Saturday 15 December 2012

Max's Story.

This post was written a couple of days ago by a mum from Australia who has a little boy on our distance programme.  Max has been on the Snowdrop programme for around 12 months and in that time he has made incredible progress.  Although your kind words are much appreciated Faith, you are the real star here, having worked incessantly to rescue your little boy from the depths of brain injury.

-----------------------------------------------------------

Wake up Sleepyhead

 

I started writing this blog, about 2 weeks after Max's stroke. A good friend suggested it might be cathartic to write about our experiences. It's also been a good way to disseminate information everyone wanted to know. In the first few weeks of writing, I wrote furiously to get all the facts down on paper before I forgot them. My early posts are pretty crappy and straight to the point. Over time, I've written a few 'flashback' posts, and talked about certain events in more detail. Which is exactly what this post is going to be....


Max spent about 2 days officially, in a medically induced coma. It was the only way, they could stop the seizures which was causing his brain to dangerously swell. It probably sounds strange, but I was grateful for those 2 comatose days. It gave me a chance to process everything which was going on. Every medical professional I spoke to in those 2 days, uttered the same phrase "we'll know more about his prognosis, once he's out of the coma". I was in no hurry for him to wake up, I was petrified of the reality we were going to face. 

In those 2 days, I sat with him, with every intention of reading him stories and singing his favourite songs. I never did either of those things because I couldn't find the strength. Instead, I cleaned his eyes when they got mucky, I cleaned his mouth and kept his lips moist. Occasionally, I changed his nappy, although there wasn't much point because he had a catheter and his bowels were essentially 'paralyzed'. 

The thing I did most of all, during those days, was think. I had no idea who my son would be when he woke up. By that stage, we knew he had permanent brain damage but didn't know how it would affect him. Would he be permanently paralyzed? Would he be dependent on me for the rest of his life? Would he be mentally handicapped? Would he ever do those 'normal' things that other parents take for granted? I feel a little ashamed, but the one thought which had the most air time, was 'I never signed up for this sh*t'.

Coming out of a coma is nothing like you see in the movies. On TV, the (beautifully made up) comatose patients eyelids flutter before they slowly open, they look at the person sitting lovingly by their bedside and stutter "wh-wh-at happened?". 

In reality, the process is very long and extremely tedious. For 2 days, I'd basically seen, no signs of life from my baby. No twitching, no eyelid flutters, no response to anything. 100% of his breathing was done by a life support machine. Slowly, we started to see little twitches, his eyes started moving behind closed eyelids and every now and then, he breathed for himself. His eyes didn't flutter open like they do in the movies. They opened millimeter by millimeter, over the course of 24 hours. Once they were open, his gaze was vacant. 

There's one particular photo, I look at fairly often which tells the story of how far we've come since last June. It was taken by my mother on the day Max started coming out of the coma. I don't think I took any photos on that day, because it hurt like hell. When I think back to that day, I still feel the stabbing pain in my heart. 





It took days, maybe even weeks for the Thiopental (aka Coma drug), to wash out of his body. The doctors explained in laymans terms, Thiopental literally soaks into every fat and muscle cell of the body. It was going to take his little body awhile to rid itself of the drug (Incidentally, I later discovered, Thiopental is the first of the 3 lethal injections given in executions- I'm glad I didn't know that at the time). 

Even after the Thiopental and pain killers were out of his system, Max was still a space cadet. He would've been happy lying in his cot all day and staring at the wall. Not once did I let him do that. When Max was awake, Max's brain was being stimulated somehow. We played music, read stories and took him for walks around the hospital. I was the crazy mummy, who took her baby to the Starlight room and joined in the art groups. Despite all of this, he was still pretty vague. I could bang saucepans only meters away from him and he wouldn't respond. Yet, we knew, his hearing was perfect. 




After months of (slightly obsessive) researching, I started Max on the Snowdrop Program. He literally 'woke up' after only one day of the program. On Monday, I could carry him into the shops easy peasy. On Tuesday, he was a humanoid Octopus who tried to grab everything off the shelves. 

Looking at him now, it's almost impossible to believe he's the same child. Check out these recent pictures, he is alert, hyperactive and incredibly mischievous. 





















From a physical perspective, we still have a long way to go, but that's a whole other post. I have days, when the cheeky little sod is driving me batty and I have to remind myself of how far he's come. There aren't enough ways to say thank you to Andrew and the Snowdrop Program. It's my greatest wish, for us to travel to the UK, because there's one thing I really need to say to him in person. "Thank you for bringing my son back". 

This isn't a sponsored post but for anyone wanting to know more about the program, here's the link- 

Snowdrop for Brain Injured Children

Saturday 8 December 2012

Music can help you to sleep.

This needs more research, but it is interesting and could help our children in the future. Snowdrop already incorporates music into our programmes of treatment for children with cerebral palsy, autism, ADHD & more, -  for instance both gregorian chant and baroque music are proven to positively influence brainwave patterns towards sleep.http://www.medicalnewstoday.com/articles/253101.php

Wednesday 28 November 2012

Brain Plasticity in Action, (on a trampoline)!






This young man is just 19 months old.   When he was 6 weeks of age he suffered a massive stroke which destroyed the left side of his brain.  His doctors told his mum that as a consequence he would never be able to use his right side limbs, which meant he would never crawl, never walk, and because language functions are situated in the left hemisphere, he would never understand or produce spoken language.  His mum refused to accept this and after many months of despair, she found Snowdrop via an internet search.  We instituted a programme of neuro-developmental stimulation, which he has been following for just 1 year.  The results have been astonishing and he did crawl, he does walk, (and run) and he most certainly does talk!  Here we see him coordinating both legs in order to enjoy the trampoline.  This young man is proof positive that not only can we stimulate brain plasticity, we can successfully direct it down a developmental pathway and thus restore the functions of children who have suffered brain injury.

 is trampolining using both legs in coordinated style! Go Max!

Monday 26 November 2012

The Principles of the Snowdrop Programme


(1) Brain injury is in the brain and if we are to help our children overcome their problems we must direct our efforts towards influencing brain plasticity.

(2). The brain responds to 3 major influences, -genetic instruction, - its internal operating environment, - the demands placed on it by the environment. These three factors drive the development of the child forward. We cannot influence genetic instruction, but we can influence the other two factors.

(3). How do we influence the demands of the environment and therefore also influence brain plasticity? - We do so through repetition of stimulus. A brain injury acts as a 'roadblock' preventing stimuli from the environment from being processed properly in the brain and therefore the child fails to develop. The Snowdrop programme assesses where that developmental roadblock lies in each area of development and provides an appropriate developmental activity which is repeated over weeks and months and which acts as an increased environmental stimulus, helping to overcome the roadblock and allowing the correct stimulation to reach the brain.

(4). The brain prefers to take in information in short, sharp bursts, which is why most activities within the programme are carried out for between 1 and 3 minutes,

(5). The brain needs plenty of 'downtime' in order to process and organise information, for this reason the programme is not as 'intensive' as might be imagined.

(6). Children learn and develop in social situations with the help of family and friends. All new abilities begin as abilities which are just beyond the reach of the child and he / she can only perform those abilities with help from family / friends. The programme activities are therefore carried out with the child by family and friends.

(7). Those friends and family who are helping the child learn and develop in social situations are providing assistance which Bruner termed as 'scaffolding' to enable the child to complete developmental tasks which are just outside of his ability to complete them alone. As the child becomes increasingly competent at the ability through repetition of stimulus, the scaffolding is gradually withdrawn until the ability is 'internalised' and the child has attained that developmental ability. This is what Vygotsky termed 'passage through the zone of proximal development.' In this way we marry academically sound Vygotskian psychology with current evidence on stimulating neuroplasticity.

Tuesday 13 November 2012

The link between music and language development.

This is the reason why exposure to music is a primary factor within the Snowdrop programme for brain injured children.  With thanks to 'Medical News Today.'

---------------------------------------------------------------------------

Contrary to the prevailing theories that music and language are cognitively separate or that music is a byproduct of language, theorists at Rice University's Shepherd School of Music and the University of Maryland, College Park (UMCP) advocate that music underlies the ability to acquire language. 

"Spoken language is a special type of music," said Anthony Brandt, co-author of a theory paper published online this month in the journal Frontiers in Cognitive Auditory Neuroscience. "Language is typically viewed as fundamental to human intelligence, and music is often treated as being dependent on or derived from language. But from a developmental perspective, we argue that music comes first and language arises from music." 

Brandt, associate professor of composition and theory at the Shepherd School, co-authored the paper with Shepherd School graduate student Molly Gebrian and L. Robert Slevc, UMCP assistant professor of psychology and director of the Language and Music Cognition Lab. 

"Infants listen first to sounds of language and only later to its meaning," Brandt said. He noted that newborns' extensive abilities in different aspects of speech perception depend on the discrimination of the sounds of language - "the most musical aspects of speech." 

The paper cites various studies that show what the newborn brain is capable of, such as the ability to distinguish the phonemes, or basic distinctive units of speech sound, and such attributes as pitch, rhythm and timbre. 

The authors define music as "creative play with sound." They said the term "music" implies an attention to the acoustic features of sound irrespective of any referential function. As adults, people focus primarily on the meaning of speech. But babies begin by hearing language as "an intentional and often repetitive vocal performance," Brandt said. "They listen to it not only for its emotional content but also for its rhythmic and phonemic patterns and consistencies. The meaning of words comes later." 

Brandt and his co-authors challenge the prevailing view that music cognition matures more slowly than language cognition and is more difficult. "We show that music and language develop along similar time lines," he said. 

Infants initially don't distinguish well between their native language and all the languages of the world, Brandt said. Throughout the first year of life, they gradually hone in on their native language. Similarly, infants initially don't distinguish well between their native musical traditions and those of other cultures; they start to hone in on their own musical culture at the same time that they hone in on their native language, he said. 

The paper explores many connections between listening to speech and music. For example, recognizing the sound of different consonants requires rapid processing in the temporal lobe of the brain. Similarly, recognizing the timbre of different instruments requires temporal processing at the same speed - a feature of musical hearing that has often been overlooked, Brandt said. 

"You can't distinguish between a piano and a trumpet if you can't process what you're hearing at the same speed that you listen for the difference between 'ba' and 'da,'" he said. "In this and many other ways, listening to music and speech overlap." The authors argue that from a musical perspective, speech is a concert of phonemes and syllables. 

"While music and language may be cognitively and neurally distinct in adults, we suggest that language is simply a subset of music from a child's view," Brandt said. "We conclude that music merits a central place in our understanding of human development." 

Brandt said more research on this topic might lead to a better understanding of why music therapy is helpful for people with reading and speech disorders. People with dyslexia often have problems with the performance of musical rhythm. "A lot of people with language deficits also have musical deficits," Brandt said. 

More research could also shed light on rehabilitation for people who have suffered a stroke. "Music helps them reacquire language, because that may be how they acquired language in the first place," Brandt said. 

Sunday 9 September 2012

Green tea and it's effects upon neurogenesis.

With thanks to 'Medical News Today. This looks interesting!   The chemical within green tea, (EGCG), seems to affect neurogenesis, (The production of new brain cells during life, - which only occurs in the hippocampus, - the part of the brain responsible for learning and part of memory formation). However, the evidence suggests that EGCG can turn these new cells to various uses in the brain when the researchers discovered that



"ECGC helps to promote the making of neural progenitor cells, which are similar to stem cells which can turn into many different kinds of cells."

This has immediate practical implications for the treatment of brain injured children and will be incorporated into the Snowdrop programme with immediate effect. (We shall be recommending caffeine free green tea of course).

Green Tea Improves Memory and Spatial Awareness.

Saturday 21 July 2012

The Reticular Formation and Sensory Processing.

We are constantly taking in information from the environment through our senses. It is something we cannot help but do and we use this sensory information to each construct our version of reality. But what is reality?

None of us really have any idea what reality is actually like: all we have is a limited sensory system, which interprets visual, auditory and tactile information and relays it to our conscious awareness. But people can only iterpret a small part of reality, being unable to detect, for example, radiation or broad colors on the light spectrum.

This is one reason why there is folly in totally accepting the world your senses provide you with. But there is another reason, one that you have more direct control over: the sensitisation of your reticular system and what it means for how you experience life on a daily basis.

The general rule of your reticular system is that whatever dominates your thoughts - both conscious and unconscious - will also dominate your attention, whether you like it or not. Ever had a toothache and then noticed that there seem to be an awful lot of adverts on TV about toothpaste and dentists? This is your reticular system at work. When a mother has a baby, she becomes acutely aware, even in sleep, of every noise her baby makes. - This is her reticular system at work, - tuning attention to what is dominating her thought processes.

Now let's consider what happens when the functioning of the reticular system is not as it should be. Many children suffer from sensory oversensitivity, whether it be visual, auditory or tactile; - or all three! This might present itself as a general oversensitivity in the affected modality, or a more specific oversensitivity, such as being oversensitive to specific sights, sounds and / or sensations. This is again the work of the reticular system, (inconjunction with the thalamus) Because of a dysfunction within the brain, whether caused by genetics or brain injury, the reticular system of the child becomes sensitised to particular stimulus, whether visual or auditory, etc and works in conjunction with the thalamus to excite the cortex so that the stimulus is processed. However, because of the dysfunctional reticular system, the cortex becomes over-excited and the child, not understanding why the stimulus is triggering this reaction in his system, reacts wildly. Here we have the basis for sensory oversensitivity in many types of developmental disability, including cerebral palsy, autism and Asperger's syndrome. or any other type of brain injury.

Fortunately, these neurological structures can be re-tuned, as they constantly are in uninjured human being, as our awareness and attention are constantly redirected to salient features of our environment. Snowdrop has developed techniques to help children who suffer from this type of difficulty to re-tune the dysfunctional reticular formation, thus allowing the opportunity for normal developmental processes to resume.

If you would like more information about Snowdrop's treatment programmes for brain injury, visit http://www.snowdrop.cc

Wednesday 11 July 2012

Study Shows the Deaf Brain Processes Touch Differently

This study again highlights the brains' adaptability. It demonstrates not only the 'rewiring' phenomenon we see in our children as a result of their participation in the Snowdrop programme, but the fact that areas of the brain previously thought to be specialised for specific functions can adapt and take on other functions.

http://neurosciencenews.com/study-shows-the-deaf-brain-processes-touch-differently/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+neuroscience-rss-feeds-neuroscience-news+%28Neuroscience+News+Updates%29

Friday 29 June 2012

The Brains' of Children with Autism are Wired Differently.

Research into how the brain is connected in a different way in children with autism.  What this study doesn't tell you is that 'wiring patterns' in the brain can be changed.  The brain responds mainly to two things, - genetic instruction, (faulty genetic instruction can cause a faulty wiring pattern) and the stimuli it receives from the environment.  The environment is by far the most powerful force and the stimulation from it can be manipulated so as to encourage the brain to change.  This is what the Snowdrop programme is all about.
-------------------------------------------------------------------------------
A research team led by Elizabeth Aylward, a University of Washington professor of radiology, report that brains of adults with autism are “wired” differently from people without the disorder. The researchers, who are affiliated with the University of Washington’s Autism Center, also found that this abnormal connection pattern may be the cause of the social impairments characteristic of autism in children.

The research team used functional magnetic resonance imaging in the study, which also revealed that the subjects with the most severe social impairment showed the most abnormal pattern of activity of connectivity in the brain regions that process faces. One of the earliest characteristics to emerge in autistic children is a deficit in face processing, and this study is the first to examine how the brain processes information about faces.

Lead author Natalia Kleinhans states that "This study shows that these brain regions are failing to work together efficiently" and that the “work seems to indicate that the brain pathways of people with autism are not completely disconnected, but they are not as strong as in people without autism."

The study’s participants were 19 high-functioning autistic adults from ages 18 to 44 with IQs of at least 85 and 21 age- and intelligence-matched typically developed adults. Within the autism spectrum disorder group were 8 individuals diagnosed with autism, 9 diagnosed with Asperger's syndrome, and 2 with an otherwise non-specified pervasive developmental disorder. Levels of social impairment were drawn from clinical observations and diagnoses.

Participants were shown 4 series of 12 pictures of faces and a similar series of pictures of houses, all while having their brains scanned. The pictures were viewed for 3 seconds, and occasionally they were repeated. The participants were instructed to press a button when a picture was repeated.

Because this was a basic task, the two groups’ performances revealed no difference in performance, but, according to co-author Todd Richards, “Differences might have shown up if they had been asked to do something more complicated."

While there was no difference in performance, the two groups exhibited different patterns of brain activity. The typically developing adults showed significantly more connectivity between the area of the brain involved in face identification and two other areas of the brain than did the autism group.

Those autistic participants with the largest social impairment demonstrated the lowest level of connectivity between the areas of the brain, leading the authors to conclude that "This study shows that the brains of people with autism are not working as cohesively as those of people without autism when they are looking at faces and processing information about them."

Does this research mean that children with autism need to be 'stuck' with this connectivity problem? This is not what I am finding. We know that the brain has qualities of plasticity, - that it is capable of re-organising it's structure and functioning through environmental stimulation. We know that this plasticity is achieved through 'sprouting' - that is the forming of new synaptic connections through dendritic growth in response to this environmental stimulation. As I said at the beginning of this post, this means that the faulty wiring pattern which the brains of children with autism adopts can be changed. The question is, how do we do this? At Snowdrop, I do this by providing the child with an enriched developmental environment which provides stimulation appropriate to the child's sensory and cognitive needs. In the particular instance of poor face recognition processing, we can utilise specialised techniques to enhance the abilities of children to process information concerning faces. Very often this leads to greater eye - contact and better facial regard and the development of mutual attention. As these abilities underpin both language and social development, we can also see improvements in these areas.

Friday 8 June 2012

Music and Language are Processed By Some of the Same Brain Systems

 
This is further justification for the use of music as a tool for treatment within the Snowdrop programme, both generally and using such tools as 'The Listening Programme' of which Snowdrop is a providor.

With thanks to MNT
 -----------------------------------------------------------------
Researchers have long debated whether or not language and music depend on common processes in the mind. Now, researchers at Georgetown University Medical Center have found evidence that the processing of music and language do indeed depend on some of the same brain systems. 

Their findings, which are currently available on-line and will be published later this year in the journal NeuroImage, are the first to suggest that two different aspects of both music and language depend on the same two memory systems in the brain. One brain system, based in the temporal lobes, helps humans memorize information in both language and music -- for example, words and meanings in language and familiar melodies in music. The other system, based in the frontal lobes, helps us unconsciously learn and use the rules that underlie both language and music, such as the rules of syntax in sentences, and the rules of harmony in music. 

"Up until now, researchers had found that the processing of rules relies on an overlapping set of frontal lobe structures in music and language. However, in addition to rules, both language and music crucially require the memorization of arbitrary information such as words and melodies," says the study's principal investigator, Michael Ullman, Ph.D., professor of neuroscience, psychology, neurology and linguistics. 

"This study not only confirms that one set of brain structures underlies rules in both language and music, but also suggests, for the first time, that a different brain system underlies memorized information in both domains," Ullman says. "So language and music both depend on two different brain systems, each for the same type of thing -- rules in one case, and arbitrary information in the other." 

Robbin Miranda, Ph.D., currently a post-doctoral researcher in the Department of Neuroscience, carried out this research with Ullman for her graduate dissertation at Georgetown. They enrolled 64 adults. They used a technique called Event-Related Potentials, in which they measured the brain's electrical activity using electrodes placed on the scalp. 

The subjects listened to 180 snippets of melodies. Half of the melodies were segments from tunes that most participants would know, such as "Three Blind Mice" and "Twinkle, Twinkle Little Star." The other half included novel tunes composed by Miranda. Three versions of each well-known and novel melody were created: melodies containing an in-key deviant note (which could only be detected if the melody was familiar, and therefore memorized); melodies that contained an out-of-key deviant note (which violated rules of harmony); and the original (control) melodies. 

For listeners familiar with a melody, an in-key deviant note violated the listener's memory of the melody -- the song sounded musically "correct" and didn't violate any rules of music, but it was different than what the listener had previously memorized. In contrast, in-key "deviant" notes in novel melodies did not violate memory (or rules) because the listeners did not know the tune. 

Out-of-key deviant notes constituted violations of musical rules in both well-known and novel melodies. Additionally, out-of-key deviant notes violated memory in well-known melodies. 

Miranda and Ullman examined the brain waves of the participants who listened to melodies in the different conditions, and found that violations of rules and memory in music corresponded to the two patterns of brain waves seen in previous studies of rule and memory violations in language. That is, in-key violations of familiar (but not novel) melodies led to a brain-wave pattern similar to one called an "N400" that has previously been found with violations of words (such as, "I'll have my coffee with milk and concrete"). Out-of-key violations of both familiar and novel melodies led to a brain-wave pattern over frontal lobe electrodes similar to patterns previously found for violations of rules in both language and music. Finally, out-of-key violations of familiar melodies also led to an N400-like pattern of brain activity, as expected because these are violations of memory as well as rules. 

"This tells us that these two aspects of music, that is rules and memorized melodies, depend on two different brain systems -- brain systems that also underlie rules and memorized information in language," Ullman says. "The findings open up exciting new ways of thinking about and investigating the relationship between language and music, two fundamental human capacities."

Friday 27 April 2012

The Brain Prefers Small 'Chunks' of Information.

Ever wondered why the Snowdrop programme activities are presented as small 'bite sizedchunks?'  Also why our Vygotskian workshop activities are broken down into 'bite sized' task series of simple units of activity.
------------------------------------------------------

In order to comprehend the continuous stream of stimulation that battle for our attention, humans seem to breakdown activities into smaller, more digestible chunks, a phenomenon that psychologists describe as "event structure perception." 

Event structure perception was originally believed to be confined to our visual system, but new research published in Psychological Science, a journal of the Association for Psychological Science, reports that a similar process occurs in all sensory systems. 

Researchers at Washington University examined event structure perception by studying subjects going about everyday activities while undergoing tests to measure neural activity. The subjects were then invited back a few days later to perform the same tasks, this time withoout their neural activity being measured. Instead, they were asked to divide the task where they believed one segment of an activity ended and another segment began. 

The researchers surmised that if changes in neural activity occurred at the same points that the subjects divided the activities, then it could be safe to suggest that humans are physiologically disposed to break down activities into bite sized chunks (remember that the same subjects had no idea during the first part of the experiment that they would later be asked to segment the activity). 

As expected, activity in certain areas of the brain increased at the points that subjects had identified as the beginning or end of a segment, otherwise known as an "event boundary." Consistent with previous research, such boundaries tended to occur during transitions in task such as changes of location or a shift in the character's goals. Researchers have hypothesised that people break down activities into smaller chunks when they are involved in an activity. However, this is the first study to demonstrate that this process occurs naturally, without awareness and to identify some of the brain regions that are involved in this process. 

These results are particularly important to our understanding of how humans comprehend everyday activity. The researchers suggest that the findings provide evidence not only that people are able to identify the structure of activities, but also that this process of segmenting the activity into discrete events occurs without us being aware of it.

In addition, a subset of the network of brain regions that also responds to event boundaries while subjects view movies of everyday events was activated. It is believed that "this similarity between processing of actual and observed activities may be more than mere coincidence, and may reflect the existence of a general network for understanding event structure.

Tuesday 17 April 2012

The Importance of Zinc.

All parents with children on the Snowdrop programme are made aware of the importance of Zinc.
-------------------------------------------

To the multitude of substances that regulate neuronal signaling in the brain and spinal cord add a new key player: zinc. By engineering a mouse with a mutation affecting a neuronal zinc target, researchers have demonstrated a central role for zinc in modulating signaling among the neurons. Significantly, they found the mutant mouse shows the same exaggerated response to noise as children with the genetic disorder called "startle disease," or hyperekplexia. 

The findings shed light on a nagging mystery in neurobiology: why the connections among certain types of neurons contain considerable pools of free zinc ions. And even though many studies had shown that zinc can act toxically on transmission of neural impulses, half a century of experiment researchers had not been able to show conclusively that the metal plays a role in normal nerve cell transmission. 

However, in an article in the journal Neuron, published by Cell Press, Heinrich Betz and colleagues conclusively demonstrate just such a role for zinc. 

In their experiments, the researchers produced mice harboring a mutant form of a gene for a receptor for zinc in neurons--thereby compromising the neurons' ability to respond to zinc. The mutation in the receptor, called the glycine receptor, targets the same receptor known to be mutated in humans with hyperekplexia. The receptor functions as a modulator of neurons in both motor and sensory signaling pathways in the brain and spinal cord. 

The genetic approach used by the researchers was a more targeted technique than previous experiments in which researchers reduced overall neuronal zinc levels using chemicals called chelators that soak up zinc ions. 

The resulting mutant mice showed tremors, delayed ability to right themselves when turned over, abnormal gait, altered transmission of visual signals, and an enhanced startle response to sudden noise. 

Electrophysiological studies of the mutant animals' brain and spinal neurons showed significant zinc-related abnormalities in transmission of signals at the connections, called synapses, among neurons. 

Betz and his colleagues wrote that "The data presented in our paper disclose a pivotal role of ambient synaptic [zinc ion] for glycinergic neurotransmission in the context of normal animal behavior." They also concluded that their results implied that manipulating synaptic zinc levels could affect the neuronal action of zinc, but that such manipulation "highlights the complexity of potential therapeutic interventions," which could cause an imbalance between the excitatory and inhibitory circuitry in the central nervous system. 

In a preview of the paper in the same issue of Neuron, Alan R. Kay, Jacques Neyton, and Pierre Paoletti wrote "Undoubtedly this work is important, since it directly demonstrates that zinc acts as an endogenous modulator of synaptic transmission." They wrote that the findings "will certainly revive the flagging hopes of zincologists. This work provides a clear demonstration that interfering with zinc modulation of a synaptic pathway leads to a significant alteration in the phenotype of the animal." The three scientists added that the finding "puts a nice dent in the zinc armor, which held firm for more than 50 years."

###

Hirzel et al.: "Hyperekplexia Phenotype of Glycine Receptor a1 Subunit Mutant Mice Identifies Zn2+ as an Essential Endogenous Modulator of Glycinergic Neurotransmission." Publishing inNeuron 52, 679-690, November 22, 2006. DOI 10.1016/j.neuron.2006.09.035

Monday 16 April 2012

Exposure to speech sounds is the basis of speech production.

This is why, in the Snowdrop programme, there are activities designed to give children the maximum exposure to speech sounds.  As I say to every family, there is a link in the brain between exposure to language and language production.
-------------------------------------
Experience, as the old saying goes, is the best teacher. And experience seems to play an important early role in how infants learn to understand and produce language. 

Using new technology that measures the magnetic field generated by the activation of neurons in the brain, researchers tracked what appears to be a link between the listening and speaking areas of the brain in newborn, 6-month-old and one-year-old infants, before infants can speak. 

The study, which appears in this month's issue of the journal NeuroReport, shows that Broca's area, located in the front of the left hemisphere of the brain, is gradually activated during an infant's initial year of life, according to Toshiaki Imada, lead author of the paper and a research professor at the University of Washington's Institute for Brain and Learning Sciences. 

Broca's area has long been identified as the seat of speech production and, more recently, as that of social cognition and is critical to language and reading, according to Patricia Kuhl, co-author of the study and co-director of the UW's Institute for Brain and Learning Sciences. 

"Magnetoencephalography is perfectly non-invasive and measures the magnetic field generated by neurons in the brain responding to sensory information that then 'leaks' through the skull," said Imada, one of the world's experts in the uses of magnetoencephalography to study the brain. 

Kuhl said there is a long history of a link in the adult brain between the areas responsible for understanding and those responsible for speaking language. The link allows children to mimic the speech patterns they hear when they are very young. That's why people from Brooklyn speak "Brooklynese," she said. 

"We think the connection between perception and production of speech gets formed by experience, and we are trying to determine when and how babies do it," said Kuhl, who also is a professor of speech and hearing sciences. 

The study involved 43 infants in Finland -18 newborns, 17 6-month-olds and 8 one-year olds. Special hardware and software developed for this study allowed the infants' brain activity to be monitored even if they moved and captured brain activation with millisecond precision. 

The babies were exposed to three kinds of sounds through earphones - pure tones that do not resemble speech like notes played on a piano, a three-tone harmonic chord that resembles speech and two Finnish syllables, "pa" and "ta." The researchers collected magnetic data only from the left hemisphere of the brain among the newborns because they cannot sit up and the magnetoencephalography cap was too big to securely fit their heads. 

At all three ages the infants showed activation in the temporal part of the brain, Broca's area, that is responsible for listening and understanding speech, showing they were able to detect sound changes for all three stimuli. But the pure perception of sound did not activate the areas of the brain responsible for speaking. However, researchers began seeing some activation in Broca's area when the 6-month-old infants heard the syllables or harmonic chords. By the time the infants were one-year old, the speech stimuli activated Broca's area simultaneously with the auditory areas, indicating "cross-talk" between the area of the brain that hears language and the area that produces language, according to Kuhl. 

"We think that early in development babies need to play with sounds, just as they play with their hands. And that helps them map relationships between sounds with the movements of their mouth and tongue," she said. "To master a skill, babies have to play and practice just as they later will in learning how to throw a baseball or ride a bike. Babies form brain connections by listening to themselves and linking what they hear to what they did to cause the sounds. Eventually they will use this skill to mimic speakers in their environments." 

This playing with language starts, Kuhl said, when babies begin cooing around 12 weeks of age and begin babbling around seven months of age. 

"They are cooing and babbling before they know how to link their mouth and tongue movements. This brain connection between perception and production requires experience," she said.