somewhere something incredible is waiting to be known-
Carl Sagan

Monday, February 28, 2011

Studies Confirm That Serotonin Plays A Role In Many Autism Cases


Mouse models are yielding important clues about the nature of autism spectrum disorders, which impact an estimated one in 110 children in the U.S. In labs at the UT Health Science Center San Antonio, researchers are studying strains of mice that inherently mimic the repetitive and socially impaired behaviors present in these disorders. Georgianna Gould, Ph.D., research assistant professor of physiology in the Graduate School of Biomedical Sciences, is eyeing the role that serotonin plays in autism spectrum disorders.
Serotonin is known for giving a sense of well-being and happiness. It is a neurotransmitter, a chemical that acts like a radio tower in the brain conveying signals among cells called neurons. Thirty percent of autism cases may have a serotonin component. In a recent paper in the Journal of Neurochemistry, Dr. Gould and colleagues showed that a medication called buspirone improved the social behaviors of mice. Buspirone is approved by the U.S. Food and Drug Administration for use in adults as an anti-anxiety and antidepressant adjuvant medication.

Some genetic variations result in diminished transmission of serotonin between neurons. Buspirone increased transmission by partially mimicking the effects of serotonin at cellular sites called receptors.

Reactions to newly encountered mouse
Social interaction behaviors of the mice were measured by placing them in a three-chamber social interaction test and positioning a "stranger" mouse in one of the chambers. Buspirone-treated mice spent more time in the chamber with the stranger mouse than untreated mice and more time sniffing the stranger. "No animal model is completely characteristic of humans, and we're far from saying that buspirone is a treatment for behaviors of autistic people," Dr. Gould said. "But this does offer further proof that serotonin is involved in a significant proportion of autism cases."

Support from the San Antonio Area Foundation made the project possible. Co-authors of the journal article are Julie Hensler, Ph.D., and Teri Frosto Burke, M.S., of the pharmacology department at the Health Science Center; Lynette Daws, Ph.D., of the university's physiology department in whose lab the work was conducted; and Robert Benno, Ph.D., and Emmanuel Onaivi, Ph.D., of the biology department at William Paterson University in Wayne, N.J.

2nd serotonin-related avenue

Dr. Gould now plans to study the impact of a diet rich in the amino acid, tryptophan, on the social behavior of the mice. Tryptophan is a biochemical precursor of serotonin, which means it is converted into serotonin during the metabolic process. Foods such as turkey are rich in tryptophan.

"We are going to supplement the diet of mice with tryptophan to see if behavior improves, and also reduce it to see if behavior worsens," Dr. Gould said. The future study of tryptophan is funded by the Morrison Trust, a San Antonio trust that lists nutrition as one of its topics of interest.

Source: University of Texas Health Science Center at San Antonio



Saturday, February 26, 2011

Brain areas for depression

Overlooked Brain Area Is An Important Locus Of Depression


A team of neuroscientists at Cold Spring Harbor Laboratory (CSHL), Brookhaven National Laboratory (BNL) and UC San Diego (UCSD) has collected evidence suggesting that a previously overlooked portion of the brain could be a prime locus of human depression. In two rat models of human depression, the scientists have demonstrated that neurons in a tiny area in the central brain called the lateral habenula (LHb) are hyperactive.
Specifically, as the team reports today online ahead of print in the journal Nature, excitatory synaptic inputs onto neurons in the LHb are enhanced in "depressed" animals, a finding they regard significant because this excitation in turn causes the inhibition of "downstream targets" - including neurons in a part of the brain called the ventral tegmental area (VTA), important in the brain's reward system and heavily populated by dopamine neurons.

Furthermore, the team, which includes Professor Fritz Henn of CSHL and BNL and Assistant Professor Bo Li of CSHL, as well as Professor Roberto Malinow of UCSD, was able to use an analog of deep brain stimulation (DBS), a novel form of electrical stimulation involving the implantation of electrodes into a specific brain area, to reverse depression-like symptoms in the rats.

DBS is an important new experimental modality of treatment for refractory depression in people, as well as a potentially important approach to treat other neurophysiological disorders, most notably Parkinson's disease. The team's results point to the LHb as a potential therapeutic target for DBS. A series of ongoing experiments in depression at other laboratories, which have shown promise in a small number of human patients, have used DBS to target an area of the cingulate cortex called Brodman's Area 25. Henn and colleagues in Germany last year reported success in treating one case of intractable human depression with DBS, targeting the LHb.

The animal models used in the team's rat experiments displayed a behavior called learned helplessness. The rats were stressed in ways that were unpredictable and inescapable; over time, they developed depression-like symptoms, notably a lack of motivation to evade unpleasant stress. "It's not a perfect model of human depression by any means," says Li, "but it is very valuable, for it does enable us to study the neural mechanisms of certain aspects of depression in people."

The team's most important decision was to study the lateral habenula. "This is an area of the brain that has often been overlooked, perhaps because of its size." Li noted, "It covers an area only about 1-2 mm across." So far only two brain imaging studies have implicated the LHb in depression because of the difficulty in resolving it using existing technologies such as PET and fMRI..



Friday, February 25, 2011

Close Friends Light Up Your Medial Prefrontal Cortex Brain Region

title of this article may not catch on as a song lyric, but new neurobiology research shows close friends can cause more of a response in brain regions such as the medial prefrontal cortex than strangers can.

The research subjects’ brain regions responded more to questions regarding their close friends than they did to strangers with similar interests. The experiments attempt to show that social closeness is used more than similarity of beliefs when evaluating others in some tasks relying on the medial prefrontal cortex region of the brain.The research is explained more thoroughly below.

Imaging study shows brain responds more to close friends

People’s brains are more responsive to friends than to strangers, even if the stranger has more in common, according to a study in the Oct. 13 issue of the Journal of Neuroscience. Researchers examined a brain region known to be involved in processing social information, and the results suggest that social alliances outweigh shared interests.

In a study led by graduate student Fenna Krienen and senior author Randy Buckner, PhD, of Harvard University, researchers investigated how the medial prefrontal cortex and associated brain regions signal someone’s value in a social situation. Previous work has shown that perceptions of others’ beliefs guide social interactions. Krienen and her colleagues wondered whether these brain regions respond more to those we know, or to those with whom we share similar interests.
“There are psychological and evolutionary arguments for the idea that the social factors of ‘similarity’ and ‘closeness’ could get privileged treatment in the brain; for example, to identify insiders versus outsiders or kin versus non-kin,” Krienen said. “However, these results suggest that social closeness is the primary factor, rather than social similarity, as previously assumed.”

The researchers first imaged the brain activity of 32 participants as they judged how well lists of adjectives described their personalities. This helped to identify brain regions that respond to personally relevant information. In separate experiments, 66 different participants provided personality information about themselves and two friends — one friend whom they believed had similar preferences and one believed to be dissimilar.
The authors made up biographies of similar and dissimilar strangers for each volunteer based on their personality profiles. Then, while in a scanner, they played a game similar to the TV show “The Newlywed Game,” in which participants predicted how another person would answer a question. For example, would a friend or stranger prefer an aisle or window seat on a flight?The authors found activity in the medial prefrontal cortex increased when people answered questions about friends. Notably, whether the person had common interests made no difference in brain response.
“In all experiments, closeness but not similarity appeared to drive responses in medial prefrontal regions and associated regions throughout the brain,” Krienen said. “The results suggest social closeness is more important than shared beliefs when evaluating others.”Read Montague, PhD, of Baylor College of Medicine, an expert on decision-making and computational neuroscience, said the study’s large number of participants and experimental approach makes it a solid contribution to the field. “The authors address an important component of social cognition; the relevance of people close to us,” Montague said.

The research was supported by the National Institute on Aging, the Howard Hughes Medical Institute, the Simons Foundation, the U.S. Department of Defense, and an Ashford Graduate Fellowship in the Sciences.

Contact: Kat Snodgrass Society for Neuroscience , Fenna Krienen and Randy Buckner Harvard University

Source: Society for Neuroscience

Wednesday, February 23, 2011

First Clinical Trial of Human Embryonic Stem Cell Therapy in the World BeginsHuman

Embryonic stem cell therapy is being tried on a human for the first time in a new clinical trial. This is the first clinical trial of its kind in the world.The first patient is reported as a patient in an Atlanta spinal cord and brain injury rehabilitation hospital. The patient’s injuries were between 7-14 days old when the first injections were given. To take part in the study, the patient had to have suffered a spinal or brain injury that resulted in paralysis from the chest down.

This patient has been injected with cells derived from human embryonic stem cells obtained from a fertility clinic. Researchers are optimistic this human embryonic stem cell therapy will not only help alleviate the symptoms of the injury, but permanently repair the damage that caused the paralysis from the spinal cord injury.This is a huge step for regenerative medicine, embryonic stem cell research, spinal cord and brain injury therapy and science in general.

Geron Initiates Clinical Trial of Human Embryonic Stem Cell-Based TherapyFirst Patient Treated at Shepherd Center in Atlanta.Geron Corporation today announced the enrollment of the first patient in the company’s clinical trial of human embryonic stem cell (hESC)-derived oligodendrocyte progenitor cells, GRNOPC1. The primary objective of this Phase I study is to assess the safety and tolerability of GRNOPC1 in patients with complete American Spinal Injury Association (ASIA) Impairment Scale grade A thoracic spinal cord injuries. Participants in the study must be newly injured and receive GRNOPC1 within 14 days of the injury.
The patient was enrolled at Shepherd Center, a 132-bed spinal cord and brain injury rehabilitation hospital and clinical research center in Atlanta, GA. Shepherd Center is one of seven potential sites in the United States that may enroll patients in the clinical trial.
“Initiating the GRNOPC1 clinical trial is a milestone for the field of human embryonic stem cell-based therapies,” said Thomas B. Okarma, Ph.D., M.D., Geron’s president and CEO. “When we started working with hESCs in 1999, many predicted that it would be a number of decades before a cell therapy would be approved for human clinical trials. This accomplishment results from extensive research and development and a succession of inventive steps to enable production of cGMP master cell banks, scalable manufacture of differentiated cell product, and preclinical studies in vitro and in animal models of spinal cord injury, leading to concurrence by the FDA to initiate the clinical trial.”

“We are pleased to have our patients participating in this exciting research,” said Donald Peck Leslie, M.D., medical director, Shepherd Center. “Our medical staff will evaluate the patients’ progress as part of this study. We look forward to participating in clinical trials that may help people with spinal cord injury.”David Apple, M.D., Shepherd Center’s medical director emeritus and principal investigator of the trial at Shepherd Center, said, “This clinical trial represents another step forward in Shepherd Center’s involvement in an attempt to find a cure for paralysis in people with spinal cord injury. Shepherd Center is an ideal place to conduct this study because of our clinical expertise and the volume of patients referred here for rehabilitation care.”

In addition to Shepherd Center, Northwestern Medicine in Chicago, IL is also open for patient enrollment. As additional trial sites come online and are ready to enroll patients, they will be listed on the Patient Information pages of Geron’s website and on the NIH clinical trials registry, ClinicalTrials.gov.
Further information on the criteria for patient eligibility for the study is also available on ClinicalTrials.gov.

About Spinal Cord Injury



Spinal Cord Injury (SCI) is caused by trauma to the spinal cord that results in a loss of such functions as locomotion, sensation or bowel/bladder control. A traumatic blow to the spine can fracture or dislocate vertebrae that may cause bone fragments or disc material to injure the nerve fibers and damage the glial cells that insulate the nerve fibers in the spinal cord. Most human SCIs are contusions (bruises) to the cord, rather than a severance of the nerve fibers. Every year approximately 12,000 people in the U.S. sustain spinal cord injuries. The most common causes are automobile accidents, falls, gunshot wounds and sports injuries.

About GRNOPC1
GRNOPC1 contains hESC-derived oligodendrocyte progenitor cells that have demonstrated remyelinating and nerve growth stimulating properties leading to restoration of function in animal models of acute spinal cord injury. Preclinical studies showed that administration of GRNOPC1 significantly improved locomotor activity and kinematic scores of animals with spinal cord injuries when injected seven days after the injury. Histological examination of the injured spinal cords treated with GRNOPC1 showed improved axon survival and extensive remyelination surrounding the rat axons. For more information about GRNOPC1, visit www.geron.com/GRNOPC1Trial/.

Sources:CNN, Geron



Tuesday, February 22, 2011

Brain can learn to overcome sleep apnea: U of T scientist


From sound science to sound sleep

By Mike Kennedy

New research from U of T could provide some restful nights for the 18 million North Americans who suffer from obstructive sleep apnea. In a recent study that appeared in the Journal of Neuroscience, U of T scientists demonstrated that repeated obstruction of the airways requires release of the brain chemical noradrenaline. The release of this chemical helps the brain learn to breathe more effectively and purposefully.

“What we showed is that repeated disruption of normal lung activity - what happens during sleep apnea - triggers a form of learning that helps you breathe better. This type of brain plasticity could be harnessed to help overcome the breathing insufficiency that typifies sleep apnea” said Professor John Peever of cell and systems biology and lead author of the study.

In order to mimic the experience of severe sleep apnea, the scientists induced short 15-second apneas in sedated rats by repeatedly restricting airflow into the lungs. They found repeated apneas caused the brain to progressively trigger more forceful contraction of the respiratory muscles, which caused an increase in breathing. This increase in breathing lasted for over an hour.
Peever said it seems the brain is using the unwanted side-effects of sleep apnea to help it learn to prevent future apneas by increasing the depth of breathing. This study also pinpointed the brain chemical that allows this type of plasticity to occur. They found that noradrenaline is required in the case of repeated apneas to cause brain plasticity and enhance breathing.
These findings are important because they suggest that artificial manipulation with common drugs that affect noradrenaline levels in the brain could also help improve breathing in patients suffering from sleep apnea. This work could serve as the potential basis for developing the long sought after pill for sleep apnea.







Monday, February 21, 2011

Moderate Exercise Enhances Connectivity in Brain Circuits

A new study published in Frontiers in Aging Neuroscience has proven that moderate exercise can help to enhance connectivity in brain circuits. Additionally, exercise can help to improve cognition and combat decline in brain functions associated with aging. Attention, couch potatoes! Walking boosts brain connectivity, function

A group of “professional couch potatoes,” as one researcher described them, has proven that even moderate exercise in this case walking at one’s own pace for 40 minutes three times a week can enhance the connectivity of important brain circuits, combat declines in brain function associated with aging and increase performance on cognitive tasks.

The study, in Frontiers in Aging Neuroscience, followed 65 adults, aged 59 to 80, who joined a walking group or stretching and toning group for a year. All of the participants were sedentary before the study, reporting less than two episodes of physical activity lasting 30 minutes or more in the previous six months. The researchers also measured brain activity in 32 younger (18- to 35-year-old) adults.

Rather than focusing on specific brain structures, the study looked at activity in brain regions that function together as networks.“Almost nothing in the brain gets done by one area it’s more of a circuit,” said University of Illinois psychology professor and Beckman Institute Director Art Kramer, who led the study with kinesiology and community health professor Edward McAuley and doctoral student Michelle Voss. “These networks can become more or less connected. In general, as we get older, they become less connected, so we were interested in the effects of fitness on connectivity of brain networks that show the most dysfunction with age.”

Neuroscientists have identified several distinct brain circuits. Perhaps the most intriguing is the default mode network (DMN), which dominates brain activity when a person is least engaged with the outside world either passively observing something or simply daydreaming.Previous studies found that a loss of coordination in the DMN is a common symptom of aging and in extreme cases can be a marker of disease, Voss said.

“For example, people with Alzheimer’s disease tend to have less activity in the default mode network and they tend to have less connectivity,” she said. Low connectivity means that the different parts of the circuit are not operating in sync. Like poorly trained athletes on a rowing team, the brain regions that make up the circuit lack coordination and so do not function at optimal efficiency or speed, Voss said.

In a healthy young brain, activity in the DMN quickly diminishes when a person engages in an activity that requires focus on the external environment. Older people, people with Alzheimer’s disease and those who are schizophrenic have more difficulty “down-regulating” the DMN so that other brain networks can come to the fore, Kramer said.
A recent study by Kramer, Voss and their colleagues found that older adults who are more fit tend to have better connectivity in specific regions of the DMN than their sedentary peers. Those with more connectivity in the DMN also tend to be better at planning, prioritizing, strategizing and multi-tasking.

The new study used functional magnetic resonance imaging (fMRI) to determine whether aerobic activity increased connectivity in the DMN or other brain networks. The researchers measured participants’ brain connectivity and performance on cognitive tasks at the beginning of the study, at six months and after a year of either walking or toning and stretching.

At the end of the year, DMN connectivity was significantly improved in the brains of the older walkers, but not in the stretching and toning group, the researchers report. The walkers also had increased connectivity in parts of another brain circuit (the fronto-executive network, which aids in the performance of complex tasks) and they did significantly better on cognitive tests than their toning and stretching peers.
Previous studies have found that aerobic exercise can enhance the function of specific brain structures, Kramer said. This study shows that even moderate aerobic exercise also improves the coordination of important brain networks.

“The higher the connectivity, the better the performance on some of these cognitive tasks, especially the ones we call executive control tasks things like planning, scheduling, dealing with ambiguity, working memory and multitasking,” Kramer said. These are the very skills that tend to decline with aging, he said.

This study was supported by the National Institute on Aging at the National Institutes of Health.
Editor’s note: To contact Art Kramer, call 217-244-8373; e-mail a-kramer@illinois.edu. To reach Michelle Voss, call 217-244-4461; e-mail mvoss@illinois.edu.

The paper, “Plasticity of Brain Networks in a Randomized Intervention Trial of Exercise Training in Older Adults,” is available online: http://frontiersin.org/Aging_Neuroscience/10.3389/fnagi.2010.00032/abstract

Contact: Diana Yates- Source: University of Illinois at Urbana-Champaign



Friday, February 18, 2011

First Hand Experience of a major Stroke

JNS Article Analyzes The Role Of Helmets In Reducing Skull Fractures Incurred By Children In Skiing And Snowboarding Accidents


A compelling clinical article published online in the March 2011 issue of Journal of Neurosurgery: Pediatrics, entitled Helmet Use Reduces Skull Fractures in Skiers and Snowboarders Admitted to the Hospital discusses skull fractures incurred by young skiers and snowboarders and the role helmets play in reducing these head injuries. Authors are Anand I. Rughani, MD, Chih Lin, MD, Michael A. Horgan, MD, Bruce I. Tranmer, MD, Ryan P. Jewell, MD (Division of Neurosurgery, University of Vermont, Burlington, Vt.); William J. Ares, BS (College of Medicine, University of Vermont, Burlington, Vt.); Deborah A. Cushing, MPH, and Jeffrey E. Florman, MD (Neurosciences Institute, Maine Medical Center, Portland, Maine). .

Severe head trauma is the most frequent cause of death and severe disability in skiers and snowboarders and accounts for about 15 percent of all skiing and snowboarding related injuries. Although helmet use is apparently increasing, it remains far from universal. The authors cite a large survey of skiers and snowboarders of all ages in western United States and Canada published in 2003 which indicated that 12.1 percent wore helmets, and more recently a 2008 study that indicated that as many as 42 percent of children in the state of New York wore helmets. At present, there are no current state laws mandating helmets for skiers or snowboarders.

The role of protective helmets in reducing ski and snowboard injuries is a matter of active epidemiological research and some debate. Several large observational surveys collectively suggest that the use of helmets reduces the need to be evacuated by ambulance or visit a hospital. A recent meta-analysis concludes that helmet use reduces head injuries by 35 percent, and another recent meta-analysis suggests head injury reductions ranging from 15-60 percent. However, none of these studies have established whether or not helmet use in skiers and snowboarders reduces the incidence of head injuries seen on CT scans. The present study is the first to analyze head injury patterns sustained by helmeted versus unhelmeted skiers and snowboarders under the age of 21, as confirmed on CT scans.

The authors reviewed data on head injured skiers and snowboarders treated at two level 1 trauma centers in New England from 2003-2009. The authors focused their research on 57 children (age 21 and younger). The primary endpoints of interest were the presence of CT findings that included epidural hematoma, subdural hematoma, other traumatic intracranial hemorrhage, and skull fractures. The secondary endpoints of interest were the presence of cervical spine injury, the need for a neurosurgical procedure, the admission location, length of hospital stay, discharge location, and incidence of death. Noteworthy results culled from this in-depth analysis:

- Helmet usage: 19 helmeted (33.3 percent), 38 unhelmeted (66.7 percent)

- Helmet usage by sport: 30.8 percent skiers, 35.5 percent snowboarders

- Skull Fractures: 5.2 percent of helmeted patients suffered skull fractures versus 36.8 percent of unhelmeted patients.

- Helmeted fracture patterns: 1 non-depressed skull fracture

- Unhelmeted fracture patterns: 14 skull fractures, 8 of which were depressed


The authors cite several studies that attributed a majority of skiing fatalities to head injury:

- Utah study: 88.9 percent of fatal injuries attributed to head injury

- Vermont study: 87.5 percent of fatal injuries attributed to head injury

- Alberta study: 80.0 percent of fatal injuries attributed to head injury

- Switzerland study: 80.0 percent of fatal injuries attributed to head injury

A review of other studies shows compelling evidence that skull fractures sustained by children in skiing and snowboarding pose serious risk. A New Hampshire study reported that 71.1 percent of all children involved in a ski accident and admitted to the hospital had suffered a skull fracture. An analysis of 16 fatal ski injuries in Vermont from 1980-1986 revealed that of the 16 deaths, 14 patients had suffered head injuries and 13 of those were associated with skull fractures.

"We are able to show that helmets are associated with reduced skull fractures in skiers and snowboarders seen at the hospital. Given that skull fractures can be an indication of severe brain injury and sometimes associated with intracranial bleeding, a reduction in skull fractures is a compelling finding. Furthermore, we did not see any increase in the risk of cervical spine injuries as some might predict. Although not a focus of our work, other research has shown that helmet use in skiers and snowboarders does not increase risk-taking behavior. This work supports the protective role of helmets in skiers and snowboarders," said Dr. Rughani.

The authors report no conflict of interest.
Source: American Association of Neurological Surgeons (AANS)





Thursday, February 17, 2011

Alzheimers and Human Tau

Alzheimer’s Symptoms Reversed in Mice with Human Tau GenesAlzheimer’s disease research has lead to important findings involving the tau gene and the possibility of reversing the disease’s progression.
The researchers used transgenic mice with two different human tau gene variants. One variant leads to tau proteins that can become entangled, the other leads to tau proteins that cannot.

The mice with tau genes that could not be entangled did not develop Alzheimer’s disease symptoms. Mice with tau genes that lead to entangled proteins did develop Alzheimer’s disease symptoms.
When the group of mice with Alzheimer’s disease symptoms had their tau genes switched off, the symptoms were reversed. The symptoms such as dementia, memory loss and reduction of synapses were drastically reversed when the tau genes were switched off just a few weeks earlier.
More animal studies are in the works to further investigate these findings. The ability to produce and reverse Alzheimer’s disease symptoms using human tau genes in mice is likely to be extremely beneficial to Alzheimer’s disease research as mice tau proteins don’t usually tangle. These results should help produce even better models of human Alzheimer’s disease to study and help focus in on better treatments.

This research is extremely promising to those interested in reversing Alzheimer’s disease in humans as it shows that reversal of many symptoms is possible in mouse models now.
Many more important details are presented in the release below.


Tau-induced memory loss in Alzheimer’s mice is reversible

Amyloid-beta and tau protein deposits in the brain are characteristic features of Alzheimer disease. The effect on the hippocampus, the area of the brain that plays a central role in learning and memory, is particularly severe. However, it appears that the toxic effect of tau protein is largely eliminated when the corresponding tau gene is switched off. Researchers from the Max Planck Research Unit for Structural Molecular Biology at DESY in Hamburg have succeeded in demonstrating that once the gene is deactivated, mice with a human tau gene, which previously presented symptoms of dementia, regain their ability to learn and remember, and that the synapses of the mice also reappear in part. The scientists are now testing active substances to prevent the formation of tau deposits in mice. This may help to reverse memory loss in the early stages of Alzheimer disease – in part, at least. (Journal of Neuroscience, February 16, 2011)

Whereas aggregated amyloid-beta protein forms insoluble clumps between the neurons, the tau protein accumulates inside them. Tau protein stabilises the tube-shaped fibers of the cytoskeleton, known as microtubules, which provide the “rails” for cellular transport. In Alzheimer disease, excess phosphate groups cause the tau protein to malfunction and form clumps (the ‘neurofibrillary tangles’). As a result, nutrient transport breaks down and the neurons and their synapses die off. This process is accompanied by the initial stage of memory loss.
Together with colleagues from Leuven, Hamburg and Erlangen, Eva and Eckhard Mandelkow’s team from the Max Planck Research Unit for Structural Molecular Biology generated regulatable transgenic mice with two different human tau gene variants that can be switched on and off again: one group was given a form of the protein that cannot become entangled (anti-aggregant), and a second was provided with the code for the strongly aggregating protein variant (pro-aggregant). The mice with the first form developed no Alzheimer symptoms; the rodents that were given the pro-aggregant tau developed the disease.

The scientists measured the mice’s memory loss with the help of a swimming test: the healthy mice quickly learn how to find a life-saving platform located under the surface of the water in a water basin. In contrast, the transgenic animals, which have the additional pro-aggregant tau gene paddle aimlessly around the basin until they accidentally stumble on the platform; they require over four times more time to do this than their healthy counterparts. However, if the mutated toxic tau gene is switched off again, the mice learn to reach “dry land” with ease just a few weeks later. As a control, the mice with the anti-aggregant form of tau have no defects in learning, just as normal non-transgenic mice.
Surprising tissue results

Tissue tests showed that, as expected, no tau clumps had formed in the brains of the first group of mice expressing anti-aggregant tau. In the second group – the mice suffering from Alzheimer’s – co-aggregates from human tau and “mouse tau” were formed – against expectations, because tau protein from mice does not usually aggregate. “Even more astonishingly, weeks after the additional gene had been switched off, the aggregated human tau had dissolved again. However, the ‘mouse tau’ remained clumped. Despite this, the mice were able to learn and remember again,” says Eckhard Mandelkow. More precise tests revealed that new synapses had actually formed in their brains.

The scientists concluded from this that mutated or pathological tau can alter healthy tau. It appears that pro-aggregant tau can act similar to a crystal nucleus – once it has started to clump up, it drags neighboring “healthy” tau into the clumps as well. This is what makes the process so toxic to the neurons. “The really important discovery here, however, is that the progression of Alzheimer’s disease can be reversed in principle – at least at an early stage of the illness before too many neurons have been destroyed,” explains Eva Mandelkow who, together with her husband, will be awarded the Potamkin Prize 2011 for Alzheimer’s disease research, which is sponsored by the American Academy of Neurology.

The aggregation of tau proteins, however, cannot simply be switched off in humans the way it can in the transgenic mice. Nevertheless, special substances exist that could dissolve the tau aggregates. By screening 200,000 substances, the Hamburg researchers have already identified several classes of active substances that could re-convert the tau aggregates into soluble tau. These are now being tested on animals.

Details about this article:

Original Article: Tau-induced Defects in Synaptic Plasticity, Learning and Memory are reversible in Transgenic Mice after Switching off the Toxic Tau Mutant

Astrid Sydow, Ann Van der Jeugd, Fang Zheng, Tariq Ahmed, Detlef Balschun, Olga Petrova, Dagmar Drexler, Lepu Zhou, Gabriele Rune, Eckhard Mandelkow, Rudi D’Hooge, Christian Alzheimer, Eva-Maria Mandelkow

Journal of Neuroscience, February 16, 2011
Contacts: Dr. Eva-Maria Mandelkow & Prof. Dr. Eckhard Mandelkow - Max Planck Research Unit for Structural Molecular Biology at DESY

Source: Max Planck



Tuesday, February 15, 2011

Parkinson's hope:Stem cells delivered in Nasal spray

Stem Cells Delivered in Nasal Spray Ease Parkinson’s Disease Symptoms in RatsScientists have shown that stem cells delivered to rats via a nasal spray lead to an improvement of motor functions in rats with Parkinson’s disease like symptoms.


Mesenchymal stem cells sprayed into the rat noses migrated to the brain and survived for at least 6 months. Dopamine levels increased in previously damaged areas and motor functions improved up to 68% of normal in the stem cell treated rats.

A nasal spray delivery system for stem cells could help avoid problems related to surgical implantation of stem cells. This new method could also make repeated stem cell treatments much safer.

More general information is provided in the release below.

Dramatic Improvement in Parkinson Disease Symptoms Following Intranasal Delivery of Stem Cells to Rat Brains

Successful intranasal delivery of stem cells to the brains of rats with Parkinson disease yielded significant improvement in motor function and reversed the dopamine deficiency characteristic of the disease. These highly promising findings, reported in Rejuvenation Research, a peer-reviewed journal published by Mary Ann Liebert, Inc. highlight the potential for a noninvasive approach to cell therapy delivery in Parkinson disease–a safer and effective alternative to surgical transplantation of stem cells. The article is available free online.

In this groundbreaking study, mesenchymal stem cells (MSCs) delivered via the nose preferentially migrated to the brain and were able to survive for at least 6 months. Substantial improvement in motor function—up to 68% of normal—was reported in the MSC-treated rat model of Parkinson disease. Levels of the neurotransmitter dopamine were significantly higher in affected rat brain regions exposed to MSCs compared to the non-treated brain regions, reported Lusine Danielyan and an international team of researchers from University Hospital of Tübingen, University of Göttingen Medical School, and University of Tübingen (Stuttgart, Germany; HealthPartners Research Foundation, St. Paul, MN; German University in Cairo, Egypt; Harvard University, Cambridge, MA; Institute of Molecular Biology NAS RA, Yerevan, Armenia; and Geneva University Hospital, Switzerland.

The authors present their findings in the article, “Therapeutic Efficacy of Intranasally Delivered Mesenchymal Stem Cells in a Rat Model of Parkinson Disease.” They explain that intranasal delivery of MSCs avoids the tissue trauma and related inflammation and brain swelling associated with surgical implantation of therapeutic stem cells. Importantly, this noninvasive delivery method would also make it possible to provide repeated stem cell treatments over time.

Rejuvenation Research, the Official Journal of the European Society of Preventive, Regenerative and Anti-Aging Medicine (ESAAM) and the World Federation & World Virtual Institute of Preventive & Regenerative Medicine (PYRAMED),is an authoritative, peer-reviewed journal published bimonthly in print and online. Led by Editor-in-Chief Aubrey D.N.J. de Grey, PhD, SENS Foundation, Cambridge,UK, the Journal publishes cutting-edge work on the development of rejuvenation therapies in the laboratory and clinic and explores the molecular and cellular mechanisms behind these novel therapeutic approaches.

ontact: Vicki CohnC - Mary Ann Liebert, Inc., Publishers

Source: Mary Ann Liebert, Inc., Publishers



Monday, February 14, 2011

Heart and Brain for Valentine's Day

JPEG For The Mind: How The Brain Compresses Visual Information

Most of us are familiar with the idea of image compression in computers. File extensions like ".jpg" or ".png" signify that millions of pixel values have been compressed into a more efficient format, reducing file size by a factor of 10 or more with little or no apparent change in image quality. The full set of original pixel values would occupy too much space in computer memory and take too long to transmit across networks.
The brain is faced with a similar problem. The images captured by light-sensitive cells in the retina are on the order of a megapixel. The brain does not have the transmission or memory capacity to deal with a lifetime of megapixel images. Instead, the brain must select out only the most vital information for understanding the visual world.
In the online issue of Current Biology, a Johns Hopkins team led by neuroscientists Ed Connor and Kechen Zhang describes what appears to be the next step in understanding how the brain compresses visual information down to the essentials.
They found that cells in area "V4," a midlevel stage in the primate brain's object vision pathway, are highly selective for image regions containing acute curvature. Experiments by doctoral student Eric Carlson showed that V4 cells are very responsive to sharply curved or angled edges, and much less responsive to flat edges or shallow curves.
To understand how selectivity for acute curvature might help with compression of visual information, co-author Russell Rasquinha (now at University of Toronto) created a computer model of hundreds of V4-like cells, training them on thousands of natural object images. After training, each image evoked responses from a large proportion of the virtual V4 cells - the opposite of a compressed format. And, somewhat surprisingly, these virtual V4 cells responded mostly to flat edges and shallow curvatures, just the opposite of what was observed for real V4 cells.

The results were quite different when the model was trained to limit the number of virtual V4 cells responding to each image. As this limit on responsive cells was tightened, the selectivity of the cells shifted from shallow to acute curvature. The tightest limit produced an eight-fold decrease in the number of cells responding to each image, comparable to the file size reduction achieved by compressing photographs into the .jpeg format. At this level, the computer model produced the same strong bias toward high curvature observed in the real V4 cells.
Why would focusing on acute curvature regions produce such savings? Because, as the group's analyses showed, high-curvature regions are relatively rare in natural objects, compared to flat and shallow curvature. Responding to rare features rather than common features is automatically economical. Despite the fact that they are relatively rare, high-curvature regions are very useful for distinguishing and recognizing objects, said Connor, a professor in the Solomon H. Snyder Department of Neuroscience in the School of Medicine, and director of the Zanvyl Krieger Mind/Brain Institute.

"Psychological experiments have shown that subjects can still recognize line drawings of objects when flat edges are erased. But erasing angles and other regions of high curvature makes recognition difficult," he explained Brain mechanisms such as the V4 coding scheme described by Connor and colleagues help explain why we are all visual geniuses.

"Computers can beat us at math and chess," said Connor, "but they can't match our ability to distinguish, recognize, understand, remember, and manipulate the objects that make up our world." This core human ability depends in part on condensing visual information to a tractable level. For now, at least, the brain format seems to be the best compression algorithm around.

Source: Lisa DeNike -Johns Hopkins University

Saturday, February 12, 2011

Diet Pop and Strokes


A study just presented at the American Stroke Assn.’s International Stroke Conference reported a link between the amount of diet soda someone drinks and the risk of having a stroke or heart attack.

Here’s the outline of the study, which was started in 2003, and what it found:

A total of 2,564 people in the study were asked about their intake of sodas (among other questions) at the start of the study. After nine years, 559 cardiovascular events had occurred, and those who had reported drinking diet soda every day had a 60% higher rate of these events, which included various forms of stroke as well as heart attacks.

The scientists adjusted for certain factors, such as age, sex, race, smoking, exercise, alcohol and daily calories. When they added additional factors to do with heart disease risk, such as metabolic syndrome, the risk was still 48% higher for the daily-diet-soda-drinking group. (Metabolic syndrome is a group of factors that can include extra weight around the waist and the inability to efficiently process blood sugar.)

“If our results are confirmed with future studies, then it would suggest that diet soda may not be the optimal substitute for sugar-sweetened beverages for protection against vascular outcomes,” noted the study lead author, Hannah Gardener of the University of Miami School of Medicine, in a release from the stroke association. She made that point also in news reports, such as one written up by WebMD.

It’s worth noting, as some scientists did, that this is a link, not proof of cause and effect. After all, there are many things that people who slurp diet sodas every day are apt to do – like eat a lousy diet — and not all of these can be adjusted for, no matter how hard researchers try. Maybe those other factors are responsible for the stroke and heart attack risk, not the diet drinks. (Those who drink daily soda of any stripe, diet or otherwise, are probably not the most healthful among us.)

Here’s one comment from Dr. Steven Greenberg, a professor of neurology at Harvard Medical School and vice-chairman of the stroke-meeting conference committee: “You try to control for everything but you can’t.” (It’s from the report at WebMD.)

Second, the link depended on self-reported food intake – and self reports are not always that reliable.

Third, the self-report was at the start of the study. Habits do change over time.

This isn’t the first report suggesting a potential link to heart health, however, as fellow Times blogger Karen Kaplan noted in an item in January. The famous Framingham heart study has also reported a link between people who drank sodas -- diet or otherwise – and the risk for metabolic syndrome. I guess we'll learn more as the months and years unfold -- if future studies can tease apart all the lifestyle factors that may be linked to a soda-guzzling habit.

In the meantime, water doesn't prime us to expect cloying sweetness with every mouthful. And sodas --diet or otherwise -- can certainly do that.

Copyright © 2011, Los Angeles Times

Thursday, February 3, 2011

War on bacteria

 Frisky bacteria war on drugs revealed By James Gallagher

Health reporter, BBC News

Ever since medicine declared war on bacteria with the discovery of penicillin, the two have been locked in an arms race.Antibiotics are met by resistance from germs; so researchers develop new drugs and germs become resistant again.Now some scientists believe genetics will be the new weapon in the fight, with doctors consulting bacterial genomes when treating disease.This week a team at the Wellcome Trust Sanger Institute published a paper in the journal Science, which they say shows the first genetic picture of the evolutionary war between medicine and bacteria.

"Potentially every time someone is ill we could isolate the genome of a bacterial infection” Bacterial genetics can be tricky. With humans, one person's DNA is passed on to their children, then to their children, and so on down the family tree.
Bacteria are altogether more frisky.

They pass DNA onto their descendants when they divide in two, but they also swap DNA with other bacteria, changing their genetic code. It is like popping to the shop and changing eye colour with someone at the checkout.This study has managed to tease out the differences between the two ways of passing on DNA in Streptococcus pneumoniae and draw its family tree.

From the lab to the hospital

The researchers were able to show how the species responded to different antibiotics, how it became resistant, where it became resistant and how the resistance spread around the world.It is the first time the whole of a genome has been studied to measure the genetic response to medicine. Other studies have come to some of the same conclusions, but as a review in the same journal said: "Suggesting that we knew all this before however misses the importance of their study, in which a single experiment provided more information than has been achieved over 15 years of research."

Studying the whole of a genome is getting cheaper and Dr Stephen Bentley, from the Sanger Institute, believes it could change the way we treat illnesses. He told the BBC: "Potentially every time someone is ill we could isolate the genome of a bacterial infection, determine if it is resistant, how it will behave in humans and match it up to a database to monitor the spread of an outbreak."

Writing in Science, Professors Mark Enright and Brian Spratt, reviewed the study: "The ease with which investigators can now obtain whole genomes of bacterial pathogens is opening up a number of questions that previously were impossible or difficult to address. "One of these is how virulent or high drug resistant strains of bacterial pathogens spread within hospitals and nursing homes within a region."
Dr Bentley thinks pathogen genomics could become part of normal hospital practice in five to 10 years' time.

Artificial blood vessels keep in fridge until heart operation

By James Gallagher Health reporter, BBC News

US scientists believe they can produce a ready made supply of blood vessels for use in heart bypass surgery.A study on baboons and dogs in Science Translational Medicine suggests vessels could be stored for up to a year and used by any patient. Blood-carrying tubes can already be grown from a patient's own cells, but this takes several months.

UK experts said the research was exciting.

In coronary heart disease, the arteries which bring oxygen to the heart muscle narrow and become blocked. It is the UK's biggest killer.Every year 28,000 coronary artery bypasses are performed, where blood vessels from other parts of the body are taken and used to "bypass" the blockage. It is not always possible to use the patient's own blood vessels and several research groups are trying to create artificial ones.

"This study shows that bioengineering can be used to create a novel type of vascular graft that has the potential to improve outcomes for patients.” Professor Jeremy Pearson British Heart Foundation

The researchers, at the biotech firm Humacyte, the Brody School of Medicine at East Carolina University and Duke University Medical Centre in North Carolina, built an artificial tube-shaped scaffold and added human smooth muscle cells.As the cells grow they build their own scaffolding out of collagen, and the original structure breaks down.The researchers then used detergent to kill off the cells so the remaining collagen tube could be implanted in anyone without triggering an immune reaction.
The tubes can be stored for a least 12 months and when used in baboons they were still allowing the blood to flow normally after a six month trial. It is the combination of storage and that the blood vessels could be implanted into any patient that has the researchers excited.

They said: "Patients have no waiting period for graft production because the grafts have already been created and stored as opposed to custom made grafts for each patient that involve a lengthy waiting time."
Professor Laura Niklason, cofounder of Humacyte, told the BBC: "I think it really takes regenerative medicine to the next level.""Normally you have to take cells and grow a tissue for one patient at a time, now we can do it on a mass scale, it's a game changer."

Professor John Hunt, UK centre for tissue engineering, said: "It's very exciting you just have to address the safety issues."
"It's a big leap from producing cell-based products for healthy animals for a short time to producing them for unhealthy humans for a lifetime. How do you ensure it lasts for 10 to 15 years, which would be a major advance?"
Professor Jeremy Pearson, associate medical director at the British Heart Foundation, said: "Not everyone is well enough to have a vein taken from another part of their body during heart surgery, so using synthetic veins can become an important part of a patient's treatment. However, sometimes even synthetic veins aren't suitable.
"This study shows that bioengineering can be used to create a novel type of vascular graft that has the potential to improve outcomes for patients. We look forward to the results of clinical trials designed to test this."The method of engineering blood vessel tissue which can be implanted into any patient could have other applications.Professor Niklason said: "It can be used for skin, ligaments, cartilage or other simple tissues where it is really the structure, not the cells, which provide the function."

The researchers hope to begin human trials on artificial blood vessels next year.

Monday, January 31, 2011

Brain anatomy

Douglas Adams and the Analog Clock

Author Douglas Adams famously made fun of earthlings for being “so amazingly primitive that they still think digital watches are a pretty neat idea.” Shortly before he died, Adams gave a talk at the University of California, Santa Barbara (not far from his home), at the end of which there was a brief question-and-answer session. A woman stood up and asked Adams the question that had been bothering her for decades: what did he have against digital watches? The crowd probably expected him to toss off a witty one-liner in response. Instead, he gave a very thoughtful answer that, in true Douglas Adams fashion, made ordinary human behavior seem self-evidently absurd.

After admitting that his comment had originally been written in the days when digital watches were themselves fairly primitive (and, ironically, required two hands to operate), Adams couched his complaint—appropriately—using an analogy. In the early days of personal computers, he said, people got very excited that their spreadsheet programs could finally create pie charts. This was considered a revolutionary advance, because as everyone knows, a pie chart visually represents a part-whole relationship in a way that is immediately obvious—a way that, to be more specific, mere columns of numbers did not. Well, the hands of an analog timepiece form wedges that look very much like a pie chart, and like a pie chart, they represent a sort of part-whole relationship in a way that requires a bare minimum of mental effort to comprehend. Not so digital timepieces, which for all their precision say nothing about the relationship of one time of day to another.

Ticked Off

Although digital watches have their place and are in no danger of becoming extinct, analog models have enjoyed a steady comeback over the past decade or two, and I think that’s just marvelous. Now, I’ve always liked digital watches, partly for the practical reason that they tend to be very accurate, and partly because the absence of moving parts strikes me as elegantly minimalist. But when I stopped to think about it, I realized that it is far less useful to know that it’s 10:13 than that it’s quarter past ten, and that when I read the time in a digital format, I nearly always have to perform an additional mental calculation to figure out what time it “really” is—that is, what that string of numbers actually means in terms of how much of the day has gone by or how much longer I can sleep before getting up to go to work. And, naturally, with very few exceptions, analog timepieces are much easier to set than their digital cousins.

Studies have shown alarmingly that many children today—and even a fair number of adults—cannot tell time using an analog clock or watch because they have only ever been exposed to digital timepieces. Presumably, someone whose only experience of timekeeping has been digital would not expend any extra mental effort figuring out how much of an hour had elapsed at 7:52—but then, such a person may have to think harder on other occasions, being unable to visually judge the “distance” between two times. Be that as it may, one clearly must be able at least to identify numbers and count in order to tell time with a digital watch, whereas even without knowing any numbers, someone can tell roughly what time of day it is using an analog watch.

Or consider the digital watch I own that has a built-in (digital) compass. I press a button and the readout tells me that the direction I’m facing is 103° E-SE, but this information becomes useful to me only when I line up the little dial on the outside with that number so I can tell visually which way is north. By including an analog dial with the digital compass, the watch manufacturer tacitly acknowledges the superiority of the “pie chart” concept in judging direction (if not time).

Second Opinion

Analog timepieces—even the ones whose hands decisively click into well-defined positions rather than moving smoothly in a circle—convey a fuzzy or approximate sense of time at a casual glance. This is a good thing, not only for the sake of children’s education but because time itself is continuous, not an infinite series of discrete steps. Units like seconds, minutes, and hours are just a convenient fiction, after all—they don’t represent anything objectively real in the world. As linguist George Lakoff pointed out in his book Women, Fire, and Dangerous Things, we like to talk about time as though it were money—a thing that can be spent, saved, earned, or wasted—but this is all merely a conceit of language. Thus, to the extent that analog timepieces distance us less from reality than digital ones do, I’ve got to believe they help to keep us ever-so-slightly more human.

Some people have argued that analog watches, for all their merits, are still too complicated because they artificially divide the day into two arbitrary cycles; there are, unsurprisingly, 24-hour clocks and watches of various designs intended to address this limitation. When an hour hand goes around a dial just once per day, it’s easier to picture what it’s an analog of: namely, the rotation of the Earth (more or less). A sundial, needless to say, gives a representation of time that’s even closer cognitively to its source, but much less accurate (and inconveniently nonfunctional at night).

Nowadays, it’s possible to make analog timepieces every bit as accurate as digital models, and for those who dislike a ticking sound or prefer the no-moving-parts aesthetic, you can even buy watches and clocks with analog-like “hands” formed by the segments of an LCD. In short, the analog clock is having its revenge by providing all the benefits of digital timepieces in a human-brain-friendly package. None of this is news; I could have written the same thing a decade ago. But it is exciting that we earthlings are somehow able to come to our senses and overcome these collective blips of faulty judgment. Let’s all keep up the good work. —Joe Kissell



Increased Nonfasting Triglyceride Levels Associated With Higher Risk Of Stroke


 Elevated nonfasting triglyceride levels, previously associated with an increased risk for heart attack, also appear to be associated with an increased risk for ischemic stroke, according to a new study.

________________________________________

Recent studies found a strong association between elevated levels of nonfasting triglycerides, which indicate the presence of remnant (a small portion that remains) lipoproteins, and increased risk of ischemic heart disease. "It is therefore possible that nonfasting triglyceride levels are also associated with increased risk of ischemic stroke," the authors write. "Triglyceride levels are usually measured after an 8- to 12-hour fast, thus excluding most remnant lipoproteins; however, except for a few hours before breakfast, most individuals are in the nonfasting state most of the time. Therefore, by mainly studying fasting rather than nonfasting triglyceride levels, several previous studies may have missed an association between triglycerides and ischemic stroke."

Jacob J. Freiberg, M.D., of Copenhagen University Hospitals, Denmark, and colleagues conducted a study to determine if increased levels of nonfasting triglycerides are associated with risk of ischemic stroke. The Copenhagen City Heart Study, a Danish population–based study initiated in 1976 with follow-up through July 2007, included 13,956 men and women age 20 through 93 years. Participants had their nonfasting triglyceride levels measured at the beginning of the study and at follow-up examinations.

Of the 13,956 participants in the study, 1,529 developed ischemic stroke. The researchers found that the cumulative incidence of ischemic stroke increased with increasing levels of nonfasting triglycerides. Men with elevated nonfasting triglyceride levels of 89 through 176 mg/dL had a 30 percent higher risk for ischemic stroke; for levels 177 through 265 mg/dL, there was a 60 percent increased risk; for 266 through 353 mg/dL, a 50 percent higher risk; for 354 through 442 mg/dL, a 2.2 times elevated risk; and for 443 mg/dL or greater, the risk of ischemic stroke was 2.5 times greater compared to men with nonfasting levels less than 89 mg/dL.

Corresponding values for women were a 30 percent increased risk of ischemic stroke for nonfasting triglyceride levels of 89 through 176 mg/dL; twice the risk for levels 177 through 265 mg/dL; a 40 percent higher risk for levels of 266 through 353 mg/dL; 2.5 times the risk for 354 through 442 mg/dL; and 3.8 times the risk for ischemic stroke for women with nonfasting triglyceride levels of 443 mg/dL or greater compared to women with nonfasting triglyceride levels less than 89 mg/dL.

Absolute 10-year risk of ischemic stroke ranged from 2.6 percent in men younger than 55 years with nonfasting triglyceride levels of less than 89 mg/dL to 16.7 percent in men age 55 years or older with levels of 443 mg/dL or greater. These values in women were 1.9 percent and 12.2 percent, respectively. Men with a previous ischemic stroke vs. controls had nonfasting triglyceride levels of 191 mg/dL vs. 148 mg/dL; for women, these values were 167 mg/dL vs. 127 mg/dL.

"By using levels of nonfasting rather than fasting triglycerides and by having more statistical power than any previous study, we detected a previously unnoticed association between linear increases in levels of nonfasting triglycerides and stepwise increases in risk of ischemic stroke …", the authors write. "Even the most recent European and North American guidelines on stroke prevention do not recognize elevated triglyceride levels as a risk factor for stroke."

"Our results, together with those from 2 previous studies, suggest that elevated levels of nonfasting triglycerides and remnant lipoprotein cholesterol could be considered together with elevated levels of low-density lipoprotein cholesterol for prediction of cardiovascular risk. However, these findings require replication in other populations."



Sunday, January 30, 2011

Mini-Strokes Leave 'Hidden' Brain Damage



ScienceDaily (Jan. 29, 2011) — Each year, approximately 150,000 Canadians have a transient ischemic attack (TIA), sometimes known as a mini-stroke. New research published January 28 in Stroke, the journal of the American Heart Association shows these attacks may not be transient at all. They in fact create lasting damage to the brain.
The stroke research team, led by Dr. Lara Boyd, physical therapist and neuroscientist with the Brain Research Centre at Vancouver Coastal Health and the University of British Columbia, studied 13 patients from the Stroke Prevention Clinic at Vancouver General Hospital and compared them against 13 healthy study participants. The TIA subjects had all experienced an acute episode affecting motor systems, but had symptoms resolved within 24 hours. The patients were studied within 14-30 days of their episode, and showed no impairment through clinical evaluation or standard imaging (CT or MRI). Participants then underwent a unique brain mapping procedure using transcranial magnetic stimulation (TMS) with profound results.

"What we found has never been seen before," says Dr. Boyd, who also holds the Canada Research Chair in Neurobiology of Motor Learning at UBC. "The brain mapping capabilities of the TMS showed us that TIA is actually causing damage to the brain that lasts much longer than we previously thought it did. In fact, we are not sure if the brain ever recovers."

In the TIA group, brain cells on the affected side of the brain showed changes in their excitability -- making it harder for both excitatory and inhibitory neurons to respond as compared to the undamaged side and to a group of people with healthy brains. These changes are very concerning to the researchers as they show that TIA is likely not a transient event.

A transient ischemic attack is characterized as a brief episode of blood loss to the brain, creating symptoms such as numbness or tingling, temporary loss of vision, difficulty speaking, or weakness on one side of the body. Symptoms usually resolve quickly and many people do not take such an episode seriously. However, TIAs are often warning signs of a future stroke. The risk of a stroke increases dramatically in the days after an attack, and the TIA may offer an opportunity to find a cause or minimize the risk to prevent the permanent neurologic damage that results because of a stroke.

"These findings are very important," says Dr. Philip Teal, head of the Stroke Prevention Clinic at VGH and co-author of the study. "We know that TIA is a warning sign of future stroke. We treat every TIA as though it will result in a stroke, but not every person goes on to have a stroke. By refining this brain mapping technique, our hope is to identify who is most at risk, and direct treatment more appropriately."



Friday, January 28, 2011

Humans 'left Africa much earlier'



By Paul Rincon Science reporter, BBC News

The tools from Jebel Faya were made by modern humans, the researchers argue. Modern humans may have emerged from Africa up to 50,000 years earlier than previously thought, a study suggests.

Researchers have uncovered stone tools in the Arabian peninsula that they say were made by modern humans about 125,000 years ago. The tools were unearthed at the site of Jebel Faya in the United Arab Emirates, a team reports in the journal Science. The results are controversial: genetic data strongly points to an exodus from Africa 60,000-70,000 years ago. Simon Armitage, from Royal Holloway, University of London, Hans-Peter Uerpmann, from the University of Tuebingen, Germany, and colleagues, uncovered 125,000-year-old stone tools at Jebel Faya which resemble those found in East Africa at roughly the same time period.

The authors of the study say the people who made the tools were newcomers in the area with origins on the other side of the Red Sea. The researchers were able to date the tools using a light-based technique, which tells scientists when the stone artifacts were buried.

Genetics questioned

So-called anatomically modern humans are thought to have emerged somewhere in Africa some 200,000 years ago. They later spread out, migrating to other continents where they displaced the indigenous human groups such as the Neanderthals in Europe and the Denisovans in Asia.DNA from the cell's powerhouses - or mitochondria - can be used as a "clock" for reconstructing the timing of human migrations. This is because mitochondrial DNA (mtDNA) accumulates mutations, or changes, at a known rate.

Researchers used a dating technique that relies on when the tools were buried Studies of mtDNA had suggested a timing for the "Out of Africa" exodus of 60-70,000 years ago. But scientists behind the latest study argue that the people who made tools at Jebel Faya 125,000 years ago are ancestral to humans living outside Africa today. Professor Uerpmann said the estimates of time using genetic data were "very rough". "The domestic dog was said to be 120,000 years old, and now it is 20,000. You can imagine how variable the genetic dating is," he explained.

Commenting on the findings, Professor Chris Stringer, a palaeoanthropologist at London's Natural History Museum, said: "This archaeological work by Armitage and colleagues provides important clues that early modern humans might have dispersed from Africa across Arabia, as far as the Straits of Hormuz, by 120,000 years ago. "This research augments the controversial idea that such populations could have migrated even further across southern Asia, despite conflicting genetic data that such movements only occurred after 60,000 years."

Multiple migrations?

The researchers say the toolmakers at Jebel Faya may have reached the Arabian Peninsula at a time when changes in the climate were transforming it from arid desert into a grassland habitat with lakes and rivers. These human groups could later have moved on towards the Persian Gulf, trekking around the Iranian coast and on to South Asia. Indeed, Dr Mike Petraglia at the University of Oxford has uncovered tools in India that he says could have been made by modern humans before 60,000 years ago. Some tools were sandwiched in ash from the eruption of the Toba super-volcano in Indonesia that geologists can date very accurately to 74,000 years ago.

However, other researchers suggest that the people living in India at this time could have died out and been replaced by a later wave of humans. Anthropologists already knew of an early foray out of Africa by modern humans. Remains found at Skhul and Qafzeh in Israel date to between 119,000 and 81,000 years ago.But the Skhul and Qafzeh people are generally thought to have died out or retreated south, perhaps because of climatic fluctuations. They subsequently disappear, and their sites are re-occupied by Neanderthals.

Professor Stringer said the fact that the tools found at Jebel Faya did not resemble those associated with modern humans at Qafzeh and Skhul hinted at "yet more complexity in the exodus of modern humans from Africa". He posed the question: "Could there have been separate dispersals, one from East Africa into Arabia, and another from North Africa into the Levant?"

Thursday, January 27, 2011

Lack of sleep 'linked to early death'


Not too little sleep, yet not too much, the experts advise. Getting less than six hours sleep a night can lead to an early grave, UK and Italian researchers have warned.They said people regularly having such little sleep were 12% more likely to die over a 25-year period than those who got an "ideal" six to eight hours.

They also found an association between sleeping for more than nine hours and early death, although that much sleep may merely be a marker of ill health. Sleep journal reports the findings, based on 1.5m people in 16 studies. The study looked at the relationship between sleep and mortality by reviewing earlier studies from the UK, US and European and East Asian countries.

Premature death from all causes was linked to getting either too little or too much sleep outside of the "ideal" six to eight hours per night. But while a lack of sleep may be a direct cause of ill health, ultimately leading to an earlier death, too much sleep may merely be a marker of ill health already, the UK and Italian researchers believe.

Time pressures

Professor Francesco Cappuccio, leader of the Sleep, Health and Society Programme at the UK's University of Warwick, said: "Modern society has seen a gradual reduction in the average amount of sleep people take and this pattern is more common amongst full-time workers, suggesting that it may be due to societal pressures for longer working hours and more shift-work. "On the other hand, the deterioration of our health status is often accompanied by an extension of our sleeping time."

Five hours is insufficient for most people
Sleep expert Professor Jim Horne

If the link between a lack of sleep and death is truly causal, it would equate to over 6.3 million attributable deaths in the UK in people over 16 years of age. Prof Cappuccio said more work was needed to understand exactly why sleep seemed to be so important for good health. Professor Jim Horne, of the Loughborough Sleep Research Centre, said other factors may be involved rather than sleep per se.
Sleep is just a litmus paper to physical and mental health. Sleep is affected by many diseases and conditions, including depression," he said.

And getting improved sleep may not make someone better or live longer, he said. "But having less than five hours a night suggests something is probably not right. "Five hours is insufficient for most people and being drowsy in the day increases your risk of having an accident if driving or operating dangerous machinery."





Wednesday, January 26, 2011

Tobacco plants may be new incubator for vaccines for flu – or bio-terrorism


DEBRA BLACK


A $21-million-dollar infusion from the U.S. Defense Department’s research arm will help a Canadian company set up a manufacturing plant to incubate flu vaccine in tobacco leaves.Medicago Inc., a Quebec City-based biotech firm, is setting up an 85,000-square-foot facility in Durham, N.C., to manufacture 10 million doses of flu vaccine a month using their new technology. The company’s start-up there will translate into about 85 new jobs in Durham by the end of the 14-month contract, with some new hires in Canada as well.Medicago has a unique technology that uses tobacco plants rather than hen’s eggs to manufacture the vaccine, said Andy Sheldon, the CEO of the company. The production process is relatively simple and amazingly fast, Sheldon said. A five-week-old Australian tobacco plant, which does not have nicotine in it, is put in a solution that is full of a bacteria that carries a genetic code for the DNA for the vaccine. The leaves and solution are put in a steel tank and a vacuum is created.

The plants then absorb the information that is carried in the solution. The plants are pulled out of the steel tank and are incubated in a greenhouse for five days. The cells within the tobacco leaves then produce the protein for the vaccine.Next, scientists “extract the protein from the leaves,” said Sheldon, by breaking down the cellulose of the leaf.But it’s not just the battle against influenza that concerns the United States’ Defense Advanced Research Projects Agency, prompting it to fund work at Medicago. One of the things that the Quebec firm’s technology could be used for is vaccines or antidotes for biological terrorism or warfare.

“DARPA understands that there needs to be faster technology, not just for vaccines but for any bio threat that comes along,” Sheldon said in an interview with the Star. “There’s a large market for bio-threat products,” he said. Over the next 14 months, the Quebec-based company will build a plant, scale it up and then see if it can produce 10 million doses of H1N1 in one month, a considerably shorter time than it took companies to produce a vaccine last year when the fear of an H1N1 pandemic gripped the world. It took almost six months before an H1N1 vaccine was ready to be shipped, leaving many to panic when the vaccine was not immediately available.

“What we saw through the pandemic flu scenario last year,” said Sheldon, “(was) you couldn’t deliver product to market in time.” But Sheldon said that, with the technique using tobacco leaves, the company can produce a large amount of product very quickly.“We’re quicker. We believe we can get to market faster. But we’re going to need all kind of technology to meet demand. Egg-based and cell-based culture systems are still good.”

Meanwhile, Medicago’s Quebec plant is completing clinical trials on an avian flu vaccine also made using tobacco plants.





Tuesday, January 25, 2011

Blocking a gene stops cancer cells spreading



By James Gallagher Health reporter, BBC News

Blocking a gene could prevent cancer cells spreading


A gene which encourages cancer to move around the body has been discovered by the University of East Anglia.Experiments on tissue cultures, published in Oncogene, suggest that blocking it would prevent cancers spreading.The researchers hope their work will lead to a new generation of cancer drugs within the decade.

Cancer Research UK said the study improved understanding of the disease, but was still at the laboratory stage.There are treatments for primary cancers, but tumours have the potential to spread. Cells can break off and travel around the body, through the bloodstream or lymph fluid, and start a new or secondary tumour where they land, a process known as metastasis.Breast cancers are known to spread to lymph nodes, the bones and the lungs.These secondary tumours are notoriously difficult to treat.

The rogue gene

The team at the University of East Anglia has found a gene which helps the cancer spread. Breast cancer has spread to the spine, skull, pelvis, ribs, shoulders, hips and knees. The gene, WWP2, leads to the breakdown of an inhibitor that normally keeps cells in check.The researchers showed, in tissue cultures, that without the inhibitor, Smad7, cancer progressed very quickly and spread.

Blocking the gene prevented that spread.

Dr Andrew Chantry, who led the study, said: "I think we're really onto something important if we can put a wall around a cancer and lock it in place."The discovery could lead to the development of a new generation of drugs within the decade that could be used to stop the aggressive spread of most forms of the disease."
The team are now recruiting chemists to help them design a drug which could interrupt the gene's activity.

Dr Kat Arney, science information manager at Cancer Research UK, said: "Over recent decades researchers all over the world have discovered genes that drive the growth and spread of cancer, and this research adds one more to this ever-growing list. "But, while these new results aid our understanding of the complexities of cancer and could point towards potential leads for future anti-cancer drugs, the work is still at the laboratory stage."

Monday, January 24, 2011

Brain cooling could aid stroke recovery



The bodies of stroke patients are cooled using ice cold intravenous drips.Cooling the brain of patients who have suffered a stroke could dramatically improve their recovery, a group of Scottish doctors has said.They are joining others from across Europe who believe that inducing hypothermia in some patients can boost survival rates and reduce brain damage.Similar techniques have already been tried successfully on heart attack patients and those with birth injuries.Scientists are in Brussels to discuss a Europe-wide trial of the technique.
To date, studies have involved the body of patients being cooled using ice cold intravenous drips and cooling pads applied to the skin.
This lowers the body temperature to about 35C, just a couple of degrees below its normal level.
The technique puts the body into a state of artificial hibernation, where the brain can survive with less blood supply, giving doctors vital time to treat blocked or burst blood vessels.Dr Malcolm Macleod, head of experimental neuroscience at the Centre for Clinical Brain Sciences at the University of Edinburgh, said: "Every day 1,000 Europeans die from stroke - that's one every 90 seconds - and about twice that number survive but are disabled. "Our estimates are that hypothermia might improve the outcome for more than 40,000 Europeans every year."

Space travel

Dr Macleod and his Scottish team are joining a consortium of clinicians from across Europe to seek funding for a trial involving 1,500 stroke patients.Speaking for European Stroke Research Network for Hypothermia (EuroHYP), a group of European researchers from more than 20 countries, Dr Macleod added: "The preliminary evidence is all there - now it is time for Europe to act."The European research project, which will also include hospitals in Germany, Italy and France, is being led by Professor Dr Stefan Schwab.

Dr Schwab said: "We know the financial situation is difficult, but based on current evidence, the personal and economic benefits of avoiding stroke related death and disability means that the trial would pay for itself in less than a year.

"As the population ages, this trial will become even more important, and a benefit of cooling demonstrated in the proposed study will set the stage for future studies with hypothermia, extending the eligibility of the treatment to even greater number of patients." The progress of the clinicians is also reportedly being watched by those from the European Space Agency because of its possible application for the future of long distance space travel.

Sunday, January 23, 2011

Human Thought Can Voluntarily Control Neurons in Brain

Neuroscience research involving epileptic patients with brain electrodes surgically implanted in their medial temporal lobes shows that patients learned to consciously control individual neurons deep in
the brain with thoughts.

Subjects learned to control mouse cursors, play video games and alter focus of digital images with their thoughts. The patients were each using brain computer interfaces, deep brain electrodes and software designed for the research.

Controlling Individual Cortical Nerve Cells by Human Thought


Five years ago, neuroscientist Christof Koch of the California Institute of Technology (Caltech), neurosurgeon Itzhak Fried of UCLA, and their colleagues discovered that a single neuron in the human brain can function much like a sophisticated computer and recognize people, landmarks, and objects, suggesting that a consistent and explicit code may help transform complex visual representations into long-term and more abstract memories.

Now Koch and Fried, along with former Caltech graduate student and current postdoctoral fellow Moran Cerf, have found that individuals can exert conscious control over the firing of these single neurons—despite the neurons’ location in an area of the brain previously thought inaccessible to conscious control—and, in doing so, manipulate the behavior of an image on a computer screen.The work, which appears in a paper in the October 28 issue of the journal Nature, shows that “individuals can rapidly, consciously, and voluntarily control neurons deep inside their head,” says Koch, the Lois and Victor Troendle Professor of Cognitive and Behavioral Biology and professor of computation and neural systems at Caltech.

The study was conducted on 12 epilepsy patients at the David Geffen School of Medicine at UCLA, where Fried directs the Epilepsy Surgery Program. All of the patients suffered from seizures that could not be controlled by medication. To help localize where their seizures were originating in preparation for possible later surgery, the patients were surgically implanted with electrodes deep within the centers of their brains. Cerf used these electrodes to record the activity, as indicated by spikes on a computer screen, of individual neurons in parts of the medial temporal lobe—a brain region that plays a major role in human memory and emotion.

Prior to recording the activity of the neurons, Cerf interviewed each of the patients to learn about their interests. “I wanted to see what they like—say, the band Guns N’ Roses, the TV show House, and the Red Sox,” he says. Using that information, he created for each patient a data set of around 100 images reflecting the things he or she cares about. The patients then viewed those images, one after another, as Cerf monitored their brain activity to look for the targeted firing of single neurons. “Of 100 pictures, maybe 10 will have a strong correlation to a neuron,” he says. “Those images might represent cached memories—things the patient has recently seen.”The four most strongly responding neurons, representing four different images, were selected for further investigation. “The goal was to get patients to control things with their minds,” Cerf says. By thinking about the individual images—a picture of Marilyn Monroe, for example—the patients triggered the activity of their corresponding neurons, which was translated first into the movement of a cursor on a computer screen. In this way, patients trained themselves to move that cursor up and down, or even play a computer game.

ut, says Cerf, “we wanted to take it one step further than just brain–machine interfaces and tap into the competition for attention between thoughts that race through our mind.”To do that, the team arranged for a situation in which two concepts competed for dominance in the mind of the patient. “We had patients sit in front of a blank screen and asked them to think of one of the target images,” Cerf explains. As they thought of the image, and the related neuron fired, “we made the image appear on the screen,” he says. That image is the “target.” Then one of the other three images is introduced, to serve as the “distractor.”

“The patient starts with a 50/50 image, a hybrid, representing the ‘marriage’ of the two images,” Cerf says, and then has to make the target image fade in—just using his or her mind—and the distractor fade out. During the tests, the patients came up with their own personal strategies for making the right images appear; some simply thought of the picture, while others repeated the name of the image out loud or focused their gaze on a particular aspect of the image. Regardless of their tactics, the subjects quickly got the hang of the task, and they were successful in around 70 percent of trials.

“The patients clearly found this task to be incredibly fun as they started to feel that they control things in the environment purely with their thought,” says Cerf. “They were highly enthusiastic to try new things and see the boundaries of ‘thoughts’ that still allow them to activate things in the environment.”Notably, even in cases where the patients were on the verge of failure—with, say, the distractor image representing 90 percent of the composite picture, so that it was essentially all the patients saw—”they were able to pull it back,” Cerf says. Imagine, for example, that the target image is Bill Clinton and the distractor George Bush. When the patient is “failing” the task, the George Bush image will dominate. “The patient will see George Bush, but they’re supposed to be thinking about Bill Clinton. So they shut off Bush—somehow figuring out how to control the flow of that information in their brain—and make other information appear. The imagery in their brain,” he says, “is stronger than the hybrid image on the screen.”

According to Koch, what is most exciting “is the discovery that the part of the brain that stores the instruction ‘think of Clinton’ reaches into the medial temporal lobe and excites the set of neurons responding to Clinton, simultaneously suppressing the population of neurons representing Bush, while leaving the vast majority of cells representing other concepts or familiar person untouched.” The work in the paper, “On-line voluntary control of human temporal lobe neurons,” is part of a decade-long collaboration between the Fried and Koch groups, funded by the National Institute of Neurological Disorders and Stroke, the National Institute of Mental Health, the G. Harold & Leila Y. Mathers Charitable Foundation, and Korea’s World Class University program.

Contact: Kathy Svitil

Source: California Institute of Technology (Caltech)