somewhere something incredible is waiting to be known-
Carl Sagan

Monday, January 31, 2011

Brain anatomy

Douglas Adams and the Analog Clock

Author Douglas Adams famously made fun of earthlings for being “so amazingly primitive that they still think digital watches are a pretty neat idea.” Shortly before he died, Adams gave a talk at the University of California, Santa Barbara (not far from his home), at the end of which there was a brief question-and-answer session. A woman stood up and asked Adams the question that had been bothering her for decades: what did he have against digital watches? The crowd probably expected him to toss off a witty one-liner in response. Instead, he gave a very thoughtful answer that, in true Douglas Adams fashion, made ordinary human behavior seem self-evidently absurd.

After admitting that his comment had originally been written in the days when digital watches were themselves fairly primitive (and, ironically, required two hands to operate), Adams couched his complaint—appropriately—using an analogy. In the early days of personal computers, he said, people got very excited that their spreadsheet programs could finally create pie charts. This was considered a revolutionary advance, because as everyone knows, a pie chart visually represents a part-whole relationship in a way that is immediately obvious—a way that, to be more specific, mere columns of numbers did not. Well, the hands of an analog timepiece form wedges that look very much like a pie chart, and like a pie chart, they represent a sort of part-whole relationship in a way that requires a bare minimum of mental effort to comprehend. Not so digital timepieces, which for all their precision say nothing about the relationship of one time of day to another.

Ticked Off

Although digital watches have their place and are in no danger of becoming extinct, analog models have enjoyed a steady comeback over the past decade or two, and I think that’s just marvelous. Now, I’ve always liked digital watches, partly for the practical reason that they tend to be very accurate, and partly because the absence of moving parts strikes me as elegantly minimalist. But when I stopped to think about it, I realized that it is far less useful to know that it’s 10:13 than that it’s quarter past ten, and that when I read the time in a digital format, I nearly always have to perform an additional mental calculation to figure out what time it “really” is—that is, what that string of numbers actually means in terms of how much of the day has gone by or how much longer I can sleep before getting up to go to work. And, naturally, with very few exceptions, analog timepieces are much easier to set than their digital cousins.

Studies have shown alarmingly that many children today—and even a fair number of adults—cannot tell time using an analog clock or watch because they have only ever been exposed to digital timepieces. Presumably, someone whose only experience of timekeeping has been digital would not expend any extra mental effort figuring out how much of an hour had elapsed at 7:52—but then, such a person may have to think harder on other occasions, being unable to visually judge the “distance” between two times. Be that as it may, one clearly must be able at least to identify numbers and count in order to tell time with a digital watch, whereas even without knowing any numbers, someone can tell roughly what time of day it is using an analog watch.

Or consider the digital watch I own that has a built-in (digital) compass. I press a button and the readout tells me that the direction I’m facing is 103° E-SE, but this information becomes useful to me only when I line up the little dial on the outside with that number so I can tell visually which way is north. By including an analog dial with the digital compass, the watch manufacturer tacitly acknowledges the superiority of the “pie chart” concept in judging direction (if not time).

Second Opinion

Analog timepieces—even the ones whose hands decisively click into well-defined positions rather than moving smoothly in a circle—convey a fuzzy or approximate sense of time at a casual glance. This is a good thing, not only for the sake of children’s education but because time itself is continuous, not an infinite series of discrete steps. Units like seconds, minutes, and hours are just a convenient fiction, after all—they don’t represent anything objectively real in the world. As linguist George Lakoff pointed out in his book Women, Fire, and Dangerous Things, we like to talk about time as though it were money—a thing that can be spent, saved, earned, or wasted—but this is all merely a conceit of language. Thus, to the extent that analog timepieces distance us less from reality than digital ones do, I’ve got to believe they help to keep us ever-so-slightly more human.

Some people have argued that analog watches, for all their merits, are still too complicated because they artificially divide the day into two arbitrary cycles; there are, unsurprisingly, 24-hour clocks and watches of various designs intended to address this limitation. When an hour hand goes around a dial just once per day, it’s easier to picture what it’s an analog of: namely, the rotation of the Earth (more or less). A sundial, needless to say, gives a representation of time that’s even closer cognitively to its source, but much less accurate (and inconveniently nonfunctional at night).

Nowadays, it’s possible to make analog timepieces every bit as accurate as digital models, and for those who dislike a ticking sound or prefer the no-moving-parts aesthetic, you can even buy watches and clocks with analog-like “hands” formed by the segments of an LCD. In short, the analog clock is having its revenge by providing all the benefits of digital timepieces in a human-brain-friendly package. None of this is news; I could have written the same thing a decade ago. But it is exciting that we earthlings are somehow able to come to our senses and overcome these collective blips of faulty judgment. Let’s all keep up the good work. —Joe Kissell



Increased Nonfasting Triglyceride Levels Associated With Higher Risk Of Stroke


 Elevated nonfasting triglyceride levels, previously associated with an increased risk for heart attack, also appear to be associated with an increased risk for ischemic stroke, according to a new study.

________________________________________

Recent studies found a strong association between elevated levels of nonfasting triglycerides, which indicate the presence of remnant (a small portion that remains) lipoproteins, and increased risk of ischemic heart disease. "It is therefore possible that nonfasting triglyceride levels are also associated with increased risk of ischemic stroke," the authors write. "Triglyceride levels are usually measured after an 8- to 12-hour fast, thus excluding most remnant lipoproteins; however, except for a few hours before breakfast, most individuals are in the nonfasting state most of the time. Therefore, by mainly studying fasting rather than nonfasting triglyceride levels, several previous studies may have missed an association between triglycerides and ischemic stroke."

Jacob J. Freiberg, M.D., of Copenhagen University Hospitals, Denmark, and colleagues conducted a study to determine if increased levels of nonfasting triglycerides are associated with risk of ischemic stroke. The Copenhagen City Heart Study, a Danish population–based study initiated in 1976 with follow-up through July 2007, included 13,956 men and women age 20 through 93 years. Participants had their nonfasting triglyceride levels measured at the beginning of the study and at follow-up examinations.

Of the 13,956 participants in the study, 1,529 developed ischemic stroke. The researchers found that the cumulative incidence of ischemic stroke increased with increasing levels of nonfasting triglycerides. Men with elevated nonfasting triglyceride levels of 89 through 176 mg/dL had a 30 percent higher risk for ischemic stroke; for levels 177 through 265 mg/dL, there was a 60 percent increased risk; for 266 through 353 mg/dL, a 50 percent higher risk; for 354 through 442 mg/dL, a 2.2 times elevated risk; and for 443 mg/dL or greater, the risk of ischemic stroke was 2.5 times greater compared to men with nonfasting levels less than 89 mg/dL.

Corresponding values for women were a 30 percent increased risk of ischemic stroke for nonfasting triglyceride levels of 89 through 176 mg/dL; twice the risk for levels 177 through 265 mg/dL; a 40 percent higher risk for levels of 266 through 353 mg/dL; 2.5 times the risk for 354 through 442 mg/dL; and 3.8 times the risk for ischemic stroke for women with nonfasting triglyceride levels of 443 mg/dL or greater compared to women with nonfasting triglyceride levels less than 89 mg/dL.

Absolute 10-year risk of ischemic stroke ranged from 2.6 percent in men younger than 55 years with nonfasting triglyceride levels of less than 89 mg/dL to 16.7 percent in men age 55 years or older with levels of 443 mg/dL or greater. These values in women were 1.9 percent and 12.2 percent, respectively. Men with a previous ischemic stroke vs. controls had nonfasting triglyceride levels of 191 mg/dL vs. 148 mg/dL; for women, these values were 167 mg/dL vs. 127 mg/dL.

"By using levels of nonfasting rather than fasting triglycerides and by having more statistical power than any previous study, we detected a previously unnoticed association between linear increases in levels of nonfasting triglycerides and stepwise increases in risk of ischemic stroke …", the authors write. "Even the most recent European and North American guidelines on stroke prevention do not recognize elevated triglyceride levels as a risk factor for stroke."

"Our results, together with those from 2 previous studies, suggest that elevated levels of nonfasting triglycerides and remnant lipoprotein cholesterol could be considered together with elevated levels of low-density lipoprotein cholesterol for prediction of cardiovascular risk. However, these findings require replication in other populations."



Sunday, January 30, 2011

Mini-Strokes Leave 'Hidden' Brain Damage



ScienceDaily (Jan. 29, 2011) — Each year, approximately 150,000 Canadians have a transient ischemic attack (TIA), sometimes known as a mini-stroke. New research published January 28 in Stroke, the journal of the American Heart Association shows these attacks may not be transient at all. They in fact create lasting damage to the brain.
The stroke research team, led by Dr. Lara Boyd, physical therapist and neuroscientist with the Brain Research Centre at Vancouver Coastal Health and the University of British Columbia, studied 13 patients from the Stroke Prevention Clinic at Vancouver General Hospital and compared them against 13 healthy study participants. The TIA subjects had all experienced an acute episode affecting motor systems, but had symptoms resolved within 24 hours. The patients were studied within 14-30 days of their episode, and showed no impairment through clinical evaluation or standard imaging (CT or MRI). Participants then underwent a unique brain mapping procedure using transcranial magnetic stimulation (TMS) with profound results.

"What we found has never been seen before," says Dr. Boyd, who also holds the Canada Research Chair in Neurobiology of Motor Learning at UBC. "The brain mapping capabilities of the TMS showed us that TIA is actually causing damage to the brain that lasts much longer than we previously thought it did. In fact, we are not sure if the brain ever recovers."

In the TIA group, brain cells on the affected side of the brain showed changes in their excitability -- making it harder for both excitatory and inhibitory neurons to respond as compared to the undamaged side and to a group of people with healthy brains. These changes are very concerning to the researchers as they show that TIA is likely not a transient event.

A transient ischemic attack is characterized as a brief episode of blood loss to the brain, creating symptoms such as numbness or tingling, temporary loss of vision, difficulty speaking, or weakness on one side of the body. Symptoms usually resolve quickly and many people do not take such an episode seriously. However, TIAs are often warning signs of a future stroke. The risk of a stroke increases dramatically in the days after an attack, and the TIA may offer an opportunity to find a cause or minimize the risk to prevent the permanent neurologic damage that results because of a stroke.

"These findings are very important," says Dr. Philip Teal, head of the Stroke Prevention Clinic at VGH and co-author of the study. "We know that TIA is a warning sign of future stroke. We treat every TIA as though it will result in a stroke, but not every person goes on to have a stroke. By refining this brain mapping technique, our hope is to identify who is most at risk, and direct treatment more appropriately."



Friday, January 28, 2011

Humans 'left Africa much earlier'



By Paul Rincon Science reporter, BBC News

The tools from Jebel Faya were made by modern humans, the researchers argue. Modern humans may have emerged from Africa up to 50,000 years earlier than previously thought, a study suggests.

Researchers have uncovered stone tools in the Arabian peninsula that they say were made by modern humans about 125,000 years ago. The tools were unearthed at the site of Jebel Faya in the United Arab Emirates, a team reports in the journal Science. The results are controversial: genetic data strongly points to an exodus from Africa 60,000-70,000 years ago. Simon Armitage, from Royal Holloway, University of London, Hans-Peter Uerpmann, from the University of Tuebingen, Germany, and colleagues, uncovered 125,000-year-old stone tools at Jebel Faya which resemble those found in East Africa at roughly the same time period.

The authors of the study say the people who made the tools were newcomers in the area with origins on the other side of the Red Sea. The researchers were able to date the tools using a light-based technique, which tells scientists when the stone artifacts were buried.

Genetics questioned

So-called anatomically modern humans are thought to have emerged somewhere in Africa some 200,000 years ago. They later spread out, migrating to other continents where they displaced the indigenous human groups such as the Neanderthals in Europe and the Denisovans in Asia.DNA from the cell's powerhouses - or mitochondria - can be used as a "clock" for reconstructing the timing of human migrations. This is because mitochondrial DNA (mtDNA) accumulates mutations, or changes, at a known rate.

Researchers used a dating technique that relies on when the tools were buried Studies of mtDNA had suggested a timing for the "Out of Africa" exodus of 60-70,000 years ago. But scientists behind the latest study argue that the people who made tools at Jebel Faya 125,000 years ago are ancestral to humans living outside Africa today. Professor Uerpmann said the estimates of time using genetic data were "very rough". "The domestic dog was said to be 120,000 years old, and now it is 20,000. You can imagine how variable the genetic dating is," he explained.

Commenting on the findings, Professor Chris Stringer, a palaeoanthropologist at London's Natural History Museum, said: "This archaeological work by Armitage and colleagues provides important clues that early modern humans might have dispersed from Africa across Arabia, as far as the Straits of Hormuz, by 120,000 years ago. "This research augments the controversial idea that such populations could have migrated even further across southern Asia, despite conflicting genetic data that such movements only occurred after 60,000 years."

Multiple migrations?

The researchers say the toolmakers at Jebel Faya may have reached the Arabian Peninsula at a time when changes in the climate were transforming it from arid desert into a grassland habitat with lakes and rivers. These human groups could later have moved on towards the Persian Gulf, trekking around the Iranian coast and on to South Asia. Indeed, Dr Mike Petraglia at the University of Oxford has uncovered tools in India that he says could have been made by modern humans before 60,000 years ago. Some tools were sandwiched in ash from the eruption of the Toba super-volcano in Indonesia that geologists can date very accurately to 74,000 years ago.

However, other researchers suggest that the people living in India at this time could have died out and been replaced by a later wave of humans. Anthropologists already knew of an early foray out of Africa by modern humans. Remains found at Skhul and Qafzeh in Israel date to between 119,000 and 81,000 years ago.But the Skhul and Qafzeh people are generally thought to have died out or retreated south, perhaps because of climatic fluctuations. They subsequently disappear, and their sites are re-occupied by Neanderthals.

Professor Stringer said the fact that the tools found at Jebel Faya did not resemble those associated with modern humans at Qafzeh and Skhul hinted at "yet more complexity in the exodus of modern humans from Africa". He posed the question: "Could there have been separate dispersals, one from East Africa into Arabia, and another from North Africa into the Levant?"

Thursday, January 27, 2011

Lack of sleep 'linked to early death'


Not too little sleep, yet not too much, the experts advise. Getting less than six hours sleep a night can lead to an early grave, UK and Italian researchers have warned.They said people regularly having such little sleep were 12% more likely to die over a 25-year period than those who got an "ideal" six to eight hours.

They also found an association between sleeping for more than nine hours and early death, although that much sleep may merely be a marker of ill health. Sleep journal reports the findings, based on 1.5m people in 16 studies. The study looked at the relationship between sleep and mortality by reviewing earlier studies from the UK, US and European and East Asian countries.

Premature death from all causes was linked to getting either too little or too much sleep outside of the "ideal" six to eight hours per night. But while a lack of sleep may be a direct cause of ill health, ultimately leading to an earlier death, too much sleep may merely be a marker of ill health already, the UK and Italian researchers believe.

Time pressures

Professor Francesco Cappuccio, leader of the Sleep, Health and Society Programme at the UK's University of Warwick, said: "Modern society has seen a gradual reduction in the average amount of sleep people take and this pattern is more common amongst full-time workers, suggesting that it may be due to societal pressures for longer working hours and more shift-work. "On the other hand, the deterioration of our health status is often accompanied by an extension of our sleeping time."

Five hours is insufficient for most people
Sleep expert Professor Jim Horne

If the link between a lack of sleep and death is truly causal, it would equate to over 6.3 million attributable deaths in the UK in people over 16 years of age. Prof Cappuccio said more work was needed to understand exactly why sleep seemed to be so important for good health. Professor Jim Horne, of the Loughborough Sleep Research Centre, said other factors may be involved rather than sleep per se.
Sleep is just a litmus paper to physical and mental health. Sleep is affected by many diseases and conditions, including depression," he said.

And getting improved sleep may not make someone better or live longer, he said. "But having less than five hours a night suggests something is probably not right. "Five hours is insufficient for most people and being drowsy in the day increases your risk of having an accident if driving or operating dangerous machinery."





Wednesday, January 26, 2011

Tobacco plants may be new incubator for vaccines for flu – or bio-terrorism


DEBRA BLACK


A $21-million-dollar infusion from the U.S. Defense Department’s research arm will help a Canadian company set up a manufacturing plant to incubate flu vaccine in tobacco leaves.Medicago Inc., a Quebec City-based biotech firm, is setting up an 85,000-square-foot facility in Durham, N.C., to manufacture 10 million doses of flu vaccine a month using their new technology. The company’s start-up there will translate into about 85 new jobs in Durham by the end of the 14-month contract, with some new hires in Canada as well.Medicago has a unique technology that uses tobacco plants rather than hen’s eggs to manufacture the vaccine, said Andy Sheldon, the CEO of the company. The production process is relatively simple and amazingly fast, Sheldon said. A five-week-old Australian tobacco plant, which does not have nicotine in it, is put in a solution that is full of a bacteria that carries a genetic code for the DNA for the vaccine. The leaves and solution are put in a steel tank and a vacuum is created.

The plants then absorb the information that is carried in the solution. The plants are pulled out of the steel tank and are incubated in a greenhouse for five days. The cells within the tobacco leaves then produce the protein for the vaccine.Next, scientists “extract the protein from the leaves,” said Sheldon, by breaking down the cellulose of the leaf.But it’s not just the battle against influenza that concerns the United States’ Defense Advanced Research Projects Agency, prompting it to fund work at Medicago. One of the things that the Quebec firm’s technology could be used for is vaccines or antidotes for biological terrorism or warfare.

“DARPA understands that there needs to be faster technology, not just for vaccines but for any bio threat that comes along,” Sheldon said in an interview with the Star. “There’s a large market for bio-threat products,” he said. Over the next 14 months, the Quebec-based company will build a plant, scale it up and then see if it can produce 10 million doses of H1N1 in one month, a considerably shorter time than it took companies to produce a vaccine last year when the fear of an H1N1 pandemic gripped the world. It took almost six months before an H1N1 vaccine was ready to be shipped, leaving many to panic when the vaccine was not immediately available.

“What we saw through the pandemic flu scenario last year,” said Sheldon, “(was) you couldn’t deliver product to market in time.” But Sheldon said that, with the technique using tobacco leaves, the company can produce a large amount of product very quickly.“We’re quicker. We believe we can get to market faster. But we’re going to need all kind of technology to meet demand. Egg-based and cell-based culture systems are still good.”

Meanwhile, Medicago’s Quebec plant is completing clinical trials on an avian flu vaccine also made using tobacco plants.





Tuesday, January 25, 2011

Blocking a gene stops cancer cells spreading



By James Gallagher Health reporter, BBC News

Blocking a gene could prevent cancer cells spreading


A gene which encourages cancer to move around the body has been discovered by the University of East Anglia.Experiments on tissue cultures, published in Oncogene, suggest that blocking it would prevent cancers spreading.The researchers hope their work will lead to a new generation of cancer drugs within the decade.

Cancer Research UK said the study improved understanding of the disease, but was still at the laboratory stage.There are treatments for primary cancers, but tumours have the potential to spread. Cells can break off and travel around the body, through the bloodstream or lymph fluid, and start a new or secondary tumour where they land, a process known as metastasis.Breast cancers are known to spread to lymph nodes, the bones and the lungs.These secondary tumours are notoriously difficult to treat.

The rogue gene

The team at the University of East Anglia has found a gene which helps the cancer spread. Breast cancer has spread to the spine, skull, pelvis, ribs, shoulders, hips and knees. The gene, WWP2, leads to the breakdown of an inhibitor that normally keeps cells in check.The researchers showed, in tissue cultures, that without the inhibitor, Smad7, cancer progressed very quickly and spread.

Blocking the gene prevented that spread.

Dr Andrew Chantry, who led the study, said: "I think we're really onto something important if we can put a wall around a cancer and lock it in place."The discovery could lead to the development of a new generation of drugs within the decade that could be used to stop the aggressive spread of most forms of the disease."
The team are now recruiting chemists to help them design a drug which could interrupt the gene's activity.

Dr Kat Arney, science information manager at Cancer Research UK, said: "Over recent decades researchers all over the world have discovered genes that drive the growth and spread of cancer, and this research adds one more to this ever-growing list. "But, while these new results aid our understanding of the complexities of cancer and could point towards potential leads for future anti-cancer drugs, the work is still at the laboratory stage."

Monday, January 24, 2011

Brain cooling could aid stroke recovery



The bodies of stroke patients are cooled using ice cold intravenous drips.Cooling the brain of patients who have suffered a stroke could dramatically improve their recovery, a group of Scottish doctors has said.They are joining others from across Europe who believe that inducing hypothermia in some patients can boost survival rates and reduce brain damage.Similar techniques have already been tried successfully on heart attack patients and those with birth injuries.Scientists are in Brussels to discuss a Europe-wide trial of the technique.
To date, studies have involved the body of patients being cooled using ice cold intravenous drips and cooling pads applied to the skin.
This lowers the body temperature to about 35C, just a couple of degrees below its normal level.
The technique puts the body into a state of artificial hibernation, where the brain can survive with less blood supply, giving doctors vital time to treat blocked or burst blood vessels.Dr Malcolm Macleod, head of experimental neuroscience at the Centre for Clinical Brain Sciences at the University of Edinburgh, said: "Every day 1,000 Europeans die from stroke - that's one every 90 seconds - and about twice that number survive but are disabled. "Our estimates are that hypothermia might improve the outcome for more than 40,000 Europeans every year."

Space travel

Dr Macleod and his Scottish team are joining a consortium of clinicians from across Europe to seek funding for a trial involving 1,500 stroke patients.Speaking for European Stroke Research Network for Hypothermia (EuroHYP), a group of European researchers from more than 20 countries, Dr Macleod added: "The preliminary evidence is all there - now it is time for Europe to act."The European research project, which will also include hospitals in Germany, Italy and France, is being led by Professor Dr Stefan Schwab.

Dr Schwab said: "We know the financial situation is difficult, but based on current evidence, the personal and economic benefits of avoiding stroke related death and disability means that the trial would pay for itself in less than a year.

"As the population ages, this trial will become even more important, and a benefit of cooling demonstrated in the proposed study will set the stage for future studies with hypothermia, extending the eligibility of the treatment to even greater number of patients." The progress of the clinicians is also reportedly being watched by those from the European Space Agency because of its possible application for the future of long distance space travel.

Sunday, January 23, 2011

Human Thought Can Voluntarily Control Neurons in Brain

Neuroscience research involving epileptic patients with brain electrodes surgically implanted in their medial temporal lobes shows that patients learned to consciously control individual neurons deep in
the brain with thoughts.

Subjects learned to control mouse cursors, play video games and alter focus of digital images with their thoughts. The patients were each using brain computer interfaces, deep brain electrodes and software designed for the research.

Controlling Individual Cortical Nerve Cells by Human Thought


Five years ago, neuroscientist Christof Koch of the California Institute of Technology (Caltech), neurosurgeon Itzhak Fried of UCLA, and their colleagues discovered that a single neuron in the human brain can function much like a sophisticated computer and recognize people, landmarks, and objects, suggesting that a consistent and explicit code may help transform complex visual representations into long-term and more abstract memories.

Now Koch and Fried, along with former Caltech graduate student and current postdoctoral fellow Moran Cerf, have found that individuals can exert conscious control over the firing of these single neurons—despite the neurons’ location in an area of the brain previously thought inaccessible to conscious control—and, in doing so, manipulate the behavior of an image on a computer screen.The work, which appears in a paper in the October 28 issue of the journal Nature, shows that “individuals can rapidly, consciously, and voluntarily control neurons deep inside their head,” says Koch, the Lois and Victor Troendle Professor of Cognitive and Behavioral Biology and professor of computation and neural systems at Caltech.

The study was conducted on 12 epilepsy patients at the David Geffen School of Medicine at UCLA, where Fried directs the Epilepsy Surgery Program. All of the patients suffered from seizures that could not be controlled by medication. To help localize where their seizures were originating in preparation for possible later surgery, the patients were surgically implanted with electrodes deep within the centers of their brains. Cerf used these electrodes to record the activity, as indicated by spikes on a computer screen, of individual neurons in parts of the medial temporal lobe—a brain region that plays a major role in human memory and emotion.

Prior to recording the activity of the neurons, Cerf interviewed each of the patients to learn about their interests. “I wanted to see what they like—say, the band Guns N’ Roses, the TV show House, and the Red Sox,” he says. Using that information, he created for each patient a data set of around 100 images reflecting the things he or she cares about. The patients then viewed those images, one after another, as Cerf monitored their brain activity to look for the targeted firing of single neurons. “Of 100 pictures, maybe 10 will have a strong correlation to a neuron,” he says. “Those images might represent cached memories—things the patient has recently seen.”The four most strongly responding neurons, representing four different images, were selected for further investigation. “The goal was to get patients to control things with their minds,” Cerf says. By thinking about the individual images—a picture of Marilyn Monroe, for example—the patients triggered the activity of their corresponding neurons, which was translated first into the movement of a cursor on a computer screen. In this way, patients trained themselves to move that cursor up and down, or even play a computer game.

ut, says Cerf, “we wanted to take it one step further than just brain–machine interfaces and tap into the competition for attention between thoughts that race through our mind.”To do that, the team arranged for a situation in which two concepts competed for dominance in the mind of the patient. “We had patients sit in front of a blank screen and asked them to think of one of the target images,” Cerf explains. As they thought of the image, and the related neuron fired, “we made the image appear on the screen,” he says. That image is the “target.” Then one of the other three images is introduced, to serve as the “distractor.”

“The patient starts with a 50/50 image, a hybrid, representing the ‘marriage’ of the two images,” Cerf says, and then has to make the target image fade in—just using his or her mind—and the distractor fade out. During the tests, the patients came up with their own personal strategies for making the right images appear; some simply thought of the picture, while others repeated the name of the image out loud or focused their gaze on a particular aspect of the image. Regardless of their tactics, the subjects quickly got the hang of the task, and they were successful in around 70 percent of trials.

“The patients clearly found this task to be incredibly fun as they started to feel that they control things in the environment purely with their thought,” says Cerf. “They were highly enthusiastic to try new things and see the boundaries of ‘thoughts’ that still allow them to activate things in the environment.”Notably, even in cases where the patients were on the verge of failure—with, say, the distractor image representing 90 percent of the composite picture, so that it was essentially all the patients saw—”they were able to pull it back,” Cerf says. Imagine, for example, that the target image is Bill Clinton and the distractor George Bush. When the patient is “failing” the task, the George Bush image will dominate. “The patient will see George Bush, but they’re supposed to be thinking about Bill Clinton. So they shut off Bush—somehow figuring out how to control the flow of that information in their brain—and make other information appear. The imagery in their brain,” he says, “is stronger than the hybrid image on the screen.”

According to Koch, what is most exciting “is the discovery that the part of the brain that stores the instruction ‘think of Clinton’ reaches into the medial temporal lobe and excites the set of neurons responding to Clinton, simultaneously suppressing the population of neurons representing Bush, while leaving the vast majority of cells representing other concepts or familiar person untouched.” The work in the paper, “On-line voluntary control of human temporal lobe neurons,” is part of a decade-long collaboration between the Fried and Koch groups, funded by the National Institute of Neurological Disorders and Stroke, the National Institute of Mental Health, the G. Harold & Leila Y. Mathers Charitable Foundation, and Korea’s World Class University program.

Contact: Kathy Svitil

Source: California Institute of Technology (Caltech)



Saturday, January 22, 2011

Better Learning

Practicing Memory Recall Boosts Science LearningPsychology researchers recently found that practicing memory recall lead to improved long-term retention of science information when compared to other learning techniques.
The researchers compared students that learned by using concept maps versus a second group that practiced retrieval. They found the students who practiced retrieval to perform better in long-term retention tests.This learning and memory research could help improve science teaching methods and lead to new strategies to teach science more effectively.

Science Learning Easier When Students Put Down Textbooks and Actively Recall Information

Actively recalling information from memory beats elaborate study methods. Put down those science text books and work at recalling information from memory. That’s the shorthand take away message of new research from Purdue University that says practicing memory retrieval boosts science learning far better than elaborate study methods.“Our view is that learning is not about studying or getting knowledge ‘in memory,’” said Purdue psychology professor Jeffrey Karpicke, the lead investigator for the study that appears today in the journal Science. “Learning is about retrieving. So it is important to make retrieval practice an integral part of the learning process.”

Educators traditionally rely on learning activities that encourage elaborate study routines and techniques focused on improving the encoding of information into memory. But, when students practice retrieval, they set aside the material they are trying to learn and instead practice calling it to mind.The study, “Retrieval Practice Produces More Learning Than Elaborative Studying With Concept Mapping,” tested both learning strategies alongside each other. The research was funded by the National Science Foundation’s Division of Undergraduate Education.

“In prior research, we established that practicing retrieval is a powerful way to improve learning,” said Karpicke. “Here we put retrieval practice to the test by comparing its effectiveness to an elaborative study method, specifically elaborative studying by creating concept maps.”Concept mapping requires students to construct a diagram–typically using nodes or bubbles–that shows relationships among ideas, characteristics or materials. These concepts are then written down as a way of encoding them in a person’s memory.

The researchers say the practice is used extensively for learning about concepts in sciences such as biology, chemistry or physics.In two studies, reported by Karpicke and his colleague, Purdue University psychology student Janell Blunt, a total of 200 students studied texts on topics from different science disciplines. One group engaged in elaborative study using concept maps while a second group practiced retrieval; they read the texts, then put them away and practiced freely recalling concepts from the text.

After an initial study period, both groups recalled about the same amount of information. But when the students returned to the lab a week later to assess their long-term learning, the group that studied by practicing retrieval showed a 50 percent improvement in long-term retention above the group that studied by creating concept maps.This, despite the students own predictions about how much they would actually remember. “Students do not always know what methods will produce the best learning,” said Karpicke in discussing whether students are good at judging the success of their study habits.

He found that when students have the material right in front of them, they think they know it better than they actually do. “It may be surprising to realize that there is such a disconnect between what students think will afford good learning and what is actually best. We, as educators, need to keep this in mind as we create learning tools and evaluate educational practices,” he said.

The researchers showed retrieval practice was superior to elaborative studying in all comparisons.

“The final retention test was one of the most important features of our study, because we asked questions that tapped into meaningful learning,” said Karpicke.

The students answered questions about the specific concepts they learned as well as inference questions asking them to draw connections between things that weren’t explicitly stated in the material. On both measures of meaningful learning, practicing retrieval continued to produce better learning than elaborative studying.Karpicke says there’s nothing wrong with elaborative learning, but argues that a larger place needs to be found for retrieval practice. “Our challenge now is to find the most effective and feasible ways to use retrieval as a learning activity–but we know that it is indeed a powerful way to enhance conceptual learning about science.”

Principal Investigators: Jeffery Karpicke, Purdue University

The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2010, its budget is about $6.9 billion. NSF funds reach all 50 states through grants to nearly 2,000 universities and institutions. Each year, NSF receives over 45,000 competitive requests for funding, and makes over 11,500 new funding awards. NSF also awards over $400 million in professional and service contracts yearly.

Source: NSF News
















Elaborate study methods that use concept maps like this one require students to construct a diagram--typically using nodes or bubbles--that shows relationships among ideas, characteristics, materials, etc. Resent test results showed retrieval practice was superior to elaborative studying in all comparisons. Credit: Vicwood40, Creative Commons Attribution-Share Alike 3.0 Unported license

Friday, January 21, 2011

Too much sitting after work harms heart: study


CBC News

As TV viewing options expand to include 3D, doctors are warning heart risks grow as time in front of televisions and computers increases. Spending more than two hours a day of leisure time in front of a TV or computer screen may increase the risk of heart disease and shorten life, a new study suggests.British researchers compared the effects of sitting for different lengths of time during leisure hours, outside of work.

People who spent more than four hours each day watching TV, using a computer or playing video games were about twice as likely to suffer a major cardiac event, Emmanuel Stamatakis of University College London's department of epidemiology and public health and his co-authors reported in Monday's issue of the Journal of the American College of Cardiology.

Stamatakis is calling for public health guidelines to warn people of the risks of being physically inactive during their down time.The warnings are important, "especially as a majority of working age adults spend long periods being inactive while commuting or being slouched over a desk or computer," the study's authors wrote.The study focused on 4,512 adults who took part in a survey of Scottish households. Participants said how much time they spent watching TV, DVDs, using computers and playing video games.

There was a 1.52 times higher risk of cardiovascular events such as heart attack, stroke or heart failure among participants reporting four or more hours of screen time compared with those who said they got less than two hours.During more than four years of follow-up, 325 of the subjects died and 215 had a cardiovascular event, the researchers reported.The study is the first time researchers have examined the association between screen time and cardiovascular health in such detail, the authors said.

Inflammation and metabolism

The researchers took into account traditional risk factors such as smoking, hypertension, body mass index, social class and exercise.Stamatakis's team did find an association between levels of inflammation and cholesterol in sedentary people, which they say partly explains about one-forth of the relationship between sitting and heart health.

The study helps doctors to understand the significant role that a sedentary lifestyle has on heart disease, Dr. Suzanne Steinbaum, a preventive cardiologist at Lenox Hill Hospital in New York City and a spokeswoman for the American Heart Association, told the paper to HealthDay.

Steinbaum suggested that people avoid sitting when they don't have to during leisure time and try to get moving instead.

Read more: http://www.cbc.ca/health/story/2011/01/10/screen-time-heart-tv-computer-sedentary.html#ixzz1AgYPNlWf



Thursday, January 20, 2011

Amoebas show primitive farming behaviour as they travel


The amoeba is known to gather together in large "fruiting bodies"
A species of amoeba - among the simplest life forms on Earth - has been seen "farming" the bacteria it eats.
When the bacteria become scarce, the Dictyostelium discoideum slime mould gathers up into a "fruiting body" that disperses spores to a new area.

Research described in Nature shows that a third of these spores contain some of the bacteria to grow at the new site.Food management has been seen in animals including ants and snails, but never in creatures as simple as these.The behaviour falls short of the kind of "farming" that more advanced animals do; ants, for example, nurture a single fungus species that no longer exists in the wild.But the idea that an amoeba that spends much of its life as a single-celled organism could hold short of consuming a food supply before decamping is an astonishing one.More than just a snack for the journey of dispersal, the idea is that the bacteria that travel with the spores can "seed" a new bacterial colony, and thus a food source in case the new locale should be lacking in bacteria.D. discoideum is already something of a famous creature, having proven its "social" nature as it gathers together into a mobile, multicellular structure in which a fifth of the individuals die, to the benefit of the ones that make it into the fruiting body.

Researchers from Rice University in Texas, looking to study the amoebas further, happened across another, truly unique behaviour - discovered perhaps because the samples came from the wild, rather than grown in the laboratory."It was a bit of serendipty, really," Debra Brock, lead author of the Nature story."I had previously worked with them, looking at developmental genes. Not many people work with wild clones but I had started in a new lab and my advisers had a large collection of them, and I came with a bit of a different perspective."

Costly choice

Once Ms Brock spotted the amoebas' fruiting bodies carrying bacteria, she measured how many of the spores were responsible, finding that about a third of them traveled with their bacterial seeds.The behaviour seems to be genetically built-in; clones of the "farmer" amoebas in turn developed into farmers, while clones of the "non-farmers" did not.

The bacteria form the basis of a food crop at the spores' new locations

"To think of a single-celled amoeba performing something that you could consider farming, I think, is surprising," Ms Brock said."Choices like that are generally costly, so there has to be a pretty large benefit for it to persist in nature."
That is to say, the amoebas, in choosing not to consume all of the bacteria around them, are forced to make smaller fruiting bodies that cannot travel as far when they disperse.There is thus an evolutionary balance to be struck between the advantage gained by showing up with the beginnings of a crop and the cost of bringing it.

Jacobus Boomsma of the University of Copenhagen said that the find was a surprising one, and gives insight that has been absent in farming creatures known already to science.For example, all the individuals of a given ant or termite species farm particular species of fungus exclusively, and the "free" versions of the advanced farmed fungi no longer exist.

"Here, farming and non-farming [members of the species] coexist, so they look perfectly normal until you put them under the microscope and know what you're looking at; [the bacteria] don't assume specialisesd roles as crops like fungi that ants and termites rear," Professor Boomsma told BBC News.
"In other farming systems that we see, they always lack this intermediate stage."

Ms Brock said that further study has already found other species of amoeba that "pack a lunch", and that D. discoideum carries more than just a snack."Bacteria generally provide huge resources that are really untapped," Ms Brock said."These amoebas carry bacteria that aren't just used for food, so that's what I'm looking into now."

Tuesday, January 18, 2011

Climate secrets of Marianas Trench probed

By Rebecca Morelle Science reporter, BBC News

The scientists used a hi-tech submersible to study the trench

The climate secrets of the deepest part of the ocean, the Marianas Trench in the western Pacific Ocean, have been probed by scientists.The international team used a submersible, designed to withstand immense pressures, to study the bottom of the 10.9km-deep underwater canyon.Their early results reveal that ocean trenches are acting as carbon sinks.This suggests that they play a larger role in regulating the Earth's chemistry and climate than was thought.

Although two explorers, Jacques Piccard and Don Walsh, reached the deepest part of the Marianas Trench - a point called the Challenger Deep - in 1960, no humans have been back since.And the handful of scientific missions, including this recent visit to this deepest spot, have been carried out using unmanned underwater vehicles. Lead researcher Professor Ronnie Glud, from the University of Southern Denmark and the Scottish Association for Marine Science (Sams), said that working at more than 1,000 atmospheres of pressure was challenging, but advances in technology had made it possible.He told BBC News: "This is the first time we have been able to set down sophisticated instruments at these depths to measure how much carbon is buried there."

Under pressure

Professor Glud, working with scientists from the Japan Agency for Marine Earth Science and Technology (Jamstec) and from the UK and Germany, used a lander equipped with special sensors packed in a titanium cylinder that was able to resist the remarkable pressures.The lander was launched from a ship and took three hours to free-fall to the sea bottom, where it carried out pre-programmed experiments before releasing its ballast and returning to the surface.The tests helped the scientists to assess the abundance of carbon at those murky depths. Professor Glud said: "Basically, we are interested in understanding how much organic material - that is all the material produced by algae or fish in the water above - settles at the sea bed, and is either eaten by bacteria and degraded or is buried.

"The ratio that is either degraded or buried is the ultimate process determining what are the oxygen and carbon dioxide concentrations of the oceans and the atmosphere, and this gives us an overall picture of how efficiently the sea can capture and sequester carbon in the global carbon cycle."

While this has been studied in other parts of the ocean, such as the abyssal plain - the large flat area of the ocean that lies between 4.6km and 5.5km of depth - the role deep sea trenches play in the carbon cycle has until now remained largely unknown.Professor Glud said: "Although these trenches cover just 2% of the ocean, we thought they might be disproportionately important, because it was likely that they would accumulate much more carbon because they would act as a trap, with more organic matter drifting to the bottom of them than in other parts of the ocean."He explained that preliminary data from his experiments suggested that this was the case.

He said: "Our results very strongly suggest that the trenches do act as sediment traps. And they also had high activity, meaning that more carbon is turned over by bacteria in the trenches than is turned over at 6,000m of depth in the abyssal plain.“To see an experiment such as this carried out at these extreme depths is a great leap forward in deep-sea science.  Dr Alan Jamieson Oceanlab

"What it means is that we have carbon storage going on in these trenches that is higher than we thought before, and this really means that we have a carbon dioxide sink in the deep ocean that wasn't recognised before."The next stage for the team is to quantify their results and work out exactly how much more carbon is stored in deep sea trenches compared with other parts of the sea, and how much carbon turnover by bacteria is being carried out. This, the researchers said, should help them to better establish the role of the ocean trenches in regulating climate.

Surprising finds

This is not the first time deep sea trenches have surprised scientists.Recent studies by University of Aberdeen's Oceanlab team have revealed that marine life is much more abundant in this hostile habitat than was previously thought.In 2008, they filmed the deepest living fish ever to be caught on camera - a 17-strong shoal found at depths of 7.7km in the Japan Trench, and the revealed other animals such as amphipods were present in large numbers even deeper.These fish were filmed at a depth of 7.7km

• Beyond the abyss

Dr Alan Jamieson, from Oceanlab, said the new study was helping researchers to build up a better idea of what happens in the deepest of the deep.He said: "The trenches continue to amaze us."And to see an experiment such as this carried out at these extreme depths is a great leap forward in deep-sea science.
"These studies will greatly enhance our understanding of how the deep trenches contribute to carbon cycling in the world's oceans."

Monday, January 17, 2011

Gaming as a way to teach boys

 As the mother of four sons I have long believed that our education system is failing especially our boys. I have also believed that gaming is the ideal way to re-engage boys. Rather than sending them on a mindless quest why not get them to solve mathematical or historical problems? I am posting the following TED talk which holds the point of view that gaming can reach boys. I find the talk a bit defensive but its main point is valid.

Sunday, January 16, 2011

Harvesting energy: body heat to warm buildings

About 250,000 people pass through Stockholm's Central Station each day.Body heat is not an energy source that normally springs to mind when companies want to keep down soaring energy costs. But it did spring to the mind of one Swedish company, which decided the warmth that everybody generates naturally was in fact a resource that was going to waste.

Jernhusen, a real estate company in Stockholm, has found a way to channel the body heat from the hordes of commuters passing through Stockholm's Central Station to warm another building that is just across the road. "This is old technology being used in a new way. The only difference here is that we've shifted energy between two different buildings," says Klas Johnasson, who is one of the creators of the system and head of Jernhusen's environmental division. "There are about 250,000 people a day who pass through Stockholm Central Station. They in themselves generate a bit of heat. But they also do a lot of activities. They buy food, they buy drinks, they buy newspapers and they buy books.

Excess body heat

All this energy generates an enormous amount of heat. So why shouldn't we use this heat. It's there. If we don't use it then it will just be ventilated away to no avail."

So how does the system work in practice?

“Why shouldn't we use this heat? It's there. If we don't use it then it will just be ventilated away to no avail” Klas Johnasson Jernhusen Heat exchangers in the Central Station's ventilation system convert the excess body heat into hot water. That is then pumped to the heating system in the nearby building to keep it warm. Not only is the system environmentally friendly but it also lowers the energy costs of the office block by as much as 25%.

"This is generally good business," says Mr Johansson. "We save money in energy costs and so the building becomes worth more. "We are quite surprised that people haven't done this before. For a large scale project like Kungbrohuset (the office block) this means a lot of money." Over the next 40 years, most experts agree that the supply of oil and gas will become less abundant.
There will be strong competition and higher prices for the resources that remain. Given the abundance of human body heat worldwide and the growing need for renewable energy to replace costly fossil fuels - is this Swedish idea going to catch on?

Costs and benefits

Stockholm's Central Station is one building reusing heat from passengers passing through

"People are now starting to think about urban heat distribution networks everywhere," says Doug King, a consultant specialising in design innovation and sustainable development in construction. "But the financial costs and the benefits will depend very much on the climate and the pricing of energy in a particular country." He explains that harnessing body heat works particularly well in Sweden because of their low winter temperatures and high gas prices.

Spin offs

"It means a low-grade waste heat source, like body heat, can be used advantageously. It's worth them spending a little bit of money on electricity to move heat from building to building, rather than spending a lot on heating with gas."

Mr Johansson is hoping there will be a lot of spin offs from their idea at Stockholm Central Station: "To get energy usage down in buildings what we need to do is use the energy that is being produced all around us."We own both Central Station and Kungbrohuset along with the land in between. So we are in charge of all of it and that has made it easier for us. But this doesn't mean that it cannot be done otherwise. It just means that real estate owners have to collaborate with each other."

He also advocates sustainability. "It's important in Sweden. But it should be important everywhere. Sustainability is the key ingredient to the future of mankind. We need to get sustainable with energy if we are supposed to live on this planet for a long time to come."But what about the Swedish commuters in the station - will they be the ones left out in the cold? "The commuters won't get chilly because we don't steal energy from Central Station we use excess heat that was already there before," says Mr Johansson.

So, with its freezing winters, green credentials and high energy costs, Sweden takes a creative approach to heating. To stay warm all they need to do is keep the heat on. And, if Stockholm's Central Station stays busy then for one building at least it is well on the road to a low carbon and energy secure future.

Saturday, January 15, 2011

Is there a genius in all of us?

Those who think geniuses are born and not made should think again, says author David Shenk.

Where do athletic and artistic abilities come from? With phrases like "gifted musician", "natural athlete" and "innate intelligence", we have long assumed that talent is a genetic thing some of us have and others don't.

But new science suggests the source of abilities is much more interesting and improvisational. It turns out that everything we are is a developmental process and this includes what we get from our genes.

A century ago, geneticists saw genes as robot actors, always uttering the same lines in exactly the same way, and much of the public is still stuck with this old idea. In recent years, though, scientists have seen a dramatic upgrade in their understanding of heredity.

They now know that genes interact with their surroundings, getting turned on and off all the time. In effect, the same genes have different effects depending on who they are talking to.

Malleable

"There are no genetic factors that can be studied independently of the environment," says Michael Meaney, a professor at McGill University in Canada.

"It would be folly to suggest that anyone can literally do or become anything. But the new science tells us that it's equally foolish to think that mediocrity is built into most of us” David Shenk Author of The Genius in All of Us

"And there are no environmental factors that function independently of the genome. [A trait] emerges only from the interaction of gene and environment."

This means that everything about us - our personalities, our intelligence, our abilities - are actually determined by the lives we lead. The very notion of "innate" no longer holds together.

"In each case the individual animal starts its life with the capacity to develop in a number of distinctly different ways," says Patrick Bateson, a biologist at Cambridge University.

"The individual animal starts its life with the capacity to develop in a number of distinctly different ways. Like a jukebox, the individual has the potential to play a number of different developmental tunes. The particular developmental tune it does play is selected by [the environment] in which the individual is growing up."

Is it that genes don't matter? Of course not. We're all different and have different theoretical potentials from one another. There was never any chance of me being Cristiano Ronaldo. Only tiny Cristiano Ronaldo had a chance of being the Cristiano Ronaldo we know now.

But we also have to understand that he could have turned out to be quite a different person, with different abilities. His future football magnificence was not carved in genetic stone.

Doomed

This new developmental paradigm is a big idea to swallow, considering how much effort has gone into persuading us that each of us inherits a fixed amount of intelligence, and that most of us are doomed to be mediocre.

How a London cabbie's brain grows
London cabbies famously navigate one of the most complex cities in the world.

In 1999, neurologist Eleanor Maguire conducted MRI scans on their brains and compared them with the brain scans of others.

In contrast with non-cabbies, experienced taxi drivers had a greatly enlarged posterior hippocampus - that part of the brain that specialises in recalling spatial representations.

What's more, the size of cabbies' hippocampi correlated directly with each driver's experience: the longer the driving career, the larger the posterior hippocampus.

That showed that spatial tasks were actively changing cabbies' brains. This was perfectly consistent with studies of violinists, Braille readers, meditation practitioners, and recovering stroke victims.

Our brains adapt in response to the demands we put on them.

The notion of a fixed IQ has been with us for almost a century. Yet the original inventor of the IQ test, Alfred Binet, had quite the opposite opinion, and the science turns out to favour Binet.

"Intelligence represents a set of competencies in development," said Robert Sternberg from Tufts University in the US in 2005, after many decades of study.

Talent researchers Mihaly Csikszentmihalyi, Kevin Rathunde and Samuel Whalen agree.

"High academic achievers are not necessarily born 'smarter' than others," they write in their book Talented Teenagers, "but work harder and develop more self-discipline."

James Flynn of the University of Otago in New Zealand has documented how IQ scores themselves have steadily risen over the century - which, after careful analysis, he ascribes to increased cultural sophistication. In other words, we've all gotten smarter as our culture has sharpened us.

Most profoundly, Carol Dweck from Stanford University in the US, has demonstrated that students who understand intelligence is malleable rather than fixed are much more intellectually ambitious and successful.

The same dynamic applies to talent. This explains why today's top runners, swimmers, bicyclists, chess players, violinists and on and on, are so much more skilful than in previous generations.

All of these abilities are dependent on a slow, incremental process which various micro-cultures have figured out how to improve. Until recently, the nature of this improvement was merely intuitive and all but invisible to scientists and other observers.

Soft and sculptable

But in recent years, a whole new field of "expertise studies", led by Florida State University psychologist Anders Ericsson, has emerged which is cleverly documenting the sources and methods of such tiny, incremental improvements.

Born to be a footballer?

Bit by bit, they're gathering a better and better understanding of how different attitudes, teaching styles and precise types of practice and exercise push people along very different pathways.

Does your child have the potential to develop into a world-class athlete, a virtuoso musician, or a brilliant Nobel-winning scientist?

It would be folly to suggest that anyone can literally do or become anything. But the new science tells us that it's equally foolish to think that mediocrity is built into most of us, or that any of us can know our true limits before we've applied enormous resources and invested vast amounts of time.

Our abilities are not set in genetic stone. They are soft and sculptable, far into adulthood. With humility, with hope, and with extraordinary determination, greatness is something to which any kid - of any age - can aspire.

David Shenk is the author of The Genius

Friday, January 14, 2011

What ever happened to Little Albert

Psychology


Little Albert was a young child who became famous from a series of psychological experiments conducted upon him to induce fears in the early days of psychology.

Thursday, January 13, 2011

Treatment of Traumatic brain injury like Ms Gifford's

The technique, called a hemicraniectomy, may sound crude and barbaric to the untrained, but it could mean the difference between life and death for Ms Gifford. Although she may have beaten the bullet, it is the damage it has left in its wake that her doctors now need to worry about.

Like any other part of the body, when the brain is injured it will swell. But because it is housed in a bony box - the skull - the swelling has nowhere to go. Left untreated, the pressure would mount and cause further damage to the jelly-like substance that is the brain.

Hole in the head

Excessive intracranial pressure can cause damage to delicate brain tissues leading to lasting disability. Even higher pressure can cause death.Faced with this, doctors have few options other than to find a way to let the pressure out.

They can try drugs to take down the swelling or drain off some of the fluid that bathes the brain, but in extreme cases, surgery may be the only answer.For the procedure, the surgeon removes a section of the skull - a "bone flap" - to give the swelling brain room to expand.The bone flap removed is preserved in a fridge until it can replaced once doctors have the swelling under control.

Eventually it can be screwed back on using metal plates, which can be removed once the bone has knitted together. Making a hole in the head is not a new idea. Surgeons have been doing it for centuries. Evidence of trepanation, or making burr holes, has been found in prehistoric human remains from Neolithic times onward, using sharp objects like teeth as tools rather than the precision surgical saws and drills used today.

Historians believe the procedures were used to treat a range of ailments, possibly including mental illness as well as epilepsy and migraines.While this sounds like something too dangerous to try in the days before modern medicine and the discovery of antibiotics, human remains show some patients did survive the operation.

Resting brain

Today, craniectomies are frequently used by military surgeons in Afghanistan to treat soldiers with severe traumatic brain injury due to bomb blast and high velocity penetrating missile injuries.

Wednesday, January 12, 2011

¾ of Ocean life still unknown

¾ of Ocean life still unknown


At least three-quarters of the world's ocean species remain unknown following a 10-year census of marine life, Canadian researchers say.

"We've estimated that for every species we know about, there's probably another three or four that we don't know, that have never been sampled by science," said Paul Snelgrove, a professor at the Memorial University of Newfoundland's Ocean Science Centre who led the group that compiled the results of the international Census of Marine Life.

Larger species tend to be better known, and when it comes to small invertebrates and microbes, "our level of knowledge is zero in many parts of the ocean," said Snelgrove Tuesday.

He was speaking during a break in a two-day meeting in Ottawa of Canadian marine biologists and representatives of ocean-focused federal agencies and departments such as Fisheries and Oceans.

Even in well-known areas of Canada's coast new species have been pulled up as recently as this past summer, including this possibly new species of sponge found near Nova Scotia in July. (Bedford Institute of Oceanography/Canadian Press)

The meeting is a Canadian follow-up to the census completed in 2010. Its goal is to figure out how to harness Canadian marine biodiversity expertise and apply it to make sustainable use of fish and other ocean resources.

Snelgrove said universities have a lot of the infrastructure and resources needed to keep tabs on the oceans and make informed decisions.

"So, partnerships between universities and federal agencies and between agencies, I think, are the only way that we can continue to do a better on job on sustainable oceans," Snelgrove said.

At the meeting, much of the focus was on how to decide the boundaries of protected marine areas.

"If we're going to give up those areas for activities like fishing … then we want to put them in areas where they're going to be most effective," Snelgrove said.

In the case of spawning cod, for example, the protected areas should be ones where the eggs and young are likely to survive, he said, and not places like the edge of a continental shelf where many eggs would get swept out to sea.

Meanwhile, there has been increasing recognition that even small, inedible species play an important role in the oceans by contributing to the balance of resources like nutrients, oxygen and a good base for the food chain, Snelgrove said.

"Species that we may not think are important to us, in fact, are because of other relationships to the environment … and also to the species that we do harvest," Snelgrove said. "If we're losing some of them, as we know we are, are we losing aspects of ocean health with them?"

Arctic, deep ocean largely unexplored

Knowledge about marine species in two areas of Canadian waters in particular is limited, Snelgrove said:

• The Arctic, because of the ice that covers much of it.

• The deeper areas of the ocean, because exploring them is expensive.

Even off Nova Scotia and Canada's Pacific coast, new species have been pulled up as recently as this past summer, said Verena Tunnicliffe, director of the VENUS, a group of underwater observatories near Vancouver and Victoria. She also holds a Canada Research Chair in deep oceans research at the University of Victoria.

Tunnicliffe addressed the conference Monday night with a talk called "Exploring the ocean frontiers — We have more to learn."Even in locations where most species are known, a lot of research continues to slowly unravel how environmental factors interact with marine life. One example of this is the shipping lane of Vancouver harbour, where researchers are exploring how such factors as oxygen, temperature, carbon or even noise affect plankton blooms and zooplankton.

"Theres's tonnes left to discover," she said. "This really is a major frontier."
Read more: http://www.cbc.ca/technology/story/2011/01/11/marine-species-census.html#ixzz1AlrjPzJF

Tuesday, January 11, 2011

Neurobiologists Genetically Engineer Mice to Smell Light

Neurobiologists have genetically engineered mice to smell light. This optogenetics research provides a better understanding of the neural basis of olfaction. By integrating light sensitive proteins, channelrhodopsins, into the olfactory system of mice, the neurobiologists were better able to control tests involving the olfactory bulb. Those interested in optogenetics, genetics, neuroscience, olfactory systems and neural systems in general may appreciate the article.
Mice that ‘smell’ light could help us better understand olfaction

Harvard University neurobiologists have created mice that can “smell” light, providing a potent new tool that could help researchers better understand the neural basis of olfaction.The work, described this week in the journal Nature Neuroscience, has implications for the future study of smell and of complex perception systems that do not lend themselves to easy study with traditional methods.“It makes intuitive sense to use odors to study smell,” says Venkatesh N. Murthy, professor of molecular and cellular biology at Harvard. “However, odors are so chemically complex that it is extremely difficult to isolate the neural circuits underlying smell that way.”

Murthy and his colleagues at Harvard and Cold Spring Harbor Laboratory used light instead, applying the infant field of optogenetics to the question of how cells in the brain differentiate between odors.
Optogenetic techniques integrate light-reactive proteins into systems that usually sense inputs other than light. Murthy and his colleagues integrated these proteins, called channelrhodopsins, into the olfactory systems of mice, creating animals in which smell pathways were activated not by odors, but rather by light.
“In order to tease apart how the brain perceives differences in odors, it seemed most reasonable to look at the patterns of activation in the brain,” Murthy says. “But it is hard to trace these patterns using olfactory stimuli, since odors are very diverse and often quite subtle. So we asked: What if we make the nose act like a retina?”
With the optogenetically engineered animal, the scientists were able to characterize the patterns of activation in the olfactory bulb, the brain region that receives information directly from the nose. Because light input can easily be controlled, they were able to design a series of experiments stimulating specific sensory neurons in the nose and looking at the patterns of activation downstream in the olfactory bulb.
“The first question was how the processing is organized, and how similar inputs are processed by adjacent cells in the brain,” Murthy says.

But it turns out that the spatial organization of olfactory information in the brain does not fully explain our ability to sense odors. The temporal organization of olfactory information sheds additional light on how we perceive odors. In addition to characterizing the spatial organization of the olfactory bulb, the new study shows how the timing of the “sniff” plays a large part in how odors are perceived.
The paper has implications not only for future study of the olfactory system, but more generally for teasing out the underlying neural circuits of other systems.

Murthy’s co-authors on the Nature Neuroscience paper are Ashesh K. Dhawale of Cold Spring Harbor Laboratory and the National Centre for Biological Sciences in Bangalore, India, Akari Hagiwara of Harvard, Upinder S. Bhalla of the National Centre for Biological Sciences, and Dinu F. Albeanu of Cold Spring Harbor Laboratory. Their work was sponsored by Harvard and Cold Spring Harbor Laboratory.

Contact: Steve Bradt

Source: Harvard University

Monday, January 10, 2011

Scientists Shed Light On What Causes Brain Cell Death In Parkinson's Patients

Just 5 percent of Parkinson's disease cases can be explained by genetic mutation, while the rest have no known cause. But a new discovery by researchers at The University of Texas Health Science Center may begin to explain why the vast majority of Parkinson's patients develop the progressive neurodegenerative disease.
This week in The Journal of Neuroscience, the researchers demystified a process that leads to the death of brain cells - or neurons - in Parkinson's patients. When researchers blocked the process, the neurons survived.

The findings could lead to an effective treatment to slow the progression of Parkinson's disease, rather than simply address symptoms that include tremors, slowed movement, muscle stiffness and impaired balance. Further studies could lead to a diagnostic test that could screen for Parkinson's years before symptoms develop, said Syed Z. Imam, Ph.D., adjunct assistant professor at the UT Health Science Center.

Parkinson's disease, which usually is not diagnosed until age 60 or later, affects an estimated half-million people in the United States. Dr. Imam joined the U.S. Food and Drug Administration (FDA) after the research was conducted. Co-authors are from the Health Science Center's Barshop Institute for Longevity and Aging Studies; the South Texas Veterans Health Care System; and the Hertie Institute for Clinical Brain Research in Tübingen, Germany.

The mechanism

After analyzing cells and post-mortem brain tissue from animals and humans, researchers noted that oxidative stress - a known culprit in neuron death - activated a protein called tyrosine kinase c-Abl in the nigra-striatum area of the brain. Neurons in this part of the brain are particularly vulnerable to Parkinson's injury. Activation of this protein led to changes in another protein called parkin, which is known to be mutated in hereditary Parkinson's. The altered parkin lacked the capacity to break down other proteins, leading to harmful clumps of unprocessed protein in the neuron. The scientists believe this accumulation leads to progressive neuron death, resulting in Parkinson's symptoms that worsen over time.

Implications

"When we blocked tyrosine kinase c-Abl activation, parkin function was preserved and neurons were spared," Dr. Imam said. "We believe these studies provide sound rationale for moving forward with a preclinical trial of tyrosine kinase c-Abl inhibitors, with the goal of developing a potent therapeutic drug for slowing the progression of Parkinson's."
If preclinical trials in animal models of Parkinson's disease yield positive results, the next step would be clinical trials in human patients, Dr. Imam said. Tyrosine kinase c-Abl inhibitors are approved by the FDA for treating myeloid leukemia and gastrointestinal tumors. This could speed approval of the drug for Parkinson's and its translation from bench research to clinical practice.
"The race is on to understand the mechanism of the 95 percent of Parkinson's cases with no known cause, and our finding certainly is a building block," Dr. Imam said. "We have found a specific signaling mechanism that is only turned on by oxidative stress and is selective only to Parkinson's-affected neurons of the nigra-striatum, which is the area that sends signals for balance to the cerebellum."

Acknowledgements

Co-authors from the UT Health Science Center San Antonio are Senlin Li, M.D., senior author; Qing Zhou, Ph.D.; Anthony J. Valente, Ph.D.; Mona C. Bains, Ph.D.; Robert A. Clark, M.D.; and James L. Roberts, Ph.D., whose primary appointment is now at Trinity University in San Antonio. Co-authors represent the School of Medicine and the Graduate School of Biomedical Sciences as well as the Barshop Institute.

The National Institutes of Health, Michael J. Fox Foundation, American Parkinson's Disease Association, Parkinson's Disease Foundation, San Antonio Area Foundation and Health Science Center Presidential Research Enhancement Fund supported the research.

Source:

Will Sansom

University of Texas Health Science Center at San Antonio

Sunday, January 9, 2011

2011: International Year of Chemistry

By Bob McDonald, host of the CBC science radio program Quirks & Quarks.

Following in the footsteps of the international years of Astronomy, Darwin and Biodiversity, this year is dedicated to the science that has given us everything from Teflon to Tylenol. Interestingly, much of the focus of this year's celebration of chemistry is to try to clean up the worldwide damage that chemical byproducts have inflicted on the environment.

Unless you are sitting in a cave, wearing animal skins, take a look around you right now and you will see the products of chemistry everywhere. From the clothes on your back to the synthetic carpet under your feet, from ingredients in your food and drink to, of course, that most ubiquitous chemical product, plastic, our modern-day life would not be where it is without the wizards who have managed to cook up miracle products that have literally shaped modern civilization.
Beyond Earth, the chemists tell us what's going on elsewhere in the universe. Instruments on spacecraft or telescopes can detect chemical elements on distant planets or in clouds of gas floating between the stars. But it's the chemists who interpret those ingredients and analyse processes such as methane rain cycles on a moon of Saturn or reactions within interstellar gas clouds that form amino acids, the building blocks of life.
Even the Periodic Table of Elements, the fundamental face of chemistry, is being revised to more clearly define elemental weights. (You can hear more about it this week on Quirks & Quarks. It may not mean much to most of us, but it dramatically sharpens the tools that are used in forensics and pollution monitoring.

But as remarkable as the achievements of chemists have been, read through any environmental story and you will see chemicals of some kind involved in air, water and land pollution, whether it's pipes discharging waste into rivers, agricultural runoff contaminating shorelines, toxic fumes blowing out stacks, or just plain litter.
To the credit of the industry, many activities planned for this year are environmentally based, including a series of hands-on activities for young people, designed to look at water quality in their neighbourhoods and compare their results to young people around the world. After all, problems that involve chemicals in the environment need chemists to both identify the problems and propose effective solutions. (Hey, even the word "solution" is a chemical term.)
Chemistry often gets a bad rap because it involves unpronounceable names, complicated formulae and failed experiments in high school. But while most of us only have a basic understanding of the subject, we rely daily on chemical reactions to move us from place to place or calm a headache. And whether you think of chemicals as modern miracles or evil incarnate, try to imagine living your life without them. For starters, you wouldn't be reading this column on your computer, because your computer wouldn't exist.So this year, whether chemistry is your thing or not, try to take in some of the events that will be staged at science centres and universities, or set up your own event, such as a river or shoreline cleanup. There are even Chemistry Olympics. So here's to chemistry - the basis of life on Earth and perhaps elsewhere, too.

Saturday, January 8, 2011

Vaccination against Cocaine?

Cocaine vaccine could make drug addiction a distant memory

The first ever vaccine for drug addiction has just been created by combining a cocaine-like molecule with part of the common cold virus, you get a vaccine that turns the immune system against cocaine, keeping it away from the brain.

So far, the vaccine has only been tested on mice, but the results are extraordinary. Mice given the vaccine no longer exhibited any of the hyperactive signs of a cocaine high when they were next given the drug. The vaccine was created by taking just the part of the cold virus that alerts the body's immune system to its presence, and then researchers connected the signalling mechanism to a more stable version of the cocaine molecule.

Once the mice received an injection of the vaccine, they started producing anti-cocaine antibodies which targeted and destroyed any cocaine that then entered their system. Normally, cocaine does not produce an immune response, leaving it free to wreak havoc on the brain and body of whoever takes it. But the cold virus segments taught the immune system to treat cocaine like a hostile invader, offering a nearly impregnable wall of protection from the cocaine's effects.

Researcher Ronald Crystal explains what this means:

"Our very dramatic data shows that we can protect mice against the effects of cocaine, and we think this approach could be very promising in fighting addiction in humans. While other attempts at producing immunity against cocaine have been tried, this is the first that will likely not require multiple, expensive infusions, and that can move quickly into human trials. There is currently no FDA (Food and Drug Administration) approved vaccine for any drug addiction.The vaccine may help [drug addicts] kick the habit, because if they use cocaine, an immune response will destroy the drug before it reaches the brain's pleasure center

Bacteria made fast work of Gulf spill methane- Hard to believe

Not sure whether or not this is just propaganda but I am posting it for your consideration.
By Sharon Oosthoek CBC News

Methane-munching bacteria digested nearly all the gas released by the Deepwater Horizon blowout in the Gulf of Mexico within just four months, according to scientists, who say they were surprised by how quickly the bacteria worked.

The findings, published Thursday by a team of U.S. researchers in the peer-reviewed journal Science, are good news because they show the naturally occurring bacteria's digestion of methane did not lead to dangerously low oxygen levels, as was first feared. The methane and an estimated five million barrels of oil were released into the Gulf starting on April 20 until the well was effectively sealed on July 15.

The methane-eating bacteria used up some oceanic oxygen while breaking down the gas, and again when they died and decomposed once the buffet was over.Texas A&M University oceanographer John Kessler sampling water as part of the research into what happened to the methane gas released by the Deepwater Horizon blowout in April 2010. (Credit: Elizabeth Crap, NOAA) But the oxygen depletion — a three per cent decrease from normal levels — was not significant enough to harm marine life, said Texas A&M University oceanographer John Kessler, the lead co-author of the study.

"Normally, you need to remove about 67 per cent of the oxygen to get to dangerous levels," he said. "Fortunately, the bacterial response to the spilled methane took approximately two months to get started, which allowed enough time for the methane to disperse enough so that no hypoxic or anoxic areas occurred as a result of the methane consumption."But once the bacteria got to work, they appear to have devoured methane faster than anyone anticipated, even though methane concentrations around the well were higher than the team had seen anywhere else in the ocean.

Methane levels around the site of the disaster ranged from 10,000 to 100,000 times the ocean's normal background levels, Kessler said.

Methane naturally seeps from the ocean floor, but even measurements taken near methane vents did not come close to levels measured in the water around the Deepwater Horizon rig after it exploded last April 20.nPrevious measurements of the rate at which bacteria decompose methane in the ocean suggested it would take a long time for the gas from the well to break down.

"It's just a slow process," Kessler said. "We suspected the methane would be around for years." The scientists think the difference in this case is that massive amounts of methane led to an equally massive spike in bacteria.The results suggest that large-scale natural releases of methane in the deep ocean are likely to be met by a similarly rapid bacterial response, the researchers say.

While the amount of methane — a highly potent greenhouse gas — emitted by the well was huge, it was not enough to have a significant impact on atmospheric levels, Kessler said. And in any case, methane that might have been released into the atmosphere was largely trapped under heavy salt water long enough for the bacteria to do their work and keep it from dispersing into the air. Bacteria that went after the oil released in the Deepwater blowout also used up oxygen, but Kessler said other studies showed the steepest recorded drop in oxygen levels in the water was about 40 per cent — again, not enough to harm marine wildlife.

The methane study did not examine the fate of the Deepwater Horizon oil, but in November the U.S. government released a peer-reviewed "technical document" that estimated 17 per cent was directly recovered from the well, 16 per cent was chemically dispersed, five per cent was burned and three per cent skimmed. The document also estimated 24 per cent evaporated or dissolved. Another U.S. government report prepared for Congress and released in December says there is still much uncertainty about the oil's fate.

"Even assuming that approximately half of the oil has been removed from the Gulf ecosystem through direct recovery, burning, skimming or evaporation, the fate of the remaining oil is unknown," the Congressional Research Service report states.Kessler says oil is difficult to track because it is a much more complex substance than methane. It is made up of thousands of different molecules, some of which float, while others sink or dissolve.

The Congressional report suggests authorities may never know what happened to the oil."It is debatable whether the fate of the remaining oil will ever be established conclusively, because multiple challenges hinder this objective: the complexity of the Gulf system; resources required to collect data; and varied interpretations over the results and observations."

Read more: http://www.cbc.ca/technology/story/2011/01/06/tech-oil-spill-gulf-of-mexico-methane-bacteria-oxygen.html#ixzz1AMI2ymSY

Thursday, January 6, 2011

Hair colour predicted from genes

By Paul Rincon Science reporter, BBC News

Red hair can be estimated with around 90% accuracy, the study says.Scientists say they have developed a way to predict a person's probable hair colour using markers in their DNA. The study paves the way for a forensic test that could estimate the hair colour of a suspect from DNA left at a crime scene.The information could then be used to refine the description of an unknown but wanted person.

A Dutch-Polish team of researchers have published details in the journal Human Genetics.
The researchers found that it was possible to determine with an accuracy of more than 90% whether a person had red hair, with a similar accuracy for people with black hair. They could estimate with an accuracy of more than 80% whether a person's hair color was blonde or brown.

Predictive power

This new genetic approach is also able to differentiate between some hair colours that are similar, for example, between red and reddish blonde, or between blonde and dark blonde hair. The DNA can be taken from blood, sperm, saliva or other samples that would be relevant in forensic case work, say the researchers.Lead scientist Manfred Kayser, from Erasmus University Medical Center in Rotterdam, said: "That we are now making it possible to predict different hair colours from DNA represents a major breakthrough because, so far, only red hair colour, which is rare, could be estimated from DNA.

The researchers studied DNA and hair colour information from hundreds of Europeans. They investigated genes previously known to influence the differences in hair colour. "We identified 13 'DNA markers' from 11 genes that are informative to predict a person's hair colour," said Professor Kayser, chair of the Department of Forensic Molecular Biology at Erasmus.

Emerging field

Predicting human "phenotypes" - a person's outward traits such as hair colour or eye colour - from DNA information is a newly emerging field in forensics.Scientists have developed a way for estimating age using T cells. Genetic profiling compares DNA at a crime scene with that of a known suspect or with other profiles in a database in search of a match. But researchers say that when this approach draws a blank, clues to the appearance of a suspect could provide valuable leads in an investigation.

But only a few phenotypic traits can currently be identified from DNA information with enough accuracy to have practical applications. "This research lays the scientific basis for the development of a DNA test for hair colour prediction," said Professor Ate Kloosterman of the Netherlands Forensic Institute (NFI). "A validated DNA test system for hair colour shall become available for forensic research in the not too distant future."He added: "This new development results in an important expansion of the future DNA tool-kit used by forensic investigators to track down unknown offenders."

Professor Kayser's team at Erasmus University has already developed a test for eye colour based on DNA markers. In September they published details of a technique to estimate the age of a suspect from blood left at a crime scene.The method exploits a characteristic of immune cells carried in the blood known as T cells. The approach enables scientists to estimate a person's age, give or take nine years either side.