News

Newswise — How does stress – which, among other things, causes our bodies to divert resources from non-essential functions – affect the basic exchange of materials that underlies our everyday life? Weizmann Institute of Science researchers investigated this question by looking at a receptor in the brains of mice, and they came up with a surprising answer. The findings, which recently appeared in Cell Metabolism, may in the future aid in developing better drugs for stress-related problems and eating disorders. Dr. Yael Kuperman began this study as part of her doctoral research in the lab of Prof. Alon Chen of the Department of Neurobiology. Dr. Kuperman, presently a staff scientist in the Veterinary Resources Department, Prof. Chen, and research student Meira Weiss focused on an area of the brain called the hypothalamus, which has a number of functions, among them helping the body adjust to stressful situations, controlling hunger and satiety, and regulating blood glucose and energy production. When stress hits, cells in the hypothalamus step up production of a receptor called CRFR1. It was known that this receptor contributes to the rapid activation of a stress-response sympathetic nerve network – increasing heart rate, for example. But since this area of the brain also regulates the body’s exchange of materials, the team thought that the CRFR1 receptor might play a role in this, as well. Prof. Chen and his group characterized the cells in a certain area of the hypothalamus, finding that the receptor is expressed in around half of the cells that arouse appetite and suppress energy consumption. These cells comprise one of two main populations in the hypothalamus – the second promotes satiety and the burning of energy. “This was a bit of a surprise,” says Dr. Kuperman, “as we would instinctively expect the receptor to be expressed on the cells that suppress hunger.” To continue investigating, the researchers removed the CRFR1 receptor in mice from just the cells that arouse appetite in the hypothalamus, and then observed how this affected the animals’ bodily functions. At first, the team did not see any significant changes, confirming that this receptor is saved for stressful situations. But when they exposed the mice to stress – cold or hunger – they got another surprise. When exposed to cold, the sympathetic nervous system activates a unique type of fat called brown fat, which produces heat to maintain the body’s internal temperature. When the receptor was removed, the body temperature dropped dramatically – but only in the female mice. Their temperatures failed to stabilize even afterward the stressor was removed, while male mice showed hardly any change. Fasting produced a similarly drastic response in the female mice. Normally, when food is scarce, the brain sends a message to the liver to produce glucose, conserving a minimum level in the blood. But when food was withheld from female mice missing the CRFR1 receptor, the amount of glucose their livers produced dropped significantly. In hungry male CRFR1-deficient mice, the result was similar to the effects of exposure to cold: the exchange of materials in their bodies was barely affected. “We discovered that the receptor has an inhibitory effect on the cells, and this is what activates the sympathetic nervous system,” says Dr. Kuperman. Among other things – revealing exactly how this receptor works and how it contributes to the stress response – the findings show that male and female bodies may exhibit significant differences in the ways that materials are exchanged under stress. Indeed, the fact that the receptor suppresses hunger in females may help explain why women are much more prone to eating disorders than men. Because drugs can enter the hypothalamus with relative ease, the findings could be relevant to the development of treatments for regulating hunger or stress responses, including anxiety disorders or depression. Indeed, several pharmaceutical companies have already begun developing psychiatric drugs to block the CRFR1 receptor. The scientists caution, however, that because the cells are involved in the exchange of materials, blocking the receptor could turn out to have such side effects as weight gain. Prof. Alon Chen’s research is supported by the Henry Chanoch Krenter Institute for Biomedical Imaging and Genomics; the Perlman Family Foundation, Founded by Louis L. and Anita M. Perlman; the Adelis Foundation; the Irving I Moskowitz Foundation; the European Research Council; the estate of Tony Bieber; and the Ruhman Family Laboratory for Research in the Neurobiology of Stress.The Weizmann Institute of Science in Rehovot, Israel, is one of the world’s top-ranking multidisciplinary research institutions. The Institute’s 3,800-strong scientific community engages in research addressing crucial problems in medicine and health, energy, technology, agriculture, and the environment. Outstanding young scientists from around the world pursue advanced degrees at the Weizmann Institute’s Feinberg Graduate School. The discoveries and theories of Weizmann Institute scientists have had a major impact on the wider scientific community, as well as on the quality of life of millions of people worldwide.
Newswise — AUSTIN — About 800,000 strokes occur in America each year; that’s about one every 40 seconds. Houston resident Joe Carrabba experienced one of them. Carrabba was having a massive stroke last year when UTHealth’s Mobile Stroke Unit assessed him, performed a CT scan and administered drugs to dissolve blood clots. The Unit is a modified ambulance that has the specialized training and equipment — which first responders do not — to diagnose and treat stroke patients en route to the hospital. When a stroke occurs, every minute counts. Because of the Unit’s quick action, Carrabba spent only four days at the hospital before he made a full recovery and was sent home. “I wouldn’t be here right now if it weren’t for the Mobile Stroke Unit,” he said. UTHealth’s Mobile Stroke Unit —the nation’s first — is one of many ways the UT System’s 14 health and academic institutions are fighting stroke through research, technology and patient care. Last year, UT System Chancellor William H. McRaven launched “Leading the Brain Health Revolution” one of eight audacious strategic initiatives described as Quantum Leaps. This particular initiative aims to understand, prevent, treat and cure the diseases of the brain, such as stroke. “We will make a tremendous investment in leveraging and connecting all the cutting-edge science occurring at our UT System institutions to discover new knowledge and solutions to neurological diseases,” McRaven said. “This Quantum Leap will drive collaboration, incentivize partnerships and demand scientific and clinical cooperation between our talented physicians and researchers.” Although May is designated as National Stroke Awareness Month, UT System institutions are fighting year-round to defeat stroke and continue Leading the Brain Health Revolution. Technology New stroke-related technologies have been developed by several UT System institutions. For example: UTHealth in Houston used telemedicine to enroll patients remotely into a stroke clinical trial – the first of its kind. UTHealth also launched a clinical trial investigating the use of a physician-monitored app to help first-time minority stroke patients become healthier. At UT Health Science Center San Antonio, physicians are using devices called stent retrievers to extract clots from stroke patients with large-vessel occlusions. Meanwhile, UT Dallas researchers Michael Kilgard, Ph.D., and Robert Rennaker, Ph.D. have conducted clinical trials stimulating a key nerve to correct unwanted changes in the brain following a stroke. Their work helped Dallas-based startup MicroTransponder develop its Vivistim® System device, a medical device designed to stimulate the vagus nerve in stroke patients to restore lost arm function. Kilgard and Rennaker will receive $2.3 million from the National Institutes of Health over the next five years to test the effectiveness of using vagus nerve stimulation to enhance stroke recovery. Research Researchers at institutions across the UT System are launching studies with the potential to dramatically improve stroke prevention and treatment. An international team of researchers, led by Clay Johnston, M.D., Ph.D., dean of the Dell Medical School at UT Austin, released a study published in the New England Journal of Medicine, which did not find that the prescription drug ticagrelor was better than aspirin — the current standard — in reducing the risk of stroke, heart attack or death in patients who suffered a minor stroke, also known as a TIA. UT Austin researchers also discovered that patients who rely heavily on their better-functioning side after a stroke can actually limit the recovery or worsen the use of a limb damaged by stroke. Researchers from UT Southwestern in Dallas are studying nursing protocols to better triage and treat stroke patients using telemedicine. It will be the first study using a new consortium of medical centers known as The Lone Star Stroke Consortium . Patient Care UT Southwestern’s Robert D. Rogers Advanced Comprehensive Stroke Center — part of UT Southwestern’s Peter O’Donnell Jr. Brain Institute — is among an elite group of stroke centers recognized nationally for its exemplary care of specialized stroke cases. It received certification by The Joint Commission and American Heart Association/American Stroke Association as an Advanced Comprehensive Stroke Center. For patients with high-risk or specialized stroke cases, this means quick access to the best care, the highest-trained team of neurosurgeons and neurovascular experts, and ability to use the latest stroke treatments and technologies available. This access to specialized care changed Fort Worth resident Kellie Whitton’s life. Whitton was diagnosed with moyamoya disease — a rare condition that causes constriction of blood vessels in the brain and life-threatening strokes — six weeks before her wedding. Babu Welch, M.D., a UT Southwestern associate professor of neurological surgery and radiology, performed a brain bypass surgery to move an artery from her scalp and attach it to one in her brain that was not affected by her condition. The 3-hour surgery was a success, and Welch even minimized the necessity of shaving Whitton’s head since she was determined not to postpone her wedding. “Dr. Welch was able to part my hair and go in through the hairline,” Whitton said. “That’s one of the reasons I really love Dr. Welch, because he is so hands-on. I completely trusted him with everything, and his team.” About The University of Texas SystemEducating students, providing care for patients, conducting groundbreaking basic, applied and clinical research, and serving the needs of Texans and the nation for more than 130 years, The University of Texas System is one of the largest public university systems in the United States. With 14 institutions and an enrollment of more than 217,000, the UT System confers more than one-third of the state’s undergraduate degrees, educates almost two-thirds of the state’s health care professionals annually and accounts for almost 70 percent of all research funds awarded to public institutions in Texas. The UT System has an annual operating budget of $16.9 billion (FY 2016) including $3 billion in sponsored programs funded by federal, state, local and private sources. With about 20,000 faculty – including Nobel laureates – and more than 70,000 health care professionals, researchers, student advisors and support staff, the UT System is one of the largest employers in the state.
Newswise — If you’re an avid runner, logging dozens of miles every week and you happen to be over 65, odds are you’re burning oxygen at nearly the same rate as a runner in her 20s. Scientists call this rate of oxygen consumption “running economy” and a new study by HSU Kinesiology Professor Justus Ortega and his colleagues at University of Colorado, Boulder is helping define the benefits of maintaining a running habit well into one’s senior years. The study was published recently in Medicine & Science in Sports & Exercise, the official journal of the American College of Sports Medicine. Previous studies indicated that running economy worsens as people age because muscles imbalances develop, resulting in muscles that are working against each other across body parts, or because overall muscle efficiency decreases. However, these studies only surveyed people up to 61 years old. This new study takes a look at runners over the age of 65 and how their bodies cope with the demands of the sport. Researchers found that individuals who maintain an active jogging habit into their senior years are spending nearly the same amount of metabolic energy as a 20-year-old. But what about those don’t jog? What’s causing those non-active seniors to see such increases in the metabolic costs of moving? “Our prior research suggests that the muscles themselves are becoming less efficient. I like to think of it as your body is like a car with a fuel efficiency level,” says Ortega. “Your body has its own fuel efficiency and what we’ve seen is that the fuel efficiency in muscles is reduced in older adults who are sedentary or only walk occasionally. The present study looked at running economy and mechanics 15 young runners and 15 older runners. Each participant had a history of running at least three times per week for a minimum of 30 minutes per session over a six-month period. The study’s trials took place on a specialized treadmill, which read the amount of force a user applies to the running deck. Participants ran 5-minute sessions, at 2.01, 2.46 and 2.91 meters per second (4.5-6.5 mph. Researchers found that despite differences in running mechanics, older runners consumed metabolic energy at a similar rate as young runners across the range of speeds. Essentially, these older runners maintain a youthful running economy into their 60s. However, researchers did find differences in the biomechanics of the two age groups, indicating that older runners adjust their techniques as they age, but still maintain youthful energy levels while exercising. Future studies are aimed at determining whether other choices of exercise can have the same effect on increasing muscle efficiency that running does and whether a sedentary individual can reap the same benefit if they decide to become more active.“There’s good evidence that it’s never too late to get into exercise, it’s about finding what types of exercise are right for your body,” says Ortega.
Newswise —  A large worldwide study has found that, contrary to popular thought, low-salt diets may not be beneficial and may actually increase the risk of cardiovascular disease (CVD) and death compared to average salt consumption. In fact, the study suggests that the only people who need to worry about reducing sodium in their diet are those with hypertension (high blood pressure) and have high salt consumption. The study, involving more than 130,000 people from 49 countries, was led by investigators of the Population Health Research Institute (PHRI) of McMaster University and Hamilton Health Sciences. They looked specifically at whether the relationship between sodium (salt) intake and death, heart disease and stroke differs in people with high blood pressure compared to those with normal blood pressure. The researchers showed that regardless of whether people have high blood pressure, low-sodium intake is associated with more heart attacks, strokes, and deaths compared to average intake. “These are extremely important findings for those who are suffering from high blood pressure,” said Andrew Mente, lead author of the study, a principal investigator of PHRI and an associate professor of clinical epidemiology and biostatistics at McMaster’s Michael G. DeGroote School of Medicine. “While our data highlights the importance of reducing high salt intake in people with hypertension, it does not support reducing salt intake to low levels. “Our findings are important because they show that lowering sodium is best targeted at those with hypertension who also consume high sodium diets.” Current intake of sodium in Canada is typically between 3.5 and 4 grams per day and some guidelines have recommended that the entire population lower sodium intake to below 2.3 grams per day, a level that fewer than five per cent of Canadians and people around the world consume. Previous studies have shown that low-sodium, compared to average sodium intake, is related to increased cardiovascular risk and mortality, even though low sodium intake is associated with lower blood pressure. This new study shows that the risks associated with low-sodium intake – less than three grams per day – are consistent regardless of a patient’s hypertension status. Further, the findings show that while there is a limit below which sodium intake may be unsafe, the harm associated with high sodium consumption appears to be confined to only those with hypertension. Only about 10 per cent of the population in the global study had both hypertension and high sodium consumption (greater than 6 grams per day). Mente said that this suggests that the majority of individuals in Canada and most countries are consuming the right amount of salt. He added that targeted salt reduction in those who are most susceptible because of hypertension and high salt consumption may be preferable to a population-wide approach to reducing sodium intake in most countries except those where the average sodium intake is very high, such as parts of central Asia or China. He added that what is now generally recommended as a healthy daily ceiling for sodium consumption appears to be set too low, regardless of a person’s blood pressure level. “Low sodium intake reduces blood pressure modestly, compared to average intake, but low sodium intake also has other effects, including adverse elevations of certain hormones which may outweigh any benefits. The key question is not whether blood pressure is lower with very low salt intake, instead it is whether it improves health,” Mente said Dr. Martin O’Donnell, a co-author on the study and an associate clinical professor at McMaster University and National University of Ireland Galway, said: “This study adds to our understanding of the relationship between salt intake and health, and questions the appropriateness of current guidelines that recommend low sodium intake in the entire population.” “An approach that recommends salt in moderation, particularly focused on those with hypertension, appears more in-line with current evidence.” The study was funded from more than 50 sources, including the PHRI, the Heart and Stroke Foundation of Canada and the Canadian Institutes of Health Research
Newswise —  During the 2014-15 flu season, the poor match between the virus used to make the world’s vaccine stocks and the circulating seasonal virus yielded a vaccine that was less than 20 percent effective. While this year’s vaccine is a much better match to the circulating seasonal strains of influenza, the shifty nature of the virus and the need to pick the viruses used to make global vaccine stocks well before the onset of the flu season can make vaccine strain selection a shot in the dark. That process — dependent on the careful selection of circulating virus strains and the identification of mutations in the part of the virus that recognizes host cells — could soon be augmented by a new approach. It would more precisely forecast the naturally occurring mutations that help seasonal flu virus dodge the vaccine. Writing this week (May 23, 2016) in the journal Nature Microbiology, a team of researchers led by University of Wisconsin-Madison School of Veterinary Medicine virologist Yoshihiro Kawaoka describes a novel strategy to predict the antigenic evolution of circulating influenza viruses and give science the ability to more precisely anticipate seasonal flu strains. It would foster a closer match for the so-called “vaccine viruses” used to create the world’s vaccine supply. The approach Kawaoka and his colleagues used involved techniques commonly employed in virology for the past 30 years and enabled his group to assemble the 2014 flu virus before the onset of the epidemic. “This is the first demonstration that one can accurately anticipate in the lab future seasonal influenza strains,” explains Kawaoka, a UW-Madison professor of pathobiological sciences who also holds a faculty appointment at the University of Tokyo. “We can identify the mutations that will occur in nature and make those viruses available at the time of vaccine (virus) candidate selection.” Influenza depends on its ability to co-opt the cells of its host to replicate and spread. To gain access to host cells, the virus uses a surface protein known as hemagglutinin which, like a key to a lock, opens the cell to infection. Vaccines prevent infection by priming the immune system to create antibodies that effectively block the lock, prompting the virus to reengineer the hemagglutinin key through chance mutation. “Influenza viruses randomly mutate,” notes Kawaoka. “The only way the virus can continue to circulate in humans is by (accumulating) mutations in the hemagglutinin.” To get ahead of the constant pace of mutations in circulating flu viruses, Kawaoka’s group assembled libraries of human H1N1 and H3N2 viruses from clinical isolates that possessed various natural, random mutations in the hemagglutinin protein. The viruses were then mixed with antibodies to weed out only those that had accumulated enough mutations to evade the antibody. Because the sources of the viruses were known, the patterns of mutation could be mapped using “antigenic cartography.” The mapping, says Kawaoka, identifies clusters of viruses featuring novel mutations which, according to the new study, can effectively predict the molecular characteristics of the next seasonal influenza virus. Such a prediction, says Kawaoka, could then be used to more effectively develop the vaccine virus stockpiles the world needs each flu season. Each year the World Health Organization (WHO), comparing genetic sequence and antigenic data, makes recommendations about which circulating strains of influenza will make the best matching vaccine. The method described by Kawaoka and his colleagues is conceptually different in that it mimics the mutations that occur in nature and accelerates their accumulation in the critical hemagglutinin protein. “Our method may therefore improve the current WHO influenza vaccine selection process,” Kawaoka and his group conclude in the Nature Microbiology report. “These in vitro selection studies are highly predictive of the antigenic evolution of H1N1 and H3N2 viruses in human populations.”
Newswise —  Chatting on the phone with a “sleep coach” and keeping a nightly sleep diary significantly improve sleep quality and reduce insomnia in women through all stages of menopause, according to a new study published today in JAMA Internal Medicine. The study also found that such phone-based cognitive behavioral therapy significantly reduced the degree to which hot flashes, or vasomotor symptoms, interfered with daily functioning. This is good news for women who do not want to use sleeping pills or hormonal therapies to treat menopause-related insomnia and hot flashes, according to paper co-author Dr. Katherine Guthrie, a member of the Public Health Sciences and Clinical Research divisions at Fred Hutchinson Cancer Research Center. “Most women experience nighttime hot flashes and problems sleeping at some point during the menopause transition. Poor sleep leads to daytime fatigue, negative mood and reduced daytime productivity. When sleep problems become chronic — as they often do — there are also a host of negative physical consequences, including increased risk for weight gain, diabetes and cardiovascular disease,” Guthrie said. “Many women do not want to use sleeping medications or hormonal therapies to treat their sleep problems because of concerns about side-effect risks. For these reasons, having effective, non-pharmacological options to offer them is important.” The research, believed to be the first and the largest study to show that cognitive behavioral therapy for insomnia helps healthy women with hot flashes to sleep better, was conducted via MsFLASH, a research network funded by the National Institute on Aging that conducts randomized clinical trials focused on relieving the most common, bothersome symptoms of menopause. Guthrie serves as principal investigator of the Fred Hutch-based MsFLASH Data Coordinating Center. The clinical trial involved more than 100 Seattle-area women (between 40 and 65 years of age) with moderate insomnia who experienced at least two hot flashes a day. All of the women were asked to keep diaries to document their sleep patterns throughout the study and rated the quantity, frequency and severity of their hot flashes at the beginning of the study, at eight weeks and at 24 weeks. Half of the women were selected at random to take part in a cognitive behavioral therapy intervention that involved talking with a sleep coach for less than 30 minutes six times over eight weeks. Importantly, non-sleep specialists (a social worker and a psychologist) delivered the therapy. Before conducting the phone sessions they underwent a day of training in cognitive behavioral therapy techniques. “Since the intervention was delivered by non-sleep specialists over the phone, it potentially could be widely disseminated through primary and women’s health centers to women who do not have good access to behavioral sleep-medicine specialists or clinics,” said the paper’s first and corresponding author Dr. Susan McCurry, a clinical psychologist and research professor at the University of Washington School of Nursing. “Such an intervention would be much less expensive to deliver than traditional, in-person cognitive behavioral therapy protocols, which are typically six to eight sessions that are one hour each,” said McCurry, principal investigator of the randomized trial. The goal of the therapy was to get women to the point where they consistently estimated that they were asleep at least 85 percent of the time they were in bed. To this end, they were given specific sleep/wake schedules and were taught to limit time spent in bed at night, which ultimately helped them fall asleep more quickly and stay asleep. They also were taught “stimulus-control” rules, which are designed to strengthen the association between bed and sleep. “For example, the women were asked to not do anything in bed except sleep and have sex,” McCurry said. “So, no reading, watching television, checking email or paying bills in bed.” Stimulus control also emphasizes the importance of getting up at the same time each day and not napping during the day. The women received an educational booklet about menopause and were given information about how sleep normally changes with age. They learned to create bedtime routines and an environment conducive to sleep, such as turning off electronics at least 30 minutes prior to bed, not drinking caffeine or alcohol after dinner, and keeping their bedroom a slightly cool temperature. They also were taught a technique called “constructive worry” to practice when ruminating thoughts kept them awake at night. The other half of the women were assigned to a menopause education control intervention. These study participants also talked to a sleep coach with the same frequency and duration as the cognitive behavioral therapy group. They received information about women’s health, including diet and exercise, and how they related to hot flashes and sleep quality. The coaches reviewed their weekly sleep diaries with them and provided the same educational booklet about menopause that the other group received. The coaches did not, however, teach cognitive strategies such as constructive worry, and they made no recommendations regarding sleep/wake schedules or restricting time in bed. “This intervention was supportive but very nondirective,” McCurry said. The main outcomes of the study were that women in the cognitive behavioral therapy group experienced statistically significant, clinically meaningful, and long-term, sustained improvements in sleep as compared to the women in the menopause education group. The women who received cognitive behavioral therapy also fared better with regard to hot flashes. Although the frequency and severity of their hot flashes did not change, the women reported that the vasomotor symptoms interfered less with their daily functioning than prior to receiving such therapy. The researchers said that delivering this therapy by phone — a dissemination model similar to phone-based smoking-cessation programs that have proven to be effective — potentially allows it to be an efficient, cost-effective way to reach large populations of women seeking treatment for midlife sleep problems. They also said that these results support further research, such as testing the effectiveness of phone-based cognitive behavioral therapy for insomnia versus traditional pharmacological approaches. “This study demonstrates that it is possible to significantly improve the sleep of many women going through the menopausal transition without the use of sleeping medications or hormone therapies, even if hot flashes are waking them up at night. This is good news for millions of women who are suffering from poor sleep at this time of life,” Guthrie said. In addition to Guthrie, the MsFLASH research group is led by co-principal investigators Dr. Andrea LaCroix at the University of California, San Diego and Dr. Susan Reed of the University of Washington, both of whom are also co-authors on the paper. Editor’s note: To obtain a copy of the embargoed JAMA Internal Medicine paper, “Telephone-Based Cognitive Behavioral Therapy for Insomnia in Perimenopausal and Postmenopausal Women with Vasomotor Symptoms,” please contact the journal at mediarelations@jamanetwork.org or 312.464.5262. At Fred Hutchinson Cancer Research Center, home to three Nobel laureates, interdisciplinary teams of world-renowned scientists seek new and innovative ways to prevent, diagnose and treat cancer, HIV/AIDS and other life-threatening diseases. Fred Hutch’s pioneering work in bone marrow transplantation led to the development of immunotherapy, which harnesses the power of the immune system to treat cancer with minimal side effects. An independent, nonprofit research institute based in Seattle, Fred Hutch houses the nation’s first and largest cancer prevention research program, as well as the clinical coordinating center of the Women’s Health Initiative and the international headquarters of the HIV Vaccine Trials Network. Private contributions are essential for enabling Fred Hutch scientists to explore novel research opportunities that lead to important medical breakthroughs. For more information visit fredhutch.org or follow Fred Hutch on Facebook, Twitter or YouTube. The UW School of Nursing is one of the nation’s premiere nursing schools dedicated to addressing challenges in health care and improving the health of communities locally and globally. For almost 100 years, the UW School of Nursing has been a leader and innovator in nursing science and education. For more information about the #huskynurse community, visit nursing.uw.edu or follow us on Facebook, Twitter or Instagram. 
Newswise — Some adults learn a second language better than others, and their secret may involve the rhythms of activity in their brains. New findings by scientists at the University of Washington demonstrate that a five-minute measurement of resting-state brain activity predicted how quickly adults learned a second language. The study, published in the June-July issue of the journal Brain and Language, is the first to use patterns of resting-state brain rhythms to predict subsequent language learning rate. "We've found that a characteristic of a person's brain at rest predicted 60 percent of the variability in their ability to learn a second language in adulthood," said lead author Chantel Prat, a faculty researcher at the Institute for Learning & Brain Sciences and a UW associate professor of psychology. At the beginning of the experiment, volunteers — 19 adults aged 18 to 31 years with no previous experience learning French — sat with their eyes closed for five minutes while wearing a commercially available EEG (electroencephalogram) headset. The headset measured naturally occurring patterns of brain activity. The participants came to the lab twice a week for eight weeks for 30-minute French lessons delivered through an immersive, virtual reality computer program. The U.S. Office of Naval Research — who funded the current study — also funded the development of the language training program. The program, called Operational Language and Cultural Training System (OLCTS), aims to get military personnel functionally proficient in a foreign language with 20 hours of training. The self-paced program guides users through a series of scenes and stories. A voice-recognition component enables users to check their pronunciation. Watch a video demonstration of the language software: https://www.youtu.be/piA6dMkBroQ To ensure participants were paying attention, the researchers used periodic quizzes that required a minimum score before proceeding to the next lesson. The quizzes also served as a measure for how quickly each participant moved through the curriculum. At the end of the eight-week language program, participants completed a proficiency test covering however many lessons they had finished. The fastest person learned twice as quickly but just as well as the slower learners. The recordings from the EEG headsets revealed that patterns of brain activity related to language processes were linked the most strongly to the participants' rate of learning. So, should people who don't have this biological predisposition not even try to learn a new language? Prat says no, for two reasons. "First, our results show that 60 percent of the variability in second language learning was related to this brain pattern — that leaves plenty of opportunity for important variables like motivation to influence learning," Prat said. Second, Prat said it's possible to change resting-state brain activity using neurofeedback training — something that she's studying now in her lab. Neurofeedback is a sort of brain training regimen, through which individuals can strengthen the brain activity patterns linked to better cognitive abilities. "We're looking at properties of brain function that are related to being ready to learn well. Our goal is to use this research in combination with technologies such as neurofeedback training to help everyone perform at their best," she said. Ultimately, neurofeedback training could help people who want to learn a second language but lack the desirable brain patterns. They'd do brain training exercises first, and then do the language program. "By studying individual differences in the brain, we're figuring out key constraints on learning and information processing, in hopes of developing ways to improve language learning, and eventually, learning more generally," Prat said.
Newswise — Studying fruit flies, whose sleep is remarkably similar to that in people, Johns Hopkins researchers say they’ve identified brain cells that are responsible for why delaying bedtime creates chronic sleepiness. In a report on the research published online on May 19 in Cell, the scientists say they found a group of brain cells in charge of so-called sleep drive that becomes more active the longer flies are kept awake. The same mechanism, they say, also plays a role in putting the flies to sleep and keeping them that way. The findings may offer insight into human sleep disorders and open up new strategies to promote long-lasting sleep for those with chronic insomnia who don’t respond to available sleep drugs, they say. “Although fruit flies look very different from people on the surface, they actually share many of the same genes and even behaviors,” says Mark Wu, M.D., Ph.D., associate professor of neurology at the Johns Hopkins University School of Medicine. “And with what we believe is the first identification a mechanism behind the adjustable nature of sleep drive, researchers can look for the same processes in mammals, including, one day, in humans.” In their search for sleep-regulating cells, Wu’s team used genetic engineering to turn on small numbers of neurons in more than 500 fruit fly strains. They then measured how these flies slept when these neurons “fired.” Several strains continued to sleep for several hours even after they turned off the neurons, stopping them from firing and suggesting that the researchers triggered sleep drive in these flies, which led to the persistent sleepiness. Using fluorescent microscopy, the scientists then examined the fly brains to specifically pinpoint the identity and location of the sleep drive-inducing cells. The firing neurons were genetically engineered to glow green. They were found in a structure called the ellipsoid body (see photo) and are known as the R2 neurons. To pin down more of what was going on, the researchers blocked the neurons from firing by genetically engineering the R2 neurons to make tetanus toxin, which silences the cells. The flies with the silenced R2 neurons slept on their normal schedule, but when the flies with the silenced R2 neurons were deprived of sleep during the night by mechanically shaking their vial houses, they got about 66 percent less “rebound sleep” compared to control flies, suggesting that they felt less sleepy after sleep deprivation. Next, the researchers tested how fly R2 neurons behaved on their own in awake, sleeping or sleep-deprived fruit flies. They used tiny electrodes to measure the firing of the R2 neurons in well-rested, awake fruit flies; in fruit flies that were an hour into their sleep cycle; and in fruit flies after 12 hours of sleep deprivation. In the well-rested fruit flies, the neurons fired only about once per second and were the least active. In the sleeping fruit flies, the neurons fired almost four times a second. In the sleep-deprived fruit flies, the neurons were the most active, firing at about seven times per second. “These R2 neurons have higher firing rates the more sleep-deprived the fruit flies were and firing of these neurons puts flies to sleep, suggesting that we’ve identified the key cells responsible for sleep drive,” says Wu. Wu says it’s long been thought that getting to sleep requires an increase in sleep-promoting chemicals in specific parts of the brain as night and bedtime approach in the normal 24-hour sleep-wake cycle. However, he says, these chemicals last for only a few minutes at a time, so it has been puzzling how they can account for sleep drive that lasts hours. As an answer to this question, Wu and colleagues used a genetic technique to light up the places on the surface of the R2 neurons where they actively release small chemical neurotransmitters, sending information to neighboring cells. Compared to well-rested flies, sleep-deprived flies had an increase in the number and size of the places releasing the neurotransmitter, and they appeared much brighter. Wu says these changes in number of neurotransmitter release sites account for how the neurons are able to adjust over time using a system for sleep drive that works over a period of hours, rather than minutes, like the known sleep-promoting chemicals. This flexible system can adjust to times when the flies are sleep-deprived or when they are just nearing their normal bedtime. He adds that the sleep drive process in the R2 neurons works similarly to how memories are encoded in other types of neurons, where changes in the neuron’s information-sending and receiving parts adjust over time. “Figuring out how sleep drive works should help us one day figure out how to treat people who have an overactive sleep drive that causes them to be sleepy all the time and resistant to current therapies,” Wu says. Sha Liu, Qili Liu and Masashi Tabuchi of Johns Hopkins Medicine also contributed to this study. The research was funded by grants from the National Institute of Neurological Disorders and Stroke (R01 NS079584 and R21 NS088521) and a Burroughs-Wellcome Fund Career Award for Medical Scientists.
Newswise —  An experimental model uses genetics-guided biomechanics and patient-derived stem cells to predict what type of inherited heart defect a child will develop, according to authors of a new study in the journal Cell. A multi-institutional team developing the technology – and led by the Cincinnati Children’s Heart Institute – reports May 19 it would let doctors intervene earlier to help patients manage their conditions and help inform future pharmacologic treatment options. In laboratory tests, the model accurately predicts whether mouse models and stem-cell derived heart cells from human patients will develop a hypertrophic or dilated cardiomyopathy. “This technology would make it possible to predict the eventual cardiac phenotype in pediatric patients and help guide their treatment and future monitoring,” said Jeffery Molkentin, PhD, lead author and a researcher in the Division of Molecular Cardiovascular Biology at Cincinnati Children’s and the Howard Hughes Medical Institute. “It could help when counseling patients about athletic endeavors, in which sudden death can occur with hypertrophic cardiomyopathy. Or it could help decide whether certain patients should consider an implantable cardioverter defibrillator to prevent sudden death as they grow into young adulthood.” Inherited cardiomyopathy is a genetically diverse group of heart muscle diseases affecting about one of every 500 people. There are two primary clinical manifestations: hypertrophic cardiomyopathy (HCM) and dilated cardiomyopathy (DCM). The diseases involve nearly 1,500 different gene mutations in sarcomeres, the part of the heart muscle that generates tension and contraction. In HCM, the heart’s chambers and valves grow so that they are not symmetric. The dimension of the ventricular chamber is reduced, the interventricular septum thickens, and patients suffer from diastolic dysfunction (in which heart muscle doesn’t relax normally), causing an increased risk of sudden death from arrhythmia. With DCM, people have an enlarged left ventricular chamber accompanied by a lengthening of heart cells (myocytes) that results in reduced systolic function and eventually heart failure. Effective drug regimens to manage the conditions do not exist, although there is research looking for new drugs. The only effective treatment at present is a heart transplant. This leaves an urgent need to develop new technologies to manage, treat, cure or prevent the diseases, according to researchers. In developing the technology, scientists analyzed how sarcomeres generate tension coupled with alterations in calcium cycling, which is critical to heart function. The coupling of tension generation and calcium cycling is altered in patients with sarcomeric gene mutations. The alteration can be measured and then used to predict how the heart will change as disease progresses, Molkentin said. To study the influence of gene mutations on this process, researchers tested an array of genetically altered mice. The mouse models were either normal (wild type) mice or those expressing different gene mutations for various cardiomyopathies. This allowed researchers to examine tension generation and associated calcium cycling rates through heart muscle in a highly defined manner. That information was used to create a mathematical model for disease prediction that integrates the total tension generated by isolated cardiomyocytes. The tension-integrated, algorithm-based model was able to predict if hearts in mouse models would undergo hypertrophic or dilated cardiac growth. The scientists next tested the computational model on human cells from cardiomyopathy patients by using induced pluripotent stem cell (iPSC) technology. Reprogrammed and derived from actual patient skin fibroblast cells, iPSCs can become virtually any cell type in the human body and then be used for scientific investigation of disease properties. Molkentin and his colleagues generated patient-specific cardiomyocytes – which under a microscope can actually be seen pulsating rhythmically similar to a beating heart. Patient-derived iPSCs also carry the same genetic makeup (including mutations) as the person donating original starter cells. In the study, iPSC heart cells developed the same cardiomyopathy tension deficits as the patient’s own hearts. Researchers then used their heart defect prediction method to see how accurately it determined the heart defect type of specific cardiomyopathy patients. With collaborators at the Stanford University School of Medicine, Molkentin and his colleagues generated four lines of developing, early-stage patient-specific iPSC heart cells (cardiomyocytes). They report that their technology accurately determined the HCM vs. DCM heart defect of the donor patients. Researchers continue to develop and test the technology by using it to determine the cardiac disease state of patients with specific mutations in a sarcomere encoding gene. They caution the technology is years away from potential clinical use, pending further testing and refinement. The research team includes collaborators from Temple University (Philadelphia), the University of Washington (Seattle), Stanford University School of Medicine (Stanford, Calif.), Harvard Medical School-Brigham and Women’s Hospital (Boston) and the University of Minnesota Medical School (Minneapolis). Funding support for the study came in part from National Institutes of Health (R37HL60562) and the Howard Hughes Medical Institute. About Cincinnati Children’sCincinnati Children’s Hospital Medical Center ranks third in the nation among all Honor Roll hospitals in U.S.News and World Report’s 2015 Best Children’s Hospitals. It is also ranked in the top 10 for all 10 pediatric specialties, including a #1 ranking in pulmonology and #2 in cancer and in nephrology. Cincinnati Children’s, a non-profit organization, is one of the top three recipients of pediatric research grants from the National Institutes of Health, and a research and teaching affiliate of the University of Cincinnati’s College of Medicine. The medical center is internationally recognized for improving child health and transforming delivery of care through fully integrated, globally recognized research, education and innovation. Additional information can be found at http://www.cincinnatichildrens.org/default/. Connect on the Cincinnati Children’s blog, via Facebook and on Twitter.
Newswise —  Context plays a big role in our memories, both good and bad. Bruce Springsteen's "Born to Run" on the car radio, for example, may remind you of your first love -- or your first speeding ticket. But a Dartmouth- and Princeton-led brain scanning study shows that people can intentionally forget past experiences by changing how they think about the context of those memories. The findings have a range of potential applications centered on enhancing desired memories, such as developing new educational tools, or diminishing harmful memories, including treatments for post-traumatic stress disorder. The study appears in the journal Psychonomic Bulletin and Review. A PDF is available on request. Since Ancient Greece, memory theorists have known that we use context -- or the situation we're in, including sights, sounds, smells, where we are, who we are with -- to organize and retrieve our memories. But the Dartmouth- and Princeton-led team wanted to know whether and how people can intentionally forget past experiences. They designed a functional magnetic resonance imaging (fMRI) experiment to specifically track thoughts related to memories' contexts, and put a new twist on a centuries-old psychological research technique of having subjects memorize and recall a list of unrelated words. In the new study, researchers showed participants images of outdoor scenes, such as forests, mountains and beaches, as they studied two lists of random words, manipulating whether they were told to forget or remember the first list prior to studying the second list. "Our hope was the scene images would bias the background, or contextual, thoughts that people had as they studied the words to include scene-related thoughts," says lead author Jeremy Manning, an assistant professor of psychological and brain sciences at Dartmouth. "We used fMRI to track how much people were thinking of scene-related things at each moment during our experiment. That allowed us to track, on a moment-by-moment basis, how those scene or context representations faded in and out of people's thoughts over time." The study's participants were told to either forget or remember the random words presented to them interspersed between scene images. Right after they were told to forget, the fMRI showed that they "flushed out" the scene-related activity from their brains. "It's like intentionally pushing thoughts of your grandmother's cooking out of your mind if you don't want to think about your grandmother at that moment," Manning says. "We were able to physically measure and quantify that process using brain data." But when the researchers told participants to remember the studied list rather than forget it, this flushing out of scene-related thoughts didn't occur. Further, the amount that people flushed out scene-related thoughts predicted how many of the studied words they would later remember, which shows the process is effective at facilitating forgetting. The study has two important implications. "First, memory studies are often concerned with how we remember rather than how we forget, and forgetting is typically viewed as a 'failure' in some sense, but sometimes forgetting can be beneficial, too," Manning says. "For example, we might want to forget a traumatic event, such as soldiers with PTSD. Or we might want to get old information 'out of our head,' so we can focus on learning new material. Our study identified one mechanism that supports these processes." The second implication is more subtle but also important. "It's very difficult to specifically identify the neural representations of contextual information," Manning says. "If you consider the context you experience something in, we're really referring to the enormously complex, seemingly random thoughts you had during that experience. Those thoughts are presumably idiosyncratic to you as an individual, and they're also potentially unique to that specific moment. So, tracking the neural representations of these things is extremely challenging because we only ever have one measurement of a particular context. Therefore, you can't directly train a computer to recognize what context 'looks like' in the brain because context is a continually moving and evolving target. In our study, we sidestepped this issue using a novel experimental manipulation -- we biased people to incorporate those scene images into the thoughts they had when they studied new words. Since those scenes were common across people and over time, we were able to use fMRI to track the associated mental representations from moment to moment." ### Dartmouth Assistant Professor Jeremy Manning is available to comment at Jeremy.R.Manning@dartmouth.edu. The study, which included scientists at Bard College and the University of Illinois at Urbana-Champagne, was supported by the John Templeton Foundation, the National Institutes of Health and the National Science Foundation.