Popular Posts Today

Diberdayakan oleh Blogger.

New processes for cost-efficient solar cell production

Written By Unknown on Rabu, 19 September 2012 | 08.16

ScienceDaily (Sep. 19, 2012) — The competition in the photovoltaics market is fierce. When it comes to price, Asian manufacturers are frequently ahead of the competition by a nose. Now, Fraunhofer researchers are designing new coating processes and thin layer systems that, if used, could help to reduce the price of solar cells significantly.

Scientists will unveil a few of these new processes at the EU PVSEC trade show in Frankfurt from September 25 to 28.

Many people answer with a resounding "yes!" when asked if they want environmentally-friendly solar cell-based power -- though it should be inexpensive. For this reason, a veritable price war is raging among the makers of photovoltaic cells. Above all, it are the cheap products of Asian origin that are making life tough for domestic manufacturers. Tough, that is, until now: the researchers at the Fraunhofer Institute for Surface Engineering and Thin Films IST in Braunschweig are providing support to these companies. They are engineering coating processes and thin film systems aimed at lowering the production costs of solar cells drastically.

Hot wires instead of plasma

The photovoltaic industry is pinning its hopes particularly on high-efficiency solar cells that can achieve efficiencies of up to 23 percent. These "HIT" cells (Heterojunction with Intrinsic Thin layer) consist of a crystalline silicon absorber with additional thin layers of silicon. Until now, manufacturers used the plasma-CVD process (short for Chemical Vapor Deposition) to apply these layers to the substrate: the reaction chamber is filled with silane (the molecules of this gas are composed of one silicon and four hydrogen atoms) and with the crystalline silicon substrate. Plasma activates the gas, thus breaking apart the silicon-hydrogen bonds. The now free silicon atoms and the silicon-hydrogen residues settle on the surface of the substrate. But there's a problem: the plasma only activates 10 to 15 percent of the expensive silane gas; the remaining 85 to 90 percent are lost, unused. This involves enormous costs.

The researchers at IST have now replaced this process: Instead of using plasma, they activate the gas by hot wires. "This way, we can use almost all of the silane gas, so we actually recover 85 to 90 percent of the costly gas. This reduces the overall manufacturing costs of the layers by over 50 percent. The price of the wire that we need for this process is negligible when compared to the price of the silane," explains Dr. Lothar Schäfer, department head at IST. "In this respect, our system is the only one that coats the substrate continously during the movement -- this is also referred to as an in-line process." This is possible since the silicon film grows up at the surface about five times faster than with plasma CVD -- and still with the same quality of layer. At this point, the researchers are coating a surface measuring 50 by 60 square centimeters; however, the process can be easily scaled up to the more common industry format of 1.4 square meters. Another advantage: The system technology is much easier than with plasma CVD, therefore the system is substantially cheaper. Thus, for example, the generator that produces the electric current to heat the wires only costs around one-tenth that of its counterpart in the plasma CVD process.

In addition, this process is also suitable for thin film solar cells. With a degree of efficiency of slightly more than ten percent, these have previously shown only a moderate pay-off. However, by tripling the solar cells (i.e., by putting three cells on top of each other) the degree of efficiency spikes up considerably. But there is another problem: Because each of the three cells is tied to considerable material losses using the plasma CVD coatings, the triple photovoltaic cells are expensive. So the researchers see another potential use for their process: the new coating process would make the cells much more cost-effective. Triple cells could even succeed over the long term if the rather scarce but highly efficient germanium is used. However, germanium is also very expensive: in order for it to be a profitable choice, one must be able to apply the layers while losing as little of the germanium as possible -- by using the hot-wire CVD process, for instance.

Saving 35 percent in the sputter process for transparent conductive oxide

The power generated by photovoltaic cells has to be able to flow out, in order for it to be used. To do so, usually a contact grid of metal is evaporated onto the solar cells, which conducts the resulting holes and electrons. But for HIT cells, this grid is insufficient. Instead, transparent, conductive layers -- similar to those in an LCD television -- are needed on the entire surface.

This normally happens through the sputter process: ceramic tiles, made from aluminum-doped zinc or indium-zinc oxide, are atomized. The dissolved components attach to the surface, thereby producing a thin layer. Unfortunately, the ceramic tiles are also quite expensive. Therefore, the researchers at IST use metallic tiles: They are 80 percent cheaper than their ceramic counterparts. An electronic control ensures that the metal tiles do not oxidize. Because that would otherwise change the manner in which the metal sputters. "Even though the control outlay is greater, we can still lower the cost of this production process by 35 percent for 1.4 square meter coatings," says Dr. Volker Sittinger, group manager at IST.

The research team intends to combine both processes over the long term, in order to make thin-coated solar cells more cost-effective and ultimately, more profitable. "You can produce all silicon layers using the hot-wire CVD, and all transparent conductive layers through sputtering with metal tiles. In principle, these processes should also be suitable for large formats," states Sittinger. However, the processes being used are not production processes quite yet: Even if the researchers already apply the processes to a countless number of square centimeters, it will still take about three to five years until they can be used in the production of solar cells.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Fraunhofer-Gesellschaft.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/Xa3FXfoW08Q/120919082933.htm
--
Manage subscription | Powered by rssforward.com
08.16 | 0 komentar | Read More

Modern DNA techniques applied to nineteenth-century potatoes

ScienceDaily (Sep. 19, 2012) — Researchers led by Professor Bruce Fitt, now at the University of Hertfordshire, have used modern DNA techniques on late nineteenth-century potatoes to show how the potato blight may have survived between cropping seasons after the Irish potato famine of the 1840s.

Late blight of potato is caused by the microorganism, Phytophthora infestans, which rapidly destroys the leaves of potato crops and was responsible for the infamous Irish potato famine of the 1840s that left over one million people dead and another one million Irish emigrating. With growing concerns over food shortages and climate change, late blight remains a serious disease problem in current potato production and has also emerged as a significant disease threat to the organic tomato industry.

In the research paper published in Plant Pathology, DNA was extracted from the Rothamsted potato samples that had been dried, ground and stored in glass bottles in the nineteenth century. The DNA was then analysed for the presence of the potato blight pathogen.

Bruce Fitt, Professor of Plant Pathology at the University of Hertfordshire and formerly at Rothamsted Research, said: "It was the foresight of two nineteenth-century plant scientists to archive potato samples from their experiment that has enabled us to apply modern DNA techniques to better understand late potato blight and the implications for today's food security. The analysis of these late nineteenth-century potato samples is the earliest proof of how this disease survived between seasons in England."

The findings of this research has proved that the DNA technique applied to the potato samples is a very useful tool in plant disease diagnosis to test seed potatoes or tomato transplants for the presence of the late blight pathogen. This technique can be further developed for testing for other diseases found in different plants which affect food production.

Bruce continued: "Using modern DNA techniques to detect and quantify the pathogen in potatoes enables us to better understand the spread of potato late blight. This disease is still a serious threat to worldwide potato production."

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by University of Hertfordshire, via AlphaGalileo.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. J. B. Ristaino, C. H. Hu, B. D. L. Fitt. Evidence for presence of the founder Ia mtDNA haplotype of Phytophthora infestans in 19th century potato tubers from the Rothamsted archives. Plant Pathology, 2012; DOI: 10.1111/j.1365-3059.2012.02680.x

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/voQn23516Vs/120919083403.htm
--
Manage subscription | Powered by rssforward.com
08.16 | 0 komentar | Read More

Major changes needed to protect Australia's species and ecosystems

ScienceDaily (Sep. 18, 2012) — A landmark study has found that climate change is likely to have a major impact on Australia's plants, animals and ecosystems that will present significant challenges to the conservation of Australia's biodiversity.

The comprehensive study by CSIRO highlights the sensitivity of Australia's species and ecosystems to climate change, and the need for new ways of thinking about biodiversity conservation.

"Climate change is likely to start to transform some of Australia's natural landscapes by 2030," lead researcher, CSIRO's Dr Michael Dunlop said.

"By 2070, the ecological impacts are likely to be very significant and widespread. Many of the environments our plants and animals currently exist in will disappear from the continent. Our grandchildren are likely to experience landscapes that are very different to the ones we have known."

Dr Dunlop said climate change will magnify existing threats to biodiversity, such as habitat clearing, water extraction and invasive species. Future climate-driven changes in other sectors, such as agriculture, water supply and electricity supply, could add yet more pressure on species and ecosystems.

"These other threats have reduced the ability of native species and ecosystems to cope with the impacts of climate change," Dr Dunlop said.

One of the challenges for policy and management will be accommodating changing ecosystems and shifting species.

The study suggests the Australian community and scientists need to start a rethink of what it means to conserve biodiversity, as managing threatened species and stopping ecological change becomes increasingly difficult.

"We need to give biodiversity the greatest opportunity to adapt naturally in a changing and variable environment rather than trying to prevent ecological change," Dr Dunlop said.

The study highlights the need to start focusing more on maintaining the health of ecosystems as they change in response to climate change, from one type of ecosystem to another.

'This could need new expectations from the community, possibly new directions in conservation policy, and new science to guide management," Dr Dunlop said.

"To be effective we also need flexible strategies that can be implemented well ahead of the large-scale ecological change. It will probably be too late to respond once the ecological change is clearly apparent and widespread."

The study found the National Reserve System will continue to be an effective conservation tool under climate change, but conserving habitat on private land will be increasingly important to help species and ecosystems adapt.

The team of researchers from CSIRO carried out modelling across the whole of Australia, as well as detailed ecological analysis of four priority biomes, together covering around 80 per cent of Australia.

The study was funded by the Australian Government Department of Sustainability, Environment, Water, Population and Communities, the Department of Climate Change and Energy Efficiency and the CSIRO Climate Adaptation Flagship.

Further information: http://www.csiro.au/Organisation-Structure/Flagships/Climate-Adaptation-Flagship/adapt-national-reserve-system.aspx

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by CSIRO Australia.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/PgKQGLugFjg/120919103616.htm
--
Manage subscription | Powered by rssforward.com
08.16 | 0 komentar | Read More

Birth is no reason to go to hospital, review suggests

ScienceDaily (Sep. 19, 2012) — A new Cochrane Review concludes that all countries should consider establishing proper home birth services. They should also provide low-risk pregnant women with information enabling them to make an informed choice. The review has been prepared by senior researcher, statistician Ole Olsen, the Research Unit for General Practice, University of Copenhagen, and midwifery lecturer PhD Jette Aaroe Clausen.

In many countries it is believed that the safest option for all women is to give birth in hospital. However, observational studies of increasingly better quality and in different settings suggest that planned home birth in many places can be as safe as planned hospital birth and with less intervention and fewer complications.

"If home birth is going be an attractive and safe option for most pregnant women, it has to be an integrated part of the health care system," Ole Olsen says and adds, "In several Danish regions the home birth service has been very well organized for several years. This is not the case everywhere in the world."

The updated Cochrane Review concludes that there is no strong evidence from experimental studies (randomized trials) to favor either planned hospital birth or planned home birth for low-risk pregnant women. At least not as long as the planned home birth is assisted by an experienced midwife with collaborative medical back up in case transfer should be necessary.

Fewer interventions in home birth

Routines and easy access to medical interventions may increase the risk of unnecessary interventions in birth explaining why women who give birth at home have a higher likelihood for a spontaneous labour. There are 20-60 per cent fewer interventions, for example fewer cesarean sections, epidurals and augmentation among those women who plan a homebirth; and 10-30 per cent fewer complications, for example post partum bleeding and severe perineal tears.

"Patience is important if women want to avoid interference and give birth spontaneously," says Jette Aaroe Clausen. "At home the temptation to make unnecessary interventions is reduced. The woman avoids for example routine electronic monitoring that may easily lead to further interventions in birth."

Jette Aaroe Clausen adds that interventions in childbirth are common in many countries, but also that there is a growing concern internationally because interventions may lead to iatrogenic effects; iatrogenic effects meaning unintended consequences of the intervention. Routine electronic monitoring may for example lead to more women having artificial rupture of membranes which in turn can lead to more interventions.

Evidence and human rights

While the scientific evidence from observational studies has been growing, the European Court of Human Rights in Strasbourg in the case Ternovszky versus Hungary has handed down a judgment stating that "the right to respect for private life includes the right to choose the circumstances of birth." This is quoted in the review.

Thus the conclusions of the review are based on human rights and ethics as well as on results from the best available scientific studies.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by University of Copenhagen.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Ole Olsen, David Jewell. Home versus hospital birth. The Cochrane Library, 12 SEP 2012 DOI: 10.1002/14651858.CD000352

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/FpKVCisqAts/120919083454.htm
--
Manage subscription | Powered by rssforward.com
07.44 | 0 komentar | Read More

Tackling 'frequent flyers' won't solve the rising emergency hospital admissions problem

ScienceDaily (Sep. 18, 2012) — Patients who are regularly admitted to hospital as emergencies (known as 'frequent flyers') make up a large proportion of admissions, but focusing just on them won't solve the problem of rising admissions, say experts on bmj.com today.

Martin Roland and Gary Abel from the Cambridge Centre for Health Services Research argue that this is one of several misconceptions about emergency admissions that must be tackled if we are to reduce the number of people being admitted as emergencies. Around the world, the pressure to reduce healthcare costs is huge. Emergency hospital admissions are an expensive aspect of care and rates have been rising for several years, particularly among the elderly and those with several conditions (co-morbidities).

In the UK, many initiatives have been set up to reduce emergency admissions, mainly in primary care and with a focus on high risk patients who are thought to use a disproportionate share of resources.

But Roland and Abel argue that there are "some fundamental flaws" in this approach. Exclusively focusing on high risk patients won't solve the problem, they say, as data show that most admissions come from low and medium risk groups. Instead they suggest interventions may need to be targeted on larger population groups, such as elderly patients.

They also challenge the widespread view that improving primary care could prevent many emergency admissions and suggest that some of the rise in admissions may be due to the introduction of four hour waiting targets in A&E. They say that primary and secondary care doctors need to work together to achieve a common set of goals.

They also point to the problem of "supply induced demand" for services which could explain the apparent increase in admissions found in some studies of intensive case management (e.g. community matrons).

Apart from a few exceptions, evaluations of interventions to reduce emergency admissions have been disappointing. Evaluations need to be evidence based, they need to allow for admission rates for individual patients falling of their own accord (a phenomenon known as regression to the mean) and to take account of variation due to chance. The authors suggest some guidelines for those needing to focus on this important and expensive aspect of healthcare.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by BMJ-British Medical Journal.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. M. Roland, G. Abel. Reducing emergency admissions: are we on the right track? BMJ, 2012; 345 (sep18 1): e6017 DOI: 10.1136/bmj.e6017

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/HFJoIZ2ZQ6k/120918185625.htm
--
Manage subscription | Powered by rssforward.com
06.16 | 0 komentar | Read More

Average 25% pay gap between men and women doctors largely 'inexplicable'

ScienceDaily (Sep. 18, 2012) — According to the latest survey of UK hourly pay by the Office of National Statistics (ONS), female doctors' pay lags behind their male colleagues by 28.6%.

This "eye opener" pay gap, which trends suggest has stood at around 25% on average since 2000, remains largely inexplicable, says John Appleby, Chief Economist at the King's Fund, in an article published on bmj.com today.

He explores possible reasons for this persistent gender divide in medicine and suggests that doctors have some way to catch up with other health care jobs.

For example, nursing auxiliaries and assistants show the smallest bias in pay towards men, writes Appleby, with women's median hourly pay being 0.1% less than men's. For nurses the pay gap widens to 1.9%.

Female paramedics' and health service managers pay also lags behind their male colleagues by 4.9% and 5.8% respectively, while at 16%, the pay gap for pharmacists is nearly treble this.

Interestingly, female medical radiographers appear to earn 5.3% more than their male counterparts on average, adds Appleby.

But what explains the big gap in medical practitioners' pay between men and women?

A 2009 study for the BMA suggested that some of the difference may be legitimate' and explained by factors such as experience, grade and administrative duties "although why men end up with more experience or on higher grades -- and hence more pay -- begs some questions," he writes.

Nevertheless, a significant part of the pay gap appeared to be 'unexplained' by such factors. The analysis suggested that female doctors were disadvantaged due to caring roles, a 'hostile culture' and geographical limitations which reduced their ability to change jobs (a key way to increase pay).

"These are of course problems faced by women in other occupations too. But it may be that these factors are more acute for female medical practitioners, suggests Appleby.

"Maybe there are lessons to be learned from some other health care professions, where gender pay differences are closer to zero," he concludes.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by BMJ-British Medical Journal.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. J. Appleby. Is there equal pay in healthcare? Not if you are a doctor. BMJ, 2012; 345 (sep18 1): e6191 DOI: 10.1136/bmj.e6191

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/DiQHpA77s4I/120918185627.htm
--
Manage subscription | Powered by rssforward.com
06.16 | 0 komentar | Read More

Blood pressure diet works, but adherence drops among African-Americans

ScienceDaily (Sep. 19, 2012) — Better adherence to the DASH (Dietary Approaches to Stop Hypertension) diet is associated with significant reductions in blood pressure. However, African-Americans may be less likely than whites to adopt the diet, according to researchers at Duke University Medical Center.

The findings, which appear online Sept. 19 in the Journal of the Academy of Nutrition and Dietetics, suggest that altering traditional recipes to meet nutritional guidelines rather than eliminating certain foods altogether may result in better adherence among African-Americans.

The DASH diet is recognized as the diet of choice for preventing and managing high blood pressure. The diet is rich in fruits, vegetables, and low fat dairy products, and is low in fats and cholesterol.

"Previous research, including results from our ENCORE study, established the DASH diet as an important approach for lowering blood pressure, and for some individuals, it may be an effective alternative to taking medication for hypertension," said James A. Blumenthal, PhD, professor of behavioral medicine in the Department of Psychiatry and Behavioral Science at Duke University Medical Center. "In this study we were interested in whether dietary adherence was related to blood pressure changes and what factors predicted who would adhere to the diet."

The study was a new analysis of data from the ENCORE trial, led by Duke researchers to evaluate the effectiveness of the DASH diet on cardiovascular health. Participants were 144 sedentary, overweight or obese adults, who had high blood pressure and were not taking medication.

Researchers measured a series of clinical and behavioral factors at the start of the study including blood pressure, weight, and physical fitness, as well as dietary habits. Depression, anxiety and social support were also evaluated as potential predictors of adherence to the regimen.

Participants were randomly assigned to one of three treatment groups: the DASH diet alone; the DASH diet in combination with weight-loss counseling and aerobic exercise; or no change in diet and exercise habits.

After four months, participants in the group that got the DASH diet plus weight-loss counseling and exercise lost an average of 19 pounds, while weight remained stable in the other two groups.

Participants in both the DASH diet alone and DASH diet plus counseling groups had significant reductions in blood pressure, with greater adherence to the DASH diet resulting in the largest drops in blood pressure. The finding suggests that that following the DASH diet lowers blood pressure, independent of exercise and weight loss.

However, the addition of weight loss and exercise to the DASH diet promoted even greater reductions in blood pressure and improved other measures of cardiovascular health. "For overweight or obese patients with high blood pressure, clinicians should recommend the DASH diet in conjunction with exercise and weight loss for the best results," said Alan Hinderliter, MD, a cardiologist at the University of North Carolina at Chapel Hill and an investigator in this study.

The researchers noted that African-American participants were less likely than white participants to eat foods recommended in the DASH diet prior to beginning the study. While both African-American and white participants in the DASH treatment groups increased the amount of DASH foods they ate, African-Americans were less likely to adopt the DASH diet compared to their white counterparts. No other demographic, behavioral, or social variable predicted whether participants would adhere to the DASH diet.

"We need to be aware of cultural differences in dietary preferences in order to help people better adopt a DASH-friendly diet," Blumenthal said. "It is important to take into account traditional food choices and cooking practices when attempting to incorporate more DASH foods into daily meal plans."

Culturally sensitive changes to implementing the DASH diet, such as modifying traditional "soul food" recipes to meet nutritional recommendations rather than eliminating foods altogether, may result in better adherence among African-Americans.

"Given the success of the DASH diet, we know that changing lifestyles can make a significant difference in people's health," Blumenthal said. "And in the long run, if people are able to maintain changes to their diet and exercise habits, it can lead to a lower risk for heart attack and stroke."

In addition to Blumenthal, Duke researchers include Dawn Epstein, Andrew Sherwood, Patrick J. Smith, Carla Caccia, Pao-Hwa Lin, Michael A. Babyak, and Julie J. Johnson. Other researchers include Linda Craighead of Emory University, and Alan Hinderliter of the University of North Carolina at Chapel Hill.

The study was funded with grants from the National Heart, Lung, and Blood Institute (HL074103), and the General Clinical Research Center, National Institutes of Health (M01-RR-30).

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Duke University Medical Center.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/9pfuYLjxSMc/120919083452.htm
--
Manage subscription | Powered by rssforward.com
06.16 | 0 komentar | Read More

Evolutionary history of lizards and snakes reconstructed using massive molecular dataset

ScienceDaily (Sep. 18, 2012) — A new study, published online in Biology Letters on Sept. 19, has utilized a massive molecular dataset to reconstruct the evolutionary history of lizards and snakes. The results reveal a surprising finding about the evolution of snakes: that most snakes we see living on the surface today arose from ancestors that lived underground.

The article, entitled "Resolving the phylogeny of lizards and snakes (Squamata) with extensive sampling of genes and species," describes research led by John J. Wiens, an Associate Professor in the Department of Ecology and Evolution at Stony Brook University. The study was based on 44 genes and 161 species of lizards and snakes, one of the largest genetic datasets assembled for reptiles.

The results show that almost all groups of snakes arose from within a bizarre group of burrowing blind snakes called scolecophidians. This finding implies that snakes ancestrally lived underground, and that the thousands of snake species living today on the surface evolved from these subterranean ancestors.

The authors suggest that there are still traces of this subterranean ancestry in the anatomy of surface-dwelling snakes. "For example, no matter where they live, snakes have an elongate body and a relatively short tail, and outside of snakes, this body shape is only found in lizards that live underground," said Professor Wiens. "Snakes have kept this same basic body shape as they have evolved to invade nearly every habitat on the planet -- from rainforest canopies to deserts and even the oceans."

Co-authors of the study include Carl R. Hutter, Daniel G. Mulcahy, Brice P. Noonan, Ted M. Townsend, Jack W. Sites Jr., and Tod W. Reeder. The work was performed at Stony Brook University, Brigham Young University, and San Diego State University.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Stony Brook University.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. J. J. Wiens, C. R. Hutter, D. G. Mulcahy, B. P. Noonan, T. M. Townsend, J. W. Sites, T. W. Reeder. Resolving the phylogeny of lizards and snakes (Squamata) with extensive sampling of genes and species. Biology Letters, 2012; DOI: 10.1098/rsbl.2012.0703

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/iRDhWGVfsCk/120919081834.htm
--
Manage subscription | Powered by rssforward.com
05.46 | 0 komentar | Read More

The cost of glaucoma care: Small group of patients accounts for large part of costs

Written By Unknown on Selasa, 18 September 2012 | 17.43

ScienceDaily (Sep. 18, 2012) — A small subset of patients with open-angle glaucoma (OAG) account for a large proportion of all glaucoma-related charges in the United States, according to new data published by researchers at the University of Michigan Kellogg Eye Center and Washington University, St. Louis.

These findings have importance for future evaluations of the cost-effectiveness of screening and treatment for glaucoma.

"We've identified risk factors associated with patients who are the costliest recipients of glaucoma-related eye care," says Joshua D. Stein, M.D., M.S., glaucoma specialist at Kellogg. "Among these factors are younger age, living in the northeastern United States, undergoing cataract surgery, and having other eye conditions. Understanding the characteristics of these individuals and finding ways to reduce disease burden and costs associated with their care can result in substantial cost savings."

The study, published in the September 2012 issue of the American Journal of Ophthalmology, reviewed claims data from 19,927 patients with newly diagnosed OAG who were enrolled in a large U.S. managed care network.

The researchers identified glaucoma-related charges for all such patients from 2001 through 2009. They found that the costliest 5 percent of enrollees were responsible for $10,202,871, or 24 percent, of all glaucoma-related charges. They also found that glaucoma patients generally consume the greatest relative share of resources during their first six months of care after diagnosis.

"Although there have been several studies examining the cost of caring for patients with glaucoma, most have been based on individuals who have already been diagnosed, and few have examined changes in cost of care over time," says Stein. "In this investigation, we examined two questions: What is the pattern of resource use for patients with OAG during the first seven years after disease onset, and what are the characteristics of those patients who have the greatest glaucoma-related resource use."

A chronic, progressive, incurable disease that affects more than 2 million individuals in the United States and many more worldwide, OAG is the most common cause of blindness among African Americans. OAG is the most common form of glaucoma in the United States. Caring for patients with OAG in the United States carries a total societal cost estimated at nearly $1 billion annually.

"Developing an understanding of the resource use of people with glaucoma and identifying those expected to have the largest resource use is important in a resource-constrained health care environment," says Stein. "Further, by collecting longitudinal information on resource use we can better quantify the value of slowing glaucoma progression through various interventions."

Stein is a member of U-M's Institute for Healthcare Policy and Innovation, which brings together hundreds of U-M researchers who study and test ways to improve patient care.

Citation: Longitudinal Trends in Resource Use in an Incident Cohort of Open-Angle Glaucoma Patients: Resource Use in Open-Angle Glaucoma. American Journal of Ophthalmology, September, 2012.

Authors: Joshua D. Stein, M.D., M.S.; Leslie M. Niziol, M.S.; David C. Musch, Ph.D., M.P.H; Paul P. Lee, M.D., J.D.; Sameer V. Kotak, M.S.; Colleen M. Peters, M.A.; Steven M. Kymes, Ph.D.

For more information about glaucoma care and research at the Kellogg Eye Center, visit http://kellogg.umich.edu/patientcare/glaucoma.service.html

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by University of Michigan Health System.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Joshua D. Stein, Leslie M. Niziol, David C. Musch, Paul P. Lee, Sameer V. Kotak, Colleen M. Peters, Steven M. Kymes. Longitudinal Trends in Resource Use in an Incident Cohort of Open-Angle Glaucoma Patients: Resource Use in Open-Angle Glaucoma. American Journal of Ophthalmology, 2012; 154 (3): 452 DOI: 10.1016/j.ajo.2012.03.032

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/DVkblisX8Zw/120918184758.htm
--
Manage subscription | Powered by rssforward.com
17.43 | 0 komentar | Read More

Compound found in purple corn may aid in developing future treatments for type 2 diabetes, kidney disease

ScienceDaily (Sep. 18, 2012) — Diabetic nephropathy is one of the most serious complications related to diabetes, often leading to end-stage kidney disease. Purple corn grown in Peru and Chile is a relative of blue corn, which is readily available in the U.S. The maize is rich in anthocyanins (also known as flavonoids), which are reported to have anti-diabetic properties.

Scientists from the Department of Food and Nutrition and Department of Biochemistry at Hallym University in Korea investigated the cellular and molecular activity of purple corn anthocyanins (PCA) to determine whether and how it affects the development of diabetic nephropathy (DN). Their findings suggest that PCA inhibits multiple pathways involved in the development of DN, which may help in developing therapies aimed at type 2 diabetes and kidney disease.

The study is entitled "Purple corn anthocyanins inhibit diabetes-associated glomerular monocyte activation and macrophage infiltration." It appears in the online edition of the American Journal of Physiology -- Renal Physiology, published by the American Physiological Society.

Methodology

Researcher Min-Kyung Kang and colleagues performed a two-part study, an in vitro experiment investigating the effects of PCA on human endothelial cells cultured under hyperglycemic kidney conditions and an in vivo study that investigated the effects of PCA on kidney tissue in diabetic mice. In the in vitro experiment, cultured cells were exposed to 1-20 µg/ml of PCA for six hours (control cells were not exposed), then assessed for level of monocyte-endothelial cell adhesion, a major factor in the development of diabetic glomerulosclerosis. In the in vivo experiment, diabetic and control mice were dosed with PCA for eight weeks, then changes in kidney tissue were assessed and immunohistological analyses were performed. Kidney tissue was further analyzed for levels of inflammatory chemokines, which are key components in DN.

Results

Researchers found that in human endothelial cells cultured in hyperglycemic kidney conditions, induction of endothelial cell adhesion molecules decreased in a dose-dependent manner with PCA exposure, meaning that the PCA likely interfered with cell-cell adhesion in glomeruli. PCA also appeared to interfere with leukocyte recruitment and adhesion to glomerular endothelial cells. In diabetic mice, PCA exposure slowed mesangial expansion and interrupted the cellular signaling pathway that may instigate glomerular adhesion and infiltration of inflammatory cells responsible for diabetic glomerulosclerosis. Finally, PCA inhibited levels of macrophage inflammatory protein-2 and monocyte chemotactic protein-1 in kidney tissue, demonstrating that it may inhibit macrophage infiltration, which is closely related to renal inflammation.

Importance of the Findings

The research suggests that anthocyanins may be the main biofunctional compound in purple corn and could protect against mesangial activation of monocytes and infiltration of macrophages in glomeruli -- the two major contributors to DN. The research further suggests that renoprotection by PCA against mesangial activation may be specific therapies targeting diabetes-associated diabetic glomerulosclerosis and renal inflammation. Finally, PCA supplementation may be an important strategy in preventing renal vascular disease in type 2 diabetes.

"PCA may be a potential renoprotective agent treating diabetes-associated glomerulosclerosis," wrote the researchers.

Research Team

In addition to Min-Kyung Kang, the study team included Jing Li, Ju-Hyun Gong, Su-Nam Kwak, Jung Han Yoon Park, Soon Sung Lim and Young-Hee Kang, all also of the Department of Food and Nutrition at Hallym University in Korea, and Jae-Yong Lee, of the Department of Biochemistry at Hallym University.

Funding

This study was funded by a grant from the Ministry of Food, Agriculture, Forestry and Fisheries through Korea Institute of Planning and Evaluation for Technology of Food, Agriculture, Forestry and Fisheries; and by the National Research Foundation of Korea.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by American Physiological Society (APS).

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. M.-K. Kang, J. Li, J.-L. Kim, J.-H. Gong, S.-N. Kwak, J. H. Y. Park, J.-Y. Lee, S. S. Lim, Y.-H. Kang. Purple corn anthocyanins inhibit diabetes-associated glomerular monocyte activation and macrophage infiltration. AJP: Renal Physiology, 2012; DOI: 10.1152/ajprenal.00106.2012

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/Yp-PtOFU4gs/120918184756.htm
--
Manage subscription | Powered by rssforward.com
17.14 | 0 komentar | Read More

Nanoparticles detect biochemistry of inflammation

ScienceDaily (Sep. 18, 2012) — Inflammation is the hallmark of many human diseases, from infection to neurodegeneration. The chemical balance within a tissue is disturbed, resulting in the accumulation of reactive oxygen species (ROS) such as hydrogen peroxide, which can cause oxidative stress and associated toxic effects.

Although some ROS are important in cell signaling and the body's defense mechanisms, these chemicals also contribute to and are indicators of many diseases, including cardiovascular dysfunction. A non-invasive way of detecting measurable, low levels of hydrogen peroxide and other ROS would provide a viable way to detect inflammation. Such a method would also provide a way to selectively deliver drugs to their targets.

Adah Almutairi, PhD, associate professor at the Skaggs School of Pharmacy and Pharmaceutical Sciences, the Department of NanoEngineering, and the Materials Science and Engineering Program at the University of California, San Diego, and colleagues have developed the first degradable polymer that is extremely sensitive to low but biologically relevant concentrations of hydrogen peroxide.

Their work is currently published in the online issue of the Journal of the American Chemical Society.

These polymeric capsules, or nanoparticles, are taken up by macrophages and neutrophils -- immune system cells that rush to the site of inflammation. The nanoparticles then release their contents when they degrade in the presence of hydrogen peroxide produced by these cells.

"This is the first example of a biocompatible way to respond to oxidative stress and inflammation," said Almutairi, director of the UC San Diego Laboratory of Bioresponsive Materials. "Because the capsules are tailored to biodegrade and release their cargo when encountering hydrogen peroxide, they may allow for targeted drug delivery to diseased tissue."

Almutairi is looking to test this method in a model of atherosclerosis. "Cardiologists have long needed a non-invasive method to determine which patients are vulnerable to a heart attack caused by ruptured plaque in the arteries before the attack," she said. "Since the most dangerous of plaques is inflamed, our system could provide a safe way to detect and treat this disease."

Additional contributors to the study include Caroline de Gracia Lux, Shivanjali Joshi-Barr, Trung Nguyen, Enas Mahmoud, Eric Schopf and Nadezda Fomina.

This research was supported by the NIH Director's New Innovator Award 1DP2OD006499-01 and a King Abdulaziz City for Science and Technology center grant to the Center of Excellence in Nanomedicine at UC San Diego.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by University of California, San Diego.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Caroline de Gracia Lux, Shivanjali Joshi-Barr, Trung Nguyen, Enas Mahmoud, Eric Schopf, Nadezda Fomina, Adah Almutairi. Biocompatible Polymeric Nanoparticles Degrade and Release Cargo in Response to Biologically Relevant Levels of Hydrogen Peroxide. Journal of the American Chemical Society, 2012; 120917145802001 DOI: 10.1021/ja303372u

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/c1N28sMKcJQ/120918184800.htm
--
Manage subscription | Powered by rssforward.com
17.14 | 0 komentar | Read More

Food supplements have little effect on the weight of malnourished children

ScienceDaily (Sep. 18, 2012) — Providing energy dense food supplements within a general household food distribution has little effect on the weight of children at risk of malnutrition, new research shows.

Giving energy dense food supplements -- Ready-to-Use Supplementary Food (RUSF), a lipid-based nutrient supplement -- to young children in addition to a general food distribution in a country with food shortages (Chad) did not reduce levels of wasting (low weight for height, a sign of acute undernutrition) but slightly increased their height and haemoglobin levels according to a study conducted by the international non-governmental organization Action Against Hunger-France (ACF-France) in collaboration with European researchers published in this week's PLOS Medicine.

In emergency situations, international aid organizations support affected populations by distributing food and sometimes by also providing nutritional supplements such as RUSF, to children at risk of malnutrition. In a cluster randomized controlled trial, researchers from Belgium and France, led by Lieven Huybregts from Ghent University in Belgium, investigated the effect of a targeted daily dose of RUSF in 6󈞐-month old children by randomly assigning fourteen household clusters in the city of Abeche, Chad, into an intervention or control arm. All the households received a general food distribution that included staple foods but eligible children in the intervention households were also given a daily RUSF ration.

At the end of the study period, the researchers found that the addition of RUSF to the household food rations had little effect on the incidence of wasting. However, compared to the children in the control group, those in the intervention group had a greater gain in height-for-age, slightly higher hemoglobin levels, and lower rates of diarrhea and fever, as reported by the child's parents.

The authors say: "Adding child-targeted RUSF supplementation to a general food distribution resulted in increased hemoglobin status and linear growth, accompanied by a reduction in diarrhea and fever episodes. However, we could not find clear evidence that adding RUSF to a household food ration distribution of staple foods was more effective in preventing acute malnutrition."

The authors continue: "Other context-specific alternatives for preventing acute malnutrition should therefore be investigated."

And in an accompanying Perspective article, Kathryn Dewey and Mary Arimond from the University of California in the USA (uninvolved in the study), say: "There is clearly a need for additional research to understand the potential growth-promoting effect of certain ingredients in Lipid-based Nutritional supplements (e.g., milk powder, essential fatty acids). The new study by Huybregts et al. is an important contribution to the evidence base."

Dewey and Arimond add: "High-quality programmatic studies can help provide urgently needed information on the cost and comparative cost effectiveness of different integrated strategies for filling nutrient gaps and promoting healthy growth."

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Public Library of Science.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal References:

  1. Lieven Huybregts, Freddy Houngbé, Cécile Salpéteur, Rebecca Brown, Dominique Roberfroid, Myriam Ait-Aissa, Patrick Kolsteren. The Effect of Adding Ready-to-Use Supplementary Food to a General Food Distribution on Child Nutritional Status and Morbidity: A Cluster-Randomized Controlled Trial. PLoS Medicine, 2012; 9 (9): e1001313 DOI: 10.1371/journal.pmed.1001313
  2. Kathryn G. Dewey, Mary Arimond. Lipid-Based Nutrient Supplements: How Can They Combat Child Malnutrition? PLoS Medicine, 2012; 9 (9): e1001314 DOI: 10.1371/journal.pmed.1001314

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/GQILNxTnwnk/120918185621.htm
--
Manage subscription | Powered by rssforward.com
17.14 | 0 komentar | Read More

Is magnetic therapy effective for tinnitus?

ScienceDaily (Sep. 18, 2012) — Loyola University Medical Center is studying whether a new form of non-invasive magnetic therapy can help people who suffer debilitating tinnitus (ringing in the ears).

The therapy, transcranial magnetic stimulation (TMS), sends short pulses of magnetic fields to the brain. TMS has been approved since 2009 for patients who have major depression and have failed at least one antidepressant.

The Loyola study will include patients who suffer from both depression and tinnitus. Recent studies have found that about 12 percent of people with chronic tinnitus also suffer depression and anxiety -- a rate three times higher than that of the general population.

Tinnitus is the perception of sound in one or both ears when there is no external source. It can include ringing, hissing, roaring, whistling, chirping or clicking. About 50 million Americans have at least some tinnitus; 16 million seek medical attention and about 2 million are seriously debilitated, according to the American Tinnitus Association. There is no cure.

The perception of phantom sounds can be more pronounced in people who are depressed. Moreover, antidepressant medications can cause tinnitus occasionally, said Dr. Murali Rao, principal investigator of Loyola's TMS tinnitus study.

Several earlier studies have found that TMS can benefit tinnitus patients. Loyola's study is the first to examine patients who suffer from both tinnitus and depression. "The combination of these two conditions can be extremely debilitating," Rao said.

During TMS treatment, the patient reclines in a comfortable padded chair. A magnetic coil, placed next to the left side of the head, sends short pulses of magnetic fields to the surface of the brain. This produces currents that stimulate brain cells. The currents, in turn, affect mood-regulatory circuits deeper in the brain. The resulting changes in the brain appear to be beneficial to patients who suffer depression. Each treatment lasts 35 to 40 minutes.

The study will enroll 10 to 15 patients. Each patient will receive five treatments a week for four to six weeks, for a total of 20 to 30 treatments. Each patient will be evaluated by a physician three times during the treatment course, or more frequently if the doctor deems necessary.

The treatments do not require anesthesia or sedation. Afterward, a patient can immediately resume normal activities, including driving. Studies have found that patients do not experience memory loss or seizures. Side effects include mild headache or tingling in the scalp, which can be treated with Tylenol.

Rao is chair of the Department of Psychiatry and Behavioral Neurosciences of Loyola University Chicago Stritch School of Medicine. His co-investigator in the study is Sam Marzo, MD, medical director of Loyola's Balance and Hearing Center. Other investigators are Matthew Niedzwiecki, MD, a psychiatry resident; and James Sinacore, PhD, a statistician.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Loyola University Health System, via Newswise.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/ewazQZO71oE/120918184754.htm
--
Manage subscription | Powered by rssforward.com
17.14 | 0 komentar | Read More

Statins are unlikely to prevent blood clots, large analysis finds

ScienceDaily (Sep. 18, 2012) — Despite previous studies suggesting the contrary, statins (cholesterol-lowering drugs) may not prevent blood clots (venous thrombo-embolism) in adults, according to a large analysis by international researchers published in this week's PLOS Medicine.

In 2009, an additional analysis of data from a randomized controlled trial called the JUPITER trial reported that the statin rosuvastatin halved the risk of venous thromboembolic events among apparently healthy adults. However, this finding was based on a small number of patients who had thromboembolic events (34 vs 60). To gather more evidence about the possible benefits of statins, a group of international researchers led by Kazem Rahimi from the George Centre for Healthcare Innovation at the University of Oxford in the UK, combined the results (performed a meta-analysis) of 29 suitable published and unpublished randomised controlled trials of the effects of statins involving over 100 000 participants and more than 1000 events: Only two studies presented venous thrombotic events in the published report, but such events had been recorded as adverse events in all of the included trials, which the authors were able to include in their analysis.

In the combined analysis, the authors found that venous thrombosis occurred in 0.9% of people taking statins compared to 1% of people not taking statins, which suggests that statins have a very small, if any, effect. These results did not change when the authors excluded the findings of the JUPITER trial. The authors also found that there was no effect at all in people taking high doses and low doses of statins.

The authors conclude: "this study provides a more detailed assessment of the potential effects of statins (or higher dose statins) on venous thromboembolic events than has previously been possible. We were unable to confirm the large proportional reduction in risk suggested by some previous studies."

The authors add: "However, a more modest but perhaps clinically worthwhile reduction in venous thromboembolic events in some or all types of patient cannot be ruled out."

In an accompanying Perspective article, Frits Rosendaal from the Leiden University Medical Center in The Netherlands (uninvolved in the study) argues that even if the study cannot provide definite answers to the statin question, some tentative conclusions can be drawn. He says: "Firstly, that for the association between statins and venous thrombosis the methodologically strongest analysis shows at most a very small effect. Secondly, if we do not wish to discard the possibility of a beneficial effect for the whole class, any such effects are limited to rosuvastatin."

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Public Library of Science.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal References:

  1. Kazem Rahimi, Neeraj Bhala, Pieter Kamphuisen, Jonathan Emberson, Sara Biere-Rafi, Vera Krane, Michele Robertson, John Wikstrand, John McMurray. Effect of Statins on Venous Thromboembolic Events: A Meta-analysis of Published and Unpublished Evidence from Randomised Controlled Trials. PLoS Medicine, 2012; 9 (9): e1001310 DOI: 10.1371/journal.pmed.1001310
  2. Frits R. Rosendaal. Statins and Venous Thrombosis: A Story Too Good to Be True? PLoS Medicine, 2012; 9 (9): e1001311 DOI: 10.1371/journal.pmed.1001311

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/1gJc4g-9Re8/120918185623.htm
--
Manage subscription | Powered by rssforward.com
17.14 | 0 komentar | Read More

Music underlies language acquisition, theorists propose

ScienceDaily (Sep. 18, 2012) — Contrary to the prevailing theories that music and language are cognitively separate or that music is a byproduct of language, theorists at Rice University's Shepherd School of Music and the University of Maryland, College Park (UMCP) advocate that music underlies the ability to acquire language.

"Spoken language is a special type of music," said Anthony Brandt, co-author of a theory paper published online this month in the journal Frontiers in Cognitive Auditory Neuroscience. "Language is typically viewed as fundamental to human intelligence, and music is often treated as being dependent on or derived from language. But from a developmental perspective, we argue that music comes first and language arises from music."

Brandt, associate professor of composition and theory at the Shepherd School, co-authored the paper with Shepherd School graduate student Molly Gebrian and L. Robert Slevc, UMCP assistant professor of psychology and director of the Language and Music Cognition Lab.

"Infants listen first to sounds of language and only later to its meaning," Brandt said. He noted that newborns' extensive abilities in different aspects of speech perception depend on the discrimination of the sounds of language -- "the most musical aspects of speech."

The paper cites various studies that show what the newborn brain is capable of, such as the ability to distinguish the phonemes, or basic distinctive units of speech sound, and such attributes as pitch, rhythm and timbre.

The authors define music as "creative play with sound." They said the term "music" implies an attention to the acoustic features of sound irrespective of any referential function. As adults, people focus primarily on the meaning of speech. But babies begin by hearing language as "an intentional and often repetitive vocal performance," Brandt said. "They listen to it not only for its emotional content but also for its rhythmic and phonemic patterns and consistencies. The meaning of words comes later."

Brandt and his co-authors challenge the prevailing view that music cognition matures more slowly than language cognition and is more difficult. "We show that music and language develop along similar time lines," he said.

Infants initially don't distinguish well between their native language and all the languages of the world, Brandt said. Throughout the first year of life, they gradually hone in on their native language. Similarly, infants initially don't distinguish well between their native musical traditions and those of other cultures; they start to hone in on their own musical culture at the same time that they hone in on their native language, he said.

The paper explores many connections between listening to speech and music. For example, recognizing the sound of different consonants requires rapid processing in the temporal lobe of the brain. Similarly, recognizing the timbre of different instruments requires temporal processing at the same speed -- a feature of musical hearing that has often been overlooked, Brandt said.

"You can't distinguish between a piano and a trumpet if you can't process what you're hearing at the same speed that you listen for the difference between 'ba' and 'da,'" he said. "In this and many other ways, listening to music and speech overlap." The authors argue that from a musical perspective, speech is a concert of phonemes and syllables.

"While music and language may be cognitively and neurally distinct in adults, we suggest that language is simply a subset of music from a child's view," Brandt said. "We conclude that music merits a central place in our understanding of human development."

Brandt said more research on this topic might lead to a better understanding of why music therapy is helpful for people with reading and speech disorders. People with dyslexia often have problems with the performance of musical rhythm. "A lot of people with language deficits also have musical deficits," Brandt said.

More research could also shed light on rehabilitation for people who have suffered a stroke. "Music helps them reacquire language, because that may be how they acquired language in the first place," Brandt said.

The research was supported by Rice's Office of the Vice Provost for Interdisciplinary Initiatives, the Ken Kennedy Institute for Information Technology and the Shepherd School of Music.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Rice University.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Anthony Brandt, Molly Gebrian, L. Robert Slevc. Music and Early Language Acquisition. Frontiers in Psychology, 2012; 3 DOI: 10.3389/fpsyg.2012.00327

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/H_HjBmQZ7Vs/120918185629.htm
--
Manage subscription | Powered by rssforward.com
17.14 | 0 komentar | Read More

Extreme temperatures may raise risk of premature cardiovascular death

ScienceDaily (Sep. 17, 2012) — Extreme temperatures during heat waves and cold spells may increase the risk of premature cardiovascular disease (CVD) death, according to new research in Circulation: Cardiovascular Quality and Outcomes, anAmerican Heart Association journal.

The study in Brisbane, Australia, is the first in which researchers examined the association between daily average temperature and "years of life lost" due to CVD. Years of life lost measures premature death by estimating years of life lost according to average life expectancy.

The findings are important because of how the body responds to temperate extremes, the growing obesity trend and Earth's climate changes, said Cunrui Huang, M.Med., M.S.P.H., the study's lead researcher and a Ph.D. scholar at the School of Public Health and Institute of Health and Biomedical Innovation at Queensland University of Technology (QUT) in Brisbane, Australia.

Exposure to extreme temperatures can trigger changes in blood pressure, blood thickness, cholesterol and heart rate, according to previous research.

"With increasing rates of obesity and related conditions, including diabetes, more people will be vulnerable to extreme temperatures and that could increase the future disease burden of extreme temperatures," Huang said.

Researchers collected data on daily temperatures in Brisbane, Australia between 1996 and 2004 and compared them to documented cardiovascular-related deaths for the same period.

Brisbane has hot, humid summers and mild, dry winters. The average daily mean temperature was 68.9 degrees Fahrenheit (20.5 degrees Celsius), with the coldest 1 percent of days (53 °F /11.7 °C) characterized as cold spells and the hottest 1 percent (84.5°F/ 29.2 °C ) heat waves.

Per 1 million people, 72 years of life were lost per day due to CVD, researchers said.

Risk of premature CVD death rose more when extreme heat was sustained for two or more days, researchers found.

"This might be because people become exhausted due to the sustained strain on their cardiovascular systems without relief, or health systems become overstretched and ambulances take longer to reach emergency cases," said Adrian G. Barnett, Ph.D., co-author of the study and associate professor of biostatistics at QUT. "We suspect that people take better protective actions during prolonged cold weather, which might be why we did not find as great a risk of CVD during cold spells."

Spending a few hours daily in a temperate environment can help reduce heat- and cold-related illnesses and deaths, Barnett said.

Researcher acknowledged that the findings may not apply to other communities and that they only considered deaths where CVD was the underlying cause.

Other co-authors are: Xiaoming Wang, Ph.D. and Shilu Tong, Ph.D. Funding and author disclosures are on the manuscript.

Learn about protecting your heart in the heat and the impact of cold weather on cardiovascular disease. For the latest heart and stroke news, follow us on twitter: @HeartNews.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by American Heart Association.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. C. Huang, A. G. Barnett, X. Wang, S. Tong. Effects of Extreme Temperatures on Years of Life Lost for Cardiovascular Deaths: A Time Series Study in Brisbane, Australia. Circulation: Cardiovascular Quality and Outcomes, 2012; 5 (5): 609 DOI: 10.1161/CIRCOUTCOMES.112.965707

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/hPhJRKcrCBY/120918184816.htm
--
Manage subscription | Powered by rssforward.com
16.45 | 0 komentar | Read More

Mathematicians show how shallow water may help explain tsunami power

ScienceDaily (Sep. 18, 2012) — While wave watching is a favorite pastime of beachgoers, few notice what is happening in the shallowest water. A closer look by two University of Colorado Boulder applied mathematicians has led to the discovery of interacting X- and Y-shaped ocean waves that may help explain why some tsunamis are able to wreak so much havoc.

Professor Mark Ablowitz and doctoral student Douglas Baldwin repeatedly observed such wave interactions in ankle-deep water at both Nuevo Vallarta, Mexico, and Venice Beach, Calif., in the Pacific Ocean -- interactions that were thought to be very rare but which actually happen every day near low tide. There they saw single, straight waves interacting with each other to form X- and Y-shaped waves as well as more complex wave structures, all predicted by mathematical equations, said Ablowitz.

When most ocean waves collide, the "interaction height" is the sum of the incoming wave heights, said Baldwin. "But the wave heights that we saw from such interactions were much taller, indicating that they are what we call nonlinear," he said.

Satellite observations of the 2011 tsunami generated by the devastating earthquake that struck Japan indicate there was an X-shaped wave created by the merger of two large waves. "This significantly increased the destructive power of the event," said Ablowitz. "If the interaction had happened at a much greater distance from shore, the devastation could have been even worse as the amplitude could have been even larger. Not every tsunami is strengthened by interacting waves, but when they do intersect there can be a powerful multiplier because of the nonlinearity."

Ablowitz first observed the nonlinear wave action in 2009 while visiting Nuevo Vallarta just north of Puerto Vallarta with his family. He took hundreds of photographs and videos of the peculiar waves over the next several years.

"Unlike most new physics, you can see these interactions without expensive equipment or years of training," said Ablowitz. "A person just needs to go to a flat beach, preferably near a jetty, within a few hours of low tide and know what to look for."

A paper on the subject by Ablowitz and Baldwin was published this month in the journal Physical Review E.

Baldwin, who is studying under Ablowitz, wanted to go the extra mile to verify that the wave interactions observed by his professor were not unique to one beach. In this case he drove more than 1,000 miles to the Los Angeles area "on a whim" to search for the types of waves Ablowitz had observed in Mexico. He hit the jackpot at Venice Beach.

"I don't think there is anything more enjoyable in science than discovering something by chance, predicting something you haven't seen, and then actually seeing what you predicted," said Baldwin.

To see photos and videos of the wave interactions visit http://www.douglasbaldwin.com/nl-waves.htmland http://www.markablowitz.com/line-solitons.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by University of Colorado at Boulder.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Mark Ablowitz, Douglas Baldwin. Nonlinear shallow ocean-wave soliton interactions on flat beaches. Physical Review E, 2012; 86 (3) DOI: 10.1103/PhysRevE.86.036305

Note: If no author is given, the source is cited instead.

Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/yhthFveySt8/120918185725.htm
--
Manage subscription | Powered by rssforward.com
16.45 | 0 komentar | Read More

Adult obesity rates could exceed 60 percent in 13 U.S. states by 2030, according to new study

ScienceDaily (Sep. 18, 2012) — The number of obese adults, along with related disease rates and health care costs, are on course to increase dramatically in every state in the United States over the next 20 years, according to F as in Fat: How Obesity Threatens America's Future 2012, a report released today by Trust for America's Health (TFAH) and the Robert Wood Johnson Foundation (RWJF).

For the first time, the annual report includes an analysis that forecasts 2030 adult obesity rates in each state and the likely resulting rise in obesity-related disease rates and health care costs. By contrast, the analysis also shows that states could prevent obesity-related diseases and dramatically reduce health care costs if they reduced the average body mass index of their residents by just 5 percent by 2030.

"This study shows us two futures for America's health," said Risa Lavizzo-Mourey, MD, RWJF president and CEO. "At every level of government, we must pursue policies that preserve health, prevent disease and reduce health care costs. Nothing less is acceptable."

The analysis, which was commissioned by TFAH and RWJF and conducted by the National Heart Forum, is based on a peer-reviewed model published last year in The Lancet. Findings include:

Projected Increases in Adult Obesity Rates

If obesity rates continue on their current trajectories, by 2030, 13 states could have adult obesity rates above 60 percent, 39 states could have rates above 50 percent, and all 50 states could have rates above 44 percent.

By 2030, Mississippi could have the highest obesity rate at 66.7 percent, and Colorado could have the lowest rate for any state at 44.8 percent. According to the latest data from the U.S. Centers for Disease Control and Prevention (CDC), obesity rates in 2011 ranged from a high of 34.9 percent in Mississippi to a low of 20.7 percent in Colorado.

Projected Increases in Disease Rates

If states' obesity rates continue on their current trajectories, the number of new cases of type 2 diabetes, coronary heart disease and stroke, hypertension and arthritis could increase 10 times between 2010 and 2020 -- and double again by 2030.

Obesity could contribute to more than 6 million cases of type 2 diabetes, 5 million cases of coronary heart disease and stroke, and more than 400,000 cases of cancer in the next two decades.

Currently, more than 25 million Americans have type 2 diabetes, 27 million have chronic heart disease, 68 million have hypertension and 50 million have arthritis. In addition, 795,000 Americans suffer a stroke each year, and approximately one in three deaths from cancer per year (approximately 190,650) are related to obesity, poor nutrition or physical inactivity.

Projected Increase in Costs for Health Care and Lost Productivity

By 2030, medical costs associated with treating preventable obesity-related diseases are estimated to increase by $48 billion to $66 billion per year in the United States, and the loss in economic productivity could be between $390 billion and $580 billion annually by 2030. Although the medical cost of adult obesity in the United States is difficult to calculate, current estimates range from $147 billion to nearly $210 billion per year.

Over the next 20 years, nine states also could see their obesity-related health care costs climb by more than 20 percent, with New Jersey on course to see the biggest increase at 34.5 percent. Sixteen states and Washington, D.C., could see increases between 15 percent and 20 percent.

How Reducing Adult Obesity Could Lower Disease Rates and Health Care Costs

The analysis also explored a scenario based on states successfully lowering adult obesity rates. It found that, if states could reduce the average body mass index (BMI) of residents by just 5 percent by 2030, every state could help thousands or millions of people avoid obesity-related diseases, while saving billions of dollars in health care costs. For a six-foot-tall person weighing 200 pounds, a 5 percent reduction in BMI would be the equivalent of losing roughly 10 pounds.

If BMIs were lowered, the number of Americans who could be spared from developing major obesity-related diseases could range from:

  • Type 2 diabetes: 14,389 in Alaska to 796,430 in California;
  • Coronary heart disease and stroke: 11,889 in Alaska to 656,970 in California;
  • Hypertension: 10,826 in Alaska to 698,431 in California;
  • Arthritis: 6,858 in Wyoming to 387,850 in California; and
  • Obesity-related cancer: 809 in Alaska to 52,769 in California.

And nearly every state could save between 6.5 percent and 7.9 percent in health care costs. This could equate to savings ranging from $81.7 billion in California to $1.1 billion in Wyoming. Florida, the only state that would save less than 6.5 percent in health care costs, could save 2.1 percent or $34 billion.

"We know a lot more about how to prevent obesity than we did 10 years ago," said Jeff Levi, PhD, executive director of TFAH. "This report outlines how policies like increasing physical activity time in schools and making fresh fruits and vegetables more affordable can help make healthier choices easier. Small changes can add up to a big difference. Policy changes can help make healthier choices easier for Americans in their daily lives."

Report Recommendations

On the basis of the data collected and a comprehensive analysis, TFAH and RWJF recommend making investments in obesity prevention in a way that matches the severity of the health and financial toll the epidemic takes on the nation. The report includes a series of policy recommendations, including:

  • Fully implement the Healthy, Hunger-Free Kids Act, by implementing the new school meal standards and updating nutrition standards for snack foods and beverages in schools;
  • Protect the Prevention and Public Health Fund;
  • Increase investments in effective, evidence-based obesity-prevention programs;
  • Fully implement the National Prevention Strategy and Action Plan;
  • Make physical education and physical activity a priority in the reauthorization of the Elementary and Secondary Education Act;
  • Finalize the Interagency Working Group on Food Marketed to Children Guidelines;
  • Fully support healthy nutrition in federal food programs; and
  • Encourage full use of preventive health care services and provide support beyond the doctor's office.

The full report with state rankings in all categories is available on TFAH's website at www.healthyamericans.org and RWJF's website at www.rwjf.org. TFAH and RWJF collaborated on the report, which was supported by a grant from RWJF.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Trust for America's Health.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/9aDyhvgaHP8/120918190923.htm
--
Manage subscription | Powered by rssforward.com
16.45 | 0 komentar | Read More

Prejudice can cause depression at the societal, interpersonal, and intrapersonal levels

ScienceDaily (Sep. 18, 2012) — Although depression and prejudice traditionally fall into different areas of study and treatment, a new article suggests that many cases of depression may be caused by prejudice from the self or from another person. In an article published in the September 2012 issue of Perspectives on Psychological Science, a journal of the Association for Psychological Science, William Cox of the University of Wisconsin-Madison and colleagues argue that prejudice and depression are fundamentally connected.

Consider the following sentence: "I really hate _____. I hate the way _____ look. I hate the way _____ talk."

What words belong in the blanks? It's possible that the statement expresses prejudice toward a stigmatized group: "I really hate Black people," "I hate the way gay men look," or "I hate the way Jews talk." But this statement actually comes from a depressed patient talking about herself: "I really hate me. I hate the way I look. I hate the way I talk."

The fact that the statement could have been completed in two equally plausible ways hints at a deep connection between prejudice and depression. Indeed, Cox and colleagues argue that the kinds of stereotypes about others that lead to prejudice and the kinds of schemas about the self that lead to depression are fundamentally similar. Among many features that they have in common, stereotypes of prejudice and schemas of depression are typically well-rehearsed, automatic, and difficult to change.

Cox and colleagues propose an integrated perspective of prejudice and depression, which holds that stereotypes are activated in a "source" who then expresses prejudice toward a "target," causing the target to become depressed.

This depression caused by prejudice -- which the researchers call deprejudice -- can occur at many levels. In the classic case, prejudice causes depression at the societal level (e.g., Nazis' prejudice causing Jews' depression), but this causal chain can also occur at the interpersonal level (e.g., an abuser's prejudice causing an abusee's depression), or even at the intrapersonal level, within a single person (e.g., a man's prejudice against himself causing his depression).

The researchers state that the focus of their theory is on cases of depression that are driven primarily by the negative thoughts that people have about themselves or that others have about them and does not address "depressions caused by neurochemical, genetic, or inflammatory processes." Understanding that many people with depression are not "just" depressed -- they may have prejudice against themselves that causes their depression -- has powerful theoretical implications for treatment.

Cox and colleagues propose that interventions developed and used by depression researchers -- such as cognitive behavior therapy and mindfulness training -- may be especially useful in combating prejudice. And some interventions developed and used by prejudice researchers may be especially useful in treating depression.

Using a wider lens to see the common processes associated with depression and prejudice will help psychological scientists and clinicians to understand these phenomena better and develop cross-disciplinary interventions that can target both problems.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by Association for Psychological Science.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. W. T. L. Cox, L. Y. Abramson, P. G. Devine, S. D. Hollon. Stereotypes, Prejudice, and Depression: The Integrated Perspective. Perspectives on Psychological Science, 2012; 7 (5): 427 DOI: 10.1177/1745691612455204

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/c1q6hsvxF1w/120918185631.htm
--
Manage subscription | Powered by rssforward.com
16.45 | 0 komentar | Read More

Erroneous link between prostate cancer and retrovirus from mice confirmed: Contamination of XMRV in cancer tissue sample

ScienceDaily (Sep. 18, 2012) — A once-promising discovery linking prostate cancer to an obscure retrovirus derived from mice was the result of an inadvertent laboratory contamination, a forensic analysis of tissue samples and lab experiments -- some dating back nearly a decade -- has confirmed.

The connection, which scientists have questioned repeatedly over the last couple years, was first proposed more than six years ago, when the telltale signature of the virus, known as XMRV, was detected in genetic material derived from tissue samples taken from men with prostate cancer.

Later studies failed to find the same signature, and researchers reported that while XMRV is a real, previously-undiscovered virus with interesting and useful properties, it is an infection of human prostate cancer cells in laboratories and not of prostate cancer patients.

Now, an analysis by a team of scientists led by researchers from the University of California, San Francisco (UCSF), Cleveland Clinic and Abbott has uncovered the complete story behind this contamination.

As described this week in the open-access journal PLOS ONE, the original association between XMRV and prostate cancer resulted from traces of XMRV that appear to have found their way into the prostate samples from other cells being handled in the same laboratory in 2003. These cells were also contaminated with the retrovirus.

"Everything arose from this presumed contamination event," said Charles Chiu, MD, PhD, an assistant professor of laboratory medicine at UCSF and director of the UCSF-Abbott Viral Diagnostics and Discovery Center.

Anatomy of Lab Contamination

XMRV became a focus of research after its genetic signature was first found in prostate cancer samples in 2006. Similar studies in 2009 also detected the virus among samples taken from people with Chronic Fatigue Syndrome -- though both discoveries have now been called into question. The original publication related to Chronic Fatigue Syndrome has since been retracted.

When the prostate connection first emerged, there was a lot of excitement in the field, said Chiu, because of the lesson from human papillomavirus, a virus known to cause cervical cancer in women. HPV taught doctors that a cancer caused by a virus could be prevented by giving people a vaccine.

For people with Chronic Fatigue Syndrome, the 2009 news offered hope because it promised new tests that could definitively diagnose their condition -- and possibly lead to treatment with antiviral drugs that block XMRV.

But the connection between the virus and both diseases began to unravel after a number of follow-up studies in several different laboratories failed to detect XMRV in tissue samples taken from men with prostate cancer and people with Chronic Fatigue Syndrome. Other studies added to the doubt by providing strong evidence that XMRV may have arisen simply from laboratory contamination.

Working with the original groups that made the 2006 discovery, Chiu and his colleagues sought to definitively uncover how this contamination occurred.

They first repeated the 2006 experiments in an unbiased way by using a variety of methods to examine new samples taken from a cohort of 39 men with prostate cancer. Failing to detect any trace of XMRV in these samples, they went back to the original material described in the 2006 paper and retraced its route through the laboratory step by step.

Quickly, they determined that the virus detected in these samples was essentially identical in each -- which suggested contamination rather than natural infection. Viruses like XMRV readily mutate, and if the different men who had donated prostate tissues had truly been infected, there likely would have been more than one strain present.

Looking further, the scientists found that while the virus was present in genetic extracts made from the samples -- and analyzed in the 2006 study -- it was not present in the original prostate tissues themselves, samples of which were fixed in waxy paraffin immediately after they were first surgically removed.

That discovery suggested that XMRV was introduced as a contaminant at some point when the tissue was being manipulated in the laboratory that processed the prostate tissue samples, prior to them being sent to UCSF for analysis.

Searching for a possible culprit, the team found a completely different cell line that was not used in the study but had been used in the same laboratory at the same time. They found frozen samples of these cells, called "LNCaP," which had been packed away in a lab freezer since 2003. The virus was in these cells.

Finding Demonstrates Power of Mitochondrial RNA Profiling

Using a sophisticated new technique called mitochondrial RNA profiling, the researchers showed that these cells were indeed the source of the virus detected in the prostate samples.

But how did the LNCaP cells themselves become contaminated? Looking further, the scientists found that the source was another type of cell, called the 22Rv1 cell line, which was developed at Case Western Reserve University and is used extensively in prostate cancer research. Prior research by other scientists showed that this virus appears to have been created accidentally in the laboratory in the 1990s in a "recombination event" in which two viruses combined to form XMRV. This event occurred while scientists were working with mice and a prostate cancer tumor to make the 22Rv1 cell line.

"These findings underscore the importance of rapidly evolving new technologies such as deep sequencing and a novel application of this technology, mitochondrial RNA profiling," said John Hackett Jr., PhD, senior research fellow of Emerging Pathogens and Virus Discovery at Abbott. If these scientific tools were available when XMRV was first discovered, contamination would likely have been identified far sooner. The most important contribution of our study to the scientific community may well be the demonstration of how these technologies can be applied in future studies."

The whole affair is something of a cautionary tale, Chiu said. "This is basically the nature of science -- fallible and not necessarily error-free, yet ultimately self-correcting."

"It's been known for over a year that XMRV was the result of lab contamination. I couldn't rest until we figured out how it happened. It felt like the right thing to do was to collaborate with Dr. Chiu and the others to get the answers," said Robert Silverman, PhD, interim chair of Cancer Biology at the Cleveland Clinic and one of the authors of the original study. "I'm gratified that we finally got to the bottom of the story."

The article, "In-Depth Investigation of Archival and Prospectively Collected Samples Reveals No Evidence for XMRV Infection in Prostate Cancer," by Deanna Lee, Jaydip Das Gupta, Christina Gaughan, Imke Steffen, Ning Tang, Ka-Cheung Luk, Xiaoxing Qiu, Anatoly Urisman, Nicole Fischer, Ross Molinaro, Miranda Broz, Gerald Schochetman, Eric A. Klein, Don Ganem, Joseph L. DeRisi, Graham Simmons, John Hackett Jr., Robert H. Silverman and Charles Y. Chiu, appears in the journal PLOS ONE on Sept. 18.

In addition to UCSF, authors on this study are affiliated with the University of San Francisco, Cleveland Clinic, Blood Systems Research Institute in San Francisco, Abbott in Abbott Park, Ill., University Medical Center Hamburg-Eppendorf in Germany, Emory University School of Medicine in Atlanta, Novartis Institutes for Biomedical Research in Emeryville, Calif.

This work was funded by the National Institutes of Health (grant #CA103943, #1R21HL109761, #R56-AI08953 and #R01-HL105704), the Charlotte Geyer Foundation, the Maltz Family Foundation, Abbott and the Howard Hughes Medical Institute.

Several authors on the study not affiliated with UCSF are included as inventors on one or more patent applications, which include either the Cleveland Clinic, Abbott Laboratories, or both. Abbott is the sole assignee of issued U.S. Patent No. 8,183,349 relating to XMRV.

Share this story on Facebook, Twitter, and Google:

Other social bookmarking and sharing tools:


Story Source:

The above story is reprinted from materials provided by University of California, San Francisco (UCSF). The original article was written by Jason Bardi.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Deanna Lee, Jaydip Das Gupta, Christina Gaughan, Imke Steffen, Ning Tang, Ka-Cheung Luk, Xiaoxing Qiu, Anatoly Urisman, Nicole Fischer, Ross Molinaro, Miranda Broz, Gerald Schochetman, Eric A. Klein, Don Ganem, Joseph L. DeRisi, Graham Simmons, John Hackett, Robert H. Silverman, Charles Y. Chiu. In-Depth Investigation of Archival and Prospectively Collected Samples Reveals No Evidence for XMRV Infection in Prostate Cancer. PLoS ONE, 2012; 7 (9): e44954 DOI: 10.1371/journal.pone.0044954

Note: If no author is given, the source is cited instead.

Disclaimer: This article is not intended to provide medical advice, diagnosis or treatment. Views expressed here do not necessarily reflect those of ScienceDaily or its staff.

19 Sep, 2012


-
Source: http://feeds.sciencedaily.com/~r/sciencedaily/~3/junu9eEoz2w/120918184751.htm
--
Manage subscription | Powered by rssforward.com
16.15 | 0 komentar | Read More
techieblogger.com Techie Blogger Techie Blogger