Friday, May 9, 2008

The DNA Network

The DNA Network

Origins of Human Malaria [HENRY » genetics]

Posted: 09 May 2008 06:22 PM CDT

In today’s MBE, Origins of Human Malaria: Rare Genomic Changes and Full Mitochondrial Genomes Confirm the Relationship of Plasmodium falciparum to Other Mammalian Parasites but Complicate the Origins of Plasmodium vivax (doi:10.1093/molbev/msn069):

Despite substantial work, the phylogeny of malaria parasites remains debated. The matter is complicated by concerns about patterns of evolution in potentially strongly selected genes as well as the extreme AT bias of some Plasmodium genomes. Particularly contentious has been the position of the most virulent human parasite Plasmodium falciparum, whether grouped with avian parasites or within a larger clade of mammalian parasites. Here, we study 3 classes of rare genomic changes, as well as the sequences of mitochondrial ribosomal RNA (rRNA) genes.

We report 3 lines of support for a clade of mammalian parasites:
1) we find no instances of spliceosomal intron loss in a hypothetical ancestor of P. falciparum and the avian parasite Plasmodium gallinaceum, suggesting against a close relationship between those species;
2) we find 4 genomic mitochondrial indels supporting a mammalian clade, but none grouping P. falciparum with avian parasites; and
3) slowly evolving mitochondrial rRNA sequences support a mammalian parasite clade with 100% posterior probability.

We further report a large deletion in the mitochondrial large subunit rRNA gene, which suggests a subclade including both African and Asian parasites within the clade of closely related primate malarias. This contrasts with previous studies that provided strong support for separate Asian and African clades, and reduces certainty about the historical and geographic origins of Plasmodium vivax. Finally, we find a lack of synapomorphic gene losses, suggesting a low rate of ancestral gene loss in Plasmodium.

New technique determines the number of fat cells remains constant in all body types [Think Gene]

Posted: 09 May 2008 04:16 PM CDT

The radioactive carbon-14 produced by above-ground nuclear testing in the 1950s and ’60s has helped researchers determine that the number of fat cells in a human’s body, whether lean or obese, is established during the teenage years. Changes in fat mass in adulthood can be attributed mainly to changes in fat cell volume, not an increase in the actual number of fat cells.

These results could help researchers develop new pharmaceuticals to battle obesity as well as the accompanying diseases such as high blood pressure and diabetes.

A new study by Lawrence Livermore National Laboratory scientist Bruce Buchholz - along with colleagues from the Karolinska Institute in Sweden; Humboldt University Berlin, Foundation of Research and Technology in Greece; Karolinska University Hospital; and Stockholm University - applied carbon dating to DNA to discover that the number of fat cells stays constant in adulthood in lean and obese individuals, even after marked weight loss, indicating that the number of fat cells is set during childhood and adolescence.

Carbon dating is typically used in archaeology and paleontology to date the age of artifacts. However, in this application, which appeared in the May 4 early online edition of the journal Nature, the scientists used the pulse of radiocarbon to analyze fat cell turnover in humans.

Radiocarbon or carbon-14 is naturally produced by cosmic ray interactions with air and is present at low levels in the atmosphere and food. Its concentration remained relatively constant during the past 4,000 years, butatmospheric testing of nuclear weapons from 1950-1963 produced a global pulse in the amount of radiocarbon in the atmosphere, Buchholz said.

In the new study, Buchholz analyzed the uptake of carbon-14 in genomic DNA within fat cells to establish the dynamics of fat cell turnover. Approximately 10 percent of fat cells are renewed annually at all adult ages and levels of body mass index.

Neither fat cell death nor its generation rate is altered in early onset obesity, suggesting a tight regulation of the number of fat cells in obese adults.

"Fat cells change in size but no one had ever measured fat cell turnover," Buchholz said. "An increase in cell size means it can hold more mass."

Obesity is increasing in epidemic proportions in most countries and poses a public health problem by enhancing the risks for cardiovascular diseases and metabolic disorders such as type 2 diabetes. According to the Centers for Disease Control and Prevention, the prevalence of overweight and obesity has increased sharply for both adults and children since the 1970s. Data from two National Health and Nutrition Examination surveys show that among adults aged 20-74 years the prevalence of obesity increased from 15 percent (in the 1976-80 survey) to 32.9 percent (in the 2003-04 survey).

The two surveys also show increases in overweight children and teens. For children aged 2-5 years, the prevalence increased from 5 percent to 13.9 percent; for those aged 6-11 years, prevalence increased from 6.5 percent to 18.8 percent; and for those aged 12-19 years, prevalence increased from 5 percent to 17.4 percent.

In the Nature study, the team first found that there was a direct correlation between the measures of fat mass (measured from body mass index (BMI) and fat cell volume in subcutaneous fat, which represents about 80 percent of all fat, and visceral fat.

In a study of 687 adults, the researchers found that number of fat cells increases in childhood and adolescence, but levels off and remains constant in adulthood. The group looked at whether the number of fat cells changes under extreme conditions such as drastic weight loss by radical reduction in caloric intake, such as through bariatric surgery. The treatment resulted in a significant decrease in BMI and fat cell volume; however, it did not reduce the number of fat cells two years after the surgery. Similarly, significant weight gain (15-25 percent) over several months in non-obese adult men resulted in significant increase in body fat volume but no change in number. Subsequent weight loss back to baseline resulted in a decrease in fat cell volume but no change in the number of fat cells.

"If you are overweight and you lose weight, you still have the capacity to store lipids because you still have the same number of fat cells. That may be why it’s so hard to keep the weight off," Buchholz said.

Overweight and obesity result from an energy imbalance - eating too many calories and not getting enough physical activity. Body weight is the result of genes, metabolism, behavior, environment, culture and socioeconomic status. "This work may give us new ideas of how to deal with the diseases that go along with obesity," Buchholz said.

Source: DOE/Lawrence Livermore National Laboratory

Researchers uncover mechanism of action of antibiotic able to reduce neuronal cell death in brain [Think Gene]

Posted: 09 May 2008 04:16 PM CDT

Research Highlights:

  • Mechanism of action of compound found to induce neurotransmitter activity in brain cells
  • The findings may lead researchers to develop potential novel therapies to treat Alzheimer’s disease, amyotrophic lateral sclerosis, Huntington's disease, epilepsy, stroke/ischemia, dementia and malignant gliomas

Virginia Commonwealth University researchers have discovered how an antibiotic works to modulate the activity of a neurotransmitter that regulates brain functions, which eventually could lead to therapies to treat Alzheimer's disease, Huntington's disease, epilepsy, stroke, dementia and malignant gliomas.

Neurodegenerative diseases are caused by the deterioration of neurons in the brain and spine resulting in problems related to either movement or memory. For most patients, it may be months or years before symptoms are evident because a large number of neurons die or stop functioning over a period of time. Currently, there are few treatment options for stopping this degeneration, and those currently being evaluated have shown minimal or no beneficial activity.

Paul B. Fisher, M.Ph., Ph.D., a professor and interim chair of the Department of Human and Molecular Genetics, and director of the VCU Institute of Molecular Medicine, in the VCU School of Medicine, and colleagues recently reported on the mechanism of action of ceftriaxone, a third-generation antibiotic with neuroprotective properties, in glutamate transport. The findings, published in the May 9 issue of the Journal of Biological Chemistry, suggest that this antibiotic or a similar drug may serve as a potential therapy against neurodegenerative disease caused by glutamate toxicity.

Glutamate is an amino acid that is important in nerve transmission and the synapse - the region that connects one neuron to another in the brain. When an excess of glutamate collects in the synapse, the result is glutamate toxicity or excitotoxicity. Ultimately, if glutamate is not cleared out of the synapse, neurons become damaged and die by a process called excitotoxicity. In previous studies, Fisher's team identified ceftriaxone as a potent physiological stimulator of glutamate transport both in cell culture and in animal models.

"Glutamate excitotoxicity is a very important and fundamental process in neurodegeneration," said Fisher. "Finding molecules, such as ceftriaxone, that may correct this problem can lead to preservation and increased survival of neurons in the brain and it may have direct implications in the therapy of many neurodegenerative diseases, such as in Alzheimer's disease, stroke, ALS and epilepsy."

In this study, Fisher and his colleagues were interested in identifying how the promoter region of the EAAT2 gene controlled the expression of glutamate in a group of brain cells called astrocytes. Using molecular biological approaches, the team examined all the regions and sequences in the promoter region and systematically eliminated them to then define which region was necessary to respond to ceftriaxone.

According to Fisher, this led the team to a critical transcription factor called nuclear factor kappaB, NF- kappaB, which regulates many functions in the brain and other parts of the body. This is a central molecule involved in regulation of genes controlling cell growth and survival. Once they identified critical regions in the EAAT2 promoter that might regulate activity, they found that alteration of one specific NF-kappaB site by mutation in the promoter was responsible for up-regulation of EAAT2 expression and consequently glutamate transport by ceftriaxone.

"This work not only has implications for the field of neurodegeneration and neurobiology, but may also help us more clearly understand brain cancer, including malignant glioma, an invariably fatal tumor, and how it impacts brain function," said Fisher, who is the first incumbent of the Thelma Newmeyer Corman Endowed Chair in Cancer Research and researcher with the VCU Massey Cancer Center.

Future studies will examine ways to modify the structure of ceftriaxone through medicinal chemistry to create molecules that are pharmacologically improved. Currently, ceftriaxone needs to be injected, which is not the ideal for patient therapy, however, the development of an oral form would be a more preferential way to treat patients.

Source: Virginia Commonwealth University

Get you Gene Genie submissions in [adaptivecomplexity's column]

Posted: 09 May 2008 03:45 PM CDT

The next Gene Genie carnival is going to feature a Mother's Day edition right here at Adaptive Complexity.

Get your submissions in now!

Reproducibility of SNP Testing, Part II [The Genetic Genealogist]

Posted: 09 May 2008 03:39 PM CDT

imageThe Quantified Self has a follow-up to last week’s post about the reproducibility of SNP testing by 23andMe and deCODEme using Illumina SNP chips (see the Quantified Self’s post and my post). In that post, it was revealed that two comparisons of the 560,000 overlapping SNP results from the two different companies had revealed differences of just 23 locations for one individual and 35 for another.

Soon after last week’s post, one of these individuals - Ann Turner - contacted The Quantified Self with new information that 4 of the SNPs on her list of 35 disagreeing results are also on the other person’s list of 23 disagreeing results (Antonio Oliveira). From Ann’s email to The Quantified Self:

Four of those (rs11149566, rs4458717, rs4660646, and rs754499) were also found in Antonio’s list. That’s more than you would expect by chance.

Interesting results, and as Kelly at TGS points out, “This is why sharing results is so valuable and a key to great quantified self understanding.” For anyone who might be interested in doing further comparison, here is Oliveira’s list (also available here):

rs4660646, rs4458717,rs754499, rs11149566, rs1934496, rs10933181, rs9881405, rs1064205, rs312330, rs11100437, rs2955195, rs7033246, rs1536928, rs10793963, rs10894749, rs3921012, rs510978, rs12296276, rs4965862, rs2290505, rs12960185, rs4814138, rs6615048

And here is Turner’s list (also available here):

rs4660646, rs4458717,rs754499, rs11149566, rs10435795, rs1045363, rs10743414, rs10945383, rs11179382, rs11707159, rs11915402, rs1209171, rs1221986, rs12907462, rs1303912, rs13422439, rs161381, rs17328647, rs1961196, rs1966357, rs2016461, rs2064034, rs2290516, rs2853981, rs3952469, rs4336661, rs4423481, rs4572718, rs6531490, rs6942478, rs7102702, rs7812884, rs845217, rs9332128, rs9476380

For everyone not familiar with SNPs, or Single Nucleotide Polymorphisms, see this brief introduction at Wikipedia, including the helpful diagram, or read the SNP Page at SNPedia (which links to a helpful YouTube video).

Gamers, get your folding on [business|bytes|genes|molecules]

Posted: 09 May 2008 03:17 PM CDT

Protein before and after folding.Technology Review was the first place I saw it, then someone put it up on Friendfeed and now Andrew Perry has a great post on Foldit. Foldit comes out of the lab of a bbgm favorite, David Baker, right here at the University of Washington.

Foldit combines gaming with protein structure prediction. It’s an interesting approach to spreading scientific problems. Folding@home built upon the success of Seti@home and the geek cred of running on gaming consoles and has built quite a following. Will Foldit, which presents a simple, fun interface to get people interested in protein structure (and the existence of Folding@home makes this somewhat familiar to geeks everywhere) be an example of how we can leverage crowdsourcing? Andrew makes some interesting points (which I agree with) on weighting crowdsourcing, although that’s always a hard thing to do, but I’d like to see karma, etc come into play here.

It’s good to see protein structure getting some attention and continuing to be creative. It’s always been my favorite scientific subject. The field lends itself to “pretty pictures”, so getting non-experts involved is a possibility.

The site and server have had connectivity issues since I’ve been trying, so perhaps they need help with web resources, cause lots seem to be interested.

Here is a list of people supporting the project: UW Animation Research Labs, UW Baker Lab, DARPA, Microsoft, and Adobe. Nice list.

Image via Wikipedia

Technorati Tags: , , , , ,

ShareThis

What The Platypus Genome Is and Isn't [adaptivecomplexity's column]

Posted: 09 May 2008 02:46 PM CDT

I haven't contributed a single thing to the platypus genome project, but since my desk sits one floor above where people and robots broke the platypus DNA into chunks, cloned those chunks into bacteria, sequenced the pieces of DNA, and used massive amounts of computing power to assemble the stretches of sequence into a complete genomic whole, I'm going consider myself somewhat of an authority on the subject and tell you what's wrong with other people's ideas about the platypus.

The genome sequence of the platypus was published Thursday in Nature, and from the press headlines, you could be excused for thinking that genomics has in fact confirmed that the platypus is a freak of nature: part bird, part reptile, and part mammal. The animal certainly looks like it - the platypus has the webbed feet and bill of a duck, and venomous spines and rubbery eggs that remind us of reptiles, but it has fur and feeds its young with milk, so it must be a mammal. The confusing press headlines might even lead you to believe that we sequenced the platypus genome just to figure out what this thing is, when the truth is, as we'll see below, that the genome sequence has essentially confirmed what evolutionary biologists have already deduced about the position of the platypus on the tree of life.

Is the platypus part bird, part reptile part mammal, an amalgam of very different groups of animals? Is it a primitive mammal that resembles the early ancestors of all mammals? Can we figure out just what this creature is by gazing at its genome?


Photo Credit: Stefan Kraft, courtesy of the Wikipedia Commons

read more

Who Wants to Solve a Protein Structure? [Bayblab]

Posted: 09 May 2008 02:28 PM CDT

The Folding@Home project allows PS3 and computer owners to use spare processor cycles to help solve 3-dimensional protein structures. A new game allows players to use spare brain power to do the same. Foldit taps into human 3-D problem solving skills getting players to fold proteins in a video game interface, giving points based on energy required for a given configuration. The game has been in testing phase using proteins with known structures, but is about to challenge players with unknown structures.

Try it out or read more about the game here.

Postdoc life vs PI life [Mailund on the Internet]

Posted: 09 May 2008 08:28 AM CDT

I just saw this post today linked to from Bitesize Bio’s Around the Blog. It bears the title “Postdocs always overestimate their intellectual contributions” but really is about the relationship between postdocs and PIs, as seen from the postdoc perspective (and an under-appreciation of the PIs work).

Life as a postdoc

The postdoc (and grad student, for that matter) perspective is — and I recognize it both from my own experience and my colleagues — that you come up with most of the ideas, do all the work, and the PI just takes the credit. It is a bit exaggerated, but the point is that the vast majority of the time spend on a project is time spent by a postdoc or grad student. The advisor reads the occasional draft paper, comment on the research a bit here and there, but doesn’t really put in the hours.

I think this is true. I think this is absolutely true (and on this point I think I disagree with the post I link to above… or maybe not when I elaborate on the point…)

Before I elaborate, though, I should probably explain my own situation, so you know where this perspective comes from.

I am somewhere in between being postdoc or PI. The last couple of years I have been working on my own grants and not really been associated with any particular group, so I haven’t been a postdoc as such. Although I’m currently officially an associate professor, I am not associated to anyone, and I haven’t got tenure yet so I cannot start my own group. With the current rules in Aarhus, I cannot have grad students associated either. Instead, I have associated myself to different other groups, for individual projects, and when doing so take on roles as either “postdoc” or “advisor”, depending on the project and who else is involved.

Anyway, the situation is that I was relatively recently a postdoc — and remember how I felt about academic life then — and now I have experienced both the advisor and the postdoc world (but so far have never really been responsible for my own group, so my perspective might change later on).

So, as I said, I think it is true that on any one project, the postdoc will be doing the vast majority of the work, at least measured in hours. This isn’t really surprising. The postdoc can focus on one project (or a few projects), while the PI needs to manage many. Of course the postdoc will contribute the most time — but who contributes the most research?

Division of labour

This is a trickier question… first of all because it is not easy to define nor measure “research”. The best measure we have is the publications, but within a project you cannot use it to measure the output of the first author vs the last author.

The PI will be more experienced (or at least should be, but I’ll come back to that) so he will have a better idea about how to approach a problem, he will be able to spot dead-ends long before the postdoc, he will know the literature better, etc. This is all experienced he gained when he was the one doing the grunt work, and now he should use it to teach the postdoc. A PI hour might simply be worth more than a postdoc our because of the extra experience.

That is not the whole story, though. There is also what I would call a trade-off between “detail work” and “big picture”.

Most research is fiddling with details, trying to make epsilon improvements upon previous work. We all dream of a great breakthrough, but day to day research is about epsilon improvements, and that means fiddling with details. It requires a lot of work to fully understand the details of any kind of problem, and it is very easy to get lost in details and lose the big picture. How does the current work fit into the larger problems in a field, how does it related to previous work, and what will be the consequences of whatever you are trying to prove or solve? Someone needs to figure out the details, but someone also needs to keep the big picture in mind!

I’m not sure you can do both at the same time. At least I find it difficult to do both at the same time.

The postdoc deals with the details, and is probably the only one who fully understands the problem being worked on. The PI keeps the big picture in mind, trying to relate the current problem to the rest of the field.

The PI will spend less time on the particular problem — contribute fewer hours to the project — but at the same time he probably needs to keep up with a broader part of the literature and keep track of related projects to improve on this project. It doesn’t feel like he is contributing to the specific problem when he is just reading papers about other peoples work, but the hours he put in there helps him keep an eye on the big picture instead of getting caught in the details.

One of the perks is, of course, that keeping up with the literature pays off for more than one project at a time, so you get more publications that way. As I said, it is one of the perks of that side to research.

Experience matters (if you have it)

One thing is the “details” / “big picture” division, but another is experience. The postdoc will be less experienced in doing research. The whole package, including doing the actual research, writing the papers — just as important as doing the experiments and analysing data, really — applying for grants, etc. This is stuff you learn from experience, and is a lot easier to learn with the assistance of someone who already has the experience. That would be the PI.

The postdoc is still learning. He is being taught (or should be taught) by the PI. It might not feel like a student/teacher relationship, because it is very different from the typical teaching at a university, but it is such a relationship. It is just more of a master/apprentice relationship. The apprentice starts out doing only the grunt work, but gradually does more and more of the complicated work under less and less supervision, until he has mastered the discipline.

Experience is mainly a way to avoid dead ends, in my experience (no pun intended). When you have worked with something for a long time, you have a good intuition for what might work and what will not, so you avoid a lot of wrong turns. The experience of the PI will help the postdoc avoid a lot of pitfalls (probably without ever realising that there was a potential pitfall). There should be room for making mistakes — that is how we learn — but by supervising the postdoc, the PI can help to avoid the worst mistakes. By supervising less and less, leaving room for some mistakes but not until the postdoc is ready to deal with them, the PI allows the postdoc to gain the experience while still being a bit sheltered.

When supervising, it is actually harder to allow someone to make a mistake than it is to spot a mistake and avoid it, but sometimes mistakes are necessary to learn, so they should never be totally avoided… but that is a topic for another day…

Will the PI always be more experienced? Many paragraphs ago I promised to get back to this point. In most ways I would say yes. Writing papers, applying for grants, that stuff he should be more experienced. Otherwise he wouldn’t be the PI, really. But there is a limit to that experience and that is also important to keep in mind.

There’s two scenarios where the postdoc will be more experienced than the PI. If the project he is working on is outside the PIs previous experience, or when the detail-work is important but completely lost to the PI.

Let’s take the first scenario first. The PI gets an idea. “It could be fun to figure out if …” and although it is not really related to anything he has done before, he manages to get a grant for doing research on the project. He has the grant but not the experience. So he hires a postdoc to work on the problem, but cannot really supervise the science! (This is not as made up as it might sound; I have seen it happen a couple of times…)

The second scenario is probably more common. Here the problem is that by focusing on “the big picture” for so long, the PI has forgotten the details and ignores that that is where the problems really are. The big picture might tell you what the important questions are, but the answers are found in the details. People forget this too often, in my opinion. They ask an interesting question and then suggest “maybe you can solve it using an MCMC or something…” or “maybe you can solve it using dynamic programming…” but that is not a helpful suggestion at all! It is a complete disregard for the details, and the details are important (damn it)!

You loose the feeling for the details if you keep away from the for too long, and when that happens you cannot teach others how to deal with them any more.

Of course, a PI will never be as much into the details as the postdoc working with them every day, but luckily most problems are somewhat similar so what works in one setting might work in another so some experience is transferable. This is why experience can be substituted for hard work in many cases.

Sometimes the experience just isn’t there (for either of the two reasons I mentioned), and in that case the PI isn’t pulling his own weight on the project. A few general remarks to a draft and completely generic comments to solutions (”have you considered an ABC solution?”) are of little help. In such cases, I think a postdoc would be right in thinking that the PI just gets the credit but doesn’t do the work. Fortunately, it is not something that happens that often, and most PIs would recognise that they are not actually capable of helping with the problems and try to improve on it by getting to know the details a bit better.

Life as a PI

Being in transit from postdoc to PI I’ve noticed a few things that I never really thought about before. New problems when you are no longer a postdoc in a group. I’ve probably only noticed half of the new problems, since I have yet to experience running my own group.

Anyway, the most important difference, the way I see it, is related to the “big picture”/”details” thing. I no longer have time to focus on just a single problem for days or weeks. There is always a lot of different projects that requires my attention, and that means that I cannot put in the time really necessary to solve detailed problems. It is something I really miss, being able to completely focus on a problem for a couple of weeks, until I have it nailed. If I want to do that now, I have to do it in my vacation.

Another thing is the time it takes to come up with those few suggestions to drafts and projects. I would have guessed that you would spend half an hour before a meeting and that would be the time you would put into it when supervising a project. Maybe some people can do that, but I find that to really think about it I need to spend hours. Checking the literature, doing the math to check out if it works out, looking over the data with my own simple methods to get a feeling for it. It is not enough to get the feeling for the details that is really needed — as I said, there simply isn’t the time for it any more — but it takes a lot longer than I would have thought.

I’ve already begun to think of the postdoc days as the good old days, but I fully remember being envious of the PIs at the time. Over the coming years, if I get tenure, I will get more and more administrative and teaching tasks and I might end up missing the postdoc time even more.

GINA redux [genomeboy.com]

Posted: 09 May 2008 08:26 AM CDT

An op-ed I wrote with Bob Cook-Deegan on the imminent passage (we hope!) of GINA into law appears in the Oakland Tribune:

…people fear genetic discrimination, which is what makes this new law so historic. In a recent survey, the Genetics and Public Policy Center found that 92 percent of respondents worried that genetic tests could be used in ways that are harmful to those getting tested. Only one in four said they would trust insurers with access to their genetic test results. Just one in six would trust employers.

The consequences of such fear, whether well-founded or not, could be grave. If people at risk for inherited diseases are unwilling to undergo genetic testing, they forego information of potentially immense importance to their lives. And if that same mistrust prevents citizens from participating in genetic and genomic research, the process by which our society develops new medicines and cures will suffer. Therefore, if GINA serves as nothing more than a reassuring symbol to a skittish public, it’s still well worth the price of the occasional lawsuit.

Workforce shortages in biotechnology, part I. Why is this a problem? [Discovering Biology in a Digital World]

Posted: 09 May 2008 08:00 AM CDT

Workforce shortages are a growing problem in the biotech industry. Communities are concerned that a lack of trained workers will either keep companies away or cause companies to move. If companies do have to move, it's likely those jobs might be lost forever, never to return. According to Robert Reich, former U.S. secretary of labor, now a professor at UC-Berkeley, biotech companies that can't hire in the U.S. will recruit foreign workers or open research centers overseas (Luke Timmerman, Seattle PI).

Read the rest of this post... | Read the comments on this post...

What’s on the web (2008 May 9) [ScienceRoll]

Posted: 09 May 2008 07:25 AM CDT


  • Web 2.0 101 (My MD Journey): Videos about RSS, wikis, podcast or blogs:

A community-wide study in upstate New York found that nearly 28 percent of all visits to the pediatric emergency department could have been replaced with a more cost-effective Internet doctor's "visit," or telemedicine, according to investigators from the University of Rochester Medical Center.

  • The new medical bloggers’ list is up at Medblog.nl. Check it out!

Platypus sex chromosomes and basal-equals-primitive. [T Ryan Gregory's column]

Posted: 09 May 2008 07:17 AM CDT

There has been considerable interest in the publication of the platypus genome, which is good. Unfortunately, much of the reporting has been distorted, which is bad. However, rather than picking on the press, I want to focus on an example from the scientific literature where a misconception about evolutionary relationships seems to creep in and generate confusion.

read more

Teatime [Sciencebase Science Blog]

Posted: 09 May 2008 07:00 AM CDT

I commented on a post on the Bad Language blog, produced by my good friend Matthew Stibbe, earlier this week. He was waxing lyrical about cutting power consumption in his SOHO and mentioned how he prefers to brew tea with freshly drawn water. I pointed out that while this may have benefits it would actually increase his kettle limescale problems through the addition of extra calcium and magnesium ions. The effect will be negligible, but if we are adding up every single kilowatt-second then it could make a difference. Of course, brewing tea is not environment friendly in the first place and we should all really be drinking trapped dew under a hessian bivouac, or somesuch.

Anyway, Matthew immediately followed up my comment with a defence of using freshly drawn water for making a cuppa. He’s a man after my own heart. I’ve done this once or twice in the past and it exemplifies precisely how blogs are if nothing else a dialogue (please don’t prove me wrong by not commenting on this post…)

I’d better qualify my boiling/reboiling comment on his blog. Chemically speaking the difference between starting with freshly drawn water each time will be a simple matter of formation of insoluble calcium and magnesium salts. With freshly drawnn water you’re adding new metal ions, which will effectively add to your limescale. However, the de-hardening of hard water by heating is not a perfect process so some will be retained in the beverage once you pour over tea leaves, but the actual balance depends on how soft or hard is your water supply in the first place.

However, now that I’ve had a glass or two of vino (at the time of writing), it has also occurred to me that there are lots of other, organic, components in fresh tapwater, such as humic acids, and organochlorine compounds (possibly even fluorine compounds depending on where you live). These will be presumably be degraded and/or boiled off with the first boil to a degree. In the second boiling it is more likely that you will get rid of all these flavoursome ingredients from the water. So, perhaps there is something in the use of fresh water for the best cuppa, but it’s marginal given that any flavours in the water will essentially be overwhelmed by the flavour of the tea itself. It’s like worrying about the sounds they leave out when compressing a music file into mp3 format.

Meanwhile, the origins of tea lie in an attempt at “storing” water in Asia, so legend goes, and to protect it from contamination by pathogens (namely cholera, although they didn’t know this as the agent at the time). The polyphenolics and other materials in tea infused into the water are to a degree antimicrobial, but perhaps more importantly the simple act of boiling kills of the microbes quickly and succinctly without any recourse to chemistry.

In the “West”, the equivalent solution to the great clean water problem was the addition of fermenting fruits and the subsequent production of wine or beer depending on the region. It’s thought to explain why westerners have evolved an enzyme to break down alcohol and its metabolites whereas some Asians lack this enzyme system.

Given the choice between a freshly brewed cuppa, I know which I prefer, especially at this time of the evening…now where’s that corkscrew?

A post from David Bradley Science Writer

Teatime

Going home… [Mailund on the Internet]

Posted: 09 May 2008 06:04 AM CDT

Despite Thomas Wolfe’s wild claim, you can go home, and today I am going home. That is, I am leaving for home in the evening but it will be past midnight before I am actually home.

I’ve been in Oxford and London the last few days — last weekend in Oxford and this week in London — visiting Jotun Hein in Oxford and David Balding at Imperial College.

The Oxford visit was more social than work. Over the weekend, it couldn’t really be more, but it was nice meeting the people there (who Jotun had working weekends, of course).

The visit to Imperial was for making progress on a paper I am writing with David. We didn’t make as much progress on it as I’d have liked — the time was simply too short — but we have a plan for how to progress from here, so that is fine. Meeting the group here and chatting with people has been very stimulating.

I’ve never really been much for going to conferences.  Sometimes they inspire me to some work, but usually I am just bored through most of it.  I’m just not good at sitting quietly and listening to talks.  Visiting people and chat with them, on the other hand, I find very inspiring.  Plus, you really learn a lot from talking with people that they would never mention in a formal presentation.

Anyway, now I’m soon on my way home.  Teaching awaits me when I am back, so until the end of term I won’t be able to go travelling again. Over summer I probably will.  Most likely to DeCODE.

Around the Blogs [Bitesize Bio]

Posted: 09 May 2008 04:55 AM CDT

This week’s around the blogs focuses on lab life and impacts of science on society. That’s a big area to cover, but there are still only a handful of really noteworthy discussions in the last couple of weeks on the topic. Check ‘em out.

Relationships in Lab Groups - How do the dynamics of interactions within research groups effect our perceptions of norms in science?

We Need to Stop Pigeon-Holing Science - Crossing the boundries between disciplines and addressing the interesting questions, as opposed to getting caught up in what we think a *insert field here*-ist should focus on, is the way to go.

Postdocs always overestimate their intellectual contributions - In a repost, the topic of thinking too highly of oneself is discussed.

Are We Training Too Many Scientists? - With competition in academia for funding is sky-high, this post asks a good question.

Gene Patents and Genetic Testing Should patenting of genes be permitted?

Books About DNA: Coming to Life by Christiane Nusslein-Volhard [Eye on DNA]

Posted: 09 May 2008 03:14 AM CDT

coming to lifeComing to Life: How Genes Drive Development by Christiane Nusslein-Volhard

Christiane Nüsslein-Volhard, winner of The Nobel Prize in Medicine, gives a concise and illustrative overview of genetics, evolution, and cellular processes as well as a discussing of current ethical issues in human biology.

An excerpt from the American Scientist review of the book:

The subtitle of Nüsslein-Volhard’s book is How Genes Drive Development. That’s really the essence of her conception of developmental biology, a view that guides the organization of the book. She begins with chapters that introduce the genetic machinery, heredity, chromosomes, genes and proteins. She moves on to a brief discussion of the role of model organisms that have been crucial in developmental genetics and proceeds to the first of these, D. melanogaster.

Free personal genomics! [Genetic Future]

Posted: 08 May 2008 11:21 PM CDT

Over at Eye on DNA, Hsien wonders about the effects of a slowing economy on the personal genomics market. Well, no matter how hard it's getting to make your mortgage repayments, you can probably still afford personal genomics if it doesn't cost you anything:

In New Jersey, meanwhile, the nonprofit Coriell Institute for Medical Research is developing a service that will test for a slate of validated genetic markers, and provide free — yes, free — information and analysis for common diseases. The institute plans to sign up 10,000 people in the next two years, and eventually enlist 100,000 people.

(From a recent piece in Wired). You can sign up here; there is a pretty extensive FAQ here. Note that you will need to physically attend an enrollment session at the Coriell Institute in New Jersey. Also, I see that Coriell is adopting the paternalistic "need to know" approach pioneered by Navigenics, and won't provide participants with any information about genetic variants that aren't "medically actionable" (e.g. incurable disease risk variants), although they will hand out information on non-disease traits like eye colour. Still, if I lived anywhere near New Jersey I'd be signing up right now rather than wasting time writing this post.

(As an aside, I wonder why Coriell is using a saliva-based method when it could be using its considerable expertise to create and store cell lines from blood - essentially generating an endless source of DNA for researchers to analyse. That seems like a missed opportunity that someone will be seriously regretting in a few years when there's no DNA left for whole-genome sequencing, or epigenome analysis, or whatever.)

If you're more ambitious, you could also sign up for (eventual) free whole-genome sequencing via the Personal Genome Project.

Subscribe to Genetic Future.

23andMe, deCODEme and Navigenics at Cold Spring Harbor [Genetic Future]

Posted: 08 May 2008 10:46 PM CDT

Just in case anyone has been wondering why I've been so quiet, I'm in beautiful Cold Spring Harbor this week for the Biology of Genomes meeting. Be warned that I'll have a lot more to say about this meeting once I've recovered from the combined effects of jet-lag and the punishing schedule (last night's evening session finished at 11:30pm!), most of which will include the words "next", "generation", "sequencing" and "wow".

For now, I want to download some thoughts on this afternoon's panel discussion on direct-to-consumer genetic testing between three giants of the personal genomics industry: 23andMe's Linda Avey, deCODEme's Kari Stephansson and Navigenics' Dietrich Stephan. The session was ably chaired by the former head of the public Human Genome Project, Francis Collins.

Each of the three company representatives was given around 10 minutes to push their vision for the personal genomics industry. While all three are selling essentially the same product (a read-out of between 500,000 and 1,000,000 variable positions from your genome), each did their best to differentiate their company from the competition.

Navigenics was first off the mark, and the tone was serious: Western society is headed for a healthcare crisis, with the combination of an ageing population with an increasing frequency of lifestyle diseases like obesity and diabetes leading us inexorably towards an increased load of common diseases. Within a generation, said Stephan, we need a solution, and the only solution is early prediction and intervention.

The Navigenics message has remained on target since its launch a month ago: this is a careful, serious, completely disease-focused company. Stephan talked up the quality control on the Navigenics testing service, which includes repeating any disease-related genotype calls that fail the first time around (unlike 23andMe and deCODEme). He also emphatically stated what Navigenics wasn't: the company will never offer testing of genetic predictors of non-disease traits like height or eye colour; will never offer genetic ancestry testing; and will not offer comparisons of family members (apparently out of concerns about unexpected discoveries of non-paternity). This is, of course, an explicit attempt to portray the competitors - who both offer both ancestry, trait prediction - as frivolous.

Next off the rank was Google-funded 23andMe. Avey immediately turned Stephan's accusation back on itself: rather than being frivolous, 23andMe simply "looks at genetics holistically", with information on non-disease traits and ancestry being part of the big picture that customers want ("genealogy is the second most popular hobby on the internet," she said. "You can guess what the most popular one is.")

Avey explained that her decision to found 23andMe was based on her frustrating experiences at Perlegen attempting to recruit large cohorts of patients to use for large-scale genetics, and during her talk there was a strong emphasis on 23andMe as a system for driving genetic research. Just as the internet-based communities of Web 2.0 are willing to share their information with one another and reap the benefits, 23andMe is apparently aiming to create "Research 2.0", in which patients volunteer their genetic and clinical information for researchers to work with, and then use the results of that research to inform their own lifestyle decisions. Avey cutely terms this model "23andWe".

Finally, the intimidating Viking-like figure of Kari Stephansson took to the platform. Stephansson played continuously on the research cred of deCODEme's parent company, deCODE - not without justification, given the impressive list of large-scale genetic studies performed by the company (which Stephansson rather heavy-handedly flashed up on the screen, one by one, for what seemed like forever).

In fact, these distinguished scientific credentials seemed to be pretty much the only thing Stephansson could find to distinguish deCODEme from its rivals. Otherwise, the deCODEme ethos seems to resemble the open information model of 23andMe rather than the more old-fashioned paternalistic approach of Navigenics. "Is it always laudable when people learn more about themselves?" he asked, and argued that the answer was unambiguously yes. "Our customers will benefit from knowing themselves better."

After the three company talks, there were presentations from Johns Hopkins' Kathy Hudson and the National Coalition for Health Professional Education in Genetics' Joseph McInerney. Hudson emphasised the strong appetite of consumers for genetic information, and the desperate need for empirical data regarding the responses of people to information about genetic risk variants (which I've wondered about myself). McInerney made a strong case for the complete under-preparedness of health care providers for the boom in personal genetics, and introduced a new database called GeneFacts (currently under development) which will store expert-curated information about genetic tests for both consumers and health providers.

The following discussion was lively, and nowhere near as hostile as I expected. A few highlights:
  • Stephansson was remarkably candid about the usefulness of current tests involving common variants (which all three companies rely on): in one exchange he noted that "we are marketing these tests without any claim that they will impact on people's lives"; in another, he admitted that common variants probably provide marginal utility beyond simply collecting general information on family history.

  • Kathy Hudson responded to the problem of patients being given data of very limited predictive value with a very sensible solution: "In the absence of demonstrable harm, the default should be to provide the information." In her talk, she referred to the argument that genetic tests should only be ordered through a health-care provider as "an old-fashioned model" - a clear rebuff to both Navigenics, which boasts about using an in-house doctor to authorise all of its tests, and the American College of Medical Genetics, which recently issued a statement which argues that "a knowledgeable health professional should be involved in the process of ordering and interpreting a genetic test".

  • Stephansson was sceptical of Avey's claims that 23andMe can perform useful research, given the limitations of self-reported data (I agree). Avey explained that this problem is one of the reasons why 23andMe is interested in working with companies like Google to integrate genetic data with medical records - a suggestion that resulted in some shocked muttering from the audience.

  • Eric Lander (founding director of the Broad Institute) suggested that the medical genetics community needs to play a stronger role in the field of personal genetics, possibly contributing to some sort of expert-curated wiki-style database of information about genetic associations. He indicated that this couldn't be privately funded due to conflicts of interest. McInerney suggested that the soon-to-arrive GeneFacts database could serve as a starting point; Hudson stated that there has been substantial discussion about a genetic test registry at the Genetics & Public Policy Center, and that clear support from the medical genetics community would help this move forward. It appears that DNA Perspectives may have some competition in the very near future. Either way, this is good news for genetic test consumers.

  • There was an interesting back-and-forth between Stephan and Stephansson over the issue of ancestry testing both in their talks and in the discussion: Stephan basically sees ancestry testing as a distraction from the serious business of disease, while Stephansson argued that the effect size of disease risk variants frequently varies between populations, so ancestry testing is highly relevant to calculations of disease risk. This is an empirical question that will soon be answerable (deCODE is apparently currently investing in association studies in Asian populations).

  • All three of the company representatives mentioned their interest in developing whole-genome sequencing capabilities, which is unsurprising - sequencing has always been the Holy Grail of personal genomics, with the current SNP chip technology really little more than a crude place-holder until sequencing prices drop.

I also chatted with both Stephan and Avey after the session - for what it's worth, they're both extremely personable, and there was no overt animosity between any of the three competitors (somewhat to the disappointment of the audience, I suspect).

Anyway, that's the bulk of the personal genomics component of this meeting. As I hinted at the beginning of the post, the other main message has been the rapid advance in next-generation sequencing technology and its application to human genomes, which I'll hopefully be able to post about over the next few days.


Subscribe to Genetic Future.

Ick update [genomeboy.com]

Posted: 08 May 2008 09:15 PM CDT

You know, I’m not a prude. I’ve been known to tell an off-color joke or two. I don’t presume to exude good taste nor do I get squeamish easily. But my God, every day I have to purge the most disgusting and perplexing spam, full of acts I’ve either never heard of or else are physiologically impossible.

I just wanted you to know what you’re missing. That is all.

276 pages of pure reality! [The Gene Sherpa: Personalized Medicine and You]

Posted: 08 May 2008 02:19 PM CDT

Have you read it? Come on.....You didn't. Well, you are missing out. Back in 2004 I started watching the Secretary's Advisory Committee on Health, Genetics and Society. But even more importantly I...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]

No comments: