Thursday, July 24, 2008

The DNA Network

The DNA Network

Dinosaurs, Supertrees, and the Cretaceous Terrestrial Revolution [HENRY » genetics]

Posted: 24 Jul 2008 08:54 PM CDT

Today’s PRS.B sees the publication of a supertree of 600 Dinosaur species. Awesome. ScienceDaily has more information here. Here’s a picture of it:

Dinosaur Supertree

The abstract says:

The observed diversity of dinosaurs reached its highest peak during the mid- and Late Cretaceous, the 50Myr that preceded their extinction, and yet this explosion of dinosaur diversity may be explained largely by sampling bias. It has long been debated whether dinosaurs were part of the Cretaceous Terrestrial Revolution (KTR), from 125–80Myr ago, when flowering plants, herbivorous and social insects, squamates, birds and mammals all underwent a rapid expansion.

Although an apparent explosion of dinosaur diversity occurred in the mid-Cretaceous, coinciding with the emergence of new groups (e.g. neoceratopsians, ankylosaurid ankylosaurs, hadrosaurids and pachycephalosaurs), results from the first quantitative study of diversification applied to a new supertree of dinosaurs show that this apparent burst in dinosaurian diversity in the last 18 Myr of the Cretaceous is a sampling artefact. Indeed, major diversification shifts occurred largely in the first one-third of the group’s history.

Despite the appearance of new clades of medium to large herbivores and carnivores later in dinosaur history, these new originations do not correspond to significant diversification shifts. Instead, the overall geometry of the Cretaceous part of the dinosaur tree does not depart from the null hypothesis of an equal rates model of lineage branching. Furthermore, we conclude that dinosaurs did not experience a progressive decline at the end of the Cretaceous, nor was their evolution driven directly by the KTR

Compensatory Gene Therapy: A Psychiatric Potential? [Think Gene]

Posted: 24 Jul 2008 05:50 PM CDT

Guest post by Paul Jaffe,; coordinator, Manhattan Adult Attention Deficit Disorder Support Group; New York, NY, USA. [7/17/08]

Recently, gene-therapy researchers — who have had their ups and downs — scored a point: the restoration of some vision to patients with retinal degeneration [1,2].

To effect this, they used a generally harmless adeno-associated virus (AAV). The goal was to deliver, to the retina, a functioning copy of a defective gene thought to trigger the illness. This viral “vector” was injected through a surgical procedure deemed reasonably safe.

Within gene therapy, AAV vectoring – which may soon turn a corner [3] — is now standard. This includes a well-publicized effort [4,5,6] — and a less-publicized effort [7] — to treat Parkinson’s Disease.

Unlike the above, which might be termed corrective gene therapy, the PD applications are closer to compensatory gene therapy. Here, the aim is to alter the brain so as to mimic a treatment several steps removed from an underlying pathology.

In each PD clinical trial, a gene has been inserted to encode a specific enzyme. These are:

  • glutamic acid decarboxylase (GAD), which catalyzes the synthesis of the neurotransmitter gamma-aminobutyric acid (GABA); and
  • human aromatic l-amino acid decarboxylase (AADC or hAADC), which does the same for dopamine (DA).

The vectors are known, respectively, as the AAV-GAD and the AAV-AADC.

The PD research might — or might not — succeed. (One vector is in US Phase II testing; the other, in Phase I.) The question here is: might either be used elsewhere? (more…)

Another Perfect Laboratory Inspection for DDC! [The DNA Testing Blog]

Posted: 24 Jul 2008 01:52 PM CDT

DNA Diagnostics Center (DDC) has done it again—the world's largest private DNA testing laboratory has achieved another perfect rating after a surprise inspection by AABB, the organization that sets industry standards for family relationship testing in the United States. Last week, the lead assessor from the AABB carefully examined DDC's performance in the many crucial areas [...]

Tuning and Temperment [Bayblab]

Posted: 24 Jul 2008 12:16 PM CDT

Dr. Ken Garson, lab ninja and science overlord extraordinaire, (here is a picture of his lab bench), and I recently had a discussion about musical theory. I have taken a fair bit of musical theory over the years and I remember very little. However, I was certain that Ken was wrong when he stated that in certain scales a C# is different than a Dflat.
Don't mess with Garson. Turns out that while I was correct when using the modern tuning of equal temperment, back in the day of musical geniuses like Bach, instrument tuning was not so crude and was specific to the key in which it was being played. Intervals in musical scales are defined by frequency ratios, and they aren't always so pretty. C# is not Dflat according to that definition.
Ken has done some fine research on this subject in order to point out my infiroirity. Equal temperment is used so that an instrument like the piano can be played in any key. However, from a great summary on the history of tuning and temperment:

* Our first written instructions for setting equal temperament come from Giovanni
Maria Lanfranco in 1533:

* "The 5ths are tuned so flat that the ear is not well pleased with them; and the
3rds are as sharp as can be endured."

Equal temperament has had its share of critics. Very few composers or organists
preferred equal temperament until the French Romantic school.

* In 1879, William Pole wrote in his book The Philosophy of Music, "The modern
practice of tuning all organs to equal temperament has been a fearful detriment to
their quality of tone. Under the old tuning, an organ made harmonious and
attractive music. Now, the harsh 3rds give it a cacophonous and repulsive effect."

So is there a audible difference to those of us who aren't musical geniuses or Ken Garson?
Check out this youtube link of a guy playing the same piece in using different temperments.
I think I can tell the difference but maybe the guy is just playing it better.
A computer simulation would probably be best.

Drum yourself into shape? [Bayblab]

Posted: 24 Jul 2008 12:04 PM CDT

Wii Fit, for the Nintendo Wii, recently hit store shelves, touting itself as a fun fitness product claiming
By playing Wii Fit a little every day, you, your friends, and your family can work towards personal goals of better health and fitness.
Perhaps the game Rock Band should take a similar tack in advertising.

A story at BBC News equates rock drummers to top athletes. These conclusions were reached by tests on Clem Burke, drummer for Blondie, who was hooked up to a heart monitor and other equipment during a performance. An hour in concert can burn up to 600 calories and tests on Burke showed his heart averaged 140-150 bpm, peaking at 190 (see this chart for Fox and Haskell's exercise zone formula). These rates, the article claims, are comparable to Premier League football (soccer) players.

I don't doubt that you can get a workout playing the drums - I've certainly broken a sweat hammering away at Rock Band (sorry, not a musician here) - but the BBC piece seems to go a bit far in it's comparison to top athletes:
"Footballers can normally expect to play 40 to 50 games a year - but in one 12 month period, Clem played 90-minute sets at 100 concerts.

"Footballer [sic] find playing a Champions League game once every two weeks a drain, but these guys are doing it every day when they are on tour.
This quote, combined with the previous comparison to professional football players, seems to imply that drumming is more physically demanding. Instead, I would read it as going against their thesis that "rock drummers are top athletes"; that is, drumming isn't nearly as strenuous as professional sports.

Enjoy your drumming, and feel good knowing that you're burning some calories, but if you're planning on leaving your pick-up soccer league to go pro there are probably better ways to get in shape.

Any thoughts from drummers out there?

The Brassy Professor [Tomorrow's Table]

Posted: 24 Jul 2008 12:03 PM CDT

I can't say that have often been called "brassy" but I sure do prefer that to some other descriptions that I have seen lately like irreverent
or disingenuous or in the case of my latest Research Proposal "denied".

So I was pleased that Andrew Leonard of Salon used that lovely word in his July 16th post in regards to my proposal that the best way to ensure a safe food supply with the least amount of damage to the environment is to integrate genetic engineering with organic farming.

Here is the good part:

... Ronald was even brassy enough to cite "Silent Spring's" Rachel Carson as a potential muse for those who aim to merge the latest biotechnology with sustainable agriculture.

After all, in 1962 she said:

"A truly extraordinary variety of alternatives to the chemical control of insects is available. Some are already in use and have achieved brilliant success. Others are in the stage of laboratory testing. Still others are little more than ideas in the minds of imaginative scientists, waiting for the opportunity to put them to the test. All have this in common: they are biological solutions, based on understanding of the living organisms they seek to control, and of the whole fabric of life to which these organisms belong. Specialists representing various areas of the vast field of biology are contributing -- entomologists, pathologists, geneticists, physiologists, biochemists, ecologists -- all pouring their knowledge and their creative inspirations into the formation of a new science of biotic controls."

....But the quote that really seemed to sum up an approach to farming (and just about everything else) that this blog can get behind also came from Ronald, quoting a farmer friend of hers:

As Mike Madison, a fellow farmer, neighbor and writer says, "In dealing with nature, to be authoritarian is almost always a mistake. In the long run, things work out better if the farmer learns to tolerate complexity and ambiguity... Having the right tools helps."

Leonard indicates that our books have joined his "endless queue". Does that mean he will buy them but not read them- or worse yet, just jot the names down on a long list?

Staying clean [The Tree of Life]

Posted: 24 Jul 2008 10:25 AM CDT

Well, sorry for the lack of posting recently. Out sick thanks to a fun antibiotic resistant bacteria. In honor of that here are some tips to staying clean:

A Germ-Zapper's Guide to Clean (from the Washington Post

hat tip to Doug Rusch for pointing this out and giving me something to do other than worry about bacterial infections)

A Not So Good GM Crop - Pharma Corn [Bitesize Bio]

Posted: 24 Jul 2008 09:51 AM CDT

Tuesday, I posted on Bt Corn as an example of a “good” genetically modified (GM) crop whose benefits vastly outweigh its risks, risks which are akin to the development of antibiotic resistance in the health sector. Here, I thought I’d mention Pharma Corn as an example of a “bad” GM crop, in my opinion at least.

The PLoS Biology article from last time touches on Pharma Corn as well, but only for a short summary. Check out the Journal for Agrobiotechnology Management and Economics (2005) for a more comprehensive article on Biopharming and the Food System.

For Pharma Corn, gene flow *is* a serious concern, and extensive measures are required to contain Pharma Corn. Just think of the consequences of hybridization between corn engineered to manufacture drugs and corn intended for human consumption. What happens if you inadvertantly consume a drug to which conflicts with your body’s chemistry or your correct prescriptions, just by eating an ear of corn.

But okay, the FDA is known for tight regulation of such things, and Pharma Corn doesn’t require much acreage, making containment easier. But look at the situation:

Given the potential risks and liabilities associated with accidental commingling with the food supply, and facing the daunting task of ensuring near-100% containment, the food and the biotech industries have taken a precautionary approach to pharmaceutical crops and support for risk-based regulations. The Prodigene incident case in 2002 illustrates the type of risks facing the food industry. In Nebraska, during the 2002 growing season, APHIS inspectors discovered "pharmaceutical" volunteer corn growing in a soybean field. The corn was from the previous year, when Prodigene had tested a pharmaceutical corn to produce a swine vaccine. As a result, both the harvested soybeans (500 bushels) and the entire soybean load of 500,000 bushels in local elevator were quarantined. In another accident in Iowa, the USDA forced Prodigene to burn 155 acres of conventional corn that may have cross-pollinated with some of the company's pharmaceutical plants. In both cases, the infraction was viewed to come from Prodigene's failure to adhere to permit protocols issued by APHIS. Prodigene was fined US$250,000 and required to pay approximately $3 million for the cleanup costs and disposal of contaminated corn and soybeans.

Clearly, “prodigious obstacles” summarizes the situation for Pharma Corn as a agricultural product of the future.

Requiescat in pace []

Posted: 24 Jul 2008 09:33 AM CDT


Victor Almon McKusick (1921-2008), the Father of Medical Genetics. I think many of us believed he’d live forever.

After a long day at OSCON [business|bytes|genes|molecules]

Posted: 24 Jul 2008 08:27 AM CDT

OSCON is a very interesting meeting. Apart from opportunities to meet old friends and acquaintances, and talk to people like Damian Conway and Toby Segaran, and having a very productive conference, one also gets to get a pulse on some trends, more on this in a bit.

Also please to find some life science types here, including people from Pfizer and 454. I still feel that one sign that academia and even commercial life science companies don’t quite get it when it comes to software development and engineering is that while we are quick to travel to any number of scientific conference, the attendance at programming and technology-centric conferences is limited. Which means you miss out on talks about XMPP and how it can be use to power and scale web services, and learn about the open web, etc.

XMPP is definitely making its presence felt at OSCON. I only got a chance to attend one talk, where XMPP was essentially presented as an alternative to REST, assuming that SOAP was so bad that you wanted to avoid it and REST had limitations. I won’t go into that argument (other than agreeing on the SOAP bit), but twitter chatter suggested that there was a lot of discussion around XMPP.

It’s also interesting to see the level of interest in things like memcached and the importance of community. I also found it interesting to observe the differences between the crowd at OSCON and at other geek-tech conferences. There’s a number of people here, not a large number, but enough, who are not quite web aware. That surprised me a little.

Anyway, all I have time for right now. More later.

Zemanta Pixie


A pronounced affection for parasites. [Genomicron]

Posted: 24 Jul 2008 06:37 AM CDT

According to Peter Olson of the Natural History Museum in London, "All free-living organisms host one or more parasites". This can be taken two ways, both of them generally true: a) that each individual multicellular organism hosts at least one individual parasite within its body, and b) that each free-living species plays host to at least one species of parasite that attacks it exclusively. Consider this second point for a moment. For each free living species there is one or more (usually several more) parasite species -- that is, as a category (polyphyletic, obviously), parasites may very well be the most diverse types of organisms on the planet.

On the other hand, most parasites are much smaller than their hosts, and so it has typically been assumed that they contribute a negligible fraction to any particular ecosystem's total biomass. Nuh-uh. In a report published in Nature this week, Kuris and colleagues presented five years' worth of analyses of estuaries in California and Baja California in which they measured the amount of biomass made up of parasites and free-living organisms.

In sum, Kuris et al. (2008) examined 138 species of infectious agents, 199 species of free-living animals (including invertebrates as well as fishes and birds), and 15 species of free-living plants in their study. They found that plants contributed the most biomass to all three of the estuaries they studied, followed by groups such as snails, bivalves, and crabs. Parasites made up only about 0.2% to 1.2% of the animal biomass of each environment, and on average parasite groups had biomasses 1000 times lower than the average free-living group. However, as Kuris et al. (2008) report,
Certain parasitic groups dominated the parasite biomass, reaching levels similar to those of common free-living groups. For instance, the biomass of trematode worms was comparable to that of the fishes, burrowing shrimps, polychaetes or small arthropods. In all estuaries, trematode biomass exceeded bird biomass by threefold to ninefold.
In other words, parasites make up a larger fraction of the living matter in these environments than do the top predators. In particular, parasites that castrate their hosts (i.e., prevent them from diverting resources into reproductive effort) were the most abundant. The world is not fishy or feathery, it is fluky.

So, whereas the famous quote attributed to J.B.S. Haldane that if there is a creator he must have "an inordinate fondness for beetles" still applies, it may be that he has an even more pronounced affection for parasites. Especially the castrating sort.


Kuris, A.M., R.F. Hechinger, J.C. Shaw, K.L. Whitney, L. Aguirre-Macedo, C.A. Boch, A.P. Dobson, E.J. Dunham, B.L. Fredensborg, T.C. Huspeni, J. Lorda, L. Mababa, F.T. Mancini, A.B. Mora, M. Pickering, N.L. Talhouk, M.E. Torchin, and K.D. Lafferty. 2008. Ecosystem energetic implications of parasite and free-living biomass in three estuaries. Nature 454: 515-518.

Fewer papers to read, more data to use... [The Seven Stones]

Posted: 24 Jul 2008 05:42 AM CDT

In a nice post at bbgm, Deepak writes:

...historical online literature lacks the relevant structure and metadata to make our task easier, but it is time that publishers thought ahead about some of the advantages of online publishing.

thumb080303.jpg I can't agree more. I heard sometimes the claim that within 5-10 years, more than 95% of the scientific literature is going to be read by computers only. Possible. However, the converse alternative might be interesting to consider: what if 95% of scientific papers could be 'written' by computers? Even if this formulation is obviously provocative and unrealistic, the point is that harnessing the 'network effect' of the web may have two complementary components, one community- the other computer-driven. On one hand, web 2.0 functionalities enable community-driven commenting, rating and even writing of scientific publications. On the other hand, semantic web technologies are expected to facilitate computer-driven integration of scientific data from multiple sources, which is likely to play an increasingly important role in science. Rather than mining thousands of unread papers, the scientist of the future may rather search the web for relevant data first and integrate it to generate – or 'write' – novel insight. In fact, integration of large datasets already represents a major field of research in systems biology (see Chuang et al 2007, Xue et al 2007 or Mani et al 2008 as recent examples published in Mol Syst Biol).

It seems thus that, in addition of being web 2.0 enabled, new publishing models should 'embed' more structured data into online publications. In short, 'papers' could progressively transform into hybrid online objects that resemble more to database records (see Timo Hannay's post on this topic) or highly structured documents. At the extreme, one could even imagine to publish 'naked' datasets, without any 'stories' around them. Of course, efficient data integration will require the data to be in a standard and structured format and its quality will have to be well characterized. These are all far from trivial qualities.

The good old-fashioned papers are probably not going to disappear as publication units, in particular for high-impact studies reporting novel and deep insights. It is also not the point here to propose dumping every scientist's hard drive into the web. Data-rich publications would be published only when the authors would feel it appropriate. There might thus be some equilibrium to find between papers that will never be read except by a text mining engine and pure datasets, published as a resource, easier to search, to mine and to integrate. This dialectic may ultimately boil down to the issue of how well will text mining and data integration technologies perform in the future.

In any case, within the context of the current debate about the saturation of the peer-review system, I wonder whether a data-centric form of scientific publishing could help to release somewhat the pressure. Reviewing of datasets might be quicker and could rely more on standardized evaluation parameters. If assorted with proper credit attribution mechanisms and metrics of impact, data-rich (or even data-only) publications may represent an alternative model complementing the traditional 'paper' format. It would prevent the loss of useful data otherwise buried in verbal descriptions and, most importantly, would hopefully stimulate web-wide integration of disparate datasets.

The science of tapping beer cans [Mailund on the Internet]

Posted: 24 Jul 2008 04:14 AM CDT

I’m linking to a Danish page here, sorry to those of you who do not quite master that language yet.

It concerns the age old question of whether you prevent a shaken beer can from spilling when you open it, if you tap the can on the top first.  A lot of people do this, but does it have any effect?

Klaus Seiersen — incidently an old drinking buddy of mine from the physics department — did an experiment with 30 cans and didn’t see any correlation with tapping and amount spilled.

You spill your beer when it foams out of the can, and the foam is caused by the CO2 in the beer.  When you shake the can, the CO2 gets mixed with the beer, and the beer foams. To prevent this, all you have to do is to wait until the CO2 settles in the top of the can again…

New “rocket science” logo [Mailund on the Internet]

Posted: 24 Jul 2008 03:50 AM CDT

Rocket Science vs Bioinformatics logoI guess the squeaky weel gets the oil.  After complaining about the access to CLC bio’s whitepaper, Lasse  Görlitz send me an email explaining how they are capitalists and that should explain why I need to fill out a webform to download a paper… anyway, he attached this logo and all is forgiven.

Yes, I am that easy to buy off!

Bioinformatics Rocks!

NGS whitepaper from CLC bio … maybe [Mailund on the Internet]

Posted: 24 Jul 2008 03:33 AM CDT

According to this press release, CLC bio has released a whitepaper describing their assembly algorithm.  I’m very interested in reading it.  The speed sounds impressive, and assembly is an interesting algorithmic problem.  There’s just one problem: the link to the paper isn’t a link to a paper at all!  It’s a web form that lets you apply for it … and after filling out the form the kindly tell you that they’d get back to you.

Sorry guys, this is too lame!  Just give me the damn paper!

The Genetic Genealogist’s Wordle [The Genetic Genealogist]

Posted: 24 Jul 2008 02:00 AM CDT

Since everyone else in the genealogical world has already submitted a version of their Wordle, I thought I’d join in on the fun.

Here’s mine:

And another:

ABI Expands SOLiD Software Community []

Posted: 24 Jul 2008 12:08 AM CDT

Looks like ABI has begun releasing more tools to their user community...see their new site (I really like the color scheme they chose for the site.. ) A variety of assembly and mapping tools are far as I can tell they are parts of the pipeline already present on the instrument..."corona" and "matoGff" are built in tools that do the automated secondary analysis. Now they are provided as standalones with fairly detailed documentation...which is always helpful. Also there are...

Read more and join the community...

No comments: