Wednesday, July 2, 2008

The DNA Network

The DNA Network

Only Nature could turn the success of PLoS One into a model of failure [The Tree of Life]

Posted: 02 Jul 2008 05:03 PM CDT

Now, mind, you I like Nature as a publishing unit. They publish some very fine journals. Now, most of them are not Open Access, so I choose not to publish there if I can avoid it. But I still like them. And many of the editors and reporters there are excellent - smart, creative, insightful and such. But Nature the publisher can also be completely inane when it comes to writing about Open Access and PLoS. In a new article by Declan Butler, Nature takes another crack at the PLoS "publishing model"

The problem with PLoS now is ... wait for this ... the success of PLoS One.  PLoS One it turns out is publishing a lot of papers (including one by me, today).  And bringing in a decent amount of money to PLoS apparently (note for full disclosure - I am involved in PLoS Biology as "Academic Editor in Chief" and PLoS Computational Biology as an Academic Editor ... although I should note I am not involved in financial discussions at PLoS in any way).  

So why is the success of PLoS One a problem?  Well, because it allows Nature to do the old good cop bad cop routine and to write, again, about the "failings" of the PLoS publication model.  Now, mind you, the article does not quote a single source for what the PLoS publication model is.  But they do say it has failed.  From what I can tell here is the logic of the failure argument:
  1. Nature believes PLoS' model for success revolved solely around PLoS Biology and PLoS Medicine and some of the other other PLoS journals being self sustaining after a few years.
  2. Analysis of some financial information suggests that PLoS Biology and Medicine currently are not breaking even
  3. PLoS One is apparently wildly successful and thus is brining in some money to PLoS.
  4. PLoS One publishes a lot of papers (they discuss this a bit and imply that this is a bad thing because some of the papers must be bad.  Note - they do not back this up with any evidence.  Silly for me to ask a science journal to use evidence)
  5. Therefore, the entire PLoS Publication model is a failure.
The problems with this logic are, well, large.  Here are some:
  1. Does Nature really think that there ever was a single "model" for how PLoS should be evaluated?  
  2. If so, where is the documentation of what this model actually was?
  3. Even if there was a PLoS model and even if it turns out to be not exactly what PLoS is doing now, what is the big deal?  If you were a stockholder of any company and they told you "we are never going to change our business model no matter what happens in the world around us" I would recommend you not buy their stock.  It is simply farcical to expect any entity to stick to a single simple model forever. 
  4. Does not Nature supplement some of their bigger journals with their higher volume other journals?
  5. Most companies these days use high profile entities such as PLoS Biology and PLoS Medicine to attract attention to other portions of their company in order to help bring in money.  Is this somehow not allowed by PLoS?  Doesn't Nature do the same thing?
  6. If you look at the figure Nature shows of PLoS $$$, it shows income rising in 2007 and expenses going down.  How did that get turned into a bad thing?
So - I still do like Nature publishing because much of the time it has high quality stuff.  It even has high quality stuff commenting/criticizing the Open Access movement and pointing out some of the challenges with it.  But this article by Butler is not an impressive piece of work.  I really wanted to give him an award but could not think of what to give.

See also 

A bloody wish list. [T Ryan Gregory's column]

Posted: 02 Jul 2008 04:03 PM CDT

Depending on the animals in question, the amount of DNA per cell may be associated with body size, metabolic rate, developmental rate, or other traits.

Read More...

The genetics of metabolic syndrome across populations [Yann Klimentidis' Weblog]

Posted: 02 Jul 2008 03:53 PM CDT

Regions of strong genetic association with phenotypes related to metabolic syndrome are found to differ between groups (whites, Mexican Americans, African Americans, and Japanese Americans).

Genome-wide Linkage Scan for the Metabolic Syndrome: The GENNID Study
Karen L. Edwards, Carolyn M. Hutter, Jia Yin Wan, Helen Kim and Stephanie A. Monks
Obesity (2008) 16 7, 1596–1601
Abstract: In the United States, the metabolic syndrome (MetS) constitutes a major public health problem with over 47 million persons meeting clinical criteria for MetS. Numerous studies have suggested genetic susceptibility to MetS. The goals of this study were (i) to identify susceptibility loci for MetS in well-characterized families with type 2 diabetes (T2D) in four ethnic groups and (ii) to determine whether evidence for linkage varies across the four groups. The GENNID study (Genetics of NIDDM) is a multicenter study established by the American Diabetes Association in 1993 and comprises a comprehensive, well-characterized resource of T2D families from four ethnic groups (whites, Mexican Americans, African Americans, and Japanese Americans). Principal component factor analysis (PCFA) was used to define quantitative phenotypes of the MetS. Variance components linkage analysis was conducted using microsatellite markers from a 10-cM genome-wide linkage scan, separately in each of the four ethnic groups. Three quantitative MetS factors were identified by PCFA and used as phenotypes for MetS: (i) a weight/waist factor, (ii) a blood pressure factor, and (iii) a lipid factor. Evidence for linkage to each of these factors was observed. For each ethnic group, our results suggest that several regions harbor susceptibility genes for the MetS. The strongest evidence for linkage for MetS phenotypes was observed on chromosome 2 (2q12.1–2q13) in the white sample and on chromosome 3 (3q26.1–3q29) in the Mexican-American sample. In conclusion, the results suggest that several regions harbor MetS susceptibility genes and that heterogeneity may exist across groups.

The Leiden 8 -- no buttons, though. [T Ryan Gregory's column]

Posted: 02 Jul 2008 02:52 PM CDT

So, over at Rationally Speaking, Massimo Pigliucci asks "Is there fundamental scientific d

Read More...

Leveraged Knowledge Management [Sciencebase Science Blog]

Posted: 02 Jul 2008 01:00 PM CDT

LeverageSeveral years ago, I was called on by a multinational producer of hygiene, food, and cleaning products to pay a visit to their research and information centre. My role was to play editorial consultant for content for their new Intranet.

You see, the company had lots of researchers in one building who were working hard on non-stick ice cream and insect-deterring shaving gel, while the information team were in a separate building trawling patents and fishing for pertinent technology news. Unfortunately, the two teams worked a different shift system and rarely met, and even when they did meet, they didn’t have much to say to each other. One group were more concerned with the latest formulation change that might result in a shift in rheological properties for a new cheese-and-chocolate pie filling or a non-grip floor cleaner, while the other group were interested only in assessing prior art or working out Boolean strings for searching databases.

The new Intranet would, however, bring the two groups together. The system would provide regularly updated news and information about what each team was doing, forums for discussing ways to improve efficiency and, most importantly, offer them an area where ideas might be mashed up to help each see the possibilities of the other’s efforts. It would be pretty much an internal version of what we now know as online social networking. And, as with many mashups we see these days would, the reasoning went, allow the company to come up with novel products and marketing strategies based on the two sides of their knowledge.

At the heart of the solution was the concept of knowledge management. Indeed, the person in charge had KM somewhere in his job title and, unfortunately, the word leverage was also in his job description. [This post's title is meant to be ironic, in case you didn't spot it, db]

Now, before this consultancy work, I had not come across the abbreviation KM, nor the phrase knowledge management and had certainly never wittingly used the word leverage. To my mind it sounded like nothing more than market-speak. Nevertheless, I quickly worked out what it was meant to mean and undertook the task with my usual level of workaholic diligence.

Apparently, I’m not alone in my perspective on KM. But, more to the point many people are unaware of the fact that the stuff with which they work is labelled knowledge and that they can manage it nevertheless. In fact, that’s the main conclusion from a paper entitled: “Dimensions of KM: they know not it’s called knowledge…but they can manage it!” published today in the International Journal of Teaching and Case Studies recently (2008, 1, 253).

In that paper, Sonal Minocha a senior manager at Raffles University, Singapore, and George Stonehouse of the Napier University Business School, Craiglockhart Campus, in Edinburgh, UK, have explored how for business students the concept of knowledge is a fascinating one as most of them wonder what is encompassed within “knowledge management” for it to be a subject. Yet these same students, can manage knowledge almost without needing to know precisely what KM is. Perhaps that finding extends into the workplace too. Indeed, following a case study involving the fall and rise of a major vehicle manufacturer the researchers come to the following conclusion:

Competitive advantage, however, depends upon the creation of new knowledge, based upon this learning, centred upon the development of new business, new ways of doing business, improved customer and supplier relationships, and the development and leverage of new knowledge-based core competencies.

Aside from the use of the word leverage and phrases like core competencies, that seems to be a fair conclusion.

Well, by now you may be wondering what happened to that proto-social network pioneered by the manufacturer of anti-shark surfboard wax and child-repellent screen cleaners. My consultancy work lasted a week during which time, I’m not proud to say, I taught
the
management team
a thing or
two about
how not to
flip a flip-chart and
almost blinded the team leader with a laser pointer
I taught the management team a thing or two about how not to flip a flip-chart and almost blinded the team leader with a laser pointer.

At the end of the week, we had a fairly solid outline for how to proceed with the new networking system. Unfortunately, the staff member in charge of the KM project to bring the research and information people together was leveraged out of the company soon after; for reasons unrelated to the incident with the flipchart and laser point, I should add.

A post from David Bradley Science Writer

Leveraged Knowledge Management

IVF and Human Cloning: the truth behind the petri dish [Mary Meets Dolly]

Posted: 02 Jul 2008 12:34 PM CDT


I love Biopolitical Times. Their blog is an absolute must read. In this entry, Jesse Reynolds exposes the shady connection between the cloning lab and the IVF clinic:

Cloning-based stem cell research puts women's health on the line. It requires large numbers of fresh human eggs, whose extraction poses significant risks.
The politicized and well-funded drive for research cloning has been
balanced by a policy consensus that women should not be paid to provide eggs for research. But now, advocates and (hopeful) practitioners of cloning-based work are pushing to undo the rules (1, 2, 3, 4) that protect women's health in this speculative line of work.


This is most evident in California, where biotech startup Stemagen recently created the first human clonal embryos [PDF] while skirting California law and national guidelines:

So, the same doctors that are extracting womens eggs to be used in the touchy-feely world of reproductive technologies, are using said eggs to clone human embryos for research. Example Samuel H. Wood, from above. His webpage at the San Diego Reproductive Sciences Center says their facilities are "where babies come from" and yet at the bottom of Dr. Wood's list of publications is his paper on cloning human embryos. [French AJ, Adams CA, Anderson LS, Kitchen JR, Hughes MR, Wood SH. Development of Human cloned Blastocysts Following Somatic Cell Nuclear Transfer (SCNT) with Adult Fibroblasts. Stem Cells. 2008 Jan 17; [Epub ahead of print]]

WAKE UP PEOPLE! Once you start creating human life in a dish, it immediately becomes a product,
a commodity to be exploited. Now that our society says it is fine to create human beings in a lab, it is not surprising that the very same doctors and researchers are expanding their area of expertise into human cloning. And do not think they aren't biting at the bit to take therapeutic cloning back into the clinic so they can tout that they are "where cloned babies come from."


15 a week....not even close to 100 in 10 days. [The Gene Sherpa: Personalized Medicine and You]

Posted: 02 Jul 2008 12:32 PM CDT

I wanted to tell everyone about what the Wellcome Trust has been doing. Aside from their new commitment to 90,000 genomes, they now have reached the 1 terabase. That is.... the staggering total of...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]

My first PLoS One paper .... yay. [The Tree of Life]

Posted: 02 Jul 2008 10:45 AM CDT

Well, I have truly entered the modern world. My first PLoS One paper has just come out. It is entitled "An Automated Phylogenetic Tree-Based Small Subunit rRNA Taxonomy and Alignment Pipeline (STAP)" and well, it describes automated software for analyzing rRNA sequences that are generated as part of microbial diversity studies. The main goal behind this was to keep up with the massive amounts of rRNA sequences we and others could generate in the lab and to develop a tool that would remove the need for "manual" work in analyzing rRNAs.


The work was done primarily by Dongying Wu, a Project Scientist in my lab with assistance from a Amber Hartman, who is a PhD student in my lab. Naomi Ward, who was on the faculty at TIGR and is now at Wyoming, and I helped guide the development and testing of the software.

We first developed this pipeline/software in conjunction with analyzing the rRNA sequences that were part of the Sargasso Sea metagenome and results from the word was in the Venter et al. Sargasso paper. We then used the pipeline and continued to refine it as part of a variety of studies including a paper by Kevin Penn et al on coral associated microbes. Kevin was working as a technician for me and Naomi and is now a PhD student at Scripps Institute of Oceanography. We also had some input from various scientists we were working with on rRNA analyses, especially Jen Hughes Martiny

We made a series of further refinements and worked with people like Saul Kravitz from the Venter Institute and the CAMERA metagenomics database to make sure that the software could be run outside of my lab. And then we finally got around to writing up a paper .... and now it is out.

You can download the software here. The basics of the software are summarized below: (see flow chart too).


  • Stage 1: Domain Analysis
    • Take a rRNA sequence
    • blast it against a database of representative rRNAs from all lines of life
    • use the blast results to help choose sequences to use to make a multiple sequence alignment
    • infer a phylogenetic tree from the alignment
    • assign the sequence to a domain of life (bacteria, archaea, eukaryotes)
  • Stage 2: First pass alignment and tree within domain
    • take the same rRNA sequence
    • blast against a database of rRNAs from within the domain of interest
    • use the blast results to help choose sequences for a multiple alignment
    • infer a phylogenetic tree from the alignment
    • assign the sequence to a taxonomic group
  • Stage 3: Second pass alignment and tree within domain
    • extract sequences from members of the putative taxonomic group (as well as some others to balance the diversity)
    • make a multiple sequence alignment
    • infer a phylogenetic tree
From the above path, we end up with an alignment, which is useful for things such as counting number of species in a sample as well as a tree which is useful for determining what types of organisms are in the sample.

I note - the key is that it is completely automated and can be run on a single machine or a cluster and produces comparable results to manual methods. In the long run we plan to connect this to other software and other labs develop to build a metagenomics and microbial diversity workflow that will help in the processing of massive amounts of sequence data for microbial diversity studies.

I should note this work was supported primarily by a National Science Foundation grant to me and Naomi Ward as part of their "Assembling the Tree of Life" Program (Grant No. 0228651). Some final work on the project was funded by the Gordon and Betty Moore Foundation (grant #1660 to Jonathan Eisen).

Natural Selection Before Darwin [T Ryan Gregory's column]

Posted: 02 Jul 2008 10:03 AM CDT

Charles Darwin (1809-1882) opened his first notebook about "the species question" in 1837, not long after his return from the voyage of the Beagle.  By 1838, he had developed the basic outline   of his theory of natural selection to explain the evolution of species.  He spent the next 20 years developing the theory and marshalling evidence in favour of both the fact that species are related through common descent and his particular theory to explain this.  After receving word that another naturalist, Alfred Russel Wallace (1823-1913), had independently come upon the same theory, he assembled his work for publication, first in a joint paper with Wallace presented to the Linnean Society of London in 1858 and then his "abstract", On the Origin of Species, in 1859.

Some authors have argued that Edward Blyth (1810-1873), an acquaintance of Darwin's, developed the central idea of selection in an 1835 paper in the Magazine of Natural History.  For example, Eiseley and Grote (1959) claimed that "the leading tenets of Darwin's work -- the struggle for existence, variation, natural selection and sexual selection are all fully expressed in Blyth's 1835 paper", from which they then quoted the following:

Read More...

Navigenics has a blog [Genetic Future]

Posted: 02 Jul 2008 07:40 AM CDT

I just noticed that personal genomics company Navigenics has been rather quietly running a blog, "The Navigator", for a week or so.

A quick perusal of the posts so far highlights the differences in attitude between Navigenics and its peppier rival, 23andMe: while the long-running 23andMe blog, The Spittoon, conveys a real sense of a group of bright people who enjoy sharing advances in genetics with the public, the Navigator so far feels more sober, careful and considered.

This fits with the differing target audiences of the two companies: whereas 23andMe has the right mix of technical information and bright colours to draw in the young Wired crowd, Navigenics has marked itself from the start as being a serious company providing serious health-centred information to busy, serious people who don't have time to learn what a SNP is, and certainly are far too busy and serious to worry about frivolous things like genes for eye colour. 23andMe's blog is thus much more relevant to the company's target audience than Navigenics' (in fact I'm unsure whether the Navigenics target customer even knows what a blog is).

I do note that Navigenics chose to comment on the recent regulatory scuffle in California on their blog, a topic on which the Spittoon has been unexpectedly silent so far - so perhaps their blog will be used as an official communication channel as well as a means of drawing attention to new SNPs being added to Navigenics' serious list of Diseases They Think You Need To Know About.

Anyway, I won't be too harsh given that the blog has only just launched - anyone following the industry will probably want to subscribe, or at least drop by occasionally.


Subscribe to Genetic Future.

Did Darwin delay? [T Ryan Gregory's column]

Posted: 02 Jul 2008 06:13 AM CDT

In my evolution course, I note that "Darwin spent 20 years working out his ideas and gathering evidence" before releasing On the Origin of Species in 1859.

Read More...

The Slowing of Drug Discovery [Bitesize Bio]

Posted: 02 Jul 2008 05:09 AM CDT

The June 20th issue of Science had an interesting story worth noting (interesting to anyone into molecular pharmacology, anyway), on Drugs, Industry, and Academia. It caught my attention because of some conversations that I’ve been having recently with colleagues and friends in the industry - how is Big Pharma going to maintain itself amid slowing drug discovery trends?

Now, I’m working at a generic pharmaceutical company for the time being, and the coming decade will be a boom for companies who aren’t following discovery model - so many very profitable drugs will be coming off their patents over that time.

Just about everyone agrees on about two things though. First, more basic research, the vast amount of which takes place in academia and is sponsored by federal funding. Thus, the government makes investments in the relevant fields of study to maintain economic prosperity in the pharmaceuticals market. And second, high-throughput screening, combinatorial chemistry, and rationalized drug design have largely been failures, suggesting that we don’t have an efficient system for discovery that can out-perform an army of inventive academics.

Beyond those two items, it sounds as though nobody has a clear vision for what the pharmaceutical industry should become. I imagine that each company will formulate their own strategy, and the one(s) with the right combination of effective qualities in the changing market will come out on top.

Health Benefits of Indium [Sciencebase Science Blog]

Posted: 02 Jul 2008 03:10 AM CDT

Toxic chemicalsYet another health supplement hits the streets, this time in the form of indium sulfate. Never heard of it? Apparently, it “is a rare trace mineral that supports several hormonal systems in the body. Indium may strongly elevate immune activity and reduce the severity and duration of a myriad of human conditions.” That’s according to the NaturalHealthConsult.com website, which goes on to claim that the element will “normalize the hypothalamus and pituitary gland in the brain.”

The site explains, that “As the conductor of various studies on indium, Dr. Schroeder (the scientist best known for inventing the means to take lead out of gasoline) found that possibly the most important function of Indium is to normalize the hypothalamus and pituitary gland in the brain.”

Well, the late Henry A. Schroeder of Dartmouth Medical School, a leading toxicologist, spent years highlighting the problems of lead toxicity, but did not as far as I know despite the wording of the quote above, develop a method for removing lead from gasoline. Why would you need to do that? The petrochemical companies used to add tetraethyl lead as an antiknocking agent, so the simplest method for its “removal” is just not to add it in the first place.

Tetraethyllead was first added to gasoline in 1923 and it quickly became obvious that workers at the three manufacturing plants were becoming psychotic and dying from its toxic effects. The issue was essentially hushed up and “research” between 1926 and 1965 claimed a consensus that lead was only a problem at high exposure levels and atmospheric lead from vehicle exhausts was not a problem at all. We now know different, thanks partly to the efforts of Schroeder.

Meanwhile, back to indium. Elemental discoveries are a boon to any marketeer, especially if you can convince consumers to buy, buy, buy. A document entitled: “Patented Indium Trace Element in Marketing Form Available for License” suggests how this can be so

There are 3 questions that a company should ask when considering the addition of a new product for its product line:

  • What is it that I can sell abundantly, at a high profit and worldwide, exclusively?

  • Why will my customers want to continue to use it, daily, for the rest of their lives?
  • How will my customers be able to afford to use it daily, all of their lives, continuously?

It then tells us how these pertain to indium (In, element 49) and how a clever marketer might exploit the patent on the health benefits of indium.

So, where do the supposed health benefits of indium come from or is it just a marketing scam and what about those claims to affect the activity of glands in the brain. Well, indium is an element in the same group of the periodic table as boron, aluminium, and gallium, oh and thallium, so one would not expect it to be particularly beneficial or even essential to health. Indeed, aluminium is a neurotoxin.

However, Schroeder, towards the end of his life, wheelchair bound with muscular dystrophy, included indium in some of his last few experiments. He apparently, demonstrated that lab animals, on lifetime indium, had fewer cancers than controls. Other than references to the use of indium in imaging agents, I can find nothing in the medical literature regarding the positive health benefits of daily supplementation with indium, not then, not now.

Yet, the web is littered with so-called health food sites selling indium sulfate to unwitting consumers, presumably, exploiting that marketing guidance I found on at least one site. However, there is one site to which I shall refer you and that is the webelements site from Sheffield University’s Mark Winter. the entry for indium explains that depending on dose:

All indium compounds should be regarded as highly toxic. Indium compounds damage the heart, kidney and liver, and may be teratogenic

So, who do you trust most, a health website hoping to get repeat sales based on your fears of poor health as you get older, or a well-respected site from a leading research team at a top university? I’m pretty sure I don’t want to be ingesting an indium compound daily for the rest of my life in the vague hope that represents some undiscovered panacea, that would just be bad medicine

This item originally published July 25, 2005, was overhauled and updated at the request of a Sciencebase reader on July 2, 2008.

A post from David Bradley Science Writer

Health Benefits of Indium

What does DNA mean to you? #12 [Eye on DNA]

Posted: 02 Jul 2008 03:01 AM CDT

dna dundeeSandra Porter of Discovering Biology in a Digital World shares what DNA means to her:

DNA means opportunity and adventure. Opportunity, in that my livelihood is completely tied up in DNA. I teach about it. I work with DNA sequences. I enjoy playing with DNA structures. And of course, our company (Geospiza) sells software for managing the production and analysis of DNA data. Opportunity is also the key term because of the diseases that we’ll someday be able to prevent and treat because of the things we’ll be able to learn from DNA. As fara as adventure, “adventure” applies to DNA because getting from here to there is certainly an adventure. Along the way, we’ll find things we want to know and things that we’d prefer not to know, but the adventure of discovery and the process of finding out who we are and where we’ve been is most certainly an adventure.

Is it heritable? Twin study evidence suggests acne can be inherited [DNA and You]

Posted: 02 Jul 2008 02:08 AM CDT

Have you ever wondered whether acne risk can be inherited?  Twin studies can shed light on this question.

In a previous post, I discussed twin studies which are the genetic gold standard to determine whether the liklihood of developing a given medical condition (or trait) is subject to heritable (i.e., genetic) influences.  The concept is fairly simple: Identical twins share essentially 100% of their genetic material while fraternal twins share 50%.  If genetics plays a strong role in risk for a disease, when twin pairs in which one member has the disease are studied, both twins will have the same disease in a larger percentage of the identical twin pairs as compared to the fraternal twin pairs. 

One study applied this concept to the study of acne.  The results were clear: in this study, approximately 80% of the degree to which acne severity varied was due to genetic effects. 

Although the underlying causative genes are not known, the fact that genetics plays such a large role in the development of acne suggests that future efforts to find the underlying genes are likely to reveal new drug targets.

Comparative whole-genome sequencing of yeast [Next Generation Sequencing]

Posted: 02 Jul 2008 01:51 AM CDT

In a new paper out in PNAS, Michael Lynch and colleagues report their findings from a comparative whole-genome pyrosequencing of yeast genomes. A total of four haploid genomes were sequenced in the study using the 454 platform. Surprisingly, the researchers found that the number of large scale genome events (insertions, deletions, duplications etc) was comparable [...]

Sanger Institute sequences its trillionth DNA base [Genetic Future]

Posted: 02 Jul 2008 12:48 AM CDT

The Wellcome Trust Sanger Institute, a major genetics research institute set in idyllic countryside a few miles south of Cambridge, UK, has reached a symbolic milestone: the sequencing of over one trillion DNA letters in just six months.

That's the equivalent of almost 350 entire human genomes. At the current rate (which is rapidly increasing) the Sanger is churning out more DNA sequence every two minutes than was generated by the entire research community from 1982-1987. This obscene rate of data generation has been enabled by the development of next-generation DNA sequencing platforms, which can each churn out one human genome equivalent in less than a week.

About a third of the trillion bases was generated for the 1000 Genomes Project, a massive international collaboration that will ultimately sequence the entire genomes of over 1,000 individuals drawn from multiple human populations. I commented last month on the fact that this project has become an important proving ground for the three major next-gen sequencing platforms on the market (454, Solexa and SOLiD), each of which is keen to position itself early as the technology of choice as the era of medical genomics draws ever closer.

From the data posted on the Institute's web site the Sanger actually reached the milestone a few weeks ago - the graph below shows the cumulative amount of purity-filtered (high-quality) sequence data generated since August 2007, with the curve crossing the magic "1000 Gb" mark in early June.



It's worth noting that there are also several other large sequencing facilities around the world (such as the Broad Institute) that are churning out roughly comparable amounts of sequencing data.

The brief window of time in which celebrity genomics was enough to get you into Nature is already closing; it's time for population genomics.

Update: Wired has a concise and timely review of progress in sequencing technology.


Subscribe to Genetic Future.

High Throughput Sequencing, Moore's Law, The Next Generation, and Storage Options [DNA and You]

Posted: 02 Jul 2008 12:44 AM CDT

It will be interesting to see if DNA sequencing technology continues to follow a Moore's law-like trajectory over the coming decades.  Clearly, next generation technologies are going to have a massive impact both in the research setting and on medical resequencing. 

Alexis Madrigal at Wired Science wrote a nice piece on the technological advances leading to the current generation of high-throughput sequencers and also the next generation: Pacific Biosciences and Helicos.

Also notable is a post from the always excellent Daniel MacArthur at Genetic Future on options for storage of personal genome sequences.

New oral angiogenesis inhibitor offers potential nontoxic therapy for a wide range of cancers [Think Gene]

Posted: 01 Jul 2008 11:57 PM CDT

An Old Drug, Transformed by NanotechnologyThe first oral, broad-spectrum angiogenesis inhibitor, specially formulated through nanotechnology, shows promising anticancer results in mice, report researchers from Children’s Hospital Boston. Findings were published online on June 29 by the journal Nature Biotechnology.

Because it is nontoxic and can be taken orally, the drug, called Lodamin, may be useful as a preventive therapy for patients at high risk for cancer or as a chronic maintenance therapy for a variety of cancers, preventing tumors from forming or recurring by blocking the growth of blood vessels to feed them. Lodamin may also be useful in other diseases that involve aberrant blood-vessel growth, such as age-related macular degeneration and arthritis.

Developed by Ofra Benny, PhD, in the Children’s laboratory of the late Judah Folkman, MD, Lodamin is a novel slow-release reformulation of TNP-470, a drug developed nearly two decades ago by Donald Ingber, MD, PhD, then a fellow in Folkman’s lab, and one of the first angiogenesis inhibitors to undergo clinical testing. In clinical trials, TNP-470 suppressed a surprisingly wide range of cancers, including metastatic cancers, and produced a few complete remissions. Trials were suspended in the 1990s because of neurologic side effects that occasionally occurred at high doses, but it remains one of the broadest-spectrum angiogenesis inhibitors known.

Lodamin appears to retain TNP-470’s potency and broad spectrum of activity, but with no detectable neurotoxicity and greatly enhanced oral availability. While a number of angiogenesis inhibitors, such as Avastin, are now commercially available, most target only single angiogenic factors, such as VEGF, and they are approved only for a small number of specific cancers. In contrast, Lodamin prevented capillary growth in response to every angiogenic stimulus tested. Moreover, in mouse models, Lodamin reduced liver metastases, a fatal complication of many cancers for which there is no good treatment.

“The success of TNP-470 in Phase I and II clinical trials opened up anti-angiogenesis as an entirely new modality of cancer therapy, along with conventional chemotherapy, radiotherapy and surgical approaches,” says Ingber, now co-interim director of the Vascular Biology Program at Children’s.

TNP-470 was first reformulated several years ago by Ronit Satchi-Fainaro, PhD, a postdoctoral fellow in Folkman’s lab, who attached a large polymer to prevent it from crossing the blood-brain barrier (Cancer Cell, March 2005). That formulation, Caplostatin, has no neurotoxicity and is being developed for clinical trials. However, it must be given intravenously.

Benny took another approach, attaching two short polymers (PEG and PLA) to TNP-470. Experimenting with polymers of different lengths, she found a combination that formed stable, “pom-pom”-shaped nanoparticles known as polymeric micelles, with TNP-470 at the core. The polymers (both FDA-approved and widely used commercially) protect TNP-470 from the stomach’s acidic environment, allowing it to be absorbed intact when taken orally. The micelles reach the tumor, react with water and break down, slowly releasing the drug.

Tested in mice, Lodamin had a significantly increased half-life, selectively accumulated in tumor tissue, blocked angiogenesis, and significantly inhibited primary tumor growth in mouse models of melanoma and lung cancer, with no apparent side effects when used at effective doses. Subsequent tests suggest that Lodamin retains TNP-470’s unusually broad spectrum of activity. “I had never expected such a strong effect on these aggressive tumor models,” Benny says.

Notably, Lodamin accumulated in the liver without causing toxicity, preventing liver metastases and prolonging survival. “This was one of the most surprising things I saw,” says Benny. “When I looked at the livers of the mice, the treated group was almost clean. In the control group you couldn’t recognize the livers — they were a mass of tumors.”

TNP-470 itself has an interesting history. It was derived from fumagillin, a mold with strong anti-angiogenic effects that Ingber discovered accidentally while culturing endothelial cells (the cells that line blood vessels). Ingber noticed that in certain dishes — those contaminated with the mold — the cells changed their shape by rounding, a behavior that inhibits capillary cell growth. Ingber cultured the fungus, disregarding lab policy, which called for contaminated culture to be discarded immediately. He and Folkman later developed TNP-470, a synthetic analog of fumagillin, with the help of Takeda Chemical Industries in Japan (Nature, December 1990). It has shown activity against dozens of tumor types, though its mechanism of action is only partly known.

“It’s been an evolution,” says Benny, “from fumagillin to TNP-470 to Caplostatin to Lodamin.”

Source: Children’s Hospital Boston

An orally delivered small-molecule formulation with antiangiogenic and anticancer activity. Ofra Benny, Ofer Fainaru, Avner Adini, Flavia Cassiola, Lauren Bazinet, Irit Adini, Elke Pravda, Yaakov Nahmias, Samir Koirala, Gabriel Corfas, Robert J D’Amato & Judah Folkman. Nature Biotechnology. Published online: 29 June 2008; | doi:10.1038/nbt1415

Josh:

This is great. I wonder why the original drug had the neurological side effects but this one does not? Is it in a lower dosage? I’m surprised that this works as good as it does. Do any doctors on here know if these drugs targeting angiogenesis affect healing and normal blood vessal growth, and if so, to what extent?

New technique produces genetically identical stem cells [Think Gene]

Posted: 01 Jul 2008 11:43 PM CDT

Adult cells of mice created from genetically reprogrammed cells—so-called induced pluripotent stem (IPS) stem cells—can be triggered via drug to enter an embryonic-stem-cell-like state, without the need for further genetic alteration.

The discovery, which promises to bring new efficiencies to embryonic stem cell research, is reported in the July 1, 2008, online issue of Nature Biotechnology.

“This technical advancement will allow thousands of identical reprogrammed cells to be used in experiments,” says Marius Wernig, one of the paper’s two lead authors and a postdoctoral researcher in Whitehead Member Rudolf Jaenisch’s lab.

“Using these cells could help define the milestones of how cells are reprogrammed and screen for drug-like molecules that replace the potentially cancer-causing viruses used for reprogramming,” adds Christopher Lengner, the other lead author and also a postdoctoral researcher in the Jaenisch’s lab.

In the current work, Wernig and Lengner made mice created in part from the embryonic-stem-cell-like cells known as IPS cells. The IPS cells were created by reprogramming adult skin cells using lentiviruses to randomly insert four genes (Oct4, Sox2, c-Myc and Klf4) into the cells’ DNA. The IPS cells also were modified to switch on these four genes when a drug trigger, doxycycline, is added to the cells.

Wernig and Lengner then took cells from each IPS mouse and introduced the doxycycline trigger, thereby changing the adult mouse cells into IPS cells.

While earlier reprogramming experiments have typically induced pluripotency in adult skin cells, Wernig and Lengner were able to employ this novel method to successfully reprogram multiple cell and tissue types, including cells of the intestine, brain, muscle, kidney, adrenal gland, and bone marrow. Importantly, the technique allows researchers to create large numbers of genetically identical IPS cells, because all cells in the mouse contain the same number of viral integrations in the same location within the genome. With previous approaches, each reprogrammed cell differed because the viruses used to insert the reprogramming genes could integrate anywhere in the cell’s DNA with varying frequency.

Wernig and Lengner’s method also increases the reprogramming efficiency from one in a thousand cells to one in twenty.

The large numbers of IPS cells that can be created by this method can aid experiments requiring millions of identical cells for reprogramming, such as large-scale chemical library screening assays.

“In experiments, the technique will eliminate many of the reprogramming process’s unpredictable variables and simplify enormously the research on the reprogramming mechanism and the screening for virus replacements,” says Jaenisch, who is also a professor of biology at Massachusetts Institute of Technology.

Source: Whitehead Institute for Biomedical Research

A drug-inducible transgenic system for direct reprogramming of multiple somatic cell types. Marius Wernig, Christopher J Lengner, Jacob Hanna, Michael A Lodato, Eveline Steine, Ruth Foreman, Judith Staerk, Styliani Markoulaki & Rudolf Jaenisch. Nature Biotechnology. Published online: 01 July 2008; | doi:10.1038/nbt1483

Josh:

I made mention of the techniques used in the earlier paper in an April Fools joke this year. When reading this press release, I first thought it must be incorrect when it says the new technique “replace[s] the potentially cancer-causing viruses used for reprogramming”, since lentiviruses are still used to introduce the vector. However, upon reading the paper, they authors make the same claim! Perhaps I’m just missing something (such as the lentiviral DNA not integrating into the host genome) or am misreading, but this seems very misleading. They say a virus is used to introduce the genes, and then say that the technique eliminates the need to use a virus. What the technique does ensure, however, is that the cells have the same number of integrations and are identical to one another, which would help in these types of experiments.

A mammalian clock protein responds directly to light [Think Gene]

Posted: 01 Jul 2008 11:17 PM CDT

We all know that light effects the growth and development of plants, but what effect does light have on humans and animals? A new paper by Nathalie Hoang et al., published in PLoS Biology this week, explores this question by examining cryptochromes in flies, mice, and humans. In plants, cryptochromes are photoreceptor proteins which absorb and process blue light for functions such as growth, seedling development, and leaf and stem expansion. Cryptochromes are present in humans and animals as well and have been proven to regulate the mechanisms of the circadian clock. But how they work in humans and animals is still somewhat of a mystery.

When plants are exposed to blue light, they experience a reduction in flavin pigments. This reduction activates the cryptochromes and thus allows for growth and seedling development. Hoang et al. sought to study the effect of blue light on fly, animal, and human cryptochromes by exposing them to blue light and measuring the change in the number of oxidized flavins. After a prolonged exposure to blue light, the authors found that the number of flavins did in fact decrease, as they do in plants.

While this research reveals a similarity in the responses of flies, mice, humans, and plants to blue light, the decrease in flavins affects circadian rhythms differently. The mouse cryptochromes, Mcry1 and Mcry2, interact with key parts of the circadian clock: mice with these cryptochromes missing exhibited a complete loss in circadian rhythm behaviors such as wheel-running. However, this change in behavior was independent of light exposure.

Although this paper by Hoang, et al, shows that cryptochromes in animals and humans do respond to light in a similar fashion to those in plants, the question as to how exactly light effects them is still open for further research. Although cryptochromes are mainly found in the retina of the eye, they are also present in many different tissues of the body that are close to the surface. This suggests that cryptochromes may have non-visual functions, and may also affect protein levels and behavior.

Source: Public Library of Science

Hoang N, Schleicher E, Kacprzak S, Bouly JP, Picot M, et al. (2008) Human and Drosophila cryptochromes are light activated by flavin photoreduction in living cells. PLoS Biol 6(7): e160. doi:10.1371/journal.pbio.0060160

Danny Hillis gets science, George Dyson doesn’t [business|bytes|genes|molecules]

Posted: 01 Jul 2008 10:20 PM CDT

I don’t know why Chris Anderson’s Wired article has got me so bugged. Things rarely do. Perhaps it’s cause of the name behind the article. Perhaps its the fact that it’s been picked up by pretty much everyone, from Hacker News, to bloggers to Edge. It’s also amazing to see the responses (in blog posts and on Edge) by Kevin Kelly and George Dyson, especially the latter. For someone who is a science historian his statement at the end of the Edge version of the Anderson article is even worse than the Anderson article. If someone could really explain what he’s trying to say it would be much appreciated, cause I can’t make any sense of it. The following is an excellent example of what I am talking about.

It may be that our true destiny as a species is to build an intelligence that proves highly successful, whether we understand how it works or not.

“It may be our destiny” is NOT science and to “build an intelligence”. What is science is best described by Danny Hillis in a quote at the end of the same Edge piece, which I reproduce in its entirety here. The quote summarizes my own thinking completely.

Chris Anderson says that “this approach to science — hypothesize, model, test — is becoming obsolete”. No doubt the statement is intended to be provocative, but I do not see even a little bit of truth in it. I share his enthusiasm for the possibilities created by petabyte datasets and parallel computing, but I do not see why large amounts of data will undermine the scientific method. We will begin, as always, by looking for simple patterns in what we have observed and use that to hypothesize what is true elsewhere. Where our extrapolations work, we will believe in them, and when they do not, we will make new models and test their consequences. We will extrapolate from the data first and then establish a context later. This is the way science has worked for hundreds of years. W. Daniel Hillis

There is a beauty to science, a structure, a certain ethos. Too many people seem to miss it completely, even those who should know better.

And unless someone comes up with another idiotic post about science, this is the last I am writing about this.

Zemanta Pixie

ShareThis

No comments:

Post a Comment