PCR Better than Pap Test for Preventing Cervical Cancer

“Power of PCR” as a Transformative Diagnostic Method

  • FDA Approves Roche PCR Test for Cervical Cancer Screening
  • Automated Test Replaces Pap Test as First-line Cervical Cancer Screening
  • Demonstrates the “Power of PCR” as a Transformative Diagnostic Method

Pap Test

The Papanicolaou test—aka Pap test, Pap smear, cervical smear, or smear test—is a method of cervical screening used to detect potentially pre-cancerous and cancerous processes in the endocervical canal of the female reproductive system. Unusual findings are often followed up by more sensitive diagnostic procedures, and, if warranted, interventions that aim to prevent progression to cervical cancer.

Proper interpretation of microscopic results requires a “trained eye.” This is evident from the representative example shown below, which I found in an online educational textbook and, quite frankly, had trouble discerning the visual keys described in the verbatim caption. Notwithstanding this issue, Pap tests were—until now—the accepted “gold standard.”

pap

Taken from homepage.smc.edu via Bing Images.

The source document for this Pap smear reads as follows. “The cytologic features of normal squamous epithelial cells can be seen at the center top and bottom, with orange to pale blue plate-like squamous cells that have small pyknotic nuclei. The dysplastic cells in the center extending to upper right are smaller overall with darker, more irregular nuclei.”

The eponymous test was pioneered by Georgios Papanikolao, a prominent Greek doctor, who in 1928 was the first to report that uterine cancer could be diagnosed by means of a vaginal smear. However, the importance of this work was not widely recognized until his 1943 publication of Diagnosis of Uterine Cancer by the Vaginal Smear, coauthored by Herbert F. Traut, both at Cornell University Medical College.

ResearchGeorgios Papanicolaou moved to Miami, Florida in 1961 to establish the Papanicolaou Cancer Research Institute at the University of Miami, but died in 1962 prior to its opening. Papanicolaou was the recipient of the Albert Lasker Award for Clinical Medical Research in 1950—this award is sometimes referred to as “America’s Nobels,” as eighty-six Lasker laureates have received the Nobel Prize. Papanikolaou’s portrait appeared on the Greek 10,000-drachma banknote of 1995-2001, prior to its replacement by the Euro.

Cervical Cancer Statistics

Cervical cancer is the second most common cancer in women worldwide, according to an NIH publication in 2007. Country-by-country data for cervical cancer reveal a striking geographical distribution. According to currently available U.S. Centers for Disease Control (CDC) FastStats, cervical cancer mortality in the U.S. in 2010 was ~4,000 or ~2.5 deaths per 100,000 females.

The global statistics provided by Cancer Research U.K. are far more saddening. Worldwide there were more than ~275,000 deaths from cervical cancer in 2010 that accounted for ~10% of female cancer deaths.

Remarkably, mortality rates are reported to vary seventeen-fold between the different regions of the world. By estimating the years-of-life-lost (YLL) by young and middle-aged women (25-64 years old) in different regions of the world, YLL attributed to cervical cancer is the most important cause of YLL for all cancers in Latin America, the Caribbean, and populous regions of Sub-Saharan Africa and South-Central Asia. The overall picture is not very sensitive to the age-weighting function used. The report notes that, since this loss of life is preventable using existing technologies, more health-resource allocation in low income settings is needed.

Pap Test Statistics

Currently available CDC FastStats for Pap test use in the U.S. in 2010 (the most recent year available) are as follows:

  • Percent of women 18 years of age and over who had a Pap test within the past 3 years: 73.2%
  • Number of physician office visits during which Pap tests were ordered or provided: 29.4 million
  • Number of hospital outpatient department visits during which Pap tests were ordered or provided: 2.4 million

Pap Test Recommendations as of 2013

To put today’s blog-post headline about switching from Pap to PCR in perspective, here are snippets from the most recent CDC guidelines and comments made available in a January 2013 press release headlined with “more women getting Pap tests as recommended [but] some women get Pap tests without need.”

  • In 2012, the U.S. Preventive Services Task Force, American College of Obstetricians and Gynecologists and American Cancer Society recommended that women, beginning at age 21, should start Pap test screening every three years.
  • The same groups agree that screening is unnecessary for most women who have had a total hysterectomy (removal of the uterus and uterine cervix) for non-cancerous reasons, or for women aged 65 years and older with several years of normal test results.
  • Studies analyzed Pap test survey data from CDC’s Behavior Risk Factor Surveillance System found the following:
    • The percentage of women aged 18-21 years who reported never being screened increased from 23.6% in 2000 to 47.5% in 2010; however, screening is not recommended for women under the age of 21.
    • In 2010, 58.7% of women aged 30 years and older who had a hysterectomy were still given a Pap test.
    • Because of the Affordable Care Act (aka Obamacare), many private health plans and Medicare now cover certain preventive services, including cervical cancer screening, with no copays or other out-of-pocket costs.

HPV: The Cervical Cancer-Causing Agent and Key to Early Detection

In a landmark publication in 1999 entitled Human papillomavirus is a necessary cause of invasive cervical cancer worldwide, Dutch investigators used PCR data to establish that the worldwide HPV prevalence in cervical carcinomas is 99.7 per cent. They noted that “the presence of HPV in virtually all cervical cancers implies the highest worldwide attributable fraction so far reported for a specific cause of any major human cancer.” More importantly, they presciently concluded that “the extreme rarity of HPV-negative cancers reinforces the rationale for HPV testing in addition to, or even instead of, cervical cytology in routine cervical screening.”

Due in part to technical challenges posed by numerous genotypes of HPV with varying cancer causality detailed elsewhere, and unavoidable time-consuming clinical studies required for FDA approval, it has taken ~15 years for a PCR test to now be poised to displace the Pap test as the primary diagnostic approach for early detection of cervical cancer.

Those of you who are interested in the technical underpinnings of Roche’s investigations to this end are referred to this 2013 publication by Roche and collaborators entitled Development and characterization of the cobas human papillomavirus test. In contrast to the tedious Pap test protocol and its “visually challenging” manual microscopic analysis, this “cobas”-based PCR test provided by Roche is fully automated.  The test process involves two instruments: one that completes sample preparation (COBAS® AmpliPrep) and another that performs the PCR process and detection of the pathogen DNA in real time (COBAS® TaqMan® Analyzer).

Incidentally, I traced-back the term “cobas” to late 1970’s Roche instrumentation named the “cobas-bio” analyzer, but could not decipher what “cobas” stands for! If any of you know the answer, please let us know by a comment at the end of this post.

FDA Panel Recommends Replacement for the Pap Test

This attention-grabbing headline of a March 2014 NY Times article by Andrew Pollock was the catalyst for my decision to research and write this blog exemplifying the “power of PCR” as a transformative diagnostic method. While this and numerous other popular news media all made reference to an FDA panel’s report, it took some digging to find the actual source-report, which is an 80-page pdf that can be accessed here to peruse in detail, if you wish. However, a much shorter but essential-fact-laden article by Joyce Frieden, News Editor of MedPage Today provided the following excerpts.

The FDA’s Medical Devices Advisory Committee Microbiology Panel agreed by a vote of 13-0 in each of three successive votes that the cobas® viral DNA test for HPV—made by Roche Molecular Systems—was safe and effective for cervical cancer screening, and that the benefits of the tests outweighed the risks. The Panel recommended that this Roche HPV test replace the Pap smear as the first-line standard of care for cancer screening.

The Roche test is seen as better than Pap tests in finding precancerous lesions (taken from the NY Times).

The Roche test is seen as better than Pap tests in finding precancerous lesions (taken from the NY Times)

The cobas® test currently has approval as a follow-up assessment for women 21 and older who have abnormal Pap tests, and as a co-test with the Pap smear to screen for the high-risk p16 and p18 HPV strains in women 30 to 65. The test comprises genotyping for HPV16 and 18 and pooled assessment of 12 additional high-risk HPV strains.

According to the proposal submitted by Roche, women 25 and older who test positive for HPV16 or 18 would proceed directly to colposcopy for further assessment.

Patients who test negative for HPV16 or 18 but positive for the other high-risk strains would have a Pap test to determine the need for colposcopy. Women who have a completely negative test would be followed at their physician’s discretion.

Panelists did express some concerns about dropping the age at which women should have the test from 30 to 25. The ATHENA study of over 47,000 patients with long-term follow-up used as the basis for the application found that about 11% of women ages 25 to 29 tested positive for HPV16 or 18 with the cobas test, compared with 7.28% among women 25 to 29 who had cytology alone as their first-line screening. Panel member Paula Hillard, MD, of Stanford University in California, was quoted as saying that would mean more patients in that age group “will be anxious about potentially having cancer.”

In addition, Hillard is quoted as expressing concern about off-label use. “I’m concerned that all those women potentially with other high-risk positivity won’t go to Paps next but go [straight] to colposcopy. That’s not what’s proposed here, but what control does FDA have once it’s out there?”

Panelist Kenneth Noller, MD, of the American Board of Obstetrics and Gynecology, in Dallas, agreed that real-world use could differ from the protocol proposed by Roche. He’s quoted as saying that “I’ve been watching how people practice; if you’re high-risk HPV positive you’re going to get colposcopy.” Furthermore, he said “that doesn’t necessarily mean it’s bad—it’s what you do with the colposcopy.”

Noller added that although he was “somewhat biased against dropping the age to 25 before I came here … I find the data presented today somewhat compelling to drop it to 25.”

Agreeing with this was panel member Kimberly Hanson, MD, MHS, of the University of Utah and ARUP Laboratories, both in Salt Lake City: “now we have the opportunity to identify women earlier, and to me that’s compelling,” adding that “although colposcopy is invasive and can be anxiety-provoking, it’s really very safe, so I think I’m leaning toward earlier screening.”

According to the summary submitted by FDA staff members, “The data show that the proposed primary screening indication for the cobas HPV test detects more women with disease and requires fewer women without disease to go to colposcopy than cytology alone.”

Benefit-risk analyses favored the HPV DNA test whether expressed in terms of number of cases of high-grade cervical disease per 10,000 women screened or per 100 colposcopy procedures.

The FDA is not bound to follow its advisory committees’ recommendations, but does so in most cases. On April 25—coincidentally DNA Day 2014—the FDA formally approved Roche’s HPV test as the First-Line Cervical Cancer Screening Method.

The “Entrenchment Factor”

At the risk of “throwing cold water” on the aforementioned PCR test benefits, I feel compelled to quote from Pollak’s NY Times story that ended with the following caveat.

“The Pap test, which is well entrenched and has been highly successful, will not go away quickly, if at all, however.

Assuming the FDA itself agrees with its advisory committee and approves the new use of Roche’s test, it would become just another option, not a replacement for the older testing regimens. And many doctors will not adopt the new test unless professional societies recommend it in guidelines, which could take years.”

Let’s all hope that these professional societies—and any other persuasive factors—lead to relatively rapid adoption by doctors.

As always, your comments are welcomed.

Venter’s Latest Venture: Increasing Human Longevity

  • The Quest to Make 100 the New 60
  • Aiming to Sequence 100,000 Human Genomes Per Year (Wow!)
  • Adding Genomes, Microbiomes, and Metabolomes to Health Records May Lead to Better Health and Longevity

Prologue

Although this post is mainly about a new start-up called Human Longevity, whose mission is to apply genomics to guide increased longevity, the fact that this company was founded by J. Craig Venter has certainly created a “buzz.” The very name Venter—to me—is synonymous with scientifically unorthodox ideas that are big and bold. If you’re familiar with Venter’s accomplishments and “genomics rock star” status, go to the next section; if not, here are some highlights of his rise to fame.

As an investigator at NIH, Venter gained notoriety when he caused a brouhaha with his intentions to patent genes he discovered using expressed sequence tags (ESTs). The controversy was so extensive that it precipitated resignation of Nobel Laureate James D. Watson in 1992, who was then Director of NIH’s Human Genome Office. This caught the eye of a venture capitalist, Wallace Steinberg, who wanted to start a gene-finding company—with Venter as its head. Venter, however, insisted on a nonprofit venture, so Steinberg set him up in a nonprofit entity called The Institute for Genomic Research (TIGR) supported by a new company, Human Genome Sciences. (A NY Times story by Nicholas Wade entitled A Maverick Making Waves provides a nice overview of Venter’s career path through 2000).

Venter generated thousands of EST’s to the human genome that became the intellectual property of Human Genome Sciences and enabled the company to develop far-reaching claims to many medical genes of interest. In partnership with Nobel Laureate Hamilton O. Smith, Venter next used shotgun sequencing—then unproven, and controversial—to completely sequence Haemophilus influenzae. This gave scientists their first glimpse into the set of genes necessary for life. Moreover, this achievement set off a revolution in medical microbiology, inspiring efforts to decode every major pathogen and learn the microbes’ entire playbook for attacking human cells.

These early sequencing successes in turn led Michael W. Hunkapiller—then President of PE Biosystems, which made the leading brand of DNA sequencing instrument—to recruit Venter to run Celera—a new private company. Venter boldly declared to the media that Celera would decode the human genome using shotgun sequencing by 2001—ahead of the public consortium, sparking a contentious “race for the human genome.”

Fast forwarding from his role in decoding the human genome—described as the single most important scientific breakthrough of modern times—Venter is Founder, Chairman, and CEO of the J. Craig Venter Institute (JCVI), a non-profit research organization with approximately 300 scientists and staff dedicated to human, microbial, plant, synthetic and environmental genomic research, and the exploration of social and ethical issues in genomics.

Venter is also Founder and CEO of Synthetic Genomics Inc. (SGI), a privately held company dedicated to commercializing genomic-driven solutions to address global needs such as new sources of energy, new food and nutritional products, and next generation vaccines. Recently Venter announced his latest venture, Human Longevity Inc. (HLI), “a genomics and cell therapy-based diagnostic and therapeutic company focused on extending the healthy, high performance human life span.”

Venter relaxing on his 95-foot sailboat/research vessel named Sorcerer II.

Venter relaxing on his 95-foot sailboat/research vessel named Sorcerer II. Photograph: Rick Friedman/Corbis (taken from theguardian.com via Bing Images).

Since 2003, scientists at the J. Craig Venter Institute have been on a quest to unlock the secrets of the oceans by sampling, shotgun sequencing and analyzing the DNA of the microorganisms living in these waters. In February 2014, the vessel embarked on a sampling expedition of the Amazon River and its tributaries, which contains 1/5th of the Earth’s river flow.

Human Longevity: Genomics-Based Fountain of Youth?

The Fountain of Youth is a spring that supposedly restores the youth of anyone who drinks or bathes in its waters. Tales of such a fountain have been recounted across the world for thousands of years, beginning with writings by the Greek historian Herodotus. The tale was particularly prominent in the 16th century, when it became attached to the Spanish explorer Juan Ponce de León, who was searching for the Fountain of Youth when, in 1513, he traveled to what is now Florida.

Artistic rendering of Ponce de León accepting water from the Fountain of Youth (taken from tabuherbalsmoke.com via Bing Images).

Artistic rendering of Ponce de León accepting water from the Fountain of Youth (taken from tabuherbalsmoke.com via Bing Images).

Given this legendary history and our collective wish for healthy, long lives, it’s not surprising that Venter’s announcement earlier this year attracted widespread media attention and significant funding—to the tune of $70 million. A story in the NY Times refers to Venter as saying that the largest of the investors is K. T. Lim, a Malaysian billionaire who runs Genting Berhad, a gambling conglomerate. Venter adds that a ‘not insignificant’ part of the funding comes from Illumina—for reasons that will be appreciated by reading further.

The press release goes on to say that HLI’s funding is being used “to build the largest human sequencing operation in the world to compile the most comprehensive and complete human genotype, microbiome, and phenotype database available to tackle the diseases associated with aging-related human biological decline.” HLI is also “leading the development of cell-based therapeutics to address age-related decline in endogenous stem cell function.”

In addition, HLI’s “revenue streams will be derived from database licensing to pharmaceutical, biotechnology and academic organizations, sequencing, and development of advanced diagnostics and therapeutics.”

Venter is quoted as saying that “using the combined power of our core areas of expertise—genomics, informatics, and stem cell therapies, we are tackling one of the greatest medical/scientific and societal challenges—aging and aging related diseases,” and that “HLI is going to change the way medicine is practiced by helping to shift to a more preventive, genomic-based medicine model which we believe will lower healthcare costs. Our goal is not necessarily lengthening life, but extending a healthier, high performing, more productive life span.”

HLI cofounder Peter H. Diamandis, M.D. puts this another and trendier way, according to the NY Times, which quotes Diamandis as saying that the goal was not to make people live forever, but rather to make “100 years old the next 60.” The NY Times goes on to say that “Venter, who is 67, sounds as if he might not need the company to succeed. Quoting Venter, “I feel like I have at least 20 or 30 years left in my career.”

HLI’s humongous database-to-be will surely be, in my opinion, a prime example of Big Data—itself a “hot trend.” It aims to have genomic sequences from “a variety of humans—children, adults and super centenarians [i.e. people who have attained the age of at least 110 years] and those with disease and those that are healthy,” according to the press release.

Illumina Looms Large in Longevity’s Plans

HLI has initially purchased two Illumina HiSeq X Ten Sequencing Systems (with the option to acquire three additional systems) to sequence up to 40,000 human genomes per year, with plans to rapidly scale to 100,000 human genomes per year.

Let me repeat this to be sure you don’t think these are typos.

40,000 and then 100,000 human genomes per year!

As pictured below, each of these newly introduced Sequencing Systems is comprised of ten—count them—instruments, which I’ve previously written about as enabling the long-elusive $1,000 genome cost target. Ironically, this goal was set by Venter as a technical challenge in 2002 at a now monumental TIGR conference.

Each Illumina HiSeq X Ten Sequencing System has a list price of $10 million.

Each Illumina HiSeq X Ten Sequencing System has a list price of $10 million.

Microbiome and Metabolome Data

Relative proportion of sequences determined at the taxonomic phylum level at eight anatomical sites. High-throughput sequencing has revealed substantial intra-individual microbiome variation at different anatomical sites, and inter-individual variation at the same anatomical sites. Such site-specific differences and the observed conservation between human hosts provide an important framework to determine the biological and pathological significance of a particular microbiome composition (taken from nature.com via Bing Images).

Relative proportion of sequences determined at the taxonomic phylum level at eight anatomical sites. High-throughput sequencing has revealed substantial intra-individual microbiome variation at different anatomical sites, and inter-individual variation at the same anatomical sites. Such site-specific differences and the observed conservation between human hosts provide an important framework to determine the biological and pathological significance of a particular microbiome composition (taken from nature.com via Bing Images).

Along with the genomic data gleaned from the sequencing of complete human genomes, HLI will also be generating microbiome data for many of these individuals through its Biome Healthcare division. The division is lead by Karen Nelson, who at TIGR led the first human microbiome study on the human gut published in Science in 2006.

The microbiome— a very “hot” trend in genomics research that I wrote about last year—consists of all the microbes that live in and on the human body that contribute to health and disease status of an individual. By better understanding a person’s microbiome—from gut, oral, skin, lung, and other body sites—the company said that it “anticipates developing improved probiotics and other advanced diagnostic and therapeutic approaches to improve health and wellness.”

HLI will also capture and analyze each individual’s metabolomic data. The metabolome is the full complement of metabolites, biochemicals and lipids circulating throughout the human body. HLI has signed an agreement with Metabolon Inc., a diagnostic products and services company offering a biochemical profiling platform, to capture this information from each of the genomic samples that HLI is collecting. “Metabolomics is important because quantifying and understanding the full picture of circulating chemicals in the body can help researchers get a clearer picture of that individual’s health status, and provide markers and pathways associated with drug action,” according to HLI.

Schematic of the 'omic hierarchy: genomics, transcriptomics, proteomics, and metabolomics—yes, the figure leaves out a few others, e.g. epigenomics and phenomics (taken from schaechter.asmblog.org via Bing Images).

Schematic of the ‘omic hierarchy: genomics, transcriptomics, proteomics, and metabolomics—yes, the figure leaves out a few others, e.g. epigenomics and phenomics (taken from schaechter.asmblog.org via Bing Images).

Stem Cell Therapies

This part of the company’s multi-pronged strategy utilizing stem cell therapy advances is said to be “premised on the theory that as the human body ages many biological changes occur, including substantial changes and degradation to the genome of the differentiated, specialized cells found in all body tissues. There is also a depletion and degradation of healthy regenerative stem cell populations in the body over time. HLI will monitor the genomic changes which occur during stem cell differentiation, normal aging, and in association with the onset of disease.”

In this regard, it’s worth mentioning that TriLink BioTechnologies is a leading provider of biosynthetic modified mRNAs that encode factors used for cellular reprograming and regenerative medicine. Further information about these catalog products and custom services is available here.

Commercial Potential

Robert Hariri, M.D., Ph.D., who cofounded HLI with Venter and Diamandis, is quoted in HLI’s press release as saying that “the global market for healthy human longevity is enormous with total healthcare expenditures in those 65 and older running well over $7 trillion.” He adds, “we believe that HLI’s unique science and technology, along with our business leadership, will positively impact the healthcare market with novel diagnostics and therapeutics.”

Time will tell.

Personally, over the many years since Mike Hunkapiller introduced me to then “NIHer” Craig Venter, I’ve learned not to bet against him.

In closing, I should mention that Venter et al. are not the first to eye the commercial potential of longevity. Last September, Google’s chief executive, Larry Page, announced that his company was creating an anti-aging company, Calico, which is being run by Arthur D. Levinson, the former chief executive of Genentech. Even earlier, Oracle’s chief executive, Lawrence J. Ellison, had financed anti-aging research through his foundation. However, last December, Ellison announced this research would end due to a funding crunch.

Your comments are welcomed.

Crowdfunding: Science & Startups Go Social

  • Websites Proliferate for “Reaching out to Engage” for Needed Dollars
  • Academics, Universities and Startups are Now Crowdfunding
  • Some Projects Attract Big Buck But Most Get Little

Wikipedia defines crowdfunding as “the collective effort of individuals who network and pool their money, usually via the internet, to support efforts initiated by other people or organizations.” In the context of science, it’s tempting to think that this is just a trendy way to characterize what has been done for a long time as charitable giving. However, I hope the following will convince you that crowdfunding is quite different and offers a number of unique advantages.

Faced with research goals that, for various reasons, are not fundable by traditional mechanisms, researchers are turning to crowdfunding to get the dollars they need. “Scientists are beginning to use crowdfunding to support their work, but don’t stop filling out those grant applications just yet”—this according to Jessica Marshall writing in the venerable Proceedings of the National Academy of Sciences.

While crowdfunding is a generic process (in the present context it involves asking members of the public to chip in money for projects that interest them), a number of crowdfunding web sites now offer researchers a page to pitch their idea, typically including a short video whereby potential donors can “meet” would-be recipients. Importantly, crowdfunding is also being used to raise money for startup companies in lieu of traditional venture capital (VC) financing.

Crowdfunding research through Web-based social media bypasses traditional grant reviews by peers or other experts (taken from conteudoegeek.blogspot.com via Bing Images).

Crowdfunding research through Web-based social media bypasses traditional grant reviews by peers or other experts (taken from conteudoegeek.blogspot.com via Bing Images).

Intrigued by this radical trend in internet-enabled, social media-catalyzed funding of scientific research, I did some homework. In this post I’ll touch on some similarities and differences among the main crowdfunding websites, explore a few representative “case studies”, and touch on some concerns that have been expressed thus far.

Crowdfunding Websites for Research

Indiegogo (“Fund what matters to you”) is an international crowdfunding site founded in 2008 and headquartered in San Francisco, CA. The site’s structure allows users to create a page for their funding campaign, set up an account with PayPal, make a list of “perks” for different levels of investment—think TV fund raisers—then create a social media-based publicity effort. Users promote the projects themselves through Facebook, Twitter and other social media platforms. The site levies a 4% fee for successful campaigns. For campaigns that fail to raise their target amount, users have the option of either refunding all money to their contributors at no charge or keeping all money raised but with a 9% fee. I assume this “penalty” encourages users to keep trying to meet their target.

RocketHub (“The world’s crowdfunding machine”) was launched in 2010, and shares similarities to Kickstarter, which was started a year earlier and focuses on helping “bring creative projects to life.” If the selected funding target is not reached by the deadline, the project leader is still able to keep the collected funds. RocketHub charges 4% of funds collected, plus 4% payment processing fees, if the project is fully funded, and 8% plus 4% payment processing fees if the project does not reach its goal. Seems like another “early withdrawal penalty” feature. SciFund Challenge has recently been founded to leverage RocketHub’s platform for science projects.

Experiment is a U.S. web site dating back to 2012 that works on the “all-or-nothing” funding model:  5% for Experiment and 3% for payment processing, but only if the campaign is successful. If the campaign does not reach the funding goal, no one is charged. In February 2014, the site changed its name from Microryza to Experiment.com. The former name Microryza was inspired by Mycorrhizae, a symbiotic fungi that live in the roots of plants. Unlike Indiegogo, backers of Experiment projects do not get tangible rewards for giving money. However, researchers share the scientific process directly with the backers and become a part of the project—think symbiosis.

Crowdfunding Numbers

Ethan O. Perlstein, who you’ll read about in the next section, posted this plot of total number of donors (x-axis) vs. the total amount raised (y-axis) for 115 science projects across four different crowdfunding platforms—as of October 20, 2013—with the comment “115 science projects have been crowdfunded, but crowdfunding can’t compete with grants — yet.”

crowdfunding

He added that Kickstarter claims that over 50,000 projects had been successfully funded to the tune of $836 million, with an increasing “pie slice” going to science projects that traditionally would have been funded almost exclusively by the government. Perlstein then offers the following series of comments that I think are quite interesting—including a provocative “universal fundraising statistic.”

“The majority of these 115 science projects fit the academic profile: professionally trained, university-employed, grant-dependent researchers asking focused research questions. But there are also examples of unconventional projects led by self-taught (aka citizen) scientists, student-led, e.g., iGEM teams, and pedagogical research.

Cumulatively, 115 science projects raised $5,082,028 from 47,958 donors, with two megaprojects comprising over half of these totals. Since the average is thrown off by those whoppers, the median is more useful. The median project goal is $3,029, and the median number of project donors is 39. 

The above plot reveals two interesting facts about the distribution of science projects. First, the ratio between dollars and donors is roughly 100-to-1 across three orders of magnitude, from $1,000 goals to $1,000,000 goals. In other words, the average donation for science projects is $100. Technically, the average donation falls within a range of $100-$60 per donor. This range is consistent with non-science projects and is probably a universal fundraising statistic that reflects economic and psychological drivers of charitable giving.

Second, there is a ceiling between $25,000 and $35,000 above which Microryza and RocketHub science projects don’t go, but above which Kickstarter and Indiegogo science projects do boldly go. What’s causing this separation? The answer appears to boil down to incentives, i.e., whether science projects offer products or not. There is also a role for the size and engagement of each crowdfunding platform’s donor community, especially repeat donors. Briefly put: tangible rewards shift the average donation size higher than people might spontaneously donate, and built-in communities on Kickstarter, and to a less extent on Indiegogo, means a larger captive audience.”

Crowdfunding Case Studies

If you’re seriously interested in pursuing crowdfunding for your research—or participating as a contributor to research by others—more information can be found by using keyword (e.g. DNA, genes, etc.) search engines provided at the aforementioned websites. In any case, the following exemplary case studies taken from Jessica Marshall’s PNAS article and elsewhere are intended to provide a sense of scope, scale, and success, or lack thereof.

Lauren Kuehne, University of Washington, Seattle, WA (taken from PNAS).

Lauren Kuehne, University of Washington, Seattle, WA (taken from PNAS).

Crowdfunding Platform: SciFund Challenge
Funding Goal / Raised: $2,000 / $2,048
Project: Soundscape characterization of freshwater lakes
Reward Offered: Sound chart of donor’s name being spoken
Take-home Message: “It was a lot of work for $2,000, I’m not going to lie.”

According to Marshall, while a campaign is live, success hinges on staying engaged, responding to questions and glitches, and continuing to spread the word on social media and among potentially interested networks, including relevant businesses and the press. After the campaign, teams need to update backers on the project and distribute the perks, which can include t-shirts, tokens related to the research, or lab tours for larger donors. “I would recommend that people not do this if their sole purpose is fundraising,” says Kuehne.

So why do it if not for money? “I immediately had this network of people who were aware of the research. People would send me papers and ideas for extending the research, and ideas for data management,” she adds. Also, “I felt like I was this little hub for people who were interested in freshwater, in noise, in data management, in new technologies.” Marshall adds that about one-third of Kuehne’s backers were her friends and family, a similar fraction were people she knew but would not have thought to ask for support, and one-third were strangers.

Rob Knight, American Gut project, University of Colorado, Boulder, CO (taken from PNAS).

Rob Knight, American Gut project, University of Colorado, Boulder, CO (taken from PNAS).

Crowdfunding Platform: Indiegogo
Funding Goal / Raised: $400,000 / $339,960
Project: Global survey of people’s microbiomes
Reward Offered: Profiles of individuals’ microbiomes
Take-home Message: “One thing that’s been very much a learning process is figuring out ways to make the results accessible to the general public.”

With some projects, connecting with the public is an essential part of the research. The American Gut project, and similar project uBiome, invited backers to donate money and samples from their own microbial populations in exchange for a printout detailing the bacteria in their bodies—a continuing “hot topic” of research that I wrote about here last year. The teams counted on growing public awareness about the importance of the microbiome and people’s innate desire to know more about their own microbiome to drive involvement.

Together, the two projects raised almost $700,000. ‘One curious aspect of this crowdfunding approach is that certain groups—like people who ascribe to caloric restriction, or those interested in following paleolithic diets—have shown strong interest’, says Rob Knight, who is the lead scientist on the American Gut project. Marshall added, “this skews the sample population, but it should allow the researchers to make some interesting comparisons, and they plan to contact participants for follow-up studies.”

uBiome is interested in taking crowdsourcing one step further by allowing backers to submit research questions as the database grows. “We wanted to bring the public in in a bigger way, let them ask questions of the data, and really harness the scientific understanding,” uBiome cofounder Jessica Richman told Marshall.

Ethan O. Perlstein, founder of Perlstein Lab, B Corp (taken from PNAS).

Ethan O. Perlstein, founder of Perlstein Lab, B Corp (taken from PNAS).

Crowdfunding Platform: RocketHub
Funding Goal / Raised: $25,000 / $25,460
Project: Interactions of methamphetamine with brain cells
Reward Offered: Model of methamphetamine molecule made with a 3D printer
Take-home Message: “The formula is straightforward: social networks + external media + time and commitment.”

Marshall opines that crowdfunding may be more likely to succeed for a project with broad public appeal. Perlstein’s project capitalized on the popularity of the television program “Breaking Bad”, about a chemistry teacher who starts making methamphetamine. Perlstein and a colleague created a short video explaining the concept and posted it on RocketHub. The scientists offered prizes, including the chance to talk science over beer for people who donated $100 or more. The men quickly raised $25,000 from family, friends and strangers.

This past February, Amy Dockser Marcus reported in The Wall Street Journal that Perlstein plans to use crowdfunding to raise $1.5 million for his research at the eponymous Perlstein Lab, B Corp. But why do this? In a nutshell, Perlstein had earned a Ph.D. in molecular biology from Harvard, spent five years doing postdoctoral research at Princeton and there led a team that published two papers on pharmacology, which all sounds very promising as a springboard to become an independent academic researcher. But he was turned down by 27 universities when he sought a tenure-track position, and decided to instead set up his own lab as a Benefit Corporation (aka B Corp) and raise money through crowdfunding.

Dr. Perlstein’s specialty is something he calls “evolutionary pharmacology.” He plans to study lysosomal storage diseases in yeast, fish, flies and worms, trying to find drugs that slow or stop the disorders, then test the drug candidates in patient cells collected by researchers. He anticipates making money by licensing or auctioning the findings to drug companies or others that can run trials in humans.

Needless to say, there are probably hundreds of researchers who will be eagerly watching to see if this successfully plays out.

Crowdfunding in Academia

Universities are also exploring how to incorporate crowdfunding into their operations, and view the campaigns as potentially valuable outreach—this according to Jessica Marshall—and quite predictable to me. She adds that The University of California, San Francisco (UCSF), has forged a partnership with Indiegogo that, among other things, allows backers to donate under tax-exempt status. Tuhin Sinha, a UCSF administrator who is coordinating the efforts is quoted by Marshall as saying that “we’re totally in a pilot mode. We’re having mixed reviews with this.”

Taking a different approach is Michael Greenberg, Director of Innovation and Strategic Initiatives for the Office of Research at the University of California, Los Angeles, and the cofounder of ScaleFunder, a company that has developed a crowdfunding platform tailored to universities. ScaleFunder lists UCLA, UCSF, and UCSC among its clients. Since all sorts of new trends, like the weather, tend to “blow from West to East” it’s likely that Harvard and MIT might also try this route.

According to Greenberg, when universities coordinate crowdfunding campaigns, researchers can access a much larger network of potential backers. The universities can also provide support as the researchers prepare their text and videos, and ensure that no university regulations are breached. “Institutions have the ability not only to bring in major donors but also major corporations,” Greenberg is quoted as saying.

Crowdfunding Startups

Stephanie M. Lee is a San Francisco Chronicle staff writer who entitled her February 15th 2014 story “Biotech startup turns to crowdfunding,” from which I’ve taken these selected excerpts.

“When Exogen Biotechnology needed to raise money, the startup turned not to venture capitalists but to crowdfunding site Indiegogo. In just 10 days this year, nearly 300 donors from all over the world chipped in $50,000.

jon

Jonathan Tang (left) and Sylvain Costes developed an agent-based model that is the first multi-scale model of the development of a mammary gland from the onset of puberty all the way to adulthood (photo by Roy Kaltschmidt; taken from technology.org via Bing Images).

The two biophysicists at Lawrence Berkeley National Laboratory are developing a technology that measures DNA damage, which could determine if a person might develop cancer, neurodegenerative disorders and immunological diseases.

‘If you find you have a high level of DNA damage, you can try to find out what’s causing it to reduce it,’ Tang said.

At a time when venture capital firms are investing fewer dollars in young biotech companies, a growing number of entrepreneurial scientists are raising money through online crowdsourcing communities like Indiegogo, Medstartr, VentureHealth, RocketHub and AngelList.

Since the recession in 2008, investors have been increasingly reluctant to pour cash into risky biotech startups that face a long and uncertain road to winning approval from the U.S. Food and Drug Administration. First-time venture deals in biotechnology hit a 17-year low in the first three quarters of 2013, according to a Burrill Report analysis of the U.S. life sciences industry. As a result, startups are looking to alternative sources like angel investors, venture philanthropists and now crowdfunding sites.

‘Early-stage life science is just in poor shape in terms of having a variety of funding sources,” said Stephanie Marrus, director of UCSF’s Entrepreneurship Center. “Everyone should keep their eyes on (crowdfunding). Looking at the global trends, it’s definitely become a source of capital. If it becomes appropriated adaptively to health care, maybe we have a new avenue to go down.’

Exogen’s founders say they turned to Indiegogo in part because they want to treat the campaign as a citizen science project. Two preliminary trials with 100 people last year gave the scientists confidence in their idea when they found that the older the subjects were, the higher their level of DNA damage. In addition, four former cancer patients had among the highest levels of DNA damage for their age group.

Donors who chip in $99 or more receive a kit to extract blood samples. The samples are analyzed by Costes, a biophysicist who studies the effects of low-dose radiation on cellular processes, and Tang, a postdoctoral fellow in Costes’ lab.

Traditionally, if a scientist wanted to measure the level of a person’s DNA damage, they would examine cells in a microscope and manually count the number of DNA breaks. Costes, however, said he has invented a way to automate the task with a machine that can scan and objectively score the damaged DNA.

Exogen plans to allow users to check their scores privately online. To prevent the onset of diseases like cancer, people with high levels of DNA damage could try to modify their diet and exercise or limit exposure to the sun. The founders are careful to note that for now, the company won’t dish out medical advice or sequence people’s genomes. But the company plans to seek FDA approval for its technology as a diagnostic tool.”

Crowdfunding Concerns and Criticisms

While crowdfunding has proven to be an efficient and effective way to raise funding, there are obvious concerns around fraud and the quality of the research being conducted. I won’t comment on fraud, since this is a concern that is common to any online—or old fashion mail—solicitation. On the other hand, quality of science is an issue peculiar to crowdfunding. After cogitating about various ways to address this concern, I concluded that it’s not easy or practical to deal with given that the vast majority of donors are non-scientists. Perhaps posting “letters of support” from bone fide academics would be an effective way to handle this.

Lee quotes Arthur Caplan, a bioethics professor at New York University’s Langone Medical Center, as saying that “you can say anything and everything when you’re crowdfunding, so the person donating money has to be vigilant. They could say, ‘I’m going to find a cure for your child’s brain cancer—if you fund it, I’ll do it.”

Erika Check Hayden, writing in Nature this past February, said that a crowd-funded HIV vaccine project called the Immunity Project has sparked debate, as scientists question the tactics for the public campaign for this project. “It is dreaming big,” she quotes co-founder Reid Rubsamen as saying, adding that his enthusiasm makes him an effective pitch man for the Immunity Project, based in Oakland, CA, which has raised more than $400,000.

But missing from Rubsamen’s promotional campaign are any HIV researchers or data supporting the effort’s scientific strategy. The unorthodox approach raises the question of whether crowdfunding, which tends to be more impressed with technology and marketing than peer-reviewed data, is compatible with medical research. This is an increasingly pertinent issue as scientists appeal to the public to fund more projects aimed at developing therapies.

I’m obviously intrigued by the rising popularity and future potential of crowdfunding. I welcome your thoughts, opinions and comments regarding this topic.

Smartphone Science: Nifty Accessories for Bio-Medical Applications

  • DIY Cholesterol and Vitamin D Monitoring Using Smartphones
  • Single-molecule Microscopy on Smartphones…Coming Soon?
  • Pee-powered Smartphones Developed
  • Solar Powered Smartphone-assisted “sample-to-answer” with PCR and Human Skin Biopsies Demonstrated

Prologue

It’s hard to believe that the first cell phone (invented about 30 years ago) was actually considered to be very convenient even though it weighed more than two pounds and looked like a large brick—plus it took 10 hours to charge the battery to get only 30 minutes of talk-time—“o BTW no txt” back then, either! Fortunately, that cell phone has evolved into present day smartphones, which are amazingly versatile and seemingly ubiquitous extensions of our hands, thumbs, mouths, and minds that can talk to us and remind us what to do. 

Motorola produced the first handheld mobile phone (far left). Martin Cooper, a Motorola engineer and executive, made the first mobile telephone call from handheld subscriber equipment on April 3, 1973 in front of reporters, placing a call to Dr. Joel S. Engel of Bell Labs. Cooper has stated his vision for the handheld device was inspired by Captain James T. Kirk using his Communicator (far right) in the 1960s television show Star Trek.

Motorola produced the first handheld mobile phone (far left). Martin Cooper, a Motorola engineer and executive, made the first mobile telephone call from handheld subscriber equipment on April 3, 1973 in front of reporters, placing a call to Dr. Joel S. Engel of Bell Labs. Cooper has stated his vision for the handheld device was inspired by Captain James T. Kirk using his Communicator (far right) in the 1960s television show Star Trek.

Evolution of Smartphones

Present day smartphones are essentially hand-held computers with lots of processing speed and memory that allow us to communicate in various ways, take remarkably high-quality digital pictures, “surf the net”, have docking ports, and can run hundreds of thousands of applications. Given their impressive functionality and remarkable convenience, it’s not surprising that there is increasing interest in developing plug-in accessories to enable smartphones to carryout various types of bio-medical applications.

One of my previous posts highlighted a report of DNA isothermal amplification/fluorescence detection using a disposable microchip interfaced with an iPod Touch—and transmitting the results via a WiFi interface—as an example of the trend toward point-of-care (POC) and even self-diagnostic tools. Following are a few recent examples that further illustrate this trend employing smartphones.

Testing Cholesterol and Vitamin D Levels with Your Smart Phone

It is estimated that 60% of adults in the US have high cholesterol (>200 mg/dL), and 37 million have very high cholesterol (>250 mg/dL). Studies on the effect of serum cholesterol on coronary heart disease mortality indicate that there is a 17% increase in mortality for every 20 mg/dL increase above 210 mg/dL, but often time high cholesterol goes undetected.

Earlier this year, a team of US scientists reported that they had developed a smartphone accessory that will allow individuals to test their blood cholesterol levels themselves. Self-detection and routine monitoring would undoubtedly save lives given that cholesterol levels can often be controlled through simple changes in diet such as consuming less saturated fat.

Prof. David Erickson using his iPhone interfaced with a blood test strip. Photo credit: Jason Koski/University Photography; taken from news.cornell.edu via Bing Images.

Prof. David Erickson using his iPhone interfaced with a blood test strip. Photo credit: Jason Koski/University Photography; taken from news.cornell.edu via Bing Images.

Prof. David Erickson and co-workers from Cornell University in New York developed a system that consists of a small accessory device that attaches onto a smartphone, an app, and dry reagent test strips that are already commercially available. A drop of blood is placed onto the test strip and an enzymatic, colorimetric reaction occurs. This strip is then placed into the accessory device and an image of the strip is generated using the camera on the phone. The app then quantifies the color change and converts this into a blood cholesterol concentration using a calibration curve. Check out this “must see” video of Prof. Erickson demonstrating how it all works.

According to Megan Tyler reporting in Chemistry World, the achievement of Erickson and his colleagues should not be understated. Although what they have done may sound simple, developing a smartphone-based system that enables precise and reproducible diagnostic measurements is actually very difficult. She adds that “the largest challenge comes from having to account for different lighting conditions and reaction times, differences between the cameras and camera settings in different types of phone, and the potential for misalignment of the test strip.” The team overcame the lighting problem by using the accessory device to block out external light so that the test strip would be uniformly illuminated by the flash on the camera. Meanwhile, algorithms in the app account for the other potential variables.

Tyler reports that Erickson and co-workers are now working to commercialize their system, so it may be available for the general public to purchase in the near future.

I contacted Prof. Erickson to obtain a copy of his publication about this system (called  “smartCARD”—smartphone Cholesterol Application for Rapid Diagnostics) so I could read about the details.

Read about low, normal, and high levels of HDL, LDL, triglycerides and cholesterol at Understanding Cholesterol (taken from thescienceofeating.com via Bing Images).

Read about low, normal, and high levels of HDL, LDL, triglycerides and cholesterol at Understanding Cholesterol (taken from thescienceofeating.com via Bing Images).

In that paper, he concludes by stating that “…using the smartCARD system presented here it is possible to measure other commercially available colorimetric test strips for LDL, HDL, cholesterol, and triglycerides. Such a device would be a great advance in “cloud” based self-diagnostics and monitoring of this quartet of compounds critical to our cardiovascular health.

As I was finishing this blog, I came across yet another publication led by Prof. Erickson entitled “A smartphone platform for the quantification of vitamin D levels.” The abstract begins with pointing out that Vitamin D deficiency has been linked to a number of diseases and adverse outcomes including: osteoporosis, infections, diabetes, cardiovascular diseases, and even cancer. At present the vast majority of vitamin D testing is performed in large-scale laboratories at the request of a physician as part of an annual panel of blood tests. In contrast, Erickson and coworkers developed a DIY system for rapid quantification of vitamin D using a system similar to the cholesterol test decribed above that enables colorimetric detection of Vitamin D using a novel gold nanoparticle-based immunoassay. This system was compared with well-established ELISA test kits for serum samples of unknown concentration, and gave equivalency of the results. These investigators concluded that they “envision this as the first step towards the development of the NutriPhone, a comprehensive system for the analysis of multiple vitamins and micronutrients on a smartphone.”

Single-molecule Microscopy on a Smartphone…Coming Soon?

Lord Kelvin—17th century inventor of the eponymous Kelvin temperature scale and source of provocative quotes—helped calculate Avogadro’s number, one I know all too well from my days working in the lab with seemingly very tiny quantities of material, such as picomoles of oligonucleotide primers for PCR. Looking back, the number of oligonucleotide molecules was still quite large, 6.022 x 1011, thus I’ve been mightily impressed—if not downright amazed—by the trend in new technologies for manipulating and optically detecting single molecules, let alone doing this using a smartphone!

Taken from rugusavay.com via Bing Images

Taken from rugusavay.com via Bing Images

Aydogan Ozcan received the 2011 Presidential Early Career Award for Scientists and Engineers (taken from newsroom.ucla.edu via Bing Images).

Aydogan Ozcan received the 2011 Presidential Early Career Award for Scientists and Engineers (taken from newsroom.ucla.edu via Bing Images).

Single-molecule optical detection on a smartphone would be truly amazing because it would need to enable technical achievements that usually require big, powerful lasers and large microscopes that take up lots of bench space and operate in a pitch-black lab. A major step toward this seemingly impossible goal has received considerable editorial praise in ACS Nano—a premier specialty journal of the American Chemical Society—as well as widespread science media coverage. The team at UCLA that’s getting all this well-deserved attention is led by Prof. Aydogan Ozcan, whom I wrote about here last year as “adapting smartphones for measurement of the cell count of HIV patients in resource limited settings or doing fluorescent microscopy.”

According to the UCLA Newsroom, your smartphone now can see what the naked eye cannot: a single virus less than one-thousandth of the width of a human hair. Prof. Ozcan’s team have created a portable smartphone attachment that can be used to perform sophisticated field testing to detect viruses and bacteria without the need for bulky and expensive microscopes and lab equipment. The device weighs less than half a pound.

“This cellphone-based imaging platform could be used for specific and sensitive detection of sub-wavelength objects, including bacteria and viruses and therefore could enable the practice of nanotechnology and biomedical testing in field settings and even in remote and resource-limited environments,” Ozcan said. “These results also constitute the first time that single nanoparticles and viruses have been detected using a cellphone-based, field-portable imaging system.”

The new research, published in ACS Nano, comes on the heels of Ozcan’s other recent inventions, including a cellphone camera–enabled sensor for allergens in food products and a smart phone attachment that can conduct common kidney tests.

Capturing clear images of objects as tiny as a single virus is difficult because the optical signal strength and contrast are very low for objects that are smaller than the wavelength of light. In the ACS Nano paper, Ozcan details a fluorescent microscope device fabricated by 3D printing—a very hot trend in science, and consumer products—that contains a color filter, an external lens and a laser diode. As pictured here, the diode illuminates fluid or solid samples at a steep angle of roughly 75 degrees. This oblique illumination avoids detection of scattered light that would otherwise interfere with the intended fluorescent image.

Using this device attached to the camera module on a smartphone, Ozcan’s team was able to detect single human cytomegalovirus (HCMV) particles. HCMV is a common virus that can cause birth defects such as deafness and brain damage. This same virus can hasten the death of adults who have received organ implants, who are infected with the HIV virus or whose immune systems have otherwise been weakened. A single HCMV particle measures about 150–300 nanometers; a human hair is roughly 100,000 nanometers thick. To verify these results, researchers in Ozcan’s lab used a photon-counting confocal microscope.

(a) Cell phone image of fluorescently labeled HCMV at a concentration of 107 PFU/mL. (b) Photon-counting map for dashed area in (a) using a confocal laser microscope. Note that absolute photon counts are different. (c) Distribution of intensity of HCMV in cell phone images. (d) Cell-phone-based virus density vs. virus incubation concentrations [taken from Wei et al. ACS Nano (2013)].

(a) Cell phone image of fluorescently labeled HCMV at a concentration of 107 PFU/mL. (b) Photon-counting map for dashed area in (a) using a confocal laser microscope. Note that absolute photon counts are different. (c) Distribution of intensity of HCMV in cell phone images. (d) Cell-phone-based virus density vs. virus incubation concentrations [taken from Wei et al. ACS Nano (2013)].

It’s important to note that these feasibility studies used Alexa-488-conjugated secondary antibody to multiply label HCMV via an abundant glycoprotein molecule, which was the target of a first antibody. Thus a single virus has many fluorophores thereby increasing detection sensitivity. The Editorial in ACS Nano suggests that the ultimate goal of imaging any unlabeled virus (or other microbe) might be achieved by hybridization with genome specific, doubly labeled oligonucleotides called molecular beacons (available from TriLink, I might add) that “light up” only after binding to the genomic target, as shown in Figure 1 below.

Taken from Khatua & Orrit, ACS Nano (2013).

Taken from Khatua & Orrit, ACS Nano (2013).

Hopefully, Prof. Ozcan’s vision of a smartphone-based “field-portable imaging system” will become a reality in the near future. If so, this raises the challenging issue of having adequate battery power for the smartphone. Solar-based battery charging stations are one solution, but what if the sun doesn’t cooperate when needed? The next section describes an alternative approach that—in my opinion—is quite creative, to say the least.

“Pee Powered” Smartphones

As reported by Jennifer Newton in Chemistry World, the first cell phone battery to be directly charged by microbial fuel cells feeding on urine has been described by scientists in the UK. This work builds upon previous experiments in 2011 aimed at development of urine-powered fuel cells by Loannis Leropoulos and colleagues at Bristol Robotics Laboratory. They had shown that urine was an excellent fuel for direct electricity generation. As a bonus, the cells can reclaim essential nutrients from the urine, making wastewater treatment easier.

urineThis latest study is the first time a commercially available mobile phone has been powered by urine-powered fuel cells. Cascades of electrically connected fuel cells use bacterial action to convert chemical energy in organic matter in urine into electricity.

The team hopes their work will lead to emergency charging devices for remote locations. The diagram below illustrates one embodiment. Some field conditions might require alternative—dare I say primitive—means of collecting urine, as well as easily portable fuel cells.

If you’ve been exploring the diagnostic potential of smartphones or, shall we say, organic ways of powering these phones, we’d be most interested in hearing from you via the comments section below. As always, all thoughts and opinions are welcomed.

Postscript

After completing this post, Prof. Erickson’s group published a proof-of-concept study of a solar powered, smartphone-assisted “sample-to-answer” molecular diagnostic test with PCR and human skin biopsies. The following are key aspects of the abstract from Nature:

“Here we integrate solar heating with microfluidics to eliminate thermal cycling power requirements as well as create a simple device infrastructure for PCR. Tests are completed in less than 30 min, and power consumption is reduced to 80 mW, enabling a standard 5.5 Wh iPhone battery to provide 70 h of power to this system. Additionally, we demonstrate a complete sample-to-answer diagnostic strategy by analyzing human skin biopsies infected with Kaposi’s Sarcoma herpesvirus through the combination of solar thermal PCR, HotSHOT DNA extraction and smartphone-based fluorescence detection. We believe that exploiting the ubiquity of solar thermal energy as demonstrated here could facilitate broad availability of nucleic acid-based diagnostics in resource-limited areas.”  

Artificially Expanding DNA’s Genetic Code in Designer Microbes

  • DNA isn’t What it Used To Be!
  • Scripps Team Led by Floyd Romesberg Demonstrates Six-Base DNA Replication in Living Bacteria
  • Are We On Our Way to Semi-synthetic Life-forms? With Vast Potential—But at What Risks? 

Doing the Impossible 

Seemingly everyone these days knows about DNA and how seemingly simple pairings of A-T and G-C encode all life forms. Now, a team led by Floyd Romesberg, a biological chemist at the Scripps Research Institute in San Diego, California, has created a synthetic base pair, which you can simply think of as X-Y, to produce artificial DNA that replicates with six bases! Click here to read more about Romesberg’s findings that were reported last month in Nature.

Prof. Floyd E. Romesberg, Department of Chemistry, The Scripps Research Institute, La Jolla, California (taken from utsandiego.com via Bing Images).

Prof. Floyd E. Romesberg, Department of Chemistry, The Scripps Research Institute, La Jolla, California (taken from utsandiego.com via Bing Images).

Expanding the genetic code of DNA has been pursued for decades. Romesberg’s achievement, however, has set the scientific world—and news media—a buzz because his version of expanded DNA (eDNA) actually replicates, setting the stage for future biosynthetic engineering aimed at microbes having correspondingly expanded RNA (eRNA) and—here’s the punchline—greatly expanded protein (e-protein) diversity.

Whereas natural proteins are comprised of 20 amino acids encoded by four-base RNA/DNA, Romesberg’s e-proteins could be comprised of 172 (!) amino acids—both natural and mostly artificial—as the result of more triplet codons derived from six-base eRNA/eDNA.

As listed in the graphic below taken from The Wall Street Journal, this stunning achievement in DNA manipulation research is a very big deal in view of the large number of potential applications it enables, covering virtually the entire spectrum of biotechnology and health.

dna

The remainder of this post provides a bit of technical detail and, perhaps more interestingly, some reported opinions that are decidedly positive or—not surprisingly—strongly negative because of concerns for “unintended consequences” of the sort that society has experienced in the past.

Look Folks, no Watson-Crick H-bonds!

Structure and paired orientation of the unnatural X-Y base pair compared to natural C-G base pairing (taken from Romesberg and coworkers Nature 2014).

Structure and paired orientation of the unnatural X-Y base pair compared to natural C-G base pairing (taken from Romesberg and coworkers Nature 2014).

What immediately struck me as being quite unexpected—if not amazing—about the unnatural X-Y base pair was the absence of Watson-Crick H-bonding commonly associated with specific A-T and G-C base pairing. Instead, these X and Y bases—with actual abbreviations d5SICS and dNaM—have relatively simple bicyclic aromatic rings and minimalistic substituents. My assumption is that these hydrophobic moieties—which aren’t even actual bases (!)—have very specific complementary geometry and “pi-stacking” interactions with hydrophobic moieties in flanking A-T and/or G-C base pairs.

Readers interested in technical details underlying this feat—of which there are many—will need to read the entire report. However, some of the “tricks” used by the Romesberg team are worth mentioning here, admittedly in over-simplified terms for the sake of brevity.

  • Their earlier results indicated that passive diffusion of unnatural nucleosides into microbes was possible but subsequent conversion to unnatural nucleotide triphosphates that are required for DNA synthesis was inefficient.
  • That problem was cleverly finessed by “borrowing” a suitable nucleotide triphosphate transporter (NTT) from a certain eukaryotic marine phytoplankton. They also had to include a couple of chemicals in growth media for this NTT to adequately function.
  • Conventional molecular biology was used to prepare a circular double-stranded plasmid having X-Y at a specific locus for transfection into E. coli bacteria to determine if bacterial DNA polymerase would replicate X and Y when “fed” the triphosphate forms of X and Y (dXTP and dYTP).

After 15 hours of growth, which represented 24 doublings or 2 X 107 amplification of the initial X-Y plasmid, analysis confirmed what had been hoped—namely, the unnatural X-Y base pair was retained during replication. Voilà!

They also investigated resistance of X-Y base pairs to E. coli DNA excision-repair processes after reaching stationary phase, and found X-Y to be quite stable: 45% and 15% retention (toward replacement by A-T) after days 3 and 6, respectively.

Here’s what Romesberg and coworkers opined in their concluding remarks:

“In the future, this organism, or a variant with the [unnatural base pairs] incorporated at other episomal or chromosomal loci, should provide a synthetic biology platform to orthogonally re-engineer cells, with applications ranging from site-specific labeling of nucleic acids in living cells to the construction of orthogonal transcription networks and eventually the production and evolution of proteins with multiple, different unnatural amino acids.”

At the risk of belittling this major milestone, there is clearly much more to do in order to extend—pun intended—X and Y into unnatural eRNA and, eventually, unnatural e-proteins prophetically imagined in the above graphic. Obviously there are many challenges ahead, but as the saying goes “every journey begins with a single step,” and in my opinion Romesberg’s team has taken a huge leap forward.

eDNA opens a world of possibilities in terms of unique products for companies synthesizing nucleotides. TriLink would certainly like to add all sorts of future dXTP and dYTP (and ribo versions) to their already extensive offering of modified nucleotides. Hopefully, that won’t be too far in the future. Time will tell.

Laudatory Views and Some Expression of Concern

My quickie survey of quotations in various stories reporting this work by Romesberg’s team, which incidentally included scientists at enzyme-purveyor New England Biolabs, indicated mostly high praise.

For example, a May 7th NY Times story by Andrew Pollack quotes Eric T. Kool, a professor of chemistry at Stanford who is also doing research in the area, as saying “it took some clever problem-solving to get where they got,” adding “it is clear that the day is coming that we’ll have stably replicating unnatural genetic structures.”

A May 9th editorial by Robert F. Service in venerable Science magazine quotes Ross Thyer, a molecular biologist at the University of Texas, Austin, as saying that “this is an amazing enabling technology,” and that the feat opens the way to a universe of new proteins—a vastly more diverse menu of proteins with a wide variety of new chemical functions, such as medicines better able to survive in the body and protein-based materials that assemble themselves.

Despite the resoundingly positive feedback, some concern about eDNA has been voiced. Jim Thomas of the ETC Group, a Canadian advocacy organization, said in an email, “The arrival of this unprecedented ‘alien’ life form could in time have far-reaching ethical, legal and regulatory implications. While synthetic biologists invent new ways to monkey with the fundamentals of life, governments haven’t even been able to cobble together the basics of oversight, assessment or regulation for this surging field.”

As for scary possibilities such as creating an unnatural dangerous organism, the editorial in Science says that creating synthetic “superbacteria” might sound ominous, but Kool thinks the risks are low. “These organisms cannot survive outside the laboratory,” Kool is quoted saying. “Personally, I think it’s a less dangerous way to modify DNA than existing genetic engineering,’ Kool is again quoted as saying.

I’m in the camp of carrying on with this line of research as long as the unnatural base pairs have to be chemically synthesized and “fed” to host organisms in a legitimate lab or manufacturing facility, thus having virtually zero possibility of unintended growth and function.

What do you think?

As usual, your comments are welcomed.

The Elusive Commercial Pursuit of RNAi Drugs

  • No Synthetic siRNA Drug Approval Some 13 Years “A.T.” (After Tuschl)
  • Alynlam Buys Sirna RNAi Assets from Merck and Forges Alliance with Genzyme
  • Novartis Cuts Back on In-House RNAi Efforts
  • Will RNAi for Therapeutics or for AgBio Pay-Off First?

Discovery, development and successful clinical investigations leading to new drugs are long and costly endeavors. In a detailed overview provided by The Pharmaceutical Research and Manufacturers of America (PhRMA), this process generally takes 10-15 years—at best. None of this work is cheap, and a relatively recent article in Forbes is critical of the drug industry “tossing around the $1 billion number for years.” The article digs into Big Pharma data to show that actual costs can reach $15 billion when failed drugs and all R&D are taken into account.

Thomas Tuschl (taken from The Rockefeller University via Bing Images).

Thomas Tuschl (taken from The Rockefeller University via Bing Images).

Having said this, it’s no surprise that seeking the first oligonucleotide drug based on RNA interference (RNAi) is still elusive today, some 13 years after Thomas Tuschl and collaborators at the Max-Planck-Institute for Biophysical Chemistry in Göttingen, Germany first reported use of synthetic 21-nucleotide RNA duplexes for RNAi. Their landmark publication in Nature in 2001, which has been cited over 9,000 times, demonstrated that these short-interfering RNA (siRNA) duplexes specifically suppressed expression of endogenous and heterologous genes in different mammalian cell lines. They presciently concluded that “21-nucleotide siRNA duplexes provide a new tool for studying gene function in mammalian cells and may eventually be used as gene-specific therapeutics.”

While siRNA did in fact become an amazingly powerful “new tool” very quickly, couching potential therapeutic utility of siRNA as an outcome that “may eventually” occur has indeed proven apropos.

Thousands of publications have led to elucidation of molecular pathways for RNAi offering various possible mechanisms of action for other types of RNAi agents. That—and delivery approaches for RNAi clinical candidates being investigated—can be read about in an excellent review by Rossi and others. The focus of this post is the elusive commercial pursuit of an RNAi drug.

Four Phases of the Business of RNAi Therapeutics

The elusive nature of RNAi therapeutics is not for lack of trying or underinvestment.  According to Dirk Haussecker, ‘the business of RNAi therapeutics’ has gone through four phases, which he explores in his excellent account entitled The Business of RNAi Therapeutics in 2012. His views in that article are paraphrased as follows:

It all began with the discovery phase (2002–05), which was defined by the early adopters of RNAi as a therapeutic modality. These were small, risk-taking biotechnology companies such as Ribozyme Pharmaceuticals (aka Sirna Therapeutics), Atugen (aka Silence Therapeutics) and Protiva (aka Tekmira). As much as they may have believed in the potential of RNAi therapeutics, their strategic reorientation was also a gamble on a technology with considerable technical uncertainties in hopes of turning around declining business fortunes by leveraging their nucleic acid therapeutics know-how to become leaders in a potentially disruptive technology. This phase also saw the founding of Alnylam Pharmaceuticals—by Thomas Tuschl, Phillip Sharp (1993 Nobel Prize), and others—based on the idea of cornering the IP on the molecules that mediate RNAi so that it may finance its own drug development by collecting a toll from all those engaged in RNAi therapeutics.

Left-to-right: Craig Mello, Andrew Fire, and Alfred Nobel (taken from ambassadors.net via Bing Images).

Left-to-right: Craig Mello, Andrew Fire, and Alfred Nobel (taken from ambassadors.net via Bing Images).

Big Pharma initially saw the value of RNAi largely as a research tool only, but this quickly changed. The defining feature of this second phase—the boom phase (2005–08)—was the impending patent cliff and the hope that the technology would mature in time to soften its financial impact. A bidding war, largely for access to potentially gate-keeping RNAi IP, erupted. Most notably, Merck acquired Sirna Therapeutics for $1.1 billion, while a Roche and Alnylam alliance provided a limited platform license from Alnylam for $331 million in upfront payments and equity investment. This boom phase was also fueled by the award of a Nobel Prize to Andrew Fire and Craig Mello for their seminal discovery of double-stranded RNA (dsRNA) as the trigger of RNAi.

This period of high expectations and blockbuster deals was followed by a backlash phase (2008–2011), or buyer’s remorse, in part due to absence of adequate delivery technologies and concerns for specificity and innate immune stimulation as safety issues. Suffering from RNAi-specific scientific and credibility issues, and with first drug approvals still years away, RNAi therapeutics was among the first to feel the cost-cutting axe. The exit of Roche from in-house RNAi therapeutics development sent shockwaves through the industry. Having invested heavily in the technology only 2–3 years ago, and being considered an innovation bellwether within Big Pharma, Roche’s decision in late 2010 found a number of imitators among Big Pharma and can be credited (or blamed, depending on your perspective), for intrepid investment in RNAi therapeutics ever since.

The backlash, incidentally, also had cleansing effects, many of which form the basis for the 4th and final phase, recovery (2011–present). This shift is most evident in the evolution of the RNAi therapeutics clinical pipeline that has become more and more populated with candidates based on sound scientific rationales, especially in terms of delivery approaches and anti-immunostimulatory strategies. For the recovery, however, to firmly take root and for the long-term health of the industry, it is important for the current clinical dataflow to bring back investors.

Current Status of RNAi Therapeutics

Dirk Haussecker’s The RNAi Therapeutics Blog richly chronicles the aforementioned and many more events dating back to 2007 and continuing through today. Particularly worth visiting is the Google-based World of RNAi Therapeutics map that shows current companies and—more importantly—the various RNAi agents under investigation. The screen shot below  exemplifies the kind of information that is displayed when you click on any company on the map—Alnylam in this case. Very convenient indeed!

Screen shot of World of RNAi Therapeutics exemplified with selection of Alnylam from the updated list of companies in the panel on the left (taken from The RNAi Therapeutics Blog).

Screen shot of World of RNAi Therapeutics exemplified with selection of Alnylam from the updated list of companies in the panel on the left (taken from The RNAi Therapeutics Blog).

It’s worth mentioning that ClinicalTrials.gov is a web-based resource that provides patients, their family members, health care professionals, researchers, and the public with easy access to information on publicly and privately supported clinical studies on a wide range of diseases and conditions. The website is maintained by the National Library of Medicine at the National Institutes of Health. Information on ClinicalTrials.gov is provided and updated by the sponsor or principal investigator of the clinical study. Studies are generally submitted to the website (that is, registered) when they begin, and the information on the site is updated throughout the study.

My February 2014 search of “siRNA” as a keyword at ClinicalTrials.gov found 31 studies listed. These are initially shown as a simplified list of the clinical study name and whether each study is completed, actively recruiting, active but not recruiting, terminated, etc. The list can be easily sorted by condition (i.e. disease type), sponsor/collaborators, and other parameters. Another useful feature is viewing found studies based on geographic location, as shown below. Click on any region or sub-region (i.e., state) to view information regarding studies in that area.

Global map of siRNA clinical studies taken from ClinicalTrials.gov

Global map of siRNA clinical studies taken from ClinicalTrials.gov

Map of “siRNA” clinical studies in the USA taken from ClinicalTrials.gov

Map of “siRNA” clinical studies in the USA taken from ClinicalTrials.gov

Alnylam Ascending

The big news for 2014 in ‘the Business of RNAi’—to borrow Dirk Haussecker’s expression—will most likely be centered around two deals involving Alnylam that were announced in January. The first announcement was that Alnylam will acquire “investigational RNAi therapeutic assets” from Merck for “future advancement through Alnylam’s commitment to RNAi Therapeutics.” The acquisition of Merck’s wholly owned subsidiary, Sirna Therapeutics, provides IP and RNAi assets including pre-clinical therapeutic candidates, chemistry, siRNA-conjugate and other delivery technologies.

Under the agreement, in exchange for acquiring the stock of Sirna Therapeutics, Alnylam will pay Merck an upfront payment of $175 million in cash and equity—$25 million cash and $150 million in Alnylam common stock. In addition, Merck is eligible to receive up to $105 million in developmental and sales milestone payments per product, as well as single-digit royalties, associated with the progress of certain pre-clinical candidates discovered by Merck. Merck is also eligible to receive up to $10 million in milestone payments and single-digit royalties on Alnylam products covered by Sirna Therapeutics’ patent estate.

Merck’s decision was quoted to be “consistent with [Merck’s] strategy to reduce emphasis on platform technologies and prioritize [Merck’s] R&D efforts to focus on product candidates capable of providing unambiguous promotable advantages to patients and payers.”

Alnylam expects to have six to seven genetic medicine product candidates in clinical development—including at least two programs in Phase 3 and five to six programs with human proof of concept—by the end of 2015 and referred to as “Alnylam 5×15″ programs, details for which can be accessed here and in presentations.

The second announcement, which came one day after announcing the acquisition of Sirna from Merck—was that Alnylam and Genzyme would form a “transformational alliance” for RNAi therapeutics as genetic medicines. This new collaboration is “expected to accelerate and expand global product value for the RNAi therapeutic genetic medicine pipeline, including ‘Alnylam 5×15’ programs.”

Alnylam will retain product rights in North America and Western Europe, while Genzyme will obtain the right to access Alnylam’s current “5×15″ and future genetic medicines pipeline in the rest of the world (ROW), including global product rights for certain programs. In addition, Genzyme becomes a major Alnylam shareholder through an upfront purchase of $700 million of newly issued stock, representing an approximately 12% ownership position. This alliance significantly bolsters Alnylam’s balance sheet to over $1 billion in cash that was said “to increase [Alnylam’s] investment in new RNAi therapeutic programs, while securing a cash runway that [Alnylam] believes will allow [it] to develop and launch multiple products as breakthrough medicines.”

In addition to the upfront equity purchase, Alnylam will receive R&D funding, starting on January 1, 2015, for programs where Genzyme has elected to opt-in for development and commercialization. In addition, Alnylam is eligible to receive milestones totaling up to $75 million per product for regional and co-developed/co-promoted programs. In the case of global Genzyme programs, Alnylam is eligible to receive up to $200 million in milestones per product. Finally, Alnylam is also eligible to receive tiered double-digit royalties up to 20% on net sales on all products commercialized by Genzyme in its territories. In the case of Genzyme’s co-developed/co-promoted products in the Alnylam territory, the parties will share profits equally and Alnylam will book net sales revenues.

Those interested in a “deep dive” into Alnylam’s impressive array of other strategic alliances can find lead information here.

First RNAi Drug Approval on the Horizon

Hopefully, the ‘Business of RNAi’ is entering its 5th phase: drug approval for sale, which would—finally—provide long-awaited demonstration of clinical utility and commercial payback toward huge investments to date.

In this regard, Alnylam has recently begun recruiting patients for a pivotal Phase III clinical trial that could lead to the first RNAi drug approval in the near future. This comes shortly after Alnylam’s November 2013 detailed press release announcing positive data from a Phase II clinical trial of patisiran (ALN-TTR02) for the treatment of transthyretin-mediated amyloidosis (ATTR), presented at the International Symposium on Familial Amyloidotic Polyneuropathy. The 24-slide deck of this Symposium presentation can be downloaded as a pdf by clicking here. Results showed that multiple doses of a Tekmira Phamaceuticals lipid nanoparticle formulation of ALN-TTR02 led to robust and statistically significant knockdown of serum TTR protein levels of up to 96%, with mean levels of TTR knockdown exceeding 85%. Knockdown of TTR, the disease-causing protein in ATTR, was found to be rapid, dose dependent, and durable, and similar activity was observed toward both wild-type and mutant protein. In addition, ALN-TTR02 was found to be generally safe and well tolerated in this study.

Details for the Phase III multicenter, multinational, randomized, double-blind, placebo-controlled study to evaluate the efficacy and safety of ALN-TTR02 can be read here at ClinicalTrials.gov. Among the details are the following facts, including the targeted completion date. While the trials look promising so far, January 2017 is several years away, and it’s wise to “never count your chickens before they hatch.”

Estimated Enrollment: 200
Study Start Date: November 2013
Estimated Study Completion Date: May 2017
Estimated Primary Completion Date: January 2017 (Final data collection date for primary outcome measure)

Novartis Cuts Back Its In-House RNAi R&D

While investment in RNAi at Alnylam is ascending, the situation at Novartis is descending, based on an article in GenomeWeb in April of 2014 stating that Novartis will be cutting back its 26 person effort. The article adds that, according to a Novartis spokesperson, the decision was driven by “ongoing challenges with formulation and delivery and the reality that the current range of medically relevant targets where siRNA may be used is quite narrow.”

Despite its decision to dial down its RNAi programs, Novartis still holds onto the rights to use Alnylam’s technology against the 31 targets covered under their one-time partnership, the company spokesperson said.

And as work continues on those targets, albeit by a downsized research team, Novartis will also considering partnering opportunities in the space, the spokesperson added.

With the seemingly never ending challenges of formulation and delivery, perhaps RNAi will pay-off first in the agricultural biotechnology (aka AgBio) space, as briefly discussed in the next section.

Pros and Cons of RNAi for AgBio

RNAi can be achieved using genetically encoded sequences rather than using chemically synthesized siRNA duplexes or other types of synthetic oligonucleotides. Agricultural biotechnology has already taken advantage of such genetically engineered constructs in producing stable and heritable RNAi phenotype in plant stocks. Analogous procedures can be applied to other organisms—including humans, such as in antiviral stratagems against HIV-1.

Andrew Pollack recently reported in the New York Times that agricultural biotechnology companies are investigating RNAi as a possible approach to kill crop-damaging insects and pathogens by disabling their genes. By zeroing in on a genetic sequence unique to one species, the technique has the potential to kill a pest without harming beneficial insects. That would be a big advance over chemical pesticides.

Subba Reddy Palli, an entomologist at the University of Kentucky who is researching the technology, is quoted as saying “if you use a neuro-poison, it kills everything, but this one is very target-specific.”

Some specialists, however, fear that releasing gene-silencing agents into fields could harm beneficial insects, especially among organisms that have a common genetic makeup, and possibly even endanger human health. Pollack adds that this controversy echoes the larger debate over genetically modified crops, which has been raging for years. The Environmental Protection Agency (EPA), which regulates pesticides, is meeting with scientific advisers to discuss the potential risks of RNA interference.

RNAi May Be a Bee’s Best Friend

Monsanto is exploring the use of RNAi to kill a mite that may play a role in bee die-offs.  Photo: Monsanto (taken from nytimes.com).

Monsanto is exploring the use of RNAi to kill a mite that may play a role in bee die-offs. Photo: Monsanto (taken from nytimes.com).

In addition to use in AgBio, RNAi may prove useful in reviving bee populations. RNAi is of interest to beekeepers because one possible use, under development by Monsanto, is to kill a mite that is believed to be at least partly responsible for the mass die-offs of honeybees in recent years.

In opposition to this, the National Honey Bee Advisory Board is quoted as saying “to attempt to use this technology at this current stage of understanding would be more naïve than our use of DDT in the 1950s.”

Pollack reports that some bee specialists told the EPA that they would welcome attempts to use RNAi to save honeybees, and groups representing corn, soybean and cotton farmers also support the technology: “commercial RNAi technology brings U.S. agriculture into an entirely new generation of tools holding great promise,” the National Corn Growers Association said.

Corn Growers Need a New Tool

For a decade, corn growners have been combating the rootworm, one of the costliest of agricultural pests, by planting so-called BT crops, which are genetically engineered to produce a toxin that kills the insects when they eat the crop. Or at least the toxin is supposed to kill them. Rootworms are now evolving resistance to at least one BT toxin.

Given that rootworm larvae can destroy significant percentages of corn if left untreated, a robust alternative is crucial to protecting future corn crops. Current estimates in the US indicate as much as 40% of corn acreage is infested with corn rootworms and the area is expected to grow over the next 20 years. RNAi is now is being studied as a possible alternative to BT toxins, and Monsanto has applied for regulatory approval of corn that is genetically engineered to use RNAi to kill the western corn rootworm.

Corn rootworm damage. Photo: IPM Images (taken from intlcorn.com via Bing Images).

Corn rootworm damage. Photo: IPM Images (taken from intlcorn.com via Bing Images).

Personally, I’m not completely averse to RNAi for AgBio, especially in view of the need to adequately feed the world’s growing population. Careful regulatory scrutiny, even if it results in slow moving progress, seems wise in order to avoid unintended consequences that could be very problematic.

As usual, your comments are welcomed.

 

 

Biomass Bonanza

In recognition of International Plant Appreciation Day (April 13th) and inspired by plantea.com, which asks us all to imagine a world without coffee, pants or toilet paper (or any one of the thousands of plant-derived products used daily), I decided to explore the world of biomass in further detail. In this post, we’ll discuss:

  • The Controversial World of Ethanol Mandates
  • Algae: Amazing Single-cell Plants May be the Key to Renewable Energy
  • Loving Lignin: High-margin Chemicals from Biomass Waste 
Imagine a world without toilet paper! (Taken from plantea.com)

Imagine a world without toilet paper! (Taken from plantea.com)

Over the past year or so I’ve collected quite a stack of journal publications and news articles—yes, all printed on paper—covering various aspects of using bioengineering—loosely speaking—to produce much needed commodities or important but scarce compounds. Some are needed on humongous scales, thus leading to other engineering challenges, but all at costs which are either competitive or, ideally, much lower than that from current sources. This is the so-called “economic calculus” that virtually every manufactured product must confront.

Frankly, I did not have an easy time trying to decide what to focus on for this post, as all of the items in the stack had intriguing titles, such as those listed below. I’ve linked each one to the source for you to peruse later, should you wish:

My choices reflect the increasing shift from the perhaps ill-conceived corn-to-ethanol mandates advocated by Presidents Bush and Obama, to hopefully more realistic goals utilizing non-edible plants or algae to produce “gasoline-like” molecules. I also decided to explore some of the more interesting pursuits of higher margin (i.e. more profitable) chemicals that are useful for diverse purposes, such as anticancer drugs and “environment friendly” bioerodible replacement for conventional plastics derived from oil.

By the way, there are plenty of nucleic acid-related technologies underlying all of these topics, but those are not the focus of my comments in this post, which instead deals with broader stories of general interest.

The Controversial World of Ethanol Mandates

Ethanol (ethyl alcohol, CH3CH2OH) is classified as being either “synthetic ethanol” that is produced chemically from something called syngas, with which you are likely unfamiliar—as I was—or “bioethanol” that has been made for millennia through biological fermentation of sugars from grains and fruits by yeasts, with which you are likely familiar—as I am, and enjoy in adult beverages.

When you have time later, and if you are so inclined, click here to read about the history and details of US energy policies (and politics) prompted by the “1973 oil crisis”—which I’m old enough to have experienced—waiting in long lines at gas stations, hoping there would be some for my car when I got to the pump. One statement therein that caught my attention—and with which I agree—is that there is “criticism that federal energy policies since the 1973 oil crisis have been dominated by crisis-mentality thinking, promoting expensive quick fixes and single-shot solutions that ignore market and technology realities. Instead of providing stable rules that support basic research while leaving plenty of scope for American entrepreneurship and innovation, congresses and presidential administrations have repeatedly backed policies which promise solutions that are politically expedient, but whose prospects are doubtful.”

Here’s where corn-to-ethanol comes in. With the Iowa political caucuses on the horizon in 2007, presidential candidate Barack Obama made homegrown corn a centerpiece of his plan to slow global warming. And when President George W. Bush signed a law that year requiring oil companies to add billions of gallons of ethanol to their gasoline each year, Bush predicted it would make the country “stronger, cleaner and more secure.” That quote in the San Francisco Chronicle is followed by these harsh but apparently true criticisms:

But the ethanol era has proven far more damaging to the environment than politicians promised and much worse than the government admits today.

As farmers rushed to find new places to plant corn, they wiped out millions of acres of conservation land, destroyed habitat and polluted water.

Environmentalist Craig Cox, above, says the push to plant corn for ethanol is damaging land and water. Photo: Charlie Riedel, Associated Press (taken from sfgate.com).

Environmentalist Craig Cox, above, says the push to plant corn for ethanol is damaging land and water. Photo: Charlie Riedel, Associated Press (taken from sfgate.com).

The consequences are so severe that environmentalists and many scientists have now rejected corn-based ethanol as bad environmental policy. But the Obama administration stands by it, highlighting its benefits to the farming industry rather than any negative impact.

The government’s predictions of the benefits have proven so inaccurate that independent scientists question whether it will ever achieve its central environmental goal: reducing greenhouse gases. “This is an ecological disaster,” said Craig Cox with the Environmental Working Group, a natural ally of the president that, like others, now finds itself at odds with the White House.

Are Bioethanol and Other Biofuels Really Delivering on their Promise?

An in-depth analysis in The Economist points out that bioethanol made from edible plants, such as corn, and biodiesel made from vegetable fat are first-generation biofuels that have drawbacks that extend beyond the negative environmental impact mentioned above.

They are made from plants rich in sugar, starch or oil that might otherwise be eaten by people or livestock. Ethanol production already consumes 40% of America’s corn harvest and a single new ethanol plant in Hull, England is about to become Britain’s largest buyer of wheat. Ethanol and biodiesel also have limitations as vehicle fuels, performing poorly in cold weather and capable of damaging unmodified engines.

According to the The Economist, overcoming these limitations is being addressed by dozens of start-up companies aimed at developing second-generation biofuels. They hoped to avoid the “food versus fuel” debate by making fuel from non-edible (e.g. cellulosic) biomass feedstocks with no nutritional value, such as agricultural waste or fast-growing trees and grasses raised on otherwise unproductive land. Other firms planned to make “drop in” biofuels that could replace conventional fossil fuels directly, rather than having to be blended in.

Unfortunately, the biofuels utopia didn’t pan out—start-ups went bust, surviving companies scaled back their plans and, as the price of first-generation biofuels rose, consumer interest waned. Furthermore, the spread of “fracking” unlocked new oil and gas reserves that provided an alternative path to planned USA energy independence. By 2012 America’s Environmental Protection Agency (EPA) had slashed the 2013 target for cellulosic biofuels to just 53 million (m) liters vs. the planned 3,800m liters—a whopping 98.6% decrease!

The article continues by asking and answering the question: what went wrong?

Basically, three challenges need to be overcome to provide viable second-generation biofuel:

  • break down woody cellulose and lignin polymers into simple plant sugars
  • convert those sugars into drop-in fuels to suit existing vehicles chemically or biochemically
  • do all this on a humongous scale—and, ah, it has to be cheap

And thus comes the real challenge—producing biofuels at a reasonable cost. Oil and gas giant Shell reportedly had ten advanced biofuels projects that are now defunct because they worked in the lab but couldn’t be cost effectively scaled up.

The optimism of five years ago may have waned, but efforts to develop second-generation biofuels continue. Half a dozen companies are now putting the final touches on industrial-scale plants and several are already producing small quantities of second-generation biofuels. Some even claim to be making money doing so.

On the bright side, three plants in America were reported to start producing cellulosic ethanol from waste corn cobs, leaves and husks in 2014: POET-DSM Advanced Biofuels (75m liters) and DuPont (110m liters), both in Iowa, and Abengoa (95m liters) in Kansas. The first company to produce ethanol using enzymes on an industrial scale is Beta Renewables, a spin-off from Chemtex, an Italian chemical giant. An 80m-liter cellulosic ethanol plant near Turin has been running at half capacity since the summer of 2013, using straw from nearby farms. It will run on corn waste in the autumn, rice straw in the winter and then perennial eucalyptus in the spring. The plant claims to be running at a profit but only with local, cheap feedstocks.

Other biofuels companies are continuing to pursue drop-in fuels. You might be wondering why given the challenges previously discussed. One attraction is that drop-in fuels are less susceptible to changing political whims compared with ethanol, the demand for which depends to a large extent on government mandates that it be blended into conventional fuels. Another is that drop-in fuels are commonly made with sugar as a feedstock, either conventionally sourced or cellulosic, and sugar is widely available and easily transported.

The next few years will be an exciting time in the pursuit of second-generation biofuels.  With several plants expected to come on line during 2015 and 2016, I’ll be paying close attention to this area to see if the food versus fuel debate heats up or fizzles out.

Algae: Amazing Single-cell Plants May be the Key to Renewable Energy

Can’t wait 300 million years? Solazyme can produce oil in a few days. This is the catchy title of an article by Jeff Benzak at Fueling Growth.org, featuring Solazyme’s use of algae to produce oil. Details of how this came about and how it’s done are better told in a lengthy story reported by Diane Cardwell in The New York Times that I have drawn upon here. Fittingly for International Plant Appreciation Day, you’ll learn that the trick is to use single-celled plants!

Friends since college, Jonathan S. Wolfson (law and business degrees) and Harrison F. Dillon (genetics PhD and law degrees) often mused about using biotechnology to create renewable energy—“delusional rantings” according to Mr. Wolfson. Then Mr. Dillon found algae, and delusional became real. More specifically microalgae, which are a large and diverse group of single-celled plants, produce a variety of substances, including oils, and are thought to be responsible for most of the fossilized oil deposits in the earth. These, it seemed, were micro-organisms with potential. With prodding, they could be re-engineered to make fuel.

Jonathan Wolfson (left) and Harrison Dillon gaze into the future of algae. Credit : Chronicle/Michael Macor. Taken from cleantechrepublic.com via Bing Images.

Jonathan Wolfson (left) and Harrison Dillon gaze into the future of algae. Credit : Chronicle/Michael Macor. Taken from cleantechrepublic.com via Bing Images.

So, according to the article, in 2003 Mr. Wolfson packed up and moved from New York to Palo Alto, California, where Mr. Dillon lived. They started a company called Solazyme. In mythical Silicon Valley tradition—à la Steve Jobs and Steve Wozniak starting Apple in Job’s Palo Alto garage—they worked in Mr. Dillon’s garage, growing algae in test tubes. And they found a small knot of investors attracted by the prospect of compressing a multimillion-year process into a matter of days.

Now, a decade later—and located in South San Francisco only a short walk from genetic engineering pioneer Genentech—they have released into the marketplace their very first algae-derived oil produced at a commercial scale. However, the destination for this oil—pale, odorless and dispensed from a small matte-gold bottle with an eyedropper—is not gas tanks, but the faces of women and men worried about their aging skin! Sold under the brand name Algenist, the product, sold for $79 for a one-ounce bottle, would seem to have nothing in common with oil refineries and transportation fuel.

It turns out that Algenist and other niche products, such as healthier low-fat food ingredients, are sold at premium prices and may be helping to finance Solazyme’s scale-up of drop-in biofuels. This additional revenue source may be enabling the company to get past the point where so many other clean-tech companies have run out of gas: the so-called “Valley of Death” where new businesses stall trying to shift to commercial-scale production.

Wolfson and Dillon had sold investors on an energy business, not one that made cosmetics, nutritional supplements and soap. They had also told their board that they would be able to make fuel through photosynthesis, a process then considered “sexy,” Mr. Wolfson is quoted as saying in the Times. That’s because the sunlight that would fuel the algae’s growth was free; other methods of goosing the algae included adding food sources like sugar. But growing algae where they could get enough sunlight required huge ponds of water. After “genetic tinkering” with the algae, other processes have been developed, and continuing cost-effective scale-up is being demonstrated.

Algae samples in a Solazyme lab are genetically manipulated into varieties having specific properties. Image credit: Ken James/Bloomberg  (taken from sustainablelifemedia.com via Bing Images)

Algae samples in a Solazyme lab are genetically manipulated into varieties having specific properties. Image credit: Ken James/Bloomberg (taken from sustainablelifemedia.com via Bing Images)

In June 2012, Solazyme announced the successful commissioning of its first fully integrated biorefinery (IBR) in Peoria, Illinois, to produce algal oil. The IBR was partially funded with a federal grant that Solazyme received from the U.S. Department of Energy (DOE) in December 2009 to demonstrate integrated commercial-scale production of renewable algal-based fuels. The demonstration / commercial-scale plant will have a capacity of two million liters of oil annually, and was said to provide an important platform for continued work on feedstock flexibility and scaling of new tailored oils into the marketplace.

At Solazyme’s plant in Peoria, Ill., algae is converted into oil. In just a few days Solazyme can recreate a process that in the natural world takes about 300 million years (taken from fuelinggrowth.org via Bing Images).

At Solazyme’s plant in Peoria, Ill., algae is converted into oil. In just a few days Solazyme can recreate a process that in the natural world takes about 300 million years (taken from fuelinggrowth.org via Bing Images).

Solazyme currently works with Chevron, UOP Honeywell, and additional industry leading refining partners, to produce SoladieselRD® renewable diesel, SoladieselHRF-76® renewable diesel for ships, and Solajet® renewable jet fuel for both military and commercial application testing.

In 2010, Solazyme delivered over 80,000 liters of algal-derived marine diesel and jet fuel to the U.S. Navy, constituting the world’s largest delivery of 100% microbial-derived, non-ethanol biofuel. The company was subsequently awarded another contract with the U.S. Department of Defense for production of up to 550,000 additional liters of SoladieselHRF-76®.

While this fuel-producing capacity is only “a tiny drop in the barrel” relative to billions of gallons of fuels used globally, even a doubtful critic would have to admit that the Peoria plant represents a giant step forward relative to Dillon’s Palo Alto garage!

Biomass Bonanza

“Companies have put biofuels on the back burner to aim for higher margin chemicals,” according to Emma Davies’ presumably pun-intended, lead-in sentence of her excellent article in Chemistry World that I draw from here.

Tom Welton, who is head of chemistry at Imperial College London, UK, views lignin—the ‘really hard stuff’ that protects plants from biological attack—as a valuable source of renewable speciality chemicals, notes Davies. She adds that his group has found a way to extract it from lignocellulosic biomass using ionic liquids—novel ‘liquid salts’—and has plans to make polymers using the technology.

In industrial biochemical and biofuel production, lignin is commonly viewed as little more than waste, sometimes burned to generate energy. Instead, the focus is on biomass cellulose, a polymer composed of repeated units of glucose, locked deep within the complex matrix of lignin and polysaccharides that make up lignocellulose.

Davies adds that “the biofuel bubble has not burst, but it was deformed by the fact that oil prices have not risen as quickly as predicted several years ago.” Consequently, efforts have been redirected to find valuable markets outside of fuels wherein lower scales are applicable and don’t need as much capital investment.

From left to right: 5-(chloromethyl) furfural (CMF); 5-(hydroxymethyl)furfural (HMF) and furan-2,5-dicarboxylic acid (taken from Chemistry World).

From left to right: 5-(chloromethyl) furfural (CMF); 5-(hydroxymethyl)furfural (HMF) and furan-2,5-dicarboxylic acid (taken from Chemistry World).

One success story involves a process that uses hydrochloric acid and gentle heat to digest raw biomass to produce high yields of 5-(chloromethyl) furfural (CMF), a starting point for two compounds with a growing market: 5-(hydroxymethyl)furfural and furan-2,5-dicarboxylic acid. The latter is used to make poly(furan-2,5-dicarboxylic acid) known as PEF, a furan version of PET (polyethylene terephthalate), which can be used to make plastic bottles.

Thus soft drinks are partly responsible for PEF’s expanding market. In 2011, Coca-Cola announced multi-million dollar partnership agreements with three biotechnology companies, Virent, Gevo and Avantium, to ‘accelerate development’ of the commercial solutions for bottles made solely from plant-based materials. While Gevo and Virent focus on PET made from bio-based paraxylene, Dutch company Avantium uses plant-based materials as feedstocks to produce PEF; it also has a bottle deal with French food company Danone.

Such chemical-based processes have clear advantages over the alternative of using selected microorganisms to digest biomass to produce chemicals. The clear advantage is time, because no matter how clever your bugs get they still need time to process whatever it is they are working on—usually days. Also, microorganisms don’t work for free. If you have a glucose molecule, two of those six carbons end up as carbon dioxide. A chemical process has a lot of potential advantages, including the possibility to keep all of the carbon.

Lovin’ lignin

Partial structure of lignin is rich in ‘aromatic’ 6-membered benzene-ring derivatives (taken from polypompholyx.com via Bing Images).

Partial structure of lignin is rich in ‘aromatic’ 6-membered benzene-ring derivatives (taken from polypompholyx.com via Bing Images).

The process to convert raw biomass to CMF leaves behind lignin. Davies article quotes a researcher as saying that ‘if anybody were ever to come up with a really simple economic way of getting some very useful molecules—simple aromatic hydrocarbons—in a really good yield from lignin, that would be worth a Nobel prize’. The researcher adds, however, that ‘it’s an enormous challenge because it’s such complicated stuff and so highly oxidized. Oxygens are attached directly to the benzene ring. That’s a very powerful bond which is difficult to break.’

Davies goes on to quote Art Ragauskas from the Georgia Institute of Technology, in Atlanta. ‘There is a lot of interest in what can be done with lignin because the cellulosic ethanol plants generate a lot of it. We have always argued that, since lignin is 20–30% of biomass, then you cannot afford just to say we’ll leave it there and burn it for energy.’

Ragauskas is working on ways to overcome the significant problem of biomass ‘recalcitrance’, the way that lignocellulose’s structural complexity makes it quite resistant to enzymatic hydrolysis used to unlock the cellulose. ‘Recalcitrance is probably the largest sole contributor to the capital and operating costs of converting lignocellulosics to biofuels,’ he says.

‘Five to six years ago, when people talked about recalcitrance, some were skeptical that it would be addressable,’ he says. Now the situation is improving. Ragauskas works with biologists from the BioEnergy Science Center looking at transgenic plants that can reduce recalcitrance, while also carrying out in-depth structural studies to gain an insight into the effects of different types of biomass pre-treatment.

It’s Ionic

According to Davies, Imperial’s Welton is excited about his group’s work to extract lignin from biomass using ionic liquids. The discovery was made by chance by PhD student Agniezka Brandt, who was working on dissolving biomass using ionic liquids. When some of the reactions did not work as well as others, Welton asked Brandt to find out why. ‘I’ve selectively taken the lignin out,’ revealed Brandt after further investigation. ‘We were lucky but we had prepared minds. This really does look promising as an approach,’ says Walton.

Post-petroleum Polymers

Meanwhile, Welton has a new project to investigate ionic liquid biorefining of lignocellulose to sustainable polymers, sponsored by the UK’s Engineering and Physical Sciences Research Council. He outlines three ‘synthetic challenges’ for renewables. The first, he says, is to get a new source of a compound in use today. Second is to make materials with properties that match those of chemicals currently in use, and third is to create large volumes of chemicals that have new properties and then discover what the chemicals could be used for.

Welton is particularly taken with the second challenge, which his polymer project targets. ‘It’s not the sophisticated chemicals that will be a problem [in a post-petroleum era]. We will find a way of making all the sophisticated chemicals that we want, as we do now,’ he says. ‘We really need to think about the low-cost, high-volume, high-mass materials.’ He looks at all the plastic items on his desk, from the folder that holds his lecture notes to his pen holder. ‘What are we going to make those kind of things from, the really mundane stuff that we take for granted that is all around us?’ he asks.

‘It has to be very simple, based on what’s available and the sophistication of the synthesis has to be really, really good because the margins will be so tiny. So that puts an amazing discipline on the chemistry that you’re going to have to invent,’ says Welton. ‘For me, as a chemist, I find that interesting.’

Big Biomass Breakthrough

In closing, I should add that a research group at The Department of Chemical and Biological Engineering at the University of Wisconsin-Madison, Wisconsin, reported in Science in January 2014 the first nonenzymatic sugar production from biomass using biomass-derived γ-valerolactone. They achieved laboratory-scale production of soluble carbohydrates from hardwood and softwood at high yields (70 to 90%) in a solvent mixture of biomass-derived γ-valerolactone (GVL), water, and dilute sulfuric acid. GVL completely solubilizes to promote the biomass, including the lignin fraction. The carbohydrates can be recovered and concentrated by extraction from GVL into an aqueous phase by addition of NaCl or liquid CO2. The researchers claim that “this strategy is well suited for catalytic upgrading to furans or fermentative upgrading to ethanol at high titers and near theoretical yield. We estimate through preliminary techno-economic modeling that the overall process could be cost-competitive for ethanol production, with biomass pretreatment followed by enzymatic hydrolysis.”

Hopefully the cutting edge research touched upon in this posting will accelerate the world’s ability to transition from dwindling petroleum-based resources to environmentally safe renewable resources.

Discussions around biofuels always seem to evoke a number of vast and varied opinions. I hope you’ll share yours in the comments section below.

DNA Day 2014

  • How’s the World Celebrating?
  • My Top 3 “Likes” for 2014 DNA Day
  • Where’s the Love for RNA Day?

DNA Day 2014

The Why, What and Where of DNA Day

DNA Day is celebrated on April 25 and commemorates the day in 1953 when James Watson, Francis Crick, Maurice Wilkins, Rosalind Franklin and colleagues published papers in Nature on the double-helix structure of DNA. Furthermore, on that day in 2003 it was declared that the Human Genome Project was nearly 100% complete. “The remaining tiny gaps are considered too costly to fill,” according to BBC News, this due to technical issues—hence the now popular term “accessible genome”.

Book of Life: the sequence of the human genome is published in Science and Nature (taken from lifesciencesfoundation.org).

Book of Life: the sequence of the human genome is published in Science and Nature (taken from lifesciencesfoundation.org).

In the USA, DNA Day was first celebrated on April 25, 2003 by proclamation of both the Senate and the House of Representatives. However, they only declared a one-time celebration. Every year from 2003 onward, annual DNA Day celebrations have been organized by the National Human Genome Research Institute (NHGRI). April 25 has since been declared “International DNA Day” and “World DNA Day” by several groups.

Metal working model used by James Watson and Francis Crick to determine the double-helical structure of the DNA molecule in 1953 (taken from lebbeuswoods.wordpress.com via Bing Images).

Metal working model used by James Watson and Francis Crick to determine the double-helical structure of the DNA molecule in 1953 (taken from lebbeuswoods.wordpress.com via Bing Images).

Researching DNA Day 2014 for this post revealed the following sampling of major international conferences, country-centric activities, local happenings, and social media—all of which struck me as a remarkable testament that elucidation of the structure and role of DNA has unquestionably had a profound influence on science and society.

The 5th World DNA and Genome Day will be held during April 25-29, 2014 in Dalian, China. Its theme is World’s Dream of Bio-Knowledge Economy. This event aims to promote life science and biotech development, and accelerate international education and scientific information exchange in China. Eleven Nobel Laureates representing various disciplines and countries are showcased in a forum that will undoubtedly provide stimulating discussion. In addition, there will be eight concurrent tracks said to cover “major hot fields” in genetics and genomics.

The NHGRI website for National DNA Day enthusiastically proclaims it as “a unique day when students, teachers and the public can learn more about genetics and genomics! Featured are an online chatroom (with transcripts back to 2005), various educational webcasts and podcasts, loads of great teaching tools for all levels, and even ambassadors—NHGRI researchers, trainees, and other staff who present current topics in genetics, the work they do, as well as career options in the field to high school audiences. As a former teacher—and continuing tax payer—it’s gratifying to see all these educational resources and outreach!

NHGRI Ambassadors for National DNA Day educational outreach (taken from nih.gov via Bing Images)

NHGRI Ambassadors for National DNA Day educational outreach (taken from nih.gov via Bing Images)

The American Society of Human Genetics (ASHG) held its 9th Annual DNA Day Essay Contest for students in grades 9-12, in parallel with the same contest sponsored by the European Society of Human Genetics. Cash prizes to students—and teaching material grants to their teachers—are awarded to winning essays that address the “2014 Question” quoted as follows:

Complex traits, such as blood pressure, height, cardiovascular disease, or autism, are the combined result of multiple genes and the environment. For ONE complex human trait of your choosing, identify and explain the contributions of at least one genetic factor AND one environmental factor. How does this interplay lead to a phenotype? Keep in mind that the environment may include nutrition, psychological elements, and other non-genetic factors. If the molecular or biological basis of the interaction between the genetic and environmental factors is known, be sure to discuss it. If not, discuss the gaps in our knowledge of how those factors influence your chosen trait.

I don’t know what life-sciences education you received in grades 9-12, but mine were limited to dissecting a smelly, formaldehyde-laced worm and starfish, and definitely not genetic factors and phenotype! Thanks to the “DNA Revolution,” teaching introductory genetics has markedly progressed!

The pervasiveness of the ‘DNA Revolution’ extends to social media as well. National DNA Day has its own Facebook page chock full of all sorts of interesting and informative links. Back in January of this year there were already 13,000 “likes” and 250 “talking about this”. I found these two items to be interesting enough to read more about them.

Taken from redorbit.com via Facebook

Taken from redorbit.com via Facebook

Something that smells wonderful to you could be offensive to your friend, but why this is so has been a mystery. The answer could lie in your genetic makeup, says a research team from Duke University. Their findings, published in the early online edition of Nature Neuroscience, reveal that a difference at the smallest level of DNA — one amino acid on one gene — determines whether or not you like a smell.

Taken from bbc.co.uk via Facebook

Taken from bbc.co.uk via Facebook

Behavior can be affected by events in previous generations which have been passed on through a form of genetic memory, animal studies suggest. Experiments showed that a traumatic event could affect the DNA in sperm and alter the brains and behavior of subsequent generations. A Nature Neuroscience study shows mice trained to avoid a smell passed their aversion on to their “grandchildren.”

My Top 3 “Likes” for DNA Day this Year

Reflecting on DNA Day 2014 led me to musing over which DNA-related topics were especially noteworthy for this post. It wasn’t easy, but I’ve narrowed it down to my Top 3 “Likes” à la Facebook jargon. I was going to reveal these in beauty pageant manner going from runners-up to the winner, but decided that they are completely different and each a winner in a unique way.

likeI’m admittedly biased about DNA synthesis, which I’ve done for many years, so I’ll start with the next-generation 1536-well oligonucleotide synthesizer with on-the-fly dispenser reported by a team led by Prof. Ronald Davis at the Stanford University’s Genome Technology Center. While this is a “must read” for synthetic oligonucleotide aficionados, the following snippets are significant “punch lines”—especially regarding throughput, scale, and cost that collectively drive applications such as the emerging field of synthetic biology.

  • Produces 1536 samples in a single run using a multi-well filtered titer plate, with the potential to synthesize up to 3456 samples per plate, using an open-well system where spent reagents are drained to waste under vacuum.
  • During synthesis, reagents are delivered on-the-fly to each micro-titer well at volumes ≤ 5μl with plate speeds up to 150 mm/s [that’s fast!].
  • Using gas-phase cleavage and deprotection, a full plate of 1536 60-mers may be processed with same-day turnaround with an average yield per well at 3.5 nmol. Final product at only $0.00277/base [that’s cheap!] is eluted into a low-volume collection plate for immediate use in downstream applications via robotics.
  • Crude oligonucleotide quality is comparable to that of commercial synthesis instrumentation, with an error rate of 1.53/717 bases. Furthermore, mass spectral analysis on strands synthesized up to 80 bases showed high purity with an average coupling efficiency of 99.5%.                     

Synthetic biology is the segue into my next “like” that is somewhat controversial, namely Do-It-Yourself Biology (DIYbio), which is explained at the DIYbio organization’s website where various activities are accessed—and information is available about the DIYbio logo & “DIYbio revolution” shown below. The website provides links to global discussions, local groups and events, the DIYbio blog, “ask a biosafety expert your safety question”, and subscribe to a quarterly “postcard update”.

An Institution for the Do-It-Yourself Biologist

DIYbio.org was founded in 2008 with the mission of establishing a vibrant, productive and safe community of DIY biologists.  Central to its mission is the “belief that biotechnology and greater public understanding about it has the potential to benefit everyone.”

DIYbio.org was founded in 2008 with the mission of establishing a vibrant, productive and safe community of DIY biologists. Central to its mission is the “belief that biotechnology and greater public understanding about it has the potential to benefit everyone.”

like

While this “democratization” of biology is a fascinating “grassroots movement”, a GenomeWeb article reported that some think “it is enabling weekend bioterrorists, disaffected teens, and inventive supervillains to use synthetic biology tools to whip up recipes of synthetic super viruses as easy as grandma’s ragout sauce. It’s only a matter of time until this is the reality, isn’t it?”

Probably not, according to a new report called “Seven Myths and Realities about Do-It-Yourself Biology.” Most of the fears about DIYbio are based on a “miscomprehension about the community’s ability to wield and manipulate life,” says the survey, which was conducted by the Synthetic Biology Project at the Woodrow Wilson International Center for Scholars in Washington DC.

  • The survey of 305 DIYers found that many of them work in multiple spaces, with 46 percent working at a community lab, 35 percent at hackerspaces, 28 percent at academic, corporate, or government labs, and 26 percent at home.
  • This finding goes against the ‘myth’ that most DIYers work anonymously and in solitude. The survey found that only 8 percent of respondents work exclusively at home labs.
  • The project says it is a myth that DIYers are capable of unleashing a deadly epidemic.
  • “The community survey suggests that, far from developing novel pathogens, which would require the skill set of a seasoned virologist and access to pathogens, most DIYers are still learning basic biotechnology,” it says.
  • DIYers also are not averse to oversight or ethical standards, the survey found. So far, they have largely been left out of conversations about government oversight concerning things like dangerous pathogens, though they do lack a formalized checking system. However, the survey found, in part because most of them work in shared spaces, there are informal checks that exclude the use of animals or pathogens.
  • Lastly, group labs are not necessarily going to become havens for bioterrorists, the report says, as DIY community labs have strict rules about access. At Brooklyn’s Genspace, for example, lab community directors evaluate new members and project safety, and consult with a safety committee.
  • The Synthetic Biology Project report also lays out several policy proposals and recommendations for ways to nurture DIYbio and to keep it safe. Education programs should be fostered, academic and corporate partners should get engaged, benchmarks and risk limits should be set, and governments should fund networks of community labs, the report says.

likeLast but not least of my top 3 Likes, is Illumina’s recent announcement of achieving the $1,000 Genome! As noted in my March 31st post, this truly amazing milestone—albeit with some cost caveats—has been realized some 12 years after Craig Venter convened and moderated a diverse panel of experts to discuss The Future of Sequencing: Advancing Towards the $1,000 Genome as a ‘hot topic’ at the 14th International Genome Sequencing and Analysis Conference (GSAC 14) in Boston on Oct 2nd 2002. The pre-conference press release presciently added that “the panel will explore new DNA sequencing technologies that have the potential to change the face of genomics over the next few years.” Indeed it has, and getting there has provided very powerful DNA sequencing tools that transformed life science, and enabled a new era of personalized medicine.

Congratulations to everyone who contributed in some way to make this happen!

This book published has received a “4-out-5 star” rating at Amazon (taken from Bing Images).

This book published has received a “4-out-5 star” rating at Amazon (taken from Bing Image

Where’s the Love for RNA Day?

While writing this post, it struck me that RNA should get its day, too! Here’s why:

While DNA encodes the “blueprint” for life, its transcription into messenger RNA (mRNA) literally translates this blueprint into proteins and, ultimately, all living organisms. But mRNA and other requisite RNAs, such as ribosomal RNA (rRNA) and transfer RNA (tRNA), are only part of the story of life. The existence and critical roles for a host of additional classes of RNA, namely short and long non-coding RNA, are now recognized to be critical. A 2014 review in Nature puts it this way:

The importance of the non-coding transcriptome has become increasingly clear in recent years—comparative genomic analysis has demonstrated a significant difference in genome utilization among species (for example, the protein-coding genome constitutes almost the entire genome of unicellular yeast, but only 2% of mammalian genomes). These observations suggest that the non-coding transcriptome is of crucial importance in determining the greater complexity of higher eukaryotes and in disease pathogenesis. Functionalizing the non-coding space will undoubtedly lead to important insight about basic physiology and disease progression.

DNA and RNA are wonderfully intertwined in the molecular basis of life, why shouldn’t RNA have its day like DNA? Any suggested dates for RNA Day? Let’s start the celebration!

Your comments about this or anything else in this post are welcomed.

Sequence Every Newborn?

  • Envisaged in the 2002 Challenge for Achieving a $1,000 Genome
  • Are We There Yet? Yes…and No
  • So Where Are We, Actually?

The notion of metaphorically ‘dirt cheap’ genome sequencing is now so prevalent that it seems to have been always available—and have virtually unlimited utility—as previously touted in a provocative article in Nature Biotechnology rhetorically entitled What would you do if you could sequence everything? The notion of ‘everything’ obviously includes all people, and—as we’re reminded by the now familiar t-shirt statement—‘babies are people too.’ Unlike the ‘big bang’ origin of the universe, cheap-enough-sequencing-for-everything, including all babies (newborns, actually) just didn’t happen spontaneously, so when and how did this come about?

In the Beginning…

According to the bible, the history of creation began when, “in the beginning God created the heavens and the earth.” The history of cheap-enough-sequencing-for-everything, according to my tongue-in-cheek reckoning, is that “In the beginning Craig Venter convened and moderated a diverse panel of experts to discuss The Future of Sequencing: Advancing Towards the $1,000 Genome.” This was a ‘hot topic’ at the 14th International Genome Sequencing and Analysis Conference (GSAC 14) in Boston on Oct 2nd 2002. The pre-conference press release added that “the panel will explore new DNA sequencing technologies that have the potential to change the face of genomics over the next few years.”

While it’s taken considerably more than a ‘few years’ to achieve the $1,000 genome, continual decrease in sequencing costs has enabled large-scale sequencing initiatives such as the 1000 Genomes Project. Nick Loman’s 2013 blog entitled The biggest genome sequencing projects: the uber-list! outlines the 15 largest sequencing projects to date and takes a look at some of the massive projects that are currently in the works.

According to GenomeWeb on Jan 14th 2014, Illumina launched a new sequencing system that can now produce a human genome for under $1,000, claiming to be the first to achieve this ‘long sought-after goal’—roughly 12 years in the making, by my reckoning. A LinkedIn post on Jan 15th by Brian Maurer, Inside Sales at Illumina, quotes Illumina’s CEO as saying that “one reagent kit to enable 16 genomes per run will cost $12,700, or $800 per genome for reagents. Hardware will add an additional $137 per genome, while sample prep will range between $55 and $65 per genome.” While Illumina is staking claim to the $1,000 genome and customers acknowledge the new system provides a drastic price reduction, the actual costs of sequencing a genome are being debated. Sequencing service provider, AllSeq, discusses the cost breakdown in their blog. While initially a bit negative, their outlook on the attainable costs seems to be improving.

So, with affordable genome sequencing apparently being a reality, are we ready to sequence every newborn?

Yes…and No

Why this conflicting answer of ‘yes…and no’? The ‘yes’ part is based on the fact that NIH has recently funded four studies on the benefits—and risks—of newborn genome sequencing. The ‘no’ part reflects the added facts that these studies will take five years, and will look at how genome testing of newborns could improve screening, and address what some geneticists view as their most sensitive ethical questions yet.

Put another way, and as detailed in the following section, the requisite low-cost DNA sequencing technology is now available but it needs to be demonstrated—through technical feasibility investigations and in clinical pilot studies—that newborns receive health benefits of the type expected, and that numerous ‘tricky’ ethical issues can be dealt with in an ‘acceptable manner’—although to whom it is acceptable is not obvious to me at this time. Likely controversial views on reimbursement also have to be addressed, but that’s a whole other topic—dare I say ‘political football’—vis-à-vis Obamacare, ooops, I mean the Affordable Care Act.

So What’s the Plan?

The following answer is an adaptation of the Sep 13th 2013 Science News & Analysis by Jocelyn Kaiser entitled Researchers to Explore Promise, Risks of Sequencing Newborns’ DNA.

The National Institute of Child Health and Human Development (NICHD) is rolling out a $25 million, 5-year federal research program to explore the potential value of looking at an infant’s genome to examine all of the genes or perhaps a particularly informative subset of them. This genome testing could significantly supplement the decades-old state screening programs that take a drop of blood from nearly every newborn’s heel and test it for biochemical markers for several dozen rare disorders. Diagnosing a child at birth can help prevent irreversible damage, as in phenylketonuria, a mutant-gene metabolic disorder that can be controlled with diet.

Sequence Every Newborn?

Handle with care. Genome testing could enhance newborn screening, but it raises ethical issues [credit: Spencer Grant/Science Vol. 341, p. 1163 (2013)].

Screening for biochemical markers often turns up false positives, however, which genetic tests might help avoid. Moreover, genome sequencing of a single sample could potentially look for all of the ~4,000 (some estimate ~10,000) monogenic diseases—i.e., those caused by defects in single genes. For more information on the subject, the World Health Organization website provides a good introduction.

The ethical concern here is that genome sequencing, unlike the current newborn screening tests, could potentially reveal many more unexpected genetic risks, some for untreatable diseases. Which of these results should be divulged is already controversial, according to Kaiser, who added that “sparks are still flying” over an earlier report described in Science as follows:

Geneticists, ethicists, and physicians reacted with shock to recommendations released last week by the American College of Medical Genetics and Genomics: that patients undergoing genomic sequencing should be informed whether 57 of their genes put them at risk of serious disease in the future, even if they don’t want that information now. The recommendations also apply to children, whose parents would be told even if illness wouldn’t strike until adulthood. The advice runs counter to the long-standing belief that patients and parents have the right to refuse DNA results. This is the first time that a professional society has advised labs and doctors what to do when unanticipated genetic results turn up in the course of sequencing a patient’s genome for an unrelated medical condition.

Given this background, it’s reassuring—in my opinion—that NICHD is taking a ‘go slow’ approach. I’m further reassured that NICHD is funding research of technical and ethical/social issues in four different but interrelated studies (see table below). Kaiser adds that “all will examine whether genomic information can improve the accuracy of newborn screening tests, but they differ in which additional genes they will test and what results they will offer parents.”

New ground: four projects funded at a total of $25 million over 5 years will look at how genome testing could improve newborn screening and other questions [credit: Spencer Grant/Science Vol. 341, p. 1163 (2013)].

New ground: four projects funded at a total of $25 million over 5 years will look at how genome testing could improve newborn screening and other questions [credit: Spencer Grant/Science Vol. 341, p. 1163 (2013)].

More specifically, Stephen Kingsmore at Children’s Mercy Hospital in Kansas City, Missouri, wants to halve the time for his current 50-hour test—discussed in the next section—which he has used to diagnose genetic disorders in up to 50% of infants in his hospital’s neonatal intensive care unit. The test hones in on a subset of genes that may explain the baby’s symptoms. While his group may ask parents if they’re interested in unrelated genetic results, the focus is on ‘a critically ill baby and a distressed family who wants answers,’ Kingsmore told Kaiser.

A team at the University of North Carolina is studying how to return results to low-income families and others who might not be familiar with genomics. It is also dividing genetic findings into three categories—mutations that should always be reported; those that parents can choose to receive, which might include risk genes for adult cancers; and a third set that should not be disclosed, e.g. untreatable adult-onset diseases such as Alzheimer’s.

A team at Brigham and Women’s Hospital in Boston and Boston Children’s Hospital hopes to learn how doctors and parents will use genomic information. ‘We’re trying to imagine a world where you have this information available, whether you’re a sick child or healthy child. How will it change the way doctors care for children?’ asks co-principal investigator Robert Green.

Ethicist Jeffrey Botkin of the University of Utah opined that sequencing might never replace existing newborn screening because of its costs and the complexity, according to Kaiser. However, Kaiser said that Botkin and others believe that it’s important to explore these issues because wealthy, well-informed parents will soon be able to mail a sample of their baby’s DNA to a company to have it sequenced—regardless of whether medical experts think that’s a good idea. ‘There’s an appetite for this. It will be filled either within the medical establishment or outside of it,’ Kaiser quotes Green as saying.

I should add that this parental ‘appetite’ doesn’t seem to be easily satisfied at the moment, based on my—admittedly superficial—survey of what’s currently available in the commercial genome sequencing space. For now, companies such as Personalis and Knome restrict their offerings to researchers and clinicians, not direct-to-consumers, such as parents-on-behalf-of-newborns—yet.

Sample-to-Whole-Genome-Sequencing-Diagnosis in Only 50 Hours

Having been a laboratory investigator during the stunning evolution from manual Maxam-Gilbert sequencing to highly automated Sanger sequencing—the title of this section ‘blows my mind’ and seems impossible—but it’s not! Stephen Kingsmore and collaborators at the Children’s Mercy Hospital in Kansas City, Missouri reported this remarkable achievement in Science Translational Medicine in 2012 and, as noted in the aforementioned table, aim to cut this turnaround time to within 24 hours!

In that 2012 report, they make a compelling case for whole-genome sequence-based diagnostics—and super speedy sample-to-result by noting that monogenic diseases are frequent causes of neonatal morbidity and mortality, and disease presentations are often undifferentiated at birth. Of the ~4,000 monogenic diseases that have been characterized, clinical testing is available for only a handful of them and many feature clinical and genetic heterogeneity. Hence, an immense unmet need exists for improved molecular diagnosis in infants. Because disease progression is extremely rapid—albeit heterogeneous—in  newborns, molecular diagnoses must occur quickly to be relevant for clinical decision-making.

Using the workflow and timeline outlined below, they describe 50-hour differential diagnosis of genetic disorders by whole-genome sequencing (WGS) that features automated bioinformatics analysis and is intended to be a prototype for use in neonatal intensive care units. I should add that an automated bioinformatics analysis is critical for clinical utility, and has been the subject of ‘musings’ by Elaine Mardis in Genome Medicine entitled The $1,000 genome, the $100,000 analysis?

Sequence Every Newborn?Summary of the steps and timing (t, hours) resulting in an interval of 50 hours between consent and delivery of a preliminary, verbal diagnosis [taken from Saunders et al. Sci Transl Med 4, 154ra135 (2012).

To validate the feasibility of automated matching of clinical terms to diseases and genes, they entered retrospectively the presenting features of 533 children, who have received a molecular diagnosis at Children’s Mercy Hospital within the last 10 years, into symptom- and sign-assisted genome analysis (SSAGA)—a new clinico-pathological correlation tool that maps the clinical features of 591 well-established, recessive genetic diseases with pediatric presentations to corresponding phenotypes and genes known to cause the symptoms. Sensitivity was 99.3%, as determined by correct disease and affected gene nominations.

Rapid WGS was made possible by two innovations. First, a widely used WGS platform was modified to generate up to 140 Gb [Gb = giga base pairs = 1,000,000,000 base pairs] of sequence in less than 30 hours (Illumina HiSeq 2500). Secondly, sample preparation took 4.5 hours, while 2 × 100 base pairs of genome sequencing took 25.5 hours. The total ‘hands-on’ time for technical staff was 5 hours.

Readers who are interested in more technical details for sample prep, sequencing, and bioinformatics/analytics should read the full text. However, the authors’ abstract provides the following succinctly described diagnostic ‘payoff’, so to speak:

Prospective WGS disclosed potential molecular diagnosis of a severe GJB2-related skin disease in one neonate; BRAT1-related lethal neonatal rigidity and multifocal seizure syndrome in another infant; identified BCL9L as a novel, recessive visceral heterotaxy gene (HTX6) in a pedigree; and ruled out known candidate genes in one infant. Sequencing of parents or affected siblings expedited the identification of disease genes in prospective cases. Thus, rapid WGS can potentially broaden and foreshorten differential diagnosis, resulting in fewer empirical treatments and faster progression to genetic and prognostic counseling.

These are compelling results, in my opinion. Let me know if you also find this compelling.  As always, your comments are welcomed.

Postscript

After finishing the above post, Andrew Pollack at The New York Times published a fascinating article giving examples of how pharmaceutical companies are heavily investing in genetic studies that employ exome-sequencing of large study groups in a search for clues to aid drug development. Regeneron is conducting one such study that includes 100,000 genomes.

Searching for ‘Genius Genes’ by Sequencing the Super-Smart

  • Brainchild of a high-school dropout
  • Joined by two renowned Professors in the USA and UK
  • Enabled by the world’s most powerful sequencing facility
  • Jonathan Rothberg to do same for math ability 

Prologue

Before plunging into this post, those of you who follow college basketball are eagerly awaiting the start of “March Madness” and its “bracketology” for predicting all the winners, with odds of 1-in-9.2 quintillion—that’s nine followed by 18 zeros—and is why Warren Buffet will almost certainly not have to pay out the $1 billion he offered for doing so.

The following short story of how basketball came about is worth a quick read before getting to this posting’s DNA sequencing projects, which are not “madness” but definitely long-shot bets—and criticized by some. 

The original 1891 "Basket Ball" court in Springfield College used a peach basket attached to the wall (taken from Wikipedia).

The original 1891 “Basket Ball” court in Springfield College used a peach basket attached to the wall (taken from Wikipedia).

James Naismith (1861 – 1939) was a Canadian American sports coach and innovator. He invented the sport of basketball in 1891 and wrote the original basketball rulebook. At Springfield College, Naismith struggled with a rowdy class which was confined to indoor games throughout the harsh New England winter and thus was perpetually short-tempered. Under orders from Dr. Luther Gulick, head of Springfield College Physical Education, Naismith was given 14 days to create an indoor game that would provide an “athletic distraction.” Gulick demanded that it would not take up much room, could help its track athletes to keep in shape and explicitly emphasized to “make it fair for all players and not too rough.” Naismith did so using the actual “basket and ball” pictured below.

SNPs and GWAS assist in finding the roots of intelligence

Many studies indicate that intelligence is heritable, but to what extent is yet uncertain (taken from the Wall Street Journal via Bing Images).

Many studies indicate that intelligence is heritable, but to what extent is yet uncertain (taken from the Wall Street Journal via Bing Images).

Many of you are well aware of—if not actually involved in—the use of DNA sequence analysis to identify common single nucleotide polymorphisms (SNPs) that are associated with diseases or traits in a study population, relative to a normal control population. Examples of these genome-wide association studies (GWAS) included and were principally enabled in the 1990s by high-density “SNP chips” developed by Affymetrix and then Agilent. While technically straightforward, there’s a lot of genetics and not-so-simple statistics to deal with in designing GWAS and—especially—properly interpreting the results.

In the future, Junior’s DNA sequence could implicate other reasons for his failing academic performance, e.g. not studying enough (taken from dailymail.co.uk via Bing Images).

In the future, Junior’s DNA sequence could implicate other reasons for his failing academic performance, e.g. not studying enough (taken from dailymail.co.uk via Bing Images).

Now, following the advent of massively parallel “next generation” sequencing (NGS) platforms from Illumina and Life Technologies, whole genomes of larger populations (i.e. “many” 1,000s of individuals) can be studied, and less common (aka rare) SNPs can be sought. All of this has fueled pursuit of more challenging—and controversial—GWAS.

So it is the following two ongoing stories that I’ve referred to as the search for genius genes. One conceived by Bowen Zhao—a teenaged Chinese high-school dropout—aiming to find the roots of intelligence in our DNA by sequencing the “off-the-chart” super-smarties, and a newer project by Jonathan Rothberg—über-famous founder of Ion Torrent, which commercialized the game-changing semiconductor-sequencing technology acquired for mega millions by Life Tech—aimed at identifying the roots of mathematical ability by, need I say, Ion Torrent sequencing.

From Chinese high-school dropout to founder of a Cognitive Genomics Unit

It’s a gross understatement to say that Mr. Bowen Zhao is an interesting person—he’s actually an amazing person. As a 13 year old in 2007, he skipped afternoon classes at his school in Beijing and managed to get an internship at the Chinese Academy of Agricultural Sciences where he cleaned test tubes and did other simple jobs. In return, the graduate students let him borrow genetics textbooks and participate in experiments, including the sequencing of the cucumber genome. When the study of the cucumber genome was published in Nature Genetics in 2009, Mr. Zhao was listed as a co-author at the age of 15.

Tantalized by genomics, Mr. Zhao quit school and began to work full-time at BGI Shenzhen (near Hong Kong), one of the largest genomics research centers in the world. BGI (formerly known as the Beijing Genomics Institute) is a private company—partly funded by the Chinese government—that significantly expanded its sequencing throughput last year by acquiring Complete Genomics of Mountain View, California.

Mr. Bowen Zhao is a young researcher with amazing accomplishments (taken from thetimes.co.uk via Bing Images)

Mr. Bowen Zhao is a young researcher with amazing accomplishments (taken from thetimes.co.uk via Bing Images)

The BGI project is sequencing DNA from IQ outliers comparable to Einstein (taken from rosemaryschool.org via Bing Images).

The BGI project is sequencing DNA from IQ outliers comparable to Einstein (taken from rosemaryschool.org via Bing Images).

In 2010, BGI founded the Cognitive Genomics Unit and named Mr. Zhao as its Director of Bioinformatics. The Cognitive Genomics Unit seeks to better understand human cognition with the goal of identifying the genes that influence intelligence. Mr. Zhao and his team are currently using more than 100 state-of-the-art next generation sequencers to decipher some 2,200 DNA samples from some of brightest people in the world—extreme IQ outliers. The majority of the DNA samples come from people with IQs of 160 or higher, which puts them at the same level as Einstein. By comparison, average IQ in any population is set at 100, and the average Nobel laureate registers at around 145. Only one in every 30,000 people (0.003%) would qualify to participate in the BGI project.

In an article by Gautam Naik of the Wall Street Journal, Mr. Zhao is quoted as saying that “people have chosen to ignore the genetics of intelligence for a long time.” Mr. Zhao, who hopes to publish his team’s initial findings this year, added that “people believe it’s a controversial topic, especially in the West [but] that’s not the case in China,” where IQ studies are regarded more as a scientific challenge and therefore are easier to fund.

According to Naik, the roots of intelligence are a mystery, and studies show that at least half of IQ variation is inherited. While scientists have identified some genes that can significantly lower IQ—in people afflicted with mental retardation, for example—truly important genes that affect normal IQ variation have yet to be pinned down.

The BGI researchers hope to crack the problem by comparing the genomes of super-high-IQ individuals with the genomes of people drawn from the general population. By studying the variation in the two groups, they hope to isolate some of the hereditary factors behind IQ. Their conclusions could lay the groundwork for a genetic test to predict a person’s inherited cognitive ability. Although such a tool could be useful, it also might be divisive.

“If you can identify kids who are going to have trouble learning, you can intervene” early on in their lives, through special schooling or other programs, says Robert Plomin, Professor of Behavioral Genetics at King’s College, London, who is involved in the BGI project and quoted by Naik.

Critics, however, worry that genetic data related to IQ could easily be misconstrued—or misused. Research into the science of intelligence has been used in the past “to target particular racial groups or individuals and delegitimize them,” said Jeremy Gruber, President of the Council for Responsible Genetics, a watchdog group based in Cambridge, Massachusetts. “I’d be very concerned that the reductionist and deterministic trends that still are very much present in the world of genetics would come to the fore in a project like this,” Gruber added.

Obtaining access to ‘genius genes’ wasn’t easy

Getting DNA to sequence from super-smart people was easier said than done. According to Naik, Zhao’s first foray into the genetics of intelligence was a plan to collect DNA from high-achieving kids at local high schools. It didn’t work. “Parents were afraid [of giving consent] because their children’s blood would be taken,” Zhao told Naik.

In the spring of 2010, Stephen Hsu—a theoretical physicist from the University of Oregon (now at Michigan State University) who was also interested in the genetics of cognitive ability—visited BGI and joined Zhao to launch the BGI intelligence project. One part of the plan called for shifting to saliva-based DNA samples obtained from mathematically gifted people, including Chinese who had participated in mathematics or science Olympiad training camps. Another involved the collection of DNA samples from high-IQ individuals from the U.S. and other countries, including those with extremely high SAT scores, and those with a doctorate in physics or math from an elite university. In addition, anyone could enroll via BGI’s website—if they met the criteria—as have about 500 qualifying volunteers to date.

Interestingly, most of the samples so far have come from outside of China. The main source is Prof. Plomin of King’s College, who for his own research had collected DNA samples from about 1,600 individuals whose IQs were off the charts. Those samples were obtained through a U.S. project known as the Study of Mathematically Precocious Youth, now in its fourth decade. Dr. Plomin tracked down 1,600 adults who had enrolled as kids in the U.S. project, now based at Vanderbilt University. Their DNA contributions make up the bulk of the BGI samples.

Frequently asked questions about the BGI intelligence project, as well as a link to the detailed project proposal, can be read by clicking here. The penultimate and last paragraphs of the introductory section of this proposal are the following:

The brain evolved to deal with a complex, information-rich environment. The blueprint of the brain is contained in our DNA, although brain development is a complicated process in which interactions with the environment play an important role. Nevertheless, in almost all cases a significant portion of cognitive or behavioral variability in humans is found to be heritable—i.e., attributable to genetic causes.

The goal of the BGI Cognitive Genomics Lab (CGL) is to investigate the genetic architecture of human cognition: the genomic locations, allele frequencies, and average effects of the precise DNA variants affecting variability in perceptual and cognitive processes. This document outlines the CGL’s proposal to investigate one trait in particular: general intelligence or general mental ability, often referred to a “g.”

On Jan 1st 2014, I contacted Prof. Hsu, who coauthored BGI’s “g” proposal, and asked him to clarify whether genome sequencing was in fact being used, as opposed to SNP genotyping chips that were specified in the aforementioned proposal’s Materials and Methods section. I also inquired as to whether any results have been published. His reply on the same day was that the “initial plan was SNPs but [was] upgraded to sequencing. No results yet.”

Stay tuned.

Jonathan Rothberg’s ‘Project Einstein’ taps 400 top mathematicians

In the October 31st 2013 issue of Nature, Erika Check Hayden reported on ‘Project Einstein,’ Ion Torrent founder/inventor/serial entrepreneur Jonathan Rothberg’s new venture aimed at identifying the genetic roots of math genius.

Jonathan Rothberg, founder of CuraGen, 454 Life Sciences, Ion Torrent, Rothberg Center for Childhood diseases, and RainDance Technologies (taken from nathanielwelch.com via Bing Images).

Jonathan Rothberg, founder of CuraGen, 454 Life Sciences, Ion Torrent, Rothberg Center for Childhood diseases, and RainDance Technologies (taken from nathanielwelch.com via Bing Images).

According to Check Hayden’s news article, Rothberg and physicist/author Max Tegmark at MIT in Cambridge, “will be wading into a field fraught with controversy” by enrolling about 400 mathematicians and theoretical physicists from top-ranked US universities in ‘Project Einstein’ to sequence the participants genomes using Ion Torrent machines that Rothberg developed. Critics claim that the study population, like BGI’s “g” project, is too small to yield meaningful results for such complex traits. Check Hayden adds that “some are concerned about ethical issues. If the projects find genetic markers for math ability, these could be used as a basis for the selective abortion of fetuses or in choosing between embryos created through in vitro fertilization.” She says that Rothberg is pushing ahead, and quotes him as stating, “I’m not at all concerned about the critics.”

On the positive side, Prof. Plomin mentioned above in BGI project “g” is said to believe that there is no reason why genome sequencing won’t work for math ability. To support this position, Plomin refers to his 2013 publication entitled Literacy and numeracy are more heritable than intelligence in primary school, which indicates that as much as two-thirds of a child’s mathematical aptitude seems to be influenced by genes.

I’ll be keeping tabs on the project to see how it progresses and how the ethics issue plays out.

Genetics of intelligence is complex and has foiled attempts at deciphering

After reading about the scientifically controversial aspects of both project “g” and ‘Project Einstein,’ I became curious about the outcomes of previous attempts to decipher the genetic basis of intelligence. There was way too much literature to delve into deeply, but a 2013 New Scientist article by Debora MacKenzie entitled ‘Intelligence genes’ evade detection in largest study is worth paraphrasing, as it distills out some simplified takeaways from the referenced study by Koellinger and 200 (!) collaborators published in Science.

  • This team of researchers assembled 54 sets of data on more than 126,000 people who had their genomes analyzed for 2.5 million common SNPs, and for whom information was available on length and level of education. Study organizer Koellinger admits that educational achievement is only a rough proxy for intelligence, but this information was available for the requisite large number of people.
  • Three SNPs from 100,000 people correlated significantly with educational achievement, and were tested against SNPs from the other 26,000 people. The same correlations held, replicating the first analysis. However, the strength of the correlations for each SNP accounted for at most 0.002% of the total variation in educational attainment.
  • “Probably thousands of SNPs are involved, each with an effect so small we need a much larger sample to see it,” says Koellinger. Either that, or intelligence is affected to a greater degree than other heritable traits by genetic variations beyond these SNPs—perhaps rare mutations or interactions between genes.
  • Robert Plomin adds that whole genome sequencing, as being done by BGI, allows researchers to “look for sequence variations of every kind.” Then, the missing genes for intelligence may finally be found, concludes MacKenzie.

Parting Thoughts 

Most, if not all of you, will agree with the contention that a human being is not merely a slave to his or her genes. After all, hasn’t determinism been swept away by the broom of quantum mechanical probabilities as a physical basis of free will? If so, then what role does inherited genetics actually play in intelligence? While the answer to this rhetorical question is obviously not simple, and still hotly debated, I found my thoughts to be largely reflected by a posting at Rose Mary School paraphrased as follows, keeping in mind that all analogies are imperfect:

Human life has been compared to a game of cards. At birth, every person is dealt a hand of cards—i.e., his or her genetic make-up. Some receive a good hand, others a less good one. Success in any game, however, is almost always a matter of education, learning, and culture. For sure, there are often certain innate qualities that will give one person an advantage over another in a specific game. However, without having learned the game and without regular and rigorous practice, nobody will ever become a champion at any game. In the same way the outcome of the game of life is not solely determined by the quality of a person’s initial hand of cards, but also by the way in which he or she takes part in the game of life. His or hers ability to take part in the game of life satisfactorily, perhaps even successfully, will be determined to a very large extent by the quality and quantity of education that he or she has enjoyed.

When I gave advice to students, as a teacher, it was very simple and what I did—and still do—myself: “study hard, work harder, and success will follow.”

As always, your comments are welcomed.