The rapid advancement of next-generation sequencing technology, also known as massively parallel sequencing (MPS), has revolutionized many areas of applied research. One such area, the analysis of mitochondrial DNA (mtDNA) in forensic applications, has traditionally used another method—Sanger sequencing followed by capillary electrophoresis (CE).
Although MPS can provide a wealth of information, its initial adoption in forensic workflows continues to be slow. However, the barriers to adoption of the technology have been lowered in recent years, as exemplified by the number of abstracts discussing the use of MPS presented at the 29th International Symposium for Human Identification (ISHI 29), held in September 2018. Compared to Sanger sequencing, MPS can provide more data on minute variations in the human genome, particularly for the analysis of mtDNA and single-nucleotide polymorphisms (SNPs). It is especially powerful for analyzing mixture samples or those where the DNA is highly degraded, such as in human remains. Continue reading “Harnessing the Power of Massively Parallel Sequencing in Forensic Analysis”
Are we better off now than we were 10 years ago? Often times this question is answered subjectively and will vary from person to person. We can empirically show how life expectancy has increased over the centuries thanks to advances in the fields of agriculture and medicine, but what about quality of life? Science affects our lives every day, and the general notion is that better science will (eventually) translate into better lives. There is a burning curiosity shared by myself and others to quantify how we have progressed in science over the years:
Bornmann and Mutz demonstrate in the image shown above how we have been doubling scientific output every nine years since the 1940s. That is not to say that we have become twice as smart or efficient; this phenomenon could be partially fueled by a desire to gain prestige through a high number of publications. To better assess the topic of efficiency, we can measure how long it takes to perform specific procedures and how much they cost. This article compares the rate of improvement for DNA sequencing, PCR, GC-MS and general automation to the rate of improvement for supercomputers and video game consoles.
It is with sadness that we recognize the passing of Dr. Frederick Sanger. Sanger is known to molecular biologists and biochemists worldwide for his DNA sequencing technique, which won for him the 1980 Nobel prize in Chemistry.
Also noteworthy, Sanger’s laboratory accomplished the first complete genome sequence, that of a viral DNA genome more than 5,000 base pairs in length.
The 1980 prize was Sanger’s second Nobel award, his first awarded in 1958 for determining the chemical structure of proteins. In this work, Sanger elucidated not only the amino acids that comprised insulin but also the order in which the amino acids occurred.
About Sanger Sequencing Sanger DNA sequencing is also known as the chain-termination method of sequencing. The Sanger technique uses dideoxynucleotides or ddNTPs in addition to typical deoxynucleotides (dNTPs) in the reaction. ddNTPs result in termination of the DNA strand because ddNTPs lack the 3’-OH group required for phosphodiester bond formation between nucleotides. Without this bond, the chain of nucleotides being formed is terminated.
Sanger sequencing requires a single-stranded DNA, a DNA primer (either radiolabeled or with a fluorescent tag), DNA polymerase, dNTPs and ddNTPs. Four reactions are set up, one for each nucleotide, G, A, T and C. In each reaction all four dNTPs are included, but only one ddNTP (ddATP, ddCTP, ddGTP or ddTTP) is added. The sequencing reactions are performed and the products denatured and separated by size using polyacrylamide gel electrophoresis.
This reaction mix results in various lengths of fragments representing, for instance, the location of each A nucleotide in the sequence, because while there is more dATP than ddATP in the reaction, there is enough ddATP that each ATP ultimately instead is replaced with a ddATP, resulting in chain termination. Separation by gel electrophoresis reveals the size of these ddATP-containing fragments, and thus the locations of all A nucleotide in the sequence. Similar information is provided for GTP, CTP and TTP.
The Maxam and Gilbert DNA sequencing method had the advantage at the time of being used with double-stranded DNA. However, this method required DNA strand separation or fractionation of the restriction enzyme fragments, resulting in a somewhat more time-consuming technique, compared to the 1977 method published by Sanger et al.
Dr. Sanger was born in Gloucestershire, U.K. in 1918, the son of a physician. Though he initially planned to follow his father into medicine, biochemistry became his life-long passion and area of research endeavor. Sanger retired at age 65, to spend more time at hobbies of gardening and boating.
At the recent International Symposium on Human Identification, Kevin Davies, the keynote speaker and author of The $1,000 Genome, entertained attendees with a history of human genome sequencing efforts and discussed ways in which the resulting information has infiltrated our everyday lives. Obviously, there is enough material on the subject to fill a book, but I will describe just a few of the high points of his talk here.
DNA testing methods are being used to solve problems in an ever-increasing number of fields. From crime scene analysis to tissue typing, from mammoths to Neanderthals, and from Thutmose I to Richard III, both modern mysteries and age-old secrets are being revealed. The availability of fast, accurate, and convenient DNA amplification and sequencing methods has made DNA analysis a viable option for many types of investigation. Now it is even being applied to solve such mundane mysteries as the precise ingredients used in a sausage recipe, and to answer that most difficult of questions “what exactly is in a doner kebab?” Continue reading “Dietary Analysis, DNA Style”
For sixty years now, scientists have studied the role of DNA as a vehicle for the storage and transmission of genetic information from generation to generation. We have marveled at the capacity of DNA to store all the information required to describe a human being using only a 4-letter code, and to pack that information into a space the size of the nucleus of a single cell. A letter published last week in Nature exploits this phenomenal storage capacity of DNA to archive a quite different kind of information. Forget CDs, hard drives and chips, the sum of human knowledge can now be stored in synthetic DNA strands. The Nature letter, authored by scientists from the European Bioinformatics Institute in Cambridge, UK, and Agilent Technologies in California, describes a proof-of-concept experiment where synthetic DNA was used to encode Shakespeare’s Sonnets, Martin Luther King’s “I Have a Dream” speech, a picture of the Bioinformatics Institute, and the original Crick and Watson paper on the double-helical nature of DNA. Continue reading “Sonnets in DNA”
Last week I read an article in Wired Science that described how an outbreak of antibiotic resistant Klebsiella pneuomiae was tracked in real-time at an NIH hospital using DNA sequencing technologies. The article described how whole genome sequencing of disease isolates and environmental samples from the hospital was used to track the source and spread of the outbreak.
The scientists monitoring the outbreak tracked spontaneous random mutations in the K. pneumoniea DNA sequence to determine that the outbreak was caused by a single source, and to track the spread of the organism within the hospital. The sequencing information helped investigators identify when and where infection occurred, and also to track transmission of the infection from person-to-person. It also revealed that the order of transmission was different from the order in which the cases presented with symptoms, and helped identify how the organism was spread between individuals.
The article describes how epidemiology, infection control and sequence identification were used together to influence outcome in this situation, but also shows the power of whole genome sequencing to find and track subtle differences between isolates that could not have been identified in any other way.
To me, this is a powerful illustration of just how far DNA sequencing has come over the last few years. Not so long ago, the idea of sequencing the entire genome of numerous disease isolates during an outbreak would have been almost laughable—an idea confined to episodes of the X-files or to science fiction stories. Now, thanks to advanced automated sequencing technologies and the computing power to analyze the results, it is doable within a reasonable timeframe for hospitals with access to the right facilities. Although this type of investigation is still beyond the capabilities of most hospitals, the costs and turnaround times for sequencing are coming down rapidly as new technologies capable of faster, cheaper analysis become available.
We have come a very long way since the days when DNA sequencing was a laborious process involving pouring a gel, running samples,and manually reading the resulting autoradiogram hoping to get a read of 50–100 bases. My reading of the wired article prompted me to find out more about the newer types of sequencing technology available today. Here’s what I learned about each: Continue reading “DNA Sequencing from AutoRads and Gels to Nanopores”