The Stories in the Bones: DNA Forensic Analysis 20 Years after 9/11

September 11, 2001 is the day that will live in infamy for my generation. On that beautiful late summer day, I was at my desk working on the Fall issue of Neural Notes magazine when a colleague learned of the first plane hitting the World Trade Center. As the morning wore on, we learned quickly that it wasn’t just one plane, and it wasn’t just the World Trade Center.

Two beams of light recognized the site of the World Trade Center attack. Today DNA forensic analysis applies new technologies to bring closure to families of victims.

Information was sparse. The world wide web was incredibly slow, and social media wasn’t much of a thing—nothing more than a few listservs for the life sciences. Someone managed to find a TV with a rabbit-eared, foil-covered antenna, and we gathered in the cafeteria of Promega headquarters—our shock growing as more footage became available. At Promega, conversation immediately turned to how we could bring our DNA forensic analysis expertise to help and support the authorities with the identification of victims and cataloguing of reference samples.

Just as the internet and social media have evolved into faster and more powerful means of communication—no longer do we rely on TVs with antennas for breaking news—the technology that is used to identify victims of a tragedy from partial remains like bone fragments and teeth has also evolved to be faster and more powerful.

Teeth and Bones: Then and Now

“Bones tell me the story of a person’s life—how old they were, what their gender was, their ancestral background.”  Kathy Reichs

Many stories, both fact and fiction, start with a discovery of bones from a burial site or other scene. Bones can be recovered from harsh environments, having been exposed to extreme heat, time, acidic soils, swamps, chemicals, animal activities, water, or fires and explosions. These exposures degrade the sample and make recovering DNA from the cells deep within the bone matrix difficult.

Continue reading “The Stories in the Bones: DNA Forensic Analysis 20 Years after 9/11”

Is MPS right for your forensics lab?

Today’s post was written by guest blogger Anupama Gopalakrishnan, Global Product Manager for the Genetic Identity group at Promega. 

Next-generation sequencing (NGS), or massively parallel sequencing (MPS), is a powerful tool for genomic research. This high-throughput technology is fast and accessible—you can acquire a robust data set from a single run. While NGS systems are widely used in evolutionary biology and genetics, there is a window of opportunity for adoption of this technology in the forensic sciences.

Currently, the gold standard is capillary electrophoresis (CE)-based technologies to analyze short tandem repeats (STR). These systems continue to evolve with increasing sensitivity, robustness and inhibitor tolerance by the introduction of probabilistic genotyping in data analysis—all with a combined goal of extracting maximum identity information from low quantity challenging samples. However, obtaining profiles from these samples and the interpretation of mixture samples continue to pose challenges.

MPS systems enable simultaneous analysis of forensically relevant genetic markers to improve efficiency, capacity and resolution—with the ability to generate results on nearly 10-fold more genetic loci than the current technology. What samples would truly benefit from MPS? Mixture samples, undoubtedly. The benefit of MPS is also exemplified in cases where the samples are highly degraded or the only samples available are teeth, bones and hairs without a follicle. By adding a sequencing component to the allele length component of CE technology, MPS resolves the current greatest challenges in forensic DNA analysis—namely identifying allele sharing between contributors and PCR artifacts, such as stutter. Additionally, single nucleotide polymorphisms in flanking sequence of the repeat sequence can identify additional alleles contributing to discrimination power. For example, sequencing of Y chromosome loci can help distinguish between mixed male samples from the same paternal lineage and therefore, provide valuable information in decoding mixtures that contain more than one male contributor. Also, since MPS technology is not limited by real-estate, all primers in a MPS system can target small loci maximizing the probability of obtaining a usable profile from degraded DNA typical of challenging samples.

Continue reading “Is MPS right for your forensics lab?”

Massively Parallel Sequencing for Forensic DNA Analysis

HiResToday’s blog is written by guest blogger, Douglas R. Storts, PhD, head of Research, Nucleic Acid Technologies, Promega Corporation.

Massively parallel sequencing (MPS), also called next generation sequencing (NGS), has the potential to alleviate some of the biggest challenges facing forensic laboratories, namely degraded DNA and samples containing DNA from multiple contributors. Unlike capillary electrophoresis, MPS genotyping methods do not require fluorescently-labeled oligonucleotides to distinguish amplification products of similar size. Furthermore, it is not necessary to design primers within a color channel to generate amplicons of different sizes to avoid allele overlap. Consequently, all the amplicons can be of a similar, small size (typically <275 base pairs). The small size of the amplicons is particularly advantageous when working with degraded DNA. Because the alleles are distinguished by the number of repeats and the DNA sequence, additional information can be derived from a sample. This can be especially important when genotyping mixtures. As previously demonstrated (1), this sequence variation can help distinguish stutter “peaks” from minor contributor alleles.

Because there is no reliance upon size and fluorescent label, significantly greater multiplexing is possible with MPS approaches. In addition to autosomal short tandem repeats (STRs), we can also sequence Y-STRs, single nucleotide polymorphisms (SNPs), and the mitochondrial DNA control region. The advantage to this approach is the forensic analyst does not need a priori knowledge whether a sample would benefit most from the different methods of genotyping.

Despite these major advantages, there are limitations to the near-term, broad deployment of current MPS technology into forensic laboratories. The limitations fall into four main categories:  Workflow, costs, performance with forensically-relevant samples, and community guidelines. Continue reading “Massively Parallel Sequencing for Forensic DNA Analysis”

Real-Time (quantitative) qPCR for Quantitating Library Prep before NGS

Real-Time (or quantitative, qPCR) monitors PCR amplification as it happens and allows you to measure starting material in your reaction.
Real-Time (or quantitative, qPCR) monitors PCR amplification as it happens and allows you to measure starting material in your reaction.

This the last in a series of four blogs about Quantitation for NGS is written by guest blogger Adam Blatter, Product Specialist in Integrated Solutions at Promega.

When it comes to nucleic acid quantitation, real-time or quantitative (qPCR) is considered the gold standard because of its unmatched performance in senstivity, specificity and accuracy. qPCR relies on thermal cycling, consisting of repeated cycles of heating an cooling for DNA melting and enzyamtic replication. Detection instrumentation is capable of measuring the accumulation of DNA product after each round of amplification in real time.

Because PCR amplifies specific regions of DNA, the method is highly sensitive, specific to DNA, and it can determine whether a sample is truly able to be amplified. Degraded DNA or free nucleotides, which might otherwise skew your quantiation, will not contribute to the signal, and your measurement will be more accurate.

However, while qPCR does provide technical advantages, the method requires special instrumentation, specialized reagents and is a more time-consuming process. In addition, you will probably need to optimize your qPCR assay for each of your targets to achieve your desired results.

Because of the added complexity and cost, qPCR is a technique suited for post-library quantitation when you need to know the exact amount of amplifiable, adapter-ligated DNA.  PCR is the only method capable of specifically targeting these library constructs over other DNA that may be present. This specificity is important because accurate normalization is especially critical for producing even coverage in multiplex experiments where equimolar amounts of several libraries are added to a pooled sample. This normalization process is essential  if your are screening for rare variants that might be lost in background and go undetected if underrepresented in a mixed pool.

 

Read Part 1: When Every Step Counts: Quantitation for NGS

Read Part 2: Nucleic Acid Quantitation by UV Absorbance: Not for NGS

Read Part 3: Fluorescence Dye-Based Quantitation: Sensitive and Specific for NGS Applications

Fluorescence Dye-Based Quantitation: Sensitive and Specific for NGS Applications

This is the third post in a series of blogs on quantitation for NGS applications written by guest blogger Adam Blatter, Product Specialist in Integrated Solutions at Promega.

Fluorescent dye-based quantitation uses specially designed DNA binding compounds that intercalate only with double stranded DNA molecules. When excited by a specific wavelength of light, only dye in the DNA-bound state will fluoresce. These aspects of the technique contribute to low background signal, and therefore the ability to accurately and specifically detect very low quantities of DNA in solution, even the nanogram quantities used in NGS applications.

For commercial NGS systems, such as the Nextera Rapid Capture Enrichment Protocol by Illumina, this specificity and sensitivity of quantitation are critical. The Nextera protocol is optimized for 50ng of total genomic DNA. A higher mass input of genomic DNA can result in incomplete tagmentation, and larger insert sizes, which can adversely affect enrichment. A lower mass input of genomic DNA or low-quality DNA can generate smaller than expected inserts, which can be lost during subsequent cleanup steps, giving lower diversity of inserts. Continue reading “Fluorescence Dye-Based Quantitation: Sensitive and Specific for NGS Applications”

Nucleic Acid Quantitation by UV Absorbance: Not for NGS

schematic diagram of UV-Vis Absorbance Method
For UV-Vis Spectrophotometry, light is split into its component wavelengths and directed through a solution. Molecules in the solution absorb specific wavelengths of light.

This is the second in a series of four blogs about Quantitation for NGS is written by guest blogger Adam Blatter, Product Specialist in Integrated Solutions at Promega.

Perhaps the most ubiquitous quantitation method is UV-spectrophotometry (also called absorbance spectroscopy). This technique takes advantage of the Beer-Lambert Law: an observation that many compounds absorb UV-Visible light at unique wavelengths, and that for a fixed path length the absorbance of a solution is directly proportional to the concentration of the absorbing species. DNA, for example has a peak absorbance at 260nm (A260nm).

This method is user friendly, quick and easy. But, it has significant limitations, especially when quantitating samples for NGS applications. Continue reading “Nucleic Acid Quantitation by UV Absorbance: Not for NGS”

When Every Step Counts: Quantitation for NGS

13170MA-800x277This series of blogs about Quantitation for NGS is written by guest blogger Adam Blatter, Product Specialist in Integrated Solutions at Promega.

As sequencing technology races toward ever cheaper, faster and more accurate ways to read entire genomes, we find ourselves able to study biological systems at a level never before possible. From basic science to translational research, massively parallel sequencing (also known as next-generation sequencing or NGS) has opened up new avenues of inquiry in genomics, oncology and ecology.

Many commercial sequencing platforms have been established (e.g., Illumina, IonTorrent, 454, PacBio), and new technologies are developed every day to enable new and unique applications. However, all of these platforms and technologies work off the same general principle: nucleic acid must be extracted from a sample, arranged into platform-specific library constructs, and loaded into the sequencer. Regardless of the sample type or the platform used, every step throughout this workflow is critical for successful results. An often overlooked part of the NGS workflow is sample quantitation. Here we are presenting the first in a series of four short blogs about the critical step of quantitation in NGS workflows.

Sample input is critical to NGS in terms of both quality and quantity. Knowing how much DNA you have, often in nanogram quantities, can make the difference between success and failure. There are several key points in the NGS workflow where sample quantitation is important before you can proceed:

  • After initial nucleic acid extraction from the sample matrix and before proceeding with library preparation
  • Post-library preparation when pooling barcoded libraries for multiplexing
  • Final pooled library quantitation immediately before loading for sequencing

There are several common methods for quantitating nucleic acids: UV-spectroscopy, Fluorescence spectoscopy, real-time quantitative PCR (qPCR). Because of inherent differences in sensitivity, specificity, time and cost, each of these techniques pose certain advantages and disadvantages with respect to the specific sample you are quantitating. Our next three blogs will discuss each of these methods against the backdrop of quantitating samples for NGS applications.

 

Read Part 2: Nucleic Acid Quantitation by UV Absorbance: Not for NGS

Read Part 3: Fluorescence Dye-Based Quantitation: Sensitive and Specific for NGS Applications

Read Part 4: Real-Time (Quantitative) qPCR for Quantitating Library Prep before NGS

Molecular Autopsies in the Whole Genome Sequencing Era

Engraving of the human heart by T. Milton, 1814. Image courtesy of Wikimedia Commons.
Engraving of the human heart by T. Milton, 1814. Image courtesy of Wikimedia Commons.
Every year, nearly 8 million people die from sudden cardiac death, which is defined as the unexpected death of a seemingly healthy person due to malfunctions in the heart’s electrical system and loss of cardiac function. Although sudden cardiac death (SCD) is usually associated with mature adults, SCD claims thousands of young lives every year. In most cases, the cause of death can be determined by autopsy or toxicological analysis, but up to 30% of these premature deaths have no apparent cause, leaving medical examiners and family members of the young victims to wonder what happened.

In cases where traditional pathological examinations cannot provide insight into causation, medical examiners are increasingly turning to molecular autopsies to determine if there is an underlying genetic factor that contributed to a person’s death.

Continue reading “Molecular Autopsies in the Whole Genome Sequencing Era”

Simplifying Next Generation Sequencing Workflow with QuantiFluor® ds-DNA System

DNA SequenceNext-generation sequencing (NGS), also known as high-throughput parallel sequencing, is the all-encompassing term used to describe a number of different modern sequencing technologies. These include Illumina (Solexa) sequencing, Roche 454 sequencing, Ion torrent: Proton / PGM sequencing and SOLiD sequencing to name a few [1].

With the advent of these technologies sequencing DNA and RNA has become much more facile and affordable in comparison to the previously used Sanger sequencing. For these reasons NGS has been the game-changer in the field of modern genomics and molecular biology.

A common starting point for template preparation for NGS platforms is random fragmentation of target DNA and addition of platform-specific adapter sequences to flanking ends. Protocols typically use sonication to shear input DNA, coupled with several rounds of enzymatic modification to produce a sequencer-ready product [2].

Accurate quantification of DNA preparations is essential to ensure high-quality reads and efficient generation of data. Too much DNA can lead to issues such as mixed signals, un-resolvable data and lower number of single reads. Too little DNA, on the other hand, might result in insufficient sequencing coverage, reduced read depth or empty runs, all of which would incur higher costs. The quality of DNA can also vary depending on the source or extraction method applied and further reinforces the need for appropriate management of the input material. Continue reading “Simplifying Next Generation Sequencing Workflow with QuantiFluor® ds-DNA System”

The Power and Potential of Next-Generation Sequencing

DNA in a test tubeNext-generation sequencing (NGS), also known as massively parallel sequencing, is revolutionizing genomic research. NGS technologies have made whole genome sequencing fast and easy, leading to dramatic advances in evolutionary biology and phylogenetics, personalized medicine and forensic science. Why is NGS such a hot topic right now?

Continue reading “The Power and Potential of Next-Generation Sequencing”