Harnessing the Power of Massively Parallel Sequencing in Forensic Analysis

The rapid advancement of next-generation sequencing technology, also known as massively parallel sequencing (MPS), has revolutionized many areas of applied research. One such area, the analysis of mitochondrial DNA (mtDNA) in forensic applications, has traditionally used another method—Sanger sequencing followed by capillary electrophoresis (CE).

Although MPS can provide a wealth of information, its initial adoption in forensic workflows continues to be slow. However, the barriers to adoption of the technology have been lowered in recent years, as exemplified by the number of abstracts discussing the use of MPS presented at the 29th International Symposium for Human Identification (ISHI 29), held in September 2018. Compared to Sanger sequencing, MPS can provide more data on minute variations in the human genome, particularly for the analysis of mtDNA and single-nucleotide polymorphisms (SNPs). It is especially powerful for analyzing mixture samples or those where the DNA is highly degraded, such as in human remains.  Continue reading

Is MPS right for your forensics lab?

Today’s post was written by guest blogger Anupama Gopalakrishnan, Global Product Manager for the Genetic Identity group at Promega. 

Next-generation sequencing (NGS), or massively parallel sequencing (MPS), is a powerful tool for genomic research. This high-throughput technology is fast and accessible—you can acquire a robust data set from a single run. While NGS systems are widely used in evolutionary biology and genetics, there is a window of opportunity for adoption of this technology in the forensic sciences.

Currently, the gold standard is capillary electrophoresis (CE)-based technologies to analyze short tandem repeats (STR). These systems continue to evolve with increasing sensitivity, robustness and inhibitor tolerance by the introduction of probabilistic genotyping in data analysis—all with a combined goal of extracting maximum identity information from low quantity challenging samples. However, obtaining profiles from these samples and the interpretation of mixture samples continue to pose challenges.

MPS systems enable simultaneous analysis of forensically relevant genetic markers to improve efficiency, capacity and resolution—with the ability to generate results on nearly 10-fold more genetic loci than the current technology. What samples would truly benefit from MPS? Mixture samples, undoubtedly. The benefit of MPS is also exemplified in cases where the samples are highly degraded or the only samples available are teeth, bones and hairs without a follicle. By adding a sequencing component to the allele length component of CE technology, MPS resolves the current greatest challenges in forensic DNA analysis—namely identifying allele sharing between contributors and PCR artifacts, such as stutter. Additionally, single nucleotide polymorphisms in flanking sequence of the repeat sequence can identify additional alleles contributing to discrimination power. For example, sequencing of Y chromosome loci can help distinguish between mixed male samples from the same paternal lineage and therefore, provide valuable information in decoding mixtures that contain more than one male contributor. Also, since MPS technology is not limited by real-estate, all primers in a MPS system can target small loci maximizing the probability of obtaining a usable profile from degraded DNA typical of challenging samples. Continue reading

Where Would DNA Sequencing Be Without Leroy Hood?

There have been many changes in sequencing technology over the course of my scientific career. In one of the research labs I rotated in as a graduate student, I assisted a third-year grad student with a manual radioactive sequencing gel because, I was told, “every student should run at least one in their career”. My first job after graduate school was as a research assistant in a lab that sequenced bacterial genomes. While I was the one creating shotgun libraries for the DNA sequencing pipeline, the sequencing reaction was performed using dideoxynucleotides labeled with fluorescent dyes and amplified in thermal cyclers. The resulting fragments were separated by manual loading on tall slab polyacrylamide gels (Applied Biosystems ABI 377s) or, once the lab got them running, capillary electrophoresis of four 96-well plates at a time (ABI 3700s).

Sequencing throughput has only increased since I left the lab. This was accomplished by increasing well density in a plate and number of capillaries for use in capillary electrophoresis, but more importantly, with the advent of the short read, massively parallel next-generation sequencing method. The next-gen or NGS technique decreased the time needed to sequence because many sequences were determined at the same time, significantly accelerating sequencing capacity. Instruments have also decreased in size as well as the price per base pair, a measurement used when I was in the lab. The long-prophesized threshold of $1,000 per genome has arrived. And now, according to a recent tweet from a Nanopore conference, you can add a sequencing module to your mobile device:

Continue reading

Choosing a Better Path for Your NGS Workflow

Imagine you are traveling in your car and must pass through a mountain range to get to your destination. You’ve been following a set of directions when you realize you have a decision to make. Will you stay on your current route, which is many miles shorter but contains a long tunnel that cuts straight through the mountains and obstructs your view? Or will you switch to a longer, more scenic route that bypasses the tunnel ahead and gets you to your destination a bit later than you wanted?

Choosing which route to take illustrates a clear trade-off that has to be considered—which is more valuable, speed or understanding? Yes, the tunnel gets you from one place to another faster. But what are you missing as a result? Is it worth a little extra time to see the majestic landscape that you are passing through?

Considering this trade-off is especially critical for researchers working with human DNA purified from formalin-fixed paraffin-embedded (FFPE) or circulating cell-free DNA (ccfDNA) samples for next-generation sequencing (NGS). These sample types present a few challenges when performing NGS. FFPE samples are prone to degradation, while ccfDNA samples are susceptible to gDNA contamination, and both offer a very limited amount of starting material to work with.

Continue reading

Better NGS Size Selection

One of the most critical parts of a Next Generation Sequencing (NGS) workflow is library preparation and nearly all NGS library preparation methods use some type of size-selective purification. This process involves removing unwanted fragment sizes that will interfere with downstream library preparation steps, sequencing or analysis.

Different applications may involve removing undesired enzymes and buffers or removal of nucleotides, primers and adapters for NGS library or PCR sample cleanup. In dual size selection methods, large and small DNA fragments are removed to ensure optimal library sizing prior to final sequencing. In all cases, accurate size selection is key to obtaining optimal downstream performance and NGS sequencing results.

Current methods and chemistries for the purposes listed above have been in use for several years; however, they are utilized at the cost of performance and ease-of-use. Many library preparation methods involve serial purifications which can result in a loss of DNA. Current methods can result in as much as 20-30% loss with each purification step. Ultimately this may necessitate greater starting material, which may not be possible with limited, precious samples, or the incorporation of more PCR cycles which can result in sequencing bias. Sample-to-sample reproducibility is a daily challenge that is also regularly cited as an area for improvement in size-selection.

Continue reading

Six (and a Half) Reasons to Quantitate Your DNA

Knowing how much DNA you have is fundamental to successful experiments. Without a firm number in which you are confident, the DNA input for subsequent experiments can lead you astray. Below are six reasons why DNA samples should be quantitated.

6. Saving time by knowing what you have rather than repeating experiments. Without quantitating your DNA, how certain can you be that the same amount of DNA is consistently added? Always using the same volume for every experiment does not guarantee the same DNA amount goes into the assay. Continue reading

Massively Parallel Sequencing for Forensic DNA Analysis

HiResToday’s blog is written by guest blogger, Douglas R. Storts, PhD, head of Research, Nucleic Acid Technologies, Promega Corporation.

Massively parallel sequencing (MPS), also called next generation sequencing (NGS), has the potential to alleviate some of the biggest challenges facing forensic laboratories, namely degraded DNA and samples containing DNA from multiple contributors. Unlike capillary electrophoresis, MPS genotyping methods do not require fluorescently-labeled oligonucleotides to distinguish amplification products of similar size. Furthermore, it is not necessary to design primers within a color channel to generate amplicons of different sizes to avoid allele overlap. Consequently, all the amplicons can be of a similar, small size (typically <275 base pairs). The small size of the amplicons is particularly advantageous when working with degraded DNA. Because the alleles are distinguished by the number of repeats and the DNA sequence, additional information can be derived from a sample. This can be especially important when genotyping mixtures. As previously demonstrated (1), this sequence variation can help distinguish stutter “peaks” from minor contributor alleles.

Because there is no reliance upon size and fluorescent label, significantly greater multiplexing is possible with MPS approaches. In addition to autosomal short tandem repeats (STRs), we can also sequence Y-STRs, single nucleotide polymorphisms (SNPs), and the mitochondrial DNA control region. The advantage to this approach is the forensic analyst does not need a priori knowledge whether a sample would benefit most from the different methods of genotyping.

Despite these major advantages, there are limitations to the near-term, broad deployment of current MPS technology into forensic laboratories. The limitations fall into four main categories:  Workflow, costs, performance with forensically-relevant samples, and community guidelines. Continue reading

Real-Time (quantitative) qPCR for Quantitating Library Prep before NGS

Real-Time (or quantitative, qPCR) monitors PCR amplification as it happens and allows you to measure starting material in your reaction.

Real-Time (or quantitative, qPCR) monitors PCR amplification as it happens and allows you to measure starting material in your reaction.

This the last in a series of four blogs about Quantitation for NGS is written by guest blogger Adam Blatter, Product Specialist in Integrated Solutions at Promega.

When it comes to nucleic acid quantitation, real-time or quantitative (qPCR) is considered the gold standard because of its unmatched performance in senstivity, specificity and accuracy. qPCR relies on thermal cycling, consisting of repeated cycles of heating an cooling for DNA melting and enzyamtic replication. Detection instrumentation is capable of measuring the accumulation of DNA product after each round of amplification in real time.

Because PCR amplifies specific regions of DNA, the method is highly sensitive, specific to DNA, and it can determine whether a sample is truly able to be amplified. Degraded DNA or free nucleotides, which might otherwise skew your quantiation, will not contribute to the signal, and your measurement will be more accurate.

However, while qPCR does provide technical advantages, the method requires special instrumentation, specialized reagents and is a more time-consuming process. In addition, you will probably need to optimize your qPCR assay for each of your targets to achieve your desired results.

Because of the added complexity and cost, qPCR is a technique suited for post-library quantitation when you need to know the exact amount of amplifiable, adapter-ligated DNA.  PCR is the only method capable of specifically targeting these library constructs over other DNA that may be present. This specificity is important because accurate normalization is especially critical for producing even coverage in multiplex experiments where equimolar amounts of several libraries are added to a pooled sample. This normalization process is essential  if your are screening for rare variants that might be lost in background and go undetected if underrepresented in a mixed pool.

 

Read Part 1: When Every Step Counts: Quantitation for NGS

Read Part 2: Nucleic Acid Quantitation by UV Absorbance: Not for NGS

Read Part 3: Fluorescence Dye-Based Quantitation: Sensitive and Specific for NGS Applications

Fluorescence Dye-Based Quantitation: Sensitive and Specific for NGS Applications

This is the third post in a series of blogs on quantitation for NGS applications written by guest blogger Adam Blatter, Product Specialist in Integrated Solutions at Promega.

Fluorescent dye-based quantitation uses specially designed DNA binding compounds that intercalate only with double stranded DNA molecules. When excited by a specific wavelength of light, only dye in the DNA-bound state will fluoresce. These aspects of the technique contribute to low background signal, and therefore the ability to accurately and specifically detect very low quantities of DNA in solution, even the nanogram quantities used in NGS applications.

For commercial NGS systems, such as the Nextera Rapid Capture Enrichment Protocol by Illumina, this specificity and sensitivity of quantitation are critical. The Nextera protocol is optimized for 50ng of total genomic DNA. A higher mass input of genomic DNA can result in incomplete tagmentation, and larger insert sizes, which can adversely affect enrichment. A lower mass input of genomic DNA or low-quality DNA can generate smaller than expected inserts, which can be lost during subsequent cleanup steps, giving lower diversity of inserts. Continue reading

Nucleic Acid Quantitation by UV Absorbance: Not for NGS

schematic diagram of UV-Vis Absorbance Method

For UV-Vis Spectrophotometry, light is split into its component wavelengths and directed through a solution. Molecules in the solution absorb specific wavelengths of light.

This is the second in a series of four blogs about Quantitation for NGS is written by guest blogger Adam Blatter, Product Specialist in Integrated Solutions at Promega.

Perhaps the most ubiquitous quantitation method is UV-spectrophotometry (also called absorbance spectroscopy). This technique takes advantage of the Beer-Lambert Law: an observation that many compounds absorb UV-Visible light at unique wavelengths, and that for a fixed path length the absorbance of a solution is directly proportional to the concentration of the absorbing species. DNA, for example has a peak absorbance at 260nm (A260nm).

This method is user friendly, quick and easy. But, it has significant limitations, especially when quantitating samples for NGS applications. Continue reading