Meet Měnglà Virus: the newest cousin in the Ebola and Marburg virus family tree

Ebola virus (EBOV) and Marburg virus (MARV) are two closely-related viruses in the family Filoviridae. Filoviruses are often pathogenic, causing hemorrhagic fever disease in human hosts. The Ebola outbreak of 2014 caught the world by surprise by spreading so quickly and severely that public health organizations were unprepared. The devastating outcome was a total of over 11,000 deaths by the time the outbreak ended in 2016. Research that provides further understanding of filoviruses and their potential for transmission is important in preventing future outbreaks from occurring. But what if the outbreak comes from a virus we’ve never seen before?

Měnglà virus was discovered among filoviruses isolated from Old World fruit bats (Rousettus)

All in the viral family

A recent study published in the journal Nature Microbiology provides evidence of a newly identified filovirus species. Using serum samples taken from bats, a well-known host for filoviruses, Yang et al. isolated and identified viral RNA for an unclassified viral genome sequence using next generation sequencing analysis. This new virus genome sequence was organized with the same open reading frames as other filoviruses, encoding for nucleoprotein (NP), viral protein 35 (VP35), VP40, glycoprotein (GP), VP30, VP24, and RNA-dependent RNA polymerase (L). This new genome sequence shared up to 54% of the nucleotide sequences for the filovirus species Lloviu virus (LLOV), EBOV and MARV, with MARV being the most similar. Their analysis suggested that this novel virus should be classified within the Filoviridae family tree as a separate genus, Dianlovirus, and was named Měnglà virus (MLAV).

Continue reading “Meet Měnglà Virus: the newest cousin in the Ebola and Marburg virus family tree”

Deep in the Jungle Something Is Happening: DNA Sequencing

This blog was written by guest blogger and 2018 Promega Social Media Intern Logan Godfrey.

Only 30 years ago, the polymerase chain reaction (PCR) was used for the first time, allowing the exponential amplification of a specific DNA segment. A small amount of DNA could now be replicated until there was enough of it to study accurately, even allowing sequencing of the amplified DNA. This was a massive breakthrough that produced immediate effects in the fields of forensics and life science research. Since these technologies were first introduced however, the molecular biology research laboratory has been the sole domain of PCR and DNA sequencing.

While an amazing revolution, application of a technology such as DNA sequencing is limited by the size and cost of DNA sequencers, which in turn restricts accessibility. However, recent breakthroughs are allowing DNA sequencing to take place in jungles, the arctic, and even space—giving science the opportunity to reach further, faster than ever before. 

Gideon Erkenswick begins extractions on fecal samples collected from wild tamarins in 2017. Location: The GreenLab, Inkaterra.

Gideon Erkenswick begins extractions on fecal samples collected from wild tamarins in 2017. Location: The GreenLab, Inkaterra. Photo credit: Field Projects International.

The newfound accessibility of DNA sequencing means a marriage between fields of science that were previously largely unacquainted. The disciplines of genomics and wildlife biology/ecology have largely progressed independently. Wildlife biology is practiced in the field through observations and macro-level assessments, and genomics, largely, has developed in a lab setting. Leading the charge in the convergence of wildlife biology and genomics is Field Projects International. 

Continue reading “Deep in the Jungle Something Is Happening: DNA Sequencing”

Harnessing the Power of Massively Parallel Sequencing in Forensic Analysis

The rapid advancement of next-generation sequencing technology, also known as massively parallel sequencing (MPS), has revolutionized many areas of applied research. One such area, the analysis of mitochondrial DNA (mtDNA) in forensic applications, has traditionally used another method—Sanger sequencing followed by capillary electrophoresis (CE).

Although MPS can provide a wealth of information, its initial adoption in forensic workflows continues to be slow. However, the barriers to adoption of the technology have been lowered in recent years, as exemplified by the number of abstracts discussing the use of MPS presented at the 29th International Symposium for Human Identification (ISHI 29), held in September 2018. Compared to Sanger sequencing, MPS can provide more data on minute variations in the human genome, particularly for the analysis of mtDNA and single-nucleotide polymorphisms (SNPs). It is especially powerful for analyzing mixture samples or those where the DNA is highly degraded, such as in human remains.  Continue reading “Harnessing the Power of Massively Parallel Sequencing in Forensic Analysis”

Is MPS right for your forensics lab?

Today’s post was written by guest blogger Anupama Gopalakrishnan, Global Product Manager for the Genetic Identity group at Promega. 

Next-generation sequencing (NGS), or massively parallel sequencing (MPS), is a powerful tool for genomic research. This high-throughput technology is fast and accessible—you can acquire a robust data set from a single run. While NGS systems are widely used in evolutionary biology and genetics, there is a window of opportunity for adoption of this technology in the forensic sciences.

Currently, the gold standard is capillary electrophoresis (CE)-based technologies to analyze short tandem repeats (STR). These systems continue to evolve with increasing sensitivity, robustness and inhibitor tolerance by the introduction of probabilistic genotyping in data analysis—all with a combined goal of extracting maximum identity information from low quantity challenging samples. However, obtaining profiles from these samples and the interpretation of mixture samples continue to pose challenges.

MPS systems enable simultaneous analysis of forensically relevant genetic markers to improve efficiency, capacity and resolution—with the ability to generate results on nearly 10-fold more genetic loci than the current technology. What samples would truly benefit from MPS? Mixture samples, undoubtedly. The benefit of MPS is also exemplified in cases where the samples are highly degraded or the only samples available are teeth, bones and hairs without a follicle. By adding a sequencing component to the allele length component of CE technology, MPS resolves the current greatest challenges in forensic DNA analysis—namely identifying allele sharing between contributors and PCR artifacts, such as stutter. Additionally, single nucleotide polymorphisms in flanking sequence of the repeat sequence can identify additional alleles contributing to discrimination power. For example, sequencing of Y chromosome loci can help distinguish between mixed male samples from the same paternal lineage and therefore, provide valuable information in decoding mixtures that contain more than one male contributor. Also, since MPS technology is not limited by real-estate, all primers in a MPS system can target small loci maximizing the probability of obtaining a usable profile from degraded DNA typical of challenging samples. Continue reading “Is MPS right for your forensics lab?”

Where Would DNA Sequencing Be Without Leroy Hood?

There have been many changes in sequencing technology over the course of my scientific career. In one of the research labs I rotated in as a graduate student, I assisted a third-year grad student with a manual radioactive sequencing gel because, I was told, “every student should run at least one in their career”. My first job after graduate school was as a research assistant in a lab that sequenced bacterial genomes. While I was the one creating shotgun libraries for the DNA sequencing pipeline, the sequencing reaction was performed using dideoxynucleotides labeled with fluorescent dyes and amplified in thermal cyclers. The resulting fragments were separated by manual loading on tall slab polyacrylamide gels (Applied Biosystems ABI 377s) or, once the lab got them running, capillary electrophoresis of four 96-well plates at a time (ABI 3700s).

Sequencing throughput has only increased since I left the lab. This was accomplished by increasing well density in a plate and number of capillaries for use in capillary electrophoresis, but more importantly, with the advent of the short read, massively parallel next-generation sequencing method. The next-gen or NGS technique decreased the time needed to sequence because many sequences were determined at the same time, significantly accelerating sequencing capacity. Instruments have also decreased in size as well as the price per base pair, a measurement used when I was in the lab. The long-prophesized threshold of $1,000 per genome has arrived. And now, according to a recent tweet from a Nanopore conference, you can add a sequencing module to your mobile device:

Continue reading “Where Would DNA Sequencing Be Without Leroy Hood?”

Choosing a Better Path for Your NGS Workflow

Imagine you are traveling in your car and must pass through a mountain range to get to your destination. You’ve been following a set of directions when you realize you have a decision to make. Will you stay on your current route, which is many miles shorter but contains a long tunnel that cuts straight through the mountains and obstructs your view? Or will you switch to a longer, more scenic route that bypasses the tunnel ahead and gets you to your destination a bit later than you wanted?

Choosing which route to take illustrates a clear trade-off that has to be considered—which is more valuable, speed or understanding? Yes, the tunnel gets you from one place to another faster. But what are you missing as a result? Is it worth a little extra time to see the majestic landscape that you are passing through?

Considering this trade-off is especially critical for researchers working with human DNA purified from formalin-fixed paraffin-embedded (FFPE) or circulating cell-free DNA (ccfDNA) samples for next-generation sequencing (NGS). These sample types present a few challenges when performing NGS. FFPE samples are prone to degradation, while ccfDNA samples are susceptible to gDNA contamination, and both offer a very limited amount of starting material to work with.

Continue reading “Choosing a Better Path for Your NGS Workflow”

Better NGS Size Selection

One of the most critical parts of a Next Generation Sequencing (NGS) workflow is library preparation and nearly all NGS library preparation methods use some type of size-selective purification. This process involves removing unwanted fragment sizes that will interfere with downstream library preparation steps, sequencing or analysis.

Different applications may involve removing undesired enzymes and buffers or removal of nucleotides, primers and adapters for NGS library or PCR sample cleanup. In dual size selection methods, large and small DNA fragments are removed to ensure optimal library sizing prior to final sequencing. In all cases, accurate size selection is key to obtaining optimal downstream performance and NGS sequencing results.

Current methods and chemistries for the purposes listed above have been in use for several years; however, they are utilized at the cost of performance and ease-of-use. Many library preparation methods involve serial purifications which can result in a loss of DNA. Current methods can result in as much as 20-30% loss with each purification step. Ultimately this may necessitate greater starting material, which may not be possible with limited, precious samples, or the incorporation of more PCR cycles which can result in sequencing bias. Sample-to-sample reproducibility is a daily challenge that is also regularly cited as an area for improvement in size-selection.

Continue reading “Better NGS Size Selection”

Six (and a Half) Reasons to Quantitate Your DNA

Knowing how much DNA you have is fundamental to successful experiments. Without a firm number in which you are confident, the DNA input for subsequent experiments can lead you astray. Below are six reasons why DNA samples should be quantitated.

6. Saving time by knowing what you have rather than repeating experiments. Without quantitating your DNA, how certain can you be that the same amount of DNA is consistently added? Always using the same volume for every experiment does not guarantee the same DNA amount goes into the assay. Continue reading “Six (and a Half) Reasons to Quantitate Your DNA”

Massively Parallel Sequencing for Forensic DNA Analysis

HiResToday’s blog is written by guest blogger, Douglas R. Storts, PhD, head of Research, Nucleic Acid Technologies, Promega Corporation.

Massively parallel sequencing (MPS), also called next generation sequencing (NGS), has the potential to alleviate some of the biggest challenges facing forensic laboratories, namely degraded DNA and samples containing DNA from multiple contributors. Unlike capillary electrophoresis, MPS genotyping methods do not require fluorescently-labeled oligonucleotides to distinguish amplification products of similar size. Furthermore, it is not necessary to design primers within a color channel to generate amplicons of different sizes to avoid allele overlap. Consequently, all the amplicons can be of a similar, small size (typically <275 base pairs). The small size of the amplicons is particularly advantageous when working with degraded DNA. Because the alleles are distinguished by the number of repeats and the DNA sequence, additional information can be derived from a sample. This can be especially important when genotyping mixtures. As previously demonstrated (1), this sequence variation can help distinguish stutter “peaks” from minor contributor alleles.

Because there is no reliance upon size and fluorescent label, significantly greater multiplexing is possible with MPS approaches. In addition to autosomal short tandem repeats (STRs), we can also sequence Y-STRs, single nucleotide polymorphisms (SNPs), and the mitochondrial DNA control region. The advantage to this approach is the forensic analyst does not need a priori knowledge whether a sample would benefit most from the different methods of genotyping.

Despite these major advantages, there are limitations to the near-term, broad deployment of current MPS technology into forensic laboratories. The limitations fall into four main categories:  Workflow, costs, performance with forensically-relevant samples, and community guidelines. Continue reading “Massively Parallel Sequencing for Forensic DNA Analysis”

Real-Time (quantitative) qPCR for Quantitating Library Prep before NGS

Real-Time (or quantitative, qPCR) monitors PCR amplification as it happens and allows you to measure starting material in your reaction.
Real-Time (or quantitative, qPCR) monitors PCR amplification as it happens and allows you to measure starting material in your reaction.

This the last in a series of four blogs about Quantitation for NGS is written by guest blogger Adam Blatter, Product Specialist in Integrated Solutions at Promega.

When it comes to nucleic acid quantitation, real-time or quantitative (qPCR) is considered the gold standard because of its unmatched performance in senstivity, specificity and accuracy. qPCR relies on thermal cycling, consisting of repeated cycles of heating an cooling for DNA melting and enzyamtic replication. Detection instrumentation is capable of measuring the accumulation of DNA product after each round of amplification in real time.

Because PCR amplifies specific regions of DNA, the method is highly sensitive, specific to DNA, and it can determine whether a sample is truly able to be amplified. Degraded DNA or free nucleotides, which might otherwise skew your quantiation, will not contribute to the signal, and your measurement will be more accurate.

However, while qPCR does provide technical advantages, the method requires special instrumentation, specialized reagents and is a more time-consuming process. In addition, you will probably need to optimize your qPCR assay for each of your targets to achieve your desired results.

Because of the added complexity and cost, qPCR is a technique suited for post-library quantitation when you need to know the exact amount of amplifiable, adapter-ligated DNA.  PCR is the only method capable of specifically targeting these library constructs over other DNA that may be present. This specificity is important because accurate normalization is especially critical for producing even coverage in multiplex experiments where equimolar amounts of several libraries are added to a pooled sample. This normalization process is essential  if your are screening for rare variants that might be lost in background and go undetected if underrepresented in a mixed pool.


Read Part 1: When Every Step Counts: Quantitation for NGS

Read Part 2: Nucleic Acid Quantitation by UV Absorbance: Not for NGS

Read Part 3: Fluorescence Dye-Based Quantitation: Sensitive and Specific for NGS Applications