As the SARS-CoV-2 virus spread around the world in early 2020, many researchers shifted their focus to support the global endeavors to address the challenge. For two professors at the University of Wisconsin, their efforts started with animal models to study pathogenicity and grew into massive SARS-CoV-2 sequencing and COVID-19 testing projects.
“Being a scientist in this field gives a sense of purpose, but also a sense of obligation and responsibility,” says David O’Connor, PhD. “You always want to feel like you’re living up to that.”
Tradeoffs are a constant source of challenge in any research lab. To get faster results, you will probably need to use more resources (people, money, supplies). The powerful lasers used to do live cell imaging may well kill those cells in the process. Purifying DNA often leaves you to choose between purity and yield.
Working with biologics also involves a delicate balancing act. Producing compounds in biological models rather than by chemical synthesis offers many advantages, but it is not without certain challenges. One of those tradeoffs results from scaling up; the more plasmid that is produced, the greater probability of endotoxin contamination.
Implementing automated nucleic acid purification or making changes to your high-throughput (HT) workflow can be complicated and time-consuming. There are also many barriers to success such as challenging samples types and maintaining desirable downstream results that can add to the stress, not to mention actually getting the robotic instrumentation to do what you want it to. All of this makes it easy to understand why many labs avoid automating or own expensive instrumentation that goes unused. Continue reading “High-Throughput Purification with Experts Included”
One of the most critical parts of a Next Generation Sequencing (NGS) workflow is library preparation and nearly all NGS library preparation methods use some type of size-selective purification. This process involves removing unwanted fragment sizes that will interfere with downstream library preparation steps, sequencing or analysis.
Different applications may involve removing undesired enzymes and buffers or removal of nucleotides, primers and adapters for NGS library or PCR sample cleanup. In dual size selection methods, large and small DNA fragments are removed to ensure optimal library sizing prior to final sequencing. In all cases, accurate size selection is key to obtaining optimal downstream performance and NGS sequencing results.
Current methods and chemistries for the purposes listed above have been in use for several years; however, they are utilized at the cost of performance and ease-of-use. Many library preparation methods involve serial purifications which can result in a loss of DNA. Current methods can result in as much as 20-30% loss with each purification step. Ultimately this may necessitate greater starting material, which may not be possible with limited, precious samples, or the incorporation of more PCR cycles which can result in sequencing bias. Sample-to-sample reproducibility is a daily challenge that is also regularly cited as an area for improvement in size-selection.
Back in graduate school, I purified a lot of RNA, and after a while, I became fairly successful at it. My yields were good, and the RNA was intact. However, many of my early attempts at RNA isolation yielded degraded RNA that did not work well in many downstream applications. In my case, successfully isolating high-quality RNA required practice. During my trials and tribulations, I learned a lot of tricks and tips about how to obtain high-quality RNA. Here I share some of these tricks to help you speed through that “practice makes perfect” phase so that you can isolate RNA like a pro.
“I would do more with my samples, but it’s just not possible…I know there’s probably a wealth of information in there, but there is just no way to get it out…I’ve got blocks of tissue sitting in the lab, experiments I want to run, but no good way to get clean nucleic acids out.”
These are a few of the comments I heard when talking with scientists at the American Society of Human Genetics meeting last week in Montreal. They, and countless other researchers, are sitting on a treasure trove of information that may have been locked away a few months ago, a few years ago, or decades ago. I’m referring to formalin-fixed, paraffin-embedded (FFPE) tissue blocks. It is estimated that there are upwards of 400 million tissue blocks archived globally and scientists are clamoring to find ways to best utilize nucleic acids derived from these tissues in applications like qPCR, microarrays, and next generation sequencing.1Continue reading “Fixed in the Past, Focus on the Future”