The truth about NGS and library prep: Find out what is inflating your costs
Isn’t it time we spoke the truth about the toll next-generation sequencing (NGS) is taking on your lab?
Yes, NGS is a revolutionary technology that can help you break new ground in your research, but when you get right down to it, how much time is actually spent making the big breakthroughs versus generating the raw sequencing data? If your lab is like most, the answer is: surprisingly little. Why? Because most of your team’s time is probably spent on tedious, labour-intensive tasks like library prep. Should you be concerned about that? Absolutely. Here’s why.
1. Valuable time is being wasted on NGS library preparation
Typical manual library prep workflows require a lot of hands-on time
Since the majority of library prep workflows can now be automated, time spent doing manual work is time that could and should be better spent — for example, on more intellectually demanding tasks like results interpretation and planning the next important experiment.
Another waste of time in the lab is the need to order reagents and consumables from multiple vendors or having to find the correct reagents and consumables in the lab before starting the library preparation.
Training is a hassle
Whether preparing libraries manually or with an automated system, an excessive amount of time is often spent on training. This can have a big negative impact on productivity, especially in labs where there is high staff turnover. In cases where automated library prep is not user-friendly or requires some level of programming skills, many labs will hand over day-to-day management and running of the system to a dedicated expert. However, this can be a costly option, not to mention a waste of valuable expertise.
Errors are more likely
Any steps that require manual intervention are opportunities for human error and potential sources of variability. Lots of replicates and boring repetitive tasks like pipetting are a recipe for mistakes because it’s easy to lose concentration. Errors and variability often translate into poor-quality or invalid results, in which case the experiment must be repeated. Not only can this cause an expensive delay, there may not even be enough sample remaining to enable a repeat experiment. For example, if it were a very rare patient sample, you might only have one shot.
Even when library prep is automated, it can take far too long to set up and optimise the scripts
Many automated workflows still involve a lot of time-consuming activities that could be further simplified and streamlined. For example, how much time are your lab members spending making up master mixes, ensuring the work-deck is configured properly, and reconfiguring protocols for their specific applications? Do you need to get help from a programmer every time a new application comes along, or even to make a simple script change? This is not uncommon.
2. Library reproducibility issues may be arising more frequently than necessary
Manual processing increases the risk of variability
In particular, library prep involves numerous pipetting steps which increase the likelihood of errors, sample carry-over, contamination and operator variability.
Lack of standardisation can have a big impact
Use of standardised procedures and protocols is essential to ensure consistent results, yet this is a source of variability that is often overlooked. Problems can arise when there are inconsistencies or unplanned changes in the library prep kits and protocols being used, the quality and/or source of reagents, the standard to which incoming staff are trained, and so on.
Variability in the amount of input DNA or RNA
Nucleic acid variability is another common culprit that can affect NGS library reproducibility, particularly if the kit you are using is not designed to tolerate a broad input range or a broad range of sample volumes.
3. Your lab staff are getting burned out
Lab work is demanding at the best of times, but stress levels can go through the roof when staff are burdened with many mind-numbing and time-consuming tasks that don’t seem to add much value. In addition to compromising the quality of their work, this situation can cause ill health, job dissatisfaction, and ultimately a higher-than-average attrition rate. Don’t let it get to that point!
The cost of ignoring these issues may be higher than you think
The bottom line here is that while the cost of sequencing has plummeted over the years, a lot of labs are still spending far more than they should on NGS. Many ‘hidden’ costs of NGS can be traced back to the upstream steps of sample and library prep. Over time, the cost of error or poor quality in library prep is often higher than investing in a streamlined library prep workflow in the first instance. Any time spent doing something a robot could do or figuring out how to operate the robot — if there is one — is a total waste of resources. Likewise, having to hire or train dedicated experts is an unnecessary expense if you can source equipment and reagents designed to minimise complexity and maximise ease of use.
Until recently, labs running NGS at low throughput or only infrequently have tended to suffer the most from these sorts of issues because library prep solutions are typically designed with higher throughputs in mind. Fortunately, Tecan is working to make NGS more feasible at low throughputs.
To find out more about recent advances on this front, don’t miss our next article.
Originally published here.
Aquatic systems are an ideal medium as eDNA from many organisms in or near the water source tends...
The DNA test has been shown to identify a range of hard-to-diagnose neurological and...
A new crop modification technique relies on a spray that introduces bioactive molecules into...