Creative solutions alleviate cost and performance pressures

Thursday, 08 June, 2006


The need to save money and improve performance at the same time is providing added impetus to the development of laboratory and analysis solutions. Chemistry is becoming a more significant factor in separation technology, affecting both increased sensitivity and the ability to detect the target components.

The chemical properties of substances play a fundamental role wherever you look: materials and active ingredient research, industrial process monitoring, medical diagnosis, environmental monitoring and the detection of genetically modified grain. The task of the instrumentation which is used in chemical analysis is to determine the exact structure and composition of substances by type and quantity. State-of-the-art laboratory and analytical techniques enable users to identify the type and quantity of a substance with near certainty. They can determine whether a substance is present or not, whether a chemical reaction has taken place or not and if so, what the result of the reaction was. This is no trivial task, given the enormous number of possible chemical compounds and naturally occurring substances.

Despite the fact that the task itself is the major factor which determines the selection of the analytical technique, there is one general rule. To an increasing extent, users are turning to automated, high-performance, networked laboratory and analytical systems to meet the chemical/analytical challenges in today's world. The systems must offer maximum efficiency, selectivity and sensitivity combined with very rugged design. They also have to meet or exceed the requirements imposed by national quality, environmental and safety standards and regulations over the long term.

Users often rely on computer-based management systems to facilitate and control analytical procedures and to make the results available on a network. These laboratory information management systems (LIMS) have now become indispensable tools which enable users to comply with national and international requirements. Every single step must be traceable, including sample weighing, sample status queries at any point in the process and modification of measurement parameters, and this requirement is not limited to market introduction of new medicines or chemicals.

21 CFR part 11 requires complete documentation for manual steps such as dispensing solvents or reagents into the sample or modification of the method, but this is not easy. The task can become very time, labor and cost intensive, especially when sample volumes increase. Highly qualified (and expensive) personnel may be needed to perform these tasks depending on the type and complexity of the analytical operation. This could be the reason why the need to save time, labour and money is the driving force behind the development and improvement of new methods and equipment.

Automation drives down cost and enhances safety and reliability

It takes a relatively large amount of effort before any type of sample reveals information about the composition and constituents of a substance. The amount of effort needed is particularly large when a sample matrix is involved, and this is often the case in biotechnology, gene technology and the life sciences. Experts believe that this field has the largest potential for innovation. When users look for simplification and automation, they tend to focus on sample preparation (often in connection with sample injection), which traditionally involves a large number of manual operations. These steps are error prone and they also offer the greatest potential for reducing the cost of chemical analysis.

Users express the desire to automate analytical processes whenever a long time normally elapses between extraction of the sample and availability of useable results and when the results are crucial for the downstream process, for example in industrial process applications or monitoring of environmental contaminants. In cases like this, the solution is to deploy online or at-line analysis techniques, send a continuous stream of data from the process or provide fast analysis results.

The trend towards high-performance, multi-functional laboratory robots with minimal sampling volumes continues unabated. To achieve useable results, users have to invest a lot of sweat and consume large amounts of solvents. These solvents are often toxic and disposal can be both time consuming and costly. The market is looking for systems which are cost effective and have minimal environmental impact. A test institute in Hamburg used pressurised solvent extraction (PSE) and a high-performance industrial robot to detect the presence of carcinogenic polychlorophenol (PCP) in textiles. Results were available within one hour, as opposed to 16 hours per sample which had previously been needed to perform the analysis. The institute was able to reduce the annual consumption of solvents by 93 per cent, saving about E21,400 annually.

What a laboratory robot can contribute depends on its mechanical capabilities, for example the sample handling options it offers, what its functional arms can do and how many directions it can operate in. However, system IQ is also crucial, in other words the degree to which the control software can be embedded into the analytical context (the overall analysis system). Ease of use is also a key issue in order to reduce the number of handling errors to an absolute minimum.

Moving towards low detection thresholds and increased sample throughput

Demand for higher sample throughput is driving innovation forward, especially in the field of chromatography, which is the oldest analysis technique. Despite the fact that this technology is largely mature, chromatography-based separation methods, and in particular capillary gas chromatography (GC) and high-performance liquid chromatography (HPLC), are used in nearly every chemical analysis laboratory, particularly in combination with mass selection.

Put simply, terms such as Fast GC or Fast LC, which have recently been cropping up in the laboratory world, describe the attempt to reduce analysis time to a minimum without affecting the quality of the results. In GC applications, this is conceivable and feasible if the length and inner diameter of the capillary columns is reduced. The development of new column packing materials based on nanotubes or nano particles could possibly enhance the performance of chromatography systems, but the technology is still in its infancy.

To increase the knowledge which they can gain about complex samples, users are increasingly using and combining a variety of techniques including headspace analysis, large-volume extraction (LVI), direct thermodesorption, solid phase micro extraction (SPME) and stir-bar sorptive extraction. These techniques can be combined with multi-dimensional chromatography (GC-xGC), a technique in which differently polar columns are connected, and mass selection, in particular MS/MS. With this method, ions are trapped and re-fragmented for further investigation under isolated conditions.

One development in the field of HPLC sounds exotic, but it is actually commonly used in GC. A number of manufacturers are now offering systems, either as new LC systems or upgrades to existing LC systems, which make it possible to perform separation at different temperatures or different temperature gradients (temperature programmed liquid chromatography or TPLC). This can increase separation performance and resolution, and it can also significantly reduce the consumption of expensive or toxic organic solvents. A producer of aroma chemicals has developed LC-Taste on a temperature programmable column oven. This method uses an eluent which does not contain any organic solvents. Taste testing of the eluent takes place in parallel with identification of the analytes using a standard detector.

Enhanced chemical expertise increases detection capabilities

A number of methods are available for structural analysis and quality control. IR and Raman spectroscopy are widely used, and laser spectroscopy is becoming a more popular alternative when other methods reach their limits. Mass spectrometry is one of the more important and widely used analytical techniques. New generations of chip-based detectors can simultaneously identify different masses including the mass of large molecules. Inductively coupled plasma mass spectrometry (ICP-MS) is rapidly becoming the solution of choice for isotope ratio determination thanks to the development of high-performance and increasingly user-friendly multi-collector instruments.

Accurate, efficient calibration continues to be a major issue, which is why mass spectrometry isotope dilution analysis is used in nearly all fields of analysis. The molecular mass of the standard does not interfere with the signals from the analyte. This enables users to reduce the number of chromatographic passes and save time, because the standard is added to the sample rather than acting as an external standard, which would require additional run time on the equipment.

Problems create new opportunities

The terrorist attacks during the past five years have stimulated the development and use of security equipment which is used to detect chemical weapons and toxins and to detect explosives in high-security zones such as airport passenger check-in areas. The increased interest in ion mobility spectrometers (IMS) is a reflection of efforts to detect explosives. IMS technology is based on the mobility of molecules in a countercurrent of a gas in an electrical field. Molecules can be ionised using an Ni-63-based radioactive source. The time of flight is measured and the user knows very quickly whether the substance in question is present. Sensitivity is extremely high and with some substances it is even better than GC-MS. The sensitivity range is usually 0.01 to 1000 ppm. IMS is to some extent comparable with time-of-flight (TOF) mass spectrometry, but IMS works at atmospheric pressure.

A number of new developments are directly related to the effort to decode the human genome and eliminate sources of disease. Gene technology and biotechnology are primarily concerned with the discovery and duplication of genetic data. A relatively new branch of the life sciences, proteomics, looks at all of the proteins in a cell. If you look at a cell at any given point in time, you will discover that the cell has not expressed anything like the complete complement of proteins that actually could be expressed. Most of the genes in a cell are never read. Conversely, a number of different proteins can have their source in a single gene. These genes consist of modules which can be combined in various ways to form messenger molecules. The ribosome reading mechanisms then produce the different proteins. This enables the cells in the immune system to quickly produce antibodies against a variety of pathogens. This has led a number of scientists to conclude that the complete complement of proteins which can be expressed and modified in a cell, the proteome, is biologically more significant than the genome.

Fluorescent arrays provide a standard way of analysing biomolecules in proteome research, for example to detect proteins and enzymes in complex matrices. However, MS is also an important analytical tool in a proteomics lab, because it provides more information about the composition and structure of a substance from a smaller sample volume than any other analytical technique. Matrix Assisted Laser Desorption/Ionisation Mass Spectrometry (MALDI-MS) as well as LC/MS can handle high sample throughput at high resolution. High-resolution Fourier Transform Ion Cyclotron Resonance also plays a major role, particularly in the final stages of protein analysis, because mass resolution and accuracy are one hundred times better than with other MS techniques.

Success is a question of time, not size

The move to the mini laboratory continues, and it is a characteristic feature of gene and proteome analysis. Using lab-on-a-chip technology, scientists have been trying for several years to place an entire lab on a microchip. The goal of saving time and money and minimising the usage of sample material, which is expensive or only available in limited quantities, is the driving force behind this innovative branch of technology. Expectations are very high. Time will tell whether a pocket-size universal lab is feasible, and if so when we will see one. The initial steps have been encouraging. There are a number of micro arrays on the market which are used for gene expression analysis.

Lab-on-a-chip technology is already being deployed, for example in applications where complex assays run using microfluid chip detection. The determination of blood sugar levels in patients who suffer from diabetes is a good example. However, it is not possible to analyse intracellular processes using sensors. Instead, the cell itself is used as a sensor. Fluorescent proteins are injected into the cell to provide a visual indication of what is going on.

Scientists are still waiting for the big breakthrough of lab-on-a-chip technology. The underlying vision is to determine the complete genome of every individual in order to detect different point mutations on an individual basis. Individual genome analysis will only become a reality, however, if the costs come down drastically to about $1000, which is the figure that is often heard in the industry. The current cost for this type of analysis runs into several million dollars.

Related Articles

Novel activity identified for an existing drug

Drug discovery company Re-Pharm has used computational chemistry suite Forge, a product of its...

New structural variant of carbon made of pentagons

Researchers from the US and China have discovered a structural variant of carbon called...

Cosmic radio waves caught in real time

Swinburne University of Technology PhD student Emily Petroff has become the first person to...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd