A REVOLUTION IN THE MICROBIOLOGY LABORATORY

| October 7, 2009

Revolution pic

Until comparatively recently, the tools available to the food microbiologist were antiquated in comparison with those at the disposal of analytical chemistry and other disciplines. With a basic technology essentially unchanged for decades, food microbiologists were strictly limited in what they could achieve. After a long wait, the revolution in molecular biology that has already changed other branches of microbiology beyond recognition is set to propel food microbiology into the twenty-first century. Now that they are at last equipped with a state-of-the-art toolbox, food microbiologists are on the brink of new discoveries and possibilities to play a wider and more important role in maintaining the safety and quality of our food supply.

Step into the food chemistry laboratory of a large manufacturer, contract testing business, or research institution at any time within the last twenty years and you would see a growing array of computer controlled, automated analysis equipment operated by technicians who spent a good deal of their time staring at screens. Test tubes and Bunsen burners would be a rare sight indeed. The corresponding microbiology lab on the other hand would probably still be occupied by staff preparing or examining petri dishes full of agar culture media, often by hand.

Certainly, many tasks in the microbiology laboratory have become at least partly automated over the intervening years, but almost every operation still relies on the need to culture microorganisms. Whether testing is aimed at detecting pathogens, counting potential spoilage organisms, studying the metabolism of bacteria and moulds, or identifying a contaminant, the first requirement is almost always to culture the organism so that there are enough cells to detect or study.

That culture step means that the total time taken for any microbiological test is at least 24 hours – in some cases 5 days or more – and that limits the role that microbiological testing can play in food safety and quality assurance. Not only is the need to culture microorganisms a bottleneck in terms of time, but it also places other restrictions on what can be done. For example, reliance on culture means that only a small proportion of the microbial world can ever be studied. Even in an environment as well understood as the human gut, recent research suggests that two thirds of the species present are unknown to science, mainly because we do not yet know how to grow them in culture.

While chemical testing has become ever more sophisticated, sensitive and rapid, microbiologists have been effectively hamstrung by their need to culture microorganisms.

An expanding market

Despite this, according to Food Micro-2008 to 2013, a recently published report by Strategic Consulting Inc. (SCI), the market for microbiological testing in the food sector has been growing at nearly 9% every year since 1998. Not only that, but the food sector now accounts for nearly half of the total industrial microbiology market. The SCI report estimates that by 2013, the number of tests carried out worldwide will be nearly 970 million, up from 740 million in 2008. The drivers for this growth are identified as an overall increase in food production, food safety concerns, demands from retailers and an increase in regulatory requirements.

The purpose of testing has also changed over the last two decades. Twenty years ago the emphasis was on quality control and it was not uncommon to find food products in warehouses or cold stores awaiting clearance by the in-house microbiology laboratory, something that is now almost unheard of. As the HACCP approach to food safety has been almost universally adopted, so microbiological testing that takes two or more days to complete has become less useful. Conventional microbiological testing can never provide results quickly enough to be used to monitor critical control points in a HACCP system, and it is now used mainly to verify that the system is working correctly.

It is only too clear that a growing market and an aging technology that doesn’t do what its users really need eventually creates a huge pent-up demand for innovative new techniques. In fact, the SCI report estimates that the market for so-called ‘rapid methods’ will grow at more than twice the rate of the overall testing market in the next few years. “By 2013, much will have changed,” says Tom Weschler, president of SCI. “Traditional methods will still be the predominant ones used, accounting for 491.2 million tests. However, traditional will represent only 50.7% of all tests conducted, which is approximately an 8% decrease based on the percentage of tests performed.”

There have been a number of attempts at developing rapid methods for counting microbes over the last 30 years or so, ranging from impedance and conductance measurement, to ATP measurement and dye-reduction tests. Rapid detection and identification tests have also been developed, arguably with more success, often based on biochemical or immunological techniques. Some of these rapid methods have been very successful and have claimed a significant share of the market, but the one thing they typically have in common is that they need a relatively large number of living microbial cells to be present, and that means culturing the organisms. Despite the growth in rapid method use, the ISO methods used for enforcement testing are still mostly traditional, culture-based methods.

The rise and rise of PCR

The strongest challenge yet to traditional techniques has its roots in the 1980s with the development of molecular biology techniques focusing on the genes of microorganisms, rather than their physical and metabolic characteristics. The key discovery was the polymerase chain reaction, or PCR, first developed in 1983 by Dr Kary Mullis, who later won a Nobel Prize for his work. PCR is a method for synthesising multiple copies of (amplifying) a specific piece of DNA and it proved to be a breakthrough technique in molecular biology since it made the speedy detection of very small amounts of DNA not only possible, but also practical. For PCR to take place, four basic components are needed.

  • A DNA template – containing the sequence that is to be amplified. This must be something very specific to the target and is often a single gene. For example, the sequences chosen for detecting food poisoning bacteria are often genes for virulence factors that determine pathogenicity.
  • Primers – a pair of short artificial single-stranded DNA sections, which are exactly complementary for specific parts of the target sequence. The design of the primers is a critical factor for PCR. They must be specific enough to bind only at the correct points bordering the target sequence.
  • A heat-stable DNA-polymerase enzyme – usually Taq polymerase from a thermophilic bacterium, which catalyses the reaction.
  • Free nucleotides – the raw materials used to build multiple copies of the DNA template during the reaction.

The first step in the process is to extract the DNA template from the sample. This is normally accomplished by heating with a detergent to lyse bacterial cells and release DNA. However, extracting DNA directly from food samples can be difficult and may be affected by so-called ‘PCR inhibitors’, food components such as fats, polysaccharides and polyphenols. To help overcome this problem, DNA purification kits have been developed commercially for food materials, but the usual approach for many food microbiology applications is still to add an enrichment culture step, typically for 16-24 hours, and then extract DNA from the culture – even PCR methods cannot dispense with culturing microbes entirely.

Once the DNA has been extracted and cleaned up, the PCR reaction can begin. The first step is to raise the temperature to about 90-95oC. This causes the double stranded DNA to denature into single strands. The temperature is then reduced to about 50-65oC to allow the two primers to bind, or anneal, at specific points on the single-stranded DNA of the target sequence. Finally, the temperature is raised to 70-74oC and the DNA-polymerase enzyme catalyses the duplication of the target sequence, using the free nucleotides as building blocks and starting at the annealed primers on each single strand – a process known as extension. This results in two double-stranded DNA fragments, which are identical copies of the original target sequence. The temperature cycling process is then repeated 30-40 times, creating a theoretical doubling of the number of copies of the target sequence at each cycle. This can produce sufficient DNA for reliable detection from a single target sequence in just a few hours.

Revolution pic2

How PCR works

Once enough copies have been generated, the next step in the process is to detect them, and there are two main types of detection method.

  • End-point PCR detection – this takes place when the amplification process is complete and typically involves gel electrophoresis, followed by staining to detect the amplified DNA fragments. This method is quite time consuming, vulnerable to contamination and can only give a qualitative result.
  • Real-time PCR detection – combining the amplification and detection stages of the process so that amplification is monitored continuously. Real-time detection is more accurate and the result can also be quantified.

Most applications for foodborne pathogens employ real-time PCR detection and commercial test kits typically use ‘fluorescent reporter probes’ as the detection method. This method utilises an additional primer, the probe, which also binds specifically to the target DNA sequence. Probes have a fluorescent ‘reporter’ dye at one end and a ‘quencher’ dye, which inhibits fluorescence, at the other. During the extension stage the probe is broken apart by the DNA-polymerase and begins to fluoresce more strongly. |

Revolution pic 3

The fluorescence can be measured at each cycle and increases in proportion to the number of target sequence copies produced. This means that the assay can be made quantitative by recording the cycle at which the fluorescence intensity rises above the background level for each test sample and for a set of standards run at the same time. A standard curve can then be drawn and the amount of target DNA present in the sample can be calculated from the standard curve. Probe-based detection also allows for more than one target DNA sequence to be assayed in the same sample, by using specific probes equipped with different coloured dyes. For example, some Campylobacter detection kits allow for several species to be detected and quantified in the same sample.

Developing practical products

Real-time PCR assays for food microbiology have been developed into commercial products and are generally highly automated to minimise the number of operations involved and reduce the risk of contamination. The reaction usually takes place inside a combined thermocycler/fluorescence detection instrument and uses pre-prepared reagents, often in dried tablet form. The test protocols for different pathogens are typically designed to run under identical conditions so that several assays can be run simultaneously. The thermocycling and detection processes are controlled by dedicated software that also calculates and interprets the results. For foodborne pathogen detection tests, the entire process can be completed within 20-30 hours.

The main advantage for PCR systems over other traditional and rapid methods is in saving time, both from sampling to result and the technician time needed to run the assay. Even rapid immunoassay-based pathogen detection methods often require a secondary enrichment stage before low numbers of cells can be detected and take up to 48 hours to achieve a clear result. The high degree of automation built into PCR systems means that relatively unskilled staff can run them without extensive training. The high specificity of PCR can also mean fewer repeat tests and the flexibility of real-time PCR assays allows tests for several different pathogens to be run simultaneously. The possibility of quantifying the result is a further advantage over other methods.

Needless to say, there is a downside to PCR assays for food microbiology and the really big factor is cost. Automated PCR systems cost a lot to buy and they cost a lot to run, because the equipment and the consumables are both expensive in comparison to other methods, especially for systems using fluorescent probes. Costs may be high enough to put PCR-based testing out of the reach of smaller laboratories, especially when there is no overwhelming cost benefit in rapid test results.

For larger labs the situation is different. Economies of scale and reduced labour costs can drive overall costs down and where time is of the essence, speedy results do carry a cost benefit. In fact, a recent study carried out at the German Federal Institute for Risk Assessment (BfR) concluded that taking all costs into account, real-time PCR could be significantly less expensive than the corresponding – and notably more labour-intensive – ISO culture method for enumerating salmonellae in food samples.

A number of commercial PCR systems are currently offered for food microbiology applications. One of the first into the marketplace was the Bax(r) system from Dupont Qualicon and this has been joined by TaqMan(r) food pathogen detection kits from Applied Biosystems, iQ Check real-time PCR kits from Bio-Rad, foodproof(r) real-time PCR detection kits produced by Biotecon Diagnostics and distributed by Merck and several others. The growing range of available tests is mainly designed for detecting pathogens, including Salmonella, Listeria, E. coli O157 and Campylobacter, but microbial quality screening tests, such as a beer screening kit for the brewing industry and a yeast and mould test, are also available. PCR systems have an advantage in that the basic technology can be quickly adapted to new tests simply by designing appropriate primers. For example, Dupont Qualicon has recently launched a Bax test for Vibrio species.

Future developments

It may well be that we have only seen the tip of the iceberg for the impact that PCR will eventually have on food microbiology. As the technology develops, the range of available tests seems set to grow, perhaps to the point where every conventional microbiological test has a PCR equivalent. There may also be scope to reduce the time taken to produce a result with better DNA extraction and cleanup techniques and more sensitive assays. It may even be possible to detect pathogens in food without that time consuming-enrichment stage. Costs too are likely to be driven down as the market grows and competition increases.

PCR can already do things that most conventional methods cannot do, such as detect unculturable and damaged cells and directly detect and identify pathogenic strains. It is possible that future developments may even lead to changes in food legislation. For example, the current EU Microbiological Criteria Regulation could become obsolete as testing capability and specificity improves and new methods are developed. The scope of the microbiology lab could be broadened by undertaking other PCR-based tests for parameters like GMOs, allergens, meat species and authentication of ingredients. Food microbiology labs of the future may well base all their operations on PCR and evolve into molecular biology diagnostic departments. The implications for small labs are less rosy. Before long, labs unable to invest in PCR technology may find themselves left behind. Molecular methods may also drive the trend towards outsourcing testing to well-resourced large contract testing labs able to offer the full range of services. One thing looks certain, molecular biology in general and PCR in particular will change food microbiology forever and could propel the discipline into an exiting new era, ripe with potential.

Adapted from an article first published in Food Engineering and Ingredients

Tags: ,

Category: Features

Comments are closed.