Tuesday, August 30, 2011

Validation of Control Measures

2.1 Introduction
ICMSF previously discussed validation of control measures in the supply chain (Zwietering et al.
2010) and portions of that paper are included in this chapter. The flexibility offered by an outcome
based risk management system must be supported by demonstration that the selected control measures
actually are capable of achieving the intended level of control on a consistent basis. Validation is
defined by the Codex Alimentarius Commission (2008) as:
“Validation: Obtaining evidence that a control measure or combination of control measures, if properly implemented,

is capable of controlling the hazard to a specified outcome.”
The overall effectiveness of the control measures should be validated according to the prevalence of
the hazards in the food of concern, taking into consideration the characteristics of the individual
hazards(s) of concern, established Food Safety Objectives or Performance Objectives and level of
risk to the consumer.
2.1.1 Relationship of Validation to Monitoring and Verification
In addition to the definition of validation cited above, the Codex Alimentarius Commission (2008)
adopted the following definitions:
“Monitoring: The act of conducting a planned sequence of observations or measurements of control parameters
to assess whether a control measure is under control.”
“Verification: The application of methods, procedures, tests and other evaluations, in addition to monitoring,
to determine whether a control measure is or has been operating as intended.”
Validation focuses on the collection and evaluation of scientific, technical and observational information
and is different from verification and monitoring. Monitoring is the on-going collection of information
on a control measure at the time the control measure is applied and verification is used to
determine that the control measures have been appropriately implemented. The successful implementation
of HACCP requires validation, which includes the clear identification of hazards, control
measures available, critical control points, critical limits and corrective actions. The outcomes
of
Chapter 2
Validation of Control Measures 1
1 Part of this chapter was published as: Zwietering MH, Stewart CM, Whiting RC, ICMSF (2010) Validation of control
measures in a food chain using the FSO concept. Food Control 21:1716–1722.
14 2 Validation of Control Measures
monitoring and verification activities associated with a HACCP system assist in determining
when
re-evaluation may be necessary. To be effective, the scope of validation may go beyond the control
measures used in the manufacturing facility and may include control areas such as primary
processing
and consumer handling.
The production of safe food requires the application of GHP and HACCP principles to develop
and implement a total food safety management system that controls the significant hazards in the
food being produced. Some risk management principles are best addressed through GHP measures
(e.g., controlling the initial levels of a hazard through good hygiene) and others are clearly part of a
defined CCP within HACCP (e.g., reducing the level of a hazard, through a decontamination step).
Food manufacturers design processes to meet Performance Objectives (PO) or Performance
Criteria (PC), which can be set at specific points throughout the food chain to assure food safety.
Regulatory authorities are concerned with whether a group of products or the consequences of a
series of processing and handling steps prior to consumption can meet the Food Safety Objective
(FSO) and ensure that those foods achieve levels that are consistent with the Appropriate Level of
Protection (ALOP) (see Chap. 1, Utility of Microbiological Testing for Food Safety and Quality).
Various control measures include the control of ingredients at the initial stage of food processing
or food chain, and intensive protocols to reduce or eliminate the contamination by washing, heating,
disinfecting and other measures. Control measures are also designed to prevent an increase of hazards
during transportation and storage, by cross-contamination during the processing or cooking, or even
by re-contamination after those steps.
Control measures should be validated to determine whether the products meet with objectives;
however, different segments of the food industry undertake these activities depending on the situation.
Food processors may validate the control measures for the processes they use, and validation
should focus on achievement of meeting the given PO or PC. In this case of validation, both withinlot
and between-lot variability should be considered. On the other hand, control measures validated
under the responsibility of regulatory authorities cover all control actions in the system for multiple
products and processes, including consideration of between-lot variability. In this case validation is
targeted at assessing the established PCs, POs and FSOs. For example, the effective risk management
of a meat production system may include validation of:
• Farm practices aimed at ensuring animal health and minimizing the level of infection in the herd
(zoonosis).
• Slaughter practices aimed at minimizing contamination.
• Chilling regimes and temperature control aimed at minimizing the potential for pathogen growth.
• Consumer instructions aimed at ensuring that the product is cooked to the minimum temperature
required to inactivate pathogens.
In this chapter, the prevalence and levels of microorganisms from the initial contamination (H0), reduction
(SR), growth and re-contamination (SI), and factors that influence these are considered throughout food
production until consumption. The influence of these factors on meeting the FSO is represented by the
equation H0-SR+SI£FSO. Stochastic aspects of the parameters are taken into account as well as deterministic
values. Potential key factors, data and data analysis methods are described. However, some of these
factors may not be relevant for a particular processing line or processor. Examples of the use of data to
validate one or a series of processes, including statistical insights, are provided.
2.2 Considerations for Validation
Processes can be validated through the use of a variety of approaches (Codex Alimentarius 2008)
including predictive modeling, the literature, microbiological challenge studies and use of safe harbors
(i.e., approaches that have been previously approved as delivering a safe product (see Chap. 1)).
2.2 Considerations for Validation 15
Not all these need to be used, but often several approaches are combined to supply sufficient validation
evidence. When a safe harbor approach is used, it may not be necessary to conduct validation studies
for that process. For example, a safe harbor for milk pasteurization is to deliver a minimum heat process
of 72°C for 15s. This process criterion has been validated and therefore can be implemented by manufacturers
without re-validation of the process.
Numerous considerations for establishing the efficacy and equivalency of processes are discussed
by NACMCF (2006), which proposed the following steps for the development of processes intended
to reduce the pathogen(s) of concern:
• Conduct a hazard analysis to identify the microorganism(s) of public health concern for the
food.
• Determine the most resistant pathogen of public health concern that is likely to survive the process.
• Assess the level of inactivation needed. Ideally this would involve determining the initial cell
numbers and normal variation in concentration that occurs before processing.
• Consider the impact of the food matrix on pathogen survival and possible growth during storage.
• Validate the efficacy of the process.
• Define the critical limits that need to be met during processing so that the food will meet the performance
objectives and performance criteria.
• Define the specific equipment and operating parameters for the proposed process.
• Implementation within GHP and/or HACCP.
Regardless of the methods used to determine and validate process criteria, similar microbiological
considerations need to be taken into account (NACMCF 2010). These include:
• What is the most resistant microorganism of public health significance for each process? When
determining the target microorganism, it is necessary to consider all pathogens that have an
epidemiologically
relevant association with a product, as the most resistant pathogen may not be
present in the highest numbers. Conversely, pathogens controlled by other means may not be of
public health significance in a product when growth is required in order to cause illness (i.e., C.
botulinum controlled by pH).
• Choice of strains used to conduct validation studies
• The phase of growth in which the microorganisms are harvested
• The substrate upon which the culture is grown and the associated environmental conditions (e.g., pH,
temperature, atmospheric conditions), including adaptation of culture when appropriate
• The suspending medium
• The food’s intrinsic factors, such as pH, aW , and preservative levels
• The sample size, preparation and handling (i.e., compositing, homogenizing, subsamples)
• Packaging conditions (packaging material and atmospheric conditions, including modified atmosphere
gas mixtures)
• Cell enumeration methods following the process and selection of appropriate measurement systems
• Processing variability
Three commonly used strategies for process validation include concurrent, retrospective and prospective
process validation. Concurrent process validation is based on simultaneous collection and
evaluation of data from a process concurrently with its application. This is used when there is a
change or modification to an established and previously validated process. Retrospective process
validation is validation of product already in distribution based upon accumulated production, testing
and control data. This technique is often used in analyzing process failures that result in product
recalls. Prospective process validation is a deliberate, forward-looking, planned approach that determines
if the process can be relied upon with a high degree of confidence to deliver safe food.
Prospective validation is best suited for evaluating novel processes and must consider the equipment,
the process and the product (Keener 2006).
16 2 Validation of Control Measures
A team of experts is required for system validation because of the many skills required such as
engineering, microbiology, physical chemistry, etc. Involvement of external experts and regulatory
officials in the development of both the master validation plan and the validation protocols is essential
to ensure technical adequacy and acceptance by authorities. Process validation requires proper analysis
of objective data.
2.3 Validation of Control Measures
Validation generally begins with microbiological studies on a laboratory scale, progresses to a pilot
plant scale and ends with full validation on a commercial scale when possible or necessary.
Microbiological challenge testing is useful to validate process lethality against a target microorganism(s)
to determine the ability of a food to support microbial growth and to determine the potential shelf life
of ambient or refrigerated foods. For example inactivation kinetic studies can be conducted over a
small range of treatments such as a unique combination of factors and levels (e.g., pH 6.5 and 70ÂșC).
Conversely, studies can also be conducted over a broad range of treatments, and can illustrate where
failure occurs and help assess the margin of safety in any process, as well as providing data that can
be used in evaluation of deviations. Furthermore this facilitates development of predictive models for
future public or private use. Several microbiological predictive models are available, including the
USDA Pathogen Modeling Program (USDA 2006) and COMBASE (2010). Challenge studies can also
be used to determine processing criteria, although they are of less generic use than models and often
are used for particular products or as a way of validating the model predictions. On the other hand
models are often generic, and therefore do not contain all factors that are of relevance for a specific
food. Therefore models and challenge studies should be combined in an iterative way. This is further
discussed by NACMCF (2010). Finally, on a commercial scale, challenge studies can be conducted
using nonpathogenic surrogate microorganisms and shelf life studies with uninoculated product can
also provide useful information for validating a process.
While microbiological challenge testing can be used to determine the stability of a product with
regards to spoilage over the intended shelf life, the remainder of this discussion focuses on
microbiological
safety of food products. In the following sections, the initial contamination (H0),
reduction
(SR), growth and re-contamination (SI), and factors influencing these are discussed
sequentially, including data needs and experimental considerations.
It is important to note that in this text, diagnostic methods are assumed to be 100% sensitive and
100% specific, which is not the case. These characteristics of methods depend largely on the target
microorganism, diagnostic method used and investigated food product. Especially for low level
pathogens false negative results might be expected. These aspects need to be clearly considered in
validation studies.
2.3.1 Initial Level (H0 ), Standard Deviation and Distribution
The design of the food process influences the importance of incoming material for product safety. The
main source of the pathogen of concern may be from a major or minor ingredient, one incorporated in
the initial processing steps or one added later. It is important to understand which ingredient(s) may
harbor the pathogen and if there is a seasonal effect on the level of the pathogen. For example, the
number of Escherichia coli O157:H7-positive lots of ground beef sampled from 2001 to 2009 increased
in the June-October period in the USA (USDA-FSIS 2009). The geographical source of the ingredient
may also play a role in the likelihood of whether a certain pathogen is present in the raw ingredient.
If contamination is not avoidable, the goal is to develop specifications and criteria for the incoming
2.3 Validation of Control Measures 17
material that will lead to achievement of the final PO and FSO, in conjunction with the performance
criteria for the other steps in the food process. The specifications for accepting the incoming materials
include the acceptable proportion above a limit or the mean log level and standard deviation.
Information for validating that incoming materials comply with required specifications can come
from:
• Baseline data from government agencies.
• Documentation from suppliers that specifications are met (supplier provides validation and end
product testing).
• Baseline data from the processor’s experience or
• Test results for incoming lots.
Microbiological testing is one of the tools that can be used to evaluate whether a food safety system
is providing the level of control it was designed to deliver. A number of different types of microbiological
testing may be employed by industry and government. One of the most commonly used is
within lot testing, which compares the level of a microbiological hazard detected in a food against a
prespecified limit, i.e., a Microbiological Criterion (MC) (ICMSF 2002). MCs are designed to determine
adherence to GHP and HACCP (i.e., verification) when more effective and efficient means are
not available. In this context, FSOs and POs are limits to be met, and within-lot testing can provide a
statistically-designed means of determining whether these limits are being met (van Schothorst et al.
2009). To assess compliance of a lot to a MC, a sampling plan based on the MC specified and the
confidence level desired can be established. To do this, the recommendations for setting MCs as outlined
in Appendix A should be followed. The MC should specify the concentration to be met (m in
CFU/g), the proportion of defective samples (c) allowed above the m value, the number of samples to
be tested (n) and an evaluation of the implications for a given sampling plan.
A sampling plan appropriate to assess compliance with a specified concentration can be developed
using the ICMSF spreadsheet (Legan et al. 2002, http://www.icmsf.org). The calculations underlying
the spreadsheet determine the probability that an analytical unit from a lot contains more than any
specified number of cells/g. That probability can be estimated from the mean concentration of the
cells in the lot, and its standard deviation. It is assumed that the distribution of concentrations of cells
in a lot is log-normally distributed. A Performance Objective is determined, e.g., that 99% of units
must contain less than a specified concentration of cells, and a corresponding mean log concentration
determined from the assumed standard deviation. Then the number of samples required to be taken
from the batch, to provide 95% confidence that an unacceptable batch will be rejected by sampling,
can be calculated taking into account the size of the analytical unit. In an example on Listeria monocytogenes
in cooked sausage (ICMSF 2002), the initial number in the raw materials prior to cooking
is assured to be no more than 103 CFU/g (i.e., H0 = 3). Often a PO for H0 can also be regarded as the
PO for the output of a previous stage of the food chain.
In any sampling process in microbiology, the actual number of organisms recovered in a sample
taken from a lot will also be affected by the random distribution of cells within the region that is
actually sampled. This randomness is described by the Poisson distribution. The relative effect of this
randomness is relatively small when large number of cells are contained, and counted, from the
sample (e.g., the standard deviation when the true mean is 100, is ± 10), but it is relatively large when
the target concentration is one cell per sample, such as in presence absence testing. Including this
consideration in design of a sampling plan is more important when the result of testing is presence
or absence, and has also been incorporated into the spreadsheet calculation (van Schothorst et al. 2009).
As for the evaluation of sampling plans based on testing against a specific number of cells, for evaluation
of sampling plans based on presence/absence testing it is also assumed that the distribution
of the
concentration of cells in the batch is log-normally distributed, and is characterized by a mean log and
standard deviation. The Poisson effect is also included in the calculations for the first alternative, but
is relatively minor.
18 2 Validation of Control Measures
2.3.2 Inactivation Studies (S R)
2.3.2.1 Modeling Studies
A microbiological predictive model can describe or predict the growth, survival or death of microorganisms
in foods. These models typically relate the microbial growth, survival or death responses to
the levels of the controlling factors, such as temperature, pH, water activity etc. Models generally
should not be used outside the range of the factors used to create them because there is no underlying
principle on which to base extrapolation. Thus consideration of the range over which they will be
used is required before beginning experimentation (Legan et al. 2002). Where extrapolation is necessary,
tests should be conducted to confirm that the extrapolation is valid, e.g., confirm that the established
process destroys a specific population of the target microorganism. However, models that can
predict the rate of death of pathogens can be used to design safe and effective processes.
Several authors describe experimental design for modeling in food microbiology (Ratkowsky et al.
1983; Davies 1993, Ratkowsky 1993, McMeekin et al. 1993). Guidelines for data collection and storage
are also available (Kilsby and Walker 1990, Walker and Jones 1993). A practical guide to modeling,
supported by references to primary sources of modeling information is discussed by Legan et al.
(2002). The reader should consult these references for details on development of a microbiological
predictive model.
2.3.2.2 Microbiological Challenge Studies
Detailed information on the design and implementation of microbiological challenge studies has been
described (IFT 2001, Scott et al. 2005, NACMCF 2010). Microbiological challenge testing is useful
to validate process lethality against a target microorganism(s).
When designing and carrying out a microbiological challenge study, some factors to consider
include the selection of appropriate pathogens or surrogates, the level of the challenge inoculum, the
inoculum preparation and method of inoculation, the duration of the study, formulation factors and
storage conditions, and sample analyses (Vestergaard 2001). Multiple replicates of such studies
should be done to reflect variation in the production lots and other factors. The extent of replication
and the impact on the results of the study must be considered.
2.3.2.3 Challenge Microorganism Selection
The ideal microorganisms for challenge testing are those previously isolated from similar formulations.
If possible, pathogens from known foodborne outbreaks should be included. In contrast to
kinetic studies, challenge studies frequently use a mixture of five or more strains of the target
pathogen
because a single strain may not be the most resistant to each of the multiple stress factors
involved in the product/process combination. Additionally, strains with the shortest generation
time may not have the shortest lag time under the test conditions. Likewise, strains may vary in
response to changes in the inactivation treatment (Scott et al. 2005). The strains in the cocktail should
be present in approximately equal numbers. It is also important to incubate and prepare the challenge
suspension under standardized conditions and format.
When possible, it is desirable to use a pathogen rather than a surrogate microorganism for
validation studies. However, surrogates are sometimes used in place of specific pathogens, for
example, in challenge studies conducted in a processing facility. The characteristics of the surrogate
in relation
to those of the pathogen should be determined and the difference accounted for in the
interpretation
of the challenge studies (Scott et al. 2005). Detailed information on the desirable
attributes for surrogates can be found in IFT (2001).
2.3 Validation of Control Measures 19
2.3.2.4 Inoculum Level
The inoculum level depends on the purpose of the study; whether the objective is to determine product
stability or shelf life, or to validate a step in the process designed to reduce microbial numbers.
When validating a process lethality step, it is usually necessary to use a high inoculum level, such as
106–107 CFU/g of product or higher, to demonstrate the log reduction of the challenge microorganisms.
The actual concentration of the inoculum before and after inoculation should be confirmed.
Also uninoculated samples should be analyzed to investigate intrinsic product contamination. Total
inactivation of the inoculum may not be necessary, especially in situations where the H0 is likely to
be low (e.g., when the initial population is <103 CFU/g a 5D process is required and the inoculum
level in the experiment is 107 CFU/g). This may be relevant when validating post lethality treatments,
where the process is being designed to inactivate low levels of pathogens resultingfrom
recontamination
of product after an initial lethal treatment, such as might occur during
slicing or packaging
operations.
2.3.2.5 Inoculum Preparation and Inoculation Method
Preparation of the inoculum is an important component of the overall protocol. Typically, the challenge
cultures should be grown in media and under conditions optimal for growth of the specific
challenge culture. In some studies, specific challenge microorganisms may be pre-adapted to certain
conditions.
The method of inoculation is another important consideration. It is essential to avoid changes in
the critical parameters of the product formulation undergoing the challenge. For example, the use of
a diluent adjusted to the approximate water activity of the product using the humectant present in the
food minimizes the potential for erroneous results in intermediate moisture foods. Preliminary analyses
should be done to ensure the water activity or moisture level of the formulation is not changed
after inoculation. For guidelines for inoculation of low water activity products or for challenge studies
with spores refer to IFT (2001).
2.3.2.6 Duration of Challenge Studies for Potential Growth
It is prudent to conduct the challenge study longer than the desired shelf life to determine what would
happen if users stored and consumed the product beyond its intended shelf life. Additionally, when
validating inactivation processes, it is possible that sublethal injury may occur in some products,
leading to a long lag period (Busta 1978). If the product is not tested for at least its entire shelf life,
it is possible to miss the recovery and subsequent growth of the challenge microorganism late in shelf
life. Some regulatory agencies require data for 1.3 times the shelf life of the product when stored as
intended. Shorter times may be considered for refrigerated products that are stored under abuse
conditions.
The frequency of testing is governed by the duration of the challenge study. If the shelf life is
measured in weeks, the test frequency is typically no less than once per week. It is desirable to
have a minimum of 5–7 data points over the shelf life to have a good indication of inoculum
behavior. All studies should start with “zero time” testing, i.e., analysis of the product right after
inoculation and, for inactivation studies, right after processing. It may also be desirable to test
more frequently early in the challenge study and then reduce the frequency of testing to longer
intervals.
A sufficient quantity of product should be inoculated so that a minimum of three replicates per
sampling time is available throughout the challenge study. In some cases, such as in certain revalidation
studies and for uninoculated control samples, fewer replicates may be used.
20 2 Validation of Control Measures
2.3.2.7 Formulation Factors and Storage Conditions
When evaluating formulation, it is important to understand the range of key factors that control its
microbiological stability such as pH, preservative level and water activity. These intrinsic properties
should be documented. It is useful to collect data on the inherent manufacturing variability of the
critical parameters and ensure that the challenge test conditions encompass this variability by a specified
margin (e.g., with 95% confidence). These parameters should be adjusted to the worst case
condition expected for the product with respect to microbial growth or inactivation (e.g., highest pH).
One approach would be to use the 95% confidence interval for the parameter or the mean plus 2
standard deviations. If there is only one critical parameter, this 95% confidence would mean that one
out of 20 times reality could be outside this range. However, if there are many critical parameters,
setting all at their 95% confidence level might simulate an unrealistic condition. The level of confidence
desired must be considered in evaluating these parameters.
It is important to test each key variable singly or in combination under worst case conditions. For
example, if the target pH is 4.5 ± 0.2 (95% confidence interval) and the processing capability is within
that range, the challenge product should be on the high side of that range (pH 4.7). This should be
carefully assessed for different parameters. For example, decreasing the water activity of a product
may delay or prevent growth of microorganisms; however, using a different humectant in the system
is a change in the critical factor even if the same water activity (aW) is achieved because growth rates
may vary with different humectants. Further, decreasing the aW of a system may reduce the lethality
of a process (Mattick et al. 2001). Inclusion of the impact of variability in critical factors helps to
ensure that the challenge study covers the process capability range for each critical factor in the
formulation.
2.3.2.8 Sample Analysis
Typically, enumeration is conducted at each sampling time. It is desirable to have at least duplicate
and preferably triplicate samples for analyses at each time point. The selection of enumeration media
and method depends on the microorganisms used in the challenge study. In situations where toxinproducing
microorganisms are used, test for appropriate toxins at each sampling time using the most
current validated method. Growth may occur without the formation of toxin.
It is prudent to analyze inoculated product and uninoculated control samples at each selected
sampling time to determine how the background microbiota behaves during shelf life. It is also
important to track pertinent physical and chemical parameters over the shelf life as they may influence
the behavior of the microorganism. Understanding how factors such as aW, moisture content, salt
level, pH, Modified Atmosphere Packaging (MAP) gas concentrations, preservative levels and other
variables may change over product shelf life is important to understanding the microbiological stability
of the product. Quality attributes should also be noted.
2.3.2.9 Data Interpretation
Once the challenge study is completed, the data should be analyzed to determine how the microorganisms
behaved over time. For toxin-producing pathogens, no toxin should be detected over the
designated challenge period. Combining quantitative inoculum data for each time point with data on
the background microbiota and the relevant physical and chemical parameters provides a broad
representation
of the microbiological stability of the formulation under evaluation. A well-designed
challenge study can provide
critical information on the microbiological safety and stability of a food
formulation. Such studies
are also invaluable in validating the key lethality or microbiological control
points in a process.
2.3 Validation of Control Measures 21
2.3.3 Growth Studies (SI)
An increase in the numbers of pathogen or spoilage microorganism can occur through growth or
recontamination. This section addresses growth.
Growth may occur if the food, temperature and packaging atmosphere support growth, and sufficient
time is provided under favorable conditions. Growth potential should be assessed for raw ingredients,
intermediate points during the manufacturing and after manufacture during distribution, retail,
food service and home storage and use. Generally, public health cannot be assured unless the potential
for growth is minimized. If the pathogen is not completely inactivated and growth is possible,
then an accurate estimation of the amount of growth that may occur is important in validating product
safety and stability.
As previously described for validating inactivation, estimates for growth may be obtained from a
variety of sources including the literature, models and challenge tests (Scott et al. 2005). Increasing
reliance is given to studies with experimental conditions that more closely reflect the actual conditions
of the food. Satisfactory validation of a pathogen’s growth in a food includes challenge tests
with the normal background microbiota. Models and broth studies can provide support for evaluating
minor changes in formulation and strain differences and for interpolating to conditions not explicitly
tested in the challenge tests. Applications of predictive models in food microbiology include models
that predict the growth rate of bacterial pathogens in response to product or environmental factors
such as aW, temperature or pH. Growth models can be used to design safe product formulations, to
set appropriate storage conditions, to explore the maximum interval between cleaning and sanitizing
of process equipment, and can also be used to inform decisions about when a challenge study is
needed and to design the test parameters.
Factors that should be considered when evaluating growth include the strain(s) used, surrogates,
physiological state of the inoculum, inoculation method, simulation of the experimental or pilot plant
conditions to the commercial process, inclusion of all environmental factors in the food (pH, aW,
acid anions) and external factors (temperature, packaging), and inclusion of the spoilage microorganisms.
Many of these factors were described in the inactivation section; considerations particular to
estimating growth are discussed below.
2.3.3.1 Inoculum Level
IFT (2001) provided a list of microorganisms that can be used in microbiological challenge studies
and recommendations for selection and assessment of tolerable growth. When the objective is to
determine product safety and the extent of growth over its shelf life (SI ), an inoculum level of
between 102 and 103 CFU/g of product is frequently used. Lower or multiple inoculum levels may be
considered if microbial spoilage is a common mode of failure and low numbers are anticipated in the
product. See Sects. 2.3.3.3 and 2.3.3.6, for additional considerations on inoculum level.
2.3.3.2 Formulation Factors and Storage Conditions
When similar products are under evaluation, testing formulations that are more favorable to growth can limit
the need to conduct challenge studies on formulations less favorable to growth. For example, studying products
with a pH near neutrality may represent a worst case when similar products have a lower pH.
Test samples should ideally be stored in the same packaging and under the same conditions
(e.g., MAP) used for the commercial marketplace. The storage temperatures used in the challenge
study should include the typical temperature range at which the product is to be held and distributed.
22 2 Validation of Control Measures
Refrigerated products should be challenged under representative abuse temperatures. Some challenge
studies may incorporate temperature cycling into the protocol.
2.3.3.3 Lag Phase
A lag phase occurs when cells require time to adjust to a new environment. The lag phase is influenced
by the magnitude of the change and the favorability of the new environment. In general, a
lengthy lag phase occurs when cells experience a significant shift to a less favorable environment
such as to a lower temperature or water activity.
The physiological state of the cell also plays a role in the length of the lag phase. Generally, cells
in the exponential growth phase adapt more rapidly than cells in the stationary phase. Cells that are
starved in nutrient poor environments such as water, frozen or desiccated on a food contact surface
typically have an increased lag time compared to the other cells. Following an inactivation treatment
or other severe stress, surviving cells may need time to repair, which can also appear as a lag phase
before growth. Significant lag times are most likely when certain ingredients are added (e.g., salt,
acidulant) or after a stressful process (heating, thawing, sudden temperature change). A lag phase as
result of temperature changes is less likely in a finished product because the mass of the food, retail
packaging and box/pallet moderate temperature changes. Validation should recognize that the temperature
reduction during a cooling period may extend over one or more days, especially if the food
is boxed and palletized. Validation of a process should strive to replicate the initial physiological state
and environmental changes in order to accurately determine the length of the lag phase, if any.
The length of the lag phase can be affected by the initial number of cells because a log normal
distribution exists for the lag times of individual cells. Validation studies with high cell numbers
(>102 CFU/package or unit) will inevitably have some cells with the shortest lag times and daughter
cells will almost entirely originate from these cells. When low levels of contamination occur, it is
possible that none of these fastest cells are present in some of the packages and the apparent lag times
will become longer and more varied in those packages.
2.3.3.4 Exponential Growth Rate
The exponential growth rate (EGR) increases with storage temperature up to the pathogen’s optimum
temperature (typically 35–45°C for pathogens). The EGR depends on other intrinsic characteristics
of the food such as acidity, water activity and inhibitors in a complex manner that can be estimated
by models. However, challenge studies are required to demonstrate that the model’s prediction
is
accurate for a specific food. Once a model is validated, it can be used to estimate the impact of the
environmental factor changes (T, pH, aW etc.) on the EGR.
2.3.3.5 Maximum Growth Level
A pathogen has a maximum level of growth that it achieves in a microbial medium or food. In broth
and in pure culture, this level is typically 108–109 CFU/mL; however, it is sometimes lower in a food.
The maximum in a food is affected also by storage temperature. For L. monocytogenes in the FDAFSIS
risk assessment the maximum growth levels (CFU/g) selected were 105 for temperatures of
<5°C, 106.5 for 5−7°C and 108 for temperatures >7°C (FDA-FSIS 2003) based on various literature
sources.
2.3 Validation of Control Measures 23
2.3.3.6 Competition and the Spoilage Flora
Competition between the pathogen and spoilage microorganism is difficult to predict. For many
pathogen-spoilage microorganism pairs, growth of both groups is reasonably independent until the
spoilage microorganisms have grown significantly. Spoilage microorganisms may decrease the pH or
produce inhibitors such as bacteriocins. Pathogens are typically at low populations and do not interfere
with the spoilage microorganisms. Typical microbiota found in commercial settings should be
present in challenge studies. Pathogens should be inoculated in the appropriate physiological state,
location in the food (e.g., surface, interior or interface of components as appropriate) and concentrations
that will likely occur in the commercial setting.
Another important consideration in determining the safety of a food is the storage conditions
that lead to spoilage, particularly spoilage before the pathogen reaches the PO. Evaluation of
growth during storage requires knowledge of the typical times and temperatures characteristic of
that stage. This may be easy for the relatively short growth periods during the commercial phases
of the food chain. However, time and temperature are highly variable in the home or food service
operation. A temperature of moderate abuse should be selected and the maximum length of
the storage period before spoilage at that temperature ascertained for determination of the
amount of growth. Foods should be tested for 1.25−1.5 times their intended shelf life unless spoilage
occurs first.
2.3.3.7 Effect Variation on Growth
In addition to determining the average increase in cell population during each growth period, it is
important to estimate the variation about that estimate (for example the 95% confidence interval).
This variation is the consequence of the different characteristics of various strains, fluctuations in the
environmental conditions within the food (pH, salt levels) and the ranges in times and temperatures
of storage. The challenge test can provide an estimate of the mean log value; varying the parameters
within a model can provide additional data to estimate the variation. This variation includes the differences
in growth from the factors calculated above but may also be increased by the analyst to
account for uncertainties because of a lack of high quality data.
2.3.4 Recontamination (SI)
If a food process includes a lethal step that eliminates the pathogen, then any pathogen present at
consumption is the result of recontamination. Foods receiving 6−8-log reductions rarely have a contaminated
package immediately after that step. For example, if a product initially has a homogeneous
contamination of 102 CFU/g in every 100 g package, after a 7 log reduction only one in 1,000 packages
will be contaminated and it will have ~1 CFU/package. When determining whether such a food
meets an FSO or PO at a further step, calculation begins after the lethal step. The frequency and level
of contamination represent the new H0.
Little literature exists on the frequencies and levels of recontamination and few applicable
models
have been developed to estimate the results of recontamination. Sufficient sampling of the
specific process at this step or at a subsequent step with a back calculation is the only way to
obtain valid data on recontamination. A food process without a lethal step and with several potential
points of additional recontamination is difficult to predict, especially since quantitative information
related to recontamination is usually not available. Sufficient sampling of the food after
the last point of recontamination is a possible way to validate whether a PO or FSO is being
24 2 Validation of Control Measures
achieved. Another approach is environmental monitoring and monitoring of food contact surfaces.
Other factors to consider are packaging integrity and proper training of employees on handling
practices.
2.4 Effect of Process Variability on FSO Compliance Validation
One way to demonstrate compliance to an FSO is by using the equation:
H0 - SR + SI £ FSO
By combining information on the initial level (H0), reductions (SR) and increases (SI ) of the
microbial hazard throughout the production and distribution chain, one can determine if the FSO or
PO will be reliably met. The variability of the microbial levels at different steps in the process and
food chain will influence the ability to meet the FSO.
The following examples illustrate the impact of including the effect of statistical distributions for
H0, SR and SI on the hazard level and the percent of nonconformance (% product above the PO or
FSO) is calculated. First, a point estimate, without considering variability is used; then the impact of
variability in the initial levels, reductions delivered through processing, and increases due to growth
during food distribution are included to evaluate the ability to meet the PO or FSO. Fresh cut, washed
and packaged lettuce is used as an example, with L. monocytogenes as the pathogen of concern. For
illustrative purposes, it is assumed that to reach an ALOP, a maximum exposure of L. monocytogenes
of 102 CFU/g (i.e., an FSO = 2 log CFU/g or 102 CFU/g) for ready-to-eat foods is set.
2.4.1 Point Estimate Approach
Szabo et al. (2003) estimated the initial contamination level of L. monocytogenes on precut lettuce,
reduction using sanitized washing, and the increases after packaging and during storage and distribution.
For a given initial level of L. monocytogenes on lettuce and the expected level of growth (SI)
during storage and distribution, the necessary reduction level to achieve a given FSO can be determined.
From Szabo et al. (2003), the initial population was H0 = 0.1 log CFU/g, the potential increase
was SI = 2.7 log CFU/g during storage for 14 days at 8°C, a S R ³ 0.8 log CFU/g was deemed necessary
to achieve the FSO of 2 log CFU/g:
- S + S = ® - + = 0 H R I 2 0.1 0.8 2.7 2.
In this example, the process can be considered to achieve the FSO exactly. However, this calculation
does not consider the impact of process variation.
2.4.2 Including Variability in the Process
2.4.2.1 Variability for One Parameter
The next example illustrates the impact of variability on calculations using data from Szabo et al.
(2003). Assume the standard deviation for SI is 0.59, and assume the log increase of L. monocytogenes
is normally distributed. For ease of calculation and explanation, H0 and SR levels do not
include variation. Because of the distribution of SI, the producer must target a lower average level of
2.4 Effect of Process Variability on FSO Compliance Validation 25
Table 2.1 Results of various levels of reduction (SR) on the proportion of defective units (P) with a standard deviation
for the increase of 0.59, assuming the log increase is normally distributed
Reduction (SR) H0−SR+SI
Probability that FSO = 2 is exceeded
P (H0−SR+SI)>2 (sd = 0.59)
0.8 0.1−0.8 + 2.7 = 2 0.5 (50%)
1.2 0.1−1.2 + 2.7 = 1.6 0.25 (25%)
1.77 0.1−1.77 + 2.7 = 1.03 0.05 (5%)
2.17 0.1−2.17 + 2.7 = 0.63 0.01 (1%)
2.62 0.1−2.62 + 2.7 = 0.18 0.001 (0.1%)
Note: The proportion above the FSO determined by the cumulative normal distribution F(2;m,s2) calculated in Excel
by 1-NORMDIST(2,x,s,1). For example, for the last line = 1-NORMDIST(2,0.18,0.59,1) = 0.001019
Table 2.2 Results on the proportion of products that do not meet the FSO (packages of fresh cut lettuce calculated to have
greater than 2 log CFU/g L. monocytogenes present at the point of consumption), with various mean log and standard deviation
values for H0, SI and SR
H0 SR SI Totala
mean log −2.5 1.4 2.7 −1.2 H0−SR+SI
sd 0.80 0.50 0.59 1.11 sd = sqrt(sd1
2+sd2
2+sd3
2)
P(>FSO) 0.2%
a The level (log CFU/g) of L. monocytogenes present in a package of lettuce at the point of consumption
L. monocytogenes in the finished product to reliably meet the FSO. If the same average level was
targeted (i.e., FSO=2 log CFU/g), 50% of the products would be above the FSO to some extent. The
processor can consider other sanitizing wash methods to provide a greater reduction step to help to
achieve the FSO through process control. The level of reduction needed to achieve different levels of
conformity is presented in Table 2.1. For example, if the SR is 2.62, the proportion product above
2 logs, for a log normal distribution with mean log 0.18 and standard deviation 0.59 is 0.1%.
0.00
0.20
0.40
0.60
0.80
1.00
−8 −6 −4 −2 0 2 4 6 8
Log (cell concentration or concentration change)
Probability density
FSO
Fig. 2.1 Probability distribution of initial cell level (H0 ——), reduction in concentration (-SR- – -) and increase in
concentration (SI– – –) of L. monocytogenes on fresh cut lettuce, and resulting cell concentration distribution (▬) in
packages of lettuce at the point of consumption using input values in Table 2.2
26 2 Validation of Control Measures
2.4.2.2 Including Variability in the Process for all Process Stages
The example in 2.4.2.1 did not include estimates of variability for H0 or SR, but variation does exist.
This section assumes variation for H0, SI and SR (values in Table 2.2). The resulting total describes
the distribution of levels of L. monocytogenes in packages of fresh cut lettuce at the point of
consumption,
and is equal to the sum of the log means for H0, SI and SR. The mean is not a correct
indicator of the risk without considering the variance. The variance of the total distribution equals the
sum of the variances, thus the standard deviation is the square root of the sum of the squares of the
standard deviations. The distributions are illustrated in Fig. 2.1. Given this distribution of outcomes,
the proportion of packages of lettuce not meeting an FSO = 2 in this example is 0.2%.
2.4.2.3 Ineffective Washing Step
Assuming that the lettuce washing step (SR) is not effective in reducing the level of L. monocytogenes
(Table 2.3, Fig. 2.2), the overall effectiveness of the process can be determined. The mean log level
of L. monocytogenes in packages of fresh cut lettuce increases from –1.2 to 0.2 and the overall standard
deviation of the level decreases from 1.11 to 0.99. The proportion of packages that have
L. monocytogenes levels above the FSO (2 log CFU/g) at the point of consumption increases to 3.5
% (Table 2.3). Note that the standard deviation does not differ much since the overall standard devia-
Table 2.3 Impact of a lettuce washing step (SR) that does not reduce L. monocytogenes levels
on the proportion of packages of fresh cut lettuce that do not meet the Food Safety Objective
H0 SR SI Totala
Mean log −2.5 0 2.7 0.2 H0−SR+SI
sd 0.80 – 0.59 0.99 sd = sqrt(sd1
2+sd2
2+sd3
2)
P(>FSO) 3.5%
a The level (log CFU/g) of L. monocytogenes present in a package of lettuce at the point of
consumption
0.00
0.20
0.40
0.60
0.80
1.00
−8 −6 −4 −2 0 2 4 6 8
Log (cell concentration or concentration change) Probability density
FSO
Fig. 2.2 Probability distribution of the initial cell level (H0 ——), increase in concentration (SI – – –) and resulting
overall final distribution (▬) of the levels of L. monocytogenes in packages of lettuce at the point of consumption for
a process in which the washing step does not reduce the level of L. monocytogenes (SR = 0), following the input values
in Table 2.3
2.4 Effect of Process Variability on FSO Compliance Validation 27
tion is influenced by the largest contributors, which is H0 in this example. Due to the ineffectiveness
of the washing procedure, a higher proportion (3.5%) of packages do not meet the FSO (2 log
CFU/g).
2.4.2.4 Effect of Shortening the Shelf Life of the Packaged Lettuce
If the product contains pathogens and supports growth of the pathogen, the length of the shelf life can
influence the impact on public health. In this example, the effect of a shorter shelf life on the proportion
of packages of lettuce that do not meet the FSO is evaluated by reducing the predicted value for SI.
If the product is stored for 7 days at 8°C, rather than 14 days, the increase in
L. monocytogenes over 7 days is estimated to be 1.9 log CFU/g with a standard deviation of 0.56
(Szabo et al. 2003) (Table 2.4, Fig. 2.3). By decreasing the shelf life, which decreases the extent of
growth of L. monocytogenes, the proportion of packages of lettuce that do not meet the FSO is
decreased to 0.01% compared to 0.2%, over a 10-fold decrease in risk.
2.4.2.5 Meeting the FSO by Changing Levels or Variability
The same proportion of products can meet an FSO, by reducing the variability of one of the inputs.
For example, if the variability of the initial levels of L. monocytogenes on the raw materials is reduced
from 0.8 to 0.4, the level of L. monocytogenes reduction required during the lettuce washing step (SR)
Table 2.4 The impact of shortening the shelf life of the product from 14 to 7 days, thus
reducing the level of growth (SI) on the proportion of packages of fresh cut lettuce
that
do not meet the Food Safety Objective
H0 SR SI Total1
mean log −2.5 1.4 1.9 −2 H0−SR+SI
sd 0.80 0.50 0.56 1.10 sd=sqrt(sd1
2+sd2
2+sd3
2)
P(>FSO) 0.01%
1 The level (log CFU/g) of L. monocytogenes present in a package of lettuce at the point
of consumption
0.00
0.20
0.40
0.60
0.80
1.00
−8 −6 −4 −2 0 2 4 6 8
Log (cell concentration or concentration change)
Probability density
FSO
Fig. 2.3 Probability distribution of the initial level (H0 ——), reduction in concentration (-SR – -), increase in concentration
(SI– – –) and resulting final distribution of L. monocytogenes levels in packages of lettuce at the point of consumption
(▬) for a product with a shortened shelf life (see Table 2.4)
28 2 Validation of Control Measures
could be decreased from 1.4 to 0.7 with the same proportion of product meeting the FSO (Table 2.5).
While the practicality of reducing the standard deviation for a raw agricultural commoditysuch
as
lettuce may be difficult to achieve given control measures available at this time, this strategy may be
applicable for other product types.
2.4.3 Log Mean Value, Standard Deviation and Meeting the FSO
The proportion of products in which the level of the microorganism of concern is above the FSO or PO
is determined by both the mean log levels and the standard deviation of the combined distributions for
H0, SR and SI. Different combinations of the mean and standard deviation resulting in the same overall
proportion of products not meeting the FSO can be calculated. The results are shown in Fig. 2.4.
The examples presented in this chapter illustrate the impact of both the mean log level and the
variability of H0, SR and SI on the proportion of product meeting the FSO. With this deeper level of
understanding of the influence of both the levels and variability of the initial microbiological load on
the incoming materials, the steps in the process that reduce the level of the microorganism of concern
and the increase of the pathogen of concern during storage and distribution, a food manufacturer can
determine where they can have the biggest impact on ensuring that the appropriate proportion of
product meets the FSO. Control strategies can focus on decreasing variability of the process, decreasing
the initial level of the microorganism of concern on the raw materials, or other parameters based on
Table 2.5 Effect of reducing variability of H0 and lowering SR during washing on
the proportion of packages of fresh cut lettuce that do not meet the FSO (compare
to Table 2.2)
H0 SR SI Total1
mean log −2.5 0.7 2.7 −0.5 H0−SR+SI
sd 0.40 0.50 0.59 0.87 sd= sqrt(sd1
2+sd2
2+sd3
2)
P(>FSO) 0.2%
1 The level (log CFU/g) of L. monocytogenes present in a package of lettuce at the
point of consumption
0.0
0.5
1.0
1.5
2.0
2.5
3.0
-2.5 -2.0 -1.5 -1.0 -0.5 0.0
Log (cell concentration)
Standard deviation
Fig. 2.4 Various combinations of mean log cell levels and standard deviation of the combined distributions for H0, SR
and SI resulting in a particular proportion of product that does not meet the FSO=2 log CFU/g. Lines represent percent
of products not meeting the FSO. Proportion not satisfying the criterion: 0.1% defective ( ), 0.2% defective (——), 0.5%
defective (–– ––), 1.0% defective (- – -), 2.0% defective (▬▬)
2.5 Validation of Cleaning and Other GHP Control Measures 29
the levels or variability observed for a particular situation. Calculations used for Fig. 2.4 are presented
in Appendix B.
The following assumptions are made with these calculations:
• All variables are assumed log normally distributed, therefore the log of the variables as used in the
FSO equation is normally distributed. This also makes their sum in the FSO equation have a normal
distribution. If values have other distributions, Monte-Carlo type calculations are needed to determine
the statistical distribution of the sum. While a normal distribution for log initial level, log increase
and log reduction is often described in the literature, in real life the distribution of pathogens may be
highly heterogeneous and not possible to describe by a log normal distribution.
• These examples assume that calculations hold even for very low levels. This may have further
implications in some situations. For example, if a 6D inactivation step is applied to containers with
a 100-g unit size and an initial concentration of 2 log CFU/g, the calculated level in each unit after
inactivation is −4 log CFU/g. If each CFU contains only one microorganism, then this process
would actually yield one microorganism in one 100 g unit (i.e., −2 log CFU/g) for every 100 units
produced (1% of the units). The other 99% of the units would be free of the microorganism. For
some microorganisms, a CFU may contain more than one cell, thus a greater percentage
of units
could theoretically contain a contaminant. This illustrates the importance of using these calculations
as general principles to compare the relative effect of changes to a food safety management
strategy rather than as absolute figures.
• If no data on standard deviation are available but minimum and maximum data are known,
representing
the range where 95% of the data will lie, the standard deviation can be estimated by
sd = 0.5 × maximum-minimum)/1.962.
2.5 Validation of Cleaning and Other GHP Control Measures
Effective application of GHP provides the foundation upon which HACCP systems are developed
and implemented. Failure to maintain and implement GHP can invalidate a HACCP system and result
in the production of unsafe food.
Effective control of a hazard in a food necessitates consideration of the components of GHP likely
to have significant impact in controlling the hazard. For example, incoming material requirements are
very important to control the risks of certain hazards in seafood (e.g., paralytic shellfish poisoning,
ciguatera toxin, scombroid poisoning). Incoming material requirements are of lesser importance for
a food that will be cooked sufficiently to eliminate vegetative pathogens (e.g., salmonellae in raw
meat or poultry) that may be present. Thus, the various components of GHP do not carry equal weight
in all food operations. It is necessary to consider the hazards that are most likely to occur and then
apply those GHP that will be effective for controlling the hazards. This does not mean that the other
components of GHP, such as equipment maintenance or calibration, are ignored. Some are very
important to ensure a food meets established safety and quality requirements.
In certain situations selected components of GHP may carry particular significance and should be
incorporated into the HACCP plan. For example, equipment maintenance and calibration are important
for large continuous ovens used in cooking meat products. In this example, the procedure and
frequency (e. g., monthly, quarterly) for conducting checks on heat distribution during cooking could
2 The minimum and maximum 95% limits are minimum = average−1.96sd; maximum = average + 1.96sd. This results in
maximum-minimum = 2 × 1.96sd, so sd = 0.5(maximum-minimum)/1.96.
30 2 Validation of Control Measures
be incorporated into the HACCP plan as a verification procedure. In addition, it is necessary to verify
the accuracy of the thermometers used for monitoring oven temperatures during cooking.
Information on hygienic design of facilities and equipment, cleaning and disinfection, health and
hygiene of personnel, and education and training was discussed previously (ICMSF 1988). Preventing
contamination or recontamination of the product during processing is a critical component of a control
program. Validation means that the facilities and equipment, choice of cleaners and sanitizers, and
conduct of the operations are designed to achieve the necessary level of control. Initial considerations
in designing the sanitation program include food characteristics, equipment construction and materials,
and microorganisms of concern for safety and spoilage. Validation of the program ensures all parts
of the system are properly treated to remove food soil and inactivate microorganisms. Residual food
soil in wet environments not only provides a source of nutrients for subsequent microbial growth,
but also can reduce the effectiveness of sanitation steps. Clean-in-place (CIP) systems require careful
verification that all parts are treated and that the system operates as intended.
The effectiveness of many sanitizers is affected by the presence of organic residues from the food
and processing environment. Scientific criteria needed to determine a sanitizer’s immediate and
residual effect include:
• Concentration of the sanitizer and conditions for efficacy (e.g., temperature).
• Immediate and long term antimicrobial effectiveness (stability of the sanitizer).
• Microorganism susceptibility to the sanitizer.
• Characteristics of the surfaces to be sanitized (temperature, organic load).
• Impact of processing steps (thermal treatments, packaging conditions).
As with validation of other components of the food process, validation of the sanitation program is
the accumulation of knowledge from laboratory, pilot plant and commercial facility studies. Sufficient
information of increasing specificity needs to be acquired to ensure the functioning of the process
will be understood. In laboratory studies, pathogens can be inoculated into media or product.
Specialized pilot plant studies might use pathogens if exposure to food and humans can be controlled;
however, GMP plants must use surrogates. In commercial facilities, data is acquired using surrogates
when pathogen presence is a rare event, or from monitoring when naturally-occurring
pathogens are
present in sufficient frequencies and numbers (e.g., in slaughter operations).
Appropriate pathogen
strains or surrogates must be used. Chemical agents must be tested according to directions using
potable water of appropriate hardness, concentration, pH, temperature and contact time. Variations in
the food and process must be considered, the critical factors that determine the margin of safety
identified and the minimum lethal treatment specified to be assured that appropriate control will
always be achieved. Periodic verification is necessary to ensure that efficacy is not lost over time
(e.g., due to development of resistance).
2.6 Shelf Life Determination
One approach to management of the safety of the food is to have the food spoil and be rejected by
the consumer for poor quality before pathogens that might be present grow to levels that become a
public health threat. In the absence of spoilage, other means of limiting shelf life such as use-by
labeling or time–temperature indicators could be employed. These issues are discussed below and in
more detail in NACMCF (2005).
Distribution and storage conditions may include moderate time and temperature abuse. Process
design and validation should include these conditions when validating that the products meet the
FSO. Decisions about the temperature abuse can be based in part on retail and home storage temperatures
survey databases from e.g., EcoSure (2008) where retail display temperatures varied by product
type (5% of home refrigerators exceeded 7.2°C and 0.7% exceeded 10°C). For some products and
2.7 When to Revalidate 31
regions, a shelf life short enough to cope with the growth at abusive temperatures may result in times
that do not permit normal commercial handling or meet consumer’s expectations. Specifying the
maximum storage temperatures is a public health risk management decision.
Shelf life validation would include determining the distribution of contamination at the end of
processing and establishing a PO at that point. The allowable amount of growth that potentially could
occur for the food to still meet the FSO can then be determined. With specification of the maximum
abuse temperature, laboratory and challenge testing can determine the length of time for repair/lag
and growth before exceeding the FSO as explained in previous examples.
For foods that are continually refrigerated from manufacture to consumption, the use-by date can
be estimated by the manufacturer. Times for commercial and retail periods and home storage are
included in the determination and a calendar date can be applied by the manufacturer. If a food is
frozen and then thawed at retail, the growth time is the remaining retail and home storage time.
For this product, a label indicating the number of days after purchase is appropriate.
Time temperature integrators (TTI) for retail packages produce a noticeable color change at the end
of the allowable storage based on a biological, physical or chemical reaction. The kinetics of the reaction
varies among devices and end points may be set for specific time/temperature standards, for quality
concerns or theoretically for growth in a specific food-pathogen combination. TTIs are not widely used
on consumer packages in 2010 because high cost, complexity of reaction kinetics for different food/
microorganism combinations, and lack of consumer awareness and understanding have limited their
use. TTIs have a potential benefit of indicating the end of the permissible shelf life because the ongoing
reaction rate is continuously affected by the temperature. If the temperature is below the designated
optimum, the rate is correspondingly slowed and the time before the indicating color change is lengthened.
If the temperature exceeds the designated optimum, the TTI reaction rate appropriately shortens
the storage time. Future developments may make it possible to choose a TTI that continuously monitors
the temperature during the entire storage period and provides an end point specific to the conditions
that a specific individual package experiences.
2.7 When to Revalidate
Validation data should be periodically reviewed to determine whether new scientific data or changes
in operating conditions would alter the previous validation conclusions. Emergence of a new pathogen
requires re-evaluation of the process based on the characteristics of the pathogen. A change in
the initial contamination of the ingredients, the formulation of the product, processing parameters or
the storage conditions of a food may require the process be revalidated. The impact of specific
changes on the concentration, homogeneity or frequency of contamination for the affected step
should be estimated. This information may be obtained from the literature, models, and laboratory or
pilot plant experiments. The magnitude of the change can be compared to the corresponding mean
log and standard deviation of the validated process. If the change is within the values of the original
validation, there may be no need for further validation. The final impact of the change at the point of
consumption can be estimated and compared to the FSO. For example, a 0.2 log increase in the contamination
of an ingredient may increase the contamination by 0.2 log for all subsequent steps to
consumption. If this increase does not result in exceeding the FSO, further validation is not needed.
However, if the change in the process was an increase in pH that permitted a 1 log increase in pathogen
concentration at consumption, this process would likely require revalidation. It would perhaps
require redesign of the process to compensate elsewhere for the increased growth and revalidation of
the new process.
Sponsored Links

No comments:

Post a Comment