Risk Models for Listeria monocytogenes in Dairy Products: Comparison
Please note this is a comparison between Version 1 by Ursula Gonzales-Barron and Version 2 by Lindsay Dong.

Cheese as a source of listeriosis tended to be studied in quantitative risk assessment (QRA) models under the full farm-to-table approach because of the many factors and forces of contamination that may occur along the chain, namely, on-farm environmental contamination sources such as silage, soil, water, and inadequate sanitation and housing conditions; extensive manipulation after milk heat treatment (if heat treated); the potential for recontamination after pasteurization and cross-contamination events during processing; the possible presence of contaminating niches in processing and retail facilities; L. monocytogenes’ ability to grow during refrigeration storage; long shelf-life in case of ripened cheeses; and wide consumption of cheese.

  • exposure assessment
  • simulation
  • raw milk
  • cheese

1. Introduction

Listeria monocytogenes is a Gram-positive, non-spore-forming, facultatively anaerobic rod-shaped bacterium, pathogenic to both humans and animals and of great concern with regard to human foodborne illness [1]. Foodborne listeriosis is one of the severe foodborne diseases, and although it is a relatively rare disease, the high rate of mortality associated with this infection makes it a significant public health concern. In the European Union (EU), listeriosis was the fifth most commonly reported zoonosis in the year 2020, with 1876 confirmed cases in 27 EU Member States, and with the highest case fatality (13%) and hospitalization rates (97.1%) [1]. Recently, a meta-analysis investigation on case–control studies of sporadic listeriosis was conducted to summarize evidence on associations (odds ratios, OR) between risk factors and sporadic cases [2].
In the past 20 years, many quantitative risk assessment (QRA) models have been developed worldwide in order to guide the decision making on food safety and risk management of L. monocytogenes. Models, from simple to more complex ones, attempt to assess in a structured approach the possible routes of contamination of L. monocytogenes at different points along the farm-to-fork chain of food products. Regardless of the scope of the QRA models (i.e., farm-to-table, processing-to-table, end-of-processing-to-table, retail-to-table, or consumption), their ultimate goals are to quantify the public health risk associated with the consumption of food products of interest and evaluate scenarios or potential risk reduction measures.

2. Risk Models for Listeria monocytogenes in Dairy Products

2.1. Risk Factors and Control Measures Assessed at Primary Production

Many sources of contamination exist in the farm environment, such as silage, soil, water, and inadequate sanitation and housing conditions, which prompt dissemination to and between animals. In addition, L. monocytogenes mastitis is an important source of contamination—that increases the risk associated with contamination of raw milk—which, although it has an extremely low between- and within-herd prevalence, when present, animals may have prolonged shedding of the bacteria in the milk. L. monocytogenes is transmitted from animal to animal through fecal–oral routes, usually via manure contamination of the pasture or silage with the microorganism. Next to this, bulk tank milk, milk filters, milking machines, milk handlers, and poor on-farm hygiene during milking are also considered sources of contamination. The QRA model of Bemrah et al. [3][13], reflecting French on farm conditions at the time, showed that the contamination load due to the environment was much stronger than that of animal mastitis in the presence of L. monocytogenes in raw milk soft cheese. In a scenario representing lower environmental contamination that reduced the mean prevalence of contaminated farms from 3% to 2%, the median concentration of L. monocytogenes in raw milk cheese decreased by 99% (from 2.53 to 0.024 CFU/g). However, in another scenario that assumed the absence of mastitis, the median concentration of L. monocytogenes in raw milk cheese was reduced by only 26%—from 2.54 CFU/g (baseline scenario assuming the probability of herds with L. monocytogenes to be 10%) to 1.87 CFU/g.
Comparable results concerning the relative importance of mastitis, yet in sheep, were obtained from a QRA model from Italy [4][15], whose scenario simulations showed that the median concentration of L. monocytogenes in bulk tank raw milk from mastitis-free flocks decreased in only 24% (from 0.56 CFU/mL to 0.43 CFU/mL) when compared to the baseline scenario of contaminated random flocks.
In the listeriosis QRA models available for cheese, no sensitivity analysis comparing the contributions of the environment contamination and the mastitis animals has been conducted. Only the study of Tiwari et al. [5][14] estimated coefficients of correlation of 0.27 and 0.15 between fecal/silage/farm contamination factors with the L. monocytogenes counts in raw and pasteurized milk, respectively. Nevertheless, despite their relative importance, on-farm contamination sources, either from shedding animals or from the broad environment, have been demonstrated by different QRA models to impact the exposure dose and the risk of listeriosis, in particular from raw milk cheeses. For instance, Condoleo et al. [4][15] estimated that sheep’s raw milk cheeses from mastitis-free flocks presented 0.07 times the risk per contaminated serving of those from contaminated random flocks, whereas raw milk cheeses from family flocks consisting of a maximum of 10 animals each could present 8 times higher risk per contaminated serving. Similarly, increasing the initial L. monocytogenes population in raw milk at the farm level (between 0.03 and 10 CFU/mL for Ireland conditions) up to a maximum of 100 CFU/mL (worst-case scenario of contamination) would increase the final mean concentration of the pathogen by 35% for raw milk cheese and by 45% for pasteurized milk cheese [5][14]. Latorre et al. [6][19] tested a scenario whereby a four-fold increase in the risk per serving would occur if the prevalence of L. monocytogenes in bulk tank milk increased from 6% (baseline) to 25%. In the same line, according to the QRA model of FDA-Health Canada [7][9], a 3 log/mL reduction in L. monocytogenes concentration in raw milk at the beginning of cheese manufacturing—which can be interpreted as the result of the application of animal husbandry strategies for mitigating the contamination of bulk milk as raw material for cheese-making—can reduce the mean risk per serving by a factor of 7–10.
One such on-farm strategy to control the risk of listeriosis associated with raw milk is the bulk tank and tank truck milk testing in order to reduce the concentration of L. monocytogenes in dairy silo milk. Whereas Latorre et al. [6][19] estimated that a five-fold decrease in the median listeriosis annual cases for raw milk consumers would occur if a raw milk testing program were put in place (i.e., conducting monthly testing of one sample of milk and recall of milk), FDA-Health Canada [7][9] estimated that in raw milk soft-ripened cheeses, the milk collection testing would reduce the mean risk per serving by 24–37 times that of the risk when no testing at all is conducted.

2.2. Risk Factors and Control Measures Assessed at Processing

It is widely known that pasteurization of milk is effective in destroying L. monocytogenes. The effectiveness of milk pasteurization as a key mitigation strategy to reduce the risk from the consumption of cheese was quantified by FDA-Health Canada [7][9] and FDA-FSIS [8][7]. The former estimated that consuming an artisanal raw milk soft-ripened cheese increased the mean risk per serving 157 times in comparison to consuming the pasteurized one in the general population. The latter estimated that the risk per serving of queso fresco is 43 times greater for the perinatal population and 36 times greater for the elderly population if cheeses were made from raw milk compared to pasteurized milk. Another strategy to control L. monocytogenes that can be applied during processing is the use of bacteriocinogenic lactic acid bacteria (LAB). Nevertheless, only one QRA model [9][17] investigated the effect of an anti-listerial cocktail from indigenous LAB on the risk of listeriosis from cheese. These authors estimated that the addition of 6 log CFU of such a LAB cocktail per ml of raw milk reduced the concentration of L. monocytogenes in raw milk semi-hard cheese ripened for 22 days from 7.7 log CFU/g (baseline scenario without added LABs) to 1.1 log CFU/g, which in turn reduced the risk by over 6 log. In the case of pasteurized milk soft cheeses, the addition of the same LAB cocktail to pasteurized milk inoculated at 1 log CFU/mL of L. monocytogenes decreased the risk 0.22-fold in both the general and vulnerable populations. Other current processing strategies, such as the smearing of cheeses with plant-based extracts having antimicrobial properties or the use of antimicrobial packaging, were not tested as what-if scenarios in any of the QRA models collected.

2.3. Cross-Contamination during Processing

Despite the effectiveness of pasteurization in inactivating L. monocytogenes, post-pasteurization contamination and cross-contamination can occur within the processing plants and are exacerbated by the pathogen’s capacity to grow at normal refrigeration temperatures, and its ability to find damp spots or niches where they can reside and proliferate. Furthermore, if mechanical cleaning, disinfection, and rinsing are not well executed, the bacteria can form a biofilm on surfaces in contact with food, which then becomes difficult to remove by standard sanitation protocols [10][27]. Nonetheless, despite the relevance of cross-contamination, only two QRA models for dairy products comprised cross-contamination modules: Tenenhaus-Aziza et al. [11][12] and Tiwari et al. [5][14]. The study of Tenenhaus-Aziza et al. (12] conducted on pasteurized milk soft cheese produced in France proposes new methods for modeling cross-contamination and recontamination events. They utilize six contamination event modules, listed as follows: (1) the primo-contamination event at the cheese-making phase, whereby milk or products can be contaminated, for example, by cells from the environment or by cells arising from pasteurization failure; (2) the primo-contamination event at the ripening phase, whereby the environment of the ripening room and the smearing machine can be initially contaminated; (3) the cross-contamination during smearing, whereby a whole colony from the surface of a cheese could be transferred at a given probability to the machine or to the immediate surroundings close to the machine, through the smearing solution and the cheese matter detached from the surface of the product; (4) cross-contamination during packaging, which was modeled using the same approach but in a simplified form, where the compartments were the cheeses and the packaging machine; (5) the transfer of colonies from the smearing room to the ripening room; whereby colonies located in the environment of the smearing room are not assumed to adhere, since they come from the smearing machine and the duration between contamination of the environment and transit of the batch is not enough long to allow adherence of the cells to the environment surfaces; and (6) the recontamination during ripening, whereby during the transit of a batch inside or outside the ripening room, colonies from the environment of the ripening room can be transferred to the surface of products present in the ripening room. This model scientifically corroborated that frequent hygienic operation is necessary in the facilities by proving that the concentration of contaminated products correlates with the total number of cells in the ripening environment. To this respect, two of the what-if scenarios estimated that when the initial number of cells in the ripening room environment decreases from 2000 to 500 cells, the mean risk of listeriosis is divided by 3.7, whereas when the primo-contamination event occurs on the smearing machine, instead of during cheese-making, with 500 cells, the mean risk is divided by 350. In the listeriosis QRA model for Irish cheeses, Tiwari et al. [5][14] borrowed the cheese-smearing cross-contamination model from Tenenhaus-Aziza et al. [11][12] and found a low correlation between the cross-contamination from the smearing machine and the counts in raw and pasteurized milk cheeses (r = 0.05 and 0.12, respectively), which sustained the scenario that if no further contamination occurred during the retail phase, but only cross-contamination due to smearing, the L. monocytogenes counts would decrease by 24% in raw cheeses and by 97% in pasteurized cheeses. In the QRA models, there is no estimate of the contribution of cross-contamination in processing plants to the final listeriosis risk. Nonetheless, it is widely known that cross-contamination is an important factor, as suggested by the many surveys throughout the world, which have reported varying prevalence levels in the environments of dairy processing plants of up to 25.0% and, as implied by the listeriosis outbreaks due to contaminated dairy products, directly linked to cross-contamination from the processing facilities [12][28]. Floor drains, floors, coolers, and areas of pooled water, such as washing machine areas, are sites of frequent recovery of L. monocytogenes, which often contaminate food contact surfaces [13][29]. This reinforces the importance of including sound modules to represent cross-contamination prevalence, patterns, and events in QRA models for both raw and pasteurized milk dairy products.

2.4. Risk Factors and Control Measures at Retail and Home

Ten dairy QRA models (58.8%) covered the contamination of L. monocytogenes in shorter chains (two for end-of-processing-to-table, six for retail-to-table, and two for consumption only) of dairy products, namely, raw milk (x2), pasteurized milk (x2), ice cream, yogurt, soft and semi-soft cheeses, queso fresco cheese, cultured milk, and various dairy products. Due to the fragmented scope of these QRA models, these models address the dairy foods entering the distribution chain or the retail establishment with a certain level of contamination with L. monocytogenes, which is prone to multiply with prolonged storage, even from low initial concentrations. Moreover, the dairy industry’s trend toward the production of refrigerated products with longer shelf lives further aggravates this problem. Other equally important factors for increasing the risk can take place in products permitting the growth of L. monocytogenes during transport and distribution, retail, and home consumption, namely retail/home refrigerator temperature fluctuations and abuse, long-term storage, cross-contamination, and inadequate handling practices at retail and at home. The high frequency and amounts of dairy foods consumption also contribute to increasing the risk of listeriosis. Many dairy QRA models compared the importance of retail/home storage temperature and retail/home storage time, and they unanimously found that the effect of higher temperature is stronger than longer time. FDA-FSIS QRA model in 10 different RTE dairy products [8][7] found out that if the maximum refrigerator temperature was set at 7 °C (instead of 16 °C in the baseline), the mean number of cases of listeriosis would be reduced by 69%, whereas further limiting the refrigerator temperature at a maximum of 5 °C would reduce the number of cases by >98%. On the other hand, if the maximum storage time was reduced from 14 days (baseline) to the (unrealistic) 4 days, the annual incidence of listeriosis cases would be decreased by 43.6%. In the same line of results, the listeriosis QRA model of pasteurized milk from FAO-WHO [14][8] undertook what-if scenarios that proved the greater relative importance of temperature over time: when the temperature distribution was shifted so that the median increased from 3.4 °C to 6.2 °C, the mean rate of illnesses increased over 10-fold for both the healthy and susceptible population; however, when the storage time distribution was extended from a median of 5.3 days to 6.7 days, the mean rate of illnesses increased 4.5-fold and 1.2-fold for the healthy and susceptible populations, respectively. Similarly, FDA-Health Canada [7][9] predicted that an increase of only 1 °C in the home refrigerator temperature increases the mean risk per contaminated serving by a factor of 1.7, whereas halving the maximum duration of the home storage from 56 days to 28 days reduces the same risk by a factor of 1.4. In a more recent listeriosis QRA model from soft and semi-soft cheeses, Pérez-Rodríguez et al. [15][10] also compared the impact of increasing and decreasing storage temperature. When storage temperature was increased by 3–4 °C, the number of cases/million servings increased by 530%, while a decrease in the storage temperature by 3–4 °C produced only a 4% decrease in the number of cases since the baseline temperature conditions did not allow for growth of L. monocytogenes in cheese. By contrast, decreasing the time to consumption (storage time) by 25% produced a decrease of 33% in the incidence of listeriosis cases per million servings. An interesting scenario was tested by Koutsoumanis et al. [16][18], which consisted of storing pasteurized milk cartons away from the door shelf of the fridge. According to their simulation, the proportion of cartons with no growth of L. monocytogenes increased from 55% to 62%. Outputs of sensitivity analysis of dairy QRA models also coincided with the relative importance of storage temperature versus storage time. Spearman rank sensitivity analysis on the probability of illness from the consumption of raw milk presented a higher correlation with the temperature of the home refrigerator (r = 0.55–0.77) than with the storage time in the home refrigerator (r = 0.27–0.36) [6][19]. Similarly, Tiwari et al. [5][14] found higher correlations of the counts of L. monocytogenes in raw and pasteurized milk cheeses with temperature at retail (r = 0.65 and 0.75, respectively) than with storage time at retail (r = 0.15 and 0.20, respectively). Nevertheless, despite the strong contribution of temperature to the risk as well as the well-known importance of maintaining the cold chain to control risks, most of the dairy QRA models (3/18) utilized variability distributions of average temperatures in the supply chain. To enable a better assessment of growth, time–temperature profiles with credible trajectories and oscillations should be used. Only three QRA models [11][15][16][10,12,18] solved L. monocytogenes growth for dynamic temperature profiles at every iteration. None of the retail-to-table QRA models included cross-contamination or poor handling modules, in spite of the potential of cross-contamination occurring during retail and at home. Only Pérez-Rodríguez et al. [15][10], when comparing the risk of listeriosis from non-sliced and sliced soft/semi-soft cheeses, indirectly determined that the processing step of slicing doubled the risk of infection, suggesting, therefore, that cross-contamination happens during slicing.

2.5. Contributions of Retail and Consumer Practices to the Final Risk of Listeriosis

Although not directly exposed, results from the QRA models, in perspective, pointed towards a higher contribution of the consumer module than the retail module to the risk of listeriosis from dairy foods. For instance, for the QRA model of Koutsoumanis et al. [16][18], the storage time at home (r = 0.482) had a stronger effect on the counts of L. monocytogenes in pasteurized milk at consumption than both the retail storage temperature (r = 0.181) and the retail storage time (0.174). Latorre et al. [6][19] also showed that the temperature of the home refrigerator (r = 0.55–0.77) can have a stronger effect than the temperature of the retail/farm refrigerator (r = 0.55) on the probability of illness from raw milk. In the case of soft-ripened cheeses [15][10], the risk per serving was more heavily driven by the L. monocytogenes counts in cheese after home storage (r = 0.95) than the counts after retail storage (r = 0.83), and in turn, than the counts after transport (r = 0.75). An interesting scenario performed in FAO/WHO [14][8] illustrated the strong contribution of the consumers’ practices to the risk of listeriosis by estimating that if all milk were consumed immediately after purchase at retail, the number of cases in both susceptible and healthy populations would decrease 1000-fold. All of the QRA models above coincide in that the consumers’ practices can be more determinant of the risk of listeriosis than the retail practices or conditions. To a lesser extent, consumption as serving size or frequency has also been investigated in the dairy QRA models. Their impact on the risk is more variable, although in general, sensitivity analysis has ranked consumption-related variables lower than risk factors such as the prevalence of the pathogen, storage temperature, and time; therefore, it is less effective. According to Latorre et al. [6][19], the correlation between the probability of illness with serving size for raw milk purchased directly from milk tanks and milk consumed in farms was low, ranging between 0.19 and 0.30, whereas, in Yang and Yoon [17][21], the amount of consumption of yogurt had no effect on the risk of illness associated to drinking and regular yogurt (r = 0.08 and 0.02, respectively). In the early model of Bemrah et al. [3][13], it was shown that the strategy of reducing the servings per person per year of raw milk cheeses from 50 to 20 would reduce the incidence of listeriosis cases by 60%, less effective than other strategies such as excluding mastitis source or reducing the mean prevalence of L. monocytogenes of contaminated farms.

2.6. L. monocytogenes Growth Kinetic Parameters as Drivers of the Final Risk

Finally, some of the QRA models have shown that the kinetic parameters of the pathogen have a strong impact on the estimated risk. For instance, in their model for soft-ripened cheese, FDA-Health Canada [7][9] found moderate correlations between risk per serving and lag phase duration (r = −0.54) and exponential growth rate at 20 °C (EGR20) (r = 0.45); furthermore, halving the EGR20 of L. monocytogenes, decreased the mean risk per contaminated serving by a factor of ~8, and doubling the EGR20 multiplied the mean risk by a factor of ~4. In Tenenhaus-Aziza et al. [11][12], when the generation time of L. monocytogenes in the environment was extended from 24 h (baseline) to 48 h, the risk of listeriosis from pasteurized milk soft cheeses was divided by ~550, whilst Pérez-Rodríguez et al. [15][10] showed that incorporating the lag time effect to the baseline model produced a reduction of 30% in the number of cases per million servings. These findings reinforce the importance of obtaining good estimates of the microbial kinetic parameters to model the changes in microbial concentration between the point of contamination and human exposure to the pathogens. To avoid the assumption that the L. monocytogenes populations are homogeneous and that their kinetic parameters represent average population behavior, the common strategy in the QRA models was to represent strain variability in parameters such as growth rate, minimum temperature for growth, minimum pH for growth and lag phase duration from growth challenge data that utilized a cocktail of L. monocytogenes strains [4][15][16][17][18][19][10,11,15,16,18,21]. A different approach for modeling strain variability for increased precision exposure assessment was proposed by Njage et al. [20][23], consisting of using whole-genome sequencing (WGS) data to unravel the biological variability that induces the diverse response by microorganisms to the differing environmental conditions. These authors employed finite mixture models to distinguish the number of L. monocytogenes sub-populations for each of the stress phenotypes: acid, cold, salt, and desiccation. Based on the performance assessment of the machine learning methods, they selected the support vector machine approach for the prediction of acid stress and the random forest approach for cold, salt, and desiccation stress responses. They used WGS data from a collection of 166 L. monocytogenes strains from Canada and Switzerland, as well as associated data on growth phenotypes during the different stress conditions. It is crucial to bear in mind that the outcomes of a risk assessment are context-specific and influenced by factors such as the country and population under consideration. Moreover, risk assessment is inherently linked to queries posed by a risk manager. Consequently, the presentation of assessment results should be tailored to the specific question at hand, whether it involves estimating risk at the population level to gauge the overall burden or assessing risk per portion to evaluate the impact of control measures. Furthermore, certain nuances are challenging to convey accurately. For instance, the term “cheese” encompasses a diverse array of processes and microflora, rendering the transfer of models from one country to another a complex task. Traditionally, in risk assessment, it is assumed that the matrix’s effect influences exposure, where growth or inactivation is linked to the properties of the matrix. Conversely, it is presumed that the food matrix does not influence the virulence of strains, although several studies propose that it might [21][22][31,32]. It is also generally assumed that it does not affect the variability of virulence profiles among strains. Consequently, the distribution of values characterizing intraspecific variability in the dose–response relationship is presumed to be identical regardless of the food under consideration. Nevertheless, available data demonstrate that the distribution of L. monocytogenes sequence types differs among various food categories [23][24][33,34]. Employing a dose–response approach that considers the diversity of virulence profiles would enable a more accurate assessment of the role of cheese and dairy products in the risk of listeriosis. A preliminary suggestion by Fritsch et al. [25][35] advocates for such an application.

2.7. Availability of Models

Sharing risk assessment models is essential to ensure the transparency of the approach and ease of re-use. This is particularly important in scientific research, where reproducibility and open science are increasingly valued [26][36]. By sharing models, researchers can allow others to scrutinize their work, identify any potential biases, and apply the models to their own data. This can help improve the accuracy and reliability of risk assessments and ultimately lead to better decision-making. In addition to ensuring transparency, sharing models also facilitates re-use. This is especially beneficial for researchers who may not have the resources to develop their own models [27][37]. Of all the studies analyzed, four provide access to the codes or spreadsheets [15][18][19][20][10,11,16,23], and one proposes to make the models used available on request [7][9]. For the other models, no indication is given as to the availability of the models. One study dating from 2004 refers to a site that no longer exists [8][7]. This latter questioned the challenge of reproducibility. Indeed, after a few years, as software evolves and resources disappear (maintenance of websites, for example), it becomes difficult to reproduce the calculations made [28][29][38,39].
 
Video Production Service