Participants’ appreciation for a successful Second MoniQA International Symposium on Food Fraud Prevention and Effective Food Allergen Management, 7-8 June 2018

The 2nd International MoniQA Symposium on Food Fraud Prevention and Effective Food Allergen Management in Vösendorf-Vienna, Austria, 7-8 June 2018, attracted 77 delegates representing all aspects of the agrifood sector. The event was organised by Roland Poms, Secretary General, MoniQA Association, Austria, and his team along with Richard Cantrill, President, MoniQA Association, Canada. The book of abstracts is published as volume 10, supplement 1, in the journal QAS - ‘Quality Assurance and Safety of crops and foods’, and can be accessed free of charge at www.wageningenacademic.com/foodfraud2018. The 3rd MoniQA International Symposium of this series is planned in latter half of 2019.

Down below a detailed summary of the meeting is given, thanks to Michael Walker, LGC, United Kingdom, who provided his personal report.

The organisers thank all speakers and poster presenters for their excellent contributions. Furthermore, we thank the sponsors of this event and all participants for their active participation!

Roland Poms and Richard Cantrill welcomed the delegates and introduced the conference which was an exciting programme of talks mixing cutting edge science with examples of litigation and enforcement. Strategies and methods for detecting and combating food fraud were presented as well as discussion of food allergy and coeliac condition, including precautionary allergen labelling, ‘free from…’, thresholds, laboratory accuracy and class action litigation.

The speakers represented the United Nation’s FAO/IAEA, IFS, LGC, USP, MoniQA, as well as industry including Nestlé, SQTS, Imprint Analytics, law firms and food research institutions, regulators, academics and nongovernmental organizations.

The keynote speaker was Andrew Cannavan, Laboratory Head, Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture, International Atomic Energy Agency, United Nations, Austria Andrew discussed the global perspective of food fraud and a "systems" approach to dealing with it. His examples were drawn from his Division’s international research projects and capacity building in the developing world. The focus of the projects is mainly on the development and application of analytical methods for food authenticity to underpin traceability, increase confidence that food commodities reaching local consumers, and those destined for international trade, are safe and authentic. Andrew’s conclusions included that positive criteria for enforcement leads to good analytical methods, and he advocated using the simplest methods that are fit for purpose and validated alongside advanced confirmatory techniques.

John O’Brian, Deputy Head, Nestlé Research Centre, Switzerland, gave some key insights on food authenticity from an industry perspective. He highlighted challenges such as ‘free from’, ‘natural’, alternative proteins, new technologies, lab grown meat, ‘grass-fed’, and packaging reduction. All stakeholders need to recognise many of these are pre-competitive issues.  Systemic vulnerability analysis, targeted audits, and early warning tools are now in use in both the private and public sectors to detect and to manage food fraud risks. Some facilities are employing sophisticated analytical tools to distinguish between ‘normal’ and ‘abnormal’ fingerprints which enable follow up action on a targeted basis. Such approaches have been shown to assist greatly in early management of issues affording greater consumer protection. However, there remain gaps in global coverage and in consistency of use of such tools.

Building on this Thomas Gude, Deputy Head, SQTS, Switzerland, discussed the value of food profiling to assure food authenticity rather than to investigate for food fraud. Food profiling is a systematic approach applying targeted and non-targeted analysis with  modern high-resolution mass spectrometry. This is a promising area that needs much more work and international collaboration. Thomas illustrated his concepts with food contact materials and where analysis is for known targets (additives) non-targets, NIAS – non-intentionally added substances.

Steven Gendel, Senior Science Manager – Food, United States Pharmacopeia (USP), speaking on assuring food authenticity from a standards developer’s perspective  noted that assuring food authenticity is a complex problem that includes the need for information integrity and communication. Consumers expect foods to contain all of the constituents that should be present, to not contain anything inappropriate or harmful, and to have accurate and complete labels. These same expectations apply to all the participants in the supply chain. For industry, authenticity starts with the ingredient. Thus ingredient standards and reference materials are needed along with identity procedures, acceptance criteria, and technically rigorous methods. Stephen described with examples how these requirements are met by the Food Chemicals Codex. His final message was that future standards will need to address complex ingredients where composition can vary depending on environmental, agricultural, and other unpredictable factors. This will create a need to share information such as sets of spectra that are not amenable to publication as traditional documents. These changes mean that standards development organizations, and the food industry, will need to develop a new understanding of what constitutes an ingredient standard, of how to ensure data integrity, and of how to communicate and use these new standards along the supply chain.

Michael Walker, Laboratory of the Government Chemist, UK, discussed honey authenticity, in particular when official controls are questioned. Michael described a series of food fraud cases involving honey in the U.S., China, New Zealand and Denmark. There are many means of adulterating honey, including the addition of cheap sugars and syrups after collection from hives, overfeeding bees with saccharides or invert derivatives and the falsification of the floral or geographical origin. In the face of media reports that more mānuka honey is sold worldwide than is produced in New Zealand a set of high level characteristics for mānuka-type honey were developed which Michael reviewed and critiqued. He described the European Commission control plan on honey in which over 2,000 samples of honey were collected, some 20% of which were non-compliant with authenticity criteria. Honey authentication requires a multifaceted approach which can be costly and time consuming. As well as classical analysis δ13C EA/LC-IRMS is required. But is this sufficient? Nuclear Magnetic Resonance (NMR) can provide quantitative data and molecular structural information on key components in honey with little sample preparation and over the last five years a small but significant literature has emerged on this approach. However, in 2015 the UK Food Standards Agency wrote to UK enforcement authorities to state ‘… 1H NMR …screening method gives indicative results and does not definitively prove that added sugar is present … no enforcement action should be taken in relation to the NMR results alone with regards to added sugar at the present time …’. The reasons for this were examined along with the outcomes of the 2018 European Commission Joint Research Centre Technical Round Table on Honey Authentication.

Cesare Varallo, Vice President - Business and Regulatory Affairs EU, INSCATECH, Italy, gave a lawyer’s perspective on how technologies and blockchain could be used to mitigate the risk of lawsuits and recalls. His presentation examined some of the most recent developments and application of artificial intelligence, the internet of things and blockchain in the food supply chain. These technologies must be properly understood, before being applied: benefits and vulnerabilities should be carefully considered. For example, systems’ interoperability cannot be ignored but such technologies can contribute to strengthening the supply chain and the quality of the data especially in the face of litigation.

Litigation was again the theme taken up by Riëtte van Laack, Director, Hyman, Phelps, & McNamara, P.C., USA. With a PhD in food science and law qualifications Riëtte was well placed to discuss the U.S. response to food fraud: FSMA, litigation, and the national organic program. Although the U.S. Food and Drug Administration’s regulations implementing the Food Safety and Modernization Act specifically mention food fraud, they address food fraud only to the extent that the fraud is a safety concern. Compositional standards help prevent and combat fraud to the extent that such standards can be verified. However, frequently, standards cannot be verified by testing alone. In the United States, private litigation by competitors and consumers is used frequently to combat alleged fraud. Competitors know the market and more easily recognize circumstances of possible fraud. Consumer protection organizations can highlight potential fraud situations. In addition, many state laws provide for monetary recovery in consumer class actions for food fraud. Examples of food fraud litigation include lawsuits regarding pomegranate products and cases regarding extra virgin olive oil. In some cases, private litigation has resulted in the development of standards, either by independent third parties or by federal agencies.

Riëtte went on in a second talk to discuss the failure of analytical tests. Fraud is often committed with the knowledge of what companies generally test for. For analytical testing to be useful in detecting fraud, at least three requirements must be met: (1) A clearly defined standard related to the chemical composition of the authentic product has been established; (2) There is a known compositional difference between the authentic and fraudulent food; and (3) The analytical test must be validated. Riëtte illustrated the failure of testing in litigation by the New York Attorney General against retailers that were accused of selling herbal dietary supplements allegedly not containing what they were represented to contain. DNA testing of these supplements allegedly revealed absence of the labelled botanical substance. However, investigators had failed to consider that the extraction processes applied to botanicals could have removed or destroyed genetic material.

Day 1 of the conference concluded with a panel discussion on interpreting legal limits in the context of laboratory data. Moderated by Richard Cantrill, Cesare Varallo, John O’Brian, Michael Walker, Riëtte van Laack, Thomas Gude and Steven Gendel discussed the value of data in litigation and disputes with lively input from the audience.

Day two of the conference began with Beatriz Torres Carrió, Senior Quality Assurance Manager, International Featured Standards (IFS), Germany, who gave an informative talk on   IFS strategies of quality management against food fraud and new IFS certification. Much of the IFS documentation is freely available on their website.

David Psomiadis, Head of Laboratory, Imprint Analytics, Austria, then discussed forensic science and digital techniques in Food authenticity and traceability testing. He illustrated analytical tools, good practice and future trends with several interesting examples. These included isotope analysis (LC/EA-IRMS) of ‘coconut water’ for added sugar, detection of synthetic vanillin, and fruit juice origin.

Steven Gendel returned to the podium to talk on opportunities and challenges in targeted vs non-targeted methods. Unlike microbial hazards, chemical hazards in foods cannot usually be controlled by processing. This means that control of chemical hazards in a prevention-based food safety system is focused on the supply chain. Dilution is a poor solution and may be illegal. The food industry is turning to the use of non-targeted methods to characterize foods or ingredients. The advantage of well-designed non-targeted methods is that they can indicate whether a particular sample is ‘out of range’ without needing to know why. The disadvantage of these methods is that they require a great deal of data to determine the expected ‘range’ under a variety of growing, harvesting, and handling practices. In addition, each sample test can generate significantly more data than presence-absence or threshold-based tests. The volumes of data involved and the need for standards on how to generate and use these data present unique challenges and opportunities for standard development organizations and for food manufacturers. Steven then led a discussion on the acceptability of untargeted analysis and the acceptance of the results by industry, regulators and standard setters.

Roland Poms, Secretary General, MoniQA Association, Austria, then described method validation and reference materials to assure the reliability of analytical results. There is always a need for laboratories to be able to demonstrate that their methods are fit for purpose in their hands, give equivalent results to the reference method and can be viewed with confidence by the customer. The requirements for the quality of an analytical method are best assessed in a validation study that usually involves some 8-16 laboratories to offer at least 8 valid results for statistical analysis. Parameters assessed include limit of detection, limit of quantitation, repeatability and reproducibility (variability of results within and between laboratories), accuracy, specificity, and false positives or negatives.  Additional information that can be drawn from a validation study concerns robustness, the acceptability and the handling of the method in the hands of different operators, and the possible to be identified influences on the results in a routine setting. Necessary steps towards assuring the reliability of analytical results in any laboratory are the preferred use of validated methods, the use of reference materials if available, method verification and participation in proficiency tests, training, considering the requirements for laboratory accreditation and following Good Laboratory Practice. Reliable analytical results are the basis for appropriate decision-making processes concerning product safety and adequate food safety management measures.

Samuel Godefroy, Professor of Food Risk Analysis and Regulatory Systems, Food Security, Laval University, Canada described the  Association of Official Analytical Communities (AOAC INTERNATIONAL) standard method performance requirements (SMPR) for allergen detection methods. Samuel described how analytical methods are part of the Codex regulatory provision – they don’t exist in isolation but to support a Codex standard. SMPRs for food allergen methods using ELISA-based techniques describe the minimum recommended performance characteristics to be used during the evaluation of a method. SMPRs for egg and milk are available and those for tree nuts will be relying upon the ability of the food allergen community to develop agreed-upon reference materials for the selected tree-nut(s) of priority.

Karin Hoffmann-Sommergruber, Group Leader, Department of Pathophysiology and Allergy Research, Medical University of Vienna, Austria, spoke on precautionary allergen labelling. Karin described the majority of allergic reactions involve skin, respiratory, gastrointestinal and systemic (anaphylaxis) effects respectively. Legislative and Precautionary Allergen Labelling, PAL, (which is voluntary) were described with the disadvantages of the latter explored. Probabilistic allergen risk management was discussed along with thresholds of elicitation. Unfortunately allergic consumers often assume PAL is regulated. Karin described how PAL is approached by various countries and recommended PAL should only be used if cross contamination is unavoidable and represents a real risk.

Clare Mills, Professor of Biological Sciences, Division of Infection, Immunity and Respiratory Medicine, University of Manchester, UK, delivered her talk, ‘Free-from foods – what does it mean for allergens?’ via Skype, which worked well. There is currently no consensus as to what constitutes a ‘free-from’ food with regards IgE-mediated food allergies. Regulators and food manufacturers alike have to rely on analytical testing to demonstrate the absence of an allergen in a food product. The lack of agreed reference doses which are considered safe for the majority of allergic consumers means it is unclear how sensitive test methods need to be, although dose distribution modelling can provide guidance with regards the levels of allergens that are unlikely to cause a reaction.  Inter-laboratory comparisons of immunoassay test methods for foods such as milk, egg and peanut have shown wide variations in test method performance regarding sensitivity and reproducibility of results. This leaves the possibility that manufacturers and enforcement bodies may obtain conflicting test results. The development of appropriate certified reference materials for allergen analysis can be used to help reconcile some of these differences, especially for test methods showing reproducible and consistent differences. Mass spectrometry (MS) methods have much to offer as a complementary, confirmatory method to the currently favoured immunoassay test methods. Clare described the iFAAM multi-centre study using a MS method for determination of peanut in chocolate dessert. The trial showed significant divergence in the ability of ELISA tests to quantify peanut allergens and demonstrated that MS has the potential to detect and quantify peanut protein at similar levels to ELISA. Identified gaps will be taken forward in the recently EFSA-funded project ThRAll.

Katherina Scherf, Research Group Leader, Leibniz-LSB, Technical University Munich, Germany, followed with a well thought out presentation on improved reference materials for gluten analysis.  Katherina described the standards for gluten-free products and the difficulties of gluten analysis. Well-characterized reference materials are essential to help address these challenges and Katherina summarised a considerable amount of work carried out by an international consortium of which she is a leading participant. This has resulted in identification of wheat cultivars that are representative for the multitude of wheats grown worldwide. Selection criteria for representative wheat cultivars as basis for the development of a new reference material for gluten(-free) analysis were defined. Grains of wheat cultivars from different geographical origins were collected, milled into white flours and characterized for chemical composition, wet and dry gluten content, ELISA response using two different antibodies and protein composition assessed by gel-permeation and reversed-phase high performance liquid chromatography and polyacrylamide gel electrophoresis. Based on the results, qualitative and quantitative selection criteria were defined and five wheat cultivars from four continents were selected. These cultivars were further investigated, and two reference materials are suitable: the single cultivar Carberry and the blend of the five cultivars.

Rene Crevel, Director at René Crevel Consulting Limited, UK, formerly of Unilever, gave a cogent overview of   new tools for allergen risk assessment and management arising from the the iFAAM project (Integrated Approaches to Food Allergy and Allergen risk Management).  The project team developed an allergen tracking tool, together with a tiered risk assessment approach. The tools and their application were thoroughly described. Tier 1 is based on point estimates and is designed to be used by those without deep expertise in the allergen field, but they do need sufficient knowledge of their own processes. The RA is conservative hence not likely to make an unwise assessment. Tier 2 RA is built on previous models, and uses distributions of minimum eliciting doses, food consumption data and unintended allergen concentrations  combined with advanced statistics and modelling. Tier 1 will be freely available. Tier 2 will not be publicly available in part owing to the level of expertise required to use it. René also described the iFAAM risk management toolbox, an extension to the tracking tool. RM options were collected and collated and a decision tree approach developed. iFAAM finished in 2017 and the tools will be available on the website. Currently iFAAM are looking at a governance structure for the website dissemination of the tools to ensure continuing relevance. The information can also be accessed via the MoniQA website.

Bert Popping, Managing Director and Co-owner, FOCOS, Germany then discussed the benefits and challenges of consumer analytical devices. Against a backdrop of growing distrust of food in some quarters can point of use devices help, especially for people with food allergies? Available, and affordable, in the U.S. they have given rise to a number of issues. These include sampling, misinterpretation of instructions and outputs and specificity. Bert described 2018.AOAC stakeholders’ guidance document for consumer analytical devices with a focus on gluten and food allergens published in Journal of AOAC International101(1), pp.185-189.

Ronald Niemeijer, Director Global Marketing Food & Feed Analysis, R-Biopharm, Germany, next discussed socio-economical aspects of food allergens. Ronald reviewed the burdens on people with allergy and autoimmune conditions and their prevalence. He discussed the provision of ‘free-from’ food on their behalf. ‘Free-from’ is a growing business opportunity but means increased production costs, logistic separation of ingredients, dedicated facilities or lines, or much stricter cleaning. Management and testing costs are higher. Ronald concluded with a review of current and potential future tools in allergen management and food allergen testing.

Adrian Rogers, Senior Research Scientist, Romer Labs UK, UK, discussed allergen analysis and three case studies which demonstrated how particular challenges faced by different users were overcome to make sure that immunoassay-based testing for food allergens best suited their needs. These were (1) an intensive validation programme of an ELISA for peanut in confectionery after the kit the customer had previously been using was withdrawn, (2) a study of casein determination in hydrolysed whey based baby formula milk and (3) findings of almond and peanut in a spice/seasoning sample where different kits gave different results.

Two awards for best poster at the conference were made with the researchers giving short talks on their work. Patricia  Galan-Malo presented ‘Survey on the occurrence of allergens on food-contact surfaces from school canteen kitchens’ and Andreas Zitek presented ‘Determination of origin and authenticity of fish by elemental and Sr-isotopic fingerprints’.

Finally Richard Cantrill and Roland Poms gave feedback on the two MoniQA Task Force workshops on (a)  Food Fraud Prevention and Authenticity, and (b) on Food Allergen Reference Materials. The next task force meetings will be in Toronto at an AOAC allergen meeting in August 2019. The 3rd MoniQA International Symposium in this series will occur in latter half of 2019.

 

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer