Sprachen:

Belastung durch mehrere „chemische Gemische“: Verfahren zur Bewertung von Risiken für Mensch und Ökologie

Introduction

    Human and ecological risk assessments of combined exposure to multiple chemicals (“chemical mixtures”) pose several challenges to scientists, risk assessors and risk managers, particularly due to the large number of chemicals involved and to their different exposure patterns and toxicological profiles in humans and other species present in the environment.

    The development of harmonised methodologies for the risk assessment of combined exposure to multiple chemicals is an important element in the European Food Safety Authority (EFSA)’s Science Strategy and a number of activities have been undertaken over the years at EFSA to support such harmonisation.

    In this context, an EFSA report (2013) reviewed the available “international frameworks dealing with human risk assessment of combined exposure to multiple chemicals” and made a number of recommendations for future work in this area to move towards harmonisation of methodologies. On this basis EFSA organised a scientific colloquium on the subject which focused on 4 key issues around the development of models for hazard identification and exposure assessment1 for mixtures of chemicals, allowing the experimental tools and statistical and risk assessment methods to be used to provide a prediction of joint effects:

    1. The mechanistic models for hazard identification and assessment;
    2. The harmonisation of methods for combined exposure assessment;  
    3. The use of so-called “OMICs”2 and in silico3 methods for risk assessment; 
    4. The application of science-based uncertainty factors and approaches for risk characterisation using mechanistic approaches. 

    1 Exposure assessment: Process of measuring or estimating concentration (or intensity), duration and frequency of exposures to an agent present in the environment or, if estimating hypothetical exposures, that might arise from the release of a substance, or radionuclide, into the environment. Reference: IUPAC Glossary of Terms Used in Toxicology 2007 https://envirotoxinfo.nlm.nih.gov/toxicology-glossary-a.htmlassessment 
    2 Omics: Any of several biochemical or genetic studies that aim to identify the totality of a certain type of compound, gene etc in a specific organism
    3
    in silico: literally ‘in silicon’, i.e. ‘in the computer’; referring to analysis or experimentation carried out in a computer environment, rather than in the laboratory

    What are the tools available to support hazard identification and risk assessment of chemical mixtures?

      First of all, it is essential to differentiate hazard identification, which refers to the identification of the intrinsic toxic properties of a chemical, and risk assessment which evaluates the probability of (adverse) effects related to the level of exposure, both in intensity and in duration4.

      A number of tools are available to support human and ecological hazard and exposure assessments for chemical mixtures. These include biologically-based models, in silico (computer-based) tools, and the so-called “omics” tools that look at biological systems as a whole, either by considering the whole genome (genomics), the full set of expressed and active proteins (proteomics), or the metabolic products in the cells (metabolomics), as well as models to evaluate the physiologically based pharmacokinetic (PBPK) models that are flexible enough to include time and life-stage variable exposure.

      4 About the fundamental distinction to be done between the notions of hazard and risk, see: www.youtube.com/watch?v=PZmNZi8bon8  Subtitles available in English, French, German, Dutch, Spanish, Chinese and Russian. A French speaking version is also available: https://youtu.be/wRmfvFYDNr8 

      Is the method of addition of potential effects generally used for the prediction of chemical mixture effects valid?

        When different components of a mixture provoke the same effect on the same endpoint, key studies have confirmed the validity of the estimation of effects by the mere addition of the various exposures to the different components of the mixture as a default (conservative) prediction approach for a simple risk assessment.

        For realistically occurring environmental mixtures, experimental evidence indicates that the number of chemicals that are relevant for the determination of a cumulative effect is indeed relatively low. Usually, 90% of the potency of a complex mixture is determined by no more than 3 or 4 individual chemicals. This also contributes to reducing the prediction window between concentration addition and independent action, and simplifies the prediction of the response.

        In eco-toxicological risk assessment, for similarly acting “narcotic chemicals”, many studies have also established the validity of the concentration addition model. Only a few cases of strict independent action by each chemical of a mixture have been observed. The likelihood of combined (or synergic) effects is then dose-dependent and single substances were always assumed to be adding to a background of chemicals.

        Substances for combined assessment should thus be selected based on the likelihood of common exposure and combined effects. With respect to synergic effects, the discussion group noted that, for human risk assessment, it is assumed that with exposure below the Tolerable or Acceptable Daily Intake (TDI/ADI) there is little evidence for synergism in practice at such low levels.

        How can synergic effects5 of different chemicals be assessed?

          The evidence for the likelihood of synergisms to occur in the context of combined exposure to chemicals should be analysed, and the magnitude of synergisms should be assessed for cases in which synergisms were identified.

          Systematic analysis might provide information that would serve as a basis to decide whether or not, or under which conditions, an additional assessment factor6 to cover synergisms might be feasible and to define the magnitude of such a factor.

          There is however sometimes considerable confusion about what ‘synergy’ is, and it was made reference to the definitions given by the International Programme on Chemical Safety (IPCS): “Chemicals may interact to produce an effect, such that their combined effect “departs” from dose additivity”. Such departures comprise “Synergy”, where the effect is greater than that predicted on the basis of simple “Additivity”. “Antagonism” is also possible, where the effect is less than that predicted on the basis of additivity.”

          Systems biology studies, for example, have revealed possible different and multiple modes of action for a single chemical in different taxa, depending on the physiology of the species.

          Evidence of synergy with a mechanistic basis, for instance when a substance inhibits an enzyme that contributes to the metabolic degradation of another substance, clearly shows that further tests may need to be applied so that such interactions can be incorporated within specific assessments. However, there is a lack of understanding of the mechanisms and modes of actions and of their relevance to joint effects. This complexity challenges the application of concentration addition and independent action as the only mechanistic possibility as there is still a lack of scientific criteria for grouping of chemicals.

          Furthermore, evidence of multiple modes of action for single chemicals, and different modes of action for two different chemicals from the same class in the same species, have also been found. This is influenced by the dynamic nature of exposures that can vary in space and time. Therefore, reliable scientific mechanistic criteria for the grouping of chemicals are still needed.

          5 Synergism, synergistic effect in toxicology: Pharmacological or toxicological interaction in which the combined biological effect of exposure to two or more substances is greater than expected on the basis of the simple summation of the effects of each of the individual substances. Reference: IUPAC Glossary of Terms Used in Toxicology 2007 https://envirotoxinfo.nlm.nih.gov/toxicology-glossary-a.html
          6 Assessment factor Numerical adjustment used to extrapolate from experimentally determined (doseresponse) relationships to estimate the agent exposure below which an adverse effect is not likely to occur Reference: IPCS Risk Assessment Terminology. Part1: IPCS/OECD Key Generic Terms used in Chemical Hazard/Risk Assessment; Part 2: IPCS Glossary of Key Exposure Assessment Terminology https://www.opentoxipedia.org/index.php/Assessment_factor

          How does a multistep or “tiered approach” to risk assessment of chemical mixtures work?

            Pragmatic options for simplification are needed to support a scientifically valid approach to risk assessment that would include consideration of combined exposure when appropriate. Even if both models remain applied, the complexity of independent modes of action for mixture effects led to the development of the concept of a ‘prediction window’ between simple concentration addition and independent action.

            Differences in protection goals are a challenge for both human and ecological risk assessments. Within the default assumption of dose addition approach, if synergistic effects can be a real phenomenon, the overall concept of a multistep or tiered hazard assessment7 for harmonisation of mixture risk assessment frameworks can be applied, and, as this remains a challenge, interactive effects can be incorporated within the assessments. The problem is that data gaps are common for both human and ecological hazard assessments and that there is a related issue of how to bridge these data gaps.

            EFSA uses a multi-stage assessment framework where studies at each further stage is triggered by specific finding at an earlier one. For instance, if genotoxicity is suspected in in vitro studies in the Tier 1, then the Tier 2 assessment is triggered:

            Tier 1

            Tier 2

            Tier 3

            • Absorption, distribution, metabolism, excretion (ADME) (repeated dose, volunteer studies) (…)
            • Carcinogenicity (1st and 2nd species);
            • Specialised studies (immunotoxicity, neurotoxicity, ..);
            • Reproductive toxicity studies (two-generations).

            7 The four stages of a tiered test approach in risk assessment
             www.efsa.europa.eu/sites/default/files/event/documentset/120921-p05.pdf
            8 Toxicokinetics
            A. Generally, the overall process of the absorption (uptake) of potentially toxic substances by the body, the distribution of the substances and their metabolites in tissues and organs, their metabolism (biotransformation), and the elimination (excretion) of the substances and their metabolites from the body.
            B. In validating a toxicological study, the collection of toxicokinetic data, either as an integral component in the conduct of non-clinical toxicity studies or in specially designed supportive studies, in order to assess systemic exposure.
            9 Toxicogenetics: study of the influence of hereditary factors on the effects of potentially toxic substances on individual organisms. Reference: IUPAC Glossary of Terms Used in Toxicology 2007

            Are the uncertainty factors and the risk characterisation used for human and for ecological risk assessment harmonised?

              Both types of risk assessment aim to protect a certain proportion of cases, however, the protection goals is not the same for human where protection of individual health through preventing adverse effects is the aim, whilst for ecosystems protecting/preserving populations by protection of individuals is the goal. This difference is a challenge for harmonisation even if certain characteristics of the framework for exposure assessment10 are amenable to potential harmonisation.

              Common issues for both frameworks exist including definition of the degree of conservatism, use of uncertainty factors, and need for a common basis for mode of action assessment. The substances for combined assessment should then be selected based on likelihood of common exposure and combined effect.

              However, in relation to the assessment of risk to human health, it is regarded important to also consider life-stages (pregnancy, lactation, early life) and susceptible populations (polymorphisms) in the modelling.

              10 Assessment factor: Numerical adjustment used to extrapolate from experimentally determined (dose response) relationships to estimate the agent exposure below which an adverse effect is not likely to occur Reference: IPCS Risk Assessment Terminology. Part1: IPCS/OECD Key Generic Terms used in Chemical Hazard/Risk Assessment; Part 2: IPCS Glossary of Key Exposure Assessment Terminology www.opentoxipedia.org/index.php/Assessment_factor 

              How to resolve the present limits of the additive approach for evaluating the toxicity of chemical mixtures?

                To resolve these issues, the development of alternative methods to animal testing is encouraged since they would provide informative data for establishing grouping/bridging/filling data gaps in both toxicological and ecological hazard assessment. For ecological risk assessment in particular, the discussion group of EFSA considered that harmonisation on principles of grouping of chemicals was still hardly feasible.

                The challenges of characterising complex mixtures would benefit of rapid analysis that can only be performed in vitro. Indeed, large-scale in vivo testing for human health effects of mixtures is unlikely to be feasible, except to confirm results from in vitro/ in silico studies.

                In this context, the identification of Adverse Outcome Pathways (AOPs) using in vitro (laboratory bench) and in silico (computer models) approaches would help. The use of in vitro assays -in particular genomics research - is anticipated to be extremely valuable for the testing of mixtures, making it possible to investigate interactions where in vivo studies would be impractical, and to test/refine mixture interaction models.

                Pragmatically, probabilistic methods11 will also be valuable for prospective assessments, the prediction of joint effects, and the characterisation of the uncertainties related to such prediction to include both prediction and uncertainty in joint effects within a single assessment.

                11 Probabilistic method or model: based on the theory of probability or the fact that randomness plays a role in predicting future events. The opposite is deterministic , which is the opposite of random and tells something can be predicted exactly, without the added complication of randomness. www.statisticshowto.datasciencecentral.com/probabilistic/ 

                What is the standard method used for assessing the risk of exposure to a single chemical?

                  To extrapolate from test animal species to humans, uncertainty factors (UF), sometimes called safety factors are used like, for example, by the US Environmental Protection Agency (US-EPA) for the human health non-carcinogenic dose-response assessment for individual environmental contaminants,). These uncertainty factors cover the extrapolation12 from animals to humans – including inter-individual variability - based on the Lowest Observed Adverse Effect Level (LOAEL) or on the No Observed Adverse Effect Level (NOAEL) observed in sub-chronic and chronic toxicity studies in animals. They also consider toxicokinetic and toxicodynamic variability13.

                  To account for both extrapolation from animal data to humans and for intra-species variability within the human population, a general default uncertainty factor of 100 is usually considered in the derivation of health-based guidance values from a point of departure, usually the NOAEL. This means that the chemical concentration considered “safe” (Tolerable Daily intake - TDI or Admissible Daily Intake - ADI) for humans is, at least, 100 times lower than the lowest concentration where no adverse effect is detected (NOAEL) either in animals or in humans. In some cases, however, this default assessment factor might not be adequate.

                  12 Extrapolation: Calculation, based on quantitative observations in exposed test species or in vitro test systems, of predicted dose-effect and dose-response relationships for a substance in humans and other biota including interspecies extrapolations and extrapolation to susceptible groups of individuals. Note: The term may also be used for qualitative information applied to species or conditions that are different from the ones in which the original investigations were carried out. Reference: IUPAC Glossary of Terms Used in Toxicology 2007 https://envirotoxinfo.nlm.nih.gov/toxicology-glossary-a.html 
                  13 Toxicokinetics (often abbreviated as 'TK') is the description of both what rate a chemical will enter the body and what occurs to excrete and metabolize the compound once it is in the body
                  Toxicodynamics, termed pharmacodynamics in pharmacology, describes the dynamic interactions of a toxicant with a biological target and its biological effect. (Wikipedia).

                  Besides extrapolation from animal data, are there other methods available for a risk assessment from exposure to chemical mixtures?

                    Quantitative structure-activity relationships (QSAR) models14 comparing molecules are also important for chemical screening and grouping and can help to group chemicals with similar Mode of Action (MoA) in order to assess their relative potency. This consideration, may however lead to too narrow a ‘grouping’ of chemicals that cause a particular toxicity, and hence underestimation of toxicity. A consensus was agreed among the participants to the EFSA scientific colloquium that information on the mode of action is desirable and considered critical to reduce uncertainties and that, in this perspective, it should ideally go further than just a mode of action, in order to fully understand physiologically-based reactions to chemical mixtures.

                    The increasing availability of information of the structure of cellular receptors based on genome sequences provides knowledge that can support understanding of specific species sensitivity. Use of mechanistic studies using whole “omics” (transcriptome, proteome and metabolome) methods with targeted individual receptor structure and binding studies can support these model assessments for species read across.

                    However, a concern was noted in regard to how to interpret genomic results in terms of differentiating beneficial, adaptive, or adverse effects. Any use in deducing mode of action is also likely to be prone to subjectivity, and would need a systems biology modelling approach to overcome this limitation. “Omics”-based methods would also be more difficult to apply to an Ecological Risk Assessment (ERA) due to the inability to identify the population relevance of the individual genomic responses.

                    The in silico profilers provided with the QSAR Toolbox of the Organisation for Economic Co-operation and Development (OECD) are also valuable for characterisation of an untested chemical but at the same time need further development to improve predictive power and reliability.

                    As already mentioned, probabilistic approaches are well accepted in ecology, e.g. for species sensitivity distributions, and these can be (and have been) applied to mixtures using the multi-substance potentially affected fraction of species. For human hazard assessment, the use of probabilistic methods is not well explored yet.

                    Dynamic energy budget (DEB) modelling uses effects on growth and reproduction due to energy deficits arising from chemical stress. These models are in theory applicable to various species and can potentially model multiple types of stressors (food, oxygen, chemicals, etc.).

                    Ultimately, the development of relevant tools requires the assembly of organised databases on species traits for physiologically-based pharmaco-kinetic (PBPK) modelling15, for comparative genomics, protein structure databases and for quantitative structure-activity relationships (QSARs), in addition, it will advance the current state of the art of bioinformatics, systems biology and data management. These informations provide indeed the underpinning knowledge to establish Adverse Outcome Pathways (AOPs)16 relevant to a specific mixture.

                    Eventually, the relatively limited requirements for ecotoxicology assessments for industrial chemicals are very different from the ecotoxicology assessments of pesticides and biocides. These differences have meant that methods for environmental exposure assessment which take account co-exposure to multiple chemicals, are lacking compared with human health assessment.

                    14 Structure activity relationship (SAR)
                    Association between specific aspects of molecular structure and defined biological action. Reference: IUPAC Glossary of Terms Used in Toxicology 2007 https://envirotoxinfo.nlm.nih.gov/toxicology-glossary-a.html
                    15 Physiological pharmacokinetic model and toxicologically based pharmacokinetic modeling (TBPK) Mathematical modeling of kinetic behavior of a substance, based on measured physiological parameters
                    16 Adverse outcome pathways (AOPs): depict the long, sometimes complex chain of processes and events from the first interaction of any chemical with a molecular target (such as the interaction with a receptor or the inhibition of an enzyme) through to the perturbed function at cell and tissue level, leading finally to an appreciable disturbance of the organism’s function and/or structure causing health disorders in humans or animals.
                    www.openaccessgovernment.org/understanding-adverse-outcome-pathway-concept/36458/ 

                    How are the assessment of combined levels of exposure to multiple chemicals derived for humans from their environmental presence?

                      Analytical methods and databases can be used to identify concentrations in air, water and soil through multimedia fate assessment. The capacity to turn these estimates from environmental fate and external concentration of the chemicals into assessment of internal exposure of the target species depends on knowledge of the receptor. A complicating aspect in relation to ecological risk assessment is that environmental composition of a chemical mixture is expected to vary over time.

                      To translate the external exposure to an internal dose, PBPK models can be developed using whatever data available. The complexity in the exposure models will vary both on the basis of physiology and data availability. Initial insights from PBPK model on internal exposure levels can be validated by use of biomarkers, although in humans and ecological species the application and value given to measures using these tools vary.

                      The models can identify key routes of uptake and biomarkers in humans and ecological species. Even when data exist, the ease with which these data (e.g. of traits such as diet, home range) can be extrapolated depends greatly on the standard of database curation. Since pesticides are evaluated routinely in both human and ecological risk assessment, they can provide excellent case studies for framework development. For example, the effect seen in an efficacy test for a pesticide - in which the whole formulation consisting of the active substance and co-formulants is tested - could provide information on the internal dose achieved in the target, and the result could be extrapolated to other species.

                      What are the main remaining data gaps and research needs?

                        More information on the modes of action and on the interactions (cross-talk) between modes of action of different chemicals (also with respect to adverse outcome pathways) is required in order to increase confidence in their hazard characterisation. Also, more dose-response data for single substances and mixtures are required.

                        One EFSA discussion group highlighted the paucity of information on combined exposure as a key data gap. Actual (external and internal) co-exposure to multiple chemicals including magnitude of exposures needs to be defined. Bio-monitoring was regarded as giving some indications.

                        What are the main conclusions on the « state of the art » in evaluating the risks of exposure to chemical mixtures?

                          Overall, identifying the chemical substances that are drivers of mixture effects still remains a challenge. While the complexity of mixture toxicity assessment has always presented a challenge, there have nonetheless been significant advances in the field. The interplay between initial biochemical interactions and the downstream nature of the physiological (and thus potentially toxic) response have been highlighted in a number of ways. Systems biology studies have revealed different modes of action of a single chemical in different taxa dependent on the physiology of the species.

                          Although the issues and challenges related to exposure to chemical mixtures are now widely recognised, current regulatory policies still generally remain focussed on assessment of each individual chemical in a commercial mixtures. However, mixtures approaches appear more and more in prospective regulation and in retrospective effects assessment studies, although the mandate for their inclusion is only clear in some regulatory contexts.

                          In the same time, to solve the key issues around the development of models for hazard and exposure assessment for mixtures, the existing experimental tools and statistical and risk assessment methods can be used to provide a prediction of joint effects.

                          In ecotoxicological risk assessment for example, many studies have established the validity of the concentration addition model for similarly acting chemicals with a narcotic effect, ranging from binary mixtures to complex mixtures of 50 or more chemicals and to mixtures in complex products such as oil and petroleum products.

                          In this context, the major gaps in knowledge and challenges that remain to be addressed include:

                          • A common basis for the assessment of the mode of action;
                          • Information on the interactions between these modes of action of different chemicals, also with respect to adverse outcome pathways;
                          • The availability of dose-response data in all single databases;
                          • Separating statistical uncertainty from variability, which are both inherent within any risk assessment.

                          Ultimately, risk assessments may need to be supported also by biomonitoring data which should include the specificity of responses to toxic mixtures.

                          Pragmatically, some probabilistic methods previously mentioned will also be valuable for prospective assessment to incorporate the prediction of joint effects and the associated uncertainty within a single assessment.


                          FacebookTwitterEmail
                          Themen
                          Veröffentlichungen A-Z
                          Brochüren

                          Get involved!

                          This summary is free and ad-free, as is all of our content. You can help us remain free and independant as well as to develop new ways to communicate science by becoming a Patron!

                          PatreonBECOME A PATRON!