Committee report November 4-5, 2014 - Chemicals Management Plan Science Committee
Chemicals Management Plan Science Committee
The Science Committee (SC) was requested to prepare a response to the following charge question:
- Health Canada and Environment Canada are seeking input on best practices for deriving a sufficient rationale for read-across within the context of risk assessments conducted under the Chemicals Management Plan (CMP). The departments are also seeking input on challenges with respect to the approach and considerations on how to best address them. The SC is requested to consider the challenges faced by the departments in using "read-across" to address data needs pertaining to deriving a sufficient rationale for regulatory decision making and to include:
- Uncertainties with respect to identification and choice of appropriate analogues;
- Impact of missing information or deficiencies in empirical data on source chemical(s);
- Mechanism/Mode of Action supporting the similarity rationale;
- Novel testing methods to support the similarity hypothesis; and
- Reading across the absence of toxicity.
Note: the order and wording of these five elements have been modified by the SC.
The SC offers the following responses for consideration by the departments.
Background
Read-across is a data-gap-filling technique within analogue/category approaches and is conducted when there is limited measured data available to support evaluation of toxicity on endpoint(s) of concern for target substance(s). Read-across methodology draws on data for source chemicals to address consideration of a number of questions. These questions may be stated as follows:
- Is the substance under consideration (target) a member of an analogue/category that is already established?
- Can a chemical grouping be formed?
- Is the category toxicologically relevant for the endpoint(s) of concern?
- Are source chemical(s) with relevant and good quality data available?
- Is the uncertainty associated with the read-across acceptable for the decision context under consideration, and can that uncertainty be expressed quantitatively?
- What additional information is needed to address the residual uncertainty associated with the proposed read-across?
Framework
The SC suggests that a framework of guiding principles for developing, evaluating and documenting read-across analyses be applied. An effective framework to facilitate application of read-across to support CMP decisions should include the following elements:
- An identification of the decision context for the specific exercise;
- A description of the similarity rationale for the grouping approach and the associated read-across in a transparent manner;
- Clarification of the roles of endpoint-specific and/or endpoint-nonspecific factors impacting the assessment;
- Documentation of the logic and supporting data used in developing the read-across prediction so it can be recreated by the end user; and
- A description of the uncertainties and separation of data uncertainty (confidence in the available information associated with the source chemical[s]) from toxicological uncertainty (confidence in the prediction of hazard information for the target chemical).
The read-across assessment should be fit-for-purpose to support a specific decision. As such, the SC was aware of the balance that needs to be struck between having a framework in place to provide for consistency and transparency in the use of methods, while allowing for sufficient flexibility to enable practitioners to derive "best practices" as the use of read-across continues to evolve. These elements are discussed below and considered in the context of the three case studies presented to the SC.
Decision Context for the Use of Read-Across
What is the "decision context"?
Read-across analyses are being used increasingly under the CMP to determine the Point-of-Departure (POD) and support risk-based decisions and management. Use of the read-across approach has implications with respect to the type of information that is needed and the framework required to support CMP decisions.
The proposed framework (1 to 5) allows for the development of guiding principles for undertaking and evaluating read-across for discrete organic compounds. An important limitation of existing frameworks, in general, is that they are not normally tailored to address substances that are unknown or variable compositions, complex reaction products and biological materials (UVCBs). Evaluation of read-across for these types of substances involves additional complexity in comparison to that for discrete organic chemicals. Although some UVCB groups comprise, for example, a homologous series of organic chemicals and are amenable to ready read-across, the scientific community has not reached consensus on how to perform read-across for many of these types of substances.
What is the scope of the problem? What is the decision? What are the data gaps for making this decision?
Read-across is only one of several data-filling techniques. Read-across is endpoint specific. Although an over-arching similarity hypothesis may provide an initial basis for grouping substances into categories, a different justification and/or different source chemical(s) may be required to address each endpoint where read-across is being proposed. For example, different criteria for category membership may have different source chemicals and/or involve different weights-of-evidence. A guiding principle in the proposed framework requires that during the initial scoping and problem-formulation stage, the full range of data-filling techniques should be considered to determine whether or not read-across is the best approach.
Suggestions
Identify up front the risk context, type of information and level of certainty required for the CMP to make a decision using a read-across approach. Decisions should be made based on transparent criteria.
Depending on the scope of the problem and the data gaps, consider whether or not a read-across approach is merited or whether a quantitative-structure activity relationship (QSAR) approach will provide the information fit-for-purpose.
(1) Uncertainties with respect to identification and choice of appropriate analogues; role of endpoint-specific and/or endpoint-nonspecific factors in impacting the assessment.
Scope of the read-across: What is/are the hypothesis/hypotheses for similarity?
See the Organization for Economic Co-operation and Development's (OECD) Guidance on Grouping of Chemicals for details on a similarity rationale.
There are a number of potential types of read-across scenario(s) that can be used as the basis for analysis. Four established scenarios are listed in Table 1.
Scenario | General Description |
---|---|
1 | Chemical similarity of compounds that do not require (or do not undergo) metabolism to exert a potential adverse human health effect (in other words, direct-acting toxicants with a similar mode of toxic action) |
2 | Chemical similarity involving metabolism and resulting in exposure to the same/similar toxicant (in other words, indirect-acting toxicants with a similar mode of toxic action based on metabolites with the same mechanism of action) |
3 | Similar chemicals with generally low or no toxicity (in other words, toxicants with no obvious reactive or specific Mode of Action) |
4 | Distinguishing chemicals in a structurally similar category with variable toxicities based on Mode of Action hypothesis (in other words, toxicants with high structural similarity but markedly different potency) |
Data taken from the Safety Evaluation Ultimately Replacing Animal Testing (SEURAT-1) workshop: "The read-across case study for safety assessment contributing to the SEURAT-1 Proof of Concept." Joint Research Centre, Ispra, Italy, April 29-30, 2014. |
Suggestions
Explicitly articulate data gaps, endpoints and plausible hypotheses for grouping substances.
Can a category be formed? How do you build similarity?
Evaluate, justify and document the basis for all major considerations (in other words, similarities in chemical structure, in chemical transformation [in both environmental and biological systems], in toxicokinetics and in bioactivity). Increasing strength in similarity is provided in Scenarios 1 to 3 of Table 1. The assessor must consider and communicate the strength (or the uncertainty) associated with each proposed analogue or category.
- Basis 1
-
Chemical similarity (consensus predictions of physical/chemical properties): a description of structural and physical-chemical property similarities (and differences) among the category members.
A description of how these similarities and differences are linked to the read-across hypothesis. These descriptions need to be supported by a data matrix of key structural and chemical properties. Uncertainty and variability in estimates and/or measurements of physical-chemical properties should be documented and, if there is large uncertainty, explained.
- Basis 2
-
Transformation similarity (absorption, distribution, metabolism and excretion [ADME], degradation): A description of expected toxicokinetics and abiotic and biotic modification of similarities and differences among the category members.
Are these similarities/differences based on experimentally determined or modelled pathways? What is the uncertainty associated with experimentally or modelled information?
A description of how these similarities and differences are linked to the read-across hypothesis. This description needs to be supported by a data matrix of abiotic and biotic modification properties, including a summary of metabolic pathways and metabolites. This summary should describe the information sources used in determining metabolic similarities/differences.
- Basis 3
-
Biological/toxicological (Mode [or mechanism] of action [MoA], Adverse outcome pathway [AOP]) similarity: a description of measured and predicted biological and toxicological property similarities (and differences) among the category members.
This description needs to contain an assessment of the uncertainties associated with these measurements and/or predicted properties. A description of how these similarities and differences are linked to the read-across hypothesis. This description needs to be supported by a data matrix of biological and toxicological properties, including a summary of structurally linked toxicological trends within the category.
In summary, for all the examples, the consistent requirement is for the assessor to clearly evaluate and subsequently communicate the basis of, or lack of, similarity for each argument and the uncertainty associated with the evidence. Examples of considerations relating to "similar properties" are provided in Table 2.
Scenario | General Description |
---|---|
1 | Structural similarity, to include common chemical class and sub-class(es), similar range in carbon atom number, similar molecular scaffolding (see Bemis and Murcko, 1996) and common constituents in the form of key substituent(s), structural fragment(s), and extended structural group(s). While the latter three structural entities may appear similar in design (in other words, 2 dimensional [2D] molecular substructures), they are often sufficiently different to be considered separately. |
2 | Similar physico-chemical and molecular properties, especially those that are linked to key factors that affect toxicity (for example, volatility, solubility reactivity). |
3 | Similar toxicokinetics. |
4 | Same key abiotic transformations (for example, hydrolysis, auto-oxidation). |
5 | Same key metabolic pathway(s) or pathway inhibition. |
6 | Activation to the same or similar reactive chemical species. |
7 | Degradation to the same or similar chemical species. |
8 | Similar structural alerts or toxicophores (in other words, structural fragment[s] and extended structural group[s] experimentally demonstrated to be associated with a specific toxic effect that is causally linked with the in vivo endpoint which is read-across). |
9 | Adverse Outcome Pathway (AOP)-based Molecular Initiating Event (MIE) and/or key intermediate event(s) causally linked to the in vivo endpoint, which is the basis of the read-across. |
10 | Other data (for example, in vitro) relevant to the apical endpoint or to the in vivo endpoint, which is the basis of the read-across. |
Traditional OECD Guideline toxicology studies typically do not provide the detailed biochemical information necessary to support considerations 3-7 and 9 in Table 2. Thus, these considerations must be evaluated using Quantitative Structure-Activity Relationships (QSAR)/Expert Systems tools or by stand-alone, non-guideline biochemical studies. These biochemical criteria and the data underpinning their use must be explained and placed within the context of the decision. When suitable biochemical data are not available, the impact on the strength of association argument should be characterized.
Suggestions
Develop transparent criteria for evaluating similarity to justify use of the read-across approach. Document the rationale for a grouping approach. Clarify the roles of endpoint-specific and/or endpoint-nonspecific factors impacting the assessment.
"Is the endpoint toxicologically relevant?"
The justification for similarity (in other words, why the assessor expects substances to be similar for each endpoint being addressed) is a highly complicated exercise and must be carried out separately for each endpoint evaluated. This may involve the identification of an overarching hypothesis, elucidation of the most sensitive/relevant endpoints and consideration of the extent to which the overarching hypothesis holds for all relevant endpoints that need to be assessed. Different endpoints will often require the use of different source chemicals, and in this case, each relationship and associated read-across hypothesis must be separately reasoned. Links from lower tier tests to the interpretation of potential for higher tier effects should be drawn explicitly. The weight of evidence should be described.
Suggestions
Evaluate analogues on the basis of both general and endpoint-specific considerations.
Consideration of the departments' three case studies
Are there sufficient data to fill gaps (read-across) to answer questions (scenarios and endpoints)?
- Case Study 1
- (3,3'-dimethoxybenzidine [DMOB] direct dyes): Yes, there is a "short bridge" between sources and targets.
- Case Study 2
- (SDPEs): Probably, but requires one additional level of support for a high-confidence read-across.
- Case Study 3
- (Dechlorane versus other organochlorine substances): No, not a read-across as presented because some of the properties of the molecules are substantially different. The SC recognizes that this is a difficult case. The substance is outside the domain of existing models (superhydrophobic, etc.). As read-across may not be appropriate, is there additional information that can be useful in the risk assessment? Are there substances with similar patterns of exposure (for example, polybrominated biphenyls)? Alternatively, this could be an instance when the government may be justified in requesting industry to provide toxicity data.
Strategies to address the challenges presented in the cases
- Emphasize decision context fit-for-purpose: decisions based on questions and interpretation of results, as per Wu et al. (2010) decision tree (clearly qualify results).
- Identify/develop bridging data (for example, Case Study 2); consider use of high-throughput (HTP) data streams when available (for example, based on analysis of the responses of cell lines).
- Where read-across cannot be applied based on limitations associated with current approaches and information (for example, Case Study 3), the following approaches may be used to build strength statistically, to identify analogs and to support a similarity argument:
- Computational chemistry, chemical features to support similarity (available in OECD tool box).
- Transformation prediction simulation to support similarity.
- Emerging data streams to inform and support similarity based on bioactivity and MoA (See Environmental Protection Agency Tox21).
- Targeted testing to provide the minimal data required to support similarity.
- Transparently describe the rationale for the conclusion of the risk assessment.
- Benchmarking and evaluation of read-across analyses.
Suggestions
Develop a systematic presentation to build the case. Provide interpretation of data across analogues, across endpoints and to support weight of evidence. Note where a read-across approach is not appropriate because the bridge is too long.
(2) Impact of missing information or deficiencies in empirical data on source chemical(s)
Ideally, data gaps should be filled without recourse to "uncertain" or "poor" analogues (for example, Case Study 3). Such circumstances might arise if there are good empirical data showing toxicity on one endpoint for all target substances at levels likely to be found in the Canadian environment: if so, the substance is Canadian Environmental Protection Act (CEPA), 1999 toxic and data gaps for other endpoints need not be filled through analogues. Alternatively, it may be possible to fill (some) data gaps through QSAR, without read-across/analogues. The uncertainty associated with the QSAR needs to be documented and reviewed to determine if it is reasonable given the decision (in other words, fit-for-purpose).
Given the requirement to identify analogues for data-poor substances for the use of "read-across," it is necessary to generate (based on structural data) a hypothesis about the relevant endpoints. Does the descriptor imply one or more in particular? Is there a need to consider one or many analogue groups? If more than one, can they be prioritized with respect to the severity and likelihood of effect?
Having made an initial choice of analogue group/category, this choice must be validated. For example, do the substances in the group include critical functional groups for a particular endpoint? Similarity tools and expert judgment should be of value. Within any category group, how should outliers be treated? These should not simply be excluded, as they may be relevant to the target chemical(s) (or to the mechanism[s] of toxicity). Rather, an analysis of outliers may provide insight into the assessment. Validation of categories may need to be resolved by input from experts.
To bolster confidence and consistency, a framework to characterize valid category choices should be developed/adapted (for example, based on Figure 1 in Wu et al., 2010).
There is also the question of how to characterize the degree of uncertainty introduced by the choice of the analogues (as discussed in Table 1, Blackburn and Stuard, 2014). This raises the issue of how to treat high uncertainty in the choice of analogues/grouping. When very high uncertainty in the choice of analogue(s) is recognized, several alternatives can be envisaged. It may be difficult to conclude that the substance is CEPA Toxic in the absence of pertinent experimental testing. Even if the absence of information encourages a conclusion under CEPA of Not Toxic, under some circumstances it may be appropriate to either:
- Request/demand additional data before drawing a conclusion,
- Come to a provisional decision while encouraging further data be generated, or
- Apply the precautionary principle and conclude that the substance is CEPA Toxic.
Suggestions
The departments must consistently capture uncertainty and communicate its implications on the interpretation of results. The CEPA regulatory framework allows for more options than are available based on purely "scientific" considerations (in other words, conclusions that can be drawn based on the available scientific data). Information that can supplement limited data and/or be used instead of missing information should be identified. If necessary and feasible, actions should be taken to promote the generation of such data. How a decision would be changed if uncertainties were reduced (value of additional information) should be discussed.
(3) Mechanism/Mode of Action supporting similarity rationale, and (4) Novel testing methods to support similarity hypothesis
Classical toxicology does not need MoA data but the lack of a transferable MoA introduces uncertainty in read-across. There are circumstances when MoA is required for a decision using read-across. For example, MoA is relevant when exposures are similar to those for which there are effects with source chemicals.
While many existing substances have been evaluated without MoA or AOP information, this information can strengthen the read-across. It is important to start with the toxicity endpoint most relevant to the source/anchor chemical of interest. The available screening methods proposed for this toxicologically relevant endpoint under various AOPs should be considered, and given the rapid development of novel testing methods in this area, new procedures should be incorporated once they have been validated. For example, once the relevant AOPs have been identified, it may be possible to use high-throughput screening (HTS) methods or other novel techniques to detect changes in critical events related to the AOP. HTS data can be used to build bridges between the source chemical and analogues. HTS approaches can also be used to investigate ADME/exposure. It will be important to determine when "lab data" are needed, as opposed to modelling or software approaches.
The committee is cognizant that while mechanism of action data are desirable, there are many instances when there is no information available on toxicity endpoint–specific targets; indeed, multiple chemicals that are toxic as assessed by one endpoint may each have a unique target. Furthermore, the "library" of available MoA or AOPs is limited.
The generation of MoA data is likely to require considerable resources, therefore there is a workload issue. It will be important to follow a menu-driven approach—from QSAR, to hypothesis, to MoA or AOP predictions—for priority chemicals.
Suggestions
The chemicals for which MoA data are required should be prioritized based on the degree of confidence needed for assessment (for example, their effect concentrations versus exposure and the uncertainty associated with exposure). This recommendation does not take into consideration those instances in which multiple overlapping MoAs may lead to a non-monotonic dose-response curve, such as those reported for some endocrine-disrupting chemicals.
A workable process could be composed of the following components:
- Identify the "critical" toxicity endpoint(s) from empirical data or structural features/structure-activity relationships. List the chemical substructures identified with the specific toxicity endpoint(s). Define the relationship between effect and exposure/dose.
- Determine whether the probable margin of exposure requires detailed MoA support.
- Develop an MoA hypothesis.
- Assess the need for additional MoA/bridging data to reduce uncertainty.
- Develop a targeted test strategy to test the MoA hypothesis; this may include consideration of structure-activity relationships, the availability of appropriate AOPs, critical targets and/or HTS data.
- Assess the weight of evidence.
(5) Reading across the absence of toxicity
The Committee observed that this is a difficult issue. If a study on a source substance demonstrates no adverse effect, even at the highest exposure/dose tested, this level (in other words, the no-observed-adverse-effect level [NOAEL]) should be used as the basis for risk assessment because "real data are real data." However, the SC also recognized that there can be somewhat greater uncertainty in extrapolating from a NOAEL than from an effect level (EL) due, for example, to residual uncertainty on whether the source study was sufficiently rigorous. Equally, however, it is not appropriate to "penalize" a sound study with a valid NOAEL versus an equivalent study with a clear EL. At this time, the SC concluded that in the absence of an AOP and/or a MoA, some additional factor of uncertainty might need to be included. The SC expressed a wish to revisit this issue during a subsequent meeting because the subject captures a number of important scientific considerations.
Additional Considerations
The SC discussed some additional considerations with respect the charge question that fell outside of the above points:
- It is important to note the relevance of:
- Weight of evidence (WOE)—from both the quality/expert judgment and quantity of data perspectives. Using Case Study 3 as an example, is there a potential role for WOE because the close structural analogues to justify read-cross are not available.
- Peer review and re-review—different degrees of review may be required depending on the "amount" of (or impact of) missing data.
- The SC discussed the "exposure" driver for CEPA Toxic. Although it is not normally a read-across issue, it is a real-world issue. Due to CEPA's construction, the issue of "proving" significant exposure is relevant to all assessments under CMP and requires consideration while investigating the need for many read-across considerations. In some cases with low exposure/wide margins of exposure (MoE), a rapid screening approach may be appropriate to avoid a detailed read-across evaluation.
- The impact of missing information can vary from minor to major. For some assumed toxicology endpoints, read-across may support a robust decision developed by the use of a QSAR approach. For others (for example, Silicone D5), additional data can alter a previous decision under CEPA from Toxic to Not Toxic. Yet how far can a decision made for a single substance (for example, a silicone) apply to other members of its "category" when similarities in silicone chemistry and toxicology are not as advanced as those for carbon chemistry? Last, on this point, it was observed there is an ongoing need to consider revisiting decisions in the light of additional information.
- The impact of missing information can be minimized (in other words, avoidance of a Type 1 error) and the pressure on read-across approaches reduced, by use of "deficient" (uncertain/less suitable) empirical data on source substances plus careful use of risk quotient (RQ) or MoE results obtained during the assessment. This is contingent on "good" (adequate) exposure data.
- Assessment of the impact of the uncertainty, which may take the form of retrospective analysis in an independent assessment, may be needed.
Suggestions
The SC suggests that decisions made by the departments on the basis of read-across in the absence of toxicity should be fully transparent.
References
- Shengde Wua, Karen Blackburn, Jack Amburgey, Joanna Jaworska and Thomas Federle. 2010. A framework for using structural, reactivity, metabolic and physicochemical similarity to evaluate the suitability of analogs for SAR-based toxicological assessments. Regulatory Toxicology and Pharmacology. Vol 56: 67-81.
- Guy W. Bemis and Mark A. Murcko. 1996. The Properties of Known Drugs. 1. Molecular Frameworks. J. Med. Chem., 1996, 39 (15), pp 2887–2893.
- Respectfully submitted, Barbara Hales and Geoff Granville (Co-chairs) on behalf of the CMP Science Committee, 12 January 2015.
Page details
- Date modified: