Risk assessment is a science and an art – there are scientifically-sound methods for evaluating data and estimating risk, and these methods are continually being improved and enhanced. There are areas of uncertainty where professional scientific judgment is necessary and competent experts may disagree. Engaging experts to assist in determining the most scientifically sound approach and answer can help strengthen the science and results.
In recent years, government agencies and authoritative organizations have increasingly used outside experts to peer review their important works to evaluate the scientific and technical defensibility and judge the strength of the assumptions and conclusions (OMB, 2004; Health Canada, 2006; US EPA, 2006; IARC, 2006; IPCS, 2005). While peer review is the most common use of outside experts, there are other types of expert involvement of peers that can assist in developing risk assessments and documents, often at an earlier stage. These other techniques can be referred to as peer input and peer consultation. Click here for definitions.
TERA has spent the last 10 years refining procedures for the expert peer review of risk assessment methods and documents. We have provided expert panel and letter review services to both government agencies and private parties including industry. A number of key principles define high quality reviews – independence, inclusion of appropriate expertise, transparency, and a robust scientific process. Click here for more on these principles.
In recent years, we have extended the principles and practices of peer review into earlier stages of work product development to use peer input and peer consultation to strengthen draft work products. The table below summarizes the three stages of risk assessment development, some of the types of peer involvement techniques that may be considered, and the issues and questions appropriate for that stage. Note that various types of peer involvement may be used at each stage. For example, peer review could be used at any stage and peer consultation may be used on an intended final document, as is the case for the Voluntary Children's Chemical Evaluation Program (VCCEP). (Table from Meek et al., 2007)
Questions and Issues to be Addressed
Problem Formulation, Issue Identification, and Data Gathering
Meetings , informal or formal
Expert Elicitation to fill data gaps or address uncertainties
Is there an accepted standard approach available?
Are there previous relevant examples to follow?
Are there data or analytical tools to suggest?
Do outside parties have additional data/information?
Are there outstanding science or science policy issues that must be resolved or addressed?
Should additional studies be conducted or data collected?
What is the available budget and timeline?
Draft Work Product
Requests for written comments or review
Panel meetings or conference calls
On single issues or entire work product
Were all the appropriate data identified?
Were the data interpreted correctly and presented in sufficient detail?
Are there alternative approaches that should be considered?
How can the work be strengthened and improved?
Final Draft Work Product
Written or letter review
Panel meetings or conference calls
On near final work product
Focused and formal charge questions covering:
The completeness and strength of the data presented
The defensibility of the assumptions
The use of appropriate analyses and methods
The strength and defensibility of the conclusions
The strength and scientific defensibility of the rationales provided for choice of: study, effect, level, models, uncertainty factors, etc.
More specific questions regarding key chemical or document specific issues
Most scientists informally discuss their work at some point with their colleagues inside their organization or seek opinion and data from others outside. More formalized approaches to peer input may be beneficial to identify issues, acquire needed or missing data, or to solicit opinion on appropriate focus. For example, the Existing Substances Division of Health Canada recently invited peer input to assist in guiding the development of the Complex Exposure Tool (ComET). ComET is a tool developed for use in Health Canada’s program to evaluate all 23,000 substances on the Domestic Substances list. TERA organized a workshop in late 2004 to solicit input and data from risk assessment and exposure experts on the proposed structure and information base for ComET. (More information on ComET and the peer workshop at http://www.tera.org/peer/Exposure/ExposureWelcome.htm.)
Peer consultation is a formal or informal process to gather independent expert peer opinion and advice on a work product during its development. Peer consultation is most helpful when the document is complete enough to benefit from a review, but the analyses may still be in flux, allowing the experts’ comments to be readily considered and to influence future direction of the work. Consultation may involve evaluation of an entire work product or focus on key specific issues or analyses. The emphasis of a peer consultation is on scientific expert opinion and advice, rather than data acquisition. Peer consultations can range from informal discussions with peers within one’s organization to large formal independent panels of experts meeting in public.
Recently peer consultation has been used to review industry-prepared chemical assessments on children’s risk to industrial chemicals through the US EPA’s Voluntary Children’s Chemical Evaluation Program (VCCEP). TERA has organized a VCCEP peer consultation program with public meetings of panels of expert scientists with a broad range of perspectives to evaluate the need for additional toxicity and exposure data for characterizing risks to children. (Lear more about VCCEP peer consultations).
The concept of external peer review is well-known to most risk assessment scientists as they are familiar with the practice of peer review of manuscripts for publication and grant proposal submissions. TERA has developed and formalized many peer review procedures to insure that the those peer reviews organized by TERA are independent, include appropriate scientific experts, are scientifically robust and are transparent with regards to process and results. The key principles of expert peer review are discussed more fully in Meek et al. (2007).
Definitions of peer input, consultation and review (taken from Meek et al. (2007)).
Peer input –soliciting information, data, or opinion from scientific peers, generally at an early stage of a work product’s development. For peer input, the emphasis is on appropriate focus, data acquisition and identification of issues. The process may be formal or informal. The experts may be internal or external and may or may not be independent of the authors or of the subject. For example, while not “peers” per se, scientists from stakeholder groups may provide input at this early stage in specific areas .
Peer consultation – a formal or informal process to gather independent expert peer opinion and advice on a work product during its development. Peer consultation is most helpful when the document is complete enough to benefit from a review, but the analysis may still be in flux, allowing the experts’ comments to be readily considered and influence future direction. Peer consultations may be conducted on an entire work product or on specific issues or analyses. The emphasis is on scientific expert opinion and advice, rather than data acquisition.
Peer review – a formal, external, and independent review of an intended final work product. The intent of a peer review is to gain agreement from a group of external expert peers regarding a document’s conclusions and the scientific basis for those conclusions. The emphasis is on agreement by the experts or agreement on the approach and conclusions, with consensus amongst the experts providing additional support and defensibility of the results.
TERA had identified four key principles for peer review. These principles also apply to varying degrees for other peer involvement activities. The four principles – independence, inclusion of appropriate expertise, scientific robustness, and transparency – are briefly described below. These are discussed more fully in Meek et al. (2007).
• Independence is defined as both distance from the development of the work product and freedom from institutional or ideological conflict of interest or bias.
• Independence applies to both the organization conducting the peer involvement activity and the experts who participate
• Independence may be less critical at project scoping or data/issue identification stages when seeking broad input from many sources
• For peer review, independence is essential
• Success for a peer review hinges on involvement of the right experts – those qualified through training and experience to offer scientific opinions on the questions and issues at hand.
• For risk assessment products, it is essential to identify and involve experts from fields such as toxicology (including sub disciplines such as pathology), epidemiology, biochemistry, statistics, and modeling.
• Other expertise may be required for issues unique to each peer involvement activity.
• Transparency is a philosophy that encourages open communication about how the activity was coordinated, as well as the basis for and nature of the important decisions made during the process of conducting a review.
• Enough information should be provided so that all interested persons are able to evaluate and judge the adequacy and credibility of process and results.
• Transparency is most significant and important for peer review of a near final product, where expert judgments are being made on the adequacy of the product; but transparency is also a good practice for other peer involvement activities.
Robust Scientific Process
• Following the other three principles contributes to a scientifically robust process.
• Peer involvement should focus on science - the robustness of the available data, the analyses, and the defensibility of the conclusions.
• Policy and implementation of risk assessment addressed in separate process.
• The charge to reviewers is critical – it must ask focused questions while allowing participants to raise unidentified issues.