Measuring Environmental Exposure Workshop ReportMontréal, QC
November 28-30, 2011
Table of Contents
- Executive Summary
- Keynote Presentation
- Session 1: Measurement Issues
- Session 2: Synthesizing Data on Environmental Exposure
- Panel Discussion
- Small-Group Brainstorming Session
- Keynote Presentation
- Session 3: Technological Perspectives
- Small-Group Brainstorming Session
- Keynote Presentation
- Where Do We Go from Here?
- Wrap Up
The Institute of Human Development, Child and Youth Health, of the Canadian Institutes of Health Research, in partnership with the UK High Commission, hosted an international workshop on Measuring Environmental Exposure from November 28 to 30, 2011, in Montréal.
More than 60 government and academic researchers from Canada, the United Kingdom, and the United States with expertise in the measurement of environmental contaminants in the context of human health took part in the workshop. Its objectives were to synthesize expertise and identify gaps and priorities in this field of research; plan future research collaborations and partnerships across disciplines; and discuss the development of new measurement methods.
The themes of the workshop included general measurement issues, synthesizing data on environmental exposure measurement, and technological perspectives. Presenters and keynote speakers set the stage for lively discussions in plenary and small groups. The workshop concluded with discussions on future priorities and opportunities.
Priorities and Opportunities
Workshop participants identified several priority areas that needed to be addressed in order to advance the measurement of environmental exposure in human health research in Canada. They also suggested ways to foster greater collaboration, both at home and internationally, and specific subjects for future research efforts.
Capacity: Participants stressed the need to build support for Canada's work in this area by promoting the value of environmental health research and long-term studies to both policy-makers and the public and approaching non-traditional funders with converging research interests for financial support. They suggested improving Canada's capacity in toxicology and ability to analyze complex environmental data by creating a centre of expertise and focusing on ways to recruit young researchers into the field.
Harmonization: A theme that came up often over the course of the workshop was the need to harmonize study designs, methodologies, and the collection, storage, and analysis of data—and to link existing studies and data, where possible.
New Technologies: Participants agreed to the importance of promoting the creation and use of new technologies for capturing and analyzing data, including low-cost mobile sensors for measuring individual exposure, imaging technologies, geographic information systems, teleconsultation and social media, and wireless technologies.
Study Designs and Methodologies: The importance of engaging policy-makers and end-users in setting priorities and designing studies was emphasized. It was also suggested that proof-of-concept studies be launched, and that the use of low-cost, low-volume, and high-sensitivity assessments for biological samples be encouraged to reduce costs.
Data: Other priorities identified at the workshop were the need to improve the quality of the data used for modeling and validation; encourage more and better modeling at the population level; and increase the capacity for data linkage, analysis, and storage (e.g., quality assurance/quality control procedures, bioinformatics capacities).
Collaboration: Greater collaboration and sharing of information and best practices across disciplines and sectors, within Canada and internationally, was identified as paramount. Participants suggested launching collaborative research efforts to data-mine existing studies and develop a methodological infrastructure. The need to promote collaboration between public health policy-makers and researchers, epigeneticists and modelers, and researchers from multiple disciplines (e.g., through student-exchange opportunities and team and training grants) was also emphasized.
Study Areas: Participants identified the following priority areas for future research:
- Links with infectious diseases, chronic diseases, mental health
- Vulnerable people and populations
- Individual susceptibility
- Biological organisms to accumulate/uptake pollutants
- New molecules and emerging issues related to new substances
- Exposure gradients
- Biomarkers of long-term, cumulative exposures
- Exposure to chemicals with short half-lives, multiple-media, and multiple-chemical mixtures
- Sources of contaminants and the mechanisms by which they enter the body
- Factors that cause epigenetic imprints and how to diminish harmful effects
- The effects of gene-environment interactions on population health
IHDCYH, the British High Commission, and other potential funding partners expressed interest in pursuing future collaborations, partnerships, and granting opportunities in this area and are maintaining contact in order to elaborate on them.
The Institute of Human Development, Child and Youth Health (IHDCYH), of the Canadian Institutes of Health Research (CIHR), hosted an international workshop on Measuring Environmental Exposure from November 28 to 30, 2011, in Montréal. The event was organized in collaboration with the British High Commission in Ottawa.
More than 60 government and academic researchers with expertise in the measurement of environmental contaminants in the context of human health took part. Although most participants were from Canada or the United Kingdom, scientists from the United States were also in attendance.
The workshop had five main objectives:
- To synthesize research expertise in the measurement of exposure to environmental contaminants in human populations, which is currently scattered both geographically and in disciplinary
- To identify gaps and priorities in the current research landscape.
- To discuss areas for future research and synergies across research disciplines.
- To provide a forum in which to discuss the development of new measurement methods.
- To network and plan future collaborations and partnerships.
The workshop agenda was structured around three sessions: general measurement issues, synthesizing data on environmental exposure measurement, and technological perspectives. Each session opened with three or four expert presentations that set the stage for a question-and-answer period and a small-group brainstorming session, which engaged participants in creative thinking around future challenges, priorities, and opportunities.
Keynote speakers provided valuable insights on issues related to the importance of environmental exposure to human health, exposure assessment, and lessons learned from longitudinal studies.
This report is a synthesis of outcomes from the small group work and plenary discussions that took place on research priorities, opportunities for collaboration, and next steps in measuring exposure to environmental contaminants. It will be used by IHDCYH to help identify potential future partnerships and granting opportunities.
Effort has been made to standardize the format of written input transcribed from electronic and hard-copy worksheets from participants in order to improve the flow and readability of this document. Original wording has, however, been retained to the fullest extent possible in order to ensure that intended meaning has not been affected.
Brief summaries of main points from each presentation are included in this document, with the complete PowerPoint slide presentations available through IHDCYH. The key questions posed and comments made during the plenary discussions that followed each of the presentations and small-group sessions are provided in Appendix III, Key Discussion Points.
In his welcoming remarks, Dr. Michael Kramer, Scientific Director of IHDCYH, thanked the British Consulate-General in Montréal, for initiating contact with the Institute about possible collaborations around environmental exposure and human health. The workshop, he said, was an important opportunity for Canada and the UK to address two underlying questions: how to measure environmental exposure in a cost-effective and scientifically rigorous manner; and what must happen to make this possible.
Dr. Kramer explained that existing and emerging threats to health were a strategic priority of the CIHR, and that IHDCYH itself had already funded studies on the impact of indoor air on genetic factors and the effects of endocrine disruptors on reproductive health. He noted that the presence of other CIHR Institutes at the workshop—including Cancer Research; Gender and Health; Genetics; Infection and Immunity; Nutrition, Metabolism and Diabetes; and Population and Public Health—was evidence of the CIHR's interest in this area of research. He thanked everyone involved in organizing and executing the workshop, which he hoped would be a prelude to future research collaborations.
Patrick Holdich, the British Consul General, welcomed participants on behalf of the British government and, in particular, the UK Science and Innovation Network—a global network of nearly 100 science specialists that is aimed at fostering collaboration among countries. He commented that the workshop's focus on health science and environmental research overlapped with two of the network's priorities, underlining the value and importance of cross-cutting and collaborative work in these fields.
Mr. Holdich said that both he and his colleagues at the British High Commission also wanted this workshop to be part of the process that would help fulfill the objectives of a joint declaration, signed by the Prime Ministers of Canada and Britain in Ottawa two months earlier, recognizing and encouraging joint work—particularly in the field of health research. He expressed his gratitude for the commitment and enthusiasm of Dr. Kramer and his team, as well as the involvement of his UK colleagues.
The Assessment of Exposures to Environmental Contaminants: The State of the Art and Research Needs
- Martin Williams, Science Policy Environmental Research Group, King's College, London
- Fixed-site monitors (FSMs) are the most common way of measuring outdoor exposure; however, the challenge is that people move from place to place.
- Longitudinally, there is good correlation between FSMs and individual exposure (measured using portable monitors).
- Despite the attraction of measurements from personal exposures, FSMs can offer very sophisticated, detailed measurements—especially of atmospheric chemistry and particulate composition.
- Having particulate matter (PM) measurement components in personal monitors is a goal.
- Measurement challenges with portable units include costs and battery limitations.
- Quantifying exposure is not possible by measurement alone; models are also needed.
- There are challenges to studying the impacts of PM:
- What are the harmful constituents of the particle mix?
- What are the main sources?
- How do people get exposed to them?
- What policy interventions would improve things?
- The question of time averaging is significant, because health outcomes can relate to different exposure times. Never quote a concentration without specifying the averaging time.
- The concept of moving to smaller spatial scales needs to be pursued for modeling, as does the use of hybrid space-time models of real personal exposures.
- Goals include better time-response, precision, accuracy, and specificity of personal analyzers.
- Ultimately, better exposure assessment means better control policies.
Session 1: General Measurement Issues
Chair: Donna Mergler, Interdisciplinary Research Centre for Biology, Health, Environment and Society, Université du Québec a Montréal
Environmental Sources of Exposure
- Michael Brauer, School of Population and Public Health and Department of Medicine, University of British Columbia
- We regulate/control/monitor environmental concentrations but not exposure.
- Environmental measurements are necessary but insufficient. The behavioural component must also be characterized.
- Biomarkers are important; however, interpretation requires environmental measurements.
- Improved exposure assessment pays dividends.
- Understanding determinants of exposure can lead to reduced exposure and risk.
- Future directions:
- Emphasis on personal monitoring
- Better understanding of exposure, which lags behind understanding of genetic influences on disease
- Links with genetic information and biomarkers to inform individual risk
Measuring and Interpreting Exposure Biomarkers
- Jay Van Oostdam, Epidemiological Advisor, Chemicals Surveillance Bureau, Health Canada
- The Canadian Health Measures Survey (CHMS) is representative.
- There are no methodological limits, because methods are changing rapidly.
- Because methodologies are changing, we need to be certain that data are comparable both now and in the future.
- Quality assurance/quality control (QA/QC) is essential. We need to think about inter-laboratory round robins as one way of doing this.
- Interpretation is important.
Timing and Frequency of Exposure Estimation in Longitudinal Studies
- Kees de Hoogh, Imperial College, London
- ESCAPE (European Study of Cohorts for Air Pollution Effects) is investigating effects on human health of long-term exposure to air pollution in Europe:
- Nitrogen oxides and fine particles were measured at different location across Europe and land use regression (LUR) was used to model pollutant at a fine spatial scale.
- Back-extrapolation is used to predict pollutant levels back in time.
- Evidence suggests the LUR model is effective at predicting small-scale variations
- Comparisons are carried out between LUR and dispersion models in some of the ESCAPE study areas in order to gain a better understanding of the differences.
- A common European model at a 100m resolution is being developed using accurate road data and satellite data
- The mast study was a register-based matched case-control study of proximity to mobile phone base-stations and cancer incidence among children ages 0-4 years in Great Britain, 1999-2001:
- Three exposure metrics were estimated for each birth address: Distance (m) to the nearest mobile phone base station; Total power output (kW) from summation across all base stations within 700 m and Modelled power density (mW/m2) computed at each address for base stations within 1400 m, using a three-dimensional propagation model.
- The model was calibrated and validated using field measurements of power density around base stations representing urban, suburban and rural areas.
- No association found between risk of childhood cancers and mobile phone base station exposures during pregnancy.
The three presenters from this session took part in a panel discussion in which they provided their thoughts on the limits of using various measurement methods to look at the connection between environmental exposure and health outcomes. They offered the following comments:
- Air-pollution monitoring is limited when it comes to the effects of very short duration exposures (e.g., a blast of diesel exhaust at a bus stop).
- Existing methods are effective at monitoring exposure to persistent chemicals but not to those that remain in the body for a shorter period of time.
- The same source can influence multiple exposures via multiple pathways. Although some factors overlap (e.g., traffic-related air pollution, noise) for the same outcomes, some also have independent health effects.
- Back-casting over the long term is difficult because of changes that have occurred over time (e.g., traffic flow, composition, roads), which causes a loss in confidence.
- Residential history is important.
- Satellite technology is not capable of providing ground-level exposure measurements for outdoor pollution concentrations in near real-time. The best use of satellite estimates is to couple them with chemical transfer models.
- Daily measurements aren't realistic for satellites because data are only collectible on sunny days. Dispersion models, however, can create daily or hourly maps of air pollution in real time.
- Variability among laboratories is a concern: we need to be more critical about the data we receive.
- We need to look more closely at the relative importance of behavioral changes in reducing exposure.
- Spatial methods work well for outdoor sources and air pollutants and are being explored for other potential applications (e.g., noise, electric and magnetic fields, chlorination byproducts in water).
- We need to develop better methods of looking at the totality of sources (e.g., by coupling approaches).
- Multi-media exposure assessments are important. We need to know what new chemicals are out there and which ones to focus on.
- There is no consensus in the research community about which biomarkers or biosignatures can be trusted and which are problematic. There are still many environmental exposures that cannot be captured using these methods.
- The problem with using cell phones to collect data is not the potential of the technology but the incentive for people to use it.
- While biomarkers of effect might yield some information, source-relevant information is also critical.
Session 2: Synthesizing Data on Environmental Exposure Measurement
Chair: Carol Dezateux, Director of the MRC Centre of Epidemiology for Child Health, Institute of Child Health, University College, London
Combining Data from Environmental Sources and Biomarkers
- Bruce Lanphear, Professor of Children's Environmental Health, Simon Fraser University
- Biomarkers can
- reduce exposure misclassification and the sample size needed to conduct research,
- enhance measures of biologically effective dose,
- increase estimated strength of association, and
- examine effects at lower levels and, if indicated, extend prevention efforts or therapy.
- Limits of biomarkers:
- Few are fully validated.
- They require an in-person (clinic) visit.
- Cost often limits the size of the study.
- They are only an approximation of biologically effective dose.
- The high variability of emerging contaminants.
Refining Exposure Assessment of POPs with the Physiologically-Based Pharmacokinetic Modeling
- Sami Haddad, Associate Professor, Department of Occupation and Environmental Health, Université de Montréal
- Physiologically-based pharmacokinetic (PBPK) modeling can provide additional information on internal exposure (e.g., different times in life, target tissues, toxic moety of parent compound or toxic metabolite).
- Combining PBPK simulations and epidemiological statistical analysis
- increases sensitivity for finding exposure-effect associations,
- allows the determination of dose-response relationship as a function of age (e.g., type of threshold: AUC vs. Cmax ), and
- allows the identification of periods of increased susceptibility (e.g., facilitates mechanistic understanding in onset of adverse affects).
- This approach is applicable to other chemicals; however, it can be more challenging for those that are short-lived.
- Longitudinal data are needed for model validation in adults.
Harmonizing Health Outcomes and Risk Factors Data Across Studies
- Isabel Fortier, Director of Research and Development, P3G Consortium
- Harmonization and integration processes are required to permit valid comparison or integration of data across studies or databases.
- Success of harmonization efforts depends on
- the quality of the studies-specific data,
- the access to data collected, and
- the rigour of the harmonization and integration processes.
- Harmonization raises particular ethical-legal, scientific, and technical challenges.
- Efforts should be pursued to develop standards, methods, and resources to support harmonization programs.
The presenters assembled for a panel discussion on issues related to data synthesis. The session chair noted that the heterogeneity of various studies and study designs, while problematic, could improve the evidence base concerning environmental exposures and their human health effects. She suggested that thought also be given to ways in which current studies could add value to past ones. Key comments from the discussion include the following:
- Some research questions require detailed answers and others simply a
“no.”It is important to allow researchers to develop their own standards and not to impose common measures on the cohorts of the world.
- Using an average value for a population can represent a significant range of variability (e.g., mercury in hair). It is important to understand what causes that variability and model it.
- Facilitating the exchange of information among cohorts is important, so cohort developers can raise standards at the outset rather than settle for the lowest common denominator later on. Useful tools include the P3G catalogue and the PhenX Toolkit.
- A number of international collaborations are bringing together cohorts to look at specific outcomes (e.g., the UK data archive pulls together information from different studies using an international metadata method; the World Health Organization has an initiative to harmonize children's cohorts that are involved in studies related to the environment).
- The harmonization of outcome measures is a challenge.
- Data linkages between the census, mortality, health care utilization, etc. offer much potential for study. Ongoing census collection should include core health questions that would be useful for assessing exposures.
- Harmonization is needed at the provincial registry level.
- Canada has failed to keep up with epidemiologic trends and needs to think more broadly in terms of surveillance.
- Adult studies are important, but research in children studies the issues of tomorrow.
- Linking longitudinal data to serial cross-sectional surveys that incorporate health measures can be both valuable and cost-effective.
Small-Group Brainstorming Session
Moderator: Hans Schleibinger, National Research Council of Canada
Participants self-divided into small groups, each of which addressed challenges in one of four areas: the study of environmental sources of exposure; measuring biomarkers; epidemiology/ biostatistics; and sex and gender. The results of their discussions are summarized as follows:
1. Challenges in Studying Environmental Sources of Exposure
- How do we deal with very acute exposures that are likely to have a lot of heterogeneity in time and space? Is this a concern and how could it be assessed on an ongoing basis?
- How do we develop retrospective exposure assessments (e.g., cancer and Parkinsons' have a long latency and we don't have the capacity to address this issue)? One way may be through ecological records (e.g., lichens or diatoms). In the Amazon, researchers have gone back in time by looking at mercury in sediments.
- On the human side, there may be quite a lot of other information available on changes in traffic patterns, land use, consumer products, etc. that hasn't been well archived.
- Many studies on air-pollution exposure have not been shared, and there is no mechanism for sharing exposure information taken on a person or at a house because of ethical issues. We lack an institutional infrastructure to address this.
- We need to leave a legacy for the future, so people 20 years from now won't be facing the same problems with regard to retrospective exposures. We need to have complete residential histories as a first step.
- The indoor environment versus the outdoor environment is also an issue. It is much more difficult to examine indoor environments on a population level because there is a lot of heterogeneity in indoor behavior. One strategy for tackling this issue would be to use property-tax data from land parcel records (e.g., information on types of heating used).
- There are considerable opportunities to get more information on people's purchasing habits from retailers who collect that information through shopping cards. In the US, some have been forced to provide their records to public health researchers to better understand dietary behaviors—but this could also extend to other consumer products.
- There are lots of opportunities to use cell phone-based and other sensors that people carry routinely to better understand the impacts of the timing of activities and travel patterns. Most major municipalities collect travel origin data, but they are not shared with health researchers.
2. Challenges in Measuring Biomarkers
- Life-stage considerations are a challenge.
- We need to develop non-invasive techniques/methods of bio-monitoring that are more applicable to large-scale studies (e.g., hair, dried blood spots, teeth). These offer some advantages for sample storage, as well.
- In terms of which aspects of exposure are characterized by data collection protocol, spikes of exposure may be more important than long-term averages.
- Large volumes of material are needed to measure levels of contaminants and bio-specimens; however, bio-banks are expensive, and it is a challenge to prioritize analyses with a limited supply of specimen material.
- There are contamination issues related to the process of collecting specimens (and in the laboratory), where they are exposed to the environment.
- The possibility of using adduct formation to assess exposure to certain compounds should be further explored.
- We need a series of samples of variation in levels.
- Cohort studies of environmental exposure are scientifically challenging and expensive.
- This workshop should recommend the development of standard operating procedures for the collection of certain types of specimens.
- The general trend has been for detection limits to decrease over time for certain assays (e.g., cotinine detection levels have had to be lowered).
- How important is it to use the same labs for all specimens? Inter-laboratory studies to ensure comparability are important for collaboration.
- Ethical issues can arise when shipping specimens out of the country.
- Short half-life specimens are a problem.
- The liability of studying vulnerable groups is that some
“high-exposure”groups have a wide exposure range.
- Effect measures tend to be non-specific, but some pathways have system specificity (e.g., cardiovascular).
- There is a tendency to do large cohort studies, but we haven't spent lot of effort figuring out how to best allocate resources and design studies. We need to find a way to give experts the incentive to answer the design questions.
- When we are talking about biomarkers, we are talking not only about chemical measurements but also about proteomics and genomics, so the whole issue of data quality assurance/quality control has to be set out and documented from the start.
3. Epidemiologic/Biostatistical Challenges
- Large cohort studies can lead to the investigation of multiple hypothesis, but we need to consider ways to avoid false positives.
- Large, general-purpose cohort studies may not have sufficient information to control for important confounders of specific exposure-outcome associations.
- We need to consider the frequency and timing of exposure at the study design stage.
- We should link timing of exposure measurement to windows of susceptibility (possibly defined by life stage, including the pre-natal period).
- For chronic diseases, the critical exposure time window may be in the distal past (challenge is to assess exposure retrospectively, or plan appropriate long term studies).
- We need exposures in different periods and of different durations to understand temporal patterns of exposure and risk.
- Understanding biological mechanisms can help elucidate critical exposure-time windows.
- Intervention studies can be useful in understanding the health effects of different exposures by providing evidence of causation rather than just association.
- Because it is not feasible to take repeated biological samples from the same subject, it may be possible to combine data from different studies in which biological samples have been taken at different times.
- We need to be as broad as possible in selecting exposure metrics, to allow for the assessment of new hypothesis that may only emerge in the future.
- Although broad-based multi-purpose prospective cohort studies can be extremely valuable, obtaining the significant resources needed to conduct them can be challenging (can we leverage data from other cohorts, including those outside Canada?).
- We need to consider temporal relationships between exposure and outcome and ensure that adequate monitoring is in place to ensure meaningful results five to 10 years from now.
- Academic investigators cannot wait for decades to get their first publication from a prospective cohort study.
- Because we are generally looking at small associations with multi-factorial health outcomes, it may be difficult to tease out the environmental contribution, which may vary from population to population.
- We need sensitive indicators of exposure to investigate subtle health effects.
- Interactions (environment-environment as well as gene-environment) must be considered.
- It is generally quite difficult to get good measures of environmental exposures.
- Biomarkers (direct measurement of contaminants in cord blood and amniotic fluid) may provide accurate indicators of exposure; however, procedures like amniocentesis are done quite selectively.
- Pharmacokinetic models may also be exploited to get more accurate measures of tissue dose.
- Bio-banks might prove useful in storing samples until appropriate analytic techniques are developed in the future.
- It is important to validate the exposure measures used.
- Most air-pollution studies have been based on cohorts established for other reasons, making it difficult to construct appropriate exposure measures after the fact.
- Diverse cohorts (such as EPIC) can provide strong exposure gradients, thereby increasing sensitivity—even in the presence of appreciable exposure measurement error.
- Similar diversity might also be achieved by pooling data from related cohort studies.
- Are there good methods for adjusting for exposure measurement error that can help in interpreting epidemiological findings?
- Statistical techniques, such as regression calibration, have proven useful in environmental epidemiology.
- Working with smaller geographic areas can lead to better ecologic indicators of exposure to ambient air pollutants.
- Sensitivity analysis restricted to the ‘best quality' exposure data can be informative with respect to the possible impacts of exposure measurement error.
4. Challenges Related to Sex and Gender
- It is important not only to analyze males and females separately but also to think about the biological and social mechanisms involved.
- A life-course perspective is needed. We do not often think of infants and children as sexed, but there are important sex-based differences that continue throughout the life course.
- There is variation within each sex (e.g., hormones, body size) throughout the lifespan. So beyond thinking about sex-based differences, we have to think about the biological mechanisms underlying these differences.
- Pharmacodynamics and toxicokinetics differ by sex and throughout the lifespan. Some modeling assumptions may be different for males and females (e.g., metabolism) that will influence the way subjects are affected.
“typical person”does not exist.
- Metrics of particular biomarkers need to be examined by sex.
- Even cells have a sex (with the exception of RBCs and platelets).
- Most animal work has been done on males, and work-related environmental exposures have focused on male-dominated occupations. Is what we consider
“sensitive”outcomes gender-biased? Are we missing half the story?
- Issues of sex and gender are not a women's political agenda; they are central to good science.
- We need to think about sex as more than just dividing the world into males and females.
- While regulations need to take into account the entire population, including the most vulnerable, direct actions require better knowledge of the relationship between the environment and effects in males and females throughout the lifespan.
- We need to identify and examine those who continue to bear the burden of environmental degradation.
- There are many biological and social variables characterizing sex and gender, and multiple comparisons are a challenge. We need to find ways to do better science around this.
- Shopping patterns, travel patterns, etc. are all highly gendered, so it is important to think about gender-based differences. The same goes for some biomarkers, as norms are very different for males and females (e.g., hemoglobin).
- Moving forward, we should think about sex and gender in everything we do.
Environmental Contaminants and Ecosystem Approaches to Human Health: Successes and Challenges
- Jean Lebel, Director of Agriculture and Environment, International Development Research Centre
- Eco-health research projects may be overly ambitious and research questions ill-defined at the outset.
- Time and effort are required to refine and scale initiatives to be feasible.
- Design flaws hinder the fully integrated analysis of environmental, social, economic, and health data.
- Demonstrating impact on health can be a challenge.
- Exposures are quantified and reduced; health improvements are not always easily or well documented.
- Demonstrating value for money of proposed changes in policy or practice can also be difficult.
- Across the board, economic analysis is missing.
- Impactful eco-health projects are almost always very long-term (i.e., 10 yrs).
- Are there scaling-up possibilities with lower investments in time and cost?
Session 3: Technological Perspectives
Chair: Frank Kelly, Medical Research Council, HPA Centre for Environment and Health
Techniques for Monitoring Environmental Exposure: Focus on Air Pollution
- Roderic Jones, Professor, Centre for Atmospheric Science, University of Cambridge
- We are working on observational and modeling tools that look at air quality not only in the context of today but also of the future.
- There are ultra-small, low-cost gas sensors for measuring PM with the sensitivity (PPB) to work in urban environments.
- These methods are a complement to fixed site networks.
- Field experiments show good correlation between co-located sensors and good repeatability between sensing methods.
- These sensors offer the possibility of validating model-derived personal exposure rather than pollution fields.
- We are moving toward combining technologies to provide a sensitive, low-cost sensor network system. Potential applications include
- high-density statistical evaluations of air quality,
- source attribution in urban environment, and
- industrial emissions.
- The potential exists to produce a personal gas exposure monitor that can fit on a person's lapel. If it does not use a GPS, the battery can last hundreds of hours.
- We are hoping to extend these technologies to other species (e.g., ozone) and other environments (e.g., indoor).
- There is no chemical information on PM using this method, just information on particle size. We suggest some focused and heavily instrumented studies to identify proxies that might be more readily measureable than PM properties.
Proteomic and Metabolomic Markers of Exposure to Environmental Contaminants
- Neil Dalton, Professor of Paediatric Biochemistry, King's College, London
- Electrospray mass spectrometry/mass spectrometry is a powerful technology for both proteomic and metabolomic analysis.
- Rapid metabolite profiling can be done using
- multiple scanning (positive and negative ion mode) on a single sample preparation, and
- comprehensive, class compound, and MRM quantitative scanning
- We can monitor modulation of specific metabolic pathways and perturbations of fundamental metabolic processes.
- Rapid protein/peptide profiling can be done based on endopeptidase (trypsin) digestion using comprehensive and MRM quantitative scanning.
- Monitoring of post-translational/environmental exposure modification of proteins and perturbations of fundamental metabolic processes is also possible.
- There is potential to use dried blood spots and/or dried urine spots.
Epigenomic Markers of Exposure to Environmental Contaminants
- Moshe Szyf, James McGill professor of Pharmacology and Therapeutics, McGill University
- Mitotic and post-mitotic cells, including neurons, are responsive to the environment and undergo DNA methylation changes that might impact health; adult brain and mature organs could be affected, not just embryos.
- Environment should be understood in its broadest definition (e.g., not just chemicals).
- DNA methylation changes might have long-term impacts that cannot be detected in traditional toxicity assays and might affect future generations.
- It is possible to screen environmental exposures for epigenetic effects in vitro and in vivo; new regulatory policies should be considered.
- Changes in DNA methylation are genome-wide and system-wide; this needs to be taken into account in the development of methylotoxicity assays.
Storage and Preservation of Biological Samples in Biomonitoring Studies
- Mario Marchand, Institut National de Santé Publique du Québec
- The laboratory should be involved as early as possible in the design of bio-monitoring studies.
- Each sample has its place; each sample should be in its place.
- Never underestimate the importance of evaluating the stability of samples.
- To do list: include QC with stored samples that undergo the same treatment and movement of samples to assess their long-term stability.
Small-Group Brainstorming Session
Moderator: Ross Anderson, Professor, Epidemiology and Public Health, St. George's, University of London
Participants broke into small groups to discuss one of three topics related to technological perspectives: new measurement methods; priorities for new technology development; and challenges in studying gene-environment interactions. The moderator asked participants to keep in mind that measurement techniques were an important driver of paradigm shifts in environmental epidemiology. The following are summaries of their deliberations:
1. New Measurement Methods: Issues/Areas
- There needs to be a balance between being motivated by a reasonably specific question and being a resource for the future.
- We should consider all exposure routes in directing research.
- New measurement methods for health measures are an area for advancement (e.g., ambulatory lung function).
- Physiologically-based pharmacokinetic (PBPK) models can help us work backward to identify the most informative measurements to acquire.
- We need to determine the base exposure measurements that can be acquired for all subjects, particularly in larger cohorts.
- We must consider methods that are state-of-the art, now and in the near future.
- A sophisticated package of sensors should be rotated among study subjects.
- Considerations in selecting a sub-population for exposure-method development include the following:
- Proof of principal studies are important.
- For more intensive, expensive, or intrusive measurements, independent study cohorts make more sense; however, for application in current cohorts, the actual study population may be the most appropriate/relevant.
- Models are an inevitable part of estimating exposures, from simple to sophisticated.
- Models need input, which is not easy to obtain. Sources include
- ecological variables, and
- vast, untapped data from social networks, voluntary participation (could this advance models significantly?).
- New communication tools have the potential to acquire behavioural data more frequently and systematically (e.g., cell phones). What are the costs?
- There are many issues related to biosamples:
- How do we maintain focus when building a future resource, taking into account the desire to collect as much as possible for future analysis?
- What are the best types of samples for exposure biomarkers?
- whole blood
- blood spots
- urine spots
- hair and nails
- Laboratory analytical methods need to be less expensive to be more widely applicable to exposure measurement.
2. Priorities for New Technology Development
Priorities for array of environmental sensors:
- Expand to include volatile organic compounds (VOCs), particles, carcinogens, mould spores, and allergens.
- Combine with smart monitors and (trans-) portable units.
- Use a teddy bear with human-type skin to monitor dermal absorption.
- Use tap-water samples to collect drinking water exposures.
- Choose different averaging times.
- Add cell phone and satellite features.
- Consider use of fixed-position (geostationary) satellites to take repeated measures of pollutants over time.
Priorities for improving tools and models for studying pharmacokinetics:
- Develop models to better understand internal dose exposure.
- Gain insights about and comparisons with external exposures.
- Gain insights into the effects of development stage/age.
Priorities for models and common platforms:
- Standardize protocols to make bio-banking data more sharable.
- Account for confounders (even if they become known only later).
- Conduct qualitative structural analysis for toxicity, including features (e.g., affinity to lipids).
- Collect and use samples for acute episodes (including protocols).
- Develop and use website portals for chronic and acute situations.
3. Challenges/Issues in Studying Gene-Environment Interactions
- There are issues related to the need for such a large sample size (e.g., 50,000 to 100,000 people).
- Cohorts should be combined.
- Are there similar environmental measures or can similar exposure measures be created?
- What is the best approach?
- The cohort study is still the preferred method (e.g., rigorous design, many outcomes, sub-sample cases).
- Resources can be concentrated on cases and controls within the cohort.
- Focus on high/ low exposures (nested study); use extremes and look for a signal, as the grey zone is hard to interpret.
- Timing of exposure/effects is a challenge and depends on the hypothesis. This needs to be thought through using a systems-based approach.
- The genome approach alone has not produced much thus far. Gene-environment interaction is key (e.g., epigenetics, expression). For example, key factors that need to be looked at are not just smoking but also socio-economic status and stress (powerful, but don't have measures of it across cohorts).
- People want to know if they have genes that could make them sensitive to some exposure, and what proportion of the disease risk is due to these specific genes. This needs to be considered in the study approach.
- Elements that need to be considered moving forward include cost, bioinformatics availability, exposure measures (accuracy), and relevance to range (variation).
- Strong support is needed for multi-generational or trans-generational studies over the long term and measuring epigenetic effects. Again, much depends on the initial hypothesis (e.g., determining time of sampling, lifecycle exposure, puberty).
- Individual exposures carry risks and benefits (e.g., therapeutic products). These need to be kept in mind.
Making Tough Choices in Longitudinal Studies: Deciding What to Measure and When—Lessons Learned from the National Children's Study
- Michael Dellarco, National Children's Study, National Institute of Child Health and Human Development
- The National Children's Study (NCS) is a congressionally mandated, integrated system of activities aimed at studying the relationships between environmental exposures and genetics on growth, development, and health.
- The NCS examines conventional environmental media, as well as social and cultural settings, health care access, and socio-economics.
- A large volume of information is collected using interviews, self-administered questionnaires, observational instruments, and a variety of sampling procedures.
- Efforts are made to develop consistent terminology for categorization and issues of various stages of growth and development and other aspects of pediatric care.
- Sampling concerns include cost, collection, storage and analysis, ability to do all measurements during a visit, participant burden, and acceptable analytical-method performance.
- We have to ensure that, for any given specimen, the sample volume is adequate for the desired analyses for a broad range of chemicals and biomarkers. In the current series of analyses, prioritization is based on concerns for reproductive and developmental toxicity. We also liaise with chemists to ensure analyses are being done in the appropriate specimens using the lowest volume possible.
- Study-visit methodology should balance value, efficiency, and economy. We are now analyzing the success of different collection strategies.
- Major challenges include terminology; variability of environmental sample collection (techniques, logistics); lack of predictive value of questionnaires; detection of early pregnancy; serial collections of specimens; data quality; generalizability of study population sample; interaction and roles of authorities and institutions; and costs per visit.
- The real value of many longitudinal studies will be seen well into the future.
Where Do We Go from Here?
Summary of Themes
Moderator: Bruce Lanphear, Professor, Children's Environmental Health at Simon Fraser University
A plenary discussion was held on themes that surfaced during the workshop and opportunities/priorities going forward. The following suggestions were made by participants about where to go next in order to advance research on measuring environmental exposure and its application to health outcomes:
- Exploit the potential for real-time, individual air-quality monitoring.
- Use mobile technologies for environmental exposure measurement.
- Build capacity to analyze complex environmental data.
- Launch proof-of-concept studies to inform cost-effective biosample and biomarker data collection.
- Take advantage of Canadian/UK strengths in health-record linkages to environmental data.
- Use novel technologies to improve exposure assessments for cohorts, sensors, proteomics, etc.
- Take advantage of new, real-time technologies for measuring exposure.
- Address the challenges of biological specimen storage and the critical need for QA/QC procedures.
- Ensure that new technologies are reproducible and valid.
- Be bold about selling environmental health research to policy makers and the public; convince them that long-term studies are important.
- Build into study designs the factors that cause epigenetic imprints and the factors we can act on to diminish harmful effects.
- Think about how to mitigate the effects of toxics at the same time as we are learning more about those effects.
- Apply this thinking to longitudinal studies that are starting up in Canada, so we harmonize what we collect in those studies to the fullest extent possible.
- Start thinking about creating a centre of expertise in environmental exposure assessment and developing exposure assessment technologies in Canada.
- Get non-traditional funders who have converging requirements to support this kind of research (e.g., the Canadian Space Agency, National Defence), including other agencies and disciplines (e.g., chemists, engineers, physicists).
- Promote cross-disciplinarity.
- Engage early on with policy makers and end-users to ensure that the right information is emerging from this work. Involve the public in priority setting.
- Share knowledge and make it more accessible (i.e., in the UK, cohort resource facilities serve as a platform for sharing information with others conducting environmental studies).
- Make careers in this field interesting to young people.
- See how far we can push blood-spot methodology.
Small-Group Brainstorming Session
The final activity of the workshop once again had participants self-select into small groups to identify priorities for measuring environmental exposure and opportunities for collaboration. Summaries of their discussions are grouped under the main themes listed.
- Increase funding, but identify where real cost-savings can be made.
- Increase Canada's capacity in toxicology.
- Engage the community's support for environmental exposure studies.
- Early exposures and health links to ultimate chronic disease.
- Interactions between the physical environment and infectious disease (e.g., infections and allergens).
- More measurements on children.
- Nutrition and contaminants in food.
- Vulnerable people and populations.
- Individual susceptibility.
- Consumer products.
- Dust as an important media for a whole suite of biological and chemical pollutants.
- Interventions: efficiency and effectiveness.
- Multi-media exposure and multiple-chemical mixtures.
- The global burden of disease and the mental health link to environmental stressors.
- The use of biological organisms, such as plants, to accumulate and uptake pollutants (e.g., metals and organics).
- New molecules and emerging issues related to new substances in the environment.
- International studies on exposure gradients, especially in highly exposed sub-populations.
- Biomarkers of long-term, cumulative exposures.
- The measurement of exposure to chemicals with short half-lives.
- Harmonize biological questionnaire data to increase capacity for modeling.
- Create a comprehensive register of cohort studies to maximize sharing (e.g., P3G).
- Examine ethical requirements for repeating studies in other countries and mining old datasets.
- Harmonize/link UK and Canadian data.
- Apply new imaging technologies—e.g., infrared cameras to detect water infiltration and mould in homes—to clinical outcomes and public health.
- Further develop geographic information systems (GIS) technology, including spatial referencing of pollution source/environment.
- Apply the use of low-technology sensors (e.g., apps/wipes) that enable the public to do its own sampling (i.e., easier applications longitudinally, combined with representativeness of the population).
- Support proof-of-concept studies to develop inexpensive, miniature sensors for multi-media use, independent of health research (i.e., NIEHS did this for epigenetic research).
- Exploit teleconsultation/social media (e.g., individual auto data on travel, shopping, and other interactions).
- Exploit the use of satellite imaging technologies.
- Promote greater interaction with participants on the use of new technologies.
- Maximize the use of wireless technologies for environmental data capture (e.g., noise, activities, location/footprint, images, personal air monitors with download capacity, biomonitoring [e.g., heart rates]).
Study Designs and Methodologies
- Harmonize study designs (from the perspective of inclusion in future meta-analyses).
- Simplify study designs to use models and indicators that provide sufficient information for public decision-making (e.g., using PM 2.5 as a rough indicator for outdoor air components).
- Link measurements more closely to sources and the mechanisms by which they enter the body.
- Give more thought to interventions (e.g., the source and how to change exposure).
- Increase the use of low-cost, low-volume, and high-sensitivity assessments for biological samples (e.g., blood spot, saliva, urine spot).
Data and Modelling
- Validate exposure models with empirical data.
- Improve data used for modeling (e.g., combine factors multi-dimensionally in time and space to look at environmental effects).
- Promote more modeling and better models at the population level.
- Improve tools for the meta-analysis of multiple data, especially with regard to data quality.
- Increase capacity for data linkage, data analysis, and data storage, including the development of new bioinformatics tools and capacities.
- Maximize use of individual-level record linkages across diverse domains (e.g., health, shopping, telephone).
Opportunities for Collaboration
- Capitalize on similarities in the Canadian and UK health-care systems by networking and sharing information on common issues (e.g., cross-border, airborne pollution, cohorts, linking to ministries/provincial agencies).
- Consult with others around the world (e.g., IDRC) who have studied Arctic communities to learn from and apply their findings to benefit the study of Arctic populations in Canada (e.g., INAC has some good programs).
- Collaborate with organizations such as the Canadian Space Agency, the Department of National Defence, Environment Canada, and Health Canada on research related to environmental exposures.
- Involve provincial agencies, given their jurisdictional responsibilities for health delivery.
- Launch strategic team grants to encourage researchers to collaborate with, rather than compete against, one another (e.g., require participation from a specific number of geographic areas or institutes/entities, while at the same time ensuring the right combination of disciplines).
- Promote the exchange of information and best practices among countries (e.g., Canada and the UK) and provinces and mine existing data sets.
- Apply experience/data from the UK and other countries in Canada, particularly for interventions and to validate models.
- Improve collaboration between public health policy makers and researchers.
- Foster collaborations between epigeneticists and modelers through training and multi-disciplinary research.
- Create centre-based programs to bring trans-disciplinary researchers together in a common space.
- Create student exchange opportunities (e.g., multi-national scientist exchange programs, conference travel expenses, pre-doc/post-doc specialty training), especially for research on environmental exposure (e.g., NSERC's Collaborative Research and Training Experience Program).
- Create and fund a short course on exposure assessment for trainees.
- Conduct an international study (North America, Europe) on gene-environment interactions and their effects on population health, funded by NIH, CIHR, and EU FP7, that would
- look at obesity and metabolic syndrome; food/epigenetics (including developing countries)
- use blood spots and urine spots
- include a work plan to develop the technologies for environmental exposure and biomarker assessment
- include the pharmaceutical industry and its data sets
- Initiate an international collaboration to data-mine existing cohort studies (e.g., P3G).
The Way Forward
The moderator commented that the world was beginning to recognize the strong link between the environment and human health. He noted that the latest WHO report showed that deaths from communicable disease no longer topped those from chronic disease, suggesting that the environment plays a key role.
He wrapped up the session by asking participants to provide their thoughts on ways to muster concentrated efforts around the key priorities and opportunities identified in the previous exercise. They offered the following suggestions:
- Encourage collaboration and build up experience by creating training exchanges and working on shared datasets.
- Launch a collaborate funding initiative to encourage a methodological infrastructure.
- Share UK work on measurement technologies and pilot it in Canada.
- Take advantage of individual champions to help drive programs and initiatives.
- Incorporate proof-of-concept into ongoing studies (e.g., comparison of mobile sensors to the gold standard).
- Foster private sector involvement (e.g., the Berkeley Center for Information Technology Research in the Interest of Society; private-sector helps fund the development of new instruments with potential commercial applications).
- Encourage training grants and pre-doc and post-doc exchanges.
- Standardize methods.
- Take a joint approach (e.g., Canada, UK, US) to solving common problems.
Dr. Kramer thanked his staff at IHDCYH and members of the organizing committee for coordinating the workshop, and the chairs, moderators, speakers, and presenters for their roles in executing such a full program—which he said, was time well spent. He thanked participants for attending, in particular those who had come from far away, and his CIHR colleagues for what he hoped would be ongoing support for research in this area. He closed by saying that the workshop had been a good head-start for the funders' meeting that would take place over lunch, and expressed his confidence that the workshop would spawn future collaborations in this area.
Appendix I: Agenda
|Day 1 - Monday, November 28, 2011
State of Science/Setting the Stage/What do we know?
|8:00-9:00||Breakfast and registration|
Patrick Holdich, British Consul General, Montreal
Michael Kramer, Scientific Director, CIHR-IHDCYH
|9:20||Keynote: The assessment of exposures to environmental contaminants, the state of the art and research needs
Martin Williams, Science Policy Environmental Research Group, King's College, London
|10:00||Session 1: Measurement Issues
Chair : Donna Mergler, Interdisciplinary Research Centre for Biology, Health, Environment and Society, Université du Québec à Montréal
|10:30||Nutrition & networking break|
|11:00||Session 1: Measurement Issues (continued)
|12:00||Panel discussion with presenters from Session 1 and plenary discussion|
|13:30||Session 2: Synthesizing data on environmental exposure
Chair: Carol Dezateux, Director, MRC Centre of Epidemiology for Child Health, Institute of Child Health, University College, London
|15:00||Panel discussion with presenters from Session 2 and plenary discussion|
|15:30||Nutrition & networking break|
|16:00||Small group brainstorming session
Moderator: Hans Schleibinger, Ventilation and Indoor Air Quality, National Research Council of Canada
Please choose one of the following topics to discuss:
|17:00||Feedback to plenary & discussion|
19:00 Dinner & Keynote: Science, the environment, and public policy
Karen Dodds, Assistant Deputy Minister, Science and Technology Branch, Environment Canada
|Day 2 – Tuesday, November 29, 2011
Areas for Development, Challenges & Opportunities
|8:00 – 9:00||Breakfast|
|9:00||Keynote: Environmental contaminants and ecosystem approaches to human health: successes and challenges.
Jean Lebel, Director Agriculture and Environment, International Development Research Centre (IDRC)
|9:45||Session 3: Technological Perspectives
Chair: Frank Kelly, Medical Research Council, HPA Centre for Environment and Health
|10:45||Nutrition & networking break|
|11:15||Session 3: Technological Perspectives (continued)
|12:15||Panel discussion with presenters from Session 3 and plenary discussion|
|14:00||Small group brainstorming session
Moderator: Ross Anderson, Professor of Epidemiology and Public Health, St George's, University of London
Please choose one of the following topics to discuss:
|15:00||Nutrition & networking break|
|15:30||Report back to plenary and group discussion|
“Free”night to explore Montreal (see participant package for suggestions)
|Day 3 – Wednesday, November 30, 2011
What’s next, Working together, Where do we go from here?
|9:00||Keynote: Making Tough Choices in Longitudinal Studies: Deciding What to Measure and When--Lessons Learned from the NCS.
Michael Dellarco, National Children’s Study, National Institute of Child Health & Human Development (NICHD)
|9:45||Summary of main themes from Days 1 & 2
Bruce Lanphear, Professor of Children's Environmental Health, Simon Fraser University
|10:00||Small group brainstorming session
Moderator: Bruce Lanphear, Professor of Children's Environmental Health, Simon Fraser University
Please divide yourselves into 4 groups. Please discuss the following 2 questions in your group:
|10:45||Nutrition & networking break|
|11:15||Each group will provide a 5-minute overview of the results of their discussions.
Plenary discussion on next steps
Michael Kramer, Scientific Director, CIHR-IHDCYH
|12:15||Adjourn and Lunch|
Appendix II: List of Participants
Professor of Epidemiology
St George's, University of London
Environmental Health Scientist,
National Collaborating Centre for Environmental Health and BC Centre for Disease Control
Université de Montréal
University of British Columbia
Senior Research Scientist
Environment Canada and Adjunct Professor,
University of Toronto
Professor and Canada Research Chair
University of Ottawa
University of California, Berkeley
Director E&OH, Public Health Ontario and Associate Professor,
University of Toronto
R. Neil Dalton
Professor of Paediatric Biochemistry
King's College, London
Kees de Hoogh
Senior Research Officer
Imperial College London
Senior Scientist and Project Officer
The National Children's Study
Eunice Kennedy Shriver
National Institute of Child Health and Human Development (NICHD)
MRC Centre of Epidemiology for Child Health
Institute of Child Health,
University College London
Assistant Deputy Minister
Science & Technology Branch
Agent de recherche
Science and Innovation Officer
British Consulate-General Montréal
Research Institute of the McGill University Health Centre
Assistant Director, Partnerships and
Air Pollution Group
Health Protection Agency
(England and Wales)
Université de Montréal
Patrick G H Holdich
British Consul General
University of California, Berkeley
Leslie Jones Communications
Professor of Atmospheric Science
University of Cambridge
Professor of Environmental Health
King's College, London
Michael S. Kramer
Director and Professor
University of Ottawa, McLaughlin Centre for Population Health and Risk Assessment
Clinician-Scientist and Professor
Child & Family Research Institute and Simon Fraser University
Directeur, Agriculture et environnement
Centre de recherche pour le développement international (CRDI)
Research Admin/Project Manager
National Institute of Environmental Health Sciences, NIH
Samuel Lunenfeld Research Institute
Centre de Toxicologie du Québec
Institut national de santé publique du Québec (INSPQ)
CIHR – RMNI
Université du Québec à Montréal
Université Laval et Centre de recherche du CHUQ
Chair, CIHR Cancer Research Institute
Canadian Cancer Society Chair Population Cancer Research
Assisted Human Reproduction
Santé pub Montréal
Canadian Partnership for Children's Health & Environment
Institut de génétique des IRSC
Hans W. Schleibinger
National Research Council Canada
Ventilation & Indoor Air Quality
Professor of Medicine
McMaster University/ St. Joseph's Healthcare
CIHR-IPPH/ISPP des IRSC
Professeure de clinique
Université de Montréal
Université de Moncton
Université de Sherbrooke
Jay Van Oostdam
Senior Epidemiological Advisor
Chemicals Surveillance Bureau
Karolinska Institutet (Suède)
Senior Research Scientist
Strategy and Planning Manager
UK Medical Research Council
Amanda J. Wheeler
Professor, Air Quality
King's College London
Appendix III: Key Discussion Points
The key questions posed and comments made during the plenary discussions that followed each of the presentations and small-group sessions are provided in this Appendix.
Keynote Presentation by Martin Williams
- There has been a lot of buzz around
“exposome”, which is the totality of environmental exposures received by a person over his/her lifetime. Where might this guide the field over the next five years? A. The quantification of total lifetime exposure presupposes a high degree of knowledge about individual components, which we don't yet have.
- If there was a simple blood test that could be done as a biomarker that could identify all the particulate matter (PM) exposure or gas exposure over a specific time period, could that replace some of the things you're measuring? A. The problem is that once a pollutant gets into the body, it goes through processing over a range of time scales that make the link to the source fuzzy. It might be different in terms of specific biomarkers for specific carcinogens, but what we're looking at here may not be in that league.
- Trying to measure environmental exposure using fixed-site monitors is very hard, because people go everywhere. Wouldn't the individual be a better testing station for global exposure? A. Yes, but we are still left with the challenge of how many individuals have to be sampled to base policy on the results. It would be an attractive option if it were possible to work back to figure out what the source of those components were.
- Are there biomarkers for non-carcinogenic pollutants? A. There is no simple answer, or we would be doing it. There are a lot of biomarkers that indicate certain sources, but none are dependable enough to be used in a large study. We can measure metals in urine and are doing studies on that, but we have a long way to go to identify anything reliable enough to take forward to help understand long-term effects associated with exposures to air pollution.
Session 1 Presentations
- There is interest in putting together the Canadian Health Measures Survey and BE equivalents you developed. Are there data on some of the compounds and, if so, can they be used as a benchmark for gauging population exposures at this time? A. Yes, there are national data on a few substances, including BPAs.
Session 2 Presentations
- Not every study needs to separate the cohort by sex, but researchers should still be thinking about it and how it applies to what they are looking at. We need to ensure that we are measuring the right things and that they are being applied adequately to both sexes and genders.
- With regard to measurement in urine of exposures that are non-persistent, it is often only a one-spot sample that is used to estimate exposure over a period of months. The fact that some studies find relatively consistent results using this method raises the question of whether there is some systematic bias involved. Developing better methodologies (e.g., cumulative biomarkers) would be one way to check on this; another might be to see if the effects are consistent with animal studies.
- If very expensive measures still have limitations, maybe we should be thinking of ways to improve the prospective statistical design of our studies. For example, we could focus on repeat measurements in a small subset and make inferences based on some of the other information, or conduct randomized trials to try and reduce exposure by random assignment.
- Most comments have been related to questionnaire-based measures of exposure. In terms of bio-banking, what attempts have there been to standardize sample types, laboratories, etc. among cohorts? A. A number of programs are working on harmonization, including BioSHaRE and the Canadian Partnership for Tomorrow, which has tried to document all of the factors that could have an influence on biochemical measures and developed a data schema for samples.
- The ESCAPE project has also tried to bring together 25 cohorts to standardize the approach to estimating air-pollution exposure and is working on a protocol to try and synthesize things.
- Deciding when data can and can't be used as equivalent is a real struggle. Are there any principles on which to base those decisions? A. Situations are context-specific, so instead of saying
“here's how to do it”, we have been trying to agree on certain steps in the decision-making process. The first is to document the methods and procedures used to collect the information (e.g., how and where was it collected); the second is to sit down with the experts and determine what you want to have for the final analysis, and then evaluate whether the cohort can provide that information.
Session 2 Small-Group Brainstorming Session
- We live in a cell phone society, but wealthier people have more cell phones than poor people, so using them for monitoring would represent more what wealthier people do than society as a whole.
- In the US, many cell phones are carried by people of lower socio-economic status (SES), and cell-phone use is leapfrogging over traditional phone service in certain parts of the developing world because they are cheaper or more accessible than land lines. This raises interesting questions about the bias this could introduce, but it is not as much of a high-SES phenomenon as it used to be.
- It could be useful to pool data from studies that take measurements at different times. If we are careful and thoughtful, we could get good results.
- In terms of young children, there are limits to how much blood you can collect, for ethical reasons, which makes it difficult to collect samples at different times from the same person. If you have a large enough cohort, however, you can solve the problem by collecting samples from different individuals at different times.
Keynote Presentation by Jean Lebel
- What disciplinary groups have you engaged to understand how best to influence behavior? A. Natural scientists, social scientists (broadly speaking), biologists, medical groups, biochemists, anthropologists, psychologists, and sociologists. Even developing a common language takes time and poses many challenges.
- Given that urbanization is a big change in developing countries, how could this approach be applied to urban areas? A. There has been a lot of work sponsored in the urban domain; not necessarily on contaminants, but on diseases. Taking this approach has proven much faster in terms of research and intervention (e.g., proximity of institutions and density of actors means can reach out more easily, also closer to the power circle where change can take place). At the same time, multiple interests sometimes arise and that can be difficult to manage. People sometimes expect more than the research can deliver, so it is important from the beginning to build a trusting relationship and ensure that people understand what you can and cannot provide. It is important to develop capacity locally to carry out this type of work.
- Ecosystem health fits well with people who are tied to local environments, and it could apply well in Arctic Canada. But people in big cities whose food is from thousands of miles away and who travel widely are not as exposed to their local environment, so how could ecosystem health be useful in that context? A. There is more research being done on the nexus between agriculture, health, nutrition, and non-communicable disease. While people in cities are remote from the field, they have huge power in terms of changing situations in the field (e.g., the spread of organic food) by putting pressure on producers for better quality goods. We are struggling with bringing ecosystem health to a higher-than-local scale and have limited evidence of it working at that level. We need to push for other funding sources and get them interested in this approach; if we're serious, we need some big programs and long-term investments.
Session 3 Presentations
- The most exciting idea of the afternoon was the concept of using new technology to transform the way we do exposure assessment in a variety of environments. If I use my cell phone to take a photo of what I'm eating, for example, and send it to the local laboratory, that is real-time, dietary exposure assessment with no recall bias. And it could be used for measuring things like UVB, PM, etc. as well. This may not be ready for prime time just yet, but there are a lot of proof-of-concept studies that could be done to move this forward.
- There are already a number of dietary websites where people type in what they ate and it comes up with a list of comparable products (e.g., research council of epidemiology) and their nutritional information.
- This is interesting approach to looking at micro and macro-nutrients, but we are more interested in chemical contaminants ingested through food—and photos can't tell you whether there are pesticides on what you ate. If diet is a major source of exposure, we need to be more innovative about how to look at that.
- We have come up with lots of futuristic ideas over the last couple of days, but that bit of grandiosity is good, because we have to be comfortable standing up and being arrogant about what we study and what we know we can achieve by doing a better job by measuring, quantifying, and modifying the environment.
- Foods come from all over the world, so contamination differs vastly, even within products. It can be measured in blood, but that doesn't tell you what nutrients are also coming in from food and whether they are useful. For example, there is mercury in fish, but fish is also highly nutritious. What we want to do is maximize nutrition and minimize risk, because a lot of pollutants are persistent and will be in our bodies for a long time. The question is what we can do to reduce the impact.
- One of the challenges people working in environmental health face is not only to identify harmful substances and their effects but also to determine the pathways from source to effect and the factors that can influence those pathways. That should be part of our measures and linked to developing better models.
- The more we can tap into public enthusiasm, the better off we will be. How to link this to chronic disease control is another issue.
- We not only have to measure contaminant concentrations in food but also need to know intake.
- We need a grand vision of where we want to go, because it is technology-forcing. Without something to work toward, we won't get there fast enough.
- Another example of how GPS technology could enhance assessment would be to use it to keep track of the environments where people go on a daily basis and link that to various indicators of health and SES.
- Everybody carries a cell phone nowadays, so even if they are not perfect for research yet, if people start using them anyway you will soon wind up with something that is a game-changing device. We have to keep our eyes open for potentially
“disruptive”technologies (likely something with a mobile computing platform) and take advantage of the opportunity to promote their use and improvement.
- People with diabetes already have wireless technologies that provide information to doctors, so we could conceivably get biomarkers in the field for certain sub-samples, and information on exposure, stress levels, etc. at that very moment. That would relieve a lot of the ambiguity of statistical errors because exposure measurement error would be virtually eliminated.
- There are cell phone apps that can be downloaded to measure noise levels.
- More studies, big numbers, more data, more harmonization: these are all very ambitious ideas, but many require proper development (e.g., proof of concept). No granting body is going to give millions of dollars to a project that is going to use an untested methodology. We need to start thinking about developing a program to turn some of these ideas into pilot projects, so they can be justifiably included in new cohort studies, etc. We have to look at things in a practical way before we can sell them.
- What do we know about characteristics of exposures that are not epigenetically active? A. We only know about exposures that lead to epigenetic consequences, because nobody reports negative results. Nobody has done a methodical study of compounds we consider safe and tested them.
- What about tissue variation in terms of activity? A. Methylation could be specific to an individual cell type, but we still see responses that are detectable in spite of the variations that go around. If an agent has an epigenetic effect, whether we take blood or saliva, we will see the consequences.
- Are there any data from longitudinal samples? A. We have some data from longitudinal studies in humans now, and for animals we have certainly followed the time course from infancy to later years. When we store samples, we cannot think solely of today's technologies but also of tomorrows. Imagine the questions we could answer if we had stored blood in the 1958 cohort.
- We have heard that storage can spark epigenetic changes if it is not done right. A. There are two principles in storage: store specific cell types (e.g., white blood cells) and store DNA, which is extremely stable once it has been made.
- What are thoughts around the equity implications of work? A. Studies suggest a mechanism whereby we can actually trace the chemical stages between poverty and what happens to the DNA. This emphasizes that economic toxicity is as bad or worse than all other toxicities.
- Are any regulatory agencies doing or requiring a screening analysis? A. Health Canada is thinking about it, but nobody is doing it—here, or in Europe, as far as we know. There are two philosophical ways to approach the idea: the academic approach that it is too complicated, and the other approach that we have no right to deprive the public of what we already know.
- Is it possible to specify particular volatile organic compounds (VOCs)? A. It is possible to use photo ionization to ionize families of VOCs, but the technique is not VOC-specific. It fits into the same footprint: small, light, and low-cost.
- In terms of personal monitoring equipment, the results reported here are extremely promising. We have been looking at modifying occupational-type sensors, but have encountered problems with them only being used once or twice or being very consumptive. We hope this encourages collaborations, because nobody else has these capabilities right now.
- Ozone is one of the most difficult contaminants to measure in an ambient environment. How close are you to getting that operational? A. Ozone interferes with other measurements we're making because it is such an effective oxidant, but it can be measured quite precisely in terms of how it affects others. We have good specificity in the laboratory and are optimistic about the atmosphere too.
- Yesterday, there was talk about mechanisms for PM interaction with human beings and recall studies showing ambient PM potentially affecting DNA. Are there any potential biomarkers or metabolites that might offer a clue about PM exposure? A. There haven't been any studies suggesting DNA methylation or not. Changing the sequence of DNA is a mutagenic effect; the epigenetic response has not been examined yet.
- The difference in specificity when measuring or assessing concentrations of nitrogen or nitrogen dioxide and the specificity shown in various proteomic and metabolic markers is striking. How can measuring the effect of a contaminant on a proteome or metabole be used with regard to markers of exposure? A. Virtually all components can be measured quite accurately; the question is whether certain changes imply a common pathway or are entirely specific. Both cases are likely, in that some will cause a specific effect (e.g., nitration) while others may elicit a common change in relation to a certain group of infections or group of environmental contaminants.
- What we get with exposures, whether they are social or chemical, is a landscape response from which many things can be detected. If you look at the genome, you will get an exquisite signature that will tell the history of the exposure and also be able to differentiate between exposures.
- As the leader of a large cohort study banking hundreds of thousands of blood and urine samples, should we be doing these tests now and having the banks ready for data mining—or hang onto them because it will be cheaper in five years and the technology presumably more refined? A. It will likely be cheaper down the road. The question is how urgent the questions are and the timeframe in which to answer them. These techniques are seen as
“fishing”expeditions, so the best approach is to do some hypothesis-driven work as well, so you have an idea of what you are trying to look for.
- Questions and data drive technology. If we want technology 20 years down the line, we need to do experiments now and ask the right questions. As we encounter obstacles, somebody will develop technologies to address them.
- There are enormous technological requirements before us with respect to these various approaches that we need to encourage government to invest in—not just because we want to use new machinery for sample analysis, but because it will drive some growth in our economy, the same way the genome project did. We need to bring to the attention of government that if we can stay at the head of this field, we will receive major economic advantages.
- To what extent is epigenetics being used for the whole field of individualized medicine? A. There have been major technological advances in what we are doing, which has been driven by medical needs. The potential is enormous, but the investment community has not fully appreciated that. Also, although we are on the verge of a scientific revolution, there has been a loss of trust in high-tech and technology and a loss of appetite to invest in research. So there is a misfit in terms of the tremendous advances we are poised to make and the risk-averse environment toward which we are moving.
- There are tremendous changes in methylation caused by immortalization, because it is a tremendous change in environment. Oxygen concentration in the human body is around 7-8% and in tissue cultures it is about 25%. The first thing the epigenome sees is oxygen, so all DNA methylation patterns change as soon as they are in a hyper-oxygenic environment. That's why it is essential to store DNA properly, so it can be analyzed later on, if not today.
- What do we know about the deterioration of trichlorophenyl in plastic as opposed to glass, or general freeze-thaw effects—or do all analytes have to go through testing for stability, different containers, etc.? A. We know that plastic or glass vials, depending on the compound, change the way we can keep samples. In general, don't store anything at minus 20 for any period of time. You will lose whole classes of proteins in urine during that first freeze overnight. The key is to get the temperature to at least minus 40.
- Dried blood spot and urine samples are useful because they are consistent; once you separate a sample, you have introduced significant variables. If you want to keep them, they should be stored at minus 80, ideally, as those left at room temperature suffer significant deterioration within 12 months.
- A special supplement in the International Journal of Epidemiology, published about five years ago, featured summaries of storage conditions for samples. There is also quite extensive literature on dried blood spot stability and, more recently, there has been work by the Centers for Disease Control and Prevention on the systematic review of storage conditions.
- If the aspiration is to get technology and methods out into these large studies of environment and human health, what is the key obstacle you would address if you had funding? A. Data handling. We can generate huge datasets on a whole host of different compounds, but we may never have the confidence to use them. We need some way to break them down into something we can understand.
- There is also a huge bottleneck in North America in mathematical training, and the mathematicians we have are generally useless for our purposes because they have no sense of the larger biological context. We are struggling to get funding for bio-mathematicians, yet there is a huge need for it.
- Another more practical problem is that activities generally have to be very hypothesis-driven: you almost need to know the answer before you put forth the proposal. Yet, some of the small-census work is much more speculative, so there has to be recognition that lot of new techniques developed this way could result in some big surprises. Funding agencies are uniformly risk-averse, but there needs to be a measure of risk-taking in order to get a potentially high reward.
- The CIHR is reforming its funding streams to protect or create a separate stream identified as high-risk and potentially high-reward, so those applications will not have to compete with low-risk, safe research.
- Another important message to funders is that in order to make advances, various disciplines have to work together. Characterizing the problem in a cross-disciplinary way (e.g., as was done in the US and the UK for the genome project) makes the connection with the need for capacity to address multiple issues.
- There is a disconnect between 90% of methylation being over by birth and the dynamism of the system with regard to response after birth. Does dynamism decrease with age? A. There are different DNA methylation changes in the genome. Ninety percent of them identify cellular identity, so that is etched in stone, but there are other sites on the genome that define environmental context identity, and they are up for tweaking. DNA evolution works at multiple time scales—and with 24 million CpG sites and another 24 million non-CpG sites on the genome, there is a lot of potential for addressing them.
- We need to make the private sector realize the potential to make money from this work. That means developing economic models to show the financial benefits of doing these studies. We have to formalize these thoughts and make reasonable arguments to defend them. We also need to redefine risk, because the public wants to be told what a safe environment is and what new drugs or diagnostic tools there are—and if our research doesn't meet those needs, that is a risk.
- The pharmaceutical industry is going to produce rich data sets from the personalized medicine agenda. If they are in the public space, they can be used to take forward some of our issues and challenges. There are lots of ways we can think laterally and smarter to make things happen in our lifetime.
Session 3 Small-Group Brainstorming Session
- There is not much currently available for airborne particle measurement, so it is measured discontinuously. It is typical, for example, to take desk samples over the course of a week and then shift to the laboratory, so there is poor time-sensitivity with regard to results.
- Counting and speciating mold spores is still fairly labour intensive, but this is a very important factor and one we control.
- There are a number of instruments capable of distinguishing between inactive and biologically active airborne particles. They may not be routine, but the military has been promoting their development for application in biohazards, bioterrorism, etc.
- In the realm of airborne biologicals, there is a lot more being learned about DNA profiling of samples and the diversity of exposures in the microbe realm.
- Diet seems to be missing from the discussion and is an incredibly important way in which we're exposed to contaminants.
- One possibility discussed was to explore new technologies to obtain more longitudinal information in a more objective way (e.g., calling subjects on their cell phones at a specific time for instantaneous reporting on what they just ate).
- There are a lot of food-frequency questionnaires out there as well.
- With regard to high versus low exposures, it seems that when you divide a population in that way, you lose other information. A. It is a pragmatic first step. If you want to do an expensive test, you take a stratified sample, look at the two extremes, and see if there is a main effect. If there is no difference, you do something else. If there is, you can still look at the grey area in between.
- You don't know if contaminant exposure is high or low before you measure it, so how do you decide who is high and who is low? A. This is not meant to measure exposure but rather biological interaction or biological effect.
- This raises an interesting biological phenomenon, whereby you get a bell-shaped dose-response curve, which has been observed in a number of areas. So you have to understand the association to begin with.
Keynote Presentation by Michael Dellarco
- Working in small, multi-disciplinary teams and trying to decide methodology is very complicated. What process do you use for making relatively quick decisions? A. We have very transparent operations and believe in open dialogue. We encourage our study centres, staff, and experts to tell us if we are missing something or if they have any recommendations for improvement—then we look to see if there is an opportunity to bring it into the study or test whether it would be useful. The ultimate decision is made by the directors of the study and the Institute, based on the results of the test and other evidence.
- With regard to data and data entry, what methods have you looked at in terms of interview-directed versus participant-completed surveys, and electronic entry versus paper? A. Information management systems and informatics are critical. We chose proprietary software systems early on and are now backtracking to open-source, modular systems. The concern with a proprietary system is if you want to go forward and invest in changes, you are stuck if the developer isn't interested. It is also problematic if you want to partner. An open-architecture, open-source system solves those problems, although there are not a lot available. Initially, we also had a central coordinating system, which has advantages in terms of data entry but is a programming nightmare if you make changes in protocol. We suggest developing a modular design around various domains of interest, so if want to make a change you don't have to disrupt the entire platform. The other important aspect is to have electronic data-checks to ensure quality.
- What are your thoughts about storage standards? A. Store samples at minus 80. We are looking at introducing more real-time measurements to reduce the need for storage itself and at different methodologies (e.g., desiccation) that might be cheaper and would improve quality—given that freezing and thawing invites opportunities to compromise quality.
- What do you require in terms of participant consent, and do you have to go back to children for re-consent at later points in life? Also, do families/individuals get any results back and, if so, how? A. We have very detailed procedures to ensure proper consent, and participants are allowed to opt out at any point, and can even opt out for a few visits. As children age, there are laws and protocols that we follow. Agreement to submit to all measures is very high, however. In terms of results disclosure, we don't have a procedure in place—but if there are legal issues (e.g., abuse) or recognized environmental or clinical situations, we are required to inform participants and local authorities.
- Date modified: