43
General enquiries on this form should be made to: Defra, Science Directorate, Management Support and Finance Team, Telephone No. 020 7238 1612 E-mail: [email protected] SID 5 Research Project Final Report SID 5 (2/05) Page 1 of 43

General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

General enquiries on this form should be made to:Defra, Science Directorate, Management Support and Finance Team,Telephone No. 020 7238 1612E-mail: [email protected]

SID 5 Research Project Final Report

SID 5 (2/05) Page 1 of 30

Page 2: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

NoteIn line with the Freedom of Information Act 2000, Defra aims to place the results of its completed research projects in the public domain wherever possible. The SID 5 (Research Project Final Report) is designed to capture the information on the results and outputs of Defra-funded research in a format that is easily publishable through the Defra website. A SID 5 must be completed for all projects.

A SID 5A form must be completed where a project is paid on a monthly basis or against quarterly invoices. No SID 5A is required where payments are made at milestone points. When a SID 5A is required, no SID 5 form will be accepted without the accompanying SID 5A.

This form is in Word format and the boxes may be expanded or reduced, as appropriate.

ACCESS TO INFORMATIONThe information collected on this form will be stored electronically and may be sent to any part of Defra, or to individual researchers or organisations outside Defra for the purposes of reviewing the project. Defra may also disclose the information to any outside organisation acting as an agent authorised by Defra to process final research reports on its behalf. Defra intends to publish this form on its website, unless there are strong reasons not to, which fully comply with exemptions under the Environmental Information Regulations or the Freedom of Information Act 2000.Defra may be required to release information, including personal data and commercial information, on request under the Environmental Information Regulations or the Freedom of Information Act 2000. However, Defra will not permit any unwarranted breach of confidentiality or act in contravention of its obligations under the Data Protection Act 1998. Defra or its appointed agents may use the name, address or other details on your form to contact you in connection with occasional customer research aimed at improving the processes through which Defra works with its contractors.

Project identification

1. Defra Project code PL0548

2. Project title

Addressing uncertainty in pesticide exposure modelling

3. Contractororganisation(s)

Cranfield UniversitySilsoeBedfordshireMK45 4DT          

54. Total Defra project costs £      

5. Project: start date................ 01 October 2002

end date................. 30 September 2004

SID 5 (2/05) Page 2 of 30

Page 3: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

6. It is Defra’s intention to publish this form. Please confirm your agreement to do so...................................................................................YES NO (a) When preparing SID 5s contractors should bear in mind that Defra intends that they be made public. They

should be written in a clear and concise manner and represent a full account of the research project which someone not closely associated with the project can follow.Defra recognises that in a small minority of cases there may be information, such as intellectual property or commercially confidential data, used in or generated by the research project, which should not be disclosed. In these cases, such information should be detailed in a separate annex (not to be published) so that the SID 5 can be placed in the public domain. Where it is impossible to complete the Final Report without including references to any sensitive or confidential data, the information should be included and section (b) completed. NB: only in exceptional circumstances will Defra expect contractors to give a "No" answer.In all cases, reasons for withholding information must be fully in line with exemptions under the Environmental Information Regulations or the Freedom of Information Act 2000.

(b) If you have answered NO, please explain why the Final report should not be released into public domain

Executive Summary7. The executive summary must not exceed 2 sides in total of A4 and should be understandable to the

intelligent non-scientist. It should cover the main objectives, methods and findings of the research, together with any other significant events and options for new work.

Current environmental risk assessments for pesticide registration in Europe rely on a comparison between exposure and ecotoxicological endpoints (surface water) or a legal threshold concentration (groundwater). Mathematical models are often used to calculate exposure concentrations. Traditionally, deterministic approaches have been applied where a single combination of model input parameters is used to calculate a single set of predicted environmental concentrations. The parameter combination is often selected so as to be protective of the actual range of use conditions. Such ‘realistic worst-case’ approaches are useful at the lower tiers of the regulatory assessment process because they are quick to deploy and act as a screening step. However, the likelihood that the predicted risk will occur under real environmental and usage conditions is not assessed and the degree of environmental protection provided by deterministic approaches is largely unknown. There is increasing interest in the use of probabilistic techniques to better quantify the likelihood and magnitude of the risk involved.

Uncertainty and variability in exposure modelling arises from various sources, including the spatial and temporal variability in factors influencing pesticide behaviour and the uncertainty associated with the Previous Defra-funded research identified and prioritised sources of uncertainty in the assessment of environmental exposure of pesticides, established the relative strengths and weaknesses of available tools, identified significant issues with reproducibility of Monte Carlo analyses, and provided case studies to inform the debate in the UK on probabilistic modelling (Dubus et al., 2002).

The aim of the current project was to continue the development of approaches to incorporate variability and uncertainty within pesticide exposure modelling and to assess the confidence that should be assigned to outputs from probabilistic exposure assessments. The specific objectives were:

To further investigate the use of Monte Carlo analysis in exposure assessment; To investigate approaches to estimate landscape-level variability in pesticide exposure; To support the development of environmental risk assessment at national and European levels

and to contribute to the generation of case studies.

A case study with the hypothetical herbicide ‘herburon’ was developed to show how probabilistic methods can be used to refine and strengthen assessment of both exposure and effects to aquatic organisms. The case study specifically demonstrates the use of probabilistic approaches with the base-set of environmental fate and effects measurements and how variability in exposure across an agricultural landscape can be built into the assessment. The Framework V Concerted Action EUFRAM (www.eufram.com) is developing a European Framework for probabilistic risk assessment of the environmental impacts of pesticides. The herburon case study has been used to test procedures and

SID 5 (2/05) Page 3 of 30

Page 4: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

recommendations developed within the project and to provide a benchmark example for dissemination to the scientific and regulatory community. The study report will form an integral part of the framework document to be produced by EUFRAM. Key conclusions from the case study are: Probabilistic methods can be applied without necessarily generating large amounts of additional data.

There are significant opportunities to develop standard probabilistic scenarios for exposure calculations that will reduce data requirements and increase harmonisation and transparency. Such scenarios offer the best opportunity to incorporate aspects of landscape variability into risk assessment. There are established methodologies for expressing uncertainty associated with extrapolating from small datasets.

Probabilistic methods can be applied at various stages during refinement of the risk assessment. It can be useful to compare probabilistic exposure with deterministic effects or vice versa.

In this example, there was good consistency between the first tier deterministic estimate of exposure and a probabilistic refinement. This provides a much-needed validation of the first-tier methodology. Further examples of such comparisons should be generated.

The case study provided the basis for feedback on approaches to communication from the Environmental Panel and is being used to train end-users and stakeholders within EUFRAM.

The most widely used technique to account for uncertainty and variability in pesticide fate modelling is termed Monte Carlo analysis (multiple calculations are undertaken, each with a different set of possible input parameters; results are collated into a distribution of possible outcomes). A ring test with six relatively experienced modellers showed that there are many subjective choices to be made during MC analysis and that this will yield differences in results from different modellers addressing the same problem. The differences are greatest at the tails of the distribution that are normally the focus during regulatory decision-making. The best way to reduce subjectivity would be to develop standard protocols or rigid modelling tools for MC analysis that would be analogous to the FOCUS procedures and tools for deterministic exposure modelling. More experience of probabilistic methods and applications to regulation are required before such instruments could be put in place. In the interim, it is essential that all methodology is documented and justified in detail to improve transparency and ensure that results can be reproduced if necessary. Guidance on how to report the rationale, methodology and results of a MC analysis has been prepared and incorporated into the EUFRAM framework document for dissemination to end-users at the European level.

A key decision in setting up a probabilistic analysis is whether there is correlation between any of the inputs to the analysis. Pesticide properties are often the most sensitive in exposure assessment. Analysis of the literature demonstrates that there are examples of both positive and negative correlation between the sorption and degradation behaviour of a pesticide, but that general trends are not apparent at present. The inclusion of even weak correlation between sorption and degradation was found to greatly influence a probabilistic analysis of leaching through soil. Correlation will reduce predicted risk of leaching for pesticides and it is recommended to set the correlation to zero unless the experimental data support an alternative assumption (i.e. where the correlation is significant with at least 90% probability and experimental artefacts can be excluded).

Choosing which shape of distribution to fit to data for model input parameters can be difficult for small datasets and will influence results of probabilistic analysis. Eleven distributions were fitted to data for sorption and degradation of pesticides measured in >10 soils. Skewed distributions (e.g. lognormal, Weibull, beta) tend to fit the data best. It is recommended that a lognormal distribution is fitted to small datasets for sorption and degradation unless there is strong evidence to support an alternative (>10 data and a clear difference in goodness-of-fit statistics). Work with a dataset of 49 Koc values for atrazine demonstrated that distributions fitted to small datasets (four or ten data) can deviate strongly from those resulting from a larger number of measurements. The resulting uncertainty will significantly influence results from a probabilistic analysis and it is recommended that this uncertainty needs to be included in the assessment. A method to estimate uncertainty arising from generating distributions from small datasets that has been widely applied for species sensitivity distributions was found to be applicable within exposure assessment. However, the time involved in running mechanistic fate models may prohibit use for some calculations.

Predicted environmental concentrations will not vary linearly for pesticides with non-linear sorption (Freundlich coefficient, nf ≠ 1). Detailed sensitivity analyses have been produced for a range of scenarios to support decisions on how to modify PECsw and PECgw in response to changes in application rate.

Model error arises from an imperfect mathematical representation of the system being described. It is routinely ignored in deterministic exposure assessment, but this is probably justified given the worst-case assumptions and partial validation of the FOCUS scenarios and models. Model error should not be ignored for assessments outside of the FOCUS framework, particularly where these challenge some of the worst-case assumptions. A methodology based on sampling of an error term from a uniform distribution within 2-D Monte Carlo is proposed to incorporate model error into probabilistic calculations.

SID 5 (2/05) Page 4 of 30

Page 5: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

Probabilistic methods offer significant advantages if properly applied to risk assessment for pesticides as they can: quantify variability and uncertainty; produce outputs with more ecological meaning; make better use of available data; and should promote better science (EUPRA, 2001). The herburon case study clearly demonstrates this potential to improve the rigour and transparency of risk assessment and regulatory decision-making. Nevertheless, the detailed analysis of issues related to probabilistic approaches raises significant obstacles to full implementation. Methods are available to address some of these issues and procedures to standardise the analysis will develop in time. The challenge over the next five to seven years will be to gradually incorporate probabilistic approaches into regulatory procedures in such a way that confidence in the methods is not undermined. It is recommended that this may best be done as with fate modelling submissions during the early 1990’s whereby modelling results were considered alongside experimental data as supporting information rather than as the key endpoint supporting a decision. Probabilistic calculations will need to be considered in parallel with deterministic approaches so that the two can be compared and to allow confidence in the new approaches to develop. Such an approach would also allow time for risk assessors within industry and regulatory bodies to gain experience with the new methods. This body of experience and worked examples of probabilistic calculations applied to real regulatory problems is now the most important requirement for regulatory implementation.

Project Report to Defra8. As a guide this report should be no longer than 20 sides of A4. This report is to provide Defra with

details of the outputs of the research project for internal purposes; to meet the terms of the contract; and to allow Defra to publish details of the outputs to meet Environmental Information Regulation or Freedom of Information obligations. This short report to Defra does not preclude contractors from also seeking to publish a full, formal scientific report/paper in an appropriate scientific or other journal/publication. Indeed, Defra actively encourages such publications as part of the contract terms. The report to Defra should include: the scientific objectives as set out in the contract; the extent to which the objectives set out in the contract have been met; details of methods used and the results obtained, including statistical analysis (if appropriate); a discussion of the results and their reliability; the main implications of the findings; possible future work; and any action resulting from the research (e.g. IP, Knowledge Transfer).

1. INTRODUCTIONCurrent environmental risk assessments for pesticide registration in Europe rely on a comparison between exposure and ecotoxicological endpoints (surface water) or a legal threshold concentration (groundwater). Mathematical models including PELMO (Klein, 1995; Jene, 1998), PRZM (Carsel et al., 1998), PEARL (Tiktak et al., 2000) and MACRO (Jarvis, 1994) are often used to calculate exposure concentrations. Traditionally, deterministic approaches have been applied where a single combination of model input parameters is used to calculate a single set of predicted environmental concentrations. The parameter combination is often selected so as to be protective of the actual range of use conditions. Such ‘realistic worst-case’ approaches are useful at the lower tiers of the regulatory assessment process because they are quick to deploy and act as a screening step. However, the likelihood that the predicted risk will occur under real environmental and usage conditions is not assessed and the degree of environmental protection provided by deterministic approaches is largely unknown. There is increasing interest in the use of probabilistic techniques to better quantify the likelihood and magnitude of the risk involved (e.g. ECOFRAM www.epa.gov/oppefed1/ecorisk, EUPRA (2001), EUFRAM, www.eufram.com).

Uncertainty and variability in exposure modelling arises from various sources (Dubus et al., 2003), including the spatial and temporal variability in factors influencing pesticide behaviour (Rao and Wagenet, 1985) and the uncertainty associated with the measurement, calculation or estimation of input parameters (Loague and Green, 1991). Variability is an inherent property of natural systems and cannot be reduced by further measurement. Uncertainty is, crudely, the sum of what we do not know; it includes, for example, sampling bias, measurement error and phenomena which remain unknown and/or unquantified. The most widely used technique to account for uncertainty and variability in pesticide fate modelling is the Monte Carlo approach (Miao et al., 2004; Trevisan and Vischetti, 2005) although other techniques such as fuzzy logic have also been applied (Freissinet et al., 1998).

DEFRA project PL0548 (Dubus et al., 2002) identified and prioritised sources of uncertainty in the assessment of environmental exposure of pesticides, established the relative strengths and weaknesses of available tools, identified significant issues with reproducibility of Monte Carlo analyses, and provided case studies to inform the debate in the UK on probabilistic modelling. The research presented here is an extension to project PL0548.

SID 5 (2/05) Page 5 of 30

Page 6: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

2. AIM AND OBJECTIVES The aim of this project was to continue the development of approaches to incorporate variability and uncertainty within pesticide exposure modelling and to assess the confidence that should be assigned to outputs from probabilistic exposure assessments.

The specific objectives were: To further investigate the use of Monte Carlo analysis in exposure assessment; To investigate approaches to estimate landscape-level variability in pesticide exposure; To support the development of environmental risk assessment at national and European levels and to

contribute to the generation of case studies.

3. MONTE CARLO MODELLINGThe Monte Carlo approach involves running a model many times for a large number of different input values or modelling scenarios followed by a statistical analysis of the model output. A Monte Carlo analysis typically involves six stages (Saltelli et al., 2000): (i) selection of outputs of interest for which the analysis will be carried out, (ii) selection of model input parameters which will be varied during the analysis, (iii) attribution of statistical distributions and ranges to these parameters and specification of dependencies and/or correlations, (iv) the generation of samples from the distributions (Monte Carlo sampling), (v) running the model for each of the sampled values, and (vi) examination of the output (e.g. calculation of percentiles and construction of cumulative distribution charts). A Monte Carlo analysis allows the user to evaluate the uncertainty in model predictions arising from uncertainty in the input parameters (e.g. the likelihood of obtaining a particular result) and to estimate the confidence that should be assigned to modelling results.

Dubus et al. (2002) investigated the merits and shortcomings of Monte Carlo modelling for predicting pesticide exposure from a conceptual, technical and registration point of view. The authors identified significant issues with reproducibility of Monte Carlo analyses. The present research evaluated how Monte Carlo approaches could be standardised in order to improve the transparency and reproducibility of probabilistic modelling studies submitted for regulatory purposes. Implications of subjective assumptions made for the outcome of the analysis are investigated. Guidance is given on how to handle correlation between model input parameters and on the reporting of Monte Carlo analyses of pesticide exposure.

3.1 Ring test to evaluate user-subjectivity in Monte Carlo samplingMonte Carlo analysis requires a number of key decisions by the modeller. Many of these will be based on previous experience and are thus subjective. Thus different modellers might be expected to generate different results for a given problem. A ring test was undertaken to evaluate whether or not user-subjectivity is a significant issue. The study investigated the influence of decisions on the type of statistical distribution (e.g. normal, lognormal) assigned to variable or uncertain model input parameters, the upper and lower limits of the distribution within which samples are taken (truncation), correlation between parameters, the tool and method used for sampling and the number of samples generated. Six pesticide fate modellers with experience in the implementation of Monte Carlo approaches were provided with data on degradation and sorption of metamitron measured in 18 soils by Allen and Walker (1987). Each modeller was asked to analyse the data, assign statistical distributions and sample values from these distributions. A leaching assessment was then carried out using a harmonised scenario and modelling protocol for each of the six sets of sampled degradation and sorption data. The six modellers were (in alphabetical order) Sabine Beulke (CSL, UK), Igor Dubus (BRGM, France), Hector Galicia (Springborn Smithers Laboratories, Switzerland), Nicholas Jarvis (SLU, Sweden), Dieter Schaefer (Bayer CropScience, Germany) and Marco Trevisan (Università Cattolica del Sacro Cuore, Italy).

3.1.1 Material and MethodsEach participant received data on degradation and sorption of metamitron measured in laboratory studies with 18 soils with a range of properties (Allen and Walker, 1987). The soils were incubated at 20oC and a moisture content equivalent to a tension of –33 kPa. The authors of the original paper derived first-order degradation rate constants (k) and Freundlich sorption coefficients from their experimental results. DT50 values (50% disappearance time = time at which half the initial amount of the pesticide has degraded) were calculated for the present study from degradation rate constants as DT50 = ln(2)/k. Sorption coefficients normalised to organic carbon content (Koc values) were calculated from the Freundlich coefficients (Kf values) and organic carbon contents of the soils (OC) as Koc =Kf / OC *100. Information on the Freundlich exponent was not available and the commonly used default value of 0.9 was used for this study. The variability in the laboratory values provided to the modellers originated from the use of soils with different texture, pH and organic carbon contents and different microbial activity (Allen and Walker, 1987). Although the modellers were not asked to consult the original paper, there were no explicit restrictions and one modeller chose to account for relationships between pesticide and soil properties in the sampling of modelling input data.

SID 5 (2/05) Page 6 of 30

Page 7: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

Each modeller was asked to (i) analyse the data on sorption and degradation of metamitron and assign a statistical distribution to the data, (ii) sample sorption and degradation data from the distributions, and (iii) where necessary convert these data to Koc values and degradation rate constants for direct use as input values for the leaching assessment. No further instructions were given on the assumptions, methods or tools to be used in the study. A leaching assessment was then carried out for each of the six sets of input data using a harmonised protocol. Leaching to depth was simulated using the leaching model PELMO 3.00 SP 2 (Klein, 1995; Jene, 1998) and a modelling scenario that is commonly used within pesticide registration (‘Borstel’ soil, ‘Hamburg normal’ weather data, single weather year repeated to give a 20-year simulation period).

3.1.2 Results and discussionAll modellers analysed the data on metamitron sorption and degradation for correlations and selected statistical distributions. Correlation coefficients were positive between DT50 and Kd/Koc (r = 0.20 to 0.26; P = 0.29 to 0.42) and negative between k and Kd/Koc (r = -0.35 to -0.38; P = 0.12 to 0.16). The correlations were not significant at the 10% level. Most modellers evaluated the fit of different distributions and retained either the normal or lognormal distribution. The histograms for Koc values and degradation rate constants were clearly skewed to the left and a lognormal distribution gave a better fit to the data than a normal distribution. The DT50 values for metamitron were described slightly better by a lognormal distribution.

Sampling of degradation and sorption parameters

Degradation and sorption parameters were sampled from the selected distributions. Table 1 summarizes the subjective choices made by the six modellers. Figure 1 shows scatter plots for sampled sorption (Koc) and degradation (DT50) data for each of the six modellers.

Table 1. Subjective assumptions made by the six modellers in sampling degradation and sorption parameters

Modeller 1 Modeller 2 Modeller 3 Modeller 4 Modeller 5 Modeller 6

Variables considered Koc, k Koc, DT50 Koc, k Koc, k Koc, DT50 Koc, DT50

Distributions assigned lognormal Lognormal lognormal triangular

Koc lognormal%Clay triangular

DT50 from regressionnormal

Selection criterion / goodness of fit statistics

Kolmogorov-Smirnov Anderson-Darling Shapiro-Wilk

statistical criteria, expert

judgementAnderson-Darling Shapiro-Wilk

Original paper consulted no No no no yes no

Correlation no No yes no no no

Truncation no Yes yes N/A yes no

Sample size 1000 5000 150 5000 1000 400

Software used MS Excel Crystal Ball UNCSAM @RISK Crystal Ball MS Excel

Sampling method Monte Carlo Latin Hypercube Latin Hypercube Latin Hypercube Latin Hypercube Monte Carlo

Figure 1. Scatter plots of Koc values vs. DT50 values sampled by the six modellers

Modeler 1

0

200

400

600

800

0 25 50 75 100

Koc

(L/k

g)

Modeler 2

0

200

400

600

800

0 25 50 75 100

Modeler 3

0

200

400

600

800

0 25 50 75 100

SID 5 (2/05) Page 7 of 30

Page 8: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

Modeler 4

0

200

400

600

800

0 25 50 75 100DT50 (days)

Koc

(ml/g

)

Modeler 5

0

200

400

600

800

0 25 50 75 100DT50 (days)

Modeler 6

0

200

400

600

800

0 25 50 75 100DT50 (days)

Probabilistic leaching assessment

Figure 2 shows cumulative frequency curves for maximum annual average concentrations in leachate simulated with PELMO using the six sets of sorption and degradation input parameters. Selected statistics, percentiles and the probability of exceeding the European regulatory threshold concentration of 0.1 μg L-1 are given in Table 2.

Figure 2. Cumulative frequency distributions of simulated maximum annual average concentrations in leachate at 1-m depth generated from input data sampled by the modellers (only 90th percentile to maximum shown)

90

92

94

96

98

100

0 50 100 150Concentration in leachate in year 20 (microg/l)

Freq

uenc

y (%

) Modeler 1Modeler 2Modeler 3Modeler 4Modeler 5Modeler 6

Table 2. Percentiles of the cumulative frequency distributions for simulated maximum annual average concentrations in leachate at 1-m depth and probabilities of exceeding 0.1 μg L-1 generated from input data sampled by the modellers

Modeller 1 Modeller 2 Modeller 3 Modeller 4 Modeller 5 Modeller 6

50th Percentile 0.004 0.004 0.003 <0.001 <0.001 0.00280th Percentile 1.2 0.88 0.39 0.009 <0.001 0.99090th Percentile 7.0 5.6 2.0 0.25 <0.001 8.495th Percentile 22.8 16.5 7.8 1.7 <0.001 48.999th Percentile 87.8 57.2 12.7 21.7 <0.001 214.2Exceedance probability (%)1 35.7 33.2 27.0 12.8 0.0 29.41 Probability that the concentration in leachate exceeds 0.1 μg L-1

Simulated concentrations in leachate and the probability of exceeding 0.1 μg L-1 differed considerably between the modellers (Figure 2, Table 2). The discrepancies were largest at the upper percentiles. Simulated concentrations based on data provided by modeller 5 were much smaller than for the remaining five sets of input data and did not exceed 0.001 μg L-1 for any of the 1000 model runs. Modeller 5 sampled clay contents of the Borstel soil and Koc values for metamitron and calculated DT50 values for each run from these two variables using a regression equation. Since the error term from the regression was ignored, the resulting range of DT50 values (8.6-18.5 days) was much smaller than those for input parameters derived from sampled distributions.

Concentrations at the upper percentiles calculated using the data provided by modeller 4 were smaller than those for modellers 1, 2 and 6 and were also smaller than those for modeller 3 except for the upper 2% of values (Figure 2, Table 2). This is because the distributions of parameters sampled by modeller 4 did not fit the data well and were shifted towards shorter DT50 values and larger Koc values.

SID 5 (2/05) Page 8 of 30

Page 9: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

Modeller 3 was the only participant who included a correlation between the degradation and sorption parameters. As a result, combinations of small Koc values and long DT50 values are sampled less frequently by modeller 3 than by modellers 1, 2 and 6 (Figure 1). The model simulates greater potential for the pesticide to leach to depth for these extreme combinations. Concentrations at the upper percentiles of the cumulative frequency curve were, thus, larger when the parameters were sampled from uncorrelated distributions (Figure 2, Table 2). The 50th percentile concentration in leachate and the probability of exceeding 0.1 μg L-1 were similar with or without correlation. The correlation between Koc and the degradation rate coefficient was only significant at p=0.11, and might therefore be justifiably ignored. However, even weak correlations that are not significant at confidence levels normally used in statistics may have important effects on predictions from leaching models, and it is not clear that such correlations should be ignored. This illustrates one important subjective choice.

The 95th and 99th percentile concentration in leachate calculated from the data provided by modeller 6 were much larger than those calculated for the remaining five sets of input parameters. This is because the parameters were sampled from untruncated normal distributions which generated a number of very small Koc values (Figure 1). Truncation can be very useful to avoid sampling of unrealistic values. The upper percentiles of the output distribution are influenced strongly by truncation of the input distributions. This is another important subjective choice in Monte Carlo modelling.

Modellers 1 and 2 sampled Koc values and DT50 values from almost identical lognormal distributions. This resulted in very similar combinations of sampled Koc and DT50 values. However, modeller 1 sampled from untruncated distributions whereas modeller 2 used the 0.5th and 99.5th percentiles as cut-off values. A larger number of small Koc values in combination with long DT50 values were, thus, sampled by modeller 1, giving somewhat larger simulated concentrations at the upper percentiles (Figure 1, 2).

3.1.3 ConclusionsThis study investigated the influence of user-subjectivity in Monte Carlo sampling on the results of probabilistic pesticide exposure modelling. Six modellers made subjective decisions on the type of statistical distribution (e.g. normal, lognormal, triangular) assigned to degradation and sorption data for an example pesticide, the upper and lower limits of the distribution within which samples are taken (truncation), correlation between parameters, the tool and method used for sampling (random vs. Latin Hypercube Sampling) and the number of samples generated. The main findings are: User-subjectivity can significantly influence the results of probabilistic exposure assessments. Uncertainties

resulting from user subjectivity are expected to introduce uncertainties in results that are greatest at the extremes of resulting distributions. All modellers had already undertaken Monte Carlo modelling in the past and the results presented above are likely to reflect subjectivity under actual conditions of use.

Standard protocols for Monte Carlo analyses would help to reduce user-subjectivity. However, more experience with this technique is required before detailed recommendations can be made.

Until such protocols become available, it is very important to clearly document the methodology used to improve transparency and the reproducibility of the results. Draft guidance on how to report probabilistic exposure calculations has been prepared within this project (see overview in Section 3.3) and will be included within a framework document on probabilistic approaches that is being prepared by EUFRAM.

3.2 Parameter correlation in Monte Carlo samplingMonte Carlo modelling involves sampling a large number of values for each variable or uncertain parameter from a distribution. Often, values for one parameter are sampled independent of the values for another parameter such that any combination is possible. However, many model parameters are inter-related and certain parameter combinations may be unrealistic. This can be accounted for in the sampling by (i) specifying mathematical relationships between parameters (i.e. only one parameter is sampled and the second parameter is calculated as a function of the first parameter) or (ii) specifying correlations. If large values of one parameter are associated with large values of another parameter, then the parameters are positively correlated and a correlation coefficient can be calculated with standard statistics tools. Negative correlation coefficients imply that large values of one parameter are associated with small values of another. A correlation coefficient of 0 indicates no correlation, 1 perfect positive correlation, -1 perfect negative correlation. Values between 0 and 1 or between 0 and –1 indicate that there is a relationship, but there is scatter in the data and the correlation is not perfect. The closer the correlation coefficient to zero, the weaker the correlation.

Positive or negative correlations can be included in Monte Carlo or Latin Hypercube sampling. This will restrict the sampling to certain combinations. A decision on whether or not to include a correlation between two parameters is likely to influence the results of a probabilistic calculation. Less values are sampled from the top left corner of Figure 3 consisting of long DT50 values in combination with small Koc values when correlation is included. These combinations lead to an increased leaching risk.

Figure 3: Combinations of DT50 values and Koc values sampled with Crystal Ball assuming no correlation (left) and a correlation coefficient of 0.7 (right). The risk for leaching increases from the bottom right (faster degradation and stronger sorption) to the top left (slower degradation and weaker sorption).

SID 5 (2/05) Page 9 of 30

Page 10: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

0

10

20

30

40

0 20 40 60 80 100

Koc (L/kg)

DT5

0 (d

ays)

r = 0

smallerleaching risk

largerleachingrisk

0

10

20

30

40

0 20 40 60 80 100

Koc (L/kg)

DT5

0 (d

ays)

r = 0.7

smallerleaching risk

largerleachingrisk

3.2.1 Literature review on the relationship between degradation and sorptionPesticide molecules are often sorbed to the walls of small pores inside soil aggregates or organic macromolecules that are inaccessible to microorganisms. Thus, it is often assumed that degradation is limited to pesticide available in the soil solution and that sorbed pesticide is protected from degradation. A number of studies support this assumption (Anderson, 1981; Ogram et al., 1985; Guo et al., 1999).

Published studies where degradation and sorption were measured on the same soil are numerous. However, correlations can only be calculated where a significant number of soils was tested and very few datasets meet these criteria. Table 3 summarises studies where measurements were available for at least nine soils.

Table 3. Correlations between degradation and sorption derived from literature data

Ionic character Compound Type of compound n

Correlationa

DT50, KdCorrelationa

DT50, KocReference

Weak acid Mesotrione Triketone herbicide 15 0.75 ** 0.81 *** Dyson et al. (2002)Flumetsulam Sulfonanilide herbicide 21 0.92 *** 0.61 ** Lehmann et al. (1992)

Weak base Metribuzinb Triazinone herbicide 18 0.73 *** 0.62 ** Allen & Walker (1987)Simazine Triazine herbicide 18 -0.11c 0.11 Walker & Thompson (1977)

Neutral Metazachlor Anilide herbicide 10 0.90 *** 0.86 ** Beulke, 1998Metazachlor Anilide herbicide 18 0.03 0.33 Allen & Walker (1987)Isoproturon Uron herbicide 108 -0.03 0.09 Price (2003)Isoproturon Uron herbicide 20 0.41 0.69 *** Walker et al. (2001)

Linuron Uron herbicide 18 -0.77c *** -0.16c Walker & Thompson (1977)Metamitron Triazinone herbicide 18 0.20 0.26 Allen & Walker (1987)Propyzamid Amide herbicide 18 -0.47c * -0.64c ** Walker & Thompson (1977)Napropamid Amide herbicide 9 0.67 * 0.24 Walker et al. (1985)

a *, **, *** significant at p< 0.001, 0.01 and 0.05, respectivelyb derived from degradation rate constants per unit microbial activityc negative correlation between degradation and sorption probably due to increased microbial activity in soils with larger sorption capacity

A relationship between degradation and sorption is sometimes found for ionisable compounds (Table 3). Weak acids (sulfonylureas, phenols) or other ionisable compounds (e.g. imidazolinone herbicides) exist predominantly as undissociated molecules at low pH and in anionic form at pH values greater than their pKa. The neutral form is more strongly sorbed than the anion because it is not repelled by negative charges on soil constituents. As a result, sorption of weak acids and other ionisable compounds decreases with increasing pH. For example, this was found for 2,4-D, imazaquin, imazethapyr and sulfonylurea herbicides (Wehtje et al., 1987; Renner et al., 1988; Stougaard et al., 1990; Loux and Reese, 1992; Johnson et al., 1995). Degradation is also influenced by pH. Soil acidity may influence the degradation of a pesticide directly if its stability is pH-dependent or indirectly via changes in pesticide sorption and soil microbial biomass or activity. If degradation is influenced indirectly by pH, it tends to proceed faster at high pH. A positive correlation between sorption coefficients and degradation half-lives is thus expected when degradation and sorption of ionisable compounds are investigated in soils with a range of pH values. flumetsulam sorption.

Although positive relationships between DT50 values and sorption coefficients are sometimes found, degradation of sorbed chemicals cannot be excluded (Guerin & Boyd, 1993; Guo et al., 2000). Degradation of sorbed molecules by microorganisms attached to solid surfaces may occur for some compounds. Microorganisms are generally more abundant at or near soil particle surfaces and sorption may thus concentrate the pesticide in

SID 5 (2/05) Page 10 of 30

Page 11: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

regions of greatest microbial activity, thereby facilitating degradation. Degradation mediated by chemical reactions can be accelerated by enhanced adsorption due to catalytic effects of solid surfaces.

Conflicting results are sometimes found for the same compound. Samples taken from different sites within a single field were treated with isoproturon and incubated at standard conditions in the laboratory by Walker et al. (2001) and Price (2003) to determine degradation. Adsorption of the pesticide was measured for each sample. Walker et al. (2001) found a positive correlation between DT50 values and Koc values of isoproturon (r = 0.69) whereas there was no relationship between degradation and sorption data measured by Price (2003).

3.2.2 Implications of parameter correlation for probabilistic modelling of pesticide exposureThe use of independently sampled degradation and sorption parameters (i.e. no correlation) in probabilistic risk assessments generally results in larger concentrations at the upper percentiles of the exposure distribution than the sampling of positively correlated DT50 and Koc values. This is due to the increased likelihood of sampling extreme combinations (i.e. slow degradation and weak sorption). Table 4 shows modelling results based on the methododolgy and scenario used in the ring test on user-subjectivity (Section 3.1). Note that the results presented here are based on an application rate of 0.1 kg ha-1 whereas a rate of 3.0 kg ha-1 was used in the ring test. The results presented are maximum annual average concentrations in leachate calculated following Latin Hypercube sampling of 5000 DT50 values and Koc values without any correlation (r = 0) and with positive correlation (r = 0.26). A larger number of long DT50 values in combination with small Koc values (top left corner of the scatterplot shown in) was sampled when degradation and sorption were assumed to be uncorrelated. This resulted in a larger simulated risk of leaching to depth (Table 4).

SID 5 (2/05) Page 11 of 30

Page 12: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

Figure 4: Combinations of DT50 values and Koc values of metamitron sampled with and without correlation

r = 0

0

25

50

75

100

0 200 400 600 800Koc (L/kg)

DT5

0 (d

ays)

r = 0.26

0

25

50

75

100

0 200 400 600 800Koc (L/kg)

DT5

0 (d

ays)

Table 4: Maximum annual average concentrations of metamitron in leachate on the basis of DT50 values and Koc values sampled with and without correlation (for details see Section )

no correlationr=0

positive correlationr=0.26

90th percentile concentration (μg L-1) 0.022 0.00795th percentile concentration (μg L-1) 0.113 0.03799th percentile concentration (μg L-1) 0.679 0.284% runs exceeding 0.1 μg L-1 5.4 2.4

3.2.3 Conclusions It is impossible to draw general conclusions about correlation between degradation and sorption as the

number of studies is limited. A positive relationship is found for some compounds, but strength of correlation is influenced by many co-

variables. Negative correlations are probably due to an increased microbial activity in soils with larger sorption capacity. Predicted risk for leaching to depth is generally smaller where a positive correlation between DT50 values

and sorption parameters is included in Monte Carlo sampling. It is recommended to set the correlation to zero unless the experimental data support an alternative assumption (i.e. where the correlation is significant with at least 90% probability and experimental artefacts can be excluded).

3.3 Guidance for reporting Monte Carlo modelling of pesticide exposureTo ensure transparency and reproducibility of probabilistic assessments, each step should be reported in detail. A brief overview of the main components is given below. Full guidance has been passed into EUFRAM for inclusion in a framework document for probabilistic risk assessment at the European level. It forms part of a document that has been disseminated to end-users for comment.

Introduction: The objective of the probabilistic exposure assessment should be defined. A summary should be provided of previous deterministic modelling activity and the regulatory context of the exposure assessment (e.g. FOCUS exposure assessment at the European level)

Justification for the selection of a specific exposure model: Often, a number of different models are available to calculate exposure concentrations in the environmental compartment of interest. The selection of a particular model can influence the result of a probabilistic modelling exercise. Simplifications in the model descriptions introduce further uncertainty into the exposure assessment and the suitability of the selected model for the situation at hand should be evaluated.

Model parameterisation: Details of the initial parameterisation and modelling assumptions should be given.

Sensitivity analysis for the model: A sensitivity analysis should be performed wherever possible to guide the selection of parameters to be included in the probabilistic assessment. Only parameters that have a significant influence on the model output should be included. Other parameters can be kept at their initial values. A sensitivity analysis specific to the modelled scenario is preferred over a generalised analysis as the sensitivity of one model parameter depends on the settings of other parameters.

SID 5 (2/05) Page 12 of 30

Page 13: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

Justification of the inputs selected for probabilistic analysis: Which inputs were varied and why? What is the type variability and/or uncertainty reflected (e.g. spatial variability of properties within an agricultural field; uncertainty introduced by differences in experimental and analytical conditions in the laboratory)

Definition and sampling of distributions (e.g. for Monte Carlo analysis): Probabilistic exposure assessment involves the analysis of available data for the parameter of interest (e.g. degradation rate). In Monte Carlo analysis, a statistical distribution is assigned to these data. A large number of values is sampled from the distribution and used as input data for modelling. The methodology selected by individual modellers can strongly influence the result and each step should be clearly documented.

Type and parameterisation of the statistical distribution and details on how these were derived Details of methodology used to sample from the distributions (e.g. software used, number of values

sampled, correlations)

Method used to run the model: Simulation models provide a range of output variables. These include concentrations of the pesticide in different compartments (e.g. soil, water), at different locations within each compartment, and at different times within the simulation period. A large number of simulations are carried out using the sampled values for each parameter as model input and the selected output is analysed for each run. Sometimes, the output provided by the model needs to be processed, for example to calculate averages or maxima over selected periods. The handling of model input and output and running the model can be time-consuming and automated methods are available to facilitate the procedure. The methodology used for this purpose should be documented.

Evaluation of model output: Probability distributions of the output are generated. From these distributions, concentrations at selected percentiles or the probability of exceeding a certain threshold concentration can be derived. The results of a probabilistic assessment may depend on the assumptions made during the analysis and settings in the tools used for Monte Carlo sampling. Where possible, the repeatability of the assessment should be evaluated.

Discussion: The discussion should include a critical evaluation of the probabilistic risk assessment, its representativeness and its limitations. Unquantified sources of variability and uncertainty (e.g. those associated with model inputs that were varied or held constant in the assessment, model error) should be listed and a qualitative evaluation of how they might affect the assessment outcome should be made.

3.3.1 Conclusion Draft guidance on how to report a probabilistic exposure calculation has been prepared. The guidance is

currently undergoing peer review and will form part of a European framework document for probabilistic risk assessment that will be the main deliverable from the EUFRAM project.

4. PARAMETER UNCERTAINTY

4.1 Default distributions for degradation and sorption parametersMonte Carlo modelling involves assigning a statistical distribution to each uncertain and/or variable parameter. Different types of distributions exist (see Figure 5 below) and the type of distribution will influence the results of the modelling. Software programmes are available that fit different distributions to the measurements for key model input data (such as degradation and sorption properties of the compound) and compare the goodness of fit. However, the number of measurements is often too small (often less than 10) to allow a robust identification of the underlying distribution and its parameters. In this case, the selection and parameterisation of the statistical distribution cannot be based on the experimental data alone and some generic guidance is needed. The objectives of this study were to: Identify default distributions that are most appropriate for DT50 values, degradation rate constants, sorption

coefficients (Kd or Kf values) and Koc values; Give recommendations on how to parameterise these distributions based on a small number of datapoints; Develop procedures that explicitly take into account the uncertainty in parameterising distributions based on a

small number of samples and visualise the resulting uncertainty in the model outcome.

Some of the distributions included in Crystal Ball are shown in Figure 5. The probability is plotted against the values for the variable of interest (e.g. the DT50 value). For each value on the x axis, its likelihood to occur can be read from the y axis. The area under the curve is equivalent to the total cumulative probability of 1. There are two basic shapes of distributions. The data are either symmetrically distributed around a mean or skewed to one side. For example, the normal distribution is symmetrical (i.e. an equal number of data is smaller and larger than the mean) whereas the lognormal distribution is skewed to the left (a larger number of values is smaller than the mean). Some distributions can be either skewed to the left or the right (e.g. beta distribution). The parameters of normal and lognormal distributions are easy to estimate from experimental data and they have been used frequently in probabilistic modelling. It is, however, not clear how well these distributions describe degradation

SID 5 (2/05) Page 13 of 30

Page 14: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

and sorption data in general and whether one of the two distributions is clearly more appropriate than the other. Alternative distributions are to date rarely used in probabilistic modelling studies for pesticides.

SID 5 (2/05) Page 14 of 30

Page 15: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

Figure 5. Types of statistical distributions typically assigned to uncertain or variable model input parameters

Published studies where degradation and/or sorption were measured in at least eleven different soils were evaluated to identify the type of distribution that best described these data. Eleven statistical distributions were fitted to the data with Crystal Ball (beta, gamma, logistic, extreme value, normal, lognormal, triangular, uniform, exponential, Weibull and Pareto). The distributions were ranked according to their goodness of fit based on the Anderson Darling index. The analysis was made for DT50 values, degradation rate constants (k), Kd or Kf values and Koc values. Table 5 and Table 6 give the distributions that fitted the data best. It also shows whether a lognormal or normal distribution fitted the data better.

Table 5. Results of fitting 11 statistical distributions to published experimental DT50 values or degradation rate constants (k)

DT50 k

Compound n Best distribution

Lognormal better than normal?

Best distribution

Lognormal better than normal? Reference

Mesotrione 15 Lognormal Yes Lognormal Yes Dyson et al. (2002)Metazachlor 18 Weibull Yes Beta No Allen & Walker (1987)Metamitron 18 Extreme value Yes Weibull Yes Allen & Walker (1987)Metribuzin1 18 Beta Yes Lognormal Yes Allen & Walker (1987)

Flumetsulam 21 Weibull Yes Lognormal Yes Lehmann et al. (1992)Simazine 18 Logistic No Extreme value Yes Walker & Thompson (1977)Linuron 18 Beta No Pareto Yes Walker & Thompson (1977)

Propyzamide 18 Lognormal Yes Lognormal Yes Walker & Thompson (1977)Isoproturon 20 Beta No n/a n/a Walker et al. (2001)Isoproturon 802 Extreme value Yes Weibull Yes Price (2003)Isoproturon 282 Gamma No Beta Yes Price (2003)

1 derived from degradation rate constants per unit microbial activity2 degradation data fell into two distinct groups, distributions fitted to each subset separately

SID 5 (2/05) Page 15 of 30

Page 16: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

Table 6. Results of fitting 11 statistical distributions to published experimental sorption data

Kd Koc

Compound n Best distribution

Lognormal better than normal?

Best distribution

Lognormal better than normal? Reference

Atrazine 49 Lognormal Yes Weibull No Giddings et al. (2005)Atrazine 12 Logistic No Weibull Yes Gottesbüren (1991)

Chlorotoluron 12 Logistic No Weibull No Gottesbüren (1991)Ethofumesate 11 Lognormal Yes Weibull Yes Gottesbüren (1991)

Methabenzthiazuron 15 Gamma Yes Extreme value Yes Gottesbüren (1991)Mesotrione 15 Lognormal Yes Lognormal Yes Dyson et al. (2002)Metazachlor 18 Beta No Weibull No Allen & Walker (1987)Metamitron 18 Lognormal Yes Weibull Yes Allen & Walker (1987)Metribuzin 18 Weibull Yes Gamma Yes Allen & Walker (1987)

Flumetsulam 21 Beta No Lognormal Yes Lehmann et al. (1992)Simazine 18 Weibull Yes Weibull Yes Walker & Thompson (1977)Linuron 18 Gamma Yes Weibull Yes Walker & Thompson (1977)

Propyzamide 18 Lognormal Yes Logistic No Walker & Thompson (1977)Isoproturon 20 Triangular Yes Logistic No Walker et al. (2001)Isoproturon 108 Logistic No Logistic No Price (2003)

The distribution of DT50 values was described better by a lognormal than a normal distribution in seven out of eleven cases (64%). No single distribution consistently fitted all the datasets best (Table 5). The distribution of degradation rate constants was described better by a lognormal than a normal distribution in all except one of the ten studies. In four cases, the lognormal distribution described the data best. It is recommended to use the lognormal distribution as a default where the number of available measurements is less than 10. In all other cases, the type of distribution should be selected based on a visual and statistical assessment of the goodness of fit to the experimental data. Alternative distributions should be used where they clearly match the data better than the lognormal distribution.

Kd and Koc values from 15 studies were evaluated. The lognormal distribution described Kd values better than the normal distribution in 67% of the cases. Koc values were fitted better by the lognormal distribution in 60% of the cases. No distribution consistently fitted all the data best. Wherever possible, the type of distribution should be selected after fitting a number of distributions to the data. If the number of datapoints is small (< 10), a lognormal distribution should be used.

The magnitude of the differences between the distribution that fitted the data best and the normal or lognormal distribution depended on the dataset. An example is shown in Figure 6 for the fits of a normal, lognormal and best-fit (beta) distribution to the data measured by Allen and Walker (1987) for metazachlor. Such differences in the distributions may be trivial around the median value, but can have significant effects on the outcome of probabilistic modelling for very sensitive model parameters and where results are close to regulatory thresholds.

Figure 6. Kd values for metazachlor measured by Allen and Walker (1987) and three distributions fitted to the data. Full curve (left), detail (right)

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0 1 2 3 4 5 6

Kd (L/kg)

Pro

babi

lity

(-)

lognormalnormal

beta

0.00

0.05

0.10

0.15

0.0 0.3 0.6 0.9 1.2

Kd (L/kg)

Pro

babi

lity

(-)

lognormal

normal

beta

4.2 Selection of parameters to describe distributionsAfter a decision has been made on the shape of a distribution (e.g. normal, lognormal), key parameters must be calculated to define the distribution. For example, a normal distribution is defined by its mean and standard deviation. The mean and standard deviation of the measured data are often used as an estimate of the parameters of the true underlying distribution. The larger the number of measurements, the closer the estimated parameters will be to the true values. However, the number of measured sorption and degradation data is small

SID 5 (2/05) Page 16 of 30

Page 17: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

for most pesticides. Measurements for four soils must be provided as a standard regulatory requirement. Additional studies are carried out for some compounds, but the total number of measurements rarely exceeds 10. It is not clear how well parameters estimated from small datasets describe the true underlying distribution. We compared distributions derived from a relatively large dataset consisting of 49 Koc values for atrazine from a review by Giddings et al. (2005) with those derived from either four or ten of the 49 measured values.

A histogram of the 49 measured Koc values for atrazine and a lognormal distribution that was fitted to the data are shown in Figure 7a. A cumulative expression of the same data and lognormal distribution is shown in Figure 7b. The data were described well by a lognormal distribution, particularly at the bottom tail of the distribution.

Figure 7. (a) Histogram of 49 measured Koc values for atrazine (Giddings et al. , 2005) and a lognormal distribution fitted to the data, (b) Cumulative probability plot

a) b)

0.0

0.1

0.2

0.3

0.4

Koc (L kg-1)

Pro

babi

lity

(-)

measuredLognormal distribution

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0 100 200 300 400 500 600

Koc (L kg-1)

Pro

babi

lity

(-)

MeasuredLognormal distribution

4.2.1 Comparison between distributions derived from small and large numbers of measurementsEither four or ten of the 49 values were sampled randomly. The aim was to compare the lognormal distribution derived from the sampled values with that of the original data. However, normal distributions are easier to calculate. Fitting a lognormal distribution to the Koc values is equivalent to fitting a normal distribution to log-transformed data. Therefore, all sampled Koc values were log-transformed and the mean and standard deviation of the log (Koc) values were used as an estimate of the underlying normal distribution. The Excel function NORMINV(probability, mean, standard deviation) was used to calculate log(Koc) values for probabilities between 0 and 1. Koc values were calculated as 10^ log(Koc). The procedure was repeated 250 times to give 250 distributions. Figure 8 shows three examples of subsets of four and ten sampled values and the resulting probability distribution functions. These were the subsets with the smallest, median and largest mean log(Koc) value. The distribution of the 49 measurements is shown for comparison.

4.2.2 Probabilistic modelling of pesticide leaching based on a small number of Koc measurementsIt was shown above that distributions of pesticide parameters that are estimated from a small number of measurements are uncertain. Implications of this uncertainty for the results of probabilistic pesticide fate modelling were investigated. Movement of atrazine to depth was first calculated using Koc values sampled from the distributions of all 49 measured Koc values. The results were compared with those derived from three smaller subsets of only 10 values (those with the smallest, median and largest mean log(Koc) value; Figure 8).

Maximum annual average concentrations of atrazine in leachate at 1-m depth were simulated using the model and scenario described in Section 3.1. Atrazine was applied to a maize crop on 12 May in each of the 20 years simulated at a rate of 1 kg/ha. The DT50 value was set to the median of 15 laboratory values measured at 20-25oC (41 days; from Giddings et al. (2005) and data provided by Syngenta). The Koc value was varied during the analysis and set to (i) the 49 measured Koc values, (ii) the median Koc value of each of the three subsets. Attempts were then made to account for the uncertainty in the distributions resulting from the availability of only a small number of measurements.

Concentrations in leachate calculated on the basis of median Koc values of the three subsets are compared with those calculated for the 49 measured Koc values in Figure 9. The concentration derived for subset 1 (median Koc = 93.6 L kg-1) corresponded to the 81st percentile concentration of the distribution based on 49 measured values. Subset 2 (median Koc = 182.5 L kg-1) and subset 3 (median Koc = 213.4 L kg-1) resulted in 41st and 24th percentiles, respectively.

SID 5 (2/05) Page 17 of 30

Page 18: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

Figure 8. Probability plots for subsets of four (top) and ten (bottom) Koc values and fitted distributions with the smallest, median and largest mean log(Koc). The distribution of the complete dataset consisting of 49 measured values is shown for comparison.

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0 100 200 300 400 500 600

Pro

babi

lity

(-)

Complete datasetLognormal distribution4 sampled valuesLognormal distribution

minimum

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0 100 200 300 400 500 600

Complete datasetLognormal distribution4 sampled valuesLognormal distribution

median

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0 100 200 300 400 500 600

Complete datasetLognormal distribution4 sampled valuesLognormal distribution

maximum

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0 100 200 300 400 500 600

Koc (L kg-1)

Pro

babi

lity

(-)

Complete datasetLognormal distribution10 sampled valuesLognormal distribution

minimum

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0 100 200 300 400 500 600

Koc (L kg-1)

Complete datasetLognormal distribution10 sampled valuesLognormal distribution

median

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0 100 200 300 400 500 600

Koc (L kg-1)

Complete datasetLognormal distribution10 sampled valuesLognormal distribution

maximum

Figure 9. Concentrations in leachate at 1-m depth simulated using 49 measured Koc values (open circles) and the median of three subsets of 10 sampled Koc values (squares)

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

1E-05 0.0001 0.001 0.01 0.1 1 10 100

Concentration in leachate (microg L-1)

Pro

babi

lity

(-)

Subset 1 (Koc = 93.6 L kg-1)

Subset 2 (Koc = 182.5 L kg-1)

Subset 3 (Koc = 213.4 L kg-1)

The mean and standard deviation of the ten measured log(Koc) values for subsets 1, 2 and 3 were taken as an estimate for the mean and standard deviation of the true underlying distribution. However, these parameters are uncertain if the sample size is small. This uncertainty was taken into account using the methodology described by Vose (2000). According to theoretical considerations, the true mean and standard deviation of a normal distribution follow statistical distributions based on the initial estimates. A total of 50 values for the mean and standard deviation were sampled. Normal distributions were then generated from these parameters and 500 log(Koc) values were sampled from each of the 50 distributions. This resulted in a total of 50 x 500 = 25,000 values. The sampling was carried out for subsets 1, 2 and 3. All sampled log(Koc) values were converted to Koc values and used as input parameters for PELMO. Concentration in leachate was recorded for each model run, yielding 50 distributions of concentration for each subset. The median and 95% confidence interval of the 50 distributions are compared with concentrations in leachate calculated from the 49 measured data in Figure 10.

SID 5 (2/05) Page 18 of 30

Page 19: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

Figure 10. Concentrations in leachate at 1-m depth simulated using 49 measured Koc values (open circles) and median and 95% confidence intervals of simulated concentrations accounting for the uncertainty in generating distributions from a small sample

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0.00001 0.001 0.1 10

Concentration in leachate (microg L-1)

Pro

babi

lity

(-)

Subset 1

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0.00001 0.001 0.1 10

Concentration in leachate (microg L-1)

Subset 2

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

0.00001 0.001 0.1 10

Concentration in leachate (microg L-1)

Pro

babi

lity

(-)

Subset 3

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1.0

1E-05 0.0001 0.001 0.01 0.1 1 10 100

Concentration in leachate (microg L-1)95% confidence interval Median Complete dataset

The confidence bounds of the output distributions for the three subsets included the distribution of leachate concentrations calculated from measured values over a wide range of percentiles (Figure 10). This suggests that the upper limit of the confidence interval (i.e. the 97.5th percentile output distribution) may give a conservative estimate of the true distribution in the majority of cases. However, the analysis carried out here involved running the exposure model 25,000 times to generate 50 output distributions, each consisting of 500 values. For the PELMO model, this resulted in a computing time of several days for each subset. Other models may require even longer running times. It would be useful to investigate whether the number of distributions or the number of datapoints per distribution could be reduced further without compromising a robust estimation of the confidence bounds.

4.2.3 Conclusions Estimation of the true distribution of pesticide properties from a limited number of measurements is uncertain. Probabilistic modelling of pesticide exposure based on a small number of measured values for key input

parameters can give misleading results. A conservative estimate may be possible if the uncertainty in generating distributions from small samples is

considered in the assessment. There is a standard method that is routinely used to calculate uncertainty bounds for species sensitivity distributions. Its application to pesticide exposure modelling is, however, much more time-consuming due to the complex calculations involved.

4.3 Effects of uncertainties in the Freundlich exponent on modelling results

4.3.1 IntroductionThe Freundlich isotherm is often used in pesticide fate models to characterise pesticide sorption and the parameterisation of this particular relationship is another source of uncertainty in modelling pesticide fate. The Freundlich isotherm has the generalised form [S = Kf.Cnf] where S = sorbed amount (mg kg-1), C = dissolved concentration (mg L-1), Kf = Freundlich sorption coefficient (L kg-1), nf = Freundlich exponent (-). Figure 11 shows the relationship between the sorbed amount of the pesticide and the dissolved concentration for different nf values (Kf = 1.5 L kg-1). The Freundlich exponent determines the curvature of the isotherm. The smaller the value of nf, the stronger the extent of non-linearity. A larger fraction of the total pesticide in the soil is available in solution at nf = 1 than at nf < 1 for dissolved concentrations below 1000 μg L-1. This is illustrated in Figure 12 for Kf values of 0.15 L kg-1 and 1.5 L kg-1. The smaller the concentration, the larger the discrepancy between the available fraction calculated from linear and non-linear isotherms. The discrepancy is also larger for more strongly sorbed compounds. Models generally predict less leaching of pesticides to depth for smaller values of nf values. Models can be very sensitive to changes in nf if concentrations in soil solution are small (Dubus et al., 2000).

Freundlich sorption coefficients and nf values are usually derived by fitting the Freundlich equation to sorbed and dissolved concentrations measured in standard batch studies after shaking soils with aqueous pesticide solutions at a range of initial concentrations. The Freundlich exponent nf is often difficult to estimate with accuracy due to the inherent variability in the measured data. The Freundlich exponent differs between different soils and studies and an average is usually used in the modelling. In the absence of any information on the Freundlich exponent, a default value of 0.9 is recommended by FOCUS (2000). The nf value is, thus, variable and uncertain and this introduces uncertainties into the risk assessment.

SID 5 (2/05) Page 19 of 30

Page 20: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

It is relatively common for the maximum recommended application rate for a pesticide to change over the course of a regulatory review. A key question is then whether it is possible to directly adjust PEC values to reflect a change in the application rate. Strictly, this is only correct for a linear sorption isotherm. The analysis below investigates the conditions where the relationship between application rate and PEC is approximately linear.

Figure 11. Relationship between sorbed Figure 12.Fraction of the total amount of pesticide available in soil

and dissolved pesticide according to the solution for a linear (nf = 1) and a nonlinear (nf = 0.9) isotherm.Freundlich isotherm (Kf = 1.5 L kg-1)

nf = 1.0

0.0

0.3

0.6

0.9

1.2

1.5

1.8

0 400 800 1200

C (microg/L)

S (m

g/kg

)

nf = 0.6

nf = 0.8

0.0

0.1

0.2

0.3

0.4

0.5

0.6

0.001 0.01 0.1 1 10

C (microg/L)

Ava

ilabl

e fra

ctio

n (-

)

nf = 1.0nf = 0.9

Kf = 0.15

0.00

0.02

0.04

0.06

0.08

0.10

0.12

0.001 0.01 0.1 1 10

C (microg/L)

Ava

ilabl

e fra

ctio

n (-

)

nf = 1.0nf = 0.9

Kf = 1.50

4.3.2 Linearity of the relationship between concentrations in leachate and pesticide application rateLeaching to depth of pesticides with a range of degradation and sorption properties (DT50 = 10 days and Koc = 10, 50 or 100 L kg-1; DT50 = 30 days and Koc = 50 or 100 L kg-1) and application rates (0.01, 0.05, 0.10, 0.50, 1.00 and 3.00 kg ha-1) was simulated with PELMO to investigate the relationship between the total amount applied and the concentration in leachate at different nf values. The soil, weather and crop scenario were those in Section 3.1. Figure 13a shows the relative change in the maximum annual average concentration in leachate for a change in the application rate from 0.1 to 1.0 kg ha-1. For nf = 1, the concentration in leachate increased by a factor of 10 when the application rate increased tenfold (i.e. proportional increase). The range of nf values over which the relationship between concentrations in leachate and the application rate is approximately linear (i.e. the relative change is close to 10) depends on the scenario and the properties of the pesticide (Figure 13a). The concentration in leachate for nf = 0.9 increased by a factor of 16.1 for a pesticide with a DT50 value of 10 days and a Koc value of 10 L kg-1 and the scenario used in this study. The deviation from linearity was much stronger for a pesticide with a DT50 value of 30 days and a Koc value of 100 L kg-1 (factor 68.8).

Figure 13. Relative change in simulated concentrations for (left) a change in the application rate from 0.1 to 1.0 kg ha-1 (factor 10) and (right) a change in the application rate from 0.5 to 1.0 kg ha-1 (factor 2).

0.1 / 1.0 kg/ha

1

10

100

1000

10000

0.80 0.85 0.90 0.95 1.00

nf (-)

Rel

ativ

e ch

ange

in c

once

ntra

tion 0.5 / 1.0 kg/ha

1

10

0.80 0.85 0.90 0.95 1.00

nf (-)

Rel

ativ

e ch

ange

in c

once

ntra

tion.

1

10

100

1000

10000

100000

0.0 1.0 2.0 3.0

Application rate (kg/ha)

DT50 = 10, Koc = 10 DT50 = 10, Koc = 50 DT50 = 10, Koc = 100DT50 = 30, Koc = 50 DT50 = 30, Koc = 100

The relative change in the simulated concentration in leachate for a change in the application rate from 0.5 to 1.0 kg ha-1 is shown in Figure 13b (note that the scale of the y-axis is different from that in Figure 13a). The range of

SID 5 (2/05) Page 20 of 30

Page 21: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

nf values over which the relationship between concentrations in leachate and the application rate is approximately linear (i.e. the factor is close to 2) is wider in this example. The concentration in leachate at nf = 0.90 increased by a factor of 2.3 – 3.5 when the application rate increased by a factor of 2.

4.3.3 Conclusions Simulated concentrations of pesticides in leachate are very sensitive to small changes in nf. It is often very

difficult to measure nf accurately and this can introduce large uncertainties into the modelling. Results presented in Figure 13 provide information on likely changes in PEC values in response to changes

in application rate.

5. MODEL ERROR

5.1 Description of the issueModels are simplified representations of a real-world system. They are based on scientific or technical assumptions and subject to approximations. Model error, also known as model inadequacy or conceptual error (Beck et al., 1997), arises from the imperfect mathematical representation of a system. It is the inability of a model to represent the reality accurately even when adequate model inputs are being used. Parameter error on the other hand is caused by uncertainty in the model parameter values due to measurement error, temporal and spatial variability and scaling issues (Beckie, 1996). It is difficult to isolate and evaluate model error and parameter error and both contribute to the total model uncertainty.

The level of uncertainty in probabilistic risk assessments for pesticides arising from model error is largely unknown. Model error has to date been widely ignored in this discipline because user-friendly approaches to explicitly account for and quantify this type of uncertainty are not available. A review has been undertaken to identify promising techniques in related disciplines including hydrological modelling and human exposure assessment. The applicability of these techniques for pesticide exposure modelling has been evaluated.

5.2 Level of uncertainty arising from model errorIt is difficult to quantify the general performance of pesticide fate models and to estimate the contribution of model error. Although there are a reasonable number of studies that assess the accuracy of existing models against field measurements, the results are variable and general conclusions cannot be drawn. Often, the uncertainty in the model prediction caused by model error cannot be separated from the uncertainty caused by parameter uncertainty and variability. An exact evaluation of parameter error requires the use of a perfect model. Similarly, the parameter values must be known perfectly to evaluate the effect of model error (Gaganis and Smith, 2001).

One way of testing a model is to compare the maximum concentration measured in the runoff or drainage water with the maximum value of the predicted concentrations, irrespective of the time point at which the concentration was observed. The performance of the MACRO model was evaluated in this way for isoproturon leaching from different soil types in the field and through lysimeters (Beulke et al., 2001). Maximum concentrations predicted for a heavy clay soil were under-estimated by a factor of seven during one season and over-estimated by factors of nine and ninety-one for subsequent seasons. Predictions were more accurate for coarser soils where simulated maximum concentrations were within a factor of four of those observed. Overall, there was a tendency for the MACRO model to over-estimate the observed concentrations, introducing a margin of safety into the modelling.

5.3 Methods to account for model error in probabilistic risk assessmentsProbabilistic analysis of pesticide exposure aims to evaluate the effects of variability and uncertainty on the assessment. Model error is classified as uncertainty. Often, variability is expressed in the form of a probability distribution where cumulative probability is plotted against the variable of interest. Uncertainty is expressed as bounds around these distributions. The wider these bounds, the larger the uncertainty. Uncertainty can be accounted for in 2-D Monte Carlo analyses, provided the uncertainty can be quantified in the form of a statistical distribution (e.g. as a normal distribution around a mean). This and alternative methods to account for model error in probabilistic pesticide fate modelling have been explored within this project.

5.3.1 Basic methodsThe most common way of dealing with model error is to minimise its causes by evaluating the underlying theories and assumptions of the models, improving transparency and communication about the models, and improving awareness of the limitations of model applicability. Model error can also be reduced by additional measurements to improve the relationships described in the model. However, model error can never be eliminated completely. In principle, differences between outputs from various models simulating the same scenario can be taken as a measure of model error. However, regulatory pesticide models are based on very similar concepts and differences in their outputs do not give a valid indication of the discrepancies between model and field behaviour.

SID 5 (2/05) Page 21 of 30

Page 22: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

A simple way to deal with the uncertainty caused by model error is to introduce a safety factor to cover the uncertainty of the predicted value. Arbitrary uncertainty factors are applied to the comparison between exposure and ecotoxicological endpoints in European regulation (typically factors of 10 or 100). These factors are largely regarded as accounting mainly for the uncertainty arising from inter-species variation in sensitivity to a toxicant. The safety factors are not considered to account for uncertainties in the exposure estimate. Instead, realistic worst-case assumptions are built into the modelling scenarios to generate conservative estimates of exposure. The models and scenarios used for FOCUS groundwater and surface water simulations have been validated to some degree. No major discrepancies occurred between the estimated potential for leaching and that observed under actual usage conditions. It is therefore likely that the FOCUS models and scenarios give a sufficient margin of safety and that model error can be ignored within standard regulatory modelling. It is, however, important to consider the implications of model error when modelling studies are undertaken outside the FOCUS framework.

Probabilistic assessments aim at replacing arbitrary safety factors and worst-case assumptions by a quantification of the likelihood for a risk to occur. It may, however, be necessary to lump all uncertainty that cannot be explicitly accounted for into a factor that accounts for model error. This factor can be included in a simple 2-D Monte Carlo analysis. Second-order or 2-D Monte Carlo sampling is a technique that allows separation of variability and uncertainty. In first-order Monte Carlo analysis, values are repeatedly sampled from input distributions to produce an output distribution. The input distributions or their parameters are estimated and hence subject to sampling error, but we assume them to be known and fixed in first-order Monte Carlo. Second-order Monte Carlo can overcome this difficulty by explicitly considering parameter uncertainty in the outer loop of the simulation. The values for the parameters sampled in the outer loop are then fixed and values are sampled in the inner loop for each variable parameter. An output distribution considering variability is calculated for each value of the parameters in the outer loop. This results in a large number of output distributions and allows to quantify the uncertainty in the result of the 1-D Monte Carlo analysis (e.g. the probability of a model outcome to occur and the 95% confidence interval around this probability).

If 2-D Monte Carlo analysis were used to account for model error, a factor would first be sampled from a distribution in the outer loop. Given the current state of knowledge, a uniform distribution would seem most appropriate (this distribution can be used when only the minimum and maximum values are known but the shape of the distribution is unknown). For example if a model under- or over-estimates exposure by a maximum factor of four, then a value would be sampled from a uniform distribution between 0.25 and 4.0. This is then fixed and multiplied with each value of the exposure distribution calculated in the inner loop. This allows uncertainty in the exposure distribution to be calculated. Further research is required to identify the upper and lower limit of the uniform distribution used. General recommendations cannot currently be made as model error depends on the modelling scenario and the type of output considered. For example, the ability of a model to simulate the average annual loss of a pesticide to depth may be better than the ability to match observed maximum concentrations in drainflow (or vice versa) Further research is needed on this topic.

5.3.2 Advanced methodsMethods to account for or reduce model error have been used in alternative disciplines. Neuman (2003) proposes a Maximum Likelihood Bayesian Model Averaging method to construct alternative conceptual models by selecting and joining existing interpretations and mathematical descriptions. The method incorporates site characterisation and site monitoring data for an optimum combination of prior information (scientific knowledge plus data) and model predictions. Standard statistical practice ignores the uncertainty in model selection, leading to over-confident inferences and decisions. Baysian model averaging (BMA) provides a coherent mechanism for accounting for model uncertainty. Several methods are discussed by Hoething et al. (1999).

When alternative models are available and there is no basis to select any particular one, Morgan and Henrion (1990) suggest to present the results of all models as possible outcomes, or to combine the output of the models to represent a range of outcomes. It may be possible to assign probabilities to the likelihood of each possible model description being the correct one. One way to do this is by using a probability tree to reflect the possible model descriptions and their likelihood (e.g. Evans et al., 1994). However, it is difficult and subjective to assign a probability to each model option and pesticide fate models have reached a large degree of consensus with similarities of approach greatly outweighing the differences. The differences between the various models are too small to warrant a complex and technically demanding analysis and the result would not be very meaningful.

5.4 Conclusions Model error is an important source of uncertainty in exposure modelling. It is very difficult to separate model

error from parameter error and to quantify the contribution of model error to the total uncertainty. A number of realistic worst-case assumptions are included in the modelling at lower tiers of the regulatory

procedure to ensure that a conservative estimate of exposure is made. The FOCUS scenarios and models have been validated to some extent and appear to provide a sufficient margin of safety.

Model error should not be ignored when assessments are made outside of the FOCUS framework. Model error could be included within probabilistic calculations using 2-D Monte Carlo analysis. Further

research on levels of predictive accuracy for regulatory models would be helpful.

SID 5 (2/05) Page 22 of 30

Page 23: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

None of the advanced methods that have been proposed in other fields of research are likely to have application within pesticide exposure modelling within the next 5 years.

6. PROBABILISTIC MODELLING ACCOUNTING FOR VARIABILITY IN LANDSCAPE FEATURES

The Framework V Concerted Action EUFRAM (www.eufram.com) is developing a European Framework for probabilistic risk assessment of the environmental impacts of pesticides. EUFRAM aims to ensure a strategic, co-ordinated and harmonised approach to the development of probabilistic risk assessment. Close links were maintained with the project to: (i) maximise benefit from the activities under EUFRAM and to prevent duplication of effort; (ii) access a range of scientific expertise including the participants of the ring test (Section 3.1); and (iii) provide input from PL0548 into EUFRAM. European case studies were collated to test procedures and recommendations developed within the project and to provide benchmark examples for dissemination to the scientific and regulatory community. Research for one case study was carried out within PL0548 to build on work undertaken under PN0931 and the first phase of PL0548.

The case study presented below aims at demonstrating probabilistic methods for aquatic risk assessment and particularly how an assessment could be undertaken using the base set of regulatory fate and ecotoxicity tests. A probabilistic estimate of exposure is developed to refine FOCUS surface water modelling and this is compared to basic ecotoxicity data, a species sensitivity distribution and endpoints from a mesocosm study. The case study has been written up as a full report of the analysis (Brown et al., 2005) and only a summary version intended for communication of the main results is provided below. The case study was presented to the Environmental Panel to seek its opinion on preferred approaches to communication. The study has also been used for dissemination of key features of probabilistic risk assessment to a range of stakeholders and end users at a EUFRAM training workshop (Brussels, March 2005). The report of the case study is undergoing formal comment from EUFRAM participants and will form an integral part of the framework document to be produced by EUFRAM. It will be used for end-user training at European workshops with ca. 100 participants to be held in October 2005 and July 2006.

6.1 Executive summaryA deterministic risk assessment for non-target aquatic organisms in UK water bodies following use of the hypothetical urea herbicide ‘herburon’ showed a potential for effects on aquatic macrophytes. This assessment was based on a number of worst-case assumptions and did not provide any information on the likelihood of the effects to occur. In this study, a probabilistic assessment of exposure concentrations in a natural agricultural landscape is undertaken considering the uncertainty and variability in landscape features and pesticide entry into water bodies. Exposure distributions are compared with expressions of effects on macrophytes at the species and community level. Results are summarised below:

Expression of exposure Expression of effects Key results of the risk assessmentMaximum PECSW from deterministic FOCUS Step 4 modelling including a 5-m no-spray buffer

Smallest ecotoxicity endpoint for standard test species (120-hr EC50 for Lemna gibba = 7 g L-1)

Smallest TER < 10 in all six scenarios simulated a

Probabilistic distribution of instantaneous concentrations in ditch arising from drift accounting for variability and uncertainty in landscape features and pesticide deposition. Distribution calculated with and without a 5-m no-spray buffer (results in this table are for exposure with the buffer)

Smallest ecotoxicity endpoint (120-hr EC50 for Lemna gibba = 7 g L-1)

TER < 10 in 68% of the exposure events considered(95% uncertainty interval 47-75%)

Probabilistic expression accounting for uncertainty in generating species sensitivity distributions from small number of species (9 aquatic macrophytes); median HC5 = 7 g L-1; 95% uncertainty interval = 4.1-11.1 g L-1)

Median HC5 exceeded in 3.8% of the exposure events considered(95% uncertainty interval 0.8-7.7%)> 5% of species affected in 4.6% of the exposure events considered(95% uncertainty interval 0.2-13.4%)

Lowest mesocosm NOEC for effects on community metabolism (5 μg L-1)

NOEC for clear effects exceeded in 7.6% of exposure events(95% uncertainty interval 2.5-13.2%)

Lowest mesocosm NOEC for effects on community metabolism with no recovery within 8 weeks (15 μg L-1)

NOEC for clear effects with no recovery within 8 weeks exceeded in 0.2% of exposure events(95% uncertainty interval 0-0.7%)

a The smaller TER value was taken where two water bodies were associated with a scenario

6.2 Description of the issue‘Herburon’ is a urea herbicide that is proposed for use as an SC formulation (500 g a.i. L-1) on a wide range of crops including cereals and vegetables. The worst-case GAP involves a single pre-emergence application to

SID 5 (2/05) Page 23 of 30

Page 24: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

winter oilseed rape at 1500 g a.i. ha-1. Initial deterministic modelling shows that spray drift is the primary route of exposure for aquatic habitats under UK conditions.

6.3 Probabilistic exposure calculationA probabilistic exposure assessment is undertaken for the Coleshill catchment in Oxfordshire, UK that is the area immediately surrounding the location of the FOCUS SW D2 weather station. A distribution of instantaneous concentrations of herburon in agricultural ditches arising from spray drift is calculated accounting for variability in individual spray events, the morphology of individual ditches and the bankside vegetation associated with these ditches. Uncertainty in the calculation arising from sampling uncertainty in drift experiments, the interception of spray drift by a certain type of bankside vegetation and the pesticide application rate is expressed using 95% uncertainty intervals around the median distribution of exposure concentrations. These are derived within a 2-D Monte Carlo framework.

6.4 Comparison of deterministic and probabilistic exposure assessment

Figure 14. Comparison of deterministic and probabilistic exposure assessment

0

10

20

30

40

50

60

70

80

90

100

0.1 1 10 100 1000

Exposure concentration Cx (microg/l)

% e

xpos

ure

even

ts w

here

PE

C SW

< C

x

median95% confidence interval

Step 2Step 3 Step 1Step 4

The analysis suggests that the concentration predicted for the D2 scenario using the FOCUS surface water scenarios at Step 4 (5-m no-spray buffer) corresponds to the 83.8 percentile exposure concentration (95% uncertainty interval 75.9-92.4%) for winter oilseed rape fields in the Coleshill catchment assuming that every field is treated and has ditches adjacent.

6.5 Species-level risk assessment The exposure distribution is compared with a fixed endpoint for a single macrophyte test species (7 g L-1, 120-hr EC50 value for Lemna gibba) or probabilistic expressions of effects on macrophytes at the species level. 120-hr EC50 values for herburon are available for nine higher aquatic plant species. A species sensitivity distribution (SSD) is generated assuming a normal distribution. A probabilistic expression of effects is then calculated accounting for the uncertainty in generating species sensitivity distributions from a small number of measurements.

Figure 15. Comparison between the cumulative distribution of PECSW values and the HC5 derived from a species sensitivity distribution (95% uncertainty intervals are shown for estimates of exposure and effects)

SID 5 (2/05) Page 24 of 30

Page 25: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

0

10

20

30

40

50

60

70

80

90

100

0.1 1 10 100

Exposure concentration Cx (microg/l)

% e

xpos

ure

even

ts w

here

PE

C SW

< C

xmedian % exposure events95% confidence interval

Med

ian

HC

5

HC595% confidence interval

The analysis indicates that growth of more than 5% of the macrophyte species will be affected in 3.8% of the exposure events (95% uncertainty interval 0.8-7.6%). The assessment is based on 120-hr laboratory EC50 values. The assessment ensemble is winter oilseed rape fields in the Coleshill catchment assuming that every field is treated, has a ditch adjacent and that a 5-m no-spray buffer is imposed.

6.6 Community-level risk assessmentThe assessment endpoint at the community level is prevention of unacceptable effects on communities of aquatic organisms. The probabilistic expression of exposure is compared with NOEC values from two replicate outdoor mesocosm experiments with herburon (lowest NOEC for clear effects on community metabolism = 5 μg L-1; lowest NOEC for clear effects on community metabolism with no recovery within 8 weeks = 15 μg L-1).

0

10

20

30

40

50

60

70

80

90

100

0.1 1 10 100

Exposure concentration Cx (g/L)

% o

f exp

osur

e ev

ents

whe

re P

EC

SW <

Cx

median95% confidence interval

NOEC for long-term effects = 15 g L-1

NOEC for short-term effects = 5 g L-1

Figure 16. Comparison between the cumulative distribution of PECSW values and endpoints from an outdoor mesocosm experiment (the chart includes 95% uncertainty intervals for estimates of exposure and effects)

The analysis indicates that the instantaneous concentration of herburon in a ditch will exceed the NOEC for effects on community metabolism in outdoor mesocosm experiments (5 μg L-1) in 7.6% of the exposure events (95% uncertainty interval 2.5-13.2%). The lowest NOEC for clear effects with no recovery within 8 weeks (15 μg L-1) will be exceeded in 0.2% of the exposure events (95% uncertainty interval 0.2-0.7%).

The assessment ensemble is winter oilseed rape fields in the Coleshill catchment assuming that every field is treated, has a ditch adjacent and that a 5-m no-spray buffer is imposed.

Figure 17 shows the frequency with which the initial PEC is predicted to exceed the different effects endpoints. Results are shown for exposure calculated either with or without a 5-m no-spray buffer.

Figure 17. Overall frequency that exposure of ditches in the Coleshill catchment would result in clear effects on community metabolism in a mesocosm study (mean (bars) and 95% uncertainty interval)

SID 5 (2/05) Page 25 of 30

Page 26: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

0

5

10

15

20

25

30

Median HC5 fromSSD

Short-term NOECfrom mesocosm

Long-term NOECfrom mesocosm

% o

f exp

osur

e ev

ents

with

initi

al P

EC

gre

ater

than

the

effe

cts

endp

oint

Standard conditions5-m no spray zone

NOECLT = 15.0 g/L

NOECST = 5.0 g/L

HC5 = 6.7 g/L

6.7 Uncertainties not considered within the analysesA number of possible sources of variability and uncertainty in exposure and effects were not included in the assessment. Factors that are likely to increase the estimated risk are (i) the variability in exposure concentrations arising from the variation of drift deposition across the water surface, and (ii) uncertainties in the vertical distribution of pesticide in the water column. Factors which are likely to decrease the estimated risk are (i) the deviation of drift deposition on a surface below ground level from measured deposition at the same level as the treated area, (ii) interception by riparian vegetation or aquatic plants, (iii) reduction in initial exposure concentration due to sorption to sediment or macrophytes. Overall, the estimated risk is expected to decrease and the assessment made in this study is thus considered conservative.

7. MAIN FINDINGS AND IMPLICATIONSThe next two years are likely to be critical in determining the extent to which probabilistic methods are taken up as tools within regulatory assessment of pesticides. The European Concerted Action EUFRAM have drafted a framework document for the implementation of probabilistic approaches into regulatory procedures and this will be debated and refined at two end-user workshops. The intention is that the finalised document will be reviewed by the EFSA PPR panel and may form the basis for guidance on probabilistic risk assessment.

Work undertaken within this Defra-funded research demonstrates that probabilistic approaches have the potential to strengthen risk assessment, but that there are still gaps in our understanding, tools and procedures that present significant obstacles to implementation. The work has been written up as three refereed journal papers as well as contributing directly to the EUFRAM guidance document. There has been extensive dissemination through the EUFRAM network of practitioners and stakeholders and through presentation at conferences. The main findings of the research and the implications for use of probabilistic approaches are summarised below.

The case study with the hypothetical herbicide ‘herburon’ demonstrates how probabilistic methods can be used to refine and strengthen assessment of both exposure and effects to aquatic organisms. Key conclusions are: Probabilistic methods can be applied without necessarily generating large amounts of additional data. There

are significant opportunities to develop standard probabilistic scenarios for exposure calculations that will reduce data requirements and increase harmonisation and transparency. Such scenarios offer the best opportunity to incorporate aspects of landscape variability into risk assessment. There are established methodologies for expressing uncertainty associated with extrapolating from small datasets.

Probabilistic methods can be applied at various stages during refinement of the risk assessment. It can be useful to compare probabilistic exposure with deterministic effects or vice versa.

In this example, there was good consistency between the first tier deterministic estimate of exposure and a probabilistic refinement. This provides a much-needed validation of the first-tier methodology. Further examples of such comparisons should be generated.

The case study provided the basis for feedback on approaches to communication from the Environmental Panel and is being used to train end-users and stakeholders within EUFRAM.

Monte Carlo analysis is likely to remain a key tool for probabilistic assessments. The ring test with six relatively experienced modellers showed that there are many subjective choices to be made during MC analysis and that this will yield differences in results from different modellers addressing the same problem. The differences are greatest at the tails of the distribution that are normally the focus during regulatory decision-making. The best way to reduce subjectivity would be to develop standard protocols or rigid modelling tools for MC analysis that would

SID 5 (2/05) Page 26 of 30

Page 27: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

be analogous to the FOCUS procedures and tools for deterministic exposure modelling. More experience of probabilistic methods and applications to regulation are required before such instruments could be put in place. In the interim, it is essential that all methodology is documented and justified in detail to improve transparency and ensure that results can be reproduced if necessary. Guidance on how to report the rationale, methodology and results of a MC analysis has been prepared and incorporated into the EUFRAM framework document for dissemination to end-users at the European level.

A key decision in setting up a probabilistic analysis is whether there is correlation between any of the inputs to the analysis. Pesticide properties are often the most sensitive in exposure assessment. Analysis of the literature demonstrates that there are examples of both positive and negative correlation between the sorption and degradation behaviour of a pesticide, but that general trends are not apparent at present. The inclusion of even weak correlation between sorption and degradation was found to greatly influence a probabilistic analysis of leaching through soil. Correlation will reduce predicted risk of leaching for pesticides and it is recommended to set the correlation to zero unless the experimental data support an alternative assumption (i.e. where the correlation is significant with at least 90% probability and experimental artefacts can be excluded).

Choosing which shape of distribution to fit to data for model input parameters can be difficult for small datasets and will influence results of probabilistic analysis. Eleven distributions were fitted to data for sorption and degradation of pesticides measured in >10 soils. Skewed distributions (e.g. lognormal, Weibull, beta) tend to fit the data best. It is recommended that a lognormal distribution is fitted to small datasets for sorption and degradation unless there is strong evidence to support an alternative (>10 data and a clear difference in goodness-of-fit statistics). Work with a dataset of 49 Koc values for atrazine demonstrated that distributions fitted to small datasets (four or ten data) can deviate strongly from those resulting from a larger number of measurements. The resulting uncertainty will significantly influence results from a probabilistic analysis and it is recommended that this uncertainty needs to be included in the assessment. A method to estimate uncertainty arising from generating distributions from small datasets that has been widely applied for species sensitivity distributions was found to be applicable within exposure assessment. However, the time involved in running mechanistic fate models may prohibit use for some calculations.

Predicted environmental concentrations will not vary linearly for pesticides with non-linear sorption (Freundlich coefficient, nf ≠ 1). Detailed sensitivity analyses have been produced for a range of scenarios to support decisions on how to modify PECsw and PECgw in response to changes in application rate.

Model error arises from an imperfect mathematical representation of the system being described. It is routinely ignored in deterministic exposure assessment, but this is probably justified given the worst-case assumptions and partial validation of the FOCUS scenarios and models. Model error should not be ignored for assessments outside of the FOCUS framework, particularly where these challenge some of the worst-case assumptions. A methodology based on sampling of an error term from a uniform distribution within 2-D Monte Carlo is proposed to incorporate model error into probabilistic calculations.

Probabilistic methods offer significant advantages if properly applied to risk assessment for pesticides as they can: quantify variability and uncertainty; produce outputs with more ecological meaning; make better use of available data; and should promote better science (EUPRA, 2001). The herburon case study clearly demonstrates this potential to improve the rigour and transparency of risk assessment and regulatory decision-making. Nevertheless, the detailed analysis of issues related to probabilistic approaches raises significant obstacles to full implementation. Methods are available to address some of these issues and procedures to standardise the analysis will develop in time. The challenge over the next five to seven years will be to gradually incorporate probabilistic approaches into regulatory procedures in such a way that confidence in the methods is not undermined. It is recommended that this may best be done as with fate modelling submissions during the early 1990’s whereby modelling results were considered alongside experimental data as supporting information rather than as the key endpoint supporting a decision. Probabilistic calculations will need to be considered in parallel with deterministic approaches so that the two can be compared and to allow confidence in the new approaches to develop. Such an approach would also allow time for risk assessors within industry and regulatory bodies to gain experience with the new methods. This body of experience and worked examples of probabilistic calculations applied to real regulatory problems is now the most important requirement for regulatory implementation. For this reason, we do not regard further fundamental research into probabilistic methods as a priority at the present time. Instead, effort should be directed at generation of further case studies and particularly those that directly compare deterministic and probabilistic results. Greater application of the methods that are currently available will doubtless raise requirements for new research in due course. It is hoped that it will also give rise to a process of standardisation under the umbrella of an organisation such as FOCUS.

8. PUBLICATIONS ARISING FROM THE PROJECT

8.1 Refereed journal papersBeulke S, Brown, CD, Dubus IG, Galicia H, Jarvis N, Schaefer D, Trevisan M. User-subjectivity in Monte Carlo

modeling of pesticide exposure. Environmental Toxicology and Chemistry (submitted).

SID 5 (2/05) Page 27 of 30

Page 28: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

Beulke S, Brown, CD. Parameter correlation in Monte Carlo sampling. Pest Management Science (submitted).

Brown CD, Beulke S. Incorporating landscape factors into probabilistic assessments of pesticide exposure. Integrated Environmental Assessment and Management (in prep).

8.2 Contributions to the EUFRAM framework documentBrown CD, Beulke S, van den Brink, P. 2005. Case study 3: risk to aquatic organisms arising from exposure to

‘herburon’ via spray drift.

Maund S, Byrne P, van den Brink P, Beulke S, Galicia H, Hart A, Posthuma L, Solomon K. 2005. Reporting probabilistic assessments

8.3 Conference presentationsBrown CD, Maund SJ, Holmes C, Dubus IG, Turner NL, Hendley P, Sweeney PJ (2003). Landscape-level risk

assessment for pesticides. Proceedings of the XII Symposium of Pesticide Chemistry, Piacenza, Italy, June 2003, p.633-640.

Brown CD, Turner N, Dubus I, Arnold D, Biggs J, Maund S, 2003. Incorporating landscape factors into probabilistic assessments of pesticide exposure for UK surface waters. SETAC Europe 13th Annual Meeting, Hamburg, Germany, 27 April – 1 May 2003.

Brown CD, Beulke S, van den Brink, P. 2005. Probabilistic case study for the risk to aquatic organisms arising from pesticide exposure via spray drift. SETAC Europe 15th Annual Meeting, Lille, France, 22-26 May 2005.

9. ACKNOWLEDGEMENTWe are grateful to Igor Dubus, Hector Galicia, Nicholas Jarvis, Dieter Schaefer and Marco Trevisan for their participation in the ring test (Section 3.1). The participation of Marco Trevisan in this exercise was partly funded through the Italian national project MIUR (COFIN 2004) Pesticide fate models use: regional scenario definition and uncertainty analysis. Work by Igor Dubus was supported by the European funded FP6 AquaTerra project (Integrated modelling of the river-sediment-soil-groundwater system: advanced tools for the management of catchment areas and river basins in the context of global change; project GOCE-CT-2004-505428)

References to published material9. This section should be used to record links (hypertext links where possible) or references to other

published material generated by, or relating to this project.Allen R, Walker A. 1987. The influence of soil properties on the rates of degradation of metamitron,

metazachlor and metribuzin. Pest. Sci. 18:95-111.Anderson JPE. 1981. Soil moisture and the rates of biodegradation of diallate and triallate. Soil Biol.

Biochem. 13:155-161.Beck MB, Ravetz JR, Mulkey LA, Barnwell TO. 1997. On the problem of model validation for predictive

exposure assessments. Stochastic Hydrology and Hydraulics 11:229-254.Beckie R. 1996. Measurement scale, network sampling scale, and groundwater model parameters. Water

Resour.Res. 32 (1): 65-76. Beulke S. 1998. Untersuchung und mathematische Beschreibung des Abbaus von Herbiziden im Boden in

Abhängigkeit von Wirkstoffverfügbarkeit, mikrobieller Biomasse und Aktivität. Ph.D. diss. Technical University Braunschweig, Germany.

Beulke S, Brown CD, Jarvis NJ. 2001. MACRO: a preferential flow model to simulate pesticide leaching and movement to drains. In: JBHJ Linders (ed.): Modelling of environmental chemical exposure and risk, Kluwer Academic Publishers, pp 117-132.

Carsel RF. 1984. PRZM-3, A Model for predicting pesticide and nitrogen fate in the crop root and unsaturated soil zones: User's manual for Release 3.0. U.S. Environment Protection Agency, Athens, GA 30605-2720

Dubus IG, Brown CD, Beulke S, Turner NL. 2002. Uncertainty and probabilistic approaches to pesticide fate modelling. Cranfield University report to DEFRA for project PL0548.

Dubus IG, Brown CD, Beulke S. 2003. Sources of uncertainty in pesticide fate modelling. Sci.Tot. Environ. 317:53-72.

Dubus IG, Brown CD. 2002. Sensitivity and first-step uncertainty analyses for the preferential flow model MACRO. J. Environ. Qual.31:227-240.

Dubus IG, Janssen PHM. 2003. Issues of replicability in Monte Carlo modelling: A case study with a pesticide leaching model Environ. Toxicol. Chem.22:3081-3087.

SID 5 (2/05) Page 28 of 30

Page 29: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

Dubus IG, Brown CD, Beulke S (2000). Sensitivity analyses for leaching models used for pesticide registration in Europe. SSLRC research report to MAFF.

Dyson JS, Beulke S, Brown CD. 2002. Adsorption and degradation of the weak acid mesotrione in soil: behaviour and iImplications. J. Environ. Qual. 31:613-618.

EUPRA. 2001. Probabilistic risk assessment for pesticides in Europe. Implementation and research needs. Report from the European Workshop on Probabilistic Risk Assessment for the Environmental Impacts of Plant Protection Products, The Netherlands, June 2001. Central Science Laboratory, York, UK.

Evans JS, Gray GM, Sielken RL; Smith AE, Valdez-Flores C, Graham JD. 1994. Use of probabilistic expert judgment in distributional analysis of carcinogenic potency, Reg. Tox. Pharm. 20: 15-36.

FOCUS (2000). FOCUS groundwater scenarios in the EU review of active substances - The report of the work of the Groundwater Scenarios Workgroup of FOCUS (FOrum for the Co-ordination of pesticide fate models and their USe). Sanco/321/2000 rev.2, 202pp.

Freissinet C, Erlich M, Vauclin M. 1998. A fuzzy logic-based approach to assess imprecisions of soil water contamination modelling. Soil Tillage Res, 47:11-17.

Gaganis P; Smith L. 2001. A Bayesian approach to the quantification of the effect of model error on the predictions of groundwater models. Water Res. Res. 37 (9): 2309-2322.

Giddings JM, Anderson TA, Hall LW Jr, Hosmer AJ, Kendall RJ, Richards RP, Solomon KR, Williams WM. 2005. A Probabilistic Aquatic Ecological Risk Assessment of Atrazine in North American Surface Waters. Society of Environmental Toxicology and Chemistry, Pensacola, FL. (in press)

Gottesbüren B. 1991. Konzeption, Entwicklung und Validierung des wissensbasierten Herbizid-Beratungssystems HERBASYS. Ph.D. diss. Hanover University, Germany.

Guerin WF, Boyd SA. 1993. Bioavailability of sorbed naphthalene to bacteria: Influence of contaminant aging and soil organic carbon content. p. 197-208. In D.M. Linn et al. (ed.) Sorption and degradation of pesticides and organic chemicals in soil. SSSA Spec. Pub. 32, ASA and SSSA Madison, WI.

Guo L, Wagenet RJ, Jury WA. 1999. Adsorption effects on kinetics of aldicarb degradation: equilibrium model and application to incubation and transport experiments. Soil Sci. Soc. Am. J. 63:1637-1644.

Guo L, Jury WA, Wagenet RJ, Flury M. 2000. Dependence of pesticide degradation on sorption: nonequilibrium model and application to soil reactors J. Cont. Hydrol. 43:45-62.

Jarvis NJ. 1994. The MACRO model (Version 3.1). Technical description and sample simulations. Reports & Dissertations 19, Department of Soil Science, Swedish University of Agricultural Sciences, Uppsala.

Jene B. 1998. PELMO 3.00 Manual extension. Staatliche Lehr- und Forschungsanstalt für Landwirtschaft, Weinbau und Gartenbau, D-67435 Neustadt/Wstr.

Johnson WD, Lavy TL, Gbur EE. (1995): Sorption, mobility and degradation of triclopyr and 2,4-D in four soils. Weed Sci. 43:678-684.

Klein M. 1995. PELMO: Pesticide Leaching Model, User manual version 2.01. Fraunhofer-Institut für Umweltchemie und Ökotoxikogie, D57392 .

Lehmann RG, Miller JR, Fontaine DD, Laskowski DA, Hunter JH, Cordes RC. 1992. Degradation of a sulfonamide herbicide as a function of soil sorption. Weed Res. 32:197-205.

Loague KM, Green RE. 1991. Statistical and graphical methods for evaluating solute transport models: overview and application. J. Contam. Hydrol. 7:51-73.

Loux, MM, Reese KD. 1992. Effect of soil pH on adsorption and persistence of imazaquin. Weed Sci. 40:490-496.

Miao Z, Trevisan M, Capri E, Padovani L, Del Re AAM. 2004. Uncertainty Assessment of the Model RICEWQ in Northern Italy. J Environ Qual 33: 2217-2228.

Morgan, MG, Henrion M. 1990. Uncertainty: A guide to dealing with uncertainty in quantitative risk and policy analysis, Cambridge University Press, New York.

Neuman SP. 2003. Maximum likelihood Bayesian averaging of uncertain model predictions. Stochastic Environ. Res. Risk Assessment 17:291-305.

Ogram AV, Jessup PP, Ou LT, Rao PSC. 1985. Effects of sorption on biological degradation rates of 2,4-dichlorophenoxy acetic acid in soils. Appl. Environ. Microbiol. 49:582-587.

Price OR. 2003. Spatial Variation in Pesticide/Soil Interactions at the Field Scale. PhD Thesis. The University of Reading.

Rao PSC, Wagenet RJ. 1985. Spatial variability of pesticides in field soils: methods for data analysis and consequences. Weed Sci. 33:18-24.

Renner KA, Meggitt WF, Penner D. 1988. Effect of soil pH on imazaquin and imazethapyr adsorption to soil and phytotoxicity to corn (Zea mays). Weed Sci. 36:78-83.

Saltelli A, Chan K, Scott EM. 2000. Sensitivity analysis. John Wiley & Sons, Chichester, UK. Stougaard RN, Shea PJ, Martin AR. 1990. Effect of soil type and pH on adsorption, mobility and efficacy

of imazaquin and imazethapyr. Weed Sci. 38: 67-73.Tiktak A, van den Berg F, Boesten JJTI, Leistra M, van der Linden AMA, van Kraalingen D. 2000.

Pesticide Emission Assessment at Regional and Local Scales: User Manual of Pearl version 1.1. RIVM Report 711401008, Alterra Report 28, RIVM, Bilthoven, 142 pp.

Trevisan M, Vischetti C. 2005. Assessment of uncertainty associated with the extent of simulation

SID 5 (2/05) Page 29 of 30

Page 30: General enquiries on this form should be made to:randd.defra.gov.uk/Document.aspx?Document=PL0548… · Web viewPositive or negative correlations can be included in Monte Carlo or

processes from point to catchment: Application to 1D-pesticide leaching models. In Soil-Water-Solute Process Characterization: An Integrated Approach (Javier Alvarez-Benedi, Rafael Munoz-Carpena editors), CRC Press, pp 673-692.

Vose D. 2000 Risk analysis. A quantitative guide. John Wiley & Sons Ltd, Chichester.Walker A, Jurado-Exposito M, Bending GD, Smith VJR. 2001. Spatial variability in the degradation rate of

isoproturon in soil. Environ. Poll. 111:407-415 Walker A, Brown PA, Mathews PR. 1985. Persistence and phytotoxicity of napropamide residues in soil.

Ann. Appl. Biol. 106:323-333.Walker A, Thompson JA. 1977. The degradation of simazine, linuron and propyzamide in different soils.

Weed Res. 17:399-405.Wehtje G, Dickens R, Wilcut JW, Hajek BF. 1987. Sorption and mobility of sulfometuron and imazethapyr

in five Alabama soils. Weed Sci. 35:858-864.

SID 5 (2/05) Page 30 of 30