Introduction

Unifying principles document coverEnvironmental monitoring is critical for tracking, understanding, and responding to environmental change, and for underpinning environmental regulation. Monitoring programmes tend to be established to meet specific purposes. While it is often possible to adapt or link them over time to address emerging environmental questions, increasing diversification of policy agendas and methodologies in recent years has made this more challenging, particularly with respect to assessing change at a UK scale. A more strategic approach to the UK’s monitoring evidence base is therefore needed to maximise the scientific and policy value of the outputs. 

This document sets out a scientifically robust set of nine principles, agreed by the UKEOF partnership, to provide guidance to those involved in the establishment or development of environmental monitoring programmes, from contracted professional surveyors to community scientists. The principles may be followed in any order. 

Application of these principles from the outset should help ensure that any new or revised monitoring programme is effective and efficient in delivering its primary objectives. Where appropriate and feasible, it should also enable the programme to contribute to, and draw from, other initiatives with overlapping aims, and hence provide added value to the UK environmental monitoring capability from regional to UK scales.

You can read our nine principles here. A PDF version of Unifying principles for environmental monitoring is also available to download.

The principles at a glance

1. Clarity of purpose

Establish clear goals to answer specific questions.

Kestrel in flight

At the outset, clearly state the desired impact of the monitoring programme and how this may be achieved. It may be helpful to adopt a ‘theory of change’ approach and/or draw up a conceptual model to provide a clear overview and address whether the programme design will meet the objectives and have the desired impact. The monitoring programme should be regularly evaluated against its original goals, and any potential new goals, to ensure it is delivering the objectives, while ensuring continuity and responding to innovation.

2. Continuity and alignment

Build on existing monitoring programmes.

River with small waterfall

Irreplaceable insights into the nature of our changing environment are provided by long term environmental monitoring programmes in which measurements have been consistently applied and repeated over time. They enable the evaluation of regulatory and management actions and policy over long time frames. At the start of any project, scope relevant current and historic monitoring activities and monitoring locations (using resources such as the UKEOF catalogue of monitoring observations), that could be either supplemented or restarted to help meet the requirements of a new programme. Consider whether such methodologies and locations are fit for current purposes and meet good practice standards. Where possible, harmonise any new design to facilitate robust analysis of anticipated data alongside existing datasets.

3. Good design

The design of the new programme must meet the requirements of the goals and be underpinned by professional statistical advice.

Spider's web covered in dew drops

Consider how the data will be collected (e.g. surveyors/volunteers, what type of technology, etc.), and who is collecting the data, ensuring an appropriate balance of trained and experienced surveyors (professional or amateur) and other members of the community. Where fieldwork is needed, practicality, health and safety, and feasibility must be considered as part of good design.

Measurements should be:

  • statistically objective for the purpose, and collected with minimal bias
  • representative and diagnostic of the system of interest
  • made at the appropriate resolution, accuracy and spatial and temporal scales to answer the questions required, accounting for measurement uncertainty (e.g. with confidence levels)
  • based on current good practice, repeatable and well documented.

Consider the ‘data journey’, using standards and good practice processes where appropriate, including:

  • the needs of data providers (e.g. sampling and analytical logistics, and data delivery pathways)
  • the needs of data users (e.g. data accessibility and analysis)
  • quality assurance and quality control processes, noting that when and how these occur will differ depending on whether it is data from contracted surveyors, community and citizen science, sensors, or sample analysis
  • data storage and management
  • data analytical processes.

4. Collaborative development

Identify and involve key interested parties from the design stage through to end use and re-use of the data.

Flock of geese flying in formation

Interested parties could range from other organisations collecting measurements in the locality, to other users of the data and community stakeholders. Apart from the potential for collaboration, it is important to consider if the data could feed into other regional, national, or international initiatives or networks, even if this was not the primary purpose. If so, consider if there are minimum data and metadata collection standards that must be met for the data to be compatible.

5. Maximised value

Value for money and efficiency should be considered at every stage.

Bright sun shining through green oak leaves

Seek to enhance value for money by aligning design with compatible monitoring programmes to enable data to be exchanged and combined. This may result in greater use of the data. Consider how the project contributes to the needs and interests of the wider scientific community, policy makers, and other relevant users, by collecting new data or evidence, and by sharing any new methods that could be used by others.

6. Planned outputs

Determine the intended programme outputs and their uses from the start.

Ripe blackberries

These will influence issues such as measurement selection, storage and licensing. Apply Q-FAIR data principles (i.e. that the data are of sufficient Quality, and are Findable, Accessible, Interoperable and (Re)usable) at every stage, and ensure outputs adhere to the UK Code of Practice for Statistics. Clear and sufficiently detailed metadata and a technical summary should be provided to ensure methods are fully explained and can be repeated, and that the data can be interpreted and re-used appropriately. Uncertainty estimates should be provided where relevant. Providing there are no ethical or contractual issues to the contrary, data should be made freely available through Open Access repositories. Use existing standards to maximise compatibility between datasets, including those for data formatting and metadata analysis and storage. Ensure appropriate licensing is in place for any other data you may plan to build into your outputs.

7. Transparency

Good science is transparent and open to challenge.

A drop of water on the tip of a grass leaf

Ideally the setting of purpose, design and methodological development should be open to peer review. The caveats, assumptions, methodologies, and uncertainties should be published alongside outputs. This includes analytical pipelines and models as well as primary data collection methods, to enable scrutiny and repeatability at every stage. Data should be available to others (Open Access where possible) and archived appropriately, alongside methodologies, and with the correct metadata. Full transparency also ensures no accidental replication of effort when developing new methods.

8. Innovation

New methodologies, technologies and data science capabilities (including artificial intelligence: AI) have huge potential to improve our monitoring.

Bolts of lightning

There are also risks, particularly with respect to the comparability of data generated using different approaches, and especially when these change over time. Consider the appropriate levels of calibration and benchmarking to ensure novel types of data are fit for purpose. New technologies should not be used for the sake of it, so carefully consider the associated benefits and trade-offs. New technologies and approaches can complement and augment existing data collection methods rather than replace them. Where new measurement methodologies are being adopted, standard approaches must be developed collaboratively. The UKEOF Working Groups are examples of communities that can support this. The principles here still apply to novel technologies (i.e., methods must be transparent, repeatable, and robust, so bias is understood).

9. Sustainability

Long term monitoring programmes involve a commitment to collect repeated and consistent measurements, enabling detection of subtle changes against the noise of short-term variation; this requires planning and long-term investment.

Stonehenge

Patterns in observations may only start to emerge after one or two decades of data collection, so a long-term view to financing should be taken from the outset. Incorporate scalability and flexibility into your design to allow for changing budgets and set out the long-term funding needs and timescales for seeking additional support, including for maintenance of datasets and/or infrastructure. Use appropriate communication to emphasise the value of the data collected from an early stage to increase the impact and visibility of the monitoring programme to funders, data users and participants, and maximise opportunities for future support. New techniques or technologies may be brought in over time, but these should be run alongside existing methods for a period of time to enable calibration.


Feedback

We welcome your feedback on these principles. We'd especially like to know if they are they useful to you, and if so, in what the context. Please contact us to share your comments and suggestions.


More information

Development of these principles

These principles have been produced by the UKEOF ‘Best Practice in Environmental Monitoring Task and Finish Group’ which was formed, and produced its first draft, in November 2023. Several iterations have since been produced by the Task and Finish group in consultation with the wider UKEOF membership. Unifying principles for environmental monitoring was published in November 2024 and will be reviewed and updated by the UKEOF Management Group annually in March of each year.

How to cite

UKEOF (2024). Unifying principles for environmental monitoring. UK Environmental Observation Framework, Lancaster, UK.
DOI: 10.5281/zenodo.14054886.

Image credits

Cover: © Daniel Hauck. Kestrel: ’doncoombez’ on Unsplash. Waterfall: Grahame Jenkins on Unsplash. Spider’s web: Gordon Beagley on Unsplash. Geese in flight: Heather Wilde on Unsplash. Oak leaves: ‘Kimona’ on Unsplash. Blackberries: Amanda Hortiz on Unsplash. Dewdrop on leaf: Aaron Burden on Unsplash. Lightning: © Andy Sier. Stonehenge: ‘Priyank V’ on Unsplash.