• Analytical Reports and Documents
  • 24 March 2025

Glossary

ESS – Environmental Standards Scotland

ET – Executive Team

SA – Strategy and Analysis

ISC – Investigations Standards and Compliance

CSC – Corporate Services and Communications

DPSIR – Drivers, Pressures, State, Impact, Responses

GSS – Government Statistical Service

SIO – Senior Investigations Officer

eNGO – environmental non-governmental organisation

  • Analytical Reports and Documents
  • 24 March 2025

1. The Guidance

1.1 About the Guidance

Environmental Standards Scotland (ESS) is an independent public body, set up to ensure environmental laws and standards are adhered to in Scotland, replacing the European Union’s scrutiny and enforcement role after Brexit. The Strategy and Analysis (SA) team undertake a range of monitoring and analytical work to ensure that we are well informed about environmental performance and identify issues of potential concern.

We may consider, assess and review data on the quality of the environment in Scotland, keep under review implementation of any international obligation of the UK relating to environmental protection and follow developments in international environmental protection legislation.

 

1.2 How to use this guidance

The processes outlined here should guide all of ESS’ analytical work. It:

i) describes roles and responsibilities of individuals and teams

ii) sets out the expectations of different analytical products

iii) provides links to key documents and further advice/guidance

It is essential that the analysis we produce is of the highest quality and that we minimise the risk of errors. This process is therefore essential to the reputation and credibility of ESS.

The process is mainly aimed at staff in the SA team. However, it is important that other staff in ESS are also familiar with the process as they may commission analysis and/or utilise its outputs.

Separate guidance applies to the undertaking of investigations and consideration of representations submitted to ESS.

 

1.3 Review and updates to procedure

This procedure will be kept under review by the Head of SA with revisions to be incorporated bi-annually if required. Members of staff are encouraged to flag any suggestions/required changes which they believe can improve the way ESS operates.

  • Analytical Reports and Documents
  • 24 March 2025

2. Standards and Principles

2.1 General ESS Powers and Obligations

i) Delegated Authority

Only the Board is given statutory authority by the UK Withdrawal from the European Union (Continuity) (Scotland) Act 2021 (the 2021 Act); however, as the Board would not have the time to physically review or authorise every decision that requires to be taken, it has agreed to delegate areas of responsibility to ESS staff. In legal terms, this means the actions of a member of staff acting with delegated authority are the actions of the Board.

ii) Legal Framework

The Board draws its authority from the 2021 Act, which both enables the actions of the Board and limits its powers. Each section of this guidance sets out the parts of the 2021 Act that is relevant to our work. As a public body, ESS is also subject to both the Freedom of Information (Scotland) Act 2002 (FOISA), the Environmental Information (Scotland) Regulations 2004 (EIR) and the Data Protection Act 2018 (DPA).

 

2.2 Standards and Principles for producing Analysis

ESS use the principles of the Aqua Book Guidance on Producing Quality Analysis for Government.

i) proportionality of response: the extent of the analytical Quality Assurance effort should be proportionate in response to the risks associated with the intended use of the analysis, including financial, legal, operational and reputational impacts. In addition, analysis that is frequently used to support a decision-making process (or the conclusions of which are then used in future work) may require a more comprehensive quality assurance response

ii) assurance throughout development: quality assurance considerations should be considered throughout the life cycle of the analysis and not just at the end. Effective communication is crucial when understanding the problem, designing the analytical approach, conducting the analysis and relaying the outputs

iii) verification and validation: analytical quality assurance is more than checking that the analysis is error-free and satisfies its specification (verification). It must also include checks that the analysis is appropriate, i.e. fit for the purpose for which it is being used (validation)

iv) analysis with RIGOUR: quality analysis needs to be Repeatable, Independent, Grounded in reality, Objective, have understood and managed Uncertainty, and the Results should address the initial question robustly. It is important to accept that uncertainty is inherent within the inputs and outputs of any piece of analysis. It is important to establish how much we can rely upon the analysis for a given problem.

 

2.3 Voluntary compliance with the code of practice for statistics

ESS follow the principles of intelligent transparency and are currently developing our approach for voluntary compliance with the code of practice for statistics. We will publish a statement on this on our website once ready.

  • Analytical Reports and Documents
  • 24 March 2025

3. SA Analytical Products

The Strategy and Analysis team produce a range of analytical products as ilustrated in Figure 3.1 and the purpose and scope of each of these products is summarised in this section.

The default audience of analytical products is assumed to be interested and semi-informed on environmental topics. Mixed stakeholders including MSPs, public bodies, environmental non-governmental organisations (eNGOs).

 

Figure 3.1 Analytical products produced by SA team

 

3.1 Scoping reports

Purpose. Broad but shallow review of topic leading to one of four possible outcomes (see below). If the outcome is that further analysis is required, scoping should identify the specific issues to focus on.

Audience. Internal. Unpublished.

Type of report. Short, summary report explaining why the topic is important (in a page with links to other sources) and then some visuals (e.g. legislation/policy map, DPSIR summary, data sources, trends). Set out next focus with information to explain why topics not included to manage risk. If proposing further analytical work this should include project specification/proposal to give a sense of what resource would be required. While formal theories of change are developed alongside recommendations there should be consideration of how ESS is likely to be able to add value through changing behaviour or processes at an earlier stage of the work.

Out of scope. Budget (unless scoping commissioned out), communications – lines to take etc. ESS’ legal advisor should be consulted at the outset and updates should be provided to Corporate Services and Communications (CSC) and ISC though quarterly C band meetings.

Key roles. Small project team to discuss key findings and next steps. Commissioner – One of C1 for Policy, Science or Quantitative Analysis. Analytical coordinator – mostly at B band.

Duration. 2-3 months.

Quality assurance and fact checking. Light touch quality assurance of references. No external fact-checking.

External engagement. Yes. To understand what stakeholders are concerned about and where the risks are. However, stakeholder views should not drive priorities unless supported by evidence.

Board/ET engagement. Updates on progress and conclusions provided through SA update papers for information only. Decisions on next steps lie with Head of SA.

Possible outcomes. 1) No further action. 2) Further detailed analysis of one or more (prioritised) topics. 3) Issue passed to investigations (occasional). 4) Issue raised by letter with Parliament or a public authority (occasional).

 

3.2 Legislative Rapid reviews

Purpose. Horizon scanning.

Audience. Internal. Unpublished.

Type of report. Legislative rapid review.

Out of scope. Budget (unless scoping commissioned out), communications – lines to take etc. Detailed legal advice generally not required but may be sought on more complex areas and updates should be provided through quarterly C band meetings.

Key roles. Commissioner – Head of Policy Analysis and Horizon Scanning. Analytical coordinator/lead analyst – B band from policy team.

Duration. 2-4 weeks.

Quality assurance and fact checking. Light touch quality assurance within policy team. No external fact-checking.

External engagement. Not normally required.

Board/ET engagement. Updates on progress provided through SA update papers for information only.

Possible outcomes. None. To provide background/context for other analytical products.

 

3.3 Analytical reports

Purpose.  In-depth analysis of a defined, narrow topic. Will always follow scoping work, and the area of environmental law and focus (compliance and/or effectiveness) should be clear.

Audience. External published.

Type of report. Analytical report, synthesising evidence from a range of sources (quantitative, science, policy) and reaching conclusions relating to compliance with and efficacy of environmental law.

Scope. Most will not require a budget, but the option should be available.  Communications and legal advice should be sought at outset and then at key stages in the process.

Key roles. Full project team with regular meetings and input on key findings and conclusions. Commissioner – One of C1 for Policy, Science or Quantitative Analysis. Analytical coordinator/lead analyst – mostly at B3 level.

Duration. Varies but likely to be at least six months and often a year or more.

Quality assurance and fact checking. Full internal quality assurance approach. External fact-checking.

External engagement. Yes. Detailed stakeholder engagement. However, stakeholder views should not drive priorities unless supported by evidence

Board/ET engagement. Detail of specification shared at start for information. Nominated Board members engaged at key points. Updates on progress and conclusions provided through SA update papers to ET and Board. Role in agreeing final recommendations (if any).

Possible outcomes. 1) No further action. 2) Further or ongoing analysis of one or more topics (prioritised). 3) Recommendations made to public authorities. 3) Issue passed to investigations (occasional).

 

3.4 Consultation responses/call for evidence responses

Purpose.  To provide high quality analysis to inform ESS’ position in response to a consultation or call for evidence.

Audience. External published.

Type of report. Consultation response.

Scope. Generally, does not require a budget. Legal and communications updates should be provided and advice sought as appropriate.

Key roles. Commissioner – Head of Policy Analysis and Horizon Scanning. Analysis delegated as required. Lead analyst and a second quality assurance (QA) role normally sufficient though more roles may be included if multi-team response.

Duration. 4-6 weeks.

Quality assurance and fact checking. Within-team quality assurance though may need fuller approach if consultation requires a multi-team response. No external fact-checking.

External engagement. Not normally required though often part of wider engagement on associated topic.

Board/ET engagement. CEO sign-off. Board engagement if significance factors apply. Updates on progress and conclusions provided through SA update papers to ET and Board.

Possible outcomes. ESS response to consultation questions. Potential oral evidence sessions with Scottish Parliament Committees.

 

3.5 Investigations support

Purpose.  To provide high quality, in-depth analysis of a narrow topic to support an investigation or pre-investigation.

Audience. Internal reports, potentially contributing to eventual external investigation report.

Type of report. Templates for analysis from individual teams e.g. quantitative analysis, scientific evidence review or policy analysis. Occasionally may be required to synthesise multi-team response.

Scope. Generally, does not require a budget unless commissioned out. Legal advice if appropriate.

Key roles. Commissioner – Head of Investigations, Standards and Compliance (ISC). Analytical coordinator – Identified within SA by C2. Project team depends on complexity. If single team response, then informal team of commissioner, coordinator and assurer normally sufficient. Multi-team response may require more formal team.

Duration. Variable.

Quality assurance and fact checking. Within-team quality assurance though may need fuller approach if consultation requires a multi-team response. No external fact-checking.

External engagement. Not normally required although may feed into requests made of stakeholders by the investigations team.

Board/Executive Team(ET) engagement. As per ISC own ET/Board engagement on wider Investigation. Updates on progress and conclusions provided through SA update papers to ET and Board.

 

3.6 Advice and Briefing notes

Purpose.  To provide high quality analysis in response to Board/ET requests.

Audience. Generally internal reports but potential to be published in some cases.

Type of report. Likely to be based on templates for analysis from individual teams e.g. quantitative analysis, scientific evidence review or policy analysis. However, format may change depending on scope of request. Occasionally may be required to synthesise multi-team response. May feed into Board/ET papers.

Scope. Generally, does not require a budget. Legal advice should be sought if appropriate. Communications advice should be sought if to be published.

Key roles. Commissioner – any part of ESS. Analytical coordinator identified by Head of SA. Need for project team depends on complexity. If single team response, then informal team of commissioner, coordinator and assurer normally sufficient. Multi-team response may require more formal team.

Duration. Variable.

Quality assurance and fact checking. Within-team quality assurance though may need fuller approach if requires a multi-team response. Generally, no external fact-checking but may be required if report intended for publication.

External engagement. Not normally required but depends on scope of request.

Board/ET engagement. Detail will vary and this should be agreed with Head of SA.

 

3.7 Post-intervention analysis – in development

As ESS matures, we anticipate the requirement for post-intervention analysis on an environmental topic to examine if the situation has changed. Our process for this is in development. This section of the process guidance will be updated once this methodology is agreed.

 

3.8 Long-term outcome indicators – in development

In the first Strategic Plan ESS set out its long-term outcome: “Scotland’s people and nature benefit from a high quality environment and are protected from harm”. The plan indicates that this will be monitored via ESS’ assessment of Scotland’s progress against environmental indicators. Development work is underway to establish a suitable methodology for measuring progress. This section of the process guidance will be updated once this methodology is agreed.

  • Analytical Reports and Documents
  • 24 March 2025

4. Scoping and Carrying out Analysis Projects

4.1 Commissioning analysis

Analysis is commissioned following the schematic in Figure 4.1. Emerging priorities can arise though requests from Scottish Parliament Committees, new Government/Parliament consultations and topics proposed by the ESS Board. All issues are assessed and scored using an analytical prioritisation tool. This tool considers a range of factors including (but not limited to); the environmental and health impacts, the level of public concern, current or planned action and the length of time that an issue has persisted for. An estimate of the level of confidence in our assessment is also undertaken, as well as assessing what value ESS’ involvement could bring. The process for commissioning analysis outlined here applies once an issue has been prioritised for analysis.

Where a piece of analysis has significant resource implications and/or entails significant reputational risk it may be necessary to agree the scope, focus and proposed approach to carrying out the work with the Executive Team.

 

Figure 4.1 Schematic of process for commissioning analysis

 

4.2 Scoping analysis

A scoping project should proceed any larger analytical project and is expected to take around 2-3 months (See Figure 4.2).

A scoping project will lead to one of four possible outcomes; 1) a decision that no further work is currently required, 2) a referral to investigations, 3) a letter to parliament or a public body, or 4) initiation of a larger analytical project leading to an analytical report.

 

Figure 4.2 Illustrative timeline for scoping analysis

 

4.3 Steps in conducting detailed analysis

Every project should allow work up front to scope the work required and identify a realistic timeframe. The time taken to conduct detailed analysis will vary according to the specifics of the project, but plans should allow for the steps and rough timeframes outlined in Figure 4.3.

Engagement with nominated Board members will occur though a briefing meeting with the project team at the outset of the project and then a meeting to discuss conclusions with the project team and the Head of SA prior to analytical sign off.

 

Figure 4.3 Illustrative timeline for detailed analytical report

 

4.4 Drafting final report for an analysis project

The analytical coordinator will write the final report, bringing together wider context and synthesising quality assured findings of each analysis strand.

The report should provide some information on methodology, strengths, limitations and uncertainty (and its sources). This can be described in qualitative terms e.g. moderate uncertainty, high confidence etc. It should include a statement of assurance from the analytical assurer in the QA record. See analytical assurer within Table 6.1 Roles and responsibilities within analytical projects.

The report will include conclusions and key findings that have been discussed and agreed with the project team.

It should use the ESS analysis report template and be subject to internal Quality Assurance of Analysis before sharing with stakeholders.

External fact-checking will occur before recommendations are developed. It should be made clear in request that we’re seeking a fact check and comments on any misrepresentation but not wider commentary on the information or language used.

Following the external fact-check further quality assurance will be undertaken of any changes that result and their impact on key messages.

When sending for comments, ask for feedback in clear categories e.g. critical/factual/to note.

Make use of proof-reading checklist at appropriate points e.g. before sending to stakeholders/Board.

 

4.5 Development of recommendations, Theory of Change and key indicators to monitor

Key findings will be agreed between the project team and the commissioner and discussed with the nominated board members.

Recommendations will not be made until the report is finalised with quality assurance and fact-checking having been undertaken.

The SA team leads, and the analytical coordinator will discuss and agree recommendations. This group will have the responsibility of ensuring that there is a sufficient threshold of evidence and that the issue is serious enough to warrant a recommendation.

Theories of change should be developed alongside the recommendation and two to three outcome indicators for monitoring should be identified (see 4.7 monitoring following analysis).

Recommendations will be of the form: review/change/do. If the issue doesn’t fit within our review/change/do approach, this may indicate the need for further work to allow it to be passed to investigations.

Not all analysis projects will result in recommendations.

 

4.6 Indicative timeline to publication

As illustrated in Figure 4.3 once the report has been finalised with key findings and conclusions and external fact-checking, around three months should be allowed prior to publication. The activities which need to be completed during this time are outlined in Table 4.1.

 

Activity Who? How long to allow
Development of recommendations, theories of change and outcome indicators Agreed by project team and the commissioner along with other C band staff in SA followed by discussion with nominated Board members, legal and communications leads. 4 weeks
Approval by Board/nominated members of strategic risks and messaging (dependant on significance) SA paper to ET and then the Board 3-4 weeks
Final sign-off by CEO and confirmation of planned publication date SA summary of changes to CEO

 

2 weeks
News release and lines to take CSC Communication lead with input from SA leads

 

2-4 weeks
Comms plan produced by Senior Communications Officer and discussed/agreed at Comms meeting CSC Comms to lead with input from SA leads

 

2-3 weeks before publication
Communications meeting with C EO and Chair CEO, Chair, CSC C2, Senior Comms Officer, SA C2, project analytical coordinator

 

2 weeks before publication

 

 

Proofreading and preparing for html Proofreading to be carried out by a member of SA who hasn’t been heavily involved in drafting the report. Conversion to html to be carried out by CSC. 2 weeks

 

4.7 Monitoring following analysis

Not all analytical topics will result in monitoring.

Where a topic does result in monitoring it should be clearly defined and related to a specific topic or action. It should also include details of what is being monitored, when and how often and by which function within SA.

Recommendations will be of the form: review/change/do and will be supported by Theories of Change. Monitoring requirements for recommendations should be clearly set out at this stage and agreed as part of the development of these elements.

Note that this is non-statutory monitoring of developments or changes to policy, legislation or evidence related to analytical topics. This is different to any future statutory monitoring that ESS may be asked to undertake or monitoring following interventions made by the ISC team.

  • Analytical Reports and Documents
  • 24 March 2025

5. Ways of Working

5.1 Within Strategy and Analysis

In addition to project team meetings (Table 6.2) we hold six-monthly analytical review meetings for the SA team.

At the analytical review meetings all ongoing and planned projects will be discussed. This gives us the opportunity to share information across projects and across the three analytical teams (policy analysis, quantitative analysis and data, scientific analysis) including.

  • progress, conclusions and recommendations
  • discussion of lessons learned
  • commonalities across projects

These meetings will also be used to review the analytical process as needed and forward work plan.

 

5.2 Information sharing across ESS

While this guidance is primarily aimed at the SA team we work closely with ISC and CSC.

As part of this we hold quarterly ESS C band analytical review meetings. The purpose of these meetings is to;

  • share information and best practice
  • keep updated without need to attend every project team discussion
  • provide and update on every analytical project and (pre-) investigation
  • update on progress and expected timelines towards outputs
  • understand any implications for forward planning (e.g. clashes with publication dates)
  • understand and plan for the involvement of other teams in projects (e.g. the need for legal, communications or financial expertise)

 

5.3 Passing issues to ISC

ISC can commission work from SA (Figure 4.1 Schematic of process for commissioning analysis). Similarly, while undertaking analytical work SA may identify issues which fall within the remit of ISC (Figure 5.1).

Although this should be a consideration in every project team meeting, issues are most likely to be identified towards the end of the analysis and may require further work to evidence the problem.

 

Figure 5.1 Decision tree for passing issues to ISC

 

5.4 Executive team and board progress updates

ET are provided with a paper fortnightly by the Head of SA updating on progress and current timelines to completion. A progress paper is also provided for each Board meeting.

Key messages/recommendations from projects included in these papers at appropriate points in project (see Figure 4.3 Illustrative timeline for detailed analytical report and Table 4.1 Steps to publication once report is finalised).

The Head of SA has the delegated authority to sign off analysis reports. However, there should first be a discussion with nominated Board members of conclusions and recommendations and, if significance factors are present, the report may need to be discussed at a Board meeting. See Sign-off for Analytical Products 6.4.

  • Analytical Reports and Documents
  • 24 March 2025

6. Role and Responsibilities

6.1 Analysis roles

Analysis roles are set out in Table 6.1. These roles mirror those set out in the aqua book.

Table 6.1 Roles and responsibilities within analytical projects

Role Description
Commissioner
  • senior person, accountable for the product meeting its objectives
  • providing leadership
  • responsible for signing off analysis specification and accountable for governance i.e. ensuring appropriate project documentation, managing risk
  • must ensure that those undertaking analysis understand context, are clear on likely risks and can determine what the appropriate analytical and quality assurance response should be
  • must understand strengths, limitations and uncertainty of analysis undertaken and be able to interpret and communicate results correctly
  • for detailed analysis projects, this will be one of the Strategy and Analysis team leads. For major investigations it will be the Head of ISC
  • for smaller projects this may be anyone in ESS who is seeking analysis to be undertaken
Analytical coordinator
  • analyst within SA assigned to coordinate and bring together the work (potentially including internal and external contributions) into one coherent final report
  • usually a member of the policy team
  • usually also working on own analysis to feed into the report
  • involved in project managing others’ contributions
  • liaises across analytical teams to keep project on track and delivering to deadlines
  • main point of contact with commissioner
Analytical assurer

 

  • SA team member responsible for providing overall assurance.
  • typically, senior analyst not involved in directly conducting analysis
  • signs off analytical plan
  • the analytical assurer will complete an assurance checklist and provide a statement on the overall quality of the work to the commissioner when signing it off. The statement will set out the scope and level of quality assurance undertaken, key uncertainties, residual risk and confirmation that the analysis is fit for purpose
  • the scale and scope of analytical assurance will be proportionate to the scale of the project but detailed analysis projects leading to published analytical reports it is expected that assurance will be provided that;
    • the analysis undertaken aligns with the specification
    • the quality of sources used is sufficient
    • appropriate methodologies have been employed
    • that within-team quality checks have been undertaken to ensure accuracy
    • that information on assumptions, limitations and uncertainty is presented clearly
    • risk-based checks of references within the final report – check correctly used and light touch review of reliability of source. Spot-check of a sample for background information, detailed check for those that are fundamental to recommendations
    • that the analytical coordinator has involved the quantitative/science teams where calculations have been made or scientific evidence summarised by other teams, particularly where a recommendation relies on it
    • that the evidence supports recommendations made, including in relation to future investigation or analysis
  • the quality assurance statement should be reviewed after the external fact-check
Lead analyst(s)

 

  • the person(s) responsible for delivering discrete pieces of analysis and producing proportionate documentation, including on the strengths, limitations and uncertainty in the analysis
  • for most projects relating to the analytical priorities, a lead analyst will be required from each of the policy, science and quantitative teams. The analytical coordinator will be one of the lead analysts involved
  • for smaller, more discrete pieces of work, only one analyst may be required and in this case the lead analyst and coordinator roles are the same
Investigations lead

 

  • where the ESS ISC team is not the commissioner, a member of the ISC team will be nominated by its C2 to be involved in the analysis project
  • they will be the lead point of contact on work and represent ISC’s views on project teams. Their role is to provide a link to representations received and investigations underway
  • they will assist in assessing whether the analysis should lead to an issue being considered for (pre-) investigation
  • where the lead has a relevant background knowledge and expertise this is also welcome
CSC lead(s)

 

  • required where an analysis project is intended for publication or where there may be budget and/or legal implications
  • at appropriate points in the project, the CSC communications, governance and legal representatives should be included. Their roles on the project team relates to their particular areas of expertise e.g. the communications required around a published report
  • ESS’ legal advisor should be consulted at the outset and updated at regular points as appropriate

 

6.2 Project teams

Not every role is required for every analysis project but as a minimum the commissioner, analytical assurer and analytical coordinator are required. For larger projects (e.g. leading to a publication) CSC leads (legal and communications) will need to be included in the project team. If any work is being externally commissioned CSC should be included for budget and procurement purposes. For analytical reports an investigations lead will be required.

The commissioner and coordinator should discuss from the outset which other roles will be needed and invite these to form a project team. At an early stage, an initiation meeting should be set up to ensure a common understanding of the problem/purpose, type of product required, research questions (if required), context, scope, limitations, complexities and outputs required.

The team should agree on documentation requirements, proportionate to scope of project. Project teams remain in place to the point of agreeing key findings and conclusions.

 

Example project teams[1]

Example Project team 1: Detailed analysis project leading to analytical report – Sewage discharge into the aquatic environment

    • commissioner – Head of SA
    • analytical assurer – Head of Policy Analysis and Horizon Scanning
    • analytical coordinator – Head of Data and Quantitative Analysis
    • lead analysts – 3 x senior analysts from science, quantitative and policy teams.
    • additional quality assurance – Principle Scientific Advisor + B2 Data Analysts
    • ISC lead – Senior Investigations Officer
    • communications – Senior Communications Officer
    • CSC – Head of CSC

 

Example Project team 2: Investigations support for climate change local authority duties investigation.

    • commissioner – Senior investigations officer
    • analytical assurer – Head of Data and Quantitative Analysis
    • lead analysts – Senior Analyst from policy team, Principle Scientific Advisor
    • communications – Senior Communications Officer

 

Example Project team 3: Investigations support for by-catch investigation

    • commissioner – Senior Investigations Officer
    • analytical assurer – Head of Data and Quantitative Analysis
    • analytical coordinator – Senior Analyst from Quantitative team

 

6.3 Project team meetings

The type and frequency of these meetings will vary according to the product being developed. For detailed analysis projects with a published output, the project team should meet regularly and no less frequently than every three months. For other products, the frequency of meetings should be discussed and agreed at the outset. Table 6.2 shows a schematic for expected project team meetings for scoping and analysis projects.

Table 6.2 Suggested key meetings within a project lifespan

Meeting Suggested agenda For which products?
Initiation
  • agree specification (signed off by Commissioner)
  • agree purpose of analysis and type of product required
  • context, scope, limitations and complexities
  • clearly defined research questions (if required)
  • analysis required e.g. data analysis, literature review, policy analysis, legal analysis or combinations
  • whether this to be undertaken in-house or commissioned/additional expertise should be sourced. If latter, advice from CSC C2 required
  • any training needs for analysts involved
  • risks (to be kept under review)
  • timeframes taking account of timetable requirements
All
Progress meeting(s)
  • review progress against plan
  • advise on managing risks and challenges
  • discuss emerging findings from analysis
  • consider whether any issues passed to ISC
Scoping, analysis, others if needed.
Key finding meeting(s)
  • to discuss in detail all the evidence produced and agree the conclusions and key findings of the work
Scoping, analysis, others if needed.
Post fact-check meeting
  • to discuss the implications of any feedback from stakeholders post fact-checking.
Analysis, others if needed.
Concluding meeting
  • lessons learned – feedback & reflections
  • agree any future monitoring required for recommendations made to public authorities or of new data available
  • agree any potential future analysis
All
Recommendations meeting
  • SA C band plus analytical coordinator to discuss key messages and develop recommendations
  • to agree next steps for report
Analysis, others if needed.

Every project team meeting should consider whether any issues have been identified with sufficient evidence to be passed to the ISC team for consideration. Even where a potential topic of investigation is identified, further analysis may be required before it can be passed over. See section 5.3 Passing issues to ISC.

For every product, there should be a concluding project team meeting which considers lessons learned. These should then be shared with the wider SA team. For larger projects CSC can help with a lessons learned meeting.

 

6.4 Analytical sign-off

In addition to the quality assurance and analytical assurer roles, the three analytical function leads in SA (Head of Policy Analysis, Principle Scientific advisor and Head of Data and Quantitative Analysis) should sign off every product produced (if relevant to their areas of expertise).

For detailed analytical projects it is likely that they will be involved in project teams throughout. However, this is not necessarily the case and unlikely for other products.

Therefore, relevant SA function leads should be involved at key points in analysis:

  • setting up the framework for undertaking policy analysis/quantitative analysis/literature reviews
  • reviewing and agreeing the output of that analysis at completion
  • discussing and agreeing how the analysis informs key messages and recommendations

This is particularly important where the original analysis is undertaken by someone outwith that function and they should ensure to engage the relevant team leads.

When sending reports for comment, areas for the analytical team leads to comment on should be clearly flagged. This should cover both where detailed analysis is presented and other parts of the report where quantitative analysis/science/policy have been referenced.

 


[1] These project team examples pre-date the appointment of the ESS legal advisor and we would expect they should be included in the project team for analytical projects.

  • Analytical Reports and Documents
  • 24 March 2025

7. Best Practice for Conducting Analysis

7.1 General principles

The lead analyst(s) on the project will deliver the analysis with RIGOUR in mind:

  • Repeatable: For the “same” inputs and constraints the analysis produces the “same” outputs
  • Independent: as far as reasonably possible free of prejudice or bias. Taking care to appropriately balance views across all stakeholders and experts
  • Grounded in reality: views and perceptions are challenged, and connections are made between the analysis and its real consequences
  • Objective: effective engagement and suitable challenge reduce potential bias and clarity about the interpretation of the analytical results
  • Uncertainty-managed: uncertainties have been identified, managed and communicated throughout the analytical process
  • Robust: provide the analytical result in the context of residual uncertainty and limitations to ensure it is used appropriately

QA should be ongoing throughout the project. Regular meetings with analytical assurer are encouraged to keep them updated. Within-team QA should occur as analysis is produced with the analytical assurer kept in touch and adding to this when the report is produced.

 

7.2 Use of DPSIR Framework

The ‘Drivers, Pressures, State, Impact, Responses’ framework is a problem structuring method that can be used to help conceptualise, prioritise and communicate an area of focus for analytical work

The initial entry point for scoping work may be a particular state (e.g. statutory target has not been met), impact (e.g. loss of biodiversity, reduced carbon sequestration) or response (e.g. specific legislation) that ESS has decided to scrutinise.

By considering the other categories within the DPSIR framework, it will allow us to scrutinise whether, for example, a specific piece of legislation (‘response’) is effectively targeting the pressures that drive the environmental state or impact, or the extent and effectiveness of responses to particular environmental states or impacts.

A simple DPSIR diagram showing how the proposed prioritised area of focus fits within the broader context of a particular environmental issue can serve as a useful communication tool to summarise the prioritisation process and justification for what is in/out of scope for an analytical project.

We are developing detailed guidance and training on DPSIR.

 

7.3 Quantitative analysis

ESS voluntarily apply the principles of the Code of Practice for Statistics and will consider Trust, Quality and Value at the start of any quantitative analysis. Permanent members of the quantitative analysis team are badged by the Government Statistical Service (GSS) and follow professional guidance and best practice.

There will be a clear agreement between the commissioner, the project team and the Head of Quantitative Analysis and Data on the purpose of data analysis. The data team will then carry out initial scoping of data availability, quality and relevance to the question. Following this, next steps will be agreed with the commissioner, the project team and the Head of SA considering the value of the analysis and the resource requirement.

We will apply a proportionate approach to quality assurance with the minimum being within quantitative analysis team with external QA as appropriate. See Table 8.1 Key steps for QA of quantitative analysis.

 

7.4 Reviewing scientific evidence

The scientific analysis team can support analytical projects by providing scientific evidence reviews. These can be provided in one of three formats. depending on needs and available time;

  1. Quick Scoping Review: provides an indication on the volume and characteristics of an evidence base, and a rapid synthesis of what that evidence indicates in relation to a question – no critical appraisal. Prioritises review articles. These will take one week to two months and can be used for response to Committee queries or to input into scoping reports
  2. Rapid Evidence Assessments: identify relevant evidence available on a topic, summarise and provide a critical assessment of the evidence. Follows standard systematic review procedures based on steps modified to achieve rapid findings. It is time-sensitive and undertaken to quickly find useful information or data on a subject/topic. These will take two to six months and provide short contributions to analytical reports in relation to defined specific questions e.g. sources of marine litter
  3. Systematic reviews: attempts to find all published and unpublished evidence related to specific scrutiny question – using literature search methodologies that are designed to be transparent, unbiased and reproducible. Seeks to categorise the quality of research and attempt to explain discrepancies in findings across research studies. These will take six to 12 months and are used to produce a full analytical report that is focused around understanding evidence on a particular environmental topic e.g. Antimicrobial Resistance baseline evidence review

Scientific evidence reviews will consider the credibility, transferability and dependability of the research using the following questions as a guide.

Credibility:

  • has the source been peer reviewed, are any biases declared and considered?
  • are the uncertainties considered and communicated appropriately?

Transferability: this is an assessment of the context of the research and its methods and the ability to transfer research findings to the setting, group or geography being considered in ESS’ analysis, for example:

  • have the limitations, dependencies and conditions of the study been considered appropriately when conclusions/findings used out of the context of the original study?
  • is the context of the assessment or conclusion appropriate to the level of uncertainty?

Dependability:

  • can the decision trail of the researcher be followed? e.g. sampling techniques/presentation of findings?
  • are any assumptions made reasonable and well documented?
  • are any inferences properly caveated?

See also Table 8.2 Key steps for QA of scientific evidence reviews.

 

7.5 Policy analysis

Policy and legislative analysis outputs include:

  • legislative rapid reviews (previously called ‘Phase 1s’)
  • consultation responses
  • briefing notes
  • information notes
  • letters to stakeholders (including Parliament, etc)
  • policy or legislative reviews for other ESS departments

These should normally be undertaken by the policy analysis team using standard templates with QA and sign-off within team.

See Table 8.3 Key steps for QA of policy analysis.

 

7.6 Commissioned analysis

Where analysis is to be commissioned externally early discussion with the Head of CSC should take place to agree the most appropriate way to source, potential timescales and resource implications.

Budget implications should be discussed with the Head of SA initially and then (depending on cost) the CEO, ET and/or Board.

Where analysis is to be commissioned externally it is vital that the reason for external commissioning is clear and that the purpose and expected output from the work are made clear at the outset. The detailed methodology and timeline to deliver the product should be proposed by the contractor and agreed at the initiation meeting – but, first and foremost, ESS must be clear on what question(s) it is seeking answers to and the scope of the analysis.

  • Analytical Reports and Documents
  • 24 March 2025

8. Quality Assurance of Analysis

8.1 Our general approach to quality assurance

Much of our analysis will involve synthesising, interpreting and translating to new contexts existing literature and assessing the applicability to a new situation rather than new primary research.

Communication of the results and the associated uncertainties and limitations is very important. The analytical assurer See (Table 6.1 Roles and responsibilities within analytical projects) must be content that the final report presents a true representation of the analysis undertaken and evidence reviewed.

Analytical quality assurance involves verifying and validating the analysis – i.e. that the analysis has been conducted as planned and that it is the right analysis. The scale and scope of these activities need to be proportionate to the purpose and constraints of the analysis.

 

8.2 Quantitative quality assurance

Table 8.1 Key steps for QA of quantitative analysis

Lead analyst(s) – throughout Within profession QA – throughout Analytical assurer – engaged throughout but adds further QA at report stage
Adherence to relevant standards (e.g. coding standards) Checks that standards adhered to Is assured that standards have been applied
Quality assures the datasets feeding into analysis using template and takes account of quality issues, limitations and uncertainty identified Spot checks that agree with lead analyst assessment of quality Checks that the final report includes information on quality, limitations and uncertainty and agrees with assessment
Undertakes QA throughout analysis* Undertakes further QA checks of any data transformations, code and outputs Reviews the analyst QA log for assurance
Maintains log of verification and validation checks undertaken, at what points in process and any resulting changes Updates log with further QA undertaken, at what points in process and any resulting changes Checks that the QA undertaken is appropriate to the complexity and risk of the analysis.**
Sets up detailed peer review if analysis is especially complex Supports lead analyst to review and implement results of peer review Checks that peer review has been undertaken and comments implemented
Documents the analysis (either in code and a brief summary in report or in methodology notes if more complex analysis undertaken) as it is undertaken Checks the code and/or methodology notes produced Checks with lead analyst that appropriate information has been produced
* For example, this may involve following through an input example to check they get the same result; or coding the problem in different ways to check the outcome is the same or appropriately comparable; or comparisons to other analysis/historical data.
**   It is expected that more effort will be required when complex analytical techniques are used, when a novel approach is adopted, when the issue is particularly critical or controversial, when results are required to a particularly high level of precision and accuracy, when data sources are uncertain or of poorer quality, or when there is limited evidence to provide challenge of the results.

 

 

8.3 Scientific evidence reviews quality assurance

Table 8.2 Key steps for QA of scientific evidence reviews

Lead analyst(s) – throughout Within profession QA – throughout
Checks that literature is robust, timely and relevant to the project  
Considers the uncertainty, risk, limitations and constraints of the research reviewed and the implications for applying the research to a new context  
Checks that it has been sourced from the agreed academic databases  
Keeps a record of any issues or concerns about the quality, relevance or lack of supporting evidence in each source  
Assesses the credibility, transferability and dependability of each source*  
Maintains a log of quality assurance undertaken, at what points in process and any resulting changes Updates log with further QA undertaken, at what points in process and any resulting changes

 

8.4 Quality assurance of policy analysis

Table 8.3 Key steps for QA of policy analysis

Lead analyst(s) – throughout Within profession QA – throughout Analytical assurer – engaged throughout but adds further QA at report stage
Checks that sources are robust, timely and relevant to the project Spot checks that agrees with lead analyst assessment of sources Is assured that robust approach to evidence used has been applied
Refers to, updates, or where required develops, legislative rapid review on topic ensuring that it includes all relevant legislation and is up to date Spot checks that agrees with lead analyst assessment of legislation and policy Reviews the analyst and QA approach for assurance as required.
Undertakes QA throughout analysis Undertakes further QA checks of key elements of work Reviews the analyst and QA approach for assurance as required.
Keeps a record of any issues or concerns about policy, regulatory or legislative evidence used Updates log with further QA undertaken, at what points in process and any resulting changes Checks that the QA undertaken is appropriate to the complexity and risk of the analysis
Ensures that appropriately rigorous review of policy, regulatory and legislative developments has been undertaken to support the project through horizon scanning and targeted research Spot checks that agrees with lead analyst assessment of legislation and policy Reviews the analyst and QA approach for assurance as required

 

8.5 Quality assurance of grey literature

ESS’ analysis of topics regularly considers grey literature as part of the evidence base. This includes a range of material that is produced outside of traditional academic (or commercial) publication routes such as reports, media articles, strategies, etc.

Many of ESS’ analytical projects will bring together and synthesise analysis from multi-teams into an overall report with additional background/contextual information and conclusions.

Whilst each of the individual products will be subject to within team quality assurance, we need to ensure that the analytical coordinator applies QA thinking while producing the final report before this is passed to the analytical assurer.

The coordinator should consider the sources used and make a reasonable assessment of their relevance and potential for bias. The assurer should check that they would reach the same conclusions from the same evidence.

Particular red flags would be conclusions/recommendations based on only one evidence source (where we should be confident it is good enough) and conclusions/ recommendations based only on grey literature from non-official sources.

Analysts should use the grey literature QA checklist to inform sources included.

 

Back to top