Design an evaluation process and impact assessmentDesigning Educational ProgrammesEnsure that outcomes are based on the content of the evaluation and impact assessmentSkills to connect evaluation and impact assessments with relevant conclusions for further learning.

How can the findings be reported and their use supported?

Make more visible and clear the outcomes of any evaluation and impact study for better use in future training.

Introduction:

The evaluation report should be structured in a manner that reflects the purpose and questions of the evaluation and should be clearly addressed for proposing changes and proposing adaptation in the future model of the training.

The specific evaluative rubrics should be used to ‘interpret’ the evidence and determine which considerations are critically important or urgent. Evidence on multiple dimensions should subsequently be synthesized to generate answers to the high-level evaluative questions.

Content:

The structure of an evaluation report can do a great deal to encourage the succinct reporting of direct answers to evaluative questions, backed up by enough detail about the evaluative reasoning and methodology to allow the reader to follow the logic and clearly see the evidence base.

A facilitator should be able to design and planning such development of the evaluation focusing already on the future steps and not only focusing on the nowadays or short term vision of using the evaluation data.

The following recommendations will help to set clear expectations for evaluation reports that are strong on evaluative reasoning:

  1. The executive summary must contain direct and explicitly evaluative answers to the questions used to guide the whole evaluation.
  2. Explicitly evaluative language must be used when presenting findings (rather than value-neutral language that merely describes findings). Examples should be provided.
  3. Use of clear and simple data visualization to present easy-to-understand ‘snapshots’ of how the intervention has performed on the various dimensions of merit.
  4. Structuring of the findings section using questions as subheadings (rather than types and sources of evidence, as is frequently done).
  5. There must be clarity and transparency about the evaluative reasoning used, with the explanations clearly understandable to both non-evaluators and readers without deep content expertise in the subject matter. These explanations should be broad and brief in the main body of the report, with more detail available in annexes.
  6. If evaluative rubrics are relatively small in size, these should be included in the main body of the report. If they are large, a brief summary of at least one or two should be included in the main body of the report, with all rubrics included in full in an annex.

Exercises:

How to apply it in everyday work?

While you prepare your next evaluation, would it be possible to start to think about the final format of the reporting?

Preparing your next training would be important to address the following point while preparing the evaluation.

  • How does the audience prefer to receive information – text, graphics, numbers, written, visual or a mixture of all of these?
  • What is the preferred length (or duration if an audio/visual presentation)?
  • What access does the audience have to information technology (this may inform whether you use web-based formats)?
  • What is the purpose of the report and how does this inform the choice of format? Purposes may include:
    • keeping stakeholders engaged during an evaluation
    • providing feedback to and maintaining the commitment of people collecting data during implementation
    • flagging emerging findings and implications for ongoing program development and for the evaluation
    • presenting interim recommendations
    • seeking feedback on draft reports to assist in identifying causal factors
    • informing planning, funding or policy decisions
    • broader dissemination of findings to support the use

Reflection Questions:

  • How often do I think in advance to the final use that I will do about training evaluation?
  • Do I plan something strategically or only for one activity ahead?
  • Am I capable to prepare a report that is connecting all the above elements in one short and simple document?

Federica de Micheli

A training focusing on participation as methodology (not only as topic) is based on a certain value premise that believes in the empowerment of all the learners and supporting the equal participation of the ones with fewer opportunities or in situations of disatatage (temporary or long term). The focus of participatory training is not just about ‘knowing more’ but about…

Click here to read more about Federica de Micheli

Read more from this author

Bookmark (0)
ClosePlease login
Source
Reference/made by/originally from: betterevaluation.org

Related Articles

Leave a Reply

Back to top button