Effective evaluation of ICT for education in Africa

Today I attended a pre-conference workshop on the effective evaluation of ICT for education in Africa, hosted by David Hollow, a doctoral researcher at the University of London (Royal Holloway). Below are my notes.

In groups we discussed the following questions related to monitoring and evaluation (M&E):

  1. What is the most significant challenge facing effective evaluation of ICT for education programmes?

  2. Why do you think this?

  3. How do you think the challenge can be addressed?

Some suggestions for what must change in M&E:

  • Important to give teeth to M&E, in other words tie it to funding, staff promotion, performance appraisals of project stakeholders, etc.

  • Funders, donors and administrative organs must drive effective M&E. It must be required and included in the project — not summatively at the end but throughout the project.

  • Multidisciplinary approaches must be taken in M&E, especially within universities.

  • Develop skills of monitors and evaluators.

  • When running a project across a diverse range of schools, it can be problematic to apply one evaluation approach. Schools that are well-resourced will respond differently to poor schools, but we tend to apply the same measurements for impact.

Presentation 1: John Traxler, of the University of Wolverhampton, presented on some of the legal and ethical challenges of M&E:

Informed consent:

  • Obtaining consent can be difficult, e.g. through parental permission.

  • Participant risk, e.g. perhaps by giving them expensive mobile phones in a poor community. Or asking potentially embarrassing questions in a focus group discussion.

  • Participant withdrawal: How does this affect the project results?

  • Financial or in-kind compensation: What? How much?


  • Data may be used by other organisations not initially part of the project.

Power, class, difference:

  • Evaluation often works across differentials in power and class.

Presentation 2: Til Schoenherr, inWEnt, presented on Capacity building in elearning: unintended outcomes. inWEnt runs a number of elearning capacity building projects. Below are some of the unintended outcomes that he put down to elearning:

  • In alumni network of trainees, cultural and religious diversity has not resulted in any racism or discrimination.

  • Change of teaching/learning patterns, e.g. Teachers coach rather than instruct.

Presentation 3: Bjorn Everts, Education Manager, Eduvision, presented on Ethiopia XO-5000, and spoke about the benefits of conducting M&E throughout the project life cycle. In the Ethiopia XO-5000 project 5,000 XO laptops were introduced into five Ethiopian schools. Eduvision developed the software for the laptops. They conducted a 3-stage evaluation — before, during and after — to assess the feasibility and impact of introducing “innovative learning” in Ethiopian schools. Followed a multi-method approach – quantitative and qualitative methods. Some findings:

  • A challenge was that there was no initial consensus among project partners about the aims of the M&E, or even what M&E is.

  • It is important to create feedback loops throughout the project and to constantly revise your plan. As part of initially refining the plan it is useful to take the M&E plan to each project stakeholder and talk it through with them. It could prevent the making of mistakes.

  • Remember to keep focussing on the aims and objectives.

Important lessons learned:

  • Spend as much time as possible to build local capacity for feedback, input and self-reflection. The teachers were best suited to monitor the project.

  • Conduct as many of the methods in the local language as possible.

  • Ensure all parties understand M&E before proceeding. Don’t assume everyone knows what M&E is. Explain everything in layman’s terms.

  • Very important to emphasise that you don’t want canned feedback. The most valuable feedback is honest feedback.

  • Be flexible with multiple partners.

Two of the more obvious points are:

  • Ethics and buy-in: comprehensively inform your participants what the M&E process is about and how important their participation is.

  • Document the indirect effects of M&E and communicate this to managers.

Presentation 4: Prof. Tim Unwin. Findings of a recent one-day workshop on M&E processes of ICT in education projects in the Middle East.

  • You can’t change education over night. So, what are the short term and long term indicators for M&E? What are we looking to change and therefore measure?
  • Dissemination of M&E findings is very important. But we need to overcome the fear of sharing negative results. The real issue is not always the findings, but the process.
  • Tim asked how we encourage a culture of M&E in the whole project team? It was suggested to include M&E in the training of the team, and to keep pushing the message that M&E is about identifying gaps in projects or organisations and that the end result is not to lay blame but to improve projects or organisations.
  • An interesting question was: Should we have pilot projects at all, or start at scale? Challenges and problems only emerge when we go to scale. Why not go big from the start?

Presentation 5: A Canadian academic presented the case study of a palliative care training project in Canada. She noted that in the input –> process –> output framework of projects, inputs are individual and organisational, especially around goals of the project. For example, what does the organisation want (e.g. x number of bums on seats in a training intervention) and what do the beneficiaries of the project want out of it (to learn how to provide palliative care)? Defining these different stakeholder goals upfront, and making sure that all stakeholders are aware of the full set of goals, is very important for effective M&E.

Final comments:

  • M&E of ICT for education projects is hugely challenging. Because of this, effective M&E is often marginalised or trivialised.

  • Every situation has a unique context. There is no one-size-fits-all approach to M&E – but that doesn’t mean that there aren’t good practices and lessons to learn from other efforts.

  • One of the participants asked for practical advice on how to conduct M&E by non-experts, with no budget. An “M&E for Dummies”.

  • Think about how best to communicate the findings to the stakeholders.

  • There aren’t that many experts in this space. Have the courage to try your own approaches or the approaches of others even if they are not formal experts in the field.

  • Important to have a framework: plan, process, indicators, partnerships, etc.

  • David Hollow listed the four Ps of M&E:
    • Stakeholder participation
    • Partnership
    • Plurality of methods
    • Focus on process (and not outputs)

For me a key affirmation that came out of the final discussion was that we should not be afraid to follow M&E methods that might not be well known, but that are good fits for our projects. There are not many true experts in the this field, and with constant budget and time constraints, we often have to conduct M&E ourselves.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s