Feeds:
Posts
Comments

Archive for the ‘Toolkit’ Category

D3-final evaluation

We have spent the last 18 months working together, and working hard at that. We wanted to ensure participants had a chance to thank each other for all the hard work, sharing of learning, and tremendous effort that has gone into the Standing Team. Therefore, all participants were asked to write their name on a sheet of paper.

They all then passed it to another  person to the table, who wrote a brief, anonymous note about what they learned from this person.

The paper was then passed on, until all participants wrote something on it.

At the end of the workshop, every participants received the letter,  as a way to thank her/him for its contribution to the workshop and to the Standing Team.

Advertisement

Read Full Post »

D3_zip_zap zup

To start the day, we did a quick and active energizer: zip! zap! zup!

Read Full Post »

D2-final energizer

Participants stand in a  circle.

The facilitator says…

1..2..3.. Look down

(and everyone looks at the shoes of someone)

1…2..3.. Look UP!

(everyone look at the eyes of that person whose shoes they were looking at)

If two people look at each other, they are out!

This is repeated until only the winners remain in the circle.

Read Full Post »

IMG_2830

Adhong helped us to recap the first day.

Participants stood up in a circle and had to say..

“My name is …”

“I am going to the moon”

“I am bringing [something that begins with the first letter of the name]”

The challenge was to bring to the moon something that was linked to yesterday’s activities…

but we ended up bringing to the moon many things connected to the Standing Team, but not with yesterday’s activities!

So a recap by Adhong followed.

Read Full Post »

DSC_3451

We first shared some of the feelings and ideas we had when looking back at our work.

timeline

We went then through all the timeline, looking at how activities unfolded, connected and developed.

And we then looked at the individual activities, focusing on the key learnings and challenges around these.

After looking at the timeline, participants discussed in small groups.

  • What is a surprise to you?
  • What didn’t you know? What do you want to know more about?

A plenary discussion on this followed, and we captured the main highlights on a flipchart.

It was amazing to see how much good work had been done in the last year!

Read Full Post »

DSC_3284

Sharp on time, we start the workshop!
Sarah gives an inspirational introduction about the workshop.
Lex Kassengerg, Country Director of CARE Nepal, welcomed us here.

DSC_3292

Read Full Post »

The CARE Standing Team are committed to sharing learning with all of us, and we are lucky that they have submitted another dispatch about their meeting! Angela submits this piece to us:

The Standing Team meets just once a year, so how do we make the most of our time together? how do we learn from each other?  how do we train up the newer members?  how do we move forward accountability practices?  I’ve captured here some of the tricks that help make great use of short time.

Talking accountability over dinner

Our workshop has a multi-purpose design beyond the accountability theme, and allows each and every one of us to design and deliver a session across the week.  We pair up a more seasoned member with a newer one in a “buddy system”.  Typically, ahead of the workshop, a needs assessment is done to determine what the interests, challenges and expectations are around the theme and an interactive session is designed around this.  We learn a lot about various aspects of accountability, emphasising the practice.  Today we looked at feedback, complaints and response mechanisms, as well as ways in which we review CARE’s performance in emergencies and against our humanitarian accountability framework. Different to other workshops – and a key feature of ours – is that facilitators also receive feedback on their session design and delivery, so as to build their skills in facilitation and workshop design too.

prop from workshop session

Today I’m left astounded by the thought and creativity that goes into the design.  Modelled on the famous tv programme, we played “Who Wants To Be An After Action Review Expert”.  Whilst the million dollar prize may not materialise for some time (where’s the accountability there?!), we were left with a brilliant example of smart session design.  Well tailored to the afternoon hump session, participants were lured in turn into the bright orange corona of a hot seat to answer questions on conducting after action reviews.  The questions were thoughtfully designed and drew out rich conversation on what after action reviews are (and are not!), the lessons learned from the previous reviews conducted by the facilitators, the key questions that are being debated within the organisation (to draw comments and insight from the participants to feed into the debate) and tips and tricks for facilitating after action reviews.  A great example that a cleverly designed and delivered session can achieve a lot in a short time and still be fun: everyone’s a winner.

Read Full Post »

The April 2012 research report, Building a Better Response: Gaps & Good Practice in Training for Humanitarian Reform, by Andy Featherstone, discusses NGO initiatives in training staff in humanitarian reform.  This includes humanitarian leadership, the cluster approach, pooled funding and general coordination.  The study found that current humanitarian reform training methodology, i.e. the teaching style, is not meeting the needs of those trained.

Key Findings

Humanitarian workers tend to prefer learning-by-doing and simulations. Examples of learning-by-doing can include a staffperson coaching or mentoring another staffperson or by placing staff in emergencies as part of their training. The report states:

While there are no easy solutions, existing knowledge certainly suggest the use of innovative and creative approaches to learning rather than formal techniques such as classroom-based methods.

The study also found international and national NGO field staff receive the least training in humanitarian response, while middle and senior managers and technical coordinators from the UN and international NGOs participate in training the most. Thus, training needs to be made available at the local level—not just in capital cities—for front-line humanitarian staff.

The report did, however, acknowledge ECB and the Consortium of British Humanitarian Agencies for their ENHAnce project, an in-country training program for national staff. The project “addresses some of the more frequent criticisms of training in the sector, using a mixture of methods which includes learning-by doing through on-the-job coaching and distance learning.”

Tips for Adult Learning

A four-day workshop held by the Humanitarian Accountability Partnership (HAP) in April on how to facilitate accountability trainings also addressed this issue of learning style. The Training of Trainers on the 2010 HAP Accountability Standard workshop held a session on adult learning in the classroom. While such learning style is not learning-by-doing in the field, it is more participatory than the traditional classroom-based method.

Here are a few pointers on adult learning:

Learning should be engaging and participatory

  • Encourage participants to share their thoughts and experiences
  • Change activity every 30 minutes
  • Use examples to which participants can relate through their lives or work experience

Use a variety of education styles, media, activities, such as

  • Interactive lectures (ask questions, encourage discussion between participants, promote participant sharing of their knowledge and experience)
  • Group discussions/exercises
  • Role play (learners practice using new knowledge or skills in a simulated situation, can be scripted or improvised, is discussed afterwards)
  • Quizzes (reinforce learning, serves as a different presentation of the information)
  • Questions (to determine participants’ knowledge and understanding)
  • Energizers (a short, fun activity that provides a break, can be related or unrelated to the topic of the learning, can build rapport between participants, can involve moving around)

Ask participants to

  • Explain complex issues
  • Describe how they would apply the learning to their jobs
  • Repeat key ideas during the reviews

Check out this blog on the findings of the Building a Better Response report, as well as this blog, submitted by Standing Team member Piva (Mery Corp), for more on the HAP workshop.

Read Full Post »

Hopefully you are familiar with ECB’s Good Enough Guide (GEG) (see this previous blog post)  and its communication materials. This week, we have interviewed Lucy Heaven Taylor, an AIM Advisor from Oxfam GB, to hear the fascinating story of how these materials were developed! Lucy co-managed this project, and this is what we found out:

Soon after the publishing of the GEG, ECHO announced a call for proposals for developing inter-agency capacity. Given the popularity of the Guide, the Accountability and Impact Measurement (AIM) Advisors and Oxfam decided to propose a project to develop materials to communicate the important principles of the GEG to agency staff and beneficiaries.

First, Lucy and Julian Srodecki, ex-AIM adviser for World Vision, project co-managers, conducted a large survey of practitioners through Survey Monkey and key informant interviews in order to find out what forms of communication were prefered. From hundreds of responses, they found that posters and leaflets were the most popular materials used to communicate key messages. 

They then moved on to conduct a literature review on the practice of communications among different cultures. This uncovered useful information, such as the fact  that the color red does not universally signify “stop.” They learned that in order to create materials and images to which people will respond and relate, the materials needed to be developed with the people themselves.

Five regions were chosen in which to develop the materials: Latin America, Sub-Saharan Africa, the Middle East, South Asia and Southeast Asia. The idea was to produce context-specific images in each of the regions as examples for humanitarian organizations, so that they could develop materials appropriate to their own geographic and linguistic context. ECB staff went to Bolivia, Kenya, Lebanon, Bangladesh and Myanmar to work on the materials with disaster affected people and local artists. The artist in each country created images to represent the people’s perceptions of disasters, of their rights and of themselves. The community members provided feedback on the images until the artist got it right. For example, in Bangladesh the artist created an image of somebody pointing, but the community thought he was holding a gun!  As you can see, their feedback was crucial!  Once the drawing was approved, the image was printed and tested in the same community.

It was interesting for the ECB staff to find that the community members prefered colored drawings to line drawings or photographs. They also prefered figures of people looking at them with recognizable facial features. In addition, it was discovered that people like to see images of themselves not exactly how they look, but instead represented in a more positive light.

Initially it was planned that the posters and leaflets would have no words because of a largely illiterate audience, but it proved to be too difficult to portray the messages. Thus it was decided that the materials would have words and a literate person could relay the message to those needing assistance. The specific wording for the posters and leaflets was agreed upon by the steering committee for the project. Half of the posters were designed for beneficiaries, to be displayed in public to raise awareness of people’s rights. Other posters were developed for agency staff, to raise awareness of the practice of accountability and to be posted in offices. The leaflets were designed to teach the principles of accountability and the GEG to agency staff. Both were printed in English, Arabic, Bangla, Burmese, Spanish and French.

 The videos were developed in a similar consultative fashion, with disaster affected communities in Bangladesh, Ethiopia, and Bolivia. These videos show staff and beneficiaries talking about the principles of the GEG, and they are designed to be viewed by agency staff for training purposes.

The project was truly a collaboration of member agencies of the ECB. It was co-led by Oxfam and World Vision, with a Steering Committee comprising of a cross-section of members, including CARE, Mercy Corps and ECB secretariat staff. The field work was undertaken by World Vision, CARE, Oxfam and ECB staff, and drew on experience from different agencies’ programmes. 

The other successful component of this project was that it not only sought to promote the practice of accountability in emergencies – but accountability was practiced while developing the materials!  The collaboration of ECB agencies and consultation with the communities was the key to their success.

Read Full Post »

In June of 2011, the ECB Project published the latest version of What we know about joint evaluations of humanitarian action: Learning from NGO Experiences. This paper aims to share the experiences and learnings of NGO staff who have conducted joint evaluations and serve as a resource for agencies considering conducting  joint evaluations in the future.

The Guide section of the booklet can be considered a ‘how‐to’ for those closely involved in joint evaluations. It discusses the benefits and disadvantages of the process, and what to do before, during and after a joint evaluation.

The Stories section shares three case studies from the ECB Project’s experiences.

  1. Joint Independent Evaluation of the Humanitarian Response of CARE, Catholic Relief Services, Save the Children and World Vision to the 2005 Food Crisis in the Republic of Niger
  2. Multi‐Agency Evaluation of the Response to the Emergency Created By Tropical Storm Stan in Guatemala – CARE, Catholic Relief Services, Oxfam
  3. CARE, Catholic Relief Services, Save the Children and World Vision Indonesia Joint Evaluation of their Responses to the Yogyakarta Earthquake in Indonesia

The Tools section includes templates and tools that can be adapted for evaluations, including sample terms of references, agreement documents, a joint evaluations readiness checklist, and suggested topics for discussion with prospective partner agencies.

Advantages of a Joint Evaluation

  • Like a single‐agency evaluation, a joint evaluation provides an opportunity to learn from past action so as to improve future decision‐making.
  • It allows agencies to see a bigger picture of the collective response and what gaps still exist.
  • By looking at a non-joint response of different agencies side by side, you can see where a coordinated effort would have been beneficial and can plan accordingly for the next response.

“Evaluation reports repeatedly show that better coordination would have led to a more effective response.”

  • When agencies open up to one another by sharing weaknesses and strengths, they increase transparency and make it easier for them to hold one another accountable for acting upon the recommendations.
  • Conducting the evaluation with other agencies allows sharing of perspectives and technical knowledge and builds trust for future cooperation.

Disadvantages

  • It takes greater time, funds and skills for agencies to agree to do and conduct a joint evaluation.
  • Less depth on the work of each agency is covered.

So check out What we know about Joint Evaluations and tell us what you think!

Read Full Post »

Older Posts »