In addition to testing, the trainers need to observe what's happening in the training. This is particular useful for skill testing and seek feedback from the trainees and analyze all of this information. Observational evaluation can be built on checklists when we're developing our training objectives we've mentioned words such as behavior, observable, measurable so the checklist enables us or encourages us to make these observations and Record them, and thus, measure trainee behavior. A checklist, for example, spells out the specific steps or tasks that one would expect a trainee to perform at the end of the program in order to achieve the objectives. What are the steps involved in organizing an immunization session in the community? What are involved in preparing salt-sugar solution? So, each of these steps can be identified and put on a checklist. We see an example of a patient counseling checklist We're trying to encourage participatory counseling, open-ended interviewing client involvement, some of the many tasks that we would hope we could observe during role-plays or during practice counseling sessions that we might organize as part of the training. Include the following. That the trainee established a cor-, cordial relationship with the client through greetings, introductions. You can have a checklist that would say the person yes or no or it's something like this that's a bit more subjective they may greet the person but they may not introduce themselves, so you would say whether they did this. Task fully, partially, or not at all. Does the trainee use open-ended interviewing to encourage client to speak fully for him or herself? Trainee may say, tell me about your health problem. And then they may switch into direct questions. Did you have fever? Did you have this? So it was partial. Another thing you might look for in a check list, in terms of counseling skills that the trainee practice active listening, is attentive and does not interrupt. Obviously if the person's talking and the trainee is going mm-hm, yes tell me more Or just listening. You know that the person is practicing active listening and encouraging the client to continue speaking and explaining. The other aspect of what we talked about in terms of the visual acuity training for teachers. We mentioned their knowledge. We've mentioned their self confidence. We also had a check list, to learn whether they could perform visual acuity testing. Just as we've talked about providing the materials necessary to carry out our training objectives and methods, we also need to make sure that the materials are available For evaluating the, the achievement of these objectives. So obviously during the training we would've, for visual acuity we would've shown them a Snellen visual acuity chart an opaque cover to put over one eye or the other. a frame that would have a lenses that could be used to see how the student could see better. a meter rule to measure the distance between where the student was standing and where the, the chart was placed. So all of these for the pretest observation were put on a table And then, the teacher was asked to assess the visual acuity of some pupils. And some volunteer pupils were brought in and the teacher was shown these things to see what he or she would do with it. Then, the trainers would sit back and watch and they would record. Their findings. Did the teacher put the chart at the right height on the wall, did she put the pupil at the right distance from the chart, how did she use the opaque cover, in terms of which eye, one eye first, the other eye, step by step what did the person do. We found that people perform very poorly at pre-test, many of them Had not seen the materials before or not had a chance to touch them, let, let alone see them. And so, we scored each one on whether they performed each of the eight steps involved. Now interestingly enough just as we saw in the knowledge area teachers just by being exposed to the questions Were able to you know seek out information or improve their knowledge even if they were in the control group and we can see here that having been exposed to the materials some of the teachers may have been curious. They may have asked friends who were nurses or doctors or something what, what this was all about. So we can see at post test some of the control teachers did improve their, their skills. Maybe they were able to set the the chart on the wall correctly or get the right distance. But they didn't really do a lot, but there was a slight increase, whereas the intervention on average you know, they were able to do at least six of the eight steps correctly. We've talked about measuring the content issues, such as knowledge, attitudes, self-confidence performance or behavior or skills. We also want to look at getting feedback from trainees on their opinions about the training, the adequacy of the training, their satisfaction with the training, so this information is important in addition to Obtaining informational knowledge and skills. Even if people show an increase in knowledge. If they don't perceive that what was provided to them was appropriate to their job or of good quality. Then the training program has not succeeded. So just the numbers in terms of the pre-post test feedback are not enough. We want to have the trainees rate aspects of the program. A simple one page form should be adequate. And these forms should be filled out anonymously. We can have simple check type of forms. We see here check the column that best shows your opinion. We can ask opinions about did they feel that they learned something? Was it useful? will they be able to use it? Was the program presented in an interesting or challenging manner? Were the training facilities adequate for their needs in terms of their comfort, their temperature, all those kinds of things. Did the program cover the promised objectives? Did they feel they learned what was stated out in the beginning? And did the trainers encourage participation and questions. In other words, was it a participatory learning process? So, we can ask various questions like this to get people's feedback on how well the program itself was organized and perceived by the trainees. In addition to scoring questions that can be rated from like we see on this one, one that was never done up to five it was done often. We can also have open-ended questions asking people what to do you find most useful about the program. What did you find least useful and getting their specific suggestions on how to improve the next round of training. In addition to forms the people can feel out we can actually have discussions. Get trainee's involved in talking and giving feedback directly to the trainers. this serves a dual purpose, It provides information to the trainers, but by talking about what they've learned it allows the trainee's to integrate their experiences and pull together what they've learned. and think about how relevant it was and how they might be able to use it when they get back. Generally whenever you're doing focus groups whether it's in the community or here at the end of a training program you should have no more than about six members. The more people you have the less opportunity each member has to speak. If you have too few of course you don't have a good spectrum of opinion. These types of feedback FGDs should be organized toward the end of the training program. Maybe over lunch at the last day or maybe the evening of the day before the last day. Want to make sure that there's a separate room or space to offer privacy, they may not want everybody hearing what they have to say if complaints are made about other trainees or trainers. One way that we've used focus groups is recognizing that some people are more active during the training program, they speak up, they're always asking questions, they're always volunteering to do the role plays etcetera. But we notice obviously as in any group, there may be some people who are quieter. an FGD is an opportunity to bring these quiet members together to find out what they're thinking. Are they quiet because that's just their nature, or are they quiet because they don't understand what's going on? So this gives them an opportunity to speak speak their mind, and talk about the program. And it would also give feedback to see if the program is really reaching them, is there evidence that they have been learning something? Focus groups also have value, as I said, in terms of integrating and thinking they can be used to encourage the trainees to think about how you are going to apply this knowledge and skill that you have gained when you get home. what are the obstacles you might face, how are you going to integrate this into your existing jobs. So it's important, not just to document that, new knowledge, attitudes, and skills have been acquired before the trainee leaves. But, make sure that the trainee considers how they will apply this new job New information back on the job when they return. Otherwise the training has simply been a vacation and it's not really helped improve performance. Again when we think about back on the job we want to insure that this discussion turns into an evaluation of the practicality and the feasibility of the new knowledge, attitudes and skills that we've been introducing during the training. And it's also a way to enhance problem-solving skills, thinking about Okay. When talking about cold chain requirements that we're trying to learn for immunization program. What is involved? Do we have adequate electricity? What with these other sources that we can use as backup? Getting people to think about these things, engaging in realistic discussion about the obstacles they might face and how they could overcome these obstacles. We discovered, through a training program on planning and management skills for health workers and child survival that, although the groups coming from different health departments, different states, different countries. in the sub-Saharan Africa could write beautiful plans with all the components that were involved. This was during a four week workshop so they could formulate reasonable objectives, they could identify appropriate educational strategies, they could list out all the resources that they would need, logistics to implement their, their their plans. But when we visited them after the first round of training. realized that they were disappointed, the, they didn't meet the kinds of reception by their superiors. some people were jealous that they went, the process of passing their plan on through the various chain of command to get to the decision makers was often thwarted. And so, yeah, you know whether this was politics, whether it was just the normal lethargy in the ministry of health, etc, we weren't sure. But we realized that the planning process the training process had not been completed before they left the workshop. And so in subsequent years, we encouraged them to develop, in addition to their main Intervention plan. An action plan that would say how they would get that activity that they had planned implemented. In other words thinking back they gain skills on program planning but the important skill was how do you implement those skills when you get back home? So we've developed that component. Toward the end of the workshop. Here you have a plan, how are you going to advocate? Where are the pressure points? Where are the influential people? Etcetera. How are you going to get your new ideas implemented? Again, as we've noted, we may equip people with the knowledge, skills, and attitudes To carry out new tasks, to improve their performance this, may not be enough. So we mentioned one way of getting them to develop an [UNKNOWN] action plan of how they're going to carry out their new skills when they get home but also we need to make sure that they have the resources to carry out these skills when they get back. A simple thing when we were training village health workers about managing common conditions, such as malaria, aches and pains, diarrhea. They needed the resources to do that. And so, discovered that it was important at the end of training to provide the trainees with a drug box that had start up doses, as it were, of, the things that we taught them about. So that that would let them start their work immediately, gain the confidence of the community, and then the community would, would patronize them. They would pay the small amount for the drugs that actually get a revolving fund started. But without this, the trainees went back in the village, people say what are you going to do, my child's sick, and if they didn't have the resources they couldn't. So this is another part. And this isn't necessarily the specific responsibility of the trainers, but again you want to avoid frustrating your trainees going back into a situation where they can't practice. So this, again talking about. From the very beginning lecture the training is part of organizational management. So while we are preparing health workers to carry out tasks we also want to consult with the management side to make sure that they have the resources whether it's transportation, whether it's equipment to perform the duties that they're learning during the training. Finally, we want to make sure we use all of these test results, whether it's pretest, whether it's observational checklist whether it's feedback from our focus groups or our satisfaction opinion forms that we distribute to the trainees. And therefore, the training committee, should sit down and look at these results, summarize them as soon as possible. That way, we can identify gaps that need attention, during follow-up visits or through correspondence, or telephone. Sending information out to people to fill in these gaps And clearly improving the design of the next program.