As the COVID-19 pandemic continues to impact development globally, Latin America has not been spared and countries in the region have been paralyzed in terms of mobility. DevTech recently faced this challenge after being awarded a contract by the Department of Labor (DOL) to conduct a performance evaluation of the Youth Pathways in Central America Project (YPCA) being implemented in Honduras and El Salvador. As the evaluation team prepared to mobilize data collection, travel in these countries came to a halt and threatened the delivery of the evaluation.

DevTech’s lead evaluator based in El Salvador and a research associate based in Bluefields, Nicaragua sprang into action by developing a solution to conduct fieldwork remotely. Although, the evaluation methodology relied heavily on individual interviews, small group interviews, and focus group discussions, the team was able to adapt their tools to conduct these conversations online and through mobile telephone calls. DOL and Catholic Relief Services (CRS) that served as the YPCA implementer agreed to the approach and CRS provided valuable support in reaching out to the evaluation participants. The sample included youth, school principals, senior management of implementing organizations and technical personnel who delivered project services, as well employers and government officials. During a period of four weeks, DevTech conducted approximately 49 remote interviews and focus groups involving 91 YPCA participants and stakeholders in Honduras, El Salvador and the US.

Reflecting on the challenges that remote fieldwork can present, the evaluation team identified (and implemented) the following key recommendations for consideration while conducting remote data collection:

  1. Identify available contact information and sampling. The DevTech Evaluation Team collected key respondents’ contact information in coordination with YPCA local implementers from their desks. After analyzing this information, the evaluation team defined a purposive sample and applied a quota approach to schedule data collection with potential informants. The selection of respondents was then finalized when the quota was reached. The sample was established based on whether introductory contact could be established with potential interviewees and whether they were available and willing to participate in the evaluation.

  2. Use communication technologies that are easily accessible. Common platforms used for enumeration included Skype, Zoom, WhatsApp and regular phone calls all of which were accessible to respondents in both Honduras and El Salvador. When using Zoom, the evaluation team sent instructions to participants ahead of the interview, so informants could connect using the free version. Most focus groups discussions were conducted using Zoom with an average of 5 respondents participating at one time. This allowed for better management of the conversation and better identification of who was speaking during the session. It is recommended that evaluation budgets allocate enough funding to use these platforms without time restrictions, and that how to use the platform is explained to the participants. Most other interviews were conducted using Skype and WhatsApp. Contrary to popular belief, high-speed internet access is not required, but there needs to be a stable connection. If the connection is unstable, video cameras can be deactivated and interviews can be conducted by voice call only, allowing for a more fluid conversation. Also, the evaluation team should be flexible and reduce the number of participants if the connectivity could be an issue.

  3. Verify information collection procedures. It is necessary to verify that the protocols for these collection tools are adapted to use new communication channels. Also, it is advisable to allow more time for each of the interviews beyond the scheduled time to allow time for interviewees to connect and because respondents tend to expand on their answers when free to talk on these platforms. It was found that interviewees feel more comfortable responding because the interview is not in person. DevTech developed the questionnaires and protocols with the two members of the evaluation team present in all the interviews. This technique helped facilitate communication during the interview and increased the Inter-Rater Reliability in the final analysis of the evaluation report. Further, in case of some short interruptions during the interviews due to loss of connectivity, one of the interviewers could continue taking notes and asking questions.

  4. Implementer introduces the evaluation team to participants. It is important to note that the implementer, in this case CRS and its partners, made the initial contact with evaluation informants, and to the extent possible, were available to introduce the evaluation team to the participants at the beginning of the interviews. This helped facilitate the willingness, availability and participation of the beneficiaries during the interviews. After that initial presentation, the implementer would then leave the meeting as part of the protocol to reduce potential bias. The evaluation team provided the instructions for the meeting and made it clear how their participation would benefit similar programs in the future and that their answers will be kept confidential.

The evaluation team found that conducting a performance evaluation using qualitative data collection techniques through virtual platforms has important advantages:

  • Interviews can be scheduled in rapid succession, regardless of the participants’ location which saves time.
  • For future evaluations, these remote data collection methods can be used as an alternative, cost-saving approach.
  • Through a distance approach via e-tech platforms, participants tend to be more willing to engage in longer in-depth interviews to share their responses as compared to face-to-face interviews. Respondents are also more likely to provide insights on the project.