Skip to main content

Table 1 Overview of the MRC guidance for working with implementers (Moore 2014) and summary of how these recommendations were operationalized in CaPSAI

From: Research and implementation interactions in a social accountability study: utilizing guidance for conducting process evaluations of complex interventions

 

Interaction area

MRC recommendations

Operationalization within CaPSAI

1

Working with implementers

Balance the need for sufficiently good working relationships to allow close observation against the need to remain credible as an independent evaluator. Sustaining good working relationships while remaining sufficiently independent for evaluation to remain credible is a challenge that evaluators must take seriously

• Co-development and use of a standard operating procedure on addressing interactions between researchers and implementers

• Development of a study implementation manual that describe the different components and steps of the interventions

• Development and use of communications tools, such as use of delegation logs or team directory and use of online collaboration site

• Conducting training with both research and implementing teams together and separately on various components of the project

• Conducting regular check-in calls and meetings

To avoid that the evaluation is seen as threatening, researchers need to ensure that process evaluation is understood as a means of allowing evaluation to inform efforts to improve interventions, rather than a pass or fail assessment

To avoid conflicts of interest, especially in cases where stakeholders with vested interest in portraying an intervention positively exert too much influence on the evaluation, agreeing on the parameters of these relationships early on may prevent problems later, and transparency about the relationship between the evaluation and intervention is critical

It is important to remain reflexive and continuously question whether productive or unproductive relationships between researchers and other stakeholders are leading to an overly positive or negative assessment of the intervention. It may be useful to seek occasional critical peer review from a more detached researcher with less investment in the project, who may be better placed to identify where the researcher’s position has compromised the research

2

Communication of emerging findings between evaluators and implementers

Agree whether evaluators will play an active role in communicating findings such as incorrect implementation practices or contextual challenges as they emerge (and helping correct implementation challenges) or play a more passive role. At the stage of feasibility and piloting, which aims to test the feasibility of the intervention and its intended evaluation, the researchers may play an active role in addressing ‘problems’/ In an evaluation that aims to establish effectiveness under real-world conditions, it may be appropriate to assume a more passive role to avoid interfering with implementation and changing how the intervention is delivered

• Co-development and use of a standard operating procedure on addressing interactions between researchers and implementers

• Co-development and use of a standard operating procedure for identifying and addressing social harms

• Conducting trainings and continued follow-ups

Systems for communicating process information to critical stakeholders should be agreed upon at the outset of the study to avoid perceptions of undue interference, or that vital information was withheld

3

Overlapping roles of the intervention and evaluation

In some cases, the most efficient means of gathering data from an intervention across multiples settings may be to ask implementers to assist with data collection. There is a need to clarify data collection instructions, correct errors in paperwork at the earliest possible stage, ensuring that data collection instructions were easy to follow, and minimize the research burden on busy implementers

• Incorporation of workbooks in the study implementation manual to record work plans and document implementation experience

• Sharing of documents and materials used in the intervention to the research teams

To minimize reporting bias, implementation staff could discuss the aims of the research and the best ways to achieve these. There is a need to emphasize that the data were being sought about how the intervention might operate and not assess the staff performance