International Development Programs are projects implemented by international NGOs to improve the quality of life and health of a population. In an increasingly data-driven world, these projects are generating vast amounts of useful qualitative and quantitative data which we can use to indicate success, provide feedback on the design of a development program, and highlight areas to be improved or adapted.
Data helps us make sense of the endless streams and sources of information. It offers guidance on the direction of a program, its progress, and its conclusions, and it further elucidates the most effective approaches and methods. Collecting valid and quality data from the field is therefore critical. Unfortunately, the process of data collection is often the most challenging, hindered by difficult environments and static paper forms. These paper based systems are often plagued with manual input errors, inadequately trained users, and submission and analysis delays. Therefore, in recent years the use of technology, including mobile data collection platforms, like Teamscope, aim to overcome some of the challenges and make it easy to extract, capture and transmit data for it to be cleaned, analysed and reported in a safe, reliable, and almost instantaneous manner. It is an exciting and promising time.
The problem is that at times we are expecting quite a lot from those using the technology. Most times the users are not specially trained data collectors or research assistants, commonly found in research interventions. In international health programmes, the users are largely clinical service providers. There often isn’t time to undertake lengthy user training, nor to assess ability before or after.
Therefore, simplicity is key. It has to be simple for providers who are busy delivering services to the people who the project or programme is designed and funded to assist. It has to be simple because access to technology, broadband internet or even electricity, might be scarce or unreliable. It has to remain simple so the use of these technologies and their learning curve is not beaten by the attraction of going back to paper forms. The ultimate simplicity ensures that the cost to those service providers does not outweigh the gains.
Speaking of the gains, let’s think about what these end users gain? In most programmes data is fed from the field level, to coordination and then on to the donor. This upward trajectory of the data is often accompanied with a generous dose of pressure (read stress) and expectation, to be able to demonstrate to the donor how well it is all going. At times we feed back to those who collected the data, but this is often to the coordination teams on the ground and doesn’t always reach the users themselves.
In my experience the users seldom see it as a good use of their time, rather something that they have to do and are less likely to use it compliantly or raise issues with it, as a result. A feedback loop is therefore an area that deserves more focus from teams and from developers. In turn it will likely act as a motivational lever to improve quality and commitment to data collection. What goes up, should come down.
The simplicity I mentioned is often lost at coordination or donor level, where data capture tools are often designed to be a good fit for their needs and to feed into organisational systems more easily.
Yet, what might work for donors may not be ideal for implementation by a midwife working in a busy clinic in a remote village. The end users need to be involved in the design of tools from the start and in the testing too; with opportunity for modifications, based on their recommendations, built in. What are they already collecting through other systems and how could these be combined? How much time do they have to collect data on each patient or client and at what time of the day? How can they benefit from collecting data in this way? What support is available? What has not worked before?
I have worked with clinical teams who are being asked to deliver data reports on a monthly or quarterly basis to meet the needs and demands of a funder. They have shared with me the difficulties of collecting data on separate systems from what they normally use, sometimes for multiple funders on slightly different indicators and with different timeframes for reporting; all of this on top of national service statistic reporting to the Ministry of Health. The funders needs are understandable but we must consider how to make it most effective, for the benefit of all.
Several healthcare providers over the years have commented that they don't know where the data goes, what happens to it or what it says? It feels like a burdensome task that offers no return for them. Whilst this might have been harder with paper forms, the increased use of technology we are now witnessing offers a window of opportunity to get this right and to demonstrate the purpose and the benefits, with relative ease.
Build fully customizable data capture forms, collect data wherever you are and analyze it with a few clicks — without any training required.
Data tool developers should ensure that tools and systems improve data quality without increasing the burden on busy healthcare professionals. The greater the level of effort required to complete data forms, the lower the data quality is likely to be. Consider the time necessary to complete forms and whether all of the data being gathered is essential.
Tools should offer a workable solution that is fit for context and local infrastructure; taking into account internet connectivity and reliability, provision of smartphones or tablets that can withstand climatic challenges and platforms that operate offline.
Developers and funders should involve end users at the beginning, middle and end of tool development, considering everyone’s needs; top down and bottom up. There should be a system to respond to user feedback and suggestions for improvement, to evaluate the tools.
A data feedback loop is needed, with data visualisation for end users which will enable them to read the story that the data is providing and what is being reported upwards on all their hard work. This helps people to feel involved, valued and included in the discussions about progress.
We need to listen to those who will use the tool and hear how it will be most effective for them. Simple, isn’t it?
Thousands of researchers in more than 20 countries can't be wrong.