DEVTA Trial - will it help or harm our treatment programs?

Children getting treatment at an Anganwadi Centre, India.

Kerry Gordon

8 May 2013

In poverty-ridden countries with a lack of funds and resources, it is critical that the most cost-effective government treatment programs are used. The decision about which programs are the most beneficial needs to be driven by solid, swift science.

It is tragic that in developing countries like South Africa, the majority of deaths for children under five are preventable. Causes of death like malnutrition, diarrhoea and measles are often the result of poor nutrition and a lack of access to good healthcare. This is why groups like UNICEF and the World Health Organisation have been developing guidelines to provide both simple and cost-effective health interventions to save lives.

The question is: What are the simplest, most effective things that health professionals can do to save the most children? A major way to allow informative decisions to take place is running clinical trials in these communities. A clinical trial involves performing a particular intervention, such as giving people nutritional supplements, and seeing how it affects their health compared to the rest of the community. These clinical trials are fraught with problems, mainly because of natural variation between people. A medicine might make three people better, and have no effect on the fourth. If a clinical trial only tested the fourth person, it would look like the medicine did nothing. To overcome this natural variation, the trials need to test as many people as possible, so random factors have less of an effect on the overall result.

One of the most powerful ways of seeing through the random variation is to perform a meta-analysis. This combines the data from many different clinical trials carried out on different groups of people.

This is not as simple as lumping all the results together - it takes into account the fact that some studies are larger than others, and might have been done on a different set of people, and might have been done in different ways or at different times. The meta-analysis takes all these factors into account, as well as checking and double-checking that the studies are accurate and legitimate.

The Cochrane Collaboration is one of the world’s most renowned sources of meta-analyses. Their analysis of 17 clinical trials with almost 200 000 children showed that giving children aged between 6 months and 5 years doses of vitamin A reduced the number of child deaths by 25%. Because giving children vitamin A is simple and cost-effective, it has been welcomed in developing countries, and our own Department of Health rolled out a vitamin A supplementation program in 2008. There is also evidence to suggest that deworming programs prevent weight loss in children, thereby increasing their chances for survival.

However, the latest clinical trial says something different. The De-worming and Enhanced Vitamin A (DEVTA) trial took place in rural districts in Northern India, recruiting children who visited Anganwadi Centres (AWC) (Anganwadi is Hindi for courtyard shelter). These are centres that provide basic medical treatment for children in rural Indian villages as part of the public healthcare program, similar to our rural and township clinics. AWC workers are minimally paid women from the communities, often with just a few months medical training. They do basic health checks, give nutrition advice and administer government provided supplements like vitamins and de-worming tablets. By using these well established centres, the costs of the trial were greatly reduced.

Almost 2 million children were involved in the DEVTA study over 5 years, making this the largest study of its kind and 10 times larger than the previous 17 trials combined. Each AWC was divided into 4 groups of children: The first group didn’t get any treatment at all, the second got vitamin A only, the third got albendazole (a de-worming drug) and the fourth got both vitamin A and albendazole. Monitors would motorcycle between neighbouring AWCs and record how many deaths there had been per year. By comparing the number of deaths in the groups that weren’t treated to those that were given vitamin A or de-worming tablets, they found that there was no difference in the number of deaths between the two groups. The DEVTA study is bound to make waves with policymakers, because it suggests that vitamin A and de-worming drugs are not the best solution to avoid child mortality.

When the DEVTA trial is combined with the 17 previous trials in a meta-analysis, the scales are tipped in favour of the DEVTA findings - vitamin A has a much smaller benefit than thought and de-worming had no benefit. It contradicts the previous studies, and so it is being treated with skepticism and careful scrutiny.

The controversy surrounding DEVTA gets to the heart of a fundamental question in science. On one hand, it’s quite right to question DEVTA’s results and try to find fault with its methods because they go against the majority of earlier evidence. It’s just one trial, even though it’s a large one, and there’s no point in throwing out all the older data if there’s a chance that there was an error in the trial’s design. On the other hand, the disbelief of DEVTA could be confirmation bias - people are happy to accept new information if it supports what they already believe, but will challenge data that disagrees with their beliefs. Confirmation bias, however, slows down scientific progress. It makes it harder for new information to be accepted and for opinions to change. The publishing of the DEVTA data itself was delayed for 7 years, in part due to the doubt with which it was received.

The logical solution is to be critical of DEVTA, and to apply the same uncertainty and detailed examination to the trials that tell us what we already believe to be true. It might mean that the previous trial results get thrown out, making the DEVTA trial more believable. The problem is that while the delays in publishing and the debate around what the results tell us is important for the integrity of science, it doesn’t help policymakers. The danger may be that we are putting children at risk by following misinformed treatment guidelines or by dragging our heels in getting good, reliable data to support our decisions.