Angus Deaton was awarded the Nobel Memorial Prize for the Economic Sciences this year. His work on measurements and poverty estimation is well known among economists (and now non-economists), who understand that good data is important to do good economics. But Deaton also wrote about role of evidence itself in policy analysis. For example, with respect to the use of Randomised Control Trials (RCTs) as a policy tool in development economics Deaton argues that, "Randomised controlled trials cannot automatically trump other evidence, they do not occupy any special place in some hierarchy of evidence, nor does it make sense to refer to them as "hard" while other methods are "soft". Here, Deaton is pointing to a broader problem within academia, practice and policymaking where certain methods get prioritised as the right or only way to analyse problems.
Deaton probably didn't speak about communal violence but his words are useful in the current context of debates in Indian media over the deaths of three men over possession (or lack thereof) of beef and the return of Sahitya Akademi awards by a number of Indian writers in response to the deaths of fellow writers. The broader debate centres around incidents of violence against minorities and increased intolerance for the freedom of speech, a common thread that connects these different incidents. These debates are largely accessed by the general reader through the media (presumably via experts) and therefore the media plays a significant role in arbitrating our understanding of what is happening in these two realms. But experts and readers seem to have positioned themselves into two clear camps - one that is data driven and objective, and another that is subjective and therefore probably "agenda" driven. This draws fundamentally from what Deaton has to say about the hierarchy of evidence and a nuance similar to his is required in debates around public matters - what metrics serve as objective measures of a problem?
Regardless of which side of the debate one is on, data has been used to justify either the rise of communal violence (the piece itself is from 2014, but has been referenced recently) or the lack of statistical evidence to prove it. A counter to data being arbitrarily used to "prove" a rise in violence is justified. However using the same type of data to disprove a point as well disregarding other forms of evidence completely misses the boat. The simplest trick is to reduce a complex problem to a reductive debate around numbers. Graphs and numbers, are visually appealing and simple to digest and also forward to others, but also mislead the public into lapping them up as hard evidence.
Experts have been using these tools, often with complete awareness of their limited value, to promote more "objective" debates around public values. As the public, we have to be critical of the form in which data is presented to us, and acknowledge the ocean of other forms of evidence it seeks to negate. The existence/lack of data by itself is not an argument won. In the analysis of complex problems, data is interpreted and politicised, and in that process it becomes an opinion - and is therefore as subjective as a piece that cites other types of evidence.
Let's anyway look at the data in these articles - because we know it'll be used time and again to prove a point. The data itself is not "wrong", the question is whether it tells us everything we need to know in particular cases or not, and whether it is sufficient for the question we are trying to ask. Data focussing on casualties from communal violence is just one source of information, regardless of the point you are trying to prove. Communal violence is hardly captured by a simple count of casualties because violence itself takes many forms, from systematic exclusion to threats not leading to casualty.
Secondly, this data aggregates the people of this country into monolithic entities identified only by the state they live in (with millions of other people) and by the political party that is "governing" them - an assumption that oversimplifies the lives of people in a large and diverse country like ours, simply in order to arrive at "objective" conclusions. Human geographies matter when issues such as communal violence are to be discussed and debated.
Furthermore, the use of data in building a narrative is strongly associated with the construction of a metric that allows such a narrative to be told. Stronger the metric, stronger the narrative. Also, there will always be a gap in the information/data that is available, and what one wishes there were. No matter what data source is used to build a narrative, there will be critics who will question the information/data used. Instead, the aim, perhaps, should be on improving the quality of the metric - attempting to design something that explains more, is able to add value to the debate. For example, an alternate metric could be the number of incidents of communal violence per capita, weighted by the severity of the incidents measured by the number of deaths or number of injuries per incident. This is possible with the data released by the Lok Sabha recently, and a weighted average of per capita incidence of communal violence, builds a vastly different narrative.
In the future, if appropriate data were collected, the injuries themselves, could be disaggregated into injuries caused to women, children and other vulnerable sections of the society. Additional parameters such as these would increase the understanding of the character of communal violence.
This positivist worldview, in other words, one that prioritises scientific and mathematical proofs however has a long history. Its fundamental purpose is to reduce human beings to rational actors or formulae that can be solved. And wouldn't we all like to believe we live in a rational predictable world to not actually confront some of the more complicated problems? Fortunately this dominant paradigm is being supplanted by more complex ways of thinking in academia, but also in public policy. Risk analysis today, for example, is moving from methods that are scientific and technical to concerns around trust, social equity and values. Unfortunately the same might not yet to be true for public debates in the media, and as long as data driven analysts are the only ones accorded the power to deconstruct our lives for us, the results will be limited and biased.
Data analysis is very valuable, and has been used to solve many of society's most pressing problems, from the development of vaccines to the diffusion of more solar electricity in the grid. Perhaps that is why we respect it so much. However data can also be used to oversimplify an issue and avoid a deeper debate that is subjective and perhaps makes us uncomfortable. Secondly, the distinction between evidence and knowledge itself is important. As explained by scholar Kevin Krizek and colleagues, evidence may include data, fact and information, but it does not constitute "knowledge". "Knowledge" implies a larger theory of the ways in which certain types of information mesh with other kinds of information, perhaps in the context of causal relationships or in a social learning context in which knowledge is jointly created."
But more problematic is the negation of other forms of evidence and the tendency to brand them as "agendas" or value-laden and biased "selective outrage". The question being asked of writers is whether they have an agenda. The implicit charge being that they are not "experts" armed with right tools.
But no sensible debate on the wicked problems of society will be rigorous and "objective" enough until we accept the diversity of evidence itself. And diversity, we've heard, is something Indians are proud of.
(K Rahul Sharma researches on science, technology and environmental policy. Arindam Jana is a researcher working on urban data.)