Obtaining reliable data and actionable information is a difficult task. Every step of the process — from survey design to implementation to analysis — leaves margin for error. Survey designers are human with inherent viewpoints and unconscious habits of thought. While we all rely on and benefit from mental shortcuts, pre-established understanding of the world can adversely affect survey outcomes. Unfortunately, seasoned professionals are also susceptible to errors due to bias and mental shortcuts. Watch out for these shortcomings that can discredit even the most well-designed surveys.
Assuming that Common Vocabulary is Non-Technical
Experienced survey designers know to use common words and simple vocabulary. But some words that are common for development workers are confusing or carry a different meaning for the target population. As a reader of this blog, you would not mentally catalogue the word internet as technical jargon. But some people in developing economies misunderstand Facebook to be separate from the internet or believe Facebook to be the whole of the internet.
Facebook’s eminence is not exaggerated. Sheryl Sandberg says, “People actually confuse Facebook and the internet in some places.”
This anomaly was first discovered by Helani Galpaya of LIRNEasia, a pro-poor, pro-market ICT think tank in Sri Lanka. Indonesian survey respondents indicated they did not use the internet but later spoke of spending time on Facebook during focus group sessions.
Quartz commissioned a survey in Indonesia and Nigeria to replicate such observations in December 2014. GEOpoll administered the mobile surveys with a sample size of 500 people per country; the median age of respondents was 25 in Indonesia and 22 in Nigeria. The surveys asked respondents if they had used the internet in the last thirty days and if they had used Facebook in the last thirty days. The disaggregated data shows more internet users than Facebook users. A closer look reveals that, of those who use Facebook, 11% of Indonesians and 9% of Nigerians indicated they did not use the internet. This is not surprising given how the internet is introduced via Facebook in these areas.
As more data is collected to inform policy decisions regarding tech infrastructure, communication, and education, it will be crucial for survey designers to understand colloquial language. Target populations first experience the internet and new technologies in a unique context. With one mobile device, they get decades of technological development in one swift action, without a comprehensive introduction or any guidebook. Development workers have had the advantage of a more gradual introduction to technology. They experienced new websites and web-based products as they were created. Development workers should not take for granted their broader foundation and longer history communicating with technical vocabulary.
No matter what the subject, reliable data depends on a common and universal understanding of the question and terms at hand.
Confusing Access and Behavior
In a poll teachers will indicate an honest opinion that detailed lessons plans are beneficial. However, this belief does not necessarily translate into action. Despite belief in the benefits of pre-planning lessons and honest intention, teachers might fail to act. There could be lack of training, time constraints, or other barriers to planning lessons in advance.
As fallible human beings, we have enough personal experiences where personal opinions diverge with behavior. If beliefs equated with behavior, most of us would eat healthier and exercise more often. Because we all differentiate between belief and behavior, researchers know not to extrapolate a teacher’s beliefs to describe the consistency of lesson-planning behavior.
Even though we know better, researchers often end up equating access to resources with behavior. While our background and experiences make it easier to differentiate between opinion and action, they bias us about what is desirable and preferred.
Let us take an example from the Western world: a grandmother notices her granddaughter does not have a microwave and therefore does not use one. The grandmother thinks that her granddaughter cannot afford one. Actually, the granddaughter does not have a microwave because she prefers not to use one. What was a celebrated and time-saving invention for the grandmother is a source of less nutritious food and a potential health risk for the granddaughter. The grandmother buys a microwave for her granddaughter. The microwave sits unused in the closet.
In a development scenario, we can look at efforts to reduce child mortality. Development and government agencies decide to survey villages to determine how many homes are without toilets. Their intention is to end open defecation and thereby reduce waterborne illnesses. But the survey design confuses access to toilets with sanitation behavior. Access to toilets does not correspond with their use, since not all rural Indian households use toilets installed on their property. Dean Spears of RICE, a non-profit research organization dedicated to understanding the lives and promoting well-being of the poor, explains his research findings in LiveMint: “A range of qualitative and statistical evidence agrees: such latrines do not fit well with the culture of purity and pollution that also underpins the caste system. People are reluctant to accumulate feces in latrine pits near their homes; they believe that latrine pits will fill up more quickly than is actually the case; and they are worried about how latrine pits will be emptied.”
This is similar to the unwanted microwave. For the toilets to be useful, the recipients need their concerns to be addressed. Dean Spears expands on the issue in a working paper: “We find an association between local practice of untouchability and open defecation that is robust; is not explained by economic, educational, or other observable differences; and is specific to open defecation, rather than other health behavior or human capital investments more generally.” In short, the cultural context in rural India complicates the adoption of latrine use.
This differentiation between access and use in research is important because providing technology and infrastructure without educating recipients will not deliver the desired result.
In this case, access without a change in behavior will yield no improvement to health outcomes.
In the same light, having a smart phone and access to the internet does not mean that someone understands how to access resources on his phone. Illiteracy creates a barrier to learning phone functionality and taking full advantage of the internet. Early adopters in rural areas also lack the broad professional and social networks and general exposure that we rely on to discover new resources and tools online. Without this peer-to-peer education, we would only use the pre-installed applications on our phones. Another gray area in access and behavior is individual versus household data. Imagine sharing a phone with your parents and siblings. How much time would you get on the device, if any?
In order to better inform public policy and resource allocation, survey designers need to measure behavior and take caution with measures of access alone.
Researcher Confirmation Bias
Beyond survey design, confirmation bias also threatens how researchers interpret survey results. Confirmation bias is the tendency to selectively interpret data so that the results align with a current ideology. This mental error is not intentional but inherent. Nevertheless, giving more weight to data that supports one’s position and
overlooking contradicting evidence is dangerous to survey outcomes and development decisions.
The World Bank conducted a study to test if their professional staff would fall into this trap. The team used identical data sets to describe the efficacy of a skin cream and the impact of minimum wage policy on poverty rates. Controlling for seniority and cognitive ability, the respondents given the neutral skin cream scenario interpreted the data with more accuracy than those given the controversial minimum wage scenario.
In the minimum wage scenario, ideology correlated with accuracy in interpretation. Participants were asked to identify with one of two statements: “Incomes should be made more equal” or “We need larger income differences as incentives for individual effort.” When the data indicated a rise in poverty from minimum wage laws, those who support income equality were less accurate.
Given this study, professionals of all levels must be cautious not to let their ideology cloud the data.
Avoiding Professional Pitfalls
Is there any way to prevent such biases from contaminating survey research? Fortunately, open discourse will offset the failures of individual and groupthink when people with different perspectives are invited to the conversation.
To prevent errors in question design, facilitate a small focus group or personal interviews with the target population. Go through the topics on the proposed survey in conversation form, using open-ended questions. Ask someone from the target population to paraphrase each question and response field using their own words. Specifically ask if anything in the survey is confusing or could be misunderstood. Provide definitions and examples in the survey for clarification. After verifying vocabulary comprehension and response fields for multiple choice questions, you can go to field with confidence.
To avoid errors in assumption and interpretation, create a team whose job is to poke holes and question survey design and analysis. Especially in societies or organizations where offering unsolicited feedback or questioning authority is not well received socially, a dedicated team whose job is to critique research design and interpretation is indispensable. These Red Teams (as termed by the U.S. Military) force conversation. In addition to internal Red Teams, invite honest discussion with individuals of different perspectives or approaches to the same issue to prevent bias relating to ideology.
Keeping these professional pitfalls at the forefront of your mind and conversation will reduce margins for error.
Assuming Common Vocabulary is Non-Technical
LIRNEasia study, Geopoll study, and Sheryl Sandberg quote from:
Leo Mirani. “Millions of Facebook users have no idea they’re using the internet.” Published by Quartz, February 2015.
Confusing Access with Behavior
David Spears quotes from:
Dean Spears and Amit Thorat. “Caste, purity, and pollution and the puzzle of open defecation in India: Evidence from a novel measure in a nationally-representative survey.” Published by RICE, September 2015.
Dean Spears. “Needed: Quantitative evidence of Swachh Bharat Abhiyan.” Published by Livemint October 2015.
Details on the World Bank study on ideology and data interpretation by professional staff from:
World Bank. “Chapter 10: Biases of Development Professionals” of World Development Report 2015: Mind, Society, and Behavior.
We know that collecting good quality data is the first step towards data driven decision making. In the last two years, we’ve helped over 150 partners collect over 20 million data points through our mobile data collection app, Collect. We love Collect, but we realize that building a product isn’t enough. We’d like to share the lessons we’ve learned on designing a stellar data collection plan through our first ever ebook. This 30-page guide contains everything you need to know to improve the way you collect data! DOWNLOAD NOW!