Cognitive bias and data: how human psychology impacts data interpretation

In this article:

In a modern data-driven world, we are quick to assume that data is the ultimate arbiter of truth: always accurate and always objective. This perception often equates to blind faith, however, and in our quest for data-driven strategies, we may be quick to overlook the many ways in which human thought processes shape seemingly objective data.

These assumptions can make us feel invulnerable to the biases that psychologists have drawn attention to for decades. In reality, though, we are always susceptible to bias. Given our increased reliance on data, it is more crucial than ever to recognize the ways in which our thoughts, feelings, and perceptions shape the meaning derived from data. Our imprint is left on every aspect of the data life cycle, determining which variables are measured and which algorithms are selected.

By recognizing the numerous biases that influence data interpretation, we can better design strategies that reflect the intricacies of behavioral data science.

Data isn’t always as objective as it seems

Data seems like the ultimate source of objectivity. After all, at face value, the facts that form data sets may be observable and measurable in a way that opinions cannot necessarily replicate. Yet those facts must be filtered through the transformative lens of human interpretation.

With every decision and even ostensibly minor action, our thoughts or emotions are embedded into data sets and allowed to influence our responses to that data. As such, it is important to acknowledge that data itself may seem objective—but through our very use of that data, it becomes subjective.

The myth of neutral data in a human-centered world

Given the variable nature of human emotion, it’s simple to assume that hard data provides a more objective or reliable way of understanding our world. This data may appear neutral because, in its raw form, it does not appear to have an agenda.

This assumption reflects the widely held belief that information exists independently of observation. Even if this were true, though, humans cannot access information without observing it, so we will inevitably shape its meaning even if we are unaware of our influence.

Why even the best datasets are shaped by perspective

Human perspective shapes every dataset. It determines why we seek data in the first place, underscoring the problems we hope to solve through the analysis and dissemination of that data. Perspective also determines whether we deem various collection or analysis methods valid, plus how we use the results to make decisions or drive action.

The influence of human perception is most easily observable when data or analysis strategies are clearly flawed. For example, major, quickly recognizable oversights may involve narrow data sources or clearly identifiable agendas.

Even when we take active efforts to mitigate the effects of bias, human perception will always remain influential. This is, after all, what allows us to interpret patterns in a way that makes data personally meaningful or universally useful.

The intersection of psychology, perception, and analytics

Once we acknowledge that data is not neutral or strictly objective, we can begin to explore the many ways in which psychology (the science of human thoughts, emotions, and behaviors) and perception (how we identify and interpret sensations) influence our understanding of data analytics.

At this intersection rests a powerful possibility to build humans' interpretive strengths into our analyses while also accounting for various cognitive challenges or limitations. This is where an emerging interdisciplinary field comes into play: behavioral data science, which explores how humans interact with data and use this understanding to advocate for data-driven strategies that limit bias while upholding ethical principles.

What are cognitive biases—and why do they matter in data work?

Cognitive biases are at the heart of everything we think, say, or do. Although often described as "errors" in thinking, these simply reveal how we make sense of complex information, using built-in thought processes to drive clarity and deal with uncertainty.

Defining cognitive bias in the context of data analysis

Cognitive biases can look different depending on the situation at hand. They are typically referred to as systematic, revealing the divide between human intuition and logic, or what the Oxford Review describes as "deviation[s] from rationality in judgment." In the context of data analysis, this could involve the shortcuts we take when gathering or interpreting data—at the risk of distorting insights or sparking misguided conclusions.

How our brains use shortcuts that distort logic

The human brain craves the easiest path to understanding, as this can help us overcome the mental discomfort of confusion. When logic is not immediately discernible, we may turn to shortcuts to derive our own sense of meaning—even if that meaning is not truly accurate. Known as heuristics, these shortcuts can play a valuable role in our everyday lives, allowing for efficient decision-making so we can make necessary choices while conserving limited mental energy.

When bias isn’t intentional—but still skews outcomes

The term 'bias' tends to evoke a negative reaction. This concept is especially present in political contexts, with accusations of bias regularly directed at various public officials or media outlets. However, bias is distinct from prejudice and thus not necessarily deliberate; it's simply a tool our brains use to make sense of the world.

Softening the language surrounding bias can better help us understand why it exists and what purpose it serves. Still, this bias (although often unintentional) tends to skew outcomes, with seemingly subtle distortions potentially coalescing to have a dramatic impact on everything from business decisions to public health.

Common biases that show up in data collection and interpretation

Biases take numerous forms that can yield dramatically different outcomes. This diversity of bias exists, in part, because the situations that call for mental shortcuts are so diverse. This can emerge at any point in our information processing patterns but may also differ according to the environments or circumstances in which we strive to process complex information. Common examples of cognitive biases in decision-making include:

Confirmation bias: finding what we already believe

Confirmation bias reflects the human tendency to support existing beliefs when seeking or interpreting information. Rather than actively challenging preconceived notions, confirmation bias reinforces them in an effort to enhance intellectual comfort. Unfortunately, this can prevent individuals from actively considering new and potentially valuable ideas. This can also limit critical thinking and informed decision-making, even when a wealth of data is available.

Anchoring bias: letting first impressions skew analysis

Often centered on first impressions, anchoring bias involves an excessive reliance on initial information, with individuals neglecting follow-up insights in favor of previously explored details. It can occur even if subsequent evidence is better rooted in research or is, simply put, more accurate. This can fuel flawed judgments, with updated information either undervalued or outright ignored. As a result, those prone to anchoring bias may not be sufficiently adaptable.

Availability heuristic: overvaluing what’s most recent or visible

The availability heuristic relies on how easily details about situations or events come to mind. These may seem more salient or impactful if they are recent or carry emotional weight. This heuristic leads us to overemphasize the most memorable information while neglecting details that, though less vivid, remain relevant.

A common example involves the fear of flying after seeing headlines surrounding plane crashes; these stories may convince individuals to avoid buying plane tickets, when, in reality, they are far more likely to be the victim of a car accident.

Survivorship bias: ignoring what isn’t in the data

As a common type of sampling bias, survivorship bias reflects a tendency toward optimism, in which we crave success stories and, in turn, are more likely to focus on situations involving positive outcomes. Meanwhile, we overlook examples involving eliminations or failures, not even registering these situations as relevant. If not curbed, survivorship bias can lead to overconfidence and chasing strategies with limited evidence of success merely because the few examples of positive outcomes are so compelling.

A common example of this involves the enduring belief in the millionaire success story. Some people assume that dropping out of college will help them reach their most ambitious professional goals simply because this worked out for Bill Gates and Mark Zuckerberg. In this instance, the overlooked reality (as referenced in a wealth of data) is that very few college dropouts achieve such incredible financial success and, rather, degrees are closely tied to substantial increases in both immediate income and lifetime earnings.

Framing effect: how presentation shapes perception

The framing effect reveals the power of presentation. This idea is best summed up by the cliché about the glass being half empty or half full. Technically speaking, the glass holds the same volume of liquid either way—but the specific words used to convey this volume can deliver a drastically different impression. This includes gain frames (focusing on positive outcomes) along with loss frames (highlighting potential downsides).

How bias impacts decision-making across fields

The biases highlighted above can have a considerable impact on decision-making, potentially causing leaders to overlook crucial details—or, conversely, to fixate on insights that are realistically of little significance to the problem at hand. Awareness is the first step to overcoming these issues, but it's especially essential to understand that the impacts of bias can play out differently from one field to the next.

To that end, below we outline a few of the most noteworthy examples of cognitive biases in decision-making, revealing how these can inadvertently cause harm even amidst sophisticated data-driven initiatives.

Business and marketing: misreading consumer trends

From consumer surveys to social media metrics, many data-driven insights help marketing experts understand how their messaging resonates with the public. Cognitive biases, however, can produce inaccurate takeaways and prompt marketing analysis to misinterpret consumers' thoughts or opinions.

That being said, marketing teams have also been known to use cognitive biases to shape campaigns, for example, frequently tapping into scarcity to drive sales or even survivorship biases to shape brand perceptions.

Healthcare: skewed diagnostics and risk assessments

In healthcare, poorly analyzed data can prove downright devastating, leading to misdiagnoses along with oversights in treatment planning. These potentially deadly errors can occur even among well-trained professionals who may improperly interpret clinical information due to extraordinary pressures that cloud their judgment.

The peer-reviewed journal BJA Education reveals the alarming prominence of numerous cognitive biases within this field, detailing specifically how they relate to decision-making in anesthesia and intensive care. For instance, anesthetists may be prone to confirmation biases when actively seeking information to support their diagnoses. This could play out with the mistaken assumption of normal blood pressure due to failure to obtain a reading thought to stem from the wrong-sized cuff, as opposed to a patient's blood pressure actually becoming "too low to detect."

Public policy: designing interventions based on incomplete insights

Today's policymakers favor data-driven solutions because they offer the impression of credibility and even an element of political cover when these leaders must make potentially unpopular decisions. This approach can be risky, though, as the information presented to policymakers could be filtered through numerous layers of interpretation. This filtering may even be actively incentivized throughout the political process, with lobbyists and special interest groups seeking to present the most favorable data—in the most favorable terms—while minimizing the uncertainties or seemingly undesirable information built into data sets.

Education: misinterpreting assessment and achievement gaps

Educational assessments can provide key indicators of student learning, helping students understand where knowledge gaps still exist while also allowing instructors to identify where their teaching methods might fall short. Formative assessments offer an ongoing approach to discerning student progress, while summative assessments detail achievements at the end of instructional units.

Oversights in these assessments may cause educators or administrators to neglect underserved populations and assume, for example, that low test scores indicate minimal effort—or that limited participation suggests disinterest.

As educational leadership experts Seth Weitzman and Robert Feirsen point out, this can hold grave implications beyond the classroom, "impair[ing] collaborative problem-solving and exacerbate[ing] conflict," and ultimately "fragment[ing] groups into us-versus-them divisions and sharply narrow[ing] the range of solutions" that educational leaders propose or consider.

Spotting bias in your own data work

Every individual is prone to bias. This cannot be eliminated entirely, as biases are built into our everyday thinking processes and frequently unconscious. That being said, bias can be minimized. This begins with simple awareness and self-reflection—making the effort to examine your own work and pinpoint any instances in which psychological processes may have undermined the true meaning behind the data.

Critical thinking techniques for analysts and decision-makers

Critical thinking is an in-demand competency within today's data-focused workforce. In the context of data-driven decision-making, this calls for the intentional questioning of assumptions, with evidence carefully evaluated under the recognition that it may have already been shaped by human perceptions. Strategies to enhance critical thinking amid bias might include expanding exposure to diverse perspectives or reframing problems to see if conclusions change.

Designing better surveys and data collection tools

Data collection can have a profound impact on cognitive biases and the ongoing effort to combat them. While all data sets will ultimately be influenced by human thought processes, these tendencies can be mitigated via randomized sampling, standardized measurements, and transparent documentation.

Techniques for improving survey design may include:

  • Emphasizing neutral language
  • Randomizing question orders
  • Conducting pilot tests to pinpoint instances of confusion

Questioning outliers, averages, and visualizations

Outliers can contribute to biases by skewing averages. Therein lies the value of comparing the mean and the median and carefully evaluating any significant differences between these metrics. Active efforts to identify and explain outliers could combat tendencies such as the survivorship bias.

Thoughtfully designed visualizations can make these outliers easier to recognize and understand, thereby guarding against misleading conclusions when exceptional cases are present.

Tools and strategies to minimize bias in data practices

Because cognitive bias is so deeply ingrained in our everyday thought processes, it can easily be integrated into the tech-driven systems and solutions we now use to make sense of data. However, through deliberate strategies, we can identify core sources of bias and strive to minimize or mitigate these concerns as they appear throughout the data lifecycle.

Peer review and cross-functional analysis teams

Peer reviews encourage experts to weigh in on data-driven decision-making, evaluating everything from data collection methods to analyses and outcomes, examining closely for errors or biases at every step. Cross-functional analysis teams support this effort by highlighting diverse perspectives so that data interpretations are not limited by narrow viewpoints.

Blind testing, randomization, and control groups

Blind testing keeps various participants purposefully unaware of certain details, with the goal of limiting bias so that outcomes are driven by specific variables rather than participants' attitudes or experiences. In the context of data practices, this allows for objective evaluations.

Randomization is influential, too, with participants assigned to experimental or control groups by chance. This limits bias because it prevents external factors from influencing the results. A blend of randomization and blind testing can isolate the impact of targeted variables, considerably improving both the validity and reliability of ensuing insights.

Training in data ethics and psychological awareness

Moral principles known as data ethics govern the use of data in contemporary decision-making. These principles reveal the value of data privacy, integrity, transparency, and consent within our data-driven landscape. Psychological awareness can be built into data ethics training to help us understand when (or why) we may overlook these values, along with the cognitive biases that may prevent us from using data responsibly.

The role of emotional intelligence in data interpretation

Emotional intelligence (EI or EQ) describes whether (and to what extent) individuals can manage their own emotions while also understanding how others feel and behave. This plays a central role in data interpretation, as poor emotional management—and an accompanying lack of empathy—can increase the potential for bias-driven distortions as well as limit our willingness to actively combat these issues.

Staying aware of personal biases and motivations

While all humans are prone to cognitive bias, some individuals may be more susceptible to certain forms of bias. Self-awareness helps us determine where we are most vulnerable and why certain mental shortcuts skew our thinking. Through reflective journaling, we can monitor our thought processes for signs of bias, while mindfulness helps us notice snap judgments as they occur.

Using empathy to understand audience reactions to data

Colleagues, clients, or stakeholders may show dramatically different responses to the same data, even when it's presented using similar language or visualizations. Through empathy, however, it is possible to anticipate reactions and tailor communication accordingly. Insights into cognitive biases may support empathy by reminding us that we are all prone to mental shortcuts.

Practicing reflective thinking before drawing conclusions

Data-driven solutions support fast-paced operations and real-time decision-making—but there are still times when a slower, more reflective approach to thinking may be preferable. This involves a conscious analysis of inner thoughts and beliefs, exploring what exactly drives these and why. Reflective thinkers pause to consider other perspectives, actively challenging their own beliefs to reveal potential blind spots.

Final insight: data is only as honest as its interpreter

Data may be king, but it's only as powerful as the humans who interpret it. With the right frameworks and skill sets, today's data-driven professionals can mitigate biases while uncovering deeper truths and ultimately promoting the ethical use of data.

Gain an edge in a data-driven workforce

As you explore opportunities in data-driven industries, be mindful of the potential for bias in your own research. A well-rounded education can help you understand and address data bias in its various forms.

This begins with developing data literacy—a clear priority within the Penn LPS Online Data Analytics and Psychological Sciences bachelor’s degree concentration. Get in touch to discover how this interdisciplinary program can help you become a responsible data steward while gaining an edge in today's fast-evolving workforce. Feel free to explore our other concentrations as you plan for a passion-guided educational journey.

Apply Today

Ready to apply to Penn LPS Online?

Apply Now

Learn more about Penn LPS Online