Cybercriminal Exploitation of Cognitive Biases: A Brain Capital Perspective

News
Article

How prioritizing brain health and brain skills could offer new solutions to this challenge.

SandervanderWerf_AdobeStock

SandervanderWerf_AdobeStock

Cybercrime may sound like science fiction to some people, but it is sadly a firsthand experience of an increasing number. As more of our social lives, work, and money move online, it is necessary for everyone to have some understanding of cybersecurity. Curiously, just as computers can be hacked, our brains can be “hacked” by social engineering. Estimates suggest that 84% to 98% of “hacks” rely on exploiting psychological vulnerabilities, not just computer code.1

Brain capital conceptualizes brain health (eg, the lack of mental illness and neurodegenerative disease) and brain skills (eg, education) as essential to the digital economy.2 This concept assumes that our brains are our greatest asset and provides a framework to define brain issues, quantify them, and track them. Brain capital can drive new workplace policies, practices, and investments.

Our Cyber World

As most of our time and lives have moved away from physical spaces into cyberspace, the opportunities for working, connecting, and communicating online have expanded. As more of our lives move online, those who seek to make money from crime and exploitation move online, too.

When our activities move online, all actions, links, and connections create a larger digital footprint—a traceable and permanent history of digital activity. Your digital footprint consists of all information that you leave behind as part of your online activity. When you search for products, services, or information online, you leave traces of what you were looking for. Social media websites track the individuals you communicate with, along with the topics you tend to be interested in and the types of websites and advertisements you click. Advertisers also create profiles, tracking information across websites to build up profiles of individuals, groups, and cohorts. The more you communicate, bank, invest, work, shop, chat, game, and watch films and TV online, the bigger your digital footprint grows—and your digital identity can be shared between many online platforms and is often on-sold.

Much of this information is stored privately, but many individuals also leave large digital footprints that are publicly available. For most people, if you know their name, the area they live in, and their job title, you can find a significant amount of information that is publicly available. Professional and amateur sports teams often have public rosters and schedules. Professional profiles are available on most company websites and LinkedIn. Personal information is often shared on sites like Instagram, Facebook, and Twitter. All this information has connections with real people and real-life activities.

Hacking Code vs Hacking Our Brains

When we talk about being targets of cybercrime online, we usually refer to it as being “hacked.” Hacking used to refer to some sort of exploit or vulnerability in computer code, but many digital scams and thefts do not bother trying to hack computer code—instead, they opt to exploit vulnerabilities that exist within our brains.

The more appropriate term and the more common exploit is what is called “social engineering.” If someone logged into your Facebook account, gained access to your online banking, or accessed a website that has access to your credit card details, it is far more likely that they managed to find out enough information about you to gain access to that account—not that they exploited some sort of vulnerability in the computer code.

Most individuals still use relatively simple passwords (eg, password, secret, abc123, 123456, and variations on words related to current affairs are still the most common passwords) or self-referential passwords (eg, surnames, kids' names, pets, street addresses, birthdates) that are fairly easy to guess with publicly available information. Most individuals use the same password on many websites, so when 1 password is leaked, every account is then vulnerable.

Most cybercriminals are not exploiting the vulnerabilities in hardware or software—rather, they are exploiting the vulnerabilities in people. A large proportion of cybercriminals are not necessarily computer geniuses or master hackers. They are con artists who seek out the locations and spaces where individuals are vulnerable to being taken advantage of. Increasingly often, these are digital spaces.

As an example of this, take 2 extremely common examples of social engineering techniques that try to take advantage of people online. Your email spam filters probably catch and block many of Type 1, but Type 2 is also quite common.

Type 1: Phishing

Phishing is when a malicious actor masquerades as a trusted entity to gain information. Phishing typically uses widely available information and mass communication to target large groups of people. For example, a common type of phishing scam will have a cybercriminal send emails out masquerading as a bank or utility company. The email mimics a typical communication from the bank, but seeks to obtain personal information like passwords, account information, or security details by prompting people to logon to a bogus but genuine-looking website. Many spam filters eliminate a large percentage of phishing messages, but disturbingly many get through to your inbox.

You can often spot phishing attempts due to some common features. First, they often threaten you by indicating that your software is out of date or your warranty is about to expire. Second, they may offer you an incentive such as a free gift card or windfall money. Third, they have a clear call to action asking you to click on a link or open an attachment. Doing so injects malware into your machine, compromising your personal computer.

Type 2: Spear phishing

Spear phishing is a particularly sinister form of attack. These are specifically targeted at an individual by making use of personal information, often gathered from social media or from the individual’s publicly available profile on the web. For example, a scammer might imitate an email from a different department or external organization to gain information from a targeted individual or group within a company. An increasingly common variant is when a scammer impersonates an individual in a leadership position in an organization pressurising them to fulfill an online task.

Another example is by using some publicly available information to obtain other information. For example, if your family plans to be at an out-of-town football tournament for several days, there may be enough information to indicate this is a possibility, so that information could be used by someone masquerading as a representative of the local sports association to confirm that the house will be unoccupied on certain days. Or it could be someone attempting to collect sensitive data from individuals within a company, either for malicious purposes or to gain further access into that company’s IT systems.

Neither of these are hacks targeting computer code—they are attempts to gather more information using psychological and social strategies to target victims.

When should you worry? Be suspicious anytime you are contacted by an unfamiliar individual or agency offering you an improbable gift or to help you avoid a potential crisis related to your finances or business. These messages sound plausible at first glance but digging a bit deeper can reveal other telltale features. If you are asked to click a link—or worse, to open an attachment—be wary. Always check the email address or the web address of the sender. These often look like real companies on the surface but have odd prefixes or suffixes that do not add up. If things look at all illegitimate, do not proceed and do not click the link. When in doubt, check with a colleague or an IT professional in your organization.

Cognitive Biases and Exploiting Social Norms

Research into individual differences and vulnerabilities in dispositional factors has found limited evidence of outcomes with cybercrime victimization.3 Much of the research on personality correlates of fraud susceptibility is atheoretical or fails to incorporate validated personality instruments. Personality correlates of fraud susceptibility tend to be modest, at best. Loneliness, well-being, trust, and cognitive abilities have all been hypothesized to have some relationship with cybercrime vulnerability, but the associations are weak and the mechanisms are poorly understood. They are likely to have significant interactions with decision-making strategies and environmental cues. Falling victim to cybercrime is highly situational and contextual. The most effective cybercrime does not target groups based on dispositional factors—it targets individuals based on personal factors.

The best protective factors are experiential factors. Knowledge of cybersecurity, understanding of common scams, and experience using digital technology can all be protective factors. However, these interact with dispositional factors. For example, those with a high degree of technical knowledge about online environments may be overconfident about their ability to spot scams.

The best framework we can use for understanding how individuals fall victim to cybercrime is understanding the cognitive biases and the social norms that are exploited by cybercriminals. Williams, Beardmore, and Joinson4 detail the most common norms that are exploited.

  1. Authority: Most individuals tend to be responsive to people or institutions with an established level of authority or power, like government departments, banks, police, managers, or public figures. Imitating authority figures draws on that trust in authority.
  2. Liking: Cybercriminals may create fake identities (either individuals or institutions) that appear kind or prosocial, especially philanthropic figures, charitable organizations, or warm and caring individuals.
  3. Conformity: Cybercriminals often indicate that many others have benefited from responding in a certain way, hoping to create the expectation that similar behavior will lead to the same result.
  4. Commitment and consistency: This strategy relies on the sunken cost trap. Getting a small amount of money as an initial investment, processing fee, or other small cost can lead to individuals sending more money to achieve the desired result.
  5. Reciprocity: Cybercriminals often initially offer a favor or prize, setting up an expectation of the victim providing something in return.
  6. Scarcity: Cybercriminals often create time or resource pressures to encourage the individual to participate in the scam urgently: “limited-time offer.”
  7. Reward: Cybercriminals often promise some sort of reward that could be psychological, financial, or physical (eg, cure illness, lose weight, yield high investment returns, feel good about yourself, etc).
  8. Loss: Cybercriminals often indicate that a failure to act or respond will result in some type of penalty, loss, end of relationship, or loss of status.

Cognitive Biases and Brain Health

These cognitive biases can be directly impacted by impaired brain health, making individuals more susceptible to exploitation by cybercriminals. Brain health is vulnerable to stress and exhaustion, such as that resulting from mounting uncertainty in the world. Friedman5 describes that we are living in an “age of accelerations” resulting from technological innovation, and rapidity from climate change and globalization. The confluence of these accelerations is leading to rapid societal change and uncertainty.

Building Brain Capital to Reduce Cybercrime

To combat these potential issues in the workplace, it is essential to foster a sense of community, purpose, and value within each individual. If individuals feel that their work is valued, including from a tangible perspective such as a raise or promotion, they are more likely to continue to stay with a company and more likely to foster a positive community. Clear and direct communication along with tangible goals to strive toward leads to individuals being more likely to accomplish what they set out to do in exemplary and innovative ways.

There are ways to mitigate the potential of cybercriminals to exploit the cognitive biases of individuals. One way is for leaders within companies to emphasize brain capital, including increasing the value and accessibility of caring for employee mental health. From a cybersecurity perspective, the boundaries between personal lives and business lives are blurry or nonexistent. Your computer is equally vulnerable to an attack on a personal-related account or a work-related account. Stressed and pressured individuals are also more likely to be vulnerable to attack. Thus, ensuring that insurance covers visits with mental health professionals, allowing for time away from work for mental health crises and encouraging open communication regarding mental health in company culture, can contribute to security. Even leaders can do this by showcasing their own vulnerabilities and fostering a positive, ongoing dialog surrounding mental health.

Another strategy to increase resilience to cybercrime—especially social engineering attacks—is to emphasize training and education on the cognitive biases and social norms that are exploited by cybercriminals. We recognize the quality work of groups such as the National Cybersecurity Center in Colorado Springs, the Cyber Security Research and Education Institute at the University of Texas at Dallas, and digital literacy workshops provided by organizations like Senior Planet.

Concluding Thoughts

Many individuals think of being “hacked” as an activity rooted in software and hardware, without realizing that cybercriminals are far more likely to exploit our cognitive biases, our brains, than vulnerabilities in computer code.

Mr MacRae is author of the book, Dark Social: Understanding the Dark Side of Work, Personality, and Social Media. Dr Ojha is a psychiatry resident physician at the UCSF Fresno Psychiatry residency program. Ms Smith is an Atlantic Fellow for Brain Health Equity at the Global Brain Health Institute and a Stanford Thiel Fellow. She is on the steering committee for the OECD NIPI. Dr Krawczyk is deputy director of the Center for BrainHealth in Dallas and holds the Debbie and Jim Francis Chair at the University of Texas at Dallas. Dr Berk is Alfred Deakin Professor of Psychiatry at Deakin University and director of the Institute for Mental and Physical Health and Clinical Translation (IMPACT). He holds honorary professorial appointments with ORYGEN Youth Health, the Florey Institute for Neuroscience and Mental Health, and the Department of Psychiatry at the University of Melbourne. Dr Eyre is lead of the Brain Capital Alliance and co-lead of the OECD Neuroscience-Inspired Policy Initiative (NIPI). He is also senior fellow for brain capital with the Meadows Mental Health Policy Institute. He holds adjunct appointments with Deakin University’s IMPACT, and the departments of psychiatry at Baylor College of Medicine and the University of Texas Health Sciences Center at Houston. He holds advisory positions with the Women’s Brain Project, the Heka Fund (a collaboration of Newfund Capital and FondaMental Fondation), the Davos Alzheimer’s Collaborative, the Brain Health Nexus of Cohen Veterans Bioscience, BrainLat, and the Euro-Mediterranean Economists Association. 

References

1. Goldman J. Fully 84 percent of hackers leverage social engineering in cyber attacks. eSecurity Planet. February 28, 2017. Accessed June 10, 2022. https://www.esecurityplanet.com/threats/fully-hackers-leverage-social-engineering-in-cyber-attacks/

2. Smith E, Ali D, Wilkerson B, et al. A brain capital grand strategy: toward economic reimaginationMol Psychiatry. 2021;26(1):3-22.

3. Norris G, Brookes A, Dowell D. The psychology of internet fraud victimisation: a systematic review. J Police Crim Psychol. 2019;34:231-245.

4. Williams EJ, Beardmore A, Joinson AN. Individual differences in susceptibility to online influence: a theoretical review. Comput Hum Behav. 2017;72:412-421.

5. Friedman TL. Thank You for Being Late: An Optimist’s Guide to Thriving in the Age of Accelerations. Farrar, Straus and Giroux; 2016.

Related Videos
bystander
MLK
new year
brain
brain schizophrenia
eating disorder brain
kindness
change
psychiatrist
birthday celebration
© 2024 MJH Life Sciences

All rights reserved.