I received the following email alert from a cybersecurity client of mine:
“6x increase in cyber attacks over the last 4 weeks.”
“Information about COVID-19 should only come from a legitamate source. Don’t trust unsolocited emails or open unknown links”
“Really?,” I thought to myself; “We’re on lock-down, stressed about family and friends, not to mention business and jobs, and I’m getting cybersecurity alerts?” Frankly, I usually ignore them when I’m not distracted, but who has time for this now?
However, the more I thought about it, the more I realized that’s exactly what cybercriminals are thinking too and why people need to stay alert and resist the temptation to click on those compelling links.
The truth is, despite the fancy hardware and software solutions available, most cybersecurity breaches occur due to human error or phishing attacks. Unless you have relatively sophisticated automated solutions, the people IN your organization may represent your greatest internal threat.
While companies see high risks from external threat actors, such as unsophisticated hackers (59%), cyber criminals (57%), and social engineers (44%), the greatest danger, cited by 9 out of 10 firms, lies with untrained general (non-IT) staff. In addition, more than half see data sharing with partners and vendors as their main IT vulnerability. Nonetheless, less than a fifth of firms have made significant progress in training staff and partners on cybersecurity awareness (ESI ThoughtLab/WSJ Pro Cybersecurity, 2018).
And this was before COVID hit us between the eyes. Let’s take a quick look at the psychology at play that makes us even more vulnerable during a crisis.
The Neuroscience of Crisis
As humans, we are prewired for crisis.
Whether you think of this brain system as the “reptilian brain,” attributed to Paul MacLean and his Triune Theory (Sagan, 1977), or the fight-flight reaction of the sympathetic nervous system (System 1) which is our immediate, emotional reaction (Kahneman, 2011), it is clear that our brain protects us in times of danger.
This system, which is buried deep in the interior of the human brain, is both evolutionarily older and more immediate than simple cognitive thought; it is pre-cognitive. When the danger is ambiguous, System 2 thinking (which, in contrast with System 1 is slower, more deliberative and more logical) is nice; go through your options, take your time, don’t rush.
But when there is a perception of crisis, the need to ACT is immediate.
The fight-flight response makes us want to DO something, and now! From an evolutionary point of view, in times of danger, those who acted first were often safer than those who took their time.
The COVID-19 pandemic is, of course, a crisis.
Have people noticed how much more tired they are these days, even though we aren’t even leaving the house? It’s because crisis mode requires more energy. During a crisis, the thoughtful, reflective parts of our brain shut down. In other circumstances, we might hover over a suspicious link, while we process whether it seems risky or not.
But that requires fully functional frontal lobes, or executive functioning, which need time and undivided attention to work properly. In crisis mode, frontal lobe functioning is significantly diminished, or may go offline altogether, in favor of a quick (albeit less considered) action or reaction.
To make matters worse, cybercriminals know this: They know what emotional buttons to push to make you afraid (just click the link) or try to help (just click the link), or maybe even register your opinion (just click the link).
But if you do click that unfamiliar or disguised link, you may have just let criminals into your personal computer and, by default, into your company’s IT system.
Wait, consider, relax. Let System 2 kick in before you commit yourself, your computer, and your company to whatever those “black hat” cybercriminals have in mind.
Motivation During a Crisis
After the fear comes a desire to help.
This is one of the ways that cybercriminals trick well-meaning people. Whether it’s a donation, or a message of support, or some other activity to help, we are again motivated in ways that leave us open to online criminal behavior.
McClelland’s Social Motive Theory suggests there are three primary social motives: Achievement, Affiliation, and Power (McClelland, 1987).
We all have the capacity for all three, and genetics and socialization as well as cognitive choice determine which motive wins the day in a given situation. In times of individual crisis, needs for achievement (e.g., successful social distancing) or needs for power (e.g., controlling the situation) may come to the fore.
But in a social crisis, many of us are “hard-wired” to help, triggering a need for affiliation.
That desire to help may cause people to act impulsively in what they believe is a pro-social, affiliative manner. Just click the link to make your donation, just click the link to show your support, and on and on, the cybercriminals never stop trying. Like the very best advertisers, they are clever about pushing your emotional (non-cognitive, pre-cognitive) buttons to get you to act in ways that benefit them.
I am assuming everyone reading this has the best of motives. Those very motives make you susceptible to the manipulation of cybercriminals.
If your current impulse is to put this away, turn to something else, then you have experienced exactly what cybercriminals are counting on.
Information fatigue, too much bad news, or just a desire to put some positive energy back out into the world, may all leave you vulnerable.
Don’t click suspicious links, or even links that look well-meaning, without doing some simple checks and reviews first.
- Hover over a link and see if the URL is the same as whom the email purports to be from.
- Don’t provide any information, on any social media, whether at work or elsewhere, that can be used against you.
- Hackers are clever and unscrupulous so check and double-check links that looks suspicious in any way.
- Do a bit of research before you agree to anything and certainly before sending money or private information.
What’s Your Story?
Narrative is the final pillar in this little tripartite approach to cybercrime. I have come to believe that personality is a story we tell ourselves (and the world) about ourselves (Bruner, 1985).
This story comprises our identity, it is who we think we are and often these beliefs about who we are dictate how we behave in the world and how we process information.
For example, as a psychologist (not to mention a human being), I think of myself as a helpful person. I try to be kind and considerate. I don’t like to walk past beggars without giving them something (yes, yes, I know that would cause me to lose points on the WAIS IQ test but there you go, despite my cognition telling me this could be a trick, he or she will just buy cigarettes and beer, I often give in anyway).
Cybercriminals will use these ideal images we have of ourselves to manipulate our thoughts, emotions, and purse-strings.
- I am good, so I give to the sick and needy.
- I love children, so I’ll give to those orphaned by COVID.
- I support healthy behaviors, so I’ll do most anything to protect my health.
- I’m a good parent, so I will click the link that shows me 10 ways to protect my family from infection.
Your personal narrative is the core of your personal identity. We sometimes value it more than life itself (think of martyrs).
If a clever cybercriminal hacks your social media, understands what makes you “tick,” that information can be used against you in a cybercrime.
The threats are real and so are the psychological levers cybercriminals pull to manipulate your fear.
We are all overwhelmed, trying our best to hang in there, and help each other where we can. Don’t let your best intentions, and fatigue, allow you to be manipulated to behave unsafely online. COVID is real, and so is cybercrime. We must be alert to both.
Written by: Dr. Mark Sirkin, CEO at Sirkin Advisors
References
Bruner, J. (1986). Actual minds, possible worlds. Cambridge, MA: Harvard University Press.
ESI ThoughtLabs/WSJ Pro Cybersecurity (2018). The cybersecurity imperative: Managing cyber risks in a world of rapid digital change. New York: Author.
Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.
McClelland, D. (1987). Human motivation. New York: University of Cambridge.
Sagan, C. (1977). The dragons of Eden. New York: Penguin Random House.
Tags: Cybersecurity, Employment