Hacker psychology: Understanding the 4 emotions of social engineering
While technological know-how certainly plays a large role in enabling attackers to hack any given system, corporation or individual, what often is overlooked is that some tricks of the trade, like social engineering, are also psychological games. That means that protecting and defending against these kinds of attacks is, in turn, part mental as well.
It’s important for IT professionals to understand the ways in which social engineers take advantage of human emotion in order to carry out their attacks. Let’s examine the four human emotions and behaviors hackers most commonly exploit as part of a social engineering campaign, the distinct campaign characteristics for each manipulated emotion, and some key considerations for better positioning your employees and your organization against falling prey to these types of attacks in the future.
* Fear. Defined as an unpleasant emotion caused by the belief that someone or something is dangerous, likely to cause pain or a threat.
As one of our most powerful motivators, fear is arguably the most commonly manipulated emotion when it comes to social engineering campaigns. Whether in the form of a phony email that your online bank account has been compromised and requires a password change, or an urgent bank security notice, these scams leverage a specific threat to the targeted recipient or group of recipients, which forces them to act quickly to avoid or rectify a dangerous or painful situation.
As an example, cybercriminals recently took advantage of tax season by gathering information stolen from the IRS to call and threaten U.S. residents filing for taxes. After getting hold of victims on the phone, the attackers would immediately become aggressive, threatening immediate police action if money was not wired to a fake IRS account to rectify a tax irregularity.
* Obedience. Defined as complying with an order, request or law or submission to another's authority.
Social engineering scams that prey on obedience are often disguised as an email, instant message or even a phone call or voicemail from a person or group of superior authority, such as law enforcement or an executive at one’s company. Because we’re taught from a young age to trust authorities, we are not conditioned to question the validity of their correspondence and tend to comply with their instructions, requests and guidance.
But when it comes to phishing campaigns, innate authoritative trust can have some serious consequences. Just ask toy maker giant Mattel, which nearly coughed up $3 million to a cybercriminal who disguised himself as the company’s CEO in an email to a finance executive with instructions to approve a payment to a vendor in China. While this particular scam had a happy ending for Mattel, as Chinese authorities were able to help restore the funds, it’s a hard lesson learned on the power of authority and obedience when it comes to phishing attacks.
* Greed. Defined as an intense and selfish desire for something, especially wealth or power.
In the case of greed-exploitative campaigns, these routinely offer a reward – usually monetary – for performing a specific action. A classic example is what is commonly referred to as the “419 Nigerian scam,” which earned its name from cybercriminals claiming to be a Nigerian official or agency via phone or email who promises a handsome reward for a small action or even a small sum of money – so long as the target eventually shares their bank account information in order to receive the reward.
In 2013, a Nigerian scam victim from Australia spoke out at the AusCERT Conference, and revealed she was swindled out of $300,000 over the course of four years. As the saying goes, money is the root of all evil – and in the case of phishing campaigns, letting greed outweigh one’s better judgement can prove this to be true to the utmost degree.
* Helpfulness. Defined as a willingness to help other people.
Not all cybercriminals take advantage of negative human tendencies in order to carry out social engineering campaigns. In fact, the fourth behavior commonly exploited is a willingness to help out another person or group. These campaigns are often targeted at customer support or customer service departments, as attackers are betting these employees’ propensity to lend a hand and keep people happy will encourage them to divulge or accept more information than they should.
Take the recent Amazon.com customer service backdoor recently disclosed on Medium, for example. In this case, a hacker accessed a shopper’s Amazon account, and with just a name, email and an incorrect mailing address, was able to verify the account via online chat with customer support and, through a series of calculated questions, obtain his target’s correct personal information. The hacker ultimately gained access to the shopper’s credit card information and made a purchase via his Amazon account. The customer support reps were simply doing their job, but the hacker in question knew just how to use their helpfulness against them.
In an enterprise setting, as with many aspects of security, a large part of defending against social engineering comes down to setting policies and educating employees. Insider threats are arguably the most common and dangerous threat to an organization’s defense, with recent research revealing internal actors responsible for 43% of data breaches – half of which are accidental and non-malicious.
It’s not only important that IT and security leaders understand hackers’ evolving tactics, but that they also continuously adjust policies and share their knowledge by educating their colleagues and training them to be vigilant against nefarious activity. For example, employees need to be taught to take a step back when they receive, say, a suspicious email or instant message and consider the emotion the vehicle for an attack is eliciting and how that might help indicate foul play. While it may be obvious to you as an IT professional that an unexpected email that provokes an urgent emotional or behavioral response – such as fear, obedience, greed or helpfulness – is an automatic red flag, the average employee likely does not.
Regular employee awareness programs may seem daunting but there are resources to help guide you through employee social engineering training from the likes of the SANS Institute and other leading organizations.
Reminding your employees that they can send all questionable emails and communication to the IT department can go a long way toward stopping a scam before it begins wreaking havoc. No matter the size of your organization, make sure you’re doing your part in ensuring your colleagues are an effective line of defense against social engineering – and most importantly, don’t forget the psychology of it all.
Perhaps the most useful piece of advice is to stop and consider the request or correspondence from a clear head and ask whether it could be used in a nefarious way before proceeding.
BetterCloud provides critical insights, automated management, and intelligent data security for cloud office platforms, and is trusted by IT teams in over 50,000 organizations representing 32 million users around the world.