Use Cyberpsychology to Know the Enemy;Gathering Dark Web Secrets
By Kim S. Nash
15th October, 2018.
Good day. Companies can boost cybersecurity defenses by understanding the psychology of online criminals, says Mary Aiken, who holds a doctorate in the little-known research field of forensic cyberpsychology.
Common artificial intelligence tools can pick out unusual activity such as repeated attempts to access a network without authorization, but companies lack tools for identifying signs that an employee could go rogue or an outsider is preparing to attack, she tells WSJ. Pro Cybersecurity’s Catherine Stupp. The first step is appreciating that “human behavior mutates online”, Dr. Aiken says.
Security researchers who prowl the Dark Web can help companies stay alert to burgeoning cyberattacks and the buying and selling of customer date, reports Jeff Stone. Wells Faro & Co. and U.S. Bancorp say such services have helped them strengthen their security posture and sometimes avoid incidents.
Cyberpsychologist says Companies Must Rethink Approach to Cyber Defense
By Catherine Stupp
Amid the endless analysis, there’s one dimension of cybersecurity that companies overlook: the role of psychology. Cyberpsychologist Mary Aiken looks at the internet in terms of human behavior, taking into account motivation and conduct, and comparing how they differ in the online and physical worlds.
Dr. Aiken, who holds a doctorate in the little-known research field of forensic cyberpsychology, introduced her theories to a broader audience when the TV show “CSI: Cyber” aired in 2015, featuring a character based on her. This was just before the rise of large-scale cyberattacks, such as last year’s WannaCry and NotPetya viruses. The show ended in 2016 and Dr. Aiken continues to raise alarm about the unique risks of the internet for incubating dangerous activity.
“Human behavior mutates online, driven by the power of anonymity, driven by the online disinhibition effect,” she said in a phone interview from Dublin. She is an adjunct associate professor at University College Dublin. The internet can make it easier for people with antisocial tendencies to find one another, facilitating and normalizing aberrant behavior, according to Dr. Aiken.
A leading expert in the psychology of cybercrime, Dr. Aiken has worked with an eclectic mix of clients, including the Federal Bureau of Investigation, European police agency Europol, and the cybersecurity-focused private-equity firm Paladin Capital Group.
The FBI declined to comment on its work with Dr. Aiken. Philipp Amann, head of strategy at Europol’s cybercrime center, said that Dr. Aiken advised the agency on analyzing factors that draw young people to online crime. “There are a lot of questions that we need to understand when it comes to combating cybercrime that require input from academics like Mary. How do people behave online? How is that different from their normal behavior?”
Internet brings out potentially bad behavior
Companies may use artificial intelligence to monitor suspicious activity, such as attempts to access their systems. But Dr. Aiken says that common AI tools don’t analyze human behavior for signals that crime is imminent. And that, she says, is critical to building effective cybersecurity defenses.
It’s trickier to pick up on likely hackers before they commit crimes and, failing that, to identify them after they carry out attacks. Dr. Aiken says that’s where cyberpsychologists come in. There are international conferences and universities offering degrees in the field, but cyberpsychology hasn’t received wide attention outside of academia.
As cyberattacks become more frequent and damaging, Dr. Aiken warns that the biggest threats come from how the internet shapes human psychology, bringing out new patterns of criminal behavior.
Dr. Aiken views the internet as an environment in which users might be more willing to break the law because they feel there is no authority policing them, and it’s easy for them to find other people who want to commit crimes, making their own behavior seem common. That applies to employees who might commit cybercrimes as well as outside hackers, she said.
Disgruntled employee or hacker from North Korea?
When Dr. Aiken studies a hacker’s motivation, her theories often don’t line up with mainstream assessments.
She contends, for example, that an upset employee was most likely behind the 2014 hack of Sony Corp., and that the release of thousands of emails from the studio’s then co-chairman Amy Pascal was not the work of a North Korean hacker whom the U.S. government indicted earlier this year. Dr. Aiken said the fact that Ms. Pascal was targeted suggests the attacker was a staff member who sought revenge.
If North Korea had ordered the hack to prevent the studio’s release of the movie The Interview, as prosecutors alleged, the attacker would have instead published emails from the film’s actors or the director, she said.
“If you looked at who paid a price, Amy Pascal lost her job, that’s it. Then you work back,” Dr. Aiken said.
“She comes at it from so many different perspectives that we often just completely miss and overlook,” said Ralph Echemendia, an entrepreneur based in Estonia who Dr. Aiken advised on the development of a security app, Seguru, that evaluates users’ behavior and informs them about cyber risks.
Mr. Echemendia said he drew on Dr. Aiken’s understanding of behavioral anomalies in designing the app’s algorithm. She also warned him that criminals might use the app to hide their identities online.
“There’s no doubt that Mary has been useful in helping us identify and reduce how even Seguru may be used inappropriately,” he said.
Applying psychology to get to the root of a cyber incident is new territory for many business leaders but doing so, she said, could help companies sidestep attackers from outside and inside.
She described categories of employees who may be insider threats, including people who ignore security rules, intentionally sabotage the company or collude with outside criminals.
Corporate security officers looking for sophisticated measures to counter such threats could hire cyberpsychologists to develop tests to monitor employees’ online behavior. They could build personality profiles using questionnaires and logs of their workplace internet activity and flag potential motivation to commit crimes, she said.
A tool backed by AI could measure these indicators and determine if they signal feelings of depression, anxiety, anger or guilt, she said.
Traditional threat exercises, in which employees pose as hackers to test their colleagues’ reactions to phishing emails and other attacks, fall short, she warned. Employees who only play hackers lack real motives and don’t fear being caught, she said.
“What you want is AI systems that are actually designed to model human irrationality,” she said. “There isn’t anything like that at the moment.”
To continue reading this article, please click here: