The social psychology of online groups and how connectivity undermines order and brings out our worst instincts — and sometimes our violent sides.
People gathered at a community-led vigil in tribute to victims near Al Noor Mosque in Christchurch.
Adam Dean for The New York Times
As the authorities in New Zealand work to identify and block off the drivers of last week’s mass shooting, they plan to restrict access to guns, rebuked international white nationalism and pushed social media platforms to block out hate speech.
But there is another force at play in this and in other, otherwise unconnected acts of violence and extremism around the globe. In a wide range of online groups and communities, there is a tendency — that seems to be growing — to drift toward hatred, which can eventually turn to targeted harassment and sometimes real-world violence.
These internet-based swarms have often been dismissed as little more than mischief-making basement-dwellers. When their harassment becomes more serious, tormenting victims or even driving them from their homes, they are often written off as trolls. When one of their ranks turns to violence, the attack is attributed to ideology or to personal causes.
But research suggests that fundamental human psychology, as much as any ideological cause, can lead online groups to organize around some focus of hate — and can encourage the group’s members to make ever-escalating attacks, less for political ends than as a kind of performance of solidarity with the group and its ideals. Doing so, research shows, gives the participant a sense of strength and security and belonging.
Normal constraints on this behavior, such as social cues that tell us not to hate or the risk of consequences like social disapproval or punishment, tend to be much weaker online, letting those impulses run rampant.
It is a pattern that repeats, with chilling consistency, even in online communities dedicated to apparently anodyne topics like pop culture. A milder version of this dynamic will be familiar to any user of Twitter, where everyday debates often devolve into mass pile-ons and waves of derision, if not hate. Users who are polite in real life can be driven to rage at the target of that day’s collective ire. And even when grievances are legitimate, the response is often wildly out of proportion to the actual offense.
Those consequences are becoming harder to ignore, particularly for the victims: women driven from their homes by death and rape threats that emerged from online gaming groups; left-wing Brazilian politicians forced into exile by online mobs; bystanders killed last year by followers of the internet-based incel community.
Golriz Ghahraman, a lawmaker with New Zealand’s Green Party, who is the first refugee to be elected to the country’s Parliament, has been the target of such severe online harassment that parliamentary security has had to implement special measures for her protection, and at one point elevated the matter to Interpol.
“The online hate is violence in and of itself,” she told us. “But also a lot of the offline violence does grow from it.”
“We haven’t reckoned with that online culture at all.”
Mob Psychology Is Just Human Psychology
Humans are social animals; being part of strong communities makes us feel more secure. “Performing” group membership, by publicly espousing group values through words and actions, is a way for people to strengthen ties to the communities they value and prove that they are members in good standing.
Anyone who has spent time on social media has probably absorbed one insider dialect or another: slang and memes that serve as shibboleths, signs of who is in the know for a particular group. And that’s usually harmless — in-jokes about puppy gifs aren’t a threat to public safety.
But researchers have found that discussing an issue with like-minded people makes people hold more extreme views about it. So if, say, a handful of people with moderate views on a political matter discuss the issue with each other, they are likely to emerge with much stronger and more radical opinions about it.
The result is that, over time, group beliefs and norms tend to become more extreme, a process known as group polarization.
And research also shows that vilifying outsiders and attacking the group’s perceived enemies is a way for people to signal loyalty to their group and show that they belong.
Together, those dynamics can quickly turn a group toxic, even if it’s a community formed around something that seems trivial or fun. Movie studios now routinely confront bouts of coordinated online harassment, with trolls recently targeting the female stars of “Captain Marvel” and “Star Wars: The Last Jedi.”
And the internet has brought another change that makes this group dynamic potentially far more dangerous. Isolated, alienated individuals can now come together online, forming communities around their rage at a society that they believe has cast them out.
The need to belong and to feel strong can exert a powerful pull, especially on angry young men eager to show off for one another. And few things tighten the bonds of community — or make the weak feel powerful — like attacking an apparent outsider.
“That’s one of the consequences of the internet,” said Michael Kimmel, a sociologist who runs the Center for the Study of Men and Masculinities at Stony Brook University. “Formerly these people felt isolated and alone, but now they find their mates, they find their people.”
The result, he said, are communities of “aggrieved entitlement,” where people bond over, and heighten, each other’s feelings of group-based grievance.
As members try to one-up each show of retribution against a perceived enemy, what starts as online trolling can escalate into something more severe.
In 2014, a group of video game enthusiasts became convinced that feminists were collaborating with game reviewers to undermine online game culture, a perceived scandal they called GamerGate. Discussion turned to anger, then rapidly escalated into a campaign of threats against female game designers and critics so severe that it drove at least one target from her home.
At the time, GamerGate seemed like a weird fluke, online culture gone awry. In retrospect, it was a harbinger.
Raw Group Impulse, Unconstrained
Ordinarily, one’s friends and families set social cues that discourage this sort of bad behavior. Employers require compliance with broadly held standards for behavior and beliefs. Reputations matter and, if damaged, are difficult to repair.
But in closed online communities, where many members are anonymous and details of their offline lives remain hidden, there is less opportunity for those kinds of social pressures to come into play.
And our sense of right and wrong is set, perhaps more than we would like to admit, by social cues.
Betsy Paluck, a social psychologist at Princeton University, has found that people intuit their communities’ beliefs and standards through social cues, which powerfully shape an individual member’s own internal sense of morality,
In one study, Ms. Paluck found that schoolchildren base their views on bullying on what they think their classmates think. And they paid special attention to a handful of influential peers as a shortcut to figuring that out.
Offline, this development is often a steadying, moderating process. But in online communities, particularly those mobilized around fears or grievances, the influential figures tend to be those who post most frequently, with the most radical beliefs.
That can give people the impression that there is widespread support for the most aggressive behavior sanctioned by the group — sometimes including violence — and that they would win community approval by attacking others.
They are often right. Elliot Rodger, who self-identified as an “incel,” an online community of “involuntarily celibate” men who blame women for their misery, murdered six people and injured 14 others in Isla Vista, California in 2014. He is lauded on incel forums as the “Supreme Gentleman.”
Growing Connectivity, Growing Consequences
Incidents of online harassment and outrage taken too far appear to be growing. There is an ever-accelerating pace of seemingly isolated incidents. On a message board, insults escalate to threatening calls against some member. Video game players make hoax phone calls about fake crimes, provoking real police, guns raised, to storm into homes. Or on internet forums awash in online conspiracy theories and hatred, a man takes matters into his own hands.
More socializing takes place online all the time, with ever more communities cut off from real-world norms and left to indulge self-affirming, self-escalating cycles of anger.
And as the groups grow in size and power, the consequences are escalating as well. Some political operatives have tried, with some apparent success, to harness these online mobs, ever eager for their next target, to their own ends.
In Brazil, there is a growing trend of far-right trolls deploying harassment and threats serious enough to force their perceived political opponents on the left into hiding or fleeing the country.
Jean Wyllys, the country’s first openly gay lawmaker, left the country in January amid online death threats, saying he had no plans to return. A month earlier, Marcia Tiburi, who had run for governor of Rio de Janeiro State, had fled for Pittsburgh under similar conditions.
In New Zealand, Ms. Gharahman, the lawmaker, has been targeted by similar tactics. Online harassment is often orchestrated by paid operatives who also work for major political parties, she said. “But the culture they foster of course then feeds an organic, intensified racism that ends up affecting ordinary minorities and women online.”
Still, rights groups say, mainstream perceptions of the threat have lagged behind its impact, with communities of hate too often seen as raucous chat groups that do little more than trolling.
That may now be beginning to change. “Yesterday, for the first time, a senior minister stopped and asked if I thought they could do anything,” Ms. Gharaman said. “I’ve spoken about it a lot, but no one has ever stopped and said that’s actually not O.K. It should be stopped.”
“I take it as a sign that this terror attack has meant people are taking online harassment and racism seriously,” she said. “Now they see it as leading to actual violence.”