Players and teams are developing strategies to spread awareness and limit their exposure to offensive users as social media continues to fuel abuse.
For three non-white players on England’s national squad, missing penalties in a crucial international football final was already a terrible experience. It got worse when they started receiving a barrage of racial abuse on social media. From emojis of monkeys to the “N” word, the players had to endure ruthless humiliation.
What’s even more tragic? Everyone anticipated its arrival.
Statement by Onuoha
“It’s stupid,” said player Nedum Onuoha, a person of colour who was in the top divisions of English and U.S. football for 16 years. “But are we surprised?”
It’s the most recent iteration of racism, visible, persistently obtrusive, and available around-the-clock. It’s a chilling throwback to the monkey chants and banana-throwing of the 1980s in the age of social media.
And on platforms where anonymity is the golden ticket for bigots, it is out of control.
“Every time it happens, it knocks you back and floors you,” Onuoha told The Associated Press. “Just when you think everything is OK, it’s a reminder that it’s not. It’s a reminder of how some people actually see you.”
Confronting online racism in football
The statistics compiled by Kick It Out over the past three seasons in English football confirm that racism is the most prevalent form of abuse reported on social media. Additionally, a FIFA report from last year highlighted that over 50% of players participating in the African Cup of Nations and the European Championship in 2021 received some form of discriminatory abuse on social media. Of the more than 400,000 posts analysed, over a third were of a racist nature.
The issue lies in the lack of accountability and the ease with which individuals can engage in such abusive behaviour. With just a mobile phone, one can easily find a player’s social media handle and send a racist message without consequences.
Former Premier League striker Mark Bright, who experienced racial abuse inside stadiums during the 1980s, was discussing the match with friends in a WhatsApp group when the three non-white players for England—Bukayo Saka, Marcus Rashford, and Jadon Sancho—missed penalties in the shootout loss to Italy in the 2020 European Championship final. This unfortunate event sparked a wave of racist messages directed at the players, further emphasising the urgent need to address the issue of online racism in football.
Statement by Bright
“We all messaged each other and said, ‘Oh God, here we go.’ Because we know what’s around the corner,” Bright told the AP. “That’s what we expected and this is where, once again, you say, ‘What can be done about it?’”
Football’s paradox: “Person of colour” players and online racial slurs
Despite facing racial abuse on social media, “person of colour” players continue to use these platforms due to their essential role in marketing and connecting with fans. It creates a paradox where football players are subjected to abuse on the very platforms they rely on for promotion and engagement.
Prominent players like Kylian Mbappe, who has a massive following of 104 million on Instagram and over 12 million on Twitter, and Vincius Junior of Real Madrid, with 38 million Instagram followers and nearly 7 million on Twitter, have been targets of racial abuse. Even Bukayo Saka, who has more than 1 million followers on Twitter, has chosen to remain on social media despite the abuse he received after England’s loss in the Euro 2020 final and more recently when he was racially insulted in a Twitter post that depicted him as a monkey.
The persistence of these players on social media highlights the complex dynamics they face. While they endure online abuse, they recognise the importance of using these platforms to connect with fans, share their experiences, and promote their personal brands. It underscores the need for better measures to combat racism and discrimination on social media platforms to create a safer environment for all users, including football players.
AI filters empower players to combat online abuse
Players and teams are coming up with strategies to spread awareness and limit their contact with disrespectful people as social media continues to feed abuse.
A business called GoBubble uses artificial intelligence (AI) algorithms to create filters that prevent social network users from seeing offensive comments. Customers come from all around Europe and Australia, including the Premier League and the fourth level of English football.
Statement by Henry Platten
“Yes, tech has caused the issue,” GoBubble founder Henry Platten told the AP, “but tech can actually solve the issue, and this is what we are seeing as one of those pieces of the jigsaw.”
With the help of a traffic light system, the company’s AI technology is hooked into players’ accounts and searches for offensive and potentially damaging phrases, photos, and other forms of messages that can be blocked.
“This isn’t about censorship, about sportswashing, or about creating that fuzzy world,” Platten said. “This is about protection, not just for the players and their families but also the wider fan community.”
Platten mentioned that several players who approached him had faced mental health challenges that affected their performance. Liverpool took a significant step in January by hiring a mental health consultant specifically focused on safeguarding young players from online trolling, making them the first Premier League club to do so.
FIFA and FIFPRO join forces
The governing bodies in football are also taking action. During the previous World Cup held in Qatar, FIFA and the players’ union, FIFPRO, implemented a dedicated moderation service that prevented players and their followers from encountering racist and hateful speech online. This service will also be provided for the upcoming Women’s World Cup.
In response to racist abuse, football authorities in England, including the Premier League, organised a four-day social media boycott in 2021 across platforms like Twitter, Facebook, and Instagram. This boycott gained support from various sports in England and was later embraced by FIFA and UEFA, the governing bodies for European football.
Despite these efforts, the problem of abuse persists on social media platforms. There have been accusations that platforms are slow in taking action to block racist posts, delete offender accounts, and improve their verification processes. It is crucial for platforms to enhance their response mechanisms, impose stricter penalties on offenders, and strengthen user verification procedures to ensure accurate identification information is provided and individuals are unable to register new accounts after being banned.
“It needs to be regulated, you need to be accountable,” Bright said. “Everyone’s been complaining about this for a long time now. Some players have set up meetings with these social media companies. It seems to me that they’re not serious enough about it.”
Statement by Meta
“No one should have to experience racist abuse, and we don’t want it on our apps,” Meta, which owns Instagram and Facebook, said in a statement to the AP. “We take action whenever we find it and we’ve launched several ways to help protect people from having to see it in the first place.”
According to the statement, this includes “Limits,” which hides comments and DMs from users who don’t follow you or who have only recently started following you, as well as “Hidden Words,” which filters inappropriate comments and direct messages and is enabled by default for creator accounts.
“We know no one thing will fix abusive behavior,” Meta said, “but we’re committed to continuing working closely with the football industry to help keep our apps a safe place for footballers and fans.”
When the AP requested a comment, Twitter responded automatically with a faeces emoji.
According to Platten, the founder of GoBubble, platforms must strike a compromise between maintaining a sizable user base for financial reasons and projecting an image of being strong on racism.
“There’s always going to be a position where they may move closer to solving the problem,” he said, “but are never going to go the full hog that we all want them to, in terms of really cracking down and solving it.”
Promoting morality in online spaces
Some sports teams and individuals are selecting alternate channels to advertise not only their own brands but also online conduct that is more moral.
These include Striver, a user-generated multimedia portal supported by 2002 World Cup champions Gilberto Silva and Roberto Carlos of Brazil. And PixStory, a site with around 1 million members, evaluates users based on the integrity of their postings and strives to build “clean social” by placing safety above all else in a way that major digital firms do not.
The women’s teams of Paris Saint-Germain, Juventus, and Arsenal all work with PixStory, whose founder, Appu Esthose Suresh, claims that teams and athletes are caught in a “Catch-22 situation.”
Statement by Suresh
“They want to live in this space because it’s a way to reach out and interact with their fans, but there’s not enough safety,” Suresh told the AP. “There is an alternative way — and that’s change the business model.”
Accountability and consequences
In the end, the biggest shift will probably be brought about by legislation. The Digital Services Act, which will require large tech corporations to better safeguard European customers from dangerous online content or risk being fined billions of dollars, was reached in a basic agreement by the European Union last month. The Online Safety Bill has been suggested by the British government, with potential fines of 10% of the platforms’ annual global turnover.
In the meantime, more people are being charged with crimes for engaging in online racist harassment. A three-year ban from all British football stadiums was imposed on the guy who harassed England striker Ivan Toney in March, according to police, in what they called a “landmark ruling.”
While Onuoha welcomed these developments, he continues to keep his social media accounts private.
“There will be lots of good people who won’t be able to connect with me, but it’s a consequence of not having enough trust and faith in enough good people being allowed to enter the account,” he said. “It’s the 1% who offset the entire experience.”