May 6, 2024

Ngola lawsuit: Could Facebook be held responsible for hateful comments about a former N.B. doctor? | Globalnews.ca

The case of a doctor who says he was driven out of New Brunswick by racism and bullying on social media points to a need for further regulation in an increasingly digital age, according to an expert in law and technology.

Dr. Jean-Robert Ngola said he was forced to leave his community of Campbellton, N.B., due to the community shunning he faced after he was falsely accused of sparking a COVID-19 outbreak in May 2020. He now lives in Quebec.

Last week, he launched a lawsuit against the province of New Brunswick and the RCMP, alleging they colluded to investigate and blame him for the outbreak.

Read more:

Doctor accused of spreading COVID-19 launches lawsuit against N.B., RCMP and Facebook

But in an unexpected move, the lawsuit also named Facebook as a defendant, alleging the social media giant — which recently rebranded as “Meta” — was responsible for the rapid spread of misinformation about his case and permitted hateful comments about him to flourish.

Story continues below advertisement

“I think it’s a really interesting lawsuit and I think it brings up a lot of the problems we see when it comes to addressing online harms,” said Suzie Dunn, an assistant professor in law and technology at Dalhousie University in Halifax.

Ngola, who is from Congo, said he was harassed over Facebook after New Brunswick Premier Blaine Higgs told a news conference that an “irresponsible” medical professional was to blame for the outbreak, which infected 40 people and killed two.


Dr. Jean-Robert Ngola says he was driven out of his former community of Campbellton, N.B., by the racist abuse he faced after he was falsely accused of spreading COVID-19.


Global News

While Higgs did not name Ngola, the lawsuit said he had already been identified on social media when Higgs made those comments, so it was easy for the public to connect the dots.

The lawsuit alleges the doctor was doxxed, stalked, and mobbed by death threats and racist insults. According to the statement of claim, he received lynching threats, was called a refugee and told to go back to Africa.

Story continues below advertisement

Read more:

Ngola case ‘definitely bad PR’ for immigration, doctor recruitment in New Brunswick

“Facebook exacerbated the mistreatment of Dr. Ngola due to its reluctance to adequately monitor, stop, remove malicious and misleading content from the platform, in direct violation of its community standards,” said the statement of claim.

Ngola’s legal team is seeking three per cent of Facebook’s net worldwide profit “as punitive damages are required to catch the attention of the social media industry.” During the July-September period in 2021, the company’s revenue grew by 35 per cent to $29.01 billion.

Profiting from hate

While this is the first legal case of this nature in Canada that Dunn is aware of, people being harassed over Facebook is nothing new.

She pointed to the case of Gretchen Roedde, a northern Ontario physician who plans to close her practice due to the cyberbullying she faced after she was falsely accused by a local councilman of denying care to unvaccinated patients.

“This isn’t the first time that this has happened, where misinformation around COVID — which people are feeling very sensitive about — has led to somebody being driven out of town,” Dunn said.

“Especially doctors right now, who are so overworked, who are facing an incredible amount of pressure … and then you have the community that you’re in come after you over something that’s untrue. I can see how it would be incredibly intolerable.”

Story continues below advertisement


Suzie Dunn is an assistant professor in law and technology at Dalhousie University in Halifax.


Submitted by Suzie Dunn

Dunn noted that — as was revealed during the Facebook congressional hearings in October 2021 — there is profit to be made from allowing hateful content to fester online, since profit is tied to engagement.

And what tends to attract engagement is “not neutral content, it’s extreme content,” said Dunn — whether it’s a controversial post that inspires bickering in the comments section, or extremely positive content like an engagement or pregnancy announcement.

In Ngola’s case, where so many people in the small community were interested in discussing the matter, that was bound to drum up some online engagement as well — with much of it turning nasty.

Read more:

Facebook prioritized profits over calming hate speech, whistleblower claims

“You know these companies are profiting in some ways from some of the negative engagements that are happening on Facebook and Twitter and various companies,” she said.

Story continues below advertisement

“And so you are starting to see this question: at what point does content on Facebook or other social media cross the line from being something that shouldn’t be Facebook’s responsibility to manage, to them actually being an agent of harm?”

Dunn doesn’t believe Facebook should hold responsibility for “every mean or nasty comment” made on its platform, but she said when those comments cross into hate campaigns, “at a certain point I think social media companies have some responsibility to behave well and recognize their role in creating healthy dialogue on the internet.”


Examples of some of the racist comments made about Ngola over Facebook.


Facebook/Submitted by Joel Etienne

There isn’t much oversight in the online world, either. Canada has not yet passed substantial legislation to rein in powerful tech giants — though the federal government did propose a new Digital Safety Commission in July 2021, shortly before calling a federal election.

In Nova Scotia, there is an Intimate Images and Cyber-Protection Act, which is the first of its kind in Canada. So far, it’s been used to curb harassment from individuals, rather than holding the companies that allow the cyberbullying to happen to account.

Story continues below advertisement

“When you think about the ways our laws are framed, it’s often like an individual to an individual,” said Dunn.

“When it comes to online harassment or cyberbullying outside of the criminal context, there’s not a lot of options for people.”

Read more:

The dark side of social media: What Canada is — and isn’t — doing about it

Dunn said while any sort of regulations or laws surrounding social media must be weighed against freedom of expression, more oversight is needed.

“People are recognizing that social media companies have an incredible amount of power, and them not regulating even what’s forbidden on their own content moderation policies is causing real, severe social harms that need to be addressed,” she said.

“Clearly, these large social media companies have not taken full responsibility themselves and aren’t engaging in adequate content moderation, and so at a certain point if they’re not going to do it themselves, we need government regulations to moderate some of this more harmful behaviour.”

Read more:

Google warns Canada’s plan to fight online hate is ‘vulnerable to abuse’

Facebook/Meta declined to comment on the case when reached by Global News last week. “We can’t comment on pending litigation,” a spokesperson said in an email.

Story continues below advertisement

Dunn said she will see how the Ngola lawsuit plays out and if that will have an impact on future regulation for big social media companies.

“It’ll be interesting to see. I think people are making innovative arguments,” she said, though she added that people in other jurisdictions have tried to sue the social media giant in the past without much luck.

“But I do think at a certain point someone’s going to make a creative enough argument, and a compelling enough argument, that social media companies will be responsible for tortious acts.”




© 2022 Global News, a division of Corus Entertainment Inc.

Source link