The federal government should appoint a regulator with the power to force social media companies to disclose information to help fight far-right extremism, an anti-hate group told MPs Tuesday.
Evan Balgord, executive director of the Canadian Anti-Hate Network, said an ombudsperson could put more pressure on tech companies to do more to reduce online harms.
“The basic idea is that you have an ombudsperson, a regulator, a well-resourced one, with investigatory powers so they can kick down the door of Facebook and take their hard drives,” Balgord told members of the Commons public safety and national security committee studying “ideologically motivated violent extremism.”
“I’m being a bit hyperbolic here but we know that these platforms hide data from us and lie to me, so we do need broad investigatory powers to investigate them.”
Balgord said the regulator should be empowered to issue recommendations about the algorithms social media platforms use to engage with their audiences, and to take cases to court. He said platforms should face the threat of fines if they refuse to follow the regulator’s recommendations.
Balgord was one of three experts who testedified before the committee on Tuesday. All three described the rise of far-right extremism in Canada, enabled by social media.
Balgord drew a direct line from anti-Muslim groups through the Yellow Vest Canada protests to the convoy protest that paralyzed downtown Ottawa for three weeks and blocked border crossings. He pointed to the Jan. 6, 2021 mob assault on the Capitol Building in Washington, DC as an example of where such movements can lead.
“They’re not all racist, they’re not all violent,” said Balgord. “Not all people on January 6 were either. There were groups in those midsts that decided that they were going to try to do a coup and they swept up a lot of the other people there.
“The same thing is kind of happening here. We have more extreme elements of our far-right movement than others, but as a whole, they are becoming a threat to our democracy,”
Barbara Perry, director of Ontario Tech University’s Center on Hate, Bias and Extremism, said the convoy protest showed “the risks and threats associated with the right-wing movement in Canada.”
Perry said the convoy protest demonstrated a capacity to organize on a large scale through encrypted and unencrypted social media platforms.
“That was the venue through which they were able to display this adeptness that they really have in terms of their ability to exploit the broader popular concerns, grievances, anxieties, and weave them into their own narratives,” she said.
Perry called for better law enforcement intelligence, saying police failed to properly evaluate the nature of the convoy protest. She also pointed out that some officers donated to the convoy or shared online conspiracy theories and misinformation.
Wendy Via, co-founder of the US-based Global Project Against Hate Extremism, told MPs that social media platforms are major drivers of hate speech and conspiracy theories and called on the government to hold them to account.
“The United States, Canada and many other countries are currently awash in hate speech and conspiracy theories like QAnon, anti-vax, election disinformation and the Great Replacement, spreading on poorly moderated social media,” she said.
Via said American militia groups have established themselves on both sides of the border and people like former US president Donald Trump have “legitimized hate and other extremist ideas.”
“Research shows that Trump’s campaign and politics galvanized Canadian white supremacist ideologies and his endorsement of the trucker convoy, along with media personalities like Tucker Carlson, undoubtedly contributed to the influx of American donations to the trucker siege,” she said.
Representatives of Facebook’s owner Meta, meanwhile, told the committee that it monitored groups and accounts related to the truck convoy 24/7 once the convoy began and did not see hate speech or violent content in association with the protest.
“We did not see dangerous organizations, a significant amount of dangerous organizations and individual involvement in the convoy blockade and protest in Canada,” said David Tessler, public policy manager for Meta.
Rachel Curran, public policy manager for Meta Canada, said some content that violated Facebook’s community standards was removed but Facebook users are allowed to criticize the government online.
“Expressing opposition to government mandates is not against our community standards and so we allow that on our platforms,” she said.
Michele Austin, Twitter’s director of public policy for Canada and the US, said her company also monitored the truck convoy protest.
“We knew when it was arriving in Ottawa, we knew when it was taking place in Alberta and we exercised and enforced our rules where it was appropriate,” Austin told CBC News after the committee hearing.
Austin said Twitter received reports from users. Convoy organizers were also talking about their plans on Twitter Spaces.
Tuesday’s hearing came as speculation swirled over how billionaire Elon Musk’s decision to buy Twitter and his pledge to promote free speech could change the social media platform.
Austin told MPs it is too early to know what might change and it could take months for Musk’s purchase of Twitter to go through.
Both companies defended their actions related to extremism, saying they have invested money and hired staff to watch for it on their platforms. Curran said that, for example, 250 white supremacist groups have been banned from Facebook and the company works with law enforcement and intelligence agencies.
Curran said less than $10,000 was raised for the convoy protest on Facebook.