Culture Minister Marc Miller rises during question period on Parliament Hill in Ottawa on Tuesday, March 24, 2026. THE CANADIAN PRESS/Adrian Wyld

Online harms bill should cover AI chatbots, say some on Ottawa’s advisory group

Apr 20, 2026 | 10:43 AM

OTTAWA — Some of the experts Ottawa has tasked with giving it direction on the upcoming online harms bill say the legislation should cover AI chatbots, while opinion on the idea of age restrictions for access to social media is more varied.

Emily Laidlaw, a law professor at the University of Calgary and a member of Ottawa’s online harms advisory panel, said she doesn’t see how the government can reintroduce online harms legislation without addressing a technology that is “facilitating some of the most harm to vulnerable adults and children.”

In March, the government reconvened an expert group it previously consulted on an earlier iteration of that bill, which did not become law before last year’s election was called.

Since then, safety issues linked to artificial intelligence-based chatbots and the idea of age restrictions for social media have both emerged as global political issues.

In Canada, questions have been asked about how the person behind the mass shooting in Tumbler Ridge, B.C., used OpenAI’s ChatGPT. The 18-year-old shooter was banned from using ChatGPT due to worrisome interactions, but OpenAI but did not alert law enforcement and the shooter got around the ChatGPT ban by having a second account.

Culture Minister Marc Miller said last week the government is “very seriously” considering a social media ban for kids, and would leave it to the expert group to weigh in on whether the bill should cover AI chatbots.

Other members of the expert panel — including Taylor Owen, founding director of the Centre for Media, Technology and Democracy at McGill University — agreed the new bill should cover AI chatbots.

Lianna McDonald, panel member and executive director for the Canadian Centre for Child Protection, said in a media statement it would be “short-sighted” not to account for AI chatbots in a future online safety regime.

But opinions about the idea of age-restricting access to social media are more varied among the experts the government is consulting.

McDonald said the Canadian Centre for Child Protection “unequivocally supports a social media delay for children.” She said there is “endless evidence” showing children are experiencing developmental harm.

“Children don’t stand a chance on their own in the face of these completely unregulated and extractive environments as they are simply not developmentally ready. We all have the responsibility to protect them just as we do from tobacco, alcohol, R-rated films and gambling,” McDonald said in the statement.

Owen said the government should consider a temporary age restriction on access to social media that would remain in place until a regulator is set up and companies show they are in compliance.

“And if they do that, then they can get access to that market again. That’s sort of a more balanced approach to this, rather than an outright ban forever,” he said.

Owen said there is an “incredible public appetite” for age restrictions which “represents a legitimate frustration, particularly from parents, that the government or the companies have not addressed these challenges that they’re facing on a daily basis.”

The government’s move to reintroduce online harms legislation comes after experts and advocates urged the Liberals to bring back the bill. Last November, a coalition of child advocates and medical organizations said the dangers children face online constitute a national emergency. They said children are being exploited, extorted and bullied, and in some cases have died as a result of online harms.

In December, Australia became the first country to pass a law enforcing age limits on social media accounts, and other countries are now considering or have put in place legislation that sets a minimum age for social media use. Earlier this month, Liberal party members voted in favour of a such a ban at the party policy convention.

Laidlaw said the popular consensus is in favour of social media bans “and so I think, realistically, that ship might have sailed, right?”

Social media interaction is important for youth and there is a big difference between banning access for kids under 13 and extending the ban to cover all kids under 16, she said.

“I’m not against some form of kind of age-gating for much younger kids, but only if it’s done carefully,” Laidlaw said.

Panel member Vivek Krishnamurthy, an associate professor of law at the University of Colorado and an associate member of the University of Ottawa’s Centre for Law, Technology and Society, said an age restriction would be the wrong approach.

Kids and teenagers are able to get around restrictions, he said — and the harms that social media causes affect adults as well.

“My argument is that we should be regulating the design characteristics of these platforms that are harmful, such as … algorithmic curation that is designed to maximize engagement and time spent on the platform,” he said.

“We should make social media a better place for all of us.”

Krishnamurthy said there may be challenges associated with extending legislation designed for social media to a chatbot because the harms can be different, citing cases of users developing an emotional dependence on a chatbot.

He also warned of potential problems if the bill targets AI chatbots specifically, since the capabilities found in chatbots “are becoming much more broadly available and much more easy to integrate into existing applications.”

“We need to think about, for example, what happens if the AI-generated features in Microsoft Word start misbehaving,” he said.

This report by The Canadian Press was first published April 20, 2026.

Anja Karadeglija, The Canadian Press