Federal consultations on AI regulations heavily skewed toward businesses, industry groups, say critics

Federal consultations on AI regulations heavily skewed toward businesses, industry groups, say critics
Open this photo in gallery:Federal consultations on AI regulations heavily skewed toward businesses, industry groups, say critics

Minister of Innovation, Science and Industry Francois-Philippe Champagne leaves after a media availability on legislation to modernize the Investment Canada Act, in West Block on Parliament Hill in Ottawa, on Dec. 7, 2022.Justin Tang/The Canadian Press

Critics of the federal government’s bill to regulate artificial intelligence and mitigate its potential harms say Ottawa’s consultations have been heavily skewed toward businesses and industry groups, even as AI applications could negatively affect wide swaths of society.

Innovation, Science and Economic Development Canada (ISED) held 216 consultations with businesses and industry representatives between June, 2022, and September, 2023, on Bill C-27, which contains the Artificial Intelligence and Data Act (AIDA). The bill also updates the country’s consumer privacy and data protection regime.

In contrast, ISED held 28 meetings with academia and nine with civil society groups such as the Canadian Civil Liberties Association.

The numbers were obtained and analyzed by Andrew Clement, an emeritus computer science professor at the University of Toronto, who requested the data from ISED.

“Given that AIDA is specifically aimed at protecting Canadians against a wide range of potentially harmful AI impacts, it is concerning that the representatives of those who might be affected, and especially those at risk, are so under-represented,” he wrote in a letter to the chair of the standing committee on industry and technology in the House of Commons, which is studying Bill C-27.

The federal government introduced AIDA as part of Bill C-27 in June, 2022. Some academics and critics have faulted the government for not conducting adequate consultations before introducing AIDA and contend the act lacks crucial details.

AIDA, which would come into force no sooner than 2025, would apply to “high-impact” AI systems. That includes algorithms used to make determinations related to employment, health care and content moderation on search engines and social media, among other areas. The most serious violations could result in fines of up to $25-million or 5 per cent of the offending company’s global revenue.

ISED spokesperson Hans Parmar said the department has tapped a wide variety of stakeholders to help shape Bill C-27.

“The ongoing commitment to engage on policy issues, including those related to privacy, AI, and the broader digital environment, have been integral to the department’s efforts, and have not been constrained to just those identified as part of this particular period of time,” he said.

Innovation, Science and Industry Minister François-Philippe Champagne has said it is crucial to pass AIDA swiftly to deal with the fast-paced nature of AI, while ISED has maintained its approach to crafting the act allows it to respond to new technological developments without stifling innovation.

Some critics, such as former BlackBerry co-CEO Jim Balsillie, have said that AIDA needs to be “scrapped completely” in part because it does not create a truly independent regulator for AI systems. Other academics and interest groups have said the act should be separated from the other components of Bill C-27 to allow for more thorough study and to not hold up the rest of the bill.

In the wake of criticism of AIDA, Mr. Champagne told the House of Commons industry committee in September that his office and department “have had more than 300 meetings with academics, businesses and members of civil society regarding this bill.”

According to Mr. Clement’s interpretation of the data from ISED, officials from the department met with Microsoft Corp. MSFT-Q, which has invested billions of dollars into OpenAI, 15 times. The Canadian Bankers Association and the Canadian Marketing Association met with officials 12 times, while Google and Cohere – a Toronto-based competitor to OpenAI – participated in at least five meetings each.

The department recorded one meeting with OpenAI on May 15. Officials met with chief executive Sam Altman that day when he was in Toronto for an event, according to Mr. Parmar with ISED.

The department met with six different civil society groups, including the University of Toronto’s Citizen Lab and the International Civil Liberties Monitoring Group, for a total of nine consultations.

ISED held 39 meetings with government bodies, including 15 with the Office of the Privacy Commissioner and two with the Canadian Human Rights Commission.

“ISED, with its goal of promoting industrial development of AI, is inclined to emphasize the private sector over civil society,” said Renee Sieber, an associate professor at McGill University who researches civic participation around AI issues. “Given the potential negative impacts on society, that makes me question whether ISED was the right agency to draft this bill.”

Teresa Scassa, a law professor at the University of Ottawa, said the proposed groups of high-impact AI systems do not include any mention of the use of algorithms to screen people for housing and rental accommodations. “It seems to me that broader consultation with civil society might have surfaced something like that,” she said.

ISED’s list of meetings also contains workshops, panel discussions and forums, and it’s questionable whether these sessions should count as consultations, according to Mr. Clement. Excluding such meetings and those without an obvious connection to AI issues, he totalled 253 consultations as opposed to more than 300, as stated by Mr. Champagne. “This suggests that the minister’s statement may be exaggerated and hence misleading,” he wrote in his letter.

Mr. Parmar said that the department makes “significant efforts” to meet with stakeholders wherever they gather. “These events provide a key venue for robust engagements with stakeholders, including academics and civil society representatives,” he said.

Mr. Parmar also said that some of the meetings could be categorized differently. While Mr. Clement labelled Mila, the Quebec machine learning institute, as a business, “those engagements in fact included representation from a broad group of stakeholders and experts, including academics.”

Gillian Hadfield, a University of Toronto law professor and member of the government’s AI advisory council, said that Canadians should look at the substance of the proposed legislation and not just the number of meetings, which can distract from the serious policy issues at stake. “Lots of consultation can produce poor legislation,” she said. Low rates of consultation “can nonetheless lead to high-quality legislation if government is smart and takes its duties to the public seriously.”

Mr. Champagne proposed a number of amendments to AIDA in a letter to the industry committee in late November, partly to account for generative AI, which was not on the government’s radar when the bill was first introduced.

Companies building general-purpose AI systems that can produce text and other media will be required to “make best efforts” to ensure people can identify computer-generated content.

In instances where a person could mistake an AI system for a human being, companies would be required to disclose to users that they are interacting with AI, even if these applications are not considered high-impact under the legislation.

“It is pivotal that we pass AIDA now,” Mr. Champagne wrote. “We are at a turning point in the history of AI, both in Canada and globally, and the costs of delay to Canadians would be significant.”