Close
newsletters Newsletters
X Instagram Youtube

Canada battles AI-generated fraud in asylum applications

The Government of Canada website, including its immigration and citizenship page, is displayed under a magnifying glass in Montreal, Canada, Nov. 17, 2017. (Adobe Stock Photo)
Photo
BigPhoto
The Government of Canada website, including its immigration and citizenship page, is displayed under a magnifying glass in Montreal, Canada, Nov. 17, 2017. (Adobe Stock Photo)
April 07, 2026 10:26 AM GMT+03:00

Canadian federal officials say artificial intelligence is being used to generate fake information in immigration and refugee applications, even as government agencies deploy the same technology to detect and prevent fraud.

Both Immigration, Refugees and Citizenship Canada (IRCC) and the Immigration and Refugee Board (IRB), an independent tribunal that rules on asylum applications, say they have identified cases in which AI was used to fabricate details in submissions.

"We have observed instances where AI has been used to help generate fraudulent applications," IRCC spokesperson Isabelle Dubois told the Globe and Mail.

"As we work to detect and prevent fraud, publicly sharing these specific examples could inadvertently help fraudulent claimants identify alternative methods to circumvent detection."

The IRB said the trend poses a new challenge for its staff. Appeals are growing longer, but the added volume does not always strengthen the arguments, officials said.

In some cases, submissions have cited nonexistent court decisions or referenced legal precedents for positions they do not actually support.

"This adds unnecessary complexity and time to our work," the IRB said in a statement.

Ghost consultants go digital

Toronto immigration lawyer Max Berger said AI is on track to become what he described as "the new ghost consultant" in asylum cases.

Ghost consultants are individuals who generate fraudulent documentation or narratives on behalf of claimants, bypassing legal requirements.

"Currently, ghost consultants who make up stories for some claimants are the scourge of the refugee determination process," Berger told the Globe and Mail in an email.

"Instead of paying ghost consultants, the minority of refugee claimants trying to game the system can now ask AI to make up a history of persecution for them at no cost."

Berger said oral hearings, where IRB members can directly question applicants, remain the most reliable safeguard.

"The antidote is in holding an oral refugee hearing where credibility is tested by the IRB board member," he said.

Foreign nationals confirmed to have used misrepresentation or fraudulent documents face a five-year ban from entering Canada.

The Canada Border Services Agency, IRCC, and the Royal Canadian Mounted Police (RCMP) jointly investigate immigration fraud.

A visa applicant submits a passport and application form to a consular officer at an embassy office in an undated stock photo. (Adobe Stock Photo)
A visa applicant submits a passport and application form to a consular officer at an embassy office in an undated stock photo. (Adobe Stock Photo)

Government also turning to AI

At the same time, IRCC's AI strategy, published earlier this year, outlines how the department is using machine-learning tools to detect false narratives, spot anomalies in applications, and flag irregular travel patterns that may indicate a claimant misrepresented their country of origin.

AI systems have also been trained to identify manipulated documents, including academic records and bank statements, and detect artificially altered photographs that could be used for identity fraud.

Both IRCC and the IRB said AI is not used for adjudicative decisions, such as whether someone should be permitted to remain in Canada.

The IRB's departmental plan for 2026–27 said it intends to introduce tools to support faster file preparation and streamline decision-making, while keeping human decision-makers in place.

The tribunal already uses speech-to-text transcription of refugee hearings and AI-generated draft summaries of Federal Court decisions, which are reviewed by paralegals or lawyers before being finalized.

The scrutiny over AI in the immigration system comes as IRCC faces broader criticism over anti-fraud controls.

Canada's auditor general, Karen Hogan, concluded last year that there were "critical weaknesses" in the department's fraud detection after it failed to investigate more than 149,000 international students flagged for not complying with study permit terms.

In 2024, the Federal Court issued a policy directive requiring lawyers and litigants to disclose any use of AI in court submissions, including immigration cases.

April 07, 2026 10:28 AM GMT+03:00
More From Türkiye Today