The Canada Border Services Agency (CBSA) is gearing up to introduce a facial recognition app aimed at tracking individuals ordered for deportation. Set to launch this fall, the app, called ReportIn, represents a shift in how the CBSA monitors those under immigration enforcement conditions. But while it promises efficiency, it also raises significant questions about privacy, consent, and the broader implications of facial recognition technology.
What Is the ReportIn App?
The ReportIn app is designed to streamline the monitoring process for individuals who have been ordered to leave Canada. The app will allow users to check in via their smartphone, using facial recognition to verify their identity and location. The data gathered will be sent to the CBSA’s system, enabling officers to track compliance more efficiently.
According to CBSA documents obtained through access-to-information requests, this initiative has been in development since 2021. With over 2,000 individuals failing to comply with deportation orders each year, the agency sees the app as an "ideal solution" to cut down on resource-heavy efforts to locate and detain non-compliant individuals.
How It Works: Biometrics and Location Tracking
The app will record a user’s facial biometrics and location through the device's sensors and GPS, ensuring that they are where they claim to be when checking in. If the app cannot confirm a facial match, it will trigger an investigation by CBSA officers. According to the agency, the app will not constantly track users but will collect location data each time they check-in.
The CBSA believes that this automated system will keep individuals engaged and aware of the agency's visibility into their cases, leading to better compliance. If someone fails to report, the data collected by the app will provide critical leads for locating them.
Why the Concern?
As with many uses of facial recognition technology, the ReportIn app has sparked concerns among experts and privacy advocates.
One of the key issues is the power imbalance between the government agency and the individuals being monitored. While the CBSA claims that participation in the app will be voluntary, some experts question whether true consent can be given in a situation where individuals are under immigration enforcement conditions. Petra Molnar, associate director of York University’s refugee law lab, points out that the vast power differential may prevent individuals from fully understanding or agreeing to the app’s use.
Another major concern is the risk of bias and errors in facial recognition technology, particularly when it comes to racialized individuals. Research has shown that these systems can have higher error rates for people with darker skin tones. Kristen Thomasen, an associate professor at the University of Windsor, warns of the potential for serious consequences if errors occur, especially in a system where trade secrets and proprietary algorithms may prevent people from understanding how decisions about them are made.
The CBSA, however, has emphasized that the app will undergo continuous testing to assess accuracy and performance. Credo AI, a company specializing in Responsible AI, has reviewed the software for demographic bias, and the agency claims a 99.9% facial match rate across six different demographic groups. Despite these assurances, the concern remains that technology alone may not be infallible.
Privacy and Transparency: The Debate
One of the more controversial aspects of the app is that the algorithm behind the facial recognition technology is considered a trade secret. Experts like Thomasen argue that when decisions have life-altering implications, individuals should have the right to understand how those decisions are made. The use of proprietary algorithms may block individuals from gaining insight into how the app evaluates their data, potentially leaving room for errors to go unchallenged.
Additionally, while the CBSA says they worked closely with the Office of the Privacy Commissioner to develop the app, some experts remain skeptical about whether adequate safeguards are in place to protect individual privacy.
What Happens If People Don’t Use the App?
While the app is being promoted as a voluntary alternative to in-person reporting, individuals who do not consent to use ReportIn will still have the option to report in person at a CBSA office. However, the question remains whether those who are under pressure will feel they can freely opt out without facing repercussions.
The Role of Technology in Immigration Enforcement
The CBSA's move to use facial recognition as part of its immigration enforcement strategy raises larger questions about the role of technology in government surveillance. While the app may provide the agency with more efficient ways to monitor compliance, the risks of errors and the potential for privacy violations are significant concerns that cannot be overlooked.
Even though CBSA officers will oversee all submissions and make the final decision on deportation cases, there’s a widely recognized psychological tendency for people to defer to the judgment of technology. As facial recognition becomes more integrated into everyday life, it is important to continue asking whether these technologies are being used in a way that is fair, transparent, and accountable.
Final Thoughts
The ReportIn app represents a significant development in the CBSA’s approach to monitoring deportations. On the surface, it offers a convenient and efficient way to track individuals who have been ordered to leave the country. However, the broader implications of using facial recognition technology — especially in high-stakes situations like immigration enforcement — cannot be ignored.
As the app rolls out this fall, it will be important to monitor its impact and ensure that it does not infringe on individuals' rights. The balance between innovation, efficiency, and human rights should remain at the forefront of this conversation.
Source: Global News
Opmerkingen