NEWS

CYBER AND PRIVACY LAW MAY, 2023

In the online legal news magazine SLAW on May 31, 2023 authors Sharon D. Nelson and John W. Simek & Michael C. Maschke wrote about the theft of data as a result of unscrupulous employees or former employees of law firms. The data is stolen mainly as a result of downloading documents and slipping them into a dropbox. The article that discusses various shenanigans perpetrated by lawyers at law firms can be found here: https://www.slaw.ca/2023/05/31/law-firm-employees-allegedly-misbehaving-make-headlines/  

The office of the privacy commissioner has just recently announced their investigation into the erroneous covid-19 test results thru the use of the ArriveCAN app. See the following web link: https://priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-federal-institutions/2022-23/pa_20230529_arrivecan/  The announcement made the following observations and conclusions:  

“On June 28th, 2022, version 3.0 of ArriveCAN was released. An error in this version caused approximately 10,000 fully vaccinated Apple device users to receive erroneous messages to quarantine, despite respecting all the conditions of the quarantine exemption for fully vaccinated travellers. CBSA indicated that it identified the defect on July 14th, 2022 and resolved it on July 20th, 2022. 

Given that the information and instructions generated by ArriveCAN were inaccurate for certain Apple device users, the complainant alleges that the CBSA had failed to take all reasonable steps to ensure that the personal information used to determine an individual’s quarantine requirements was as accurate as possible. 

Ultimately, we found that the CBSA did not meet the requirements of the Privacy Act, as it did not take all reasonable steps to ensure the accuracy of the information that it used for an administrative decision-making process. Accordingly, our Office finds that the CBSA failed to respect its obligations under subsection 6(2) of the Privacy Act, and this complaint is therefore well-founded.” 

In spite of the thorough review of the ArriveCAN app errors in programming, CBSA has refused to follow the strong recommendations made by the OPC. It says further in the announcement as follows: CBSA disagreed with our finding that it failed to take all reasonable steps to ensure accuracy. It also refused to implement our recommendation to correct the inaccurate and sensitive information it holds for the affected travellers concerning quarantine status. We call on CBSA to reconsider its refusal to correct the erroneous data generated by the ArriveCan error and to put in place all necessary measures should it decide to proceed with similar tools in the future.” 

The BILL C-27 in Part 3 includes draft legislation that is entitled: Artificial Intelligence Data Act. One glaring issue that emanates from the current wording is the fact that the government is legislating itself out of application of the legislation. The provisions stipulate that government institutions as defined in the Privacy Act are not entities to which AIDA will apply. If this is indeed the case once having received royal assent, one of the possible issues in the future is going to be by what legal standard is the government going to have to meet in terms of accuracy and decision making by the use of AI. The private sector will have to abide by the provisions of AIDA. 

Further, there also appears to be the issue as to whether the definition of privacy in one’s data necessarily includes “negative inferential analytics.” One such periodical that discusses this new legal issue is the article written by Sandra Wachter & Brent Mittelstadt in their well researched piece entitled: “A RIGHT TO REASONABLE INFERENCES: RE-THINKING DATA PROTECTION LAW IN THE AGE OF BIG DATA AND AI” that appeared in the Columbia Business Law Review – Vol. 2019 – Issue 2 from pages 1-130.  

The authors write at page 84 in their thesis as follows: “As the novel risks of automated decision-making and profiling suggest,357 these systems disrupt traditional concepts of privacy and discrimination by throwing the potential value and sensitivity of data into question. A question thus becomes apparent: Are the fundamental aims of data protection law still being met in the age of Big Data, or is a re-alignment of the remit of data protection required to restore adequate protection of privacy?” 

If a person has a right to reasonable inferences to be made from their own data, what about the situation where errors are made – what legal recourse does the individual have in that sense? Perhaps the individual has a right to reject, delete or rectify incorrect inferences. They argue at page 57 as follows:  “However, inferences can also be probabilistic assumptions that cannot be verified currently, or perhaps ever.248 While some inferences can be verified through “ground truth,” for example by asking the data subject whether her predicted income range is correct, others are inherently subjective (e.g. the data subject is a “high-risk borrower”) or predictive (e.g. the data subject will apply for a mortgage within the next two years) and thus cannot be verified as such.” The authors state further that: “Some argue that only data that can be verified counts as personal data and thus falls within the scope of the right to rectification, excluding unverifiable inferred data.”

One such example of AI implementation by the government is in immigration application processing. If a visa officer makes a negative inference which leads to visa refusal for a foreign national, it would appear the only recourse is to make an application for leave and judicial review to the Federal Court and try to make a “reasonableness” argument making application of the SCC decision in Vavilov to the particular facts of their case. The fact that AIDA will not apply to government institutions such as IRCC is going to make it problematic for a foreign national to make their case heard in the Federal Court due to the threshold to obtain Leave in the first place.  

David H. Davis of Davis Cyber Law specializes in strategic risk management, incident response, privacy & data protection, and advocacy. He can be reached by email at david@daviscyberlaw.com or by telephone at 204-956-2336. We are also on the web at www.daviscyberlaw.com