Crying “Fake News!” Does Not Make It True

It is increasingly common for the prosecution to rely upon image-based evidence in criminal prosecutions. This evidence may take the form of videos derived from CCTV, smartphones, body worn cameras, and in-car cameras, as well as still images. Occasionally, the defence will also tender image-based evidence as part of their case. What happens if the opposing party claims that the images have been altered or have been generated by AI? Is crying “Fake News!” enough to exclude tendered images? A recent Canadian case addressed this issue and provided some helpful guidance.

The Canadian Case

As cases go, R. v. Medow (December 11/25, Ontario Court of Justice, unreported) is unexceptional. What makes it interesting though is the defendant’s allegation that the police altered the video evidence that was tendered at trial. The defendant was charged with obstructing and assaulting a police officer while in the execution of his duties. The officer was in the middle of an unrelated traffic stop when the defendant, who was merely a bystander and evidently had some mental health challenges, engaged in a physical altercation with the officer. Once other officers were called to assist, the defendant was taken into custody. The entire incident was recorded on the officer’s body worn camera and the in-car camera in his police vehicle.

The case for the Crown consisted of the evidence of the involved officers and the video recordings. The defendant was a very difficult litigant for the trial judge to manage, and he showed great patience in giving the defendant a fair trial. The defendant asserted, amongst other things, that there was a police conspiracy against him and alleged that the video recordings had been altered and were deepfakes. Though he said that he had evidence to prove this (or would find such evidence), no evidence of alteration or lack of authenticity was ever provided. The trial judge therefore had to determine how to deal with bare assertions of impropriety in the absence of evidence.

The Law

The admissibility of digital evidence in Canada is largely addressed by s. 31.1 of the Canada Evidence Act, which provides:

31.1 Any person seeking to admit an electronic document as evidence has the burden of proving its authenticity by evidence capable of supporting a finding that the electronic document is that which it is purported to be.

The Newfoundland and Labrador Court of Appeal, in R. v. Martin, 2021 NLCA 1, noted that the provisions of the CEA, first introduced in 2000, did not create a new standard for admission but rather simply codified existing common law. The majority decision noted that the threshold for the admission of authenticated “electronic documents” is low, in keeping with the general principle that relevant evidence in a criminal trial is admissible unless it is subject to an exclusionary rule or its probative value is outweighed by its prejudicial effect. The majority stated that proof of authenticity is not held to the beyond a reasonable doubt standard or even the balance of probabilities. Nor must the evidence be shown to determine or be capable of determining a finding of authenticity. Rather, as per s. 31.1, the evidence tendered need only be capable of supporting a finding of authenticity. Evidence led for this purpose can be direct or circumstantial in nature. In so ruling, the majority drew support from similar appellate rulings in Ontario (R. v. C.B., R. v. Colosie, R. v. Farouk), Alberta (R. v. Bulldog), Saskatchewan (R. v. Hirsch, R. v. Durocher), and New Brunswick (R. v. Richardson).

The majority was careful to note that satisfying the low threshold test for admissibility does not equate with a finding of genuineness. Those are issues for the trier of fact to resolve in the context of all the evidence presented by the parties. There is a significant difference between assessing evidence for threshold admissibility and using it in support of a conviction or acquittal. A comparatively lower standard is required at the threshold stage, where the tendering party must produce some evidence on the issue of authenticity but only enough to show that the evidence can support a finding that the electronic document is that which it is purported to be. It follows that even if evidence is admitted under the low standard for authentication, the prosecution must still prove that the images have sufficient integrity that they should be utilized by the court in making findings of fact.

Application to Case

At trial, the police officer who interacted with the defendant from the outset and whose body worn camera and in-car camera video recordings were tendered as evidence, testified that the video recordings accurately depicted the event. The assisting officers testified that the video recordings accurately depicted the incident from the time they arrived and were shown on camera. All officers testified that they did not alter the recordings. The main officer testified that he uploaded the body worn video at a docking station and thereafter had no control over it. No evidence was led as to what happened to the recordings once uploaded to the police video receptacle, who had access to them, or how they were prepared for disclosure. It was evident that the face of the unrelated motorist was blurred out but no evidence was led as to how this was done. This necessary redaction helped to fuel the defendant’s allegations of evidence tampering.

The trial judge agreed with the defendant that the existence and widespread proliferation of AI generated video recordings was the proper subject of judicial notice. He further noted that the tendering party must establish that the video recording is reliable and fit for the tendered purpose.

[55]…I agree that the existence of deepfakes presents a potentially serious concern to the integrity of our justice system. However, the admission of digital evidence does not mean that its ultimate reliability should be presumed. By contrast, a rigorous analysis of the ultimate reliability of any type of digital evidence is always required. The admissibility standard associated with the authentication of digital evidence must never be confused with the Crown’s heavy burden to prove its case against an accused person beyond a reasonable doubt. Furthermore, a trial court must consider not only any disputed digital evidence when determining if the Crown has met its burden, but the totality of the evidence presented at the conclusion of the trial.

The trial judge was satisfied that the evidence of the officers was sufficient to not only authenticate the video recordings but also to establish that they were reliable and entitled to consideration by the court in the fact-finding process. Responding to the defendant’s allegation of image alteration, the judge said:

[59]…I acknowledge that Mr. Medow does not face any burden in this trial, and it was the Crown that tendered the video evidence. Nevertheless, mere speculation by an accused person that digital evidence has been falsified to deceive the viewer is insufficient to preclude its admissibility or to diminish its weight…

[61]…With respect to the BWC footage at the centre of this case, I cannot infer that the video was intentionally altered to falsely incriminate Mr. Medow, simply given the measures taken to protect the unknown motorist’s identity. Nor does Mr. Medow’s firmly held belief that the police in general have a motive to frame him provide the court a reason to disregard the content of the videos. The specific officers who testified provided credible and reliable testimony, demonstrating a strong, independent recollection of the offences they witnessed and confirming each other’s accounts regarding Mr. Medow’s acts of violence…

The Court was careful to note that some forms of digital evidence may require the Crown to present expert evidence to establish the authenticity and reliability of the evidence. Further, given the ability of generative AI to produce images that may deceive viewers, ‘…courts must ensure that the authentication voir dire required for digital evidence is not rendered meaningless’ (para. 73).

Commentary

Though Medow is not binding authority, it serves as a useful reminder of the burden the tendering party has when submitting image-based evidence to the court for consideration. While authentication is a low evidential burden, establishing that the images have integrity and are reliable for deciding factual issues in a case is much higher. In this case, the Crown led the evidence (the officers’ testimony) that was necessary to meet both objectives. It was not necessary to lead evidence about what happened to the recordings once they were uploaded from the body worn camera or the in-car camera absent any reason to suspect that improper alterations had occurred. Similarly, the redaction procedure would rarely require explanatory testimonial evidence. That said, there will be cases where this approach will not suffice and where technical or expert evidence will be required to dispel any perceived concerns about image integrity and reliability. In such cases, a proactive approach is far better than being reactive and perhaps ill-equipped to address these issues.

Where there is a legitimate concern about the introduction of images that have been generated by AI or have otherwise been altered, the necessity for expert evidence is clear. Two articles have been written on this site about American cases wherein AI enhanced video was tendered by the defence in a criminal case (State v. Puloka) and deepfakes were tendered by the plaintiff in a civil case (Mendones et al. v. Cushman and Wakefield, Inc. et al.). The first case involved expert evidence while the second did not. Image alteration was also discussed in State v. Rittenhouse, also on this site.

Crying “Fake News!” is a typical response from someone who simply doesn’t like the truth. However, there will be cases where claims of generative AI and image alteration may have merit and should be fully investigated by people trained and qualified to do so. Similarly, a party tendering image-based evidence should not blindly assume that the images they rely upon are genuine. That should be confirmed in advance by people qualified to make that assessment. After all, the goal of a criminal investigation and prosecution should be to find the truth.


Discover more from Jonathan W. Hak KC PhD

Subscribe to get the latest posts sent to your email.