Canada Aiming at Improving Cybersecurity of Federally Regulated Industries Through Bill C-26

Canada recently started looking at a new piece of legislation that seeks to strengthen cybersecurity of businesses and organizations the activities of which fall within ambit of activities that the Federal government can directly regulate.

Interestingly, contrary to most Canadian legislation so far and that touch upon cybersecurity, the focus this time is not on whether an organization collects, uses or discloses personal information. Rather, the bill at issue would seek to cover whole swats of certain industries, whether the organizations operating therein do or do not deal with personal information. This is a new approach in Canada which may signify that the government is finally realizing we collectively need to take cybersecurity more seriously, and that it is more than an issue of personal information.

Bill C-26 proposes to impose on telecommunication providers a new regime that would force them to adopt better cybersecurity practices, with a view to better protecting Canadians who rely on their services for things like cell phone and Internet services.

More generally, the bill would also empower the Canadian government to force federally regulated businesses to clean-up their act (so to speak), cybersecurity-wise, especially when it may jeopardize national security or public safety. As you may know, in Canada, federally regulated businesses include, for example, those who deal with:

  • radio, television and telecommunications, such as Internet providers;
  • air transportation, including airlines, airports, ports, shipping, boats, as well as railways and road transportation services that cross borders;
  • banks;
  • certain energies and their transport, like pipelines, etc.

Bill C-26 would allow the Federal government to require organizations operating in those areas to take cybersecurity more seriously, in particular when public safety may be involved. For example, this may allow the government to dictate that operators of pipelines better protect and monitor their computer systems, with a view to avoiding major catastrophes that may eventually result from cyber-attacks.

In addition to eventually requiring organizations in those industries to adopt and apply cybersecurity programs and to better protect their systems, C-26 would also require the organizations at issue to report eventual cybersecurity breaches, something they currently are not generally required to do.

Bill C-26 is currently at the First Reading stage.

LaMDA: Simple Chat Bot or Ghost in the Machine?

The Internet and social media have been buzzing for a couple of days about a Google engineer who seems to think an AI program developed by Google and called LaMDA is sentient. Huh?

Interestingly, the engineer at issue seems to have been put on administrative leave since, leaving us to ponder this. Are we really already THERE?

Though the story makes for a good one, the overall feeling around the digital campfire is that, NO, we are not there yet. At this point, no AI is not really capable of thinking for itself and coming up with mental leaps, ideas, preferences or opinions, in a way that truly approximates what we do as humans. Sure AI can make connections, but real ideas of its own? Feelings? Opinions? A personality? No, sadly it seems all that, for the time being anyway, is still science-fiction.

The reason I mention the story though, is that at some point, if we keep collectively investing in AI, there may come a time when something does come out of it that may very approximate pretty well for it is to be a person. If and when that happens, we may have to reexamine how, legally, we define a person and what rights we may want to give such digital personalities. Though this may not be a real problem for a while yet, at some point, it may very well become a real issue we’re collectively forced to contend with.

Sure, for now keyboard conversations with chat bots like LaMDA are more like parlor tricks, but it may not always remain so. Shouldn’t we collectively start thinking about this eventuality, including as to how the law may want to handle it? This kind of story begs the question.

Google Photos Class Action in Québec Derailed Off the Bat

The Québec Superior Court recently rejected a proposed class action involving Google Photos and the allege misuse of biometrics data resulting from this Google service. In the decision at issue, Homsy v. Google (2022 QCCS 722), the court refused to authorize the proposed class action, because the plaintiff failed to show he had even mere color of rights. In short, he failed to demonstrate that he had a case or, rather, what could be reasonably considered a real case.

Legally, the explanation of the rejection off hand of this (proposed) class action stems from the requirement that any such proceedings in Québec seem, at the very least, to hold water, if you will. To do so, the court should conclude, looking at the claim as presented, that if the alleged facts were true, then this case would justify a Québec court indeed awarding the remedy requested by that plaintiff.

Even though one might think this allows anyone to sue like this by alleging X, Y and Z, it is not so, as it could force unfounded and/or unworthy proceedings on the Québec justice system -something we collectively definitely do not need.

Indeed, jurisprudence is now teaching us that not mere allegations in initial proceedings (to institute a class action) may NOT suffice to allow a class action in Québec to stand. In effect, simply alleging a bunch of suppositions and theories isn’t sufficient to introduce a valid class action before Québec courts. You need more; maybe not tons more, but more. Thus, given the lack of even a modicum of evidence in the case at issue, the court agreed to throw it out (or, rather, refuse to authorize this class action against Google); this case simply did not pass muster. As cases such as this one demonstrate, even though Québec rules generally seek to facilitate class actions (as compared to your ordinary proceedings, anyway), you do need more than mere conjecture, theories, suppositions and inferences . If this is all you have initially (as was the case in Homsy), then the court may simply refuse to authorize your action -sorry.