“False Light” – Canada’s Newest Tort

A tort recognized by the Ontario Superior Court of Justice last month expands privacy protections for Canadians by adopting a well-established US cause of action.

 

Torts are an essential element of common law. A tort is a wrongful act or injury that leads to physical, emotional, or financial damage to a person, for which another person can be held legally responsible. Torts may be either intentional or unintentional (a tort may be caused by negligence, for example). Law in this area usually develops through legal precedent, as decisions by the highest courts expand the scope and application of a tort.

 

The Ontario case, V.M.Y. v. S.H.G., concerned cyberbullying of a particular nature: in an ongoing campaign of harassment, a father posted images, petitions, and videos of his ex-wife and her parents, along with comments that accused them of numerous illegal acts, including kidnapping, child abuse, assault, and fraud.

 

The tort, new to Canadian law, is that of “publicity placing a person in a false light.” The court found that “the wrong is in publicly representing someone, not as worse than they are, but as other than they are. The value at stake is respect for a person’s privacy right to control the way they present themselves to the world.”[1] 

 

This tort is already well-established in courts in the United States. Indeed, over recent years, three of four key US privacy-related torts have been adopted into Canadian law.  These common-law torts, originally catalogued by American jurist William L. Prosser, are:

  1. Intrusion upon seclusion or solitude or private affairs;
  2. Public disclosure of embarrassing private facts;
  3. Publicity which places the plaintiff in a false light in the public eye; and
  4. Appropriation, for the defendant’s advantage, of the plaintiff’s name or likeness.

The first three have been adopted in Canada in the following decisions:

Tort Case Jurisdiction and year
Intrusion upon seclusion Jones v. Tsige Ontario, 2012
Public disclosure of private facts Doe v. ND Ontario, 2016
False light V.M.Y. v. S.H.G. Ontario, 2019

 

All three Canadian decisions referenced Prosser’s work.

 

As I pointed out in my book Privacy In Design: A Practical Guide to Corporate Compliance (page 62):

 

Liability in these four US privacy torts often hinges upon whether the violation can be considered “highly offensive to a reasonable person.” It isn’t always easy to predict how broadly this category may be defined by a US court, but following prior judgements, certain trends can be noted:

  • “Highly offensive” actions include: snooping into mail, secretly recording conversations, illegally accessing financial records, disclosure of autopsy photos or medical data, and disclosure of debts.
  • “Not highly offensive” actions include: disclosure of reputation-enhancing information, union membership, non-embarrassing facts, minor injuries, or information that causes “minor and moderate annoyance.”

The court in V.M.Y. v. S.H.G. followed the US definition of the tort by hinging  culpability of whether or not the act in question is “highly offensive.” Justice Kristjanson wrote: “It is enough for the plaintiff to show that a reasonable person would find it highly offensive to be publicly misrepresented as they have been.”[2]

 

This case is a victory for privacy rights in Canada. Those who use the Internet to harass others have had their wings clipped; this case and the tort it creates expand the legal remedies available to their victims.

 

Privacy In Design: A Practical Guide to Corporate Compliance (2019) is available on Amazon: https://www.amazon.ca/dp/B07S9Q2CYZ/ref=cm_sw_em_r_mt_dp_U_fVonEbWHGC7ZS

 

 

[1] V.M.Y. v. S.H.G., [2019] O.J. No. 6702.

[2] Ibid.

Smart Privacy Auditing – An Ontario Healthcare Case Study

IPCS Smart Privacy Auditing Seminar

On September 13, Dr. Waël Hassan, was a panelist at the Innovation Procurement Case Study Seminar on Smart Privacy Auditing, hosted by Mackenzie Innovation Institute (Mi2) and the Ontario Centres of Excellence (OCE). The seminar attracted leaders from the health care sector, the private information and technology industry, and privacy authorities. The seminar explored the concept of innovative procurement via the avenue of competitive dialogue, in addition to demonstrating the power and benefits of using artificial intelligence to automates the process of auditing all PHI accesses within a given hospital or health network.

What are the benefits of participating in an innovative procurement process, particularly competitive dialogue?

An innovative procurement partnership between Mi2, Mackenzie Health, Michael Garron Hospital, and Markham Stouffville Hospital was supported by the OCE’s REACH grant and sought to identify an innovative approach to auditing that could be applicable to the privacy challenges faced by numerous hospitals with different practices, policies, and information systems. Rather than focus on how the solution should operate, the partners collaboratively identified six outcome-based specifications the procured audit tool would be required to meet.

By identifying key priorities and specifying the outcomes a solution should achieve, Competitive Dialogue establishes a clear and mutual understanding of expectations. This can help the private sector narrow down solution options to a model best-suited for the contracting authority’s unique context. The feedback loop provided by the iterative rounds (if used) enables vendors to clarify any confusion and customize proposals to the contracting authority’s unique needs, staff workflows, and policy contexts.

Competitive Dialogue is an opportunity for transparent communication that gives vendors the opportunity to learn more intimate details of what the contracting authority, in this case Mackenzie Health, needs from a solution. Because hospitals are not tech or security experts, they often struggle to accurately identify and define what solutions they need to solve a particular issue, and thus a traditional procurement process is rarely ideal since there is little to no room for clarification or feedback. This process is more flexible than the traditional procurement process and thereby allows for more creativity and innovative thinking processes during the initial proposal development. Encouraging creativity and creating a competitive environment in which competing vendors may be sounding ideas off each other results in higher quality proposals and final solutions.

 Mackenzie Health Case Study

Mackenzie Health employs over 450 physicians and 2,600 other staff members, processes nearly 55,000 patient medical record accesses every day, and has just one privacy officer to monitor everything. Mackenzie Health’s privacy needs far outweigh its capacity, so they turned to the private sector for an innovative solution.

Section 37(1) of PHIPA outlines the possible uses of personal health information, and these guidelines are based on the purpose underlying the activities. Because the legal framework is centred on purpose, KI Design’s approach is to explain the purpose for accessing a given medical record. The core of this technology is more commonly known as an explanation-based auditing system (EBAS) designed and patented by Dr. Fabbri of Maize Analytics.

To detect unauthorized accesses, the technology has the capability of identifying an intelligible connection between the patient and the employee accessing the patient’s records. AI changes the fundamental question underlying auditing tools from “who is accessing patient records without authorization?” to, “for what purpose are hospital staff accessing patient records?” Asking this question helps the technology break down staff workflows and identify common and unique purposes for accessing any given medical record, which are further categorized as either authorized access or unexplained access, which may then flagged as potentially unauthorized behaviour. The technology is able to filter out the authorized accesses, which are usually 98% to 99% of all accesses, so that the Privacy Officer can focus on the much smaller number of unexplained and flagged accesses.

Why is the private sector interested in health care?

Health care is an extremely complex system operated by the province and service providers. The province is a specialist in governance and regulation, the service providers are specialists in medicine – neither are experts in privacy or security. Companies such as KI Design are interested in filling the expertise gap within the health care sector by working closely in tandem with health care providers and the Information & Privacy Commissioner to adapt privacy and security solutions that are suitable for their working realities. There is irrevocable value added in having a privacy and security expert working directly with hospitals and other health service providers to assist in refining privacy best practices and implementing a privacy tool that will improve privacy and security outcomes without restricting the workflows of health practitioners.

To learn more on how AI solutions improve Audit visit https://phipa.ca/

 GDPR Responsibilities of Controllers and Processors

Responsibilities of Controllers and Processors

What are controllers and processors under the GDPR?

  • Controllers determine the purpose and means of processing personal data and are usually the collectors of data. They do not necessarily need to be located in the EU. Controllers are additionally responsible for monitoring processors’ compliance.
  • Processors are engaged to protect data on behalf of the controller

Both controllers and processors are responsible and liable for compliance under the GDPR.

Responsibilities of Controllers

The primary responsibility of controllers is to data subjects. Controllers also demonstrate compliance with the GDPR and ensure data processors’ compliance as well. Controllers outside the EU that regularly process personal data pertaining to people within the EU should have a designated representative within the EU to help manage compliance.

 

Responsibilities of Processors

Processors are governed by a contract that addresses how data will be processed, how requests from data subjects will be fulfilled, and whether data will be transferred to any other geographical locations. The processor makes information available to the controller to demonstrate compliance and notifies the controller in the event of a breach. It is also the processor’s responsibility to ensure that authorization is given prior to engaging a sub-processor, and that all data is deleted or returned at the end of their service provision.

The GDPR introduces direct statutory obligations on processors as well as severe sanctions for compliance failures. This is especially relevant for non-EU data processors, who need to ensure that if their clients are based in the EU they are responsible for complying with the GDPR. The processor has equal risk for fines as the controller.

 

Required Data Protection Practices

  • Data protection by design and default

Data can also be protected by design, meaning that data protection principles are integrated into the design of the systems that manage personal data. Another way to protect data is by default, meaning putting in place safeguards to limit the processing of data.

  • Safeguards

Generally, it is recommended to put in place practices and technologies that are appropriate to the level of risk. Some of the best safeguards are quite simple. For instance, having a data protection officer and consulting with supervisory authorities concerning high risk projects. Other examples include breach notifications and data protection impact assessments (DPIA) for high risk projects.

  • Breach notification

Breaches must be reported within 72 hours of discovery unless there is a low risk to the rights and freedoms of the data subjects. High risk breaches should be communicated to data subjects without delay.

  • Documentation

Companies with 250+ employees and those that handle certain special categories of data are required to document: contact information, purpose of processing, categories of data, data transfers to other countries, timelines for erasure of different categories of data and, where possible, a description of technical and organizational security measures.

What Is The GDPR?

What is the GDPR?

The GDPR represents new legislation that is destined to replace the General Data Protection Regulation, which has been in place since 1995. The arrival of the digital age means that the way people understand and interact with data is changing rapidly. The GDPR can help to clarify individual rights in the digital age, as well as creating a “digital single market” within the EU. With the GDPR in place, it will be easier for citizens to have control over their personal data, representing a shift in power.

The underlying principle of the GDPR is that the protection of personal data is a fundamental right, and organizations that handle personal data are responsible for those rights. “Processing” data means collecting, sharing, distributing, structuring, storing, or otherwise using an individual’s data. In this relationship, there are controllers and processors. A controller determines the purpose and means of processing personal data and is usually the collector of the data. Processors are engaged to process data on behalf of the controller, but the controllers are responsible for monitoring processors’ compliance.

The GDPR affects the North American market because any organization that offers goods or services to the EU or that monitors the behaviour of people within the EU is responsible for complying with the GDPR.

There are three key principles of the regulation:

  1. Limitation of processing means that: data must be processed only for specified, explicit and legitimate purposes; data must not be further processed in ways inconsistent with the initial purposes; data should be adequate, relevant, and necessary; data should be accurate and kept up-to-date; data should be kept only as long as necessary.
  2. Informed consent refers to freely given and clearly affirmative consent that must be intelligible, easily accessible, and written in plain language. Participants have the right to withdraw consent, and services cannot be withheld on condition of consent.
  3. Lawful processing pertains to at least one of the following conditions must be met:
    1. Consent from the data subject
    2. Processing is necessary for a contract
    3. Processing is necessary for compliance with EU laws
    4. Processing is necessary to protect a person’s vital interests
    5. Processing in the public interest or exercise of official authority
    6. Legitimate interests of the controller or a third party that are not overridden by the data subject’s rights and freedoms

Key Terms

  • Right to be Forgotten

This concept refers to personal data being deleted when the data subject no longer wants it to be processed. The exception to this is when there is legitimate reason to retain the data, for instance, in the case of completing a contract or complying with legal obligations.

  • Informed Consent

Information is made readily available and is communicated in clear, plain language. Informed consent will especially be enforced regarding services for children.

  • Right to Data Portability

Data subjects have a right to a copy of their personal data in an appropriate format and, where possible, they can transfer that data directly from one service provider to another. For example, individuals should be able to transfer photos from one social network to another.

  • Data Protection by Design and Default

This aspect helps protect users’ data by design, for instance by implementing technical safeguards like anonymization, pseudonymization, and encryption, as well as organizational safeguards.

  • Mandatory Data Protection Officer

A DPO fills the need for an organization to help monitor privacy and data protection. A DPO is an expert in their field, and is required if an organization’s core activities consist of regular and systematic monitoring of personal data on a large scale. This position helps ensure compliance and awareness of privacy legislation. The DPO may also monitor internal data protection activities, train staff, and conduct internal audits. If data subjects have inquiries, these will go through the DPO as well.

 

Company Response

Companies are responding to the GDPR in several ways:

  1. Stop buying and selling personal data
  2. Know where your clients live, or implement EU requirements regardless of location
  3. Prepare to respond to requests from data subjects
  4. Audit sub-contractors for compliance
  5. Reconsider cloud services

eDiscovery and Audits: The Solution to Unauthorized Access

eDiscovery and Audits: The Solution to Unauthorized Access

Electronic medical records (EMRs) contain sensitive personal information that is strongly protected in many jurisdictions. EMRs are protected under the Personal Health Information Protection Act (PHIPA) in Ontario by limiting authorized access to professionals who are currently providing healthcare services to the patient or are otherwise given consent to access the patient’s records. Despite PHIPA’s definition of and limits to authorized access, many of Ontario’s healthcare organizations (such as hospitals and clinics) operate open-access EMR database systems, meaning all healthcare staff have access to all records. Despite the responsibilities of healthcare organizations to protect patients’ information from unauthorized access and distribution, the risk of unprofessional conduct is not well-mitigated.

 

Unauthorized access to EMRs, colloquially termed snooping, constitutes the access and/or disclosure of private medical records without consent from the patient. Not all snooping is committed with malicious intent, and several cases in recent years have admitted curiosity or genuine concern as the reason behind the unauthorized access.

 

Regardless of the intention behind the act, snooping can have severe consequences for both the healthcare professional and the patient, even if the security breach is small in scale.

An offending healthcare professional could face loss of employment, relinquishment of their medical license, a civil lawsuit, and a maximum fine of $50,000 for violating PHIPA regulations. As for the patient, cases with the most severe consequences are usually smaller, isolated incidents involving social media and the malicious dispersion of their private medical records. A 2014 case in Indiana offers such an example, when a young woman’s positive STD test results were posted to Facebook by a former high school peer who was working as a nurse at the young woman’s local hospital – it is needless to say the reputational and emotional damage was irreversible.

 

While these cases are extremely serious, they are still relatively rare amongst healthcare-related lawsuits such as malpractice cases. However, the gradual increase in EMR privacy-related lawsuits and the understanding that electronic records can be manipulated have created a demand for reliable software tools that will efficiently identify, collect, and make available for attorneys accurate EMR data. This includes metadata such as the identification of healthcare providers who accessed the records and correlated timestamps, both of which are very important for the litigation. The practice of using these tools to find and freeze relevant data (meaning the data cannot be modified or deleted) is called eDiscovery.

 

Essentially, eDiscovery produces an “audit trail” from EMRs that is more accurate and reliable than simply accepting whatever records the healthcare provider’s attorney produces, as done in the past.

Using technology to sort and extract relevant data provides attorneys and healthcare organizations the advantage of being able to sift through more data faster. This is extremely useful given that the average Ontario hospital will generate approximately one million EMR access logs per week, or approximately 52 million in a given year.

 

However, challenges remain with assessing access logs and determining the purpose and validity of each. Some healthcare organizations use random audits to examine EMRs that have been accessed two or three times (or more) in a week; this method works similarly to a spot check by identifying anomalies and catching offenders off-guard. Another approach some organizations use is running regular conditions-based audit – basically, if a given patient’s EMR access logs meet pre-determined criteria that the healthcare organization has set, the access logs are flagged for further examination. While this approach may seem more methodical and likely to identify questionable activity, it also tends to generate many false alarms. Both random audits and regular condition-based searches make good use of available data analytics technology, however they both also require a level of human judgement and oversight that is not easily replaced by software.

 

So, what can be done to reduce, prevent, and prosecute unauthorized access to EMRs? Ontario’s Information and Privacy Commissioner has identified several methods of minimizing the risk of snooping, including the development of a comprehensive training program on privacy responsibilities for healthcare providers, and mandatory confidentiality agreements that are required prior to healthcare providers gaining access to medical record databases. These are feasible options, but they are only as strong as each healthcare organization’s resolve to implement them. Implementing advanced analytic technology could offer a positive way forward in better understanding this behaviour and improving prosecution against offenders.

 

As software developers create new methods of working with big data and continue to push analytic software and artificial intelligence to the next level, new solutions to complex issues like EMR privacy violations will emerge. The need for human judgement and oversight that is still required when using the aforementioned healthcare approaches to auditing can actually be reduced if not eliminated as long as the new audit solution has sophisticated data mining algorithms and well-integrated process triggers. KI Design’s Audit Solutions, powered by Maize have superior data mining and filtering abilities that can not only identify suspicious behaviour but also connect employee data access with potential clinical or operational reasons for accessing the data. If a particular access record is flagged as particularly suspicious, the audit software will trigger the risk assessment and incident response processes and provide relevant information to the healthcare organization’s privacy officer for manual review.

It is unnecessary for healthcare organizations and legal attorneys to remain dependent on manually sifting through EMRs and their access logs. Audit technology is more advanced and reliable than ever, and will likely play an important role in improving the eDiscovery process and lead to better outcomes to snooping lawsuits.

Artificial Intelligence and Privacy: What About?

Inference

How AI impacts privacy and security implementaiton

Big Data analytics is transforming all industries including healthcare-based research and innovation, offering tremendous potential to organizations able to leverage their data assets. However, as a new species of data – massive in volume, velocity, variability, and variety – Big Data also creates the challenge of compliance with federal and provincial privacy laws, and with data protection best practices.

Stemming from internationally-recognized Privacy Principles, data protection regulations tend to follow a specific format, focusing particularly on the collection, retention, use, and disclosure of personal information or personal health information, provided there is a specified purpose and explicit consent from the individual.

From a conceptual standpoint, the evolution of Big Data brings in a new element of analytical functions: Inference

– the extraction of new knowledge from parameters in mathematical models fitted to data – captures commonly-known functions such as data linking, predictive analytics, artificial intelligence (AI), and data mining. Inference allows analysts to create new data based on reasoning and extrapolation, which adds a greater dimensionality to the information already in their possession.

From a corporate governance perspective, the addition of inference will impact how an organization complies with the above-mentioned Privacy Principles, and how it meets its legal obligations regarding consent, access, auditing, and reporting.

In terms of privacy practices, inference can be understood as both a collection and a use of data. For organizations to be compliant with data protection principles, inferences gleaned from data analysis must meet the requirements of applicable privacy laws. This means that new data created from the inference process must be collected and used for a specified reason and with the consent of the individual; it must be accurate; the data collector must disclose all collected information on an individual should they request it; and the data’s use, disclosure, and retention must be limited to the specified reason of collection.

If inference is used to generate new data outside the original data’s specified purpose, the collecting organization will not be complying with privacy laws and could become subject to individual complaints and auditing from the Office of the Privacy Commissioner. So while inference can seem like the dawn of a new age in Big Data analytics, it is still restricted by privacy laws, and must be used only within the present data collection and use principles.

 

Social Media Analytics Drivers

By Aydin Farrokhi and Dr. Wael Hassan

Today, the public has remarkable power and reach by which they can share their news, and express their opinion, about any product or services or even react to an existing state of affairs, especially regarding social or political issues. For example, in marketing, consumer voices can have an enormous influence in shaping the opinions of other consumers. Similarly, in politics, public opinion can influence loyalties, decisions, and advocacy. 

While increasingly organizations are adopting and embracing social media, the motive for each establishment to use social media varies. Some of the key drivers for adopting social media include:

Economic drivers:

 

  • Market research and new product
  • Need for better consumer
  • Need to gain competitive
  • Need to improve customer
  • Need to develop new products and
  • Need to increase Return on Marketing Investment (ROMI)
  • Top strategic actions to maximize social media spend
  • Improve ability to respond to customer’s wants and needs
  • Build social media measurement into marketing campaigns and brand promotions
  • Maximize marketing campaign and effectiveness
  • Align social media monitoring capabilities to overall business objectives

 

Political drivers:

  • Public opinion research and new motto
  • Need for better public
  • Need to gain competitive
  • Need to improve public
  • Need to develop new
  • Need to increase Return on Campaigning Investment (ROCI)
  • Top strategic actions to maximize social media spend
  • Improve ability to respond to the public’s wants and needs
  • Build social media measurement into political campaign and publicity promotions
  • Maximize political campaign and effectiveness
  • Align social media monitoring capabilities to overall political agenda

 

In general, there are three major categories of methods for analyzing social media data. These analytical tools can be grouped as either Content Analysis tools, Group and Network Analysis tools or Prediction tools.

 

 

 

Overcoming the Challenges of Privacy of Social Media in Canada

By Aydin Farrokhi and Dr. Wael Hassan

In Canada data protection is regulated by both federal and provincial legislation. Organizations and other companies who capture and store personal information are subject to several laws in Canada. In the course of commercial activities, the federal Personal Information Protection and Electronic Documents Act (PIPEDA) became law in 2004. PIPEDA requires organizations to obtain consent from individual whose data being collected, used, or disclosed to third parties. By definition personal data includes any information that can be used to identify an individual other than information that is publicly available. Personal information can only be used for the purpose it was collected and individuals have the right to access their personal information held by an organization.

Amendments to PIPEDA 

The compliance and enforcement in PIPEDA may not be strong enough to address big data privacy aspects. The Digital Privacy Act (Also known as Bill S_4) received Royal Assent and now is law. Under this law if it becomes entirely enforced, the Privacy Commissioner can bring a motion against the violating company and a fine up to $100,000.

The Digital Privacy Act amends and expands PIPEDA in several respects:

 

  1. The definition of “consent” is updated: It adds to PIPEDA’s consent and knowledge requirement. The DPA requires reasonable expectation that the individual understands what they are consenting to. The expectation is that the individual understands the nature, purpose and consequence of the collection, use or disclosure of their personal data. Children and vulnerable individuals have specific

There are some exceptions to this rule. Managing employees, fraud investigations and certain business transactions are to name a few.

  1. Breach reporting to the Commissioner is mandatory (not yet in force)
  2. Timely breach notifications to be sent to the impacted individuals: the mandatory notification must explain the significance of the breach and what can be done, or has been done to lessen the risk of the
  3. Breach record keeping mandated: All breaches affecting personal information whether or not there has been a real risk of significant harm is mandatory to be kept for records. These records may be requested by the Commissioner or be required in discovery by litigant or asked by the insurance company to assess the premiums for cyber
  4. Failure to report a breach to the Commissioner or the impacted individuals may result in significant

Cross-Border Transfer of Big Data

The federal Privacy Commissioner’s position in personal information transferred to a foreign third party is that transferred information is subject to the laws and regulations of the foreign country and no contracts can override those laws. There is no consent required for transferring personal data to a foreign third party. Depending on the sensitivity of the personal data a notification to the affected individuals that their information may be stored or accessed outside  of Canada and potential impact this may have on their privacy rights.

 Personal Information- Ontario Privacy Legislations

The Freedom of Information and Protection of Privacy Act, the Municipal Freedom of Information and Protection of Privacy Act and Personal Health Information Protection Act are three major legislations that organizations such as government ministries, municipalities, police services, health care providers and school boards are to comply with when collecting, using and disclosing personal information. The office of the Information and Privacy Commissioner of Ontario (IPC) is responsible for monitoring and enforcing these acts.

In big data projects the IPC works closely with government institutions to ensure compliance with the laws. With big data projects, information collected for one reason may be collectively used with information acquired for another reasons. If not properly managed, big data projects may be contrary to Ontario’s privacy laws.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

7 Mandatory Breach Reporting Requirements and Examples — Ontario

Mark you calendars , October 1st 2017 Mandatory Breach Reporting Requirements kick in.

THERE ARE 7 SITUATIONS WHERE YOU MUST NOTIFY THE ONTARIO PRIVACY COMMISSIONER OF A PRIVACY BREACH

  1. Use or disclosure without authority :  Looking at a family member, a celebrity, a politician records out of curiosity or for a malicious intent. Limited exceptions: accessing a record by mistake, or mailing a letter to the wrong address.
  2. Stolen Information: Laptop, Tablet, or paper theft or loss.  In addition to being subject to malware or ransomware.
  3. Extended Use or Disclosure: Following a reported breach, a sales company used records to market its products or services.
  4. Pattern or Similar Breaches: Letters are being sent to the wrong address, employees are repeatedly accessing a patient’s record.
  5. Disciplinary action against a college member:  A college member resigns, is suspended, or has their licenses revoked following or combined with a breach.
  6. Disciplinary action against a non college member: Resignation, Suspension, or firing of an employee following or during a breach.
  7. Significant Breach: the information is sensitive, large volume , large number of affected individuals, and more than one custodian or agent is involved.

Custodians will be required to start tracking privacy breach statistics as of January 1, 2018, and will be required to provide the Commissioner with an annual report of the previous calendar year’s statistics, starting in March 2019.

Bill S-201: The Genetic Non-Discrimination Act

Following a majority vote in the House of Commons passing the bill and a final approval by the Senate, Bill S-201 received royal assent on May 4th. The bill, also referred to as the Genetic Non-Discrimination Act, is a preliminary step towards the amendment of the Canadian Human Rights Act as it aims to prevent discrimination by reasoning of genetic history. Furthermore, enactment of Bill S-201 amends the Canada Labour Code, with intentions to protect employees from involuntary genetic testing or disclosure of genetic testing results.

What Bill S-201 Will Accomplish

With medical technology undergoing rapid change, genetic testing is predicted to be increasingly reliable in illness prediction. The enactment of Bill S-201 protects individuals from revealing results of a genetic test as a precursor to receiving services or entering contracts and agreements. In simpler terms, service providers that require participants to undergo genetic testing will need to re-examine current practices and alter them to align with this new act. Groups that are exempt from Bill S-201’s conditions are health care practitioners, such as physicians and pharmacists, and medical or pharmaceutical researchers in instances where the individual is a participant in a study. With the passage of the Genetic Non-Discrimination Act, goods and service providers such as employers and insurance companies, that require genetic testing results will be in contravention with the Personal Information Protection and Electronic Documents Act (PIPEDA).

Bill S-201 is considered to be an important action taken for human rights and privacy for Canadians – highlighting the significance of privacy protection for sensitive personal information. Given the current advancement of genetic discoveries that are poised to revolutionize the field of medicine and reveal individual likelihoods of developing diseases, common concerns at arise from the public include barriers to receiving insurance coverage, employment and social acceptance. With the adoption of Bill S-201, insurance companies – which have generally mandated individuals to disclose lifestyle and genetic background – are prohibited from doing so as a method to denote policy rates.

Opposition & Support

Opposing stakeholders such as insurance providers have challenged this over fears that this legislation is unfair as it sanctions insurers to payout claims for individuals with genetic predispositions to debilitating and fatal illnesses. During a vote in the House of Commons in March, MPs were urged to reject the bill by Prime Minister Justin Trudeau, who declared the bill unconstitutional. Notably, insurance regulation falls under provincial and territorial jurisdiction with this regulation possibly allowing leeway for the insurance sector to challenge the law in the Supreme Court of Canada, with reasoning that it is unconstitutional due to the division of powers.

Supporters of Bill S-201 have confidence in its ability to prevent genetic discrimination in the private sector and remove obstacles to comprehensive insurance coverage. Currently, restrictions to obtaining individual genetic testing results have been implemented in France while the United Kingdom has adopted an agreement that limits insurance providers in their use of genetic testing. As global concerns of personal privacy emerge, the arena for political discourse on these matters will undoubtedly be dynamic and populated with many stakeholders.