Categories: Privacy

Best-Practice Data Transfers for Canadian Companies – I – Outsourcing


In our digitally interconnected world, most organizations that handle personal information will transfer it to a third party at some stage of the data life cycle. Your company may send personal information (PI) to an external service provider such as PayPal to process customer payments – that’s a data transfer. Perhaps you hired a data disposal company to destroy data at the end of its life span – that’s a data transfer. Your company may outsource payroll – that means you’re transferring employee data. Any sharing or transmitting of data, electronic or hard copy, is considered a transfer.

But remember: all transfers of personal information must be compliant with the Personal Information Protection and Electronic Documents Act (PIPEDA) and any relevant provincial and sector-specific privacy laws. So, be aware that the many business advantages of data outsourcing are offset by increased security risks, as we’ll see below. And when PI flows into another jurisdiction, the situation becomes more complex.

The key take-away is this:

When you transfer personal information, even if it passes into another jurisdiction, you retain accountability for its care.

A common type of data transfer is outsourcing: handing over aspects of the provision and management of data computing and storage to a third party. A cloud database managed by a third party is a common example of data outsourcing. Within a data outsourcing design, data sets are often stored together with an application – this connects to an external server, which can then assume data management.

There are many advantages to delegating a business process to an external service provider; these can include efficiency, lower labour costs, and enhanced quality and innovation. (Data processing is often outsourced offshore, to foreign businesses: this raises other issues, which are addressed in Part II: Cross-border Data Transfers.

However, data outsourcing brings its own challenges and security risks. Can you guarantee that your data processor will not misuse the data in its care? Can you ensure that access controls will be enforced, and policy updates supported, by your processor? Will the processor commit to as rigorous a Privacy Framework as your company has?

The greatest danger with data outsourcing is the risk of a security breach. According to Trustwave’s 2013 Global Security Report, in 63% of global data breach investigations, “a third party responsible for system support, development and/or maintenance introduced the security deficiencies exploited by attackers.”[i] Patrick Thibodeau, senior editor of Computerworld, stresses that companies utilizing the advantages of data outsourcing “need to go through an exhaustive due-diligence process and examine every possible contingency.”[ii]

Encrypting the data to be outsourced can prevent both outside attacks and inappropriate access from the server itself. It’s also helpful to combine authorization policies with encryption methods, so that access control requirements are bundled together with the data.

Before transferring data, think carefully: is the personal information component actually needed? If you can ensure that the data is (irreversibly) anonymized, and keep careful records of having done so, the personal information will disappear and data protection principles will no longer apply.

PIPEDA doesn’t prevent organizations from outsourcing the processing of data, but the Office of the Privacy Commissioner cautions that organizations outsourcing PI need to take “all reasonable steps to protect that information from unauthorized uses and disclosures while it is in the hands of the third-party processor.”[iii]

Legal Requirements

CANADA: Under PIPEDA, the “transfer” of data is considered a “use” by a company, as opposed to a “disclosure” – this is because the processing of information by a third party is still done for the purposes for which the PI was originally collected. “Processing” is interpreted as any use of the information by a third party for its intended purpose at the time of collection.

PIPEDA’s first Privacy Principle, Accountability, states:

“An organization is responsible for personal information in its possession or custody, including information that has been transferred to a third party for processing. The organization shall use contractual or other means to provide a comparable level of protection while the information is being processed by a third party.”

This statement has three key clauses; we’ll look at each in turn.

1) “An organization is responsible for personal information in its possession or custody, including information that has been transferred to a third party for processing.” The onus of responsibility lies with your organization, even once information has been transferred to a third party; you cannot outsource legal liability. This means that you’ll need to know exactly what data protection safeguards your data processor has in place, and be able to monitor them during the transfer process.

2) An organization needs to ensure a “comparable level of protection while the information is being processed by a third party.” According to the Office of the Privacy Commissioner, this means that the third party must provide a level of data protection comparable to the protection that would have been in place had the data not been transferred.[iv] (The protection should be generally equivalent, but it doesn’t necessarily have to be exactly the same across the board.)

3) “The organization shall use contractual or other means” to comply with legal privacy requirements. There should be a written agreement in every instance where personal information is transferred to a third party. A contract cannot transfer responsibility, but it can describe necessary measures a data processor must take to optimally safeguard personal information, and clearly delineate the responsibilities of each party.

In an effort to protect PI and reduce risks, PIPEDA’s restrictions encourage organizations to minimize data transfers, and only to use them for reasonable purposes.

Quebec has passed legislation[v] that imposes strict rules on private-sector organizations using, transferring, or disclosing personal information outside Quebec, even if the PI is being transferred to another Canadian province. Under the law, data transfer or disclosure is prohibited unless it can be guaranteed that the PI will not be used or disclosed for other purposes than those for which it was transferred, or disclosed to third parties without consent.

UNITED STATES: While no federal law creates a general requirement for data owners regarding data protection during transfer, sectoral laws may do so: for example, the Health Insurance Portability and Accountability Act imposes strict regulations on covered entities seeking to disclose personal health information to a service provider. State laws may also impose security standards; for example, California requires organizations transferring data to third parties to contractually oblige those third parties to maintain reasonable security protocols.

EUROPE: Free transfer of personal data within member states is integral to the founding principles of the EU. As long as the data is transferred in compliance with the strict requirements of the General Data Protection Regulation, the Regulation does not restrict data flows within the European Union or European Economic Area.

For further information on data transfers, and privacy compliance matters generally, see Waël Hassan’s book Privacy in Design: A Practical Guide to Corporate Compliance, available on Amazon.

A three-part series from KI Design:


[i] Trustwave 2013 Global Security Report, p. 10, online at:

[ii] Patrick Thibodeau, “Offshore risks are numerous, say those who craft contracts,” Computerworld, 3 November 2003, p. 12, online at:–say-those-who-craft-contracts.html.

[iii] For more information, see the OPC’s “Privacy and Outsourcing for Businesses” guidelines, online at:

[iv] Office of the Privacy Commissioner, “Guidelines for Processing Personal Data Across Borders,” January 2009, online at:

[v] P-39.1 – Act respecting the protection of personal information in the private sector, online at:

Categories: Privacy

“False Light” – Canada’s Newest Tort

A tort recognized by the Ontario Superior Court of Justice last month expands privacy protections for Canadians by adopting a well-established US cause of action.


Torts are an essential element of common law. A tort is a wrongful act or injury that leads to physical, emotional, or financial damage to a person, for which another person can be held legally responsible. Torts may be either intentional or unintentional (a tort may be caused by negligence, for example). Law in this area usually develops through legal precedent, as decisions by the highest courts expand the scope and application of a tort.


The Ontario case, V.M.Y. v. S.H.G., concerned cyberbullying of a particular nature: in an ongoing campaign of harassment, a father posted images, petitions, and videos of his ex-wife and her parents, along with comments that accused them of numerous illegal acts, including kidnapping, child abuse, assault, and fraud.


The tort, new to Canadian law, is that of “publicity placing a person in a false light.” The court found that “the wrong is in publicly representing someone, not as worse than they are, but as other than they are. The value at stake is respect for a person’s privacy right to control the way they present themselves to the world.”[1] 


This tort is already well-established in courts in the United States. Indeed, over recent years, three of four key US privacy-related torts have been adopted into Canadian law.  These common-law torts, originally catalogued by American jurist William L. Prosser, are:

  1. Intrusion upon seclusion or solitude or private affairs;
  2. Public disclosure of embarrassing private facts;
  3. Publicity which places the plaintiff in a false light in the public eye; and
  4. Appropriation, for the defendant’s advantage, of the plaintiff’s name or likeness.

The first three have been adopted in Canada in the following decisions:

Tort Case Jurisdiction and year
Intrusion upon seclusion Jones v. Tsige Ontario, 2012
Public disclosure of private facts Doe v. ND Ontario, 2016
False light V.M.Y. v. S.H.G. Ontario, 2019


All three Canadian decisions referenced Prosser’s work.


As I pointed out in my book Privacy In Design: A Practical Guide to Corporate Compliance (page 62):


Liability in these four US privacy torts often hinges upon whether the violation can be considered “highly offensive to a reasonable person.” It isn’t always easy to predict how broadly this category may be defined by a US court, but following prior judgements, certain trends can be noted:

The court in V.M.Y. v. S.H.G. followed the US definition of the tort by hinging  culpability of whether or not the act in question is “highly offensive.” Justice Kristjanson wrote: “It is enough for the plaintiff to show that a reasonable person would find it highly offensive to be publicly misrepresented as they have been.”[2]


This case is a victory for privacy rights in Canada. Those who use the Internet to harass others have had their wings clipped; this case and the tort it creates expand the legal remedies available to their victims.


Privacy In Design: A Practical Guide to Corporate Compliance (2019) is available on Amazon:



[1] V.M.Y. v. S.H.G., [2019] O.J. No. 6702.

[2] Ibid.

Categories: About Waël, Privacy

Privacy in Design: A Practical Guide to Corporate Compliance

I am pleased to announce that Privacy in Design Book is now available for preorder on Amazon and Kindle.

The book describes a journey into achieving corporate compliance while maintaining and improving business objectives.

Compliance is an essential resource for privacy officers, executive leadership, and boards of directors.

From developing a privacy program, through data handling and protection, to auditing and monitoring privacy maturity, Waël Hassan presents his expert advice, clearly delineating applicable legal obligations and increased regulatory requirements, including the GDPR. He explores privacy best practices on numerous topical issues, such as workplace and employment privacy practices, data transfer and cloud computing, data analytics, and data breach avoidance and management.

The book is divided into four sections:

  1. Part I: Navigating the Legal Landscape of Privacy Protection
  2. Part II: Bringing Your Organization into Best-practices Privacy Part
  3. III: How to De-identify Personal Information
  4. Part IV: Privacy and Big Data

By implementing the principles and practices outlined in this book, you’ll make privacy compliance benefit your business and become a competitive advantage.

About Waël Hassan:

Dr. Waël Hassan is the founder of KI Design – his full bio is available at About

Categories: Privacy

Urban Data Responsibility – The Battle for Toronto

The initial excitement over Alphabet’s Smartcity may be dwindling out of the perception that the tech giant will use the new development in the Harbourfront to collect personal data. The special attention given by interest groups to a project that actually has engaged the public and shown good faith may be giving companies the wrong lesson: Don’t engage the public and no one will care.

For several years, Turn Style, now Yelp Wifi, has captured, linked, and shared consumer confidential data with no public engagement and no protest from advocates.

By protesting against companies who are engaging the public – interest groups may be doing Privacy a dis-service

The project, run by Sidewalk Labs is set to be an ambitious feat which incorporates innovation with sustainability to build a city that is ‘smart’—technology that responds to users to create a highly efficient and responsive landscape. On the one hand, the public is excited for the opportunity to live in a highly efficient neighborhood whose core is sustainability and innovation, and on the other, the public is alarmed by advocates who claim that the project’s data collection and sharing is alarming. The graph below illustrates how feelings of excitement are progressively being overtaken by feelings of fear.

The real question we should be asking is whether the data being collected, is much different from the surveillance techniques we interact with daily. Traffic cameras- the low-tech version to Apples’s sensors- already track our movements. Our Presto cards, now increasingly necessary to use public transportation, store our travel data and can reveal where we live, work and who we travel with. Yelp Wifi is a known data predator, which indiscriminately and without consent tracks Torontonians’ entry and exit into 600+ local businesses. We sign onto unsecured servers to gain access to Wi-Fi when we are at a café or shopping mall, and most of us already give access to more information than we realize via the cellphones we carry and use to share personal information, at all times. Yelp’s retention policy is effectively indefinite. Opting out of their services is definitely not accessible even for the tech savvy.

Here is an excerpt of Yelp wifi data retention:

We (Yelp) retain all Personally Identifiable Information and Non-Personally Identifiable Information until the date you first access the Services or the time at which you instruct us in accordance with the terms hereof to remove your Personally Identifiable Information or Non-Personally Identifiable Information.

I have been following Yelp’s Wifi traction on privacy from when it was a startup on King West called Turnstyle. Their CEO was quoted “I want to desensitize people’s need for privacy”. Their traction on privacy has been disappointing. Reviewing their policies over the years, I found that:

Yelp Wifi’s retention policy is confusing and inaccessible, it violates the reasonable expectation of privacy

In my opinion, and despite all the noise, Sidewalk Labs’ proposal is reasonable. Their Digital Governance Proposal has principles that demonstrate good faith. Meanwhile, advocates are pushing for anonymization, a technique that allows the removal of personal identity from any sensor data.

In this discussion, some argue that the issue is not that Sidewalk Labs will collect data, it is that a corporation is cementing itself—literally—in the place of local government. What access will it have and who will it share our information with?

Like any good government, in order for citizens to have a voice and prevent any giant from taking over—political or corporate—there needs to be checks and balances in place to ensure compliance. In reviewing their proposal, I found that:

The sidewalk proposal is reasonable, however it is missing an important tenant of data protection, that is Audit.

Much like any public corporation that exposes its financial documents to a third party to perform its financial audit, Sidewalk Labs’ proposal is missing the potential of a technical audit by a third-party assessor.

I also find the counter points made by advocates to be lacking. The argument that anonymization offers protections in big data is misplaced:

Anonymization may be moot because data will be released to companies that have other sources to blend

The current negotiations happening are important but unless we understand how they will be enforced and regulated over time, they remain policy when in fact action is needed. This is new territory from a legal, political, and business standpoint and the truth is, Canadians do not have robust protections in place to safeguard them from privacy exploitation. As the law unfortunately drags behind, we must be proactive in how we build our security governance. Privacy audit companies have long been in the business of protecting our data—they ensure information is being stored and shared responsibly and, the way it’s intended.

As we continue to debate the Harbourfront project we must resist falling back onto tropes of progress versus preservation of the norm. Initially, we must realize that our norms most likely share more of our data than we would like. Then, we must understand that change is inevitable, but we have a chance to be part of that change and direct its course. Privacy Auditing allows us the opportunity to consistently ensure that our data is being used in the ways we intend for it to be used.

Now is not the time to step away from negotiations, particularly from a company that is welcoming feedback. How the project is developed and instituted will set a precedence and influence, not only for the Harbourfront area, but what we can expect from corporate governance and the future of privacy laws. It is in our utmost interest to take full interest and engage as extensively as we can to ensure an outcome that keeps its promise of innovation and sustainability.

As a private citizen, I welcome businesses that are open to listening and are engaging the public to expressing their opinion. The effort by Sidewalk Toronto and their partners is a work in progress that will need more attention and third party attestation.

Stay tuned for our upcoming pieces that continue to inform on Privacy by Design in the Big Data environment. 

Smart Privacy Auditing – An Ontario Healthcare Case Study

IPCS Smart Privacy Auditing Seminar

On September 13, Dr. Waël Hassan, was a panelist at the Innovation Procurement Case Study Seminar on Smart Privacy Auditing, hosted by Mackenzie Innovation Institute (Mi2) and the Ontario Centres of Excellence (OCE). The seminar attracted leaders from the health care sector, the private information and technology industry, and privacy authorities. The seminar explored the concept of innovative procurement via the avenue of competitive dialogue, in addition to demonstrating the power and benefits of using artificial intelligence to automates the process of auditing all PHI accesses within a given hospital or health network.

What are the benefits of participating in an innovative procurement process, particularly competitive dialogue?

An innovative procurement partnership between Mi2, Mackenzie Health, Michael Garron Hospital, and Markham Stouffville Hospital was supported by the OCE’s REACH grant and sought to identify an innovative approach to auditing that could be applicable to the privacy challenges faced by numerous hospitals with different practices, policies, and information systems. Rather than focus on how the solution should operate, the partners collaboratively identified six outcome-based specifications the procured audit tool would be required to meet.

By identifying key priorities and specifying the outcomes a solution should achieve, Competitive Dialogue establishes a clear and mutual understanding of expectations. This can help the private sector narrow down solution options to a model best-suited for the contracting authority’s unique context. The feedback loop provided by the iterative rounds (if used) enables vendors to clarify any confusion and customize proposals to the contracting authority’s unique needs, staff workflows, and policy contexts.

Competitive Dialogue is an opportunity for transparent communication that gives vendors the opportunity to learn more intimate details of what the contracting authority, in this case Mackenzie Health, needs from a solution. Because hospitals are not tech or security experts, they often struggle to accurately identify and define what solutions they need to solve a particular issue, and thus a traditional procurement process is rarely ideal since there is little to no room for clarification or feedback. This process is more flexible than the traditional procurement process and thereby allows for more creativity and innovative thinking processes during the initial proposal development. Encouraging creativity and creating a competitive environment in which competing vendors may be sounding ideas off each other results in higher quality proposals and final solutions.

 Mackenzie Health Case Study

Mackenzie Health employs over 450 physicians and 2,600 other staff members, processes nearly 55,000 patient medical record accesses every day, and has just one privacy officer to monitor everything. Mackenzie Health’s privacy needs far outweigh its capacity, so they turned to the private sector for an innovative solution.

Section 37(1) of PHIPA outlines the possible uses of personal health information, and these guidelines are based on the purpose underlying the activities. Because the legal framework is centred on purpose, KI Design’s approach is to explain the purpose for accessing a given medical record. The core of this technology is more commonly known as an explanation-based auditing system (EBAS) designed and patented by Dr. Fabbri of Maize Analytics.

To detect unauthorized accesses, the technology has the capability of identifying an intelligible connection between the patient and the employee accessing the patient’s records. AI changes the fundamental question underlying auditing tools from “who is accessing patient records without authorization?” to, “for what purpose are hospital staff accessing patient records?” Asking this question helps the technology break down staff workflows and identify common and unique purposes for accessing any given medical record, which are further categorized as either authorized access or unexplained access, which may then flagged as potentially unauthorized behaviour. The technology is able to filter out the authorized accesses, which are usually 98% to 99% of all accesses, so that the Privacy Officer can focus on the much smaller number of unexplained and flagged accesses.

Why is the private sector interested in health care?

Health care is an extremely complex system operated by the province and service providers. The province is a specialist in governance and regulation, the service providers are specialists in medicine – neither are experts in privacy or security. Companies such as KI Design are interested in filling the expertise gap within the health care sector by working closely in tandem with health care providers and the Information & Privacy Commissioner to adapt privacy and security solutions that are suitable for their working realities. There is irrevocable value added in having a privacy and security expert working directly with hospitals and other health service providers to assist in refining privacy best practices and implementing a privacy tool that will improve privacy and security outcomes without restricting the workflows of health practitioners.

To learn more on how AI solutions improve Audit visit

Categories: Privacy

Are Malls “Grasping at Straws”?

Cadillac Fairview is tracking the public by using facial recognition technology !!

The news of privacy commissioners of Canada and Alberta launching an investigation into facial recognition technology used at Cadillac Fairview, did not come as a surprise to many. The investigation was initiated by Commissioner Daniel Therrien in the wake of numerous media reports that have raised questions and concerns about whether the company is collecting and using personal information without consent.

@CadFairview has stated that it is using the technology for the purposes of monitoring traffic, as well as the age and gender of shoppers. The company contends it is not capturing images of individuals.

Now, Cadillac Fairview suspended its use of facial recognition software as the investiation is underway.

I applaud the efforts of both @ABoipc and @PrivacyPrivee. In my opinion there are three questions that need be answered through the audit report:

  1. Do citizens have a reasonable expectation of being free from historic tracking?
  2. Will posting signs at mall entrances be sufficient notice that CF can use to infer information about individuals?
  3. With a handful of information security staff, is CF a trusted custodian of public data?

There is no doubt that the investigators from both offices are more than capable to perform this investigation, my fear however is that

Current legistlative instruments may not adequately defend citizen privacy against biometric big data system

In previous work, we wrote on the advent of Big Data and we explained that current laws do not deal with inferences. We have also written numerously on how the lack of Privacy is an existential threat to any business.

While the investigation is on going, I as a member of the public encourage CF to

  1. Update your outdated (2016) Privacy Policy.
  2. Provide the name of your privacy officer to the public.
  3. Disclose a Privacy Impact Assessment showing your due diligence

@CadFairview is fully aware that the future is here and the new generation of shoppers enjoys online shopping. Breaching citizen privacy will likely speed up the end of the mall era.

Categories: Privacy

Parliament Responds to the Standing Committee’s Report on Access to Information, Privacy and Ethics

The Honourable Navdeep Bains, P.C., M.P. extends his gratitude for report of the Standing Committee on Access to Information, Privacy and Ethics titled, Privacy by Design: Review of the Personal Information Protection and Electronic Documents Act. His response encompasses the following summary.

He shows his appreciation for the OPC and other witnesses that supported this study and states that the recommendations provide valuable guidance. The Government of Canada agrees that changes are required to ensure that the use of personal information in a commercial context has clear rules to support the expectations of Canadians.

A critical step was made with the announcement of new regulations under the PIPEDA on April 18, 2018, to assure Canadians that they will be informed about risks with the distribution of their personal information. The next step is to engage Canadians in conversations about data and digital issues on a national level.

Consent under the PIPEDA

The Government agrees that consent should remain a core element of the PIPEDA, as it provides individuals with control over how their personal information is shared. Maintaining a progressive view on consent additionally ensure that the internationally recognized standards align with those of Canada. However, there is work to be done to ensure that consent remains meaningful under the PIPEDA, as it can be enhanced in a variety of ways. Furthermore, the Government is committed to maintaining the principles-based approach to the PIPEDA, as it has been noted as a source of the Act’s strength.

In response to recent incident involving unintended uses of personal information from social media, the Government acknowledges the need to closely consider redefining “publicly available” information for the purpose of the PIPEDA. The amendments to the PIPEDA’s consent requirements resulted in consent only be considered valid if the individual can understand the consequences providing that consent. This was aimed to prohibit deceptive collection of a child’s personal information, however it presents unique challenges as it involves the definition of a minor, which is within provincial jurisdiction.

Online Reputation and Respect for Privacy

The Government acknowledges public concerns about the accumulation of personal information online and agrees that it poses a risk to privacy protection. Furthermore, the Government acknowledges the work by the OPC in this area and that there are legitimate concerns about the impacts of this position on other rights. Therefore, the OPC has called for further study to provide an appropriate balance between these competing rights.

Public commentary on the divergent views of these matters results in the need for providing further certainty on how the PIPEDA applied within various contexts is necessary to ensure a “level playing field”. The Government agrees that the appropriate destruction of information after it is no longer needed provides unintended future uses that can lead to harm on their reputation.

Enforcement Powers of the Privacy Commissioner

In agreeance with the Committee, the Government states that the time has come to closely examine how the PIPEDA’s enforcement model can be improved to ensure that the objectives are met of supporting innovation and growth of the digital economy, while providing robust protections for personal privacy. Similar recommendations were made by the Senate Standing Committee on Transport and Communications.

In order to determine an optimal model for compliance and enforcement, the Government must assess all options that can strengthen the compliance and enforcement regime of the Act. As part of this assessment, the Government must look at other models of compliance and enforcement to consider potential impacts on the mandate of the OPC, the principles of fundamental justice, and the countervailing risks with increased powers. Options for change must also be assessed, including those regarding consent.

Impact of the European Union (EU) General Data Protection Regulation (GDPR)

The Government supports the following: (1) Canada’s Adequacy Status (ref. recommendations 17-19) and acknowledges that data flows are a significant enabler in a growing digital economy. In discussion with trade partners, including the EU nations and institutions, the key is to work towards harmonization of different frameworks to ensure data protection is levels all jurisdictions. Officials are using a cross-government approach and working closely with the European Commission to understand the requirements for maintaining Canada’s adequacy standing under the EU GDPR.

The Committee’s study has made a significant contribution to this work by providing the government with recommendations to ensure effectiveness of the PIPDEA of international developments.

New Rights to Align with the GDPR

In recognition of the importance of interoperability of privacy regimes, in the GDPR the EU has added concept of “essential equivalence” to examine the adequacy of non-member regimes. Therefore, it is not clear that the PIPEDA’s requirements must reflect each of the GDPR’s right and protections in order to maintain its adequacy standing.

Moving forward, the Government will engage Canadians in a conversation on making Canada more data-savvy, focusing on how companies can use personal information to innovate and compete while protecting privacy. This is a value that Canadians hold dear.

Once again, thank you to the Committee on behalf of the Government for the careful examination of these important issues.

Categories: Privacy

 GDPR Responsibilities of Controllers and Processors

Responsibilities of Controllers and Processors

What are controllers and processors under the GDPR?

Both controllers and processors are responsible and liable for compliance under the GDPR.

Responsibilities of Controllers

The primary responsibility of controllers is to data subjects. Controllers also demonstrate compliance with the GDPR and ensure data processors’ compliance as well. Controllers outside the EU that regularly process personal data pertaining to people within the EU should have a designated representative within the EU to help manage compliance.


Responsibilities of Processors

Processors are governed by a contract that addresses how data will be processed, how requests from data subjects will be fulfilled, and whether data will be transferred to any other geographical locations. The processor makes information available to the controller to demonstrate compliance and notifies the controller in the event of a breach. It is also the processor’s responsibility to ensure that authorization is given prior to engaging a sub-processor, and that all data is deleted or returned at the end of their service provision.

The GDPR introduces direct statutory obligations on processors as well as severe sanctions for compliance failures. This is especially relevant for non-EU data processors, who need to ensure that if their clients are based in the EU they are responsible for complying with the GDPR. The processor has equal risk for fines as the controller.


Required Data Protection Practices

Data can also be protected by design, meaning that data protection principles are integrated into the design of the systems that manage personal data. Another way to protect data is by default, meaning putting in place safeguards to limit the processing of data.

Generally, it is recommended to put in place practices and technologies that are appropriate to the level of risk. Some of the best safeguards are quite simple. For instance, having a data protection officer and consulting with supervisory authorities concerning high risk projects. Other examples include breach notifications and data protection impact assessments (DPIA) for high risk projects.

Breaches must be reported within 72 hours of discovery unless there is a low risk to the rights and freedoms of the data subjects. High risk breaches should be communicated to data subjects without delay.

Companies with 250+ employees and those that handle certain special categories of data are required to document: contact information, purpose of processing, categories of data, data transfers to other countries, timelines for erasure of different categories of data and, where possible, a description of technical and organizational security measures.

Categories: Privacy

What Is The GDPR?

What is the GDPR?

The GDPR represents new legislation that is destined to replace the General Data Protection Regulation, which has been in place since 1995. The arrival of the digital age means that the way people understand and interact with data is changing rapidly. The GDPR can help to clarify individual rights in the digital age, as well as creating a “digital single market” within the EU. With the GDPR in place, it will be easier for citizens to have control over their personal data, representing a shift in power.

The underlying principle of the GDPR is that the protection of personal data is a fundamental right, and organizations that handle personal data are responsible for those rights. “Processing” data means collecting, sharing, distributing, structuring, storing, or otherwise using an individual’s data. In this relationship, there are controllers and processors. A controller determines the purpose and means of processing personal data and is usually the collector of the data. Processors are engaged to process data on behalf of the controller, but the controllers are responsible for monitoring processors’ compliance.

The GDPR affects the North American market because any organization that offers goods or services to the EU or that monitors the behaviour of people within the EU is responsible for complying with the GDPR.

There are three key principles of the regulation:

  1. Limitation of processing means that: data must be processed only for specified, explicit and legitimate purposes; data must not be further processed in ways inconsistent with the initial purposes; data should be adequate, relevant, and necessary; data should be accurate and kept up-to-date; data should be kept only as long as necessary.
  2. Informed consent refers to freely given and clearly affirmative consent that must be intelligible, easily accessible, and written in plain language. Participants have the right to withdraw consent, and services cannot be withheld on condition of consent.
  3. Lawful processing pertains to at least one of the following conditions must be met:
    1. Consent from the data subject
    2. Processing is necessary for a contract
    3. Processing is necessary for compliance with EU laws
    4. Processing is necessary to protect a person’s vital interests
    5. Processing in the public interest or exercise of official authority
    6. Legitimate interests of the controller or a third party that are not overridden by the data subject’s rights and freedoms

Key Terms

This concept refers to personal data being deleted when the data subject no longer wants it to be processed. The exception to this is when there is legitimate reason to retain the data, for instance, in the case of completing a contract or complying with legal obligations.

Information is made readily available and is communicated in clear, plain language. Informed consent will especially be enforced regarding services for children.

Data subjects have a right to a copy of their personal data in an appropriate format and, where possible, they can transfer that data directly from one service provider to another. For example, individuals should be able to transfer photos from one social network to another.

This aspect helps protect users’ data by design, for instance by implementing technical safeguards like anonymization, pseudonymization, and encryption, as well as organizational safeguards.

A DPO fills the need for an organization to help monitor privacy and data protection. A DPO is an expert in their field, and is required if an organization’s core activities consist of regular and systematic monitoring of personal data on a large scale. This position helps ensure compliance and awareness of privacy legislation. The DPO may also monitor internal data protection activities, train staff, and conduct internal audits. If data subjects have inquiries, these will go through the DPO as well.


Company Response

Companies are responding to the GDPR in several ways:

  1. Stop buying and selling personal data
  2. Know where your clients live, or implement EU requirements regardless of location
  3. Prepare to respond to requests from data subjects
  4. Audit sub-contractors for compliance
  5. Reconsider cloud services

Categories: Privacy

Designing Smart Cities – A Design Thinking Approach

Privacy, Data Management, and Risk Mitigation

While no clear definition or requirements of a “smart city” exist, the general consensus is that it is an innovative development initiative that combines urban planning with creative digital infrastructure. Areas of focus often include reducing traffic congestion, improving sustainable energy use, and making public spaces more accessible and adaptable to its residents’ needs and desires. To achieve these goals, these initiatives incorporate innovative methods of data collection to improve service provision for local residents, however this innately sparks concerns surrounding consent, privacy, and data protection.

When Sidewalk Labs announced its interest in developing a 12-acre property along Toronto’s eastern waterfront to be North America’s most advanced smart city neighbourhood, many people were concerned about what kinds of data would be collected and how it would be used. Sidewalk Labs is a subsidiary of Alphabet Inc., the parent company of Google, so there is no doubt that this project could bring both incredible innovation as well as possible data exploitation or breaches. This said, the project developers have been vigilant in consulting with the community and releasing updated data privacy frameworks to calm tech-induced fears.

An exciting aspect of smart city development is the opportunity to build new collaborations between municipal and provincial governments, innovation hubs, entrepreneurs and their startups, research institutions, the leading educational institutions, and local residents. When combined, these various actors and organizations can collectively source the innovative ideas, design thinking, policy frameworks, and financial investment required to ensure that new ideas take hold.

Past and potential future efforts include:

There are many approaches to planning and developing a smart city project, but all projects involve basic issues: privacy, data management, and risk mitigation.

Multi-Domain Privacy Impact Assessments

The combination of information sharing initiatives and innovative approaches to service delivery, such as smart city projects, has led to a growing need for multi-institutional and multi-jurisdictional PIAs. Guidelines from the Office of the Privacy Commissioner recommend that such PIAs include a clear business case for information sharing, a common communications strategy to inform the public of information sharing, and a set of expected privacy practices shared by all institutions participating in the data sharing initiative.

Our unique approach builds on these basic requirements to define a clear, seven-step process that we use both to guide our clients as they develop privacy policy prior to developing a smart city project, and to conduct PIAs after a smart city project has been completed.

1.      Purpose: We begin by defining the reasons for which smart city projects collect, use, retain and disclose personal information.

2.     Custodianship: A key next step to ensuring private information is protected is to adopt a custodianship model. In the context of a smart city initiative, a custodian will be designated to review and revise policies, processes, and procedures to ensure any sensitive information is shared securely.

3.     Liability: In order to establish liability, we help to define the roles, responsibilities, and accountabilities of smart city project participants. We define different participants’ right and ability to manage (collect, retain, disclose, and correct) personal information.

4.     Data Management: We define policies for management of data quality, records management, assurance of accuracy, retention and archiving, and secondary use of data.

5.     Controls: We define policies for the application of legislative requirements, including management of information safeguards, compliance auditing, identity validation and management, implementation of consent rules, breach management, and proactive and reactive monitoring of technology assets. Controls also include frameworks such as provider agreements, resident disclaimers, and mandatory and discretionary requirements that define the roles of smart city participants.

6.     Process: We apply privacy policy to workflows and interactions throughout service delivery processes, including service model, delivery model, management of consent, reporting procedures, incident management, and resident feedback mechanisms.

7.     Adoption: In this final step we develop instruments for the implementation of privacy policy during the planning and ongoing development of the smart city project, such as provider agreements, resident disclaimers, mandatory and discretionary requirements, and system feedback.


Recommendations for Smart City Risk Mitigation

Given the opportunities and challenges associated with developing a function and advanced smart city project, we recommend planners and project managers consider the following six areas of risk and mitigation.

  1. Role of AI: Artificial intelligence is still very much uncharted territory, meaning there are abundant opportunities for leading edge technological development, but there is also a policy void. Governments, software developers, and researchers will need to collaborate and actively engage with each other’s sectors to gain a better understanding of their goals, practices, and needs to will help foster secure but innovative development.

2.   Handling Personal Information: The policies and practices that guide how personal information collected by smart city initiatives are fundamental for maintaining the trust of community members and ensuring the initiatives do not violate privacy laws. The data that the new smart technologies collect and analyze come from many sources including sensors and cameras. These technologies may be able to interact with people or their personal devices without any positive action required by the individual (i.e. consent) or an opportunity to out.

The vast amounts of data that can be collected could lead to negative practices (or suspicions of such practices) such as surveillance, profiling, or using personal information for difference purposes than originally stated, either without consent or without public input. This practices are to be avoided wherever possible, and so whatever body is responsible for smart city data management must be vigilant in data-minimization practices by only collecting, using, or disclosing personal information where it is necessary for the initiative’s outcomes and there are no other alternatives. Lastly, smart city operations should have meaningful consent agreements where required by law and/or opt-out opportunities to ensure participants are able to make informed decisions.

3.     Privacy Governance and Oversight: Technology has thus far kept a faster pace than the policy regulating it. Smart city initiatives must be supported by updated data governance and privacy management policies. These policies should address a wide range of privacy and security requirements, including: appoint a privacy lead; monitoring and auditing for regulatory and legal compliance; responding to and maintaining transparency during breaches; and contractual protections and accountability frameworks for all the diverse actors and organizations involved in the initiative. This last requirement is particularly important for encouraging strong partnerships as it helps mitigate the risks of entering into the collaboration at the starting point.

4.     Transparency and Public Notice: For smart city projects to be most successful, a thorough level of community engagement will be required to not only collect and make use of residents’ experiences and ideas, but to also maintain proper feedback channels and project transparency. Project goals and practices should be transparent and made easily understandable so that community members will understand how they might be affected.

5.     Privacy Impact Assessments: Collaborating partners responsible for the security of smart city data must conduct privacy impact and threat risk assessments to ensure privacy and security risks are identified and adequately addressed in the design and implementation of new technologies and programs.

6.     Safeguarding Data: Any smart city endeavours that make use of data collection must include appropriate measures to secure all personal information. Given the diverse formats of implemented technology in the smart cities context, it is especially difficult to establish effective safeguards. Generally speaking, more points of data collection, processing, and access also mean more points of vulnerability and therefore greater risks of a security breach. To mitigate this serious risk, smart city data systems must de-identify personal information at the earliest stage in the collection process as possible and reduce the risk of re-identification that is inherent with connected devices. Lastly, smart cities should only retain, use, and disclose de-identified information, particularly in an aggregated format when possible.

Smart cities offer an incredible opportunity for exercising creative design thinking and harnessing the entrepreneurial spirit. However, government policy must be in line with the best interests of the public, particularly those who will be directly impacted by the programs and new technologies introduced by these innovative initiatives. Two-way, open and transparent discussions and partnerships between the innovative research and design sectors and the government and affected communities will be required to ensure smart cities are designed and implemented in a way that advances technology and urban planning while improving the lives and experiences within the communities. It is clear that following privacy and security best practices are absolute paramount for the success of these initiatives.

To Learn more about smart cities follow @drwhassan