Here are my unfiltered answers to questions surrounding AI.
Beyond the Jargon – what problems can AI solve for me?
AI can solve problems where a judgement call is needed. In essence it is suited to contexts where there is an opinion or an attribute need be determined based on patterns. A prime example is sentiment analysis. If an airport wants to learn what people think of its services, they can train an AI engine to detect sentiment. Using AI, here is an example of how sentiment can be compared between two airports Vancouver International Airport @yvrairport and Toronto Pearson International Airport @TorontoPearson .
Are there any successful AI implementation Models?
In terms of models, AI solutions are not cookie cutter. A model has to be built based on a particular project premises. There may be some lessons learned, but the models are generally not recyclable.
What are a few steps that can ensure success implementing AI?
Its always important to think of the need or the opportunity that we need to solve.
Its important to think of AI as a means to an end, not the other way round
What is an Applicable AI Scenario?
Scenario: People needing to commute before flying are generally unhappy. How can we make their journey or experience better.
- Step 1: Identify a problem or an opportunity. In this case our goal is to improve user experience by identifying their pain points. e.g. customer who commute to fly have a negative sentiment towards the airport.
- Step 2: Interview Stakeholders ask them about their possible solutions. Its extremely important to ask the stakeholders involved for their assumptions. Suggestions from stakeholders could be: Provide parking assist solution, better signage and public transit information, a mobile application that provides a holistic experience.
- Step 3: Use Big Data (Social/Sales/Flight/Traffic/Weather) to understand sentiment. The public is generally vocal about their experiences in the public services domain. Your lead scientist can design an approach to assess sentiment and to extrapolate reasons for negative sentiment. This step ought to include quantitative and a qualitative data analysis.
A) Quantitative data analysis will identify top 10 challenges statistically. for example, unavailability of parking spaces could attribute to negative sentiment.
Using quantitative analysis a scientist can extract the top 10 reasons for negative sentiment.
B) Qualitative analysis is also needed. In this step, a human can train the AI engine to categorize challenges met by travelers. The AI can then sift through data at mass and provide more refined stats.
Using qualitative analysis a team of analysis can understand and breakdown user stories into patterns. Once the patterns are defined, the AI can quickly categories all data into patterns .
- Step 4: Compare assumptions with data. Comparing assumptions to data findings is key. Assumptions made by stakeholders may remain valid despite the lack of data support. The gap could be due to access or data quality.
- Step 5 – Implement solution and remeasure in 6 months or when appropriate: Sentiment may change in the order of days or weeks. Once a solution is chosen and implemented. Sentiment results will start to change.
Can AI be applied to all industries?
Indeed, here is another example how AI has been used to detect cell phone user sentiment in Canada. among @bell, @rogers, & @telus
Why is there a lot of confusion of what AI is?
Most people explaining AI, may lack the technical and scientific rigor. I am told multi-billion budget companies are failing at the basics.
Infographic representing key issues concerning future of data broken by country.
Link to source https://waelhassan.com/wp-content/uploads/2019/01/Future-Value-of-Data-World-Map-Infographic-2018-002.pdf
The initial excitement over Alphabet’s Smartcity may be dwindling out of the perception that the tech giant will use the new development in the Harbourfront to collect personal data. The special attention given by interest groups to a project that actually has engaged the public and shown good faith may be giving companies the wrong lesson: Don’t engage the public and no one will care.
For several years, Turn Style, now Yelp Wifi, has captured, linked, and shared consumer confidential data with no public engagement and no protest from advocates.
By protesting against companies who are engaging the public – interest groups may be doing Privacy a dis-service
The project, run by Sidewalk Labs is set to be an ambitious feat which incorporates innovation with sustainability to build a city that is ‘smart’—technology that responds to users to create a highly efficient and responsive landscape. On the one hand, the public is excited for the opportunity to live in a highly efficient neighborhood whose core is sustainability and innovation, and on the other, the public is alarmed by advocates who claim that the project’s data collection and sharing is alarming. The graph below illustrates how feelings of excitement are progressively being overtaken by feelings of fear.
The real question we should be asking is whether the data being collected, is much different from the surveillance techniques we interact with daily. Traffic cameras- the low-tech version to Apples’s sensors- already track our movements. Our Presto cards, now increasingly necessary to use public transportation, store our travel data and can reveal where we live, work and who we travel with. Yelp Wifi is a known data predator, which indiscriminately and without consent tracks Torontonians’ entry and exit into 600+ local businesses. We sign onto unsecured servers to gain access to Wi-Fi when we are at a café or shopping mall, and most of us already give access to more information than we realize via the cellphones we carry and use to share personal information, at all times. Yelp’s retention policy is effectively indefinite. Opting out of their services is definitely not accessible even for the tech savvy.
Here is an excerpt of Yelp wifi data retention:
We (Yelp) retain all Personally Identifiable Information and Non-Personally Identifiable Information until the date you first access the Services or the time at which you instruct us in accordance with the terms hereof to remove your Personally Identifiable Information or Non-Personally Identifiable Information.
I have been following Yelp’s Wifi traction on privacy from when it was a startup on King West called Turnstyle. Their CEO was quoted “I want to desensitize people’s need for privacy”. Their traction on privacy has been disappointing. Reviewing their policies over the years, I found that:
Yelp Wifi’s retention policy is confusing and inaccessible, it violates the reasonable expectation of privacy
In my opinion, and despite all the noise, Sidewalk Labs’ proposal is reasonable. Their Digital Governance Proposal has principles that demonstrate good faith. Meanwhile, advocates are pushing for anonymization, a technique that allows the removal of personal identity from any sensor data.
In this discussion, some argue that the issue is not that Sidewalk Labs will collect data, it is that a corporation is cementing itself—literally—in the place of local government. What access will it have and who will it share our information with?
Like any good government, in order for citizens to have a voice and prevent any giant from taking over—political or corporate—there needs to be checks and balances in place to ensure compliance. In reviewing their proposal, I found that:
The sidewalk proposal is reasonable, however it is missing an important tenant of data protection, that is Audit.
Much like any public corporation that exposes its financial documents to a third party to perform its financial audit, Sidewalk Labs’ proposal is missing the potential of a technical audit by a third-party assessor.
I also find the counter points made by advocates to be lacking. The argument that anonymization offers protections in big data is misplaced:
Anonymization may be moot because data will be released to companies that have other sources to blend
The current negotiations happening are important but unless we understand how they will be enforced and regulated over time, they remain policy when in fact action is needed. This is new territory from a legal, political, and business standpoint and the truth is, Canadians do not have robust protections in place to safeguard them from privacy exploitation. As the law unfortunately drags behind, we must be proactive in how we build our security governance. Privacy audit companies have long been in the business of protecting our data—they ensure information is being stored and shared responsibly and, the way it’s intended.
As we continue to debate the Harbourfront project we must resist falling back onto tropes of progress versus preservation of the norm. Initially, we must realize that our norms most likely share more of our data than we would like. Then, we must understand that change is inevitable, but we have a chance to be part of that change and direct its course. Privacy Auditing allows us the opportunity to consistently ensure that our data is being used in the ways we intend for it to be used.
Now is not the time to step away from negotiations, particularly from a company that is welcoming feedback. How the project is developed and instituted will set a precedence and influence, not only for the Harbourfront area, but what we can expect from corporate governance and the future of privacy laws. It is in our utmost interest to take full interest and engage as extensively as we can to ensure an outcome that keeps its promise of innovation and sustainability.
As a private citizen, I welcome businesses that are open to listening and are engaging the public to expressing their opinion. The effort by Sidewalk Toronto and their partners is a work in progress that will need more attention and third party attestation.
Stay tuned for our upcoming pieces that continue to inform on Privacy by Design in the Big Data environment.
IPCS Smart Privacy Auditing Seminar
On September 13, Dr. Waël Hassan, was a panelist at the Innovation Procurement Case Study Seminar on Smart Privacy Auditing, hosted by Mackenzie Innovation Institute (Mi2) and the Ontario Centres of Excellence (OCE). The seminar attracted leaders from the health care sector, the private information and technology industry, and privacy authorities. The seminar explored the concept of innovative procurement via the avenue of competitive dialogue, in addition to demonstrating the power and benefits of using artificial intelligence to automates the process of auditing all PHI accesses within a given hospital or health network.
What are the benefits of participating in an innovative procurement process, particularly competitive dialogue?
An innovative procurement partnership between Mi2, Mackenzie Health, Michael Garron Hospital, and Markham Stouffville Hospital was supported by the OCE’s REACH grant and sought to identify an innovative approach to auditing that could be applicable to the privacy challenges faced by numerous hospitals with different practices, policies, and information systems. Rather than focus on how the solution should operate, the partners collaboratively identified six outcome-based specifications the procured audit tool would be required to meet.
By identifying key priorities and specifying the outcomes a solution should achieve, Competitive Dialogue establishes a clear and mutual understanding of expectations. This can help the private sector narrow down solution options to a model best-suited for the contracting authority’s unique context. The feedback loop provided by the iterative rounds (if used) enables vendors to clarify any confusion and customize proposals to the contracting authority’s unique needs, staff workflows, and policy contexts.
Competitive Dialogue is an opportunity for transparent communication that gives vendors the opportunity to learn more intimate details of what the contracting authority, in this case Mackenzie Health, needs from a solution. Because hospitals are not tech or security experts, they often struggle to accurately identify and define what solutions they need to solve a particular issue, and thus a traditional procurement process is rarely ideal since there is little to no room for clarification or feedback. This process is more flexible than the traditional procurement process and thereby allows for more creativity and innovative thinking processes during the initial proposal development. Encouraging creativity and creating a competitive environment in which competing vendors may be sounding ideas off each other results in higher quality proposals and final solutions.
Mackenzie Health Case Study
Mackenzie Health employs over 450 physicians and 2,600 other staff members, processes nearly 55,000 patient medical record accesses every day, and has just one privacy officer to monitor everything. Mackenzie Health’s privacy needs far outweigh its capacity, so they turned to the private sector for an innovative solution.
Section 37(1) of PHIPA outlines the possible uses of personal health information, and these guidelines are based on the purpose underlying the activities. Because the legal framework is centred on purpose, KI Design’s approach is to explain the purpose for accessing a given medical record. The core of this technology is more commonly known as an explanation-based auditing system (EBAS) designed and patented by Dr. Fabbri of Maize Analytics.
To detect unauthorized accesses, the technology has the capability of identifying an intelligible connection between the patient and the employee accessing the patient’s records. AI changes the fundamental question underlying auditing tools from “who is accessing patient records without authorization?” to, “for what purpose are hospital staff accessing patient records?” Asking this question helps the technology break down staff workflows and identify common and unique purposes for accessing any given medical record, which are further categorized as either authorized access or unexplained access, which may then flagged as potentially unauthorized behaviour. The technology is able to filter out the authorized accesses, which are usually 98% to 99% of all accesses, so that the Privacy Officer can focus on the much smaller number of unexplained and flagged accesses.
Why is the private sector interested in health care?
Health care is an extremely complex system operated by the province and service providers. The province is a specialist in governance and regulation, the service providers are specialists in medicine – neither are experts in privacy or security. Companies such as KI Design are interested in filling the expertise gap within the health care sector by working closely in tandem with health care providers and the Information & Privacy Commissioner to adapt privacy and security solutions that are suitable for their working realities. There is irrevocable value added in having a privacy and security expert working directly with hospitals and other health service providers to assist in refining privacy best practices and implementing a privacy tool that will improve privacy and security outcomes without restricting the workflows of health practitioners.
To learn more on how AI solutions improve Audit visit https://phipa.ca/
Cadillac Fairview is tracking the public by using facial recognition technology !!
The news of privacy commissioners of Canada and Alberta launching an investigation into facial recognition technology used at Cadillac Fairview, did not come as a surprise to many. The investigation was initiated by Commissioner Daniel Therrien in the wake of numerous media reports that have raised questions and concerns about whether the company is collecting and using personal information without consent.
@CadFairview has stated that it is using the technology for the purposes of monitoring traffic, as well as the age and gender of shoppers. The company contends it is not capturing images of individuals.
Now, Cadillac Fairview suspended its use of facial recognition software as the investiation is underway.
I applaud the efforts of both @ABoipc and @PrivacyPrivee. In my opinion there are three questions that need be answered through the audit report:
- Do citizens have a reasonable expectation of being free from historic tracking?
- Will posting signs at mall entrances be sufficient notice that CF can use to infer information about individuals?
- With a handful of information security staff, is CF a trusted custodian of public data?
There is no doubt that the investigators from both offices are more than capable to perform this investigation, my fear however is that
Current legistlative instruments may not adequately defend citizen privacy against biometric big data system
In previous work, we wrote on the advent of Big Data and we explained that current laws do not deal with inferences. We have also written numerously on how the lack of Privacy is an existential threat to any business.
While the investigation is on going, I as a member of the public encourage CF to
- Provide the name of your privacy officer to the public.
- Disclose a Privacy Impact Assessment showing your due diligence
@CadFairview is fully aware that the future is here and the new generation of shoppers enjoys online shopping. Breaching citizen privacy will likely speed up the end of the mall era.
A daily routine that includes continuously scrolling through Instagram, sipping kale smoothies, drinking Starbucks coffee, hitting the gym, and hanging out with friends, while still managing to fit in a full day of work is most-likely a Millennial.
Millennial. The four-syllable word that makes thousands of Generation Xers roll their eyes and cringe at the so-called “entitled” and “privileged” group born after the 80’s.
Not all, but most Millennials share the features of a short attention span, an obsession with social media, and a love to socialize. Although this may drive a crowd of Generation Xers to angrily grunt in agreement, from a managerial-perspective, these aren’t negative characteristics. In fact, they are actually valuable elements of a workplace.
In order to be an effective manager, as with all employees, it is important to understand the Millennials in the workplace. Clearly I have a different daily routine as them as I hardly scroll through Instagram and don’t think I could even get through an entire kale smoothie. I started to wonder that if even our daily lives are so different – how different are their expectations and interests in the work that they’re doing?
After discussing with the Millennials that I work with, they’ve explained to me their main priorities and interests. I believe it’s important to integrate these things into the workplace and foster an innovative environment for both them, and myself, as I know that I have a lot to learn from them.
From what I’ve gathered, Millennials’ priorities include: hanging out with their friends, finding a work/life balance, being passionate about the work they’re doing, and using social media to connect with people.
In my experience, these often overlooked interests allow Millennials to be valuable assets in the workplace. Millennials are conditioned to immediacy and will find solutions to get work done quickly and efficiently, with the ability to do several things at once. They are fluent in media, and natives of the digital world, creating innovation in technology. With constant posting and use of social media, Millennials are naturals in communications and marketing. They foster cohesiveness and team-building in the workplace. They thrive on community and naturally build it within a workplace.
Unlike many of us Generation Xers, Millennials aren’t as interested in climbing the ladder or making mass amounts of money as they value these other priorities. Some may not be interested in becoming a leader or gaining status whatsoever. They may be simply trying out different positions for the sake of having new experiences. It’s important to ensure that they are passionate and interested in their work, and that they aren’t doing repetitive, boring tasks. Some of us have spent years doing jobs for the sole purpose of getting a promotion or making money. To those born in this new generation, they focus on pursuing their passions, and focusing on the present.
Most Millennials grew up in a contented environment, where they were given independence from a young age, not under strict authority. This translates to giving millennial workers lots of independence and creative freedom in the workplace. Rather than constantly correcting, or giving strict guidelines, allow them to work on projects where they can implement their own ideas and strategies.
Millennials are conditioned with an ethical value system that Generation Xers weren’t naturally exposed to. Surrounded by ethnic diversity, planet-saving initiatives, socio-economic rallies, and an overall environment that strives for equality, Millennials are aware of the social responsibilities of the companies they work for. They have a balance between their need to excel in their work and their ingrained moral ethics.
Ultimately, we all have a lot to learn from the Millennials in our workplace, and they have unique perspectives that should be heard. Acknowledge and understand the differences you have, and incorporate them into the workplace to create a challenging and thriving environment.
By Wael Hassan and Tessa Barclay
The Honourable Navdeep Bains, P.C., M.P. extends his gratitude for report of the Standing Committee on Access to Information, Privacy and Ethics titled, Privacy by Design: Review of the Personal Information Protection and Electronic Documents Act. His response encompasses the following summary.
He shows his appreciation for the OPC and other witnesses that supported this study and states that the recommendations provide valuable guidance. The Government of Canada agrees that changes are required to ensure that the use of personal information in a commercial context has clear rules to support the expectations of Canadians.
A critical step was made with the announcement of new regulations under the PIPEDA on April 18, 2018, to assure Canadians that they will be informed about risks with the distribution of their personal information. The next step is to engage Canadians in conversations about data and digital issues on a national level.
Consent under the PIPEDA
The Government agrees that consent should remain a core element of the PIPEDA, as it provides individuals with control over how their personal information is shared. Maintaining a progressive view on consent additionally ensure that the internationally recognized standards align with those of Canada. However, there is work to be done to ensure that consent remains meaningful under the PIPEDA, as it can be enhanced in a variety of ways. Furthermore, the Government is committed to maintaining the principles-based approach to the PIPEDA, as it has been noted as a source of the Act’s strength.
In response to recent incident involving unintended uses of personal information from social media, the Government acknowledges the need to closely consider redefining “publicly available” information for the purpose of the PIPEDA. The amendments to the PIPEDA’s consent requirements resulted in consent only be considered valid if the individual can understand the consequences providing that consent. This was aimed to prohibit deceptive collection of a child’s personal information, however it presents unique challenges as it involves the definition of a minor, which is within provincial jurisdiction.
Online Reputation and Respect for Privacy
The Government acknowledges public concerns about the accumulation of personal information online and agrees that it poses a risk to privacy protection. Furthermore, the Government acknowledges the work by the OPC in this area and that there are legitimate concerns about the impacts of this position on other rights. Therefore, the OPC has called for further study to provide an appropriate balance between these competing rights.
Public commentary on the divergent views of these matters results in the need for providing further certainty on how the PIPEDA applied within various contexts is necessary to ensure a “level playing field”. The Government agrees that the appropriate destruction of information after it is no longer needed provides unintended future uses that can lead to harm on their reputation.
Enforcement Powers of the Privacy Commissioner
In agreeance with the Committee, the Government states that the time has come to closely examine how the PIPEDA’s enforcement model can be improved to ensure that the objectives are met of supporting innovation and growth of the digital economy, while providing robust protections for personal privacy. Similar recommendations were made by the Senate Standing Committee on Transport and Communications.
In order to determine an optimal model for compliance and enforcement, the Government must assess all options that can strengthen the compliance and enforcement regime of the Act. As part of this assessment, the Government must look at other models of compliance and enforcement to consider potential impacts on the mandate of the OPC, the principles of fundamental justice, and the countervailing risks with increased powers. Options for change must also be assessed, including those regarding consent.
Impact of the European Union (EU) General Data Protection Regulation (GDPR)
The Government supports the following: (1) Canada’s Adequacy Status (ref. recommendations 17-19) and acknowledges that data flows are a significant enabler in a growing digital economy. In discussion with trade partners, including the EU nations and institutions, the key is to work towards harmonization of different frameworks to ensure data protection is levels all jurisdictions. Officials are using a cross-government approach and working closely with the European Commission to understand the requirements for maintaining Canada’s adequacy standing under the EU GDPR.
The Committee’s study has made a significant contribution to this work by providing the government with recommendations to ensure effectiveness of the PIPDEA of international developments.
New Rights to Align with the GDPR
In recognition of the importance of interoperability of privacy regimes, in the GDPR the EU has added concept of “essential equivalence” to examine the adequacy of non-member regimes. Therefore, it is not clear that the PIPEDA’s requirements must reflect each of the GDPR’s right and protections in order to maintain its adequacy standing.
Moving forward, the Government will engage Canadians in a conversation on making Canada more data-savvy, focusing on how companies can use personal information to innovate and compete while protecting privacy. This is a value that Canadians hold dear.
Once again, thank you to the Committee on behalf of the Government for the careful examination of these important issues.
Responsibilities of Controllers and Processors
What are controllers and processors under the GDPR?
- Controllers determine the purpose and means of processing personal data and are usually the collectors of data. They do not necessarily need to be located in the EU. Controllers are additionally responsible for monitoring processors’ compliance.
- Processors are engaged to protect data on behalf of the controller
Both controllers and processors are responsible and liable for compliance under the GDPR.
Responsibilities of Controllers
The primary responsibility of controllers is to data subjects. Controllers also demonstrate compliance with the GDPR and ensure data processors’ compliance as well. Controllers outside the EU that regularly process personal data pertaining to people within the EU should have a designated representative within the EU to help manage compliance.
Responsibilities of Processors
Processors are governed by a contract that addresses how data will be processed, how requests from data subjects will be fulfilled, and whether data will be transferred to any other geographical locations. The processor makes information available to the controller to demonstrate compliance and notifies the controller in the event of a breach. It is also the processor’s responsibility to ensure that authorization is given prior to engaging a sub-processor, and that all data is deleted or returned at the end of their service provision.
The GDPR introduces direct statutory obligations on processors as well as severe sanctions for compliance failures. This is especially relevant for non-EU data processors, who need to ensure that if their clients are based in the EU they are responsible for complying with the GDPR. The processor has equal risk for fines as the controller.
Required Data Protection Practices
- Data protection by design and default
Data can also be protected by design, meaning that data protection principles are integrated into the design of the systems that manage personal data. Another way to protect data is by default, meaning putting in place safeguards to limit the processing of data.
Generally, it is recommended to put in place practices and technologies that are appropriate to the level of risk. Some of the best safeguards are quite simple. For instance, having a data protection officer and consulting with supervisory authorities concerning high risk projects. Other examples include breach notifications and data protection impact assessments (DPIA) for high risk projects.
Breaches must be reported within 72 hours of discovery unless there is a low risk to the rights and freedoms of the data subjects. High risk breaches should be communicated to data subjects without delay.
Companies with 250+ employees and those that handle certain special categories of data are required to document: contact information, purpose of processing, categories of data, data transfers to other countries, timelines for erasure of different categories of data and, where possible, a description of technical and organizational security measures.
What is the GDPR?
The GDPR represents new legislation that is destined to replace the General Data Protection Regulation, which has been in place since 1995. The arrival of the digital age means that the way people understand and interact with data is changing rapidly. The GDPR can help to clarify individual rights in the digital age, as well as creating a “digital single market” within the EU. With the GDPR in place, it will be easier for citizens to have control over their personal data, representing a shift in power.
The underlying principle of the GDPR is that the protection of personal data is a fundamental right, and organizations that handle personal data are responsible for those rights. “Processing” data means collecting, sharing, distributing, structuring, storing, or otherwise using an individual’s data. In this relationship, there are controllers and processors. A controller determines the purpose and means of processing personal data and is usually the collector of the data. Processors are engaged to process data on behalf of the controller, but the controllers are responsible for monitoring processors’ compliance.
The GDPR affects the North American market because any organization that offers goods or services to the EU or that monitors the behaviour of people within the EU is responsible for complying with the GDPR.
There are three key principles of the regulation:
- Limitation of processing means that: data must be processed only for specified, explicit and legitimate purposes; data must not be further processed in ways inconsistent with the initial purposes; data should be adequate, relevant, and necessary; data should be accurate and kept up-to-date; data should be kept only as long as necessary.
- Informed consent refers to freely given and clearly affirmative consent that must be intelligible, easily accessible, and written in plain language. Participants have the right to withdraw consent, and services cannot be withheld on condition of consent.
- Lawful processing pertains to at least one of the following conditions must be met:
- Consent from the data subject
- Processing is necessary for a contract
- Processing is necessary for compliance with EU laws
- Processing is necessary to protect a person’s vital interests
- Processing in the public interest or exercise of official authority
- Legitimate interests of the controller or a third party that are not overridden by the data subject’s rights and freedoms
This concept refers to personal data being deleted when the data subject no longer wants it to be processed. The exception to this is when there is legitimate reason to retain the data, for instance, in the case of completing a contract or complying with legal obligations.
Information is made readily available and is communicated in clear, plain language. Informed consent will especially be enforced regarding services for children.
- Right to Data Portability
Data subjects have a right to a copy of their personal data in an appropriate format and, where possible, they can transfer that data directly from one service provider to another. For example, individuals should be able to transfer photos from one social network to another.
- Data Protection by Design and Default
This aspect helps protect users’ data by design, for instance by implementing technical safeguards like anonymization, pseudonymization, and encryption, as well as organizational safeguards.
- Mandatory Data Protection Officer
A DPO fills the need for an organization to help monitor privacy and data protection. A DPO is an expert in their field, and is required if an organization’s core activities consist of regular and systematic monitoring of personal data on a large scale. This position helps ensure compliance and awareness of privacy legislation. The DPO may also monitor internal data protection activities, train staff, and conduct internal audits. If data subjects have inquiries, these will go through the DPO as well.
Companies are responding to the GDPR in several ways:
- Stop buying and selling personal data
- Know where your clients live, or implement EU requirements regardless of location
- Prepare to respond to requests from data subjects
- Audit sub-contractors for compliance
- Reconsider cloud services
Privacy, Data Management, and Risk Mitigation
While no clear definition or requirements of a “smart city” exist, the general consensus is that it is an innovative development initiative that combines urban planning with creative digital infrastructure. Areas of focus often include reducing traffic congestion, improving sustainable energy use, and making public spaces more accessible and adaptable to its residents’ needs and desires. To achieve these goals, these initiatives incorporate innovative methods of data collection to improve service provision for local residents, however this innately sparks concerns surrounding consent, privacy, and data protection.
When Sidewalk Labs announced its interest in developing a 12-acre property along Toronto’s eastern waterfront to be North America’s most advanced smart city neighbourhood, many people were concerned about what kinds of data would be collected and how it would be used. Sidewalk Labs is a subsidiary of Alphabet Inc., the parent company of Google, so there is no doubt that this project could bring both incredible innovation as well as possible data exploitation or breaches. This said, the project developers have been vigilant in consulting with the community and releasing updated data privacy frameworks to calm tech-induced fears.
An exciting aspect of smart city development is the opportunity to build new collaborations between municipal and provincial governments, innovation hubs, entrepreneurs and their startups, research institutions, the leading educational institutions, and local residents. When combined, these various actors and organizations can collectively source the innovative ideas, design thinking, policy frameworks, and financial investment required to ensure that new ideas take hold.
Past and potential future efforts include:
- Partnerships with mobile apps that map out traffic congestion, motor vehicle accidents, and other on-road incidents (Toronto/Waze);
- Providing free Wi-Fi in public spaces to connect residents (Vancouver);
- Improving innovation procurement through research and entrepreneurial partnerships (Guelph/MaRSDD);
- Establishing a network of digital and physical engagement and innovation hubs targeted for youth (Ottawa); and,
- Building a technology-enabled “Circular Food Economy” (Guelph/Wellington County).
There are many approaches to planning and developing a smart city project, but all projects involve basic issues: privacy, data management, and risk mitigation.
Multi-Domain Privacy Impact Assessments
The combination of information sharing initiatives and innovative approaches to service delivery, such as smart city projects, has led to a growing need for multi-institutional and multi-jurisdictional PIAs. Guidelines from the Office of the Privacy Commissioner recommend that such PIAs include a clear business case for information sharing, a common communications strategy to inform the public of information sharing, and a set of expected privacy practices shared by all institutions participating in the data sharing initiative.
1. Purpose: We begin by defining the reasons for which smart city projects collect, use, retain and disclose personal information.
2. Custodianship: A key next step to ensuring private information is protected is to adopt a custodianship model. In the context of a smart city initiative, a custodian will be designated to review and revise policies, processes, and procedures to ensure any sensitive information is shared securely.
3. Liability: In order to establish liability, we help to define the roles, responsibilities, and accountabilities of smart city project participants. We define different participants’ right and ability to manage (collect, retain, disclose, and correct) personal information.
4. Data Management: We define policies for management of data quality, records management, assurance of accuracy, retention and archiving, and secondary use of data.
5. Controls: We define policies for the application of legislative requirements, including management of information safeguards, compliance auditing, identity validation and management, implementation of consent rules, breach management, and proactive and reactive monitoring of technology assets. Controls also include frameworks such as provider agreements, resident disclaimers, and mandatory and discretionary requirements that define the roles of smart city participants.
Recommendations for Smart City Risk Mitigation
Given the opportunities and challenges associated with developing a function and advanced smart city project, we recommend planners and project managers consider the following six areas of risk and mitigation.
- Role of AI: Artificial intelligence is still very much uncharted territory, meaning there are abundant opportunities for leading edge technological development, but there is also a policy void. Governments, software developers, and researchers will need to collaborate and actively engage with each other’s sectors to gain a better understanding of their goals, practices, and needs to will help foster secure but innovative development.
2. Handling Personal Information: The policies and practices that guide how personal information collected by smart city initiatives are fundamental for maintaining the trust of community members and ensuring the initiatives do not violate privacy laws. The data that the new smart technologies collect and analyze come from many sources including sensors and cameras. These technologies may be able to interact with people or their personal devices without any positive action required by the individual (i.e. consent) or an opportunity to out.
The vast amounts of data that can be collected could lead to negative practices (or suspicions of such practices) such as surveillance, profiling, or using personal information for difference purposes than originally stated, either without consent or without public input. This practices are to be avoided wherever possible, and so whatever body is responsible for smart city data management must be vigilant in data-minimization practices by only collecting, using, or disclosing personal information where it is necessary for the initiative’s outcomes and there are no other alternatives. Lastly, smart city operations should have meaningful consent agreements where required by law and/or opt-out opportunities to ensure participants are able to make informed decisions.
3. Privacy Governance and Oversight: Technology has thus far kept a faster pace than the policy regulating it. Smart city initiatives must be supported by updated data governance and privacy management policies. These policies should address a wide range of privacy and security requirements, including: appoint a privacy lead; monitoring and auditing for regulatory and legal compliance; responding to and maintaining transparency during breaches; and contractual protections and accountability frameworks for all the diverse actors and organizations involved in the initiative. This last requirement is particularly important for encouraging strong partnerships as it helps mitigate the risks of entering into the collaboration at the starting point.
4. Transparency and Public Notice: For smart city projects to be most successful, a thorough level of community engagement will be required to not only collect and make use of residents’ experiences and ideas, but to also maintain proper feedback channels and project transparency. Project goals and practices should be transparent and made easily understandable so that community members will understand how they might be affected.
5. Privacy Impact Assessments: Collaborating partners responsible for the security of smart city data must conduct privacy impact and threat risk assessments to ensure privacy and security risks are identified and adequately addressed in the design and implementation of new technologies and programs.
6. Safeguarding Data: Any smart city endeavours that make use of data collection must include appropriate measures to secure all personal information. Given the diverse formats of implemented technology in the smart cities context, it is especially difficult to establish effective safeguards. Generally speaking, more points of data collection, processing, and access also mean more points of vulnerability and therefore greater risks of a security breach. To mitigate this serious risk, smart city data systems must de-identify personal information at the earliest stage in the collection process as possible and reduce the risk of re-identification that is inherent with connected devices. Lastly, smart cities should only retain, use, and disclose de-identified information, particularly in an aggregated format when possible.
Smart cities offer an incredible opportunity for exercising creative design thinking and harnessing the entrepreneurial spirit. However, government policy must be in line with the best interests of the public, particularly those who will be directly impacted by the programs and new technologies introduced by these innovative initiatives. Two-way, open and transparent discussions and partnerships between the innovative research and design sectors and the government and affected communities will be required to ensure smart cities are designed and implemented in a way that advances technology and urban planning while improving the lives and experiences within the communities. It is clear that following privacy and security best practices are absolute paramount for the success of these initiatives.
To Learn more about smart cities follow @drwhassan