Categories: Privacy

Urban Data Responsibility – The Battle for Toronto

The initial excitement over Alphabet’s Smartcity may be dwindling out of the perception that the tech giant will use the new development in the Harbourfront to collect personal data. The special attention given by interest groups to a project that actually has engaged the public and shown good faith may be giving companies the wrong lesson: Don’t engage the public and no one will care.

For several years, Turn Style, now Yelp Wifi, has captured, linked, and shared consumer confidential data with no public engagement and no protest from advocates.

By protesting against companies who are engaging the public – interest groups may be doing Privacy a dis-service

The project, run by Sidewalk Labs is set to be an ambitious feat which incorporates innovation with sustainability to build a city that is ‘smart’—technology that responds to users to create a highly efficient and responsive landscape. On the one hand, the public is excited for the opportunity to live in a highly efficient neighborhood whose core is sustainability and innovation, and on the other, the public is alarmed by advocates who claim that the project’s data collection and sharing is alarming. The graph below illustrates how feelings of excitement are progressively being overtaken by feelings of fear.

The real question we should be asking is whether the data being collected, is much different from the surveillance techniques we interact with daily. Traffic cameras- the low-tech version to Apples’s sensors- already track our movements. Our Presto cards, now increasingly necessary to use public transportation, store our travel data and can reveal where we live, work and who we travel with. Yelp Wifi is a known data predator, which indiscriminately and without consent tracks Torontonians’ entry and exit into 600+ local businesses. We sign onto unsecured servers to gain access to Wi-Fi when we are at a café or shopping mall, and most of us already give access to more information than we realize via the cellphones we carry and use to share personal information, at all times. Yelp’s retention policy is effectively indefinite. Opting out of their services is definitely not accessible even for the tech savvy.

Here is an excerpt of Yelp wifi data retention:

We (Yelp) retain all Personally Identifiable Information and Non-Personally Identifiable Information until the date you first access the Services or the time at which you instruct us in accordance with the terms hereof to remove your Personally Identifiable Information or Non-Personally Identifiable Information.

I have been following Yelp’s Wifi traction on privacy from when it was a startup on King West called Turnstyle. Their CEO was quoted “I want to desensitize people’s need for privacy”. Their traction on privacy has been disappointing. Reviewing their policies over the years, I found that:

Yelp Wifi’s retention policy is confusing and inaccessible, it violates the reasonable expectation of privacy

In my opinion, and despite all the noise, Sidewalk Labs’ proposal is reasonable. Their Digital Governance Proposal has principles that demonstrate good faith. Meanwhile, advocates are pushing for anonymization, a technique that allows the removal of personal identity from any sensor data.

In this discussion, some argue that the issue is not that Sidewalk Labs will collect data, it is that a corporation is cementing itself—literally—in the place of local government. What access will it have and who will it share our information with?

Like any good government, in order for citizens to have a voice and prevent any giant from taking over—political or corporate—there needs to be checks and balances in place to ensure compliance. In reviewing their proposal, I found that:

The sidewalk proposal is reasonable, however it is missing an important tenant of data protection, that is Audit.

Much like any public corporation that exposes its financial documents to a third party to perform its financial audit, Sidewalk Labs’ proposal is missing the potential of a technical audit by a third-party assessor.

I also find the counter points made by advocates to be lacking. The argument that anonymization offers protections in big data is misplaced:

Anonymization may be moot because data will be released to companies that have other sources to blend

The current negotiations happening are important but unless we understand how they will be enforced and regulated over time, they remain policy when in fact action is needed. This is new territory from a legal, political, and business standpoint and the truth is, Canadians do not have robust protections in place to safeguard them from privacy exploitation. As the law unfortunately drags behind, we must be proactive in how we build our security governance. Privacy audit companies have long been in the business of protecting our data—they ensure information is being stored and shared responsibly and, the way it’s intended.

As we continue to debate the Harbourfront project we must resist falling back onto tropes of progress versus preservation of the norm. Initially, we must realize that our norms most likely share more of our data than we would like. Then, we must understand that change is inevitable, but we have a chance to be part of that change and direct its course. Privacy Auditing allows us the opportunity to consistently ensure that our data is being used in the ways we intend for it to be used.

Now is not the time to step away from negotiations, particularly from a company that is welcoming feedback. How the project is developed and instituted will set a precedence and influence, not only for the Harbourfront area, but what we can expect from corporate governance and the future of privacy laws. It is in our utmost interest to take full interest and engage as extensively as we can to ensure an outcome that keeps its promise of innovation and sustainability.

As a private citizen, I welcome businesses that are open to listening and are engaging the public to expressing their opinion. The effort by Sidewalk Toronto and their partners is a work in progress that will need more attention and third party attestation.

Stay tuned for our upcoming pieces that continue to inform on Privacy by Design in the Big Data environment. 


Smart Privacy Auditing – An Ontario Healthcare Case Study

IPCS Smart Privacy Auditing Seminar

On September 13, Dr. Waël Hassan, was a panelist at the Innovation Procurement Case Study Seminar on Smart Privacy Auditing, hosted by Mackenzie Innovation Institute (Mi2) and the Ontario Centres of Excellence (OCE). The seminar attracted leaders from the health care sector, the private information and technology industry, and privacy authorities. The seminar explored the concept of innovative procurement via the avenue of competitive dialogue, in addition to demonstrating the power and benefits of using artificial intelligence to automates the process of auditing all PHI accesses within a given hospital or health network.

What are the benefits of participating in an innovative procurement process, particularly competitive dialogue?

An innovative procurement partnership between Mi2, Mackenzie Health, Michael Garron Hospital, and Markham Stouffville Hospital was supported by the OCE’s REACH grant and sought to identify an innovative approach to auditing that could be applicable to the privacy challenges faced by numerous hospitals with different practices, policies, and information systems. Rather than focus on how the solution should operate, the partners collaboratively identified six outcome-based specifications the procured audit tool would be required to meet.

By identifying key priorities and specifying the outcomes a solution should achieve, Competitive Dialogue establishes a clear and mutual understanding of expectations. This can help the private sector narrow down solution options to a model best-suited for the contracting authority’s unique context. The feedback loop provided by the iterative rounds (if used) enables vendors to clarify any confusion and customize proposals to the contracting authority’s unique needs, staff workflows, and policy contexts.

Competitive Dialogue is an opportunity for transparent communication that gives vendors the opportunity to learn more intimate details of what the contracting authority, in this case Mackenzie Health, needs from a solution. Because hospitals are not tech or security experts, they often struggle to accurately identify and define what solutions they need to solve a particular issue, and thus a traditional procurement process is rarely ideal since there is little to no room for clarification or feedback. This process is more flexible than the traditional procurement process and thereby allows for more creativity and innovative thinking processes during the initial proposal development. Encouraging creativity and creating a competitive environment in which competing vendors may be sounding ideas off each other results in higher quality proposals and final solutions.

 Mackenzie Health Case Study

Mackenzie Health employs over 450 physicians and 2,600 other staff members, processes nearly 55,000 patient medical record accesses every day, and has just one privacy officer to monitor everything. Mackenzie Health’s privacy needs far outweigh its capacity, so they turned to the private sector for an innovative solution.

Section 37(1) of PHIPA outlines the possible uses of personal health information, and these guidelines are based on the purpose underlying the activities. Because the legal framework is centred on purpose, KI Design’s approach is to explain the purpose for accessing a given medical record. The core of this technology is more commonly known as an explanation-based auditing system (EBAS) designed and patented by Dr. Fabbri of Maize Analytics.

To detect unauthorized accesses, the technology has the capability of identifying an intelligible connection between the patient and the employee accessing the patient’s records. AI changes the fundamental question underlying auditing tools from “who is accessing patient records without authorization?” to, “for what purpose are hospital staff accessing patient records?” Asking this question helps the technology break down staff workflows and identify common and unique purposes for accessing any given medical record, which are further categorized as either authorized access or unexplained access, which may then flagged as potentially unauthorized behaviour. The technology is able to filter out the authorized accesses, which are usually 98% to 99% of all accesses, so that the Privacy Officer can focus on the much smaller number of unexplained and flagged accesses.

Why is the private sector interested in health care?

Health care is an extremely complex system operated by the province and service providers. The province is a specialist in governance and regulation, the service providers are specialists in medicine – neither are experts in privacy or security. Companies such as KI Design are interested in filling the expertise gap within the health care sector by working closely in tandem with health care providers and the Information & Privacy Commissioner to adapt privacy and security solutions that are suitable for their working realities. There is irrevocable value added in having a privacy and security expert working directly with hospitals and other health service providers to assist in refining privacy best practices and implementing a privacy tool that will improve privacy and security outcomes without restricting the workflows of health practitioners.

To learn more on how AI solutions improve Audit visit https://phipa.ca/


Categories: Privacy

eDiscovery and Audits: The Solution to Unauthorized Access

eDiscovery

eDiscovery and Audits: The Solution to Unauthorized Access

Electronic medical records (EMRs) contain sensitive personal information that is strongly protected in many jurisdictions. EMRs are protected under the Personal Health Information Protection Act (PHIPA) in Ontario by limiting authorized access to professionals who are currently providing healthcare services to the patient or are otherwise given consent to access the patient’s records. Despite PHIPA’s definition of and limits to authorized access, many of Ontario’s healthcare organizations (such as hospitals and clinics) operate open-access EMR database systems, meaning all healthcare staff have access to all records. Despite the responsibilities of healthcare organizations to protect patients’ information from unauthorized access and distribution, the risk of unprofessional conduct is not well-mitigated.

 

Unauthorized access to EMRs, colloquially termed snooping, constitutes the access and/or disclosure of private medical records without consent from the patient. Not all snooping is committed with malicious intent, and several cases in recent years have admitted curiosity or genuine concern as the reason behind the unauthorized access.

 

Regardless of the intention behind the act, snooping can have severe consequences for both the healthcare professional and the patient, even if the security breach is small in scale.

An offending healthcare professional could face loss of employment, relinquishment of their medical license, a civil lawsuit, and a maximum fine of $50,000 for violating PHIPA regulations. As for the patient, cases with the most severe consequences are usually smaller, isolated incidents involving social media and the malicious dispersion of their private medical records. A 2014 case in Indiana offers such an example, when a young woman’s positive STD test results were posted to Facebook by a former high school peer who was working as a nurse at the young woman’s local hospital – it is needless to say the reputational and emotional damage was irreversible.

 

While these cases are extremely serious, they are still relatively rare amongst healthcare-related lawsuits such as malpractice cases. However, the gradual increase in EMR privacy-related lawsuits and the understanding that electronic records can be manipulated have created a demand for reliable software tools that will efficiently identify, collect, and make available for attorneys accurate EMR data. This includes metadata such as the identification of healthcare providers who accessed the records and correlated timestamps, both of which are very important for the litigation. The practice of using these tools to find and freeze relevant data (meaning the data cannot be modified or deleted) is called eDiscovery.

 

Essentially, eDiscovery produces an “audit trail” from EMRs that is more accurate and reliable than simply accepting whatever records the healthcare provider’s attorney produces, as done in the past.

Using technology to sort and extract relevant data provides attorneys and healthcare organizations the advantage of being able to sift through more data faster. This is extremely useful given that the average Ontario hospital will generate approximately one million EMR access logs per week, or approximately 52 million in a given year.

 

However, challenges remain with assessing access logs and determining the purpose and validity of each. Some healthcare organizations use random audits to examine EMRs that have been accessed two or three times (or more) in a week; this method works similarly to a spot check by identifying anomalies and catching offenders off-guard. Another approach some organizations use is running regular conditions-based audit – basically, if a given patient’s EMR access logs meet pre-determined criteria that the healthcare organization has set, the access logs are flagged for further examination. While this approach may seem more methodical and likely to identify questionable activity, it also tends to generate many false alarms. Both random audits and regular condition-based searches make good use of available data analytics technology, however they both also require a level of human judgement and oversight that is not easily replaced by software.

 

So, what can be done to reduce, prevent, and prosecute unauthorized access to EMRs? Ontario’s Information and Privacy Commissioner has identified several methods of minimizing the risk of snooping, including the development of a comprehensive training program on privacy responsibilities for healthcare providers, and mandatory confidentiality agreements that are required prior to healthcare providers gaining access to medical record databases. These are feasible options, but they are only as strong as each healthcare organization’s resolve to implement them. Implementing advanced analytic technology could offer a positive way forward in better understanding this behaviour and improving prosecution against offenders.

 

As software developers create new methods of working with big data and continue to push analytic software and artificial intelligence to the next level, new solutions to complex issues like EMR privacy violations will emerge. The need for human judgement and oversight that is still required when using the aforementioned healthcare approaches to auditing can actually be reduced if not eliminated as long as the new audit solution has sophisticated data mining algorithms and well-integrated process triggers. KI Design’s Audit Solutions, powered by Maize have superior data mining and filtering abilities that can not only identify suspicious behaviour but also connect employee data access with potential clinical or operational reasons for accessing the data. If a particular access record is flagged as particularly suspicious, the audit software will trigger the risk assessment and incident response processes and provide relevant information to the healthcare organization’s privacy officer for manual review.

It is unnecessary for healthcare organizations and legal attorneys to remain dependent on manually sifting through EMRs and their access logs. Audit technology is more advanced and reliable than ever, and will likely play an important role in improving the eDiscovery process and lead to better outcomes to snooping lawsuits.