Smart Privacy Auditing – An Ontario Healthcare Case Study

IPCS Smart Privacy Auditing Seminar

On September 13, Dr. Waël Hassan, was a panelist at the Innovation Procurement Case Study Seminar on Smart Privacy Auditing, hosted by Mackenzie Innovation Institute (Mi2) and the Ontario Centres of Excellence (OCE). The seminar attracted leaders from the health care sector, the private information and technology industry, and privacy authorities. The seminar explored the concept of innovative procurement via the avenue of competitive dialogue, in addition to demonstrating the power and benefits of using artificial intelligence to automates the process of auditing all PHI accesses within a given hospital or health network.

What are the benefits of participating in an innovative procurement process, particularly competitive dialogue?

An innovative procurement partnership between Mi2, Mackenzie Health, Michael Garron Hospital, and Markham Stouffville Hospital was supported by the OCE’s REACH grant and sought to identify an innovative approach to auditing that could be applicable to the privacy challenges faced by numerous hospitals with different practices, policies, and information systems. Rather than focus on how the solution should operate, the partners collaboratively identified six outcome-based specifications the procured audit tool would be required to meet.

By identifying key priorities and specifying the outcomes a solution should achieve, Competitive Dialogue establishes a clear and mutual understanding of expectations. This can help the private sector narrow down solution options to a model best-suited for the contracting authority’s unique context. The feedback loop provided by the iterative rounds (if used) enables vendors to clarify any confusion and customize proposals to the contracting authority’s unique needs, staff workflows, and policy contexts.

Competitive Dialogue is an opportunity for transparent communication that gives vendors the opportunity to learn more intimate details of what the contracting authority, in this case Mackenzie Health, needs from a solution. Because hospitals are not tech or security experts, they often struggle to accurately identify and define what solutions they need to solve a particular issue, and thus a traditional procurement process is rarely ideal since there is little to no room for clarification or feedback. This process is more flexible than the traditional procurement process and thereby allows for more creativity and innovative thinking processes during the initial proposal development. Encouraging creativity and creating a competitive environment in which competing vendors may be sounding ideas off each other results in higher quality proposals and final solutions.

 Mackenzie Health Case Study

Mackenzie Health employs over 450 physicians and 2,600 other staff members, processes nearly 55,000 patient medical record accesses every day, and has just one privacy officer to monitor everything. Mackenzie Health’s privacy needs far outweigh its capacity, so they turned to the private sector for an innovative solution.

Section 37(1) of PHIPA outlines the possible uses of personal health information, and these guidelines are based on the purpose underlying the activities. Because the legal framework is centred on purpose, KI Design’s approach is to explain the purpose for accessing a given medical record. The core of this technology is more commonly known as an explanation-based auditing system (EBAS) designed and patented by Dr. Fabbri of Maize Analytics.

To detect unauthorized accesses, the technology has the capability of identifying an intelligible connection between the patient and the employee accessing the patient’s records. AI changes the fundamental question underlying auditing tools from “who is accessing patient records without authorization?” to, “for what purpose are hospital staff accessing patient records?” Asking this question helps the technology break down staff workflows and identify common and unique purposes for accessing any given medical record, which are further categorized as either authorized access or unexplained access, which may then flagged as potentially unauthorized behaviour. The technology is able to filter out the authorized accesses, which are usually 98% to 99% of all accesses, so that the Privacy Officer can focus on the much smaller number of unexplained and flagged accesses.

Why is the private sector interested in health care?

Health care is an extremely complex system operated by the province and service providers. The province is a specialist in governance and regulation, the service providers are specialists in medicine – neither are experts in privacy or security. Companies such as KI Design are interested in filling the expertise gap within the health care sector by working closely in tandem with health care providers and the Information & Privacy Commissioner to adapt privacy and security solutions that are suitable for their working realities. There is irrevocable value added in having a privacy and security expert working directly with hospitals and other health service providers to assist in refining privacy best practices and implementing a privacy tool that will improve privacy and security outcomes without restricting the workflows of health practitioners.

To learn more on how AI solutions improve Audit visit https://phipa.ca/

eDiscovery and Audits: The Solution to Unauthorized Access

eDiscovery and Audits: The Solution to Unauthorized Access

Electronic medical records (EMRs) contain sensitive personal information that is strongly protected in many jurisdictions. EMRs are protected under the Personal Health Information Protection Act (PHIPA) in Ontario by limiting authorized access to professionals who are currently providing healthcare services to the patient or are otherwise given consent to access the patient’s records. Despite PHIPA’s definition of and limits to authorized access, many of Ontario’s healthcare organizations (such as hospitals and clinics) operate open-access EMR database systems, meaning all healthcare staff have access to all records. Despite the responsibilities of healthcare organizations to protect patients’ information from unauthorized access and distribution, the risk of unprofessional conduct is not well-mitigated.

 

Unauthorized access to EMRs, colloquially termed snooping, constitutes the access and/or disclosure of private medical records without consent from the patient. Not all snooping is committed with malicious intent, and several cases in recent years have admitted curiosity or genuine concern as the reason behind the unauthorized access.

 

Regardless of the intention behind the act, snooping can have severe consequences for both the healthcare professional and the patient, even if the security breach is small in scale.

An offending healthcare professional could face loss of employment, relinquishment of their medical license, a civil lawsuit, and a maximum fine of $50,000 for violating PHIPA regulations. As for the patient, cases with the most severe consequences are usually smaller, isolated incidents involving social media and the malicious dispersion of their private medical records. A 2014 case in Indiana offers such an example, when a young woman’s positive STD test results were posted to Facebook by a former high school peer who was working as a nurse at the young woman’s local hospital – it is needless to say the reputational and emotional damage was irreversible.

 

While these cases are extremely serious, they are still relatively rare amongst healthcare-related lawsuits such as malpractice cases. However, the gradual increase in EMR privacy-related lawsuits and the understanding that electronic records can be manipulated have created a demand for reliable software tools that will efficiently identify, collect, and make available for attorneys accurate EMR data. This includes metadata such as the identification of healthcare providers who accessed the records and correlated timestamps, both of which are very important for the litigation. The practice of using these tools to find and freeze relevant data (meaning the data cannot be modified or deleted) is called eDiscovery.

 

Essentially, eDiscovery produces an “audit trail” from EMRs that is more accurate and reliable than simply accepting whatever records the healthcare provider’s attorney produces, as done in the past.

Using technology to sort and extract relevant data provides attorneys and healthcare organizations the advantage of being able to sift through more data faster. This is extremely useful given that the average Ontario hospital will generate approximately one million EMR access logs per week, or approximately 52 million in a given year.

 

However, challenges remain with assessing access logs and determining the purpose and validity of each. Some healthcare organizations use random audits to examine EMRs that have been accessed two or three times (or more) in a week; this method works similarly to a spot check by identifying anomalies and catching offenders off-guard. Another approach some organizations use is running regular conditions-based audit – basically, if a given patient’s EMR access logs meet pre-determined criteria that the healthcare organization has set, the access logs are flagged for further examination. While this approach may seem more methodical and likely to identify questionable activity, it also tends to generate many false alarms. Both random audits and regular condition-based searches make good use of available data analytics technology, however they both also require a level of human judgement and oversight that is not easily replaced by software.

 

So, what can be done to reduce, prevent, and prosecute unauthorized access to EMRs? Ontario’s Information and Privacy Commissioner has identified several methods of minimizing the risk of snooping, including the development of a comprehensive training program on privacy responsibilities for healthcare providers, and mandatory confidentiality agreements that are required prior to healthcare providers gaining access to medical record databases. These are feasible options, but they are only as strong as each healthcare organization’s resolve to implement them. Implementing advanced analytic technology could offer a positive way forward in better understanding this behaviour and improving prosecution against offenders.

 

As software developers create new methods of working with big data and continue to push analytic software and artificial intelligence to the next level, new solutions to complex issues like EMR privacy violations will emerge. The need for human judgement and oversight that is still required when using the aforementioned healthcare approaches to auditing can actually be reduced if not eliminated as long as the new audit solution has sophisticated data mining algorithms and well-integrated process triggers. KI Design’s Audit Solutions, powered by Maize have superior data mining and filtering abilities that can not only identify suspicious behaviour but also connect employee data access with potential clinical or operational reasons for accessing the data. If a particular access record is flagged as particularly suspicious, the audit software will trigger the risk assessment and incident response processes and provide relevant information to the healthcare organization’s privacy officer for manual review.

It is unnecessary for healthcare organizations and legal attorneys to remain dependent on manually sifting through EMRs and their access logs. Audit technology is more advanced and reliable than ever, and will likely play an important role in improving the eDiscovery process and lead to better outcomes to snooping lawsuits.