eDiscovery and Audits: The Solution to Unauthorized Access
Electronic medical records (EMRs) contain sensitive personal information that is strongly protected in many jurisdictions. EMRs are protected under the Personal Health Information Protection Act (PHIPA) in Ontario by limiting authorized access to professionals who are currently providing healthcare services to the patient or are otherwise given consent to access the patient’s records. Despite PHIPA’s definition of and limits to authorized access, many of Ontario’s healthcare organizations (such as hospitals and clinics) operate open-access EMR database systems, meaning all healthcare staff have access to all records. Despite the responsibilities of healthcare organizations to protect patients’ information from unauthorized access and distribution, the risk of unprofessional conduct is not well-mitigated.
Unauthorized access to EMRs, colloquially termed snooping, constitutes the access and/or disclosure of private medical records without consent from the patient. Not all snooping is committed with malicious intent, and several cases in recent years have admitted curiosity or genuine concern as the reason behind the unauthorized access.
Regardless of the intention behind the act, snooping can have severe consequences for both the healthcare professional and the patient, even if the security breach is small in scale.
An offending healthcare professional could face loss of employment, relinquishment of their medical license, a civil lawsuit, and a maximum fine of $50,000 for violating PHIPA regulations. As for the patient, cases with the most severe consequences are usually smaller, isolated incidents involving social media and the malicious dispersion of their private medical records. A 2014 case in Indiana offers such an example, when a young woman’s positive STD test results were posted to Facebook by a former high school peer who was working as a nurse at the young woman’s local hospital – it is needless to say the reputational and emotional damage was irreversible.
While these cases are extremely serious, they are still relatively rare amongst healthcare-related lawsuits such as malpractice cases. However, the gradual increase in EMR privacy-related lawsuits and the understanding that electronic records can be manipulated have created a demand for reliable software tools that will efficiently identify, collect, and make available for attorneys accurate EMR data. This includes metadata such as the identification of healthcare providers who accessed the records and correlated timestamps, both of which are very important for the litigation. The practice of using these tools to find and freeze relevant data (meaning the data cannot be modified or deleted) is called eDiscovery.
Essentially, eDiscovery produces an “audit trail” from EMRs that is more accurate and reliable than simply accepting whatever records the healthcare provider’s attorney produces, as done in the past.
Using technology to sort and extract relevant data provides attorneys and healthcare organizations the advantage of being able to sift through more data faster. This is extremely useful given that the average Ontario hospital will generate approximately one million EMR access logs per week, or approximately 52 million in a given year.
However, challenges remain with assessing access logs and determining the purpose and validity of each. Some healthcare organizations use random audits to examine EMRs that have been accessed two or three times (or more) in a week; this method works similarly to a spot check by identifying anomalies and catching offenders off-guard. Another approach some organizations use is running regular conditions-based audit – basically, if a given patient’s EMR access logs meet pre-determined criteria that the healthcare organization has set, the access logs are flagged for further examination. While this approach may seem more methodical and likely to identify questionable activity, it also tends to generate many false alarms. Both random audits and regular condition-based searches make good use of available data analytics technology, however they both also require a level of human judgement and oversight that is not easily replaced by software.
So, what can be done to reduce, prevent, and prosecute unauthorized access to EMRs? Ontario’s Information and Privacy Commissioner has identified several methods of minimizing the risk of snooping, including the development of a comprehensive training program on privacy responsibilities for healthcare providers, and mandatory confidentiality agreements that are required prior to healthcare providers gaining access to medical record databases. These are feasible options, but they are only as strong as each healthcare organization’s resolve to implement them. Implementing advanced analytic technology could offer a positive way forward in better understanding this behaviour and improving prosecution against offenders.
As software developers create new methods of working with big data and continue to push analytic software and artificial intelligence to the next level, new solutions to complex issues like EMR privacy violations will emerge. The need for human judgement and oversight that is still required when using the aforementioned healthcare approaches to auditing can actually be reduced if not eliminated as long as the new audit solution has sophisticated data mining algorithms and well-integrated process triggers. KI Design’s Audit Solutions, powered by Maize have superior data mining and filtering abilities that can not only identify suspicious behaviour but also connect employee data access with potential clinical or operational reasons for accessing the data. If a particular access record is flagged as particularly suspicious, the audit software will trigger the risk assessment and incident response processes and provide relevant information to the healthcare organization’s privacy officer for manual review.
It is unnecessary for healthcare organizations and legal attorneys to remain dependent on manually sifting through EMRs and their access logs. Audit technology is more advanced and reliable than ever, and will likely play an important role in improving the eDiscovery process and lead to better outcomes to snooping lawsuits.
How AI impacts privacy and security implementaiton
Big Data analytics is transforming all industries including healthcare-based research and innovation, offering tremendous potential to organizations able to leverage their data assets. However, as a new species of data – massive in volume, velocity, variability, and variety – Big Data also creates the challenge of compliance with federal and provincial privacy laws, and with data protection best practices.
Stemming from internationally-recognized Privacy Principles, data protection regulations tend to follow a specific format, focusing particularly on the collection, retention, use, and disclosure of personal information or personal health information, provided there is a specified purpose and explicit consent from the individual.
From a conceptual standpoint, the evolution of Big Data brings in a new element of analytical functions: Inference
– the extraction of new knowledge from parameters in mathematical models fitted to data – captures commonly-known functions such as data linking, predictive analytics, artificial intelligence (AI), and data mining. Inference allows analysts to create new data based on reasoning and extrapolation, which adds a greater dimensionality to the information already in their possession.
From a corporate governance perspective, the addition of inference will impact how an organization complies with the above-mentioned Privacy Principles, and how it meets its legal obligations regarding consent, access, auditing, and reporting.
In terms of privacy practices, inference can be understood as both a collection and a use of data. For organizations to be compliant with data protection principles, inferences gleaned from data analysis must meet the requirements of applicable privacy laws. This means that new data created from the inference process must be collected and used for a specified reason and with the consent of the individual; it must be accurate; the data collector must disclose all collected information on an individual should they request it; and the data’s use, disclosure, and retention must be limited to the specified reason of collection.
If inference is used to generate new data outside the original data’s specified purpose, the collecting organization will not be complying with privacy laws and could become subject to individual complaints and auditing from the Office of the Privacy Commissioner. So while inference can seem like the dawn of a new age in Big Data analytics, it is still restricted by privacy laws, and must be used only within the present data collection and use principles.
Implementing PPM Successfully
The web is full of articles on PPM tools and how difficult a PPM implementation could be. That said it’s a well-established that companies using PPM tools generally have a better vision of their future markets and how to manage their investments.
PPM tools are good for business
If your organization is in the market for a PPM tool, you are already looking at the latest industry research reports; at the same time you may be researching the net.
You may be asking one of two questions (or both):
Q1 : “How do I avoid failure?” This is a good question if your approach is risk mitigation.
As a response we say that implementing a PPM solution is like starting a new business in that you should have a good idea how your business will fit into the market, especially at the start-up phase. Similarly, your organization should have a clear sense of how a PPM solution will fit into their company’s future.
Q2 : “I am transforming my organization to improve decision-making and I need a technology to fit this process, how can PPM tools help?” – this is a better question in my mind.
There is a wide range of PPM solutions and each will fit each company differently, so it is important for companies to approach PPM implementation with an open mind. Having gone through several implementations, some successful and some not, here are some rules of thumb:
Tier 1 Project and Portfolio Management (PPM) solutions do not fail, but their implementation could
- Think of a PPM as a business transformation technology do better than those who think of it simply as a singular tool.
- Organizations that balance their focus between tool sophistication and adoption are the most successful.
- Organizations that have in-house PPM business champions do better than those who lack business champions and rely on external SMEs specializing in select tools.
whereas there is a facebook page that makes people believe that their account was deleted, information in actual fact is never deleted. Whats their excuse, someone else may have liked your picture or article.
From facebook pages:
If you don’t think you’ll use Facebook again, you can request to have your account permanently deleted. Please keep in mind that you won’t be able to reactivate your account or retrieve anything you’ve added. Before you do this, you may want to download a copy of your info from Facebook. Then, if you’d like your account permanently deleted with no option for recovery, log into your account and let us know.
When you delete your account, people won’t be able to see it on Facebook. It may take up to 90 days from the beginning of the deletion process to delete all of the things you’ve posted, like your photos, status updates or other data stored in backup systems. While we are deleting this information, it is inaccessible to other people using Facebook.
Some of the things you do on Facebook aren’t stored in your account. For example, a friend may still have messages from you even after you delete your account. That information remains after you delete your account.
Doctors without borders is an organization I believe in, they have an opening for a Medical Liaison – Medical team manager.
POSITION OBJECTIVE: The Medical Liaison – Medical Team Manager establishes a referent point for the work that requires the support, input and the direction of a doctor; who would hold significant and recognized medical and operational experience with MSF.
BACKGROUND: The MSF Canada Program Unit (PU) exists to support MSF social mission by empowering and serving the field needs. PU provides operational support to partners through Learning Services (consulting, production) and Telemedicine (expertise, implementation and adoption).
The nature of MSF interventions has translated into a demand to test and implement TM solutions to bring additional clinical support, education and mentoring to the MSF missions.
Under established governance, MSF Canada executes the implementation (piloting, testing, coordinating, administrating), development and maintenance of TM solutions (Real Time and Non-Real Time). The approach is to adapt the services to the specific needs, desires and skill-sets of MSF stakeholders. The Telemedicine Program also includes professionalizing the MSF Store and Forward (S&F) platform and solidifying MSF Canada’s position as a movement referent for Telemedicine.
Telemedicine Store and Forward services continues to develop, Store and Forward is about to reach 7000 patients with about 2500 cases posted in 2017. We continue to adapt our organizational structure to meet the growing needs of the MSF Telemedicine services.
Any questions or concerns regarding accommodation and accessibility can be addressed to firstname.lastname@example.org or Kathy Mahinpou at 1-800-982-7903 ext. 3454
By Aydin Farrokhi and Dr. Wael Hassan
Organizations are using big data to help consumers. The question is what steps organizations can take to avoid inadvertently harming consumers. In the event of legal action, an organization’s preparedness is a determining factor in how well the case will move forward.
In court matters, electronically stored information is generally subject to the same preservation and discovery requirements as other relevant information. Preserving evidences or fulfilling a court production order for social media records raises technical challenges and evidentiary issues. As described by the Sedona report:
“social media data is often hosted remotely with third parties, is dynamic and collaborative by nature, may include several different data types, and is often accessed through unique interfaces.”
In addition, social media sites may terminate an account(s) or delete account contents and mobile applications can shut down, or the mobile device itself can cause restrictions.2
For these reasons, addressing the needs for defense in a legal action, with early engagement in an agreement for preservation and collection of social media data is beneficial.
Social media analytics is a relatively new topic and may not always fit within existing laws and regulations. There are no universal agreements on how to regulate it and many countries are still at the stage of developing laws that protect their citizens’ private information from unethical use or misuse. Moreover, current legislation for data protection are not always comprehensive, and sometimes do not work well together. In some other cases these regulations have not been designed with social media analytics in mind.
Big data will continue grow in use, and will benefit growing areas such as education, health, social services, employment, finance, forensic investigation and many more. While big data analytics continue to provide benefits and opportunities to consumers, protecting consumers’ privacy has become a growing challenge. The privacy commissioner of each jurisdiction continues to monitor areas where big data practices could violate existing laws, and brings enforcement actions where appropriate. In addition, raising awareness about the negative impact of big data practices on low-income and underserved population remains on the commissions’ primary list.
Since big data analytics can have significant consequences, organizations and legislation should work together to minimize the risks it presents. Therefore, organizations and companies that are already using or are considering engaging in big data analytics should:29
- Consider if their data sets are missing information from particular populations and if so, take necessary steps to address the problem
- Review their data sets and algorithms for hidden biases that may have unintended negative impact on certain groups
- Remember that discovered correlations in big data, must be used with the risks of using those results in mind
- Consider fairness and ethical use of big data
It may be worthwhile to have human supervision of data and algorithms when using big data tools to make important decisions.
By Aydin Farrokhi and Dr. Wael Hassan
To highlight a few areas in which big data has helped to improve the organizational processes, the following are real examples worth mentioning. In education, some institutions have used big data to identify student candidates for advanced classes. In finance, big data has been used to provide access to credit through non-traditional methods, for example, LexisNexis created an alternative credit scoring (Risk View) system which provides alternative ways to score creditworthiness. In healthcare, a tailored medicare is a new approach for disease treatment and prevention based on an individual’s environment and lifestyle. In human resources, Google is using big data to help promote a more diverse workforce.
All above said, a concern arises that certain groups of people will be categorized and excluded through the use of big data. In some cases, customers’ credit limits have been lowered not because of their payment history but because of where they had shopped.
Also of concern is the exposure of people’s sensitive data. The results of a study performed which combined data on Facebook “Likes” with limited survey information was found to be staggering. The researchers were able to accurately predict:
Male user’s sexual orientation -88% of the time
User’s ethnic origin-95% of the time
User’s religion (Christian or Muslim)-82% of the time
A Democrat or Republican-85% of the time
Used alcohol, drugs, or cigarettes -65% to 75% of the time
Big data may even increase the incidents of fraud. Fraudsters can target vulnerable consumers and offer disingenuous services or goods for scamming purposes. Big data analytics allows organizations (or fraudster) to more easily and accurately identify persons who are drawn to sweepstake offers or who are vulnerable prospects.
Other malicious intent could occur with companies offering consumers choices to quote and infer misleading conclusion from a likeminded preselected group of people that big data provided them.
By Aydin Farrokhi and Dr. Wael Hassan
Today, the public has remarkable power and reach by which they can share their news, and express their opinion, about any product or services or even react to an existing state of affairs, especially regarding social or political issues. For example, in marketing, consumer voices can have an enormous influence in shaping the opinions of other consumers. Similarly, in politics, public opinion can influence loyalties, decisions, and advocacy.
While increasingly organizations are adopting and embracing social media, the motive for each establishment to use social media varies. Some of the key drivers for adopting social media include:
- Market research and new product
- Need for better consumer
- Need to gain competitive
- Need to improve customer
- Need to develop new products and
- Need to increase Return on Marketing Investment (ROMI)
- Top strategic actions to maximize social media spend
- Improve ability to respond to customer’s wants and needs
- Build social media measurement into marketing campaigns and brand promotions
- Maximize marketing campaign and effectiveness
- Align social media monitoring capabilities to overall business objectives
- Public opinion research and new motto
- Need for better public
- Need to gain competitive
- Need to improve public
- Need to develop new
- Need to increase Return on Campaigning Investment (ROCI)
- Top strategic actions to maximize social media spend
- Improve ability to respond to the public’s wants and needs
- Build social media measurement into political campaign and publicity promotions
- Maximize political campaign and effectiveness
- Align social media monitoring capabilities to overall political agenda
In general, there are three major categories of methods for analyzing social media data. These analytical tools can be grouped as either Content Analysis tools, Group and Network Analysis tools or Prediction tools.
By Aydin Farrokhi and Dr. Wael Hassan
In Canada data protection is regulated by both federal and provincial legislation. Organizations and other companies who capture and store personal information are subject to several laws in Canada. In the course of commercial activities, the federal Personal Information Protection and Electronic Documents Act (PIPEDA) became law in 2004. PIPEDA requires organizations to obtain consent from individual whose data being collected, used, or disclosed to third parties. By definition personal data includes any information that can be used to identify an individual other than information that is publicly available. Personal information can only be used for the purpose it was collected and individuals have the right to access their personal information held by an organization.
Amendments to PIPEDA
The compliance and enforcement in PIPEDA may not be strong enough to address big data privacy aspects. The Digital Privacy Act (Also known as Bill S_4) received Royal Assent and now is law. Under this law if it becomes entirely enforced, the Privacy Commissioner can bring a motion against the violating company and a fine up to $100,000.
The Digital Privacy Act amends and expands PIPEDA in several respects:
- The definition of “consent” is updated: It adds to PIPEDA’s consent and knowledge requirement. The DPA requires reasonable expectation that the individual understands what they are consenting to. The expectation is that the individual understands the nature, purpose and consequence of the collection, use or disclosure of their personal data. Children and vulnerable individuals have specific
There are some exceptions to this rule. Managing employees, fraud investigations and certain business transactions are to name a few.
- Breach reporting to the Commissioner is mandatory (not yet in force)
- Timely breach notifications to be sent to the impacted individuals: the mandatory notification must explain the significance of the breach and what can be done, or has been done to lessen the risk of the
- Breach record keeping mandated: All breaches affecting personal information whether or not there has been a real risk of significant harm is mandatory to be kept for records. These records may be requested by the Commissioner or be required in discovery by litigant or asked by the insurance company to assess the premiums for cyber
- Failure to report a breach to the Commissioner or the impacted individuals may result in significant
Cross-Border Transfer of Big Data
The federal Privacy Commissioner’s position in personal information transferred to a foreign third party is that transferred information is subject to the laws and regulations of the foreign country and no contracts can override those laws. There is no consent required for transferring personal data to a foreign third party. Depending on the sensitivity of the personal data a notification to the affected individuals that their information may be stored or accessed outside of Canada and potential impact this may have on their privacy rights.
Personal Information- Ontario Privacy Legislations
The Freedom of Information and Protection of Privacy Act, the Municipal Freedom of Information and Protection of Privacy Act and Personal Health Information Protection Act are three major legislations that organizations such as government ministries, municipalities, police services, health care providers and school boards are to comply with when collecting, using and disclosing personal information. The office of the Information and Privacy Commissioner of Ontario (IPC) is responsible for monitoring and enforcing these acts.
In big data projects the IPC works closely with government institutions to ensure compliance with the laws. With big data projects, information collected for one reason may be collectively used with information acquired for another reasons. If not properly managed, big data projects may be contrary to Ontario’s privacy laws.
Mark you calendars , October 1st 2017 Mandatory Breach Reporting Requirements kick in.
THERE ARE 7 SITUATIONS WHERE YOU MUST NOTIFY THE ONTARIO PRIVACY COMMISSIONER OF A PRIVACY BREACH
- Use or disclosure without authority : Looking at a family member, a celebrity, a politician records out of curiosity or for a malicious intent. Limited exceptions: accessing a record by mistake, or mailing a letter to the wrong address.
- Stolen Information: Laptop, Tablet, or paper theft or loss. In addition to being subject to malware or ransomware.
- Extended Use or Disclosure: Following a reported breach, a sales company used records to market its products or services.
- Pattern or Similar Breaches: Letters are being sent to the wrong address, employees are repeatedly accessing a patient’s record.
- Disciplinary action against a college member: A college member resigns, is suspended, or has their licenses revoked following or combined with a breach.
- Disciplinary action against a non college member: Resignation, Suspension, or firing of an employee following or during a breach.
- Significant Breach: the information is sensitive, large volume , large number of affected individuals, and more than one custodian or agent is involved.
Custodians will be required to start tracking privacy breach statistics as of January 1, 2018, and will be required to provide the Commissioner with an annual report of the previous calendar year’s statistics, starting in March 2019.