Categories: Innovation

Using AI to Combat AI-Generated Disinformation

AI can be impact election outcomes? how can this be combatted?


Categories: social

AI – Q&A Series – Aviation – Mobility

Here are my unfiltered answers to questions surrounding AI.

Beyond the Jargon – what problems can AI solve for me?

AI can solve problems where a judgement call is needed. In essence it is suited to contexts where there is an opinion or an attribute need be determined based on patterns. A prime example is sentiment analysis. If an airport wants to learn what people think of its services, they can train an AI engine to detect sentiment. Using AI, here is an example of how sentiment can be compared between two airports Vancouver International Airport @yvrairport and Toronto Pearson International Airport @TorontoPearson .

Toronto And Vancouver Airport Sentiment Comparison 24 hours 29-4-2019

Are there any successful AI implementation Models?

In terms of models, AI solutions are not cookie cutter. A model has to be built based on a particular project premises. There may be some lessons learned, but the models are generally not recyclable.

What are a few steps that can ensure success implementing AI?

Its always important to think of the need or the opportunity that we need to solve.

Its important to think of AI as a means to an end, not the other way round

What is an Applicable AI Scenario?

Scenario: People needing to commute before flying are generally unhappy. How can we make their journey or experience better.

A) Quantitative data analysis will identify top 10 challenges statistically. for example, unavailability of parking spaces could attribute to negative sentiment.

Using quantitative analysis a scientist can extract the top 10 reasons for negative sentiment.

B) Qualitative analysis is also needed. In this step, a human can train the AI engine to categorize challenges met by travelers. The AI can then sift through data at mass and provide more refined stats.

Using qualitative analysis a team of analysis can understand and breakdown user stories into patterns. Once the patterns are defined, the AI can quickly categories all data into patterns .

Can AI be applied to all industries?

Indeed, here is another example how AI has been used to detect cell phone user sentiment in Canada. among @bell, @rogers, & @telus

No alt text provided for this image

Why is there a lot of confusion of what AI is?

Most people explaining AI, may lack the technical and scientific rigor. I am told multi-billion budget companies are failing at the basics.


Future of Data

Infographic representing key issues concerning future of data broken by country.

Link to source https://waelhassan.com/wp-content/uploads/2019/01/Future-Value-of-Data-World-Map-Infographic-2018-002.pdf


Categories: Privacy

Urban Data Responsibility – The Battle for Toronto

The initial excitement over Alphabet’s Smartcity may be dwindling out of the perception that the tech giant will use the new development in the Harbourfront to collect personal data. The special attention given by interest groups to a project that actually has engaged the public and shown good faith may be giving companies the wrong lesson: Don’t engage the public and no one will care.

For several years, Turn Style, now Yelp Wifi, has captured, linked, and shared consumer confidential data with no public engagement and no protest from advocates.

By protesting against companies who are engaging the public – interest groups may be doing Privacy a dis-service

The project, run by Sidewalk Labs is set to be an ambitious feat which incorporates innovation with sustainability to build a city that is ‘smart’—technology that responds to users to create a highly efficient and responsive landscape. On the one hand, the public is excited for the opportunity to live in a highly efficient neighborhood whose core is sustainability and innovation, and on the other, the public is alarmed by advocates who claim that the project’s data collection and sharing is alarming. The graph below illustrates how feelings of excitement are progressively being overtaken by feelings of fear.

The real question we should be asking is whether the data being collected, is much different from the surveillance techniques we interact with daily. Traffic cameras- the low-tech version to Apples’s sensors- already track our movements. Our Presto cards, now increasingly necessary to use public transportation, store our travel data and can reveal where we live, work and who we travel with. Yelp Wifi is a known data predator, which indiscriminately and without consent tracks Torontonians’ entry and exit into 600+ local businesses. We sign onto unsecured servers to gain access to Wi-Fi when we are at a café or shopping mall, and most of us already give access to more information than we realize via the cellphones we carry and use to share personal information, at all times. Yelp’s retention policy is effectively indefinite. Opting out of their services is definitely not accessible even for the tech savvy.

Here is an excerpt of Yelp wifi data retention:

We (Yelp) retain all Personally Identifiable Information and Non-Personally Identifiable Information until the date you first access the Services or the time at which you instruct us in accordance with the terms hereof to remove your Personally Identifiable Information or Non-Personally Identifiable Information.

I have been following Yelp’s Wifi traction on privacy from when it was a startup on King West called Turnstyle. Their CEO was quoted “I want to desensitize people’s need for privacy”. Their traction on privacy has been disappointing. Reviewing their policies over the years, I found that:

Yelp Wifi’s retention policy is confusing and inaccessible, it violates the reasonable expectation of privacy

In my opinion, and despite all the noise, Sidewalk Labs’ proposal is reasonable. Their Digital Governance Proposal has principles that demonstrate good faith. Meanwhile, advocates are pushing for anonymization, a technique that allows the removal of personal identity from any sensor data.

In this discussion, some argue that the issue is not that Sidewalk Labs will collect data, it is that a corporation is cementing itself—literally—in the place of local government. What access will it have and who will it share our information with?

Like any good government, in order for citizens to have a voice and prevent any giant from taking over—political or corporate—there needs to be checks and balances in place to ensure compliance. In reviewing their proposal, I found that:

The sidewalk proposal is reasonable, however it is missing an important tenant of data protection, that is Audit.

Much like any public corporation that exposes its financial documents to a third party to perform its financial audit, Sidewalk Labs’ proposal is missing the potential of a technical audit by a third-party assessor.

I also find the counter points made by advocates to be lacking. The argument that anonymization offers protections in big data is misplaced:

Anonymization may be moot because data will be released to companies that have other sources to blend

The current negotiations happening are important but unless we understand how they will be enforced and regulated over time, they remain policy when in fact action is needed. This is new territory from a legal, political, and business standpoint and the truth is, Canadians do not have robust protections in place to safeguard them from privacy exploitation. As the law unfortunately drags behind, we must be proactive in how we build our security governance. Privacy audit companies have long been in the business of protecting our data—they ensure information is being stored and shared responsibly and, the way it’s intended.

As we continue to debate the Harbourfront project we must resist falling back onto tropes of progress versus preservation of the norm. Initially, we must realize that our norms most likely share more of our data than we would like. Then, we must understand that change is inevitable, but we have a chance to be part of that change and direct its course. Privacy Auditing allows us the opportunity to consistently ensure that our data is being used in the ways we intend for it to be used.

Now is not the time to step away from negotiations, particularly from a company that is welcoming feedback. How the project is developed and instituted will set a precedence and influence, not only for the Harbourfront area, but what we can expect from corporate governance and the future of privacy laws. It is in our utmost interest to take full interest and engage as extensively as we can to ensure an outcome that keeps its promise of innovation and sustainability.

As a private citizen, I welcome businesses that are open to listening and are engaging the public to expressing their opinion. The effort by Sidewalk Toronto and their partners is a work in progress that will need more attention and third party attestation.

Stay tuned for our upcoming pieces that continue to inform on Privacy by Design in the Big Data environment. 


Categories: Privacy

Are Malls “Grasping at Straws”?

Cadillac Fairview is tracking the public by using facial recognition technology !!

The news of privacy commissioners of Canada and Alberta launching an investigation into facial recognition technology used at Cadillac Fairview, did not come as a surprise to many. The investigation was initiated by Commissioner Daniel Therrien in the wake of numerous media reports that have raised questions and concerns about whether the company is collecting and using personal information without consent.

@CadFairview has stated that it is using the technology for the purposes of monitoring traffic, as well as the age and gender of shoppers. The company contends it is not capturing images of individuals.

Now, Cadillac Fairview suspended its use of facial recognition software as the investiation is underway.

I applaud the efforts of both @ABoipc and @PrivacyPrivee. In my opinion there are three questions that need be answered through the audit report:

  1. Do citizens have a reasonable expectation of being free from historic tracking?
  2. Will posting signs at mall entrances be sufficient notice that CF can use to infer information about individuals?
  3. With a handful of information security staff, is CF a trusted custodian of public data?

There is no doubt that the investigators from both offices are more than capable to perform this investigation, my fear however is that

Current legistlative instruments may not adequately defend citizen privacy against biometric big data system

In previous work, we wrote on the advent of Big Data and we explained that current laws do not deal with inferences. We have also written numerously on how the lack of Privacy is an existential threat to any business.

While the investigation is on going, I as a member of the public encourage CF to

  1. Update your outdated (2016) Privacy Policy.
  2. Provide the name of your privacy officer to the public.
  3. Disclose a Privacy Impact Assessment showing your due diligence

@CadFairview is fully aware that the future is here and the new generation of shoppers enjoys online shopping. Breaching citizen privacy will likely speed up the end of the mall era.


Overcoming the Challenges of Privacy of Social Media in Canada

By Aydin Farrokhi and Dr. Wael Hassan

In Canada data protection is regulated by both federal and provincial legislation. Organizations and other companies who capture and store personal information are subject to several laws in Canada. In the course of commercial activities, the federal Personal Information Protection and Electronic Documents Act (PIPEDA) became law in 2004. PIPEDA requires organizations to obtain consent from individual whose data being collected, used, or disclosed to third parties. By definition personal data includes any information that can be used to identify an individual other than information that is publicly available. Personal information can only be used for the purpose it was collected and individuals have the right to access their personal information held by an organization.

Amendments to PIPEDA 

The compliance and enforcement in PIPEDA may not be strong enough to address big data privacy aspects. The Digital Privacy Act (Also known as Bill S_4) received Royal Assent and now is law. Under this law if it becomes entirely enforced, the Privacy Commissioner can bring a motion against the violating company and a fine up to $100,000.

The Digital Privacy Act amends and expands PIPEDA in several respects:

 

  1. The definition of “consent” is updated: It adds to PIPEDA’s consent and knowledge requirement. The DPA requires reasonable expectation that the individual understands what they are consenting to. The expectation is that the individual understands the nature, purpose and consequence of the collection, use or disclosure of their personal data. Children and vulnerable individuals have specific

There are some exceptions to this rule. Managing employees, fraud investigations and certain business transactions are to name a few.

  1. Breach reporting to the Commissioner is mandatory (not yet in force)
  2. Timely breach notifications to be sent to the impacted individuals: the mandatory notification must explain the significance of the breach and what can be done, or has been done to lessen the risk of the
  3. Breach record keeping mandated: All breaches affecting personal information whether or not there has been a real risk of significant harm is mandatory to be kept for records. These records may be requested by the Commissioner or be required in discovery by litigant or asked by the insurance company to assess the premiums for cyber
  4. Failure to report a breach to the Commissioner or the impacted individuals may result in significant

Cross-Border Transfer of Big Data

The federal Privacy Commissioner’s position in personal information transferred to a foreign third party is that transferred information is subject to the laws and regulations of the foreign country and no contracts can override those laws. There is no consent required for transferring personal data to a foreign third party. Depending on the sensitivity of the personal data a notification to the affected individuals that their information may be stored or accessed outside  of Canada and potential impact this may have on their privacy rights.

 Personal Information- Ontario Privacy Legislations

The Freedom of Information and Protection of Privacy Act, the Municipal Freedom of Information and Protection of Privacy Act and Personal Health Information Protection Act are three major legislations that organizations such as government ministries, municipalities, police services, health care providers and school boards are to comply with when collecting, using and disclosing personal information. The office of the Information and Privacy Commissioner of Ontario (IPC) is responsible for monitoring and enforcing these acts.

In big data projects the IPC works closely with government institutions to ensure compliance with the laws. With big data projects, information collected for one reason may be collectively used with information acquired for another reasons. If not properly managed, big data projects may be contrary to Ontario’s privacy laws.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


Categories: social

Building a Social Media Pipeline

Social Media Boston University

A Presentation by Dr. Waël Hassan at Boston University School of Media & Communications

Abstract: Companies who developed Success Criteria, established their Style, decided their Sources, Setup a business process, whilst they survey their results are winning big on social media. The most unknown part of building an enterprise social media service is how to build a social media pipeline. This presentation describes how to do that.

 

[flipbook pdf=”https://waelhassan.com/wp-content/uploads/2017/06/Building-Your-Social-Media-Pipeline-by-@Drwhassan.pdf”]

 


Big Data Everywhere

Two arguments that can’t be more dangerous, People don’t care about Privacy, Get all the data you can


Categories: Privacy

Can big Data be wrong – An election post mortem

Well that’s a good question, everyone is asking today what happened with the elections. Thinking that all that we knew and heard from media outlets was wrong. Big Data is subject to a few simple rules which often get ignored.

  1. Subjects (People) involved ought to be connected, i.e. they are feeding data into the machine.
  2.  Subjects are willing to express their opinions. Without express consent of the individual, its questionable to correlate behavioral data, such as someone clicking on an article in favor of a Candidate to rule out that they vote for them
  3. Interpretation ought to be accurate.  All big data offers is a set of data points. Interpretation cannot be wishful thinking.

When the next election or event comes along, there is one thing to remember.

Big Data has a human side, do not forget it.

Source: https://kidesignmagazine.com/can-big-data-wrong-election-post-mortem/

About Waël Hassan:

Dr. Waël Hassan is the founder of KI Design – his full bio is available at About

Data Protection in Design

Time for a New Vision

Up until now, we have viewed privacy and security on the same sliding scale, through which it appears to be impossible to have one without hurting the other. Envisioning a country where privacy is prioritized over security and surveillance seems absurd. However, it is time that we disrupt this traditional way of thinking.

How? Through Data Protection in Design. By developing and building data protection into the design of private, public, and political systems, citizens would have the ability to express their desires, change the system, and influence government, all the while minimizing the risk to national or public safety. Instead of pitting the forces for privacy and the forces for security against one another, the two forces should be integrated in order to reap the benefits of both.

It is no longer a balance between privacy freedoms and security, but rather about achieving both outcomes in an effective way