Trending >

Ottawa’s Privacy Analytics launches Risk Monitor assessment tool

security Ottawa’s Privacy Analytics has introduced a privacy risk assessment tool called Risk Monitor, designed to give organizations the ability to identify, measure and manage the level of risk exposure associated with their data sharing practices.

Until now, Privacy Analytics has focused exclusively on the healthcare field, something that it continues to specialize in, owing to the vast research potential that patient data could provide to the general population if that data was properly anonymized.
This new product signals the company’s intent to provide its data security platform in a way that can be used by any vertical that deals in data privacy, which is basically every company that gathers and stores customer data for any purpose.

“Organizations in healthcare, insurance, financial services and other industries are seeking to maximize the usefulness and value of their data assets for analysis to derive new insights, discover new opportunities and improve the bottom line,” said Privacy Analytics Vice President of Product Management Pamela Neely Buffone. “When that data contains information that could be used to identify an individual, it is imperative that responsible privacy measures are in place to ensure the highest level of compliance and the lowest possible levels of legal, financial and reputational risk to the organization.”

Risk Monitor provides companies and organizations the assurance that their data protection standards comply with most regulators, such as HIPAA and the EU Data Protection Directive 95/46/EC as well as globally accepted data sharing standards and guidelines, including those from the Institute of Medicine (IOM), Health Information Trust Alliance (HITRUST), PhUSE, and the Council of Canadian Academies.

HIRE Technologies

In the wake of well-publicized corporate hacks like Sony Pictures, Target, Slack, Ashley Madison, US healthcare provider Premera, health insurance company Anthem, and the IRS, what’s becoming clear is that these publicized cases represent the tip of a data security shame iceberg.

For every case of hacking against corporations or government agencies that we hear about, there are that many more that we don’t because the optics and bad publicity around hacking prevents companies from divulging that they have been hacked.

The Global Business Outlook Survey co-published by Duke University and CFO Magazine is the world’s longest running and farthest reaching survey of its kind, having been conducted for 77 consecutive quarters.

For Q2 2015, it reported that 80% of U.S. companies admit “that they have been seriously hacked (with the hack intended to steal, change, or make public important data)”.

That percentage climbs to 85% for smaller firms, of less than 1,000 employees, and companies in Asia (85%), Europe (92%), Africa (87%) and Latin America (87%).

In other words, everyone is either getting hacked now or has been hacked in the past.

And so, when you speak to them about this, most companies engage in a kind of herd protection attitude to data security. Basically, sure, we were hacked but it’s not as big or bad a hack as happened at Target or the U.S. Public Service. We’re operating to the same security standard as everyone else, is what they’ll tell you, indicating that the standard is so incredibly low.

“The reality is that responsibly sharing data for secondary use is inherently a risk management exercise that every organization should be undertaking. But many are not,” said Neely Buffone. “We’ve taken years of research and know-how from operating in the most restrictive data compliance environments and created an easy to use tool that identifies the privacy risks organizations need to be aware of so they can start to effectively manage those risks. It is the only product of its kind on the market.”

In an interview with Cantech Letter last year, Privacy Analytics CEO Khaled El Emam memorably described most companies’ approach to data protection as “Mickey Mouse”, saying, “You have to be transparent about this, you have to be serious about it, you have to do it properly. You can’t put in place Mickey Mouse anonymization or Mickey Mouse governance. And I think with that, you can maintain consumer trust, you can meet regulator expectations. You will have constraints on what you can do with the data, but these are good constraints. They’ll keep you out of trouble.”

  •  
  •  
  •  

About The Author /

Comment

Leave a Reply

Your email address will not be published. Required fields are marked *