Have questions around facilitating privacy impact assessments? Register for our free webinar “Automating Privacy Impact Assessments – Innovative Processes for Data Protection!“
On this Data Protection Day, Privacy and Security are in the hearts and minds of the public more than ever before—and with good reason. Consumers have increasingly been subjected to identity theft, so much so that a new industry has cropped up to provide insurance against these kinds of concerns.
At the same time, companies and government agencies around the world remain under constant threat of cyber-attacks and potential theft of sensitive data. Even exposure of highly sensitive data from once-trusted employees. Consumers are beginning to understand that their personal information has value, and as
Technology is driving ever-forward. Walgreens is piloting a new line of “smart coolers;” fridges equipped with cameras that scan shoppers’ faces and make inferences on their age and gender. While Walgreen’s smart coolers don’t know from whom they are collecting data, large technology companies are increasingly investing in technology that does. In-home smart devices understand some of the most intimate details of our lives, and social media platforms are “feeding” consumers information that confirms their previously held beliefs via Facebook newsfeeds or new stories based on an algorithm that analyzes previous items on which users have clicked.
This kind of targeting technology is evolving—perhaps from the “convenience” that we have come to expect when Amazon recommends a book we may like based on a previous purchase—into a more insidious and creepy kind of technology where giant corporations are in fact influencing our very ideas and understanding of “the facts.” The suggestive power of social media platforms, Facebook “likes,” Google and YouTube “auto plays,” and the almost seemingly endless stream of targeted online advertising and political commentary that is flooding our inboxes all feed into this.Looking for good articles on #DataPrivacyDay? Check this one out: Click To Tweet
I have often cautioned “buyer beware” when consumers are presented with a “free” service. If companies like Google, Facebook, and Twitter are providing a “free” service, they are almost certainly being paid elsewhere by their advertisers. These advertisers are literally paying to see what consumers like and dislike in addition to the ability to sell products to them early and often. But what about the sale of this data for an “idea” or a political opinion?
What about when it begins to impact our politics and possibly our elections? What if it begins to infiltrate not only the public trust, but also the technology we use inside of our companies? What is that ethical line, and how much should we be expected to accept? Facebook’s new plan to integrate several of their social media platforms will allow them to access even more consumer information. If everything is for sale, nothing is free.
Consumers are at risk, and not only for their personal information like credit cards, passwords and security questions being stolen and exposed. They’re also at risk because their information becomes a valuable commodity sought by anxious data brokers and even captured by devices like their automobiles and thermostats! We are undoubtedly living in a “data driven” society. We are living in a world of globalizing economies, data transfer, and ubiquitous access to everything from everywhere. At the same time, throughout the past year, we have seen an influx of compliance and data security-related stories flood news outlets.
So, who’s responsible? On the one hand, with the emergence of legislation like the EU GDPR, the CCPA, and the increasingly likely possibility of a US Federal Privacy law, companies around the globe are facing a heightened demand for data privacy and compliance regulation. From Facebook to Google, NSA to Apple, there is a continuing balancing act of deciding to share information vs. wanting to protect the information we wish to keep private.
Living in our increasingly social world has and will continue to present a paradox with personal privacy. Information placed on the internet and available publicly can be used in unintended ways regardless of your original intent. This is true for public sector organizations, businesses, and individuals.
Are Chief Privacy Officers data stewards and advocates for the privacy rights of our employees, customers and citizens of the world? The reality is that companies are in business to make money, and it’s the job of compliance professionals—be they privacy officers, attorneys, or security officers—to help them do so, to fully realize the potential of the data they obtain, and to make sure they are protecting that information at the same time.
We continue to move towards a data-driven society with self-driving cars and IoT devices not only collecting data about us but also making decisions for us. In reality, I would suggest that the person that handles that should be the Chief Privacy Officer. Chief Privacy Officers have by nature a role that is intended to balance the collection of data, but it typically does not cover the flow of data from a company to a customer. They also typically don’t cover the algorithms that are used to make automated decisions about individuals (other than to test whether they are acceptable within the boundaries of a given law).
So who makes those decisions? And who brings up those tests of ethics and the “right thing to do” versus the “legal thing to do?” As very young children we are taught that we have to “share” with others, but that we should not “take” something without asking or permission. Notice that choice and consent, the foundations of global privacy basics and many privacy frameworks and regulations, mimic these playground rules.
Companies must be transparent about the reason that they want to collect data, give their customers a true choice about whether or not to provide it, and then follow through by ensuring that they only use the data that they collect for the purpose and within the boundaries of consent that a consumer provided. These are the rules of society, and most of us learn them on the playground, in the classroom, and at home. But unlike a playground, regulation and consumers are our monitors.
Blog Post: Protect Your Tenant! How to Monitor Office 365 External User Activity (Part 1)
Trust is something that businesses must work to establish with their customers every day. Once lost, it is very difficult to regain. Consumers can and will reward businesses that they trust and will punish those that they don’t, as will regulators.
Regardless of whether or not your business MUST have a privacy officer, you probably SHOULD have a privacy officer. This is a person who is responsible for helping your company make those informed decisions about risk and reward, including what you should do with data versus what you can do with data. Keep in mind that both your security and privacy officers—as well as general counsel—can help you take the steps above to empower them to become “partners” with your IT and business colleagues, gain internal key executive sponsorship and cooperate with their lines of business.