Data and privacy

Nurturing innovation does more for privacy than strict regulation

We embrace new technology – despite privacy flaws – because of what it makes possible. Policymakers should work with business and encourage innovation to meet privacy challenges.

Share article

business innovation privacy

Every day, privacy commentators and activists point to new technology using our private data in ways we don’t expect. It feels like innovation and privacy are in a constant power struggle. To move forward, we need both. If privacy policy changes tack, we can have just that.

Privacy has come a long way

Privacy has evolved into the social norm we know today, and it still varies between cultures and countries. Before industrialization, there was little social privacy – extended families shared one house (often one room) and neighbors entered each other’s homes freely. Other people probably knew about everything you did.

Our expectations of social privacy have increased alongside technological development, and in many cases, the technology itself has given us more privacy. We may even feel more comfortable sharing personal information online than offline because online, we feel we have more ways to manage our privacy.

Is regulation the right way to protect privacy?

European Union (EU) policymakers seem sure regulation like General Data Protection Regulation (GDPR) guard online privacy and increases public trust in, and use of, digital technology. In 2018 Vĕra Jourová, then EU Commissioner for Justice, Consumers and Gender Equality said, “The new rules are beginning to set a global standard for privacy. They will help to bring back the trust we need to be successful in a global digital economy.”

Evidence doesn’t seem to agree. Regulation makes little or no difference to whether people use technology. According to European Commission opinion polls’ Eurobarometer, in 2018 – the year GDPR came into effect – Europeans’ online consumer trust fell to its lowest in a decade. The Commission says giving consumers new rights will empower them to control their data. Still, in 2019, Eurobarometer showed four in five Europeans felt they have no or only partial control over the information they give online.

If regulation isn’t the answer, we need businesses to approach their privacy challenges proactively.

How do people really behave around privacy?

Kaspersky’s Global Privacy Research 2020 found about a third of people surveyed had their private information improperly accessed. Nearly a third of those cases meant consequences like financial loss or emotional distress.

What are people doing to protect themselves? Not as much as they could. Only 41 percent protect their web browsing and 37 percent prevent others from using their devices. More than one in four store passwords in ways cybercriminals can easily find.

15 percent of us write down our passwords, even on a sticker near our computer. Surprisingly, the group doing it most is 25- to 34-year-olds

Asking users to confirm they’ve read and accepted terms every time they visit a website would frustrate people more than improve their understanding. A 2019 Eurobarometer survey looking at Europeans’ data sharing and protection views and behaviors found just over one in five say they’re always informed about personal data collection and use conditions. But only 13 percent said they fully read privacy statements.

If regulation won’t improve our privacy, what will?

Education is important in real empowerment. Developing our sense of how to become ‘good digital citizens‘ would likely improve trust and awareness of privacy.

Individually, each business can ensure their digital user experience – how easily users can find and use features of a site or application – enables privacy. People respond well to a shared visual trust language: Lock icons to show a secure process, warning labels to flag suspicious content and ticks for verified profiles.

Well-communicated standards help businesses avoid unintended effects of adopting new technologies, like compliance and legal costs.

Rather than reaching for heavy-handed regulation or banning new technologies, policymakers could instead limit behaviors that have serious privacy consequences.

For example, if there are concerns insurance companies might use genetic information to set premiums or deny coverage, policymakers could simply prohibit them from using it. Then insurers could focus on protecting consumers, knowing they’re behaving responsibly in the eyes of regulators.

Technology changes faster than social values. It makes sense for the regulator to create rules that reflect our values and protect individuals from harm rather than blocking technology and its benefits. Regulators must ask questions about new technologies, like where you can fly your drone or how you can use neural implants. Businesses and trade associations are increasingly held accountable and required to explain what new technologies will achieve and how they’ll use and protect personal data.

Can we have biometric authentication as well as privacy?

Consumers like the convenience of biometric authentication, such as MasterCard’s Selfie Pay and Samsung’s iris scanning. Facial recognition is good for controlling access to the likes of public transport, smart buildings and devices. It’s faster and more convenient than codes or keycards and prevents ‘tailgating’ – when an uninvited person slips in behind an authenticated visitor.

Commercial demand for ways to apply biometric authentication keeps growing, from securing homes to ensuring hospitals give patients the right treatment. It has strong public support – in the US, only one in four (26 percent) wants the government to limit use of facial recognition.

Some systems perform better than others. An often-cited claim is that facial recognition systems perform worse on women and certain ethnicities. Claims about inaccuracy in facial recognition, particularly by race, are misleading because many of the best systems have virtually no error and outperform humans at the same task. In addition, other critics conflate facial detection, facial analysis, and facial recognition which all have different functions, and error rates with one type, do not necessarily apply to the others. Unfortunately, media stories conflating these technologies heighten misperceptions about the risks.

Understandably, people expect regulators to set limits on its surveillance use and safeguard human rights. GDPR and California Consumer Privacy Act (CCPA) have imposed some restrictions. With this tech in its infancy, legislation that restricts or bans it may halt its further development to benefit society. Instead, regulating to limit potential misuse or abuse will improve public trust and encourage the commercial sector to develop, test and share best practices.

There are cybersecurity concerns as biometrics evolve. These must be considered and tested as part of security design. Better policy guidance, and clear and common cybersecurity standards can help companies address security. Policymakers should encourage those developing innovative solutions to protect data, including using artificial intelligence (AI).

Integrated with products like medical implants, autonomous vehicles and drones, AI’s use in healthcare, education and government is fast growing. We face previous cybersecurity challenges alongside new threats. Adversaries can add malicious code to data that trains machine learning or bypass facial recognition to get access. Vulnerabilities like these need investigating.

We need regulators and businesses to work together

Businesses need regulation to innovate in ways that serve society. Some outdated, disproportionate laws mean companies using emerging technologies fear sanctions for non-compliance with rules which were designed to address past problems.

Business leaders must keep up to date with proposed legislation and public sentiment. They must work with regulators to evolve laws, and proactively tackle privacy and security concerns. Now digital transformation is business as usual, it’s critical that clear guidance helps them know how they can respect customers’ privacy and protect their data.

This article reflects the opinion of its author. Article published in May, 2020.

Kaspersky’s Global Privacy Research 2020

We examine consumer attitudes toward online privacy and what they’re doing to keep their private information safe.

About authors

Eline Chivot is Senior Policy Analyst at the Center for Data Innovation in Brussels, Belgium. She has been published in the Financial Times, has appeared on France24 and BloombergTV, and speaks regularly at events.