Data protection The new GDPR privacy law: Four letters, lots of question marks

The new regulation was conceived by lawmakers who rely on fax machines and think Facebook is for kids. That's a major blunder, writes Handelsblatt's digital leader.
My data is in here somewhere.

Almost 10 years ago, politicians from all over the EU came together to do something great. They wanted to update data protection laws and harmonize them across Europe so that internet users could finally understand what companies were doing with their data. The law was to be modern, a set of rules for Europe's digital future.

After thousands of amendments, the European project has finally become a law. And, wow, what a law: The General Data Protection Regulation, or GDPR. Yet instead of clarity, the four letters create a lot of question marks. Companies have been complaining that their IT departments have been clogged up for months in preparation for the new rules. Freelancers are considering closing semi-private blogs. And even wedding photographers wonder who they can still photograph without having to fear cease-and-desist letters.

A hopeless mess with plenty to share the blame. Sure, sometimes companies haven't respected the privacy of their customers and employees enough. But the GDPR shows that those who pass such laws all too often live in a world where the fax machine still plays a far too important role. Millions of citizens have long lived in a different, much more digital world than those of their politicians.

Quelle: dpa
Ms. Vosshoff does not have Facebook, to the surprise of no one.
(Source: dpa)

A few days ago, Federal Data Protection Commissioner Andrea Vosshoff boasted in Handelsblatt that she had neither a Twitter nor a Facebook account, because it could not be reconciled with her "understanding of fundamental rights and with her office." And so now we have a law where it's unclear whether users need permission to store information from other people's business cards in their mobile phones.

Much of what the GDPR wants to regulate is targeted at Google and Facebook, whose business model is based on the evaluation of users' most private information. And users should get more control over their data. But now the major players are just asking users to once again permit the use of their data, because the services wouldn't function as well otherwise. Who wants that?

It's not so simple for others. How is a freight forwarder behaving when he stores travel times and movement data about his own vehicles on his computer – is he violating his employees' privacy? What are accountants, who've always informed their clients about current developments via e-mail, now allowed to do? Even law firms are unclear about the law. And doctor's offices: Will they also need a data protection officer because they deal with sensitive data? Unclear.

The GDPR should hit the big ones  –  but it will also become a huge burden for the small ones: for companies with a few dozen employees, a server in the basement, but without a compliance department. A Bitkom survey shows that only 9 percent of startups have successfully implemented the GDPR due to a lack of resources.

It's a fax from 1990! It says the GDPR is pretty neat!

Of course, this is the only way forward because too many companies have taken the subject of data protection lightly: The law was discussed for years, so they could have dealt with it long ago. But that wouldn't have changed the basic problem. The law is too bureaucratic and does not answer the important questions ahead. Not a word about artificial intelligence, networked machines, autonomous cars and smart health  –  all based on data technologies that will be part of everyday life in the next few years. With the GDPR, the state wants to reduce the amount of data that citizens divulge and has already enshrined the principle of data poverty in paragraph 5. But is this the right approach in an age of intelligent computers, adaptive algorithms and big data?

It would have been better to clarify the really important questions, like how a law can force companies to make their algorithms transparent, for example. Berlin internet lawyer Niko Härting suggests using a type of algorithm check. Soon, self-learning computers will make important decisions for millions of people, in traffic, when shopping and when looking for a job. Today, corporations already use algorithms to view applications. How can we ensure that they don't discriminate? What authority can even check such procedures? What guidelines should the state follow? Answers to these questions would increase people's confidence in the new technologies.

Yes, the GDPR deals with the important topic of data, but does not regulate the important data topics of the future. However, there is at least one positive aspect to the law: At least Europe is finally talking about data again.

To contact the author: [email protected]