Tim Cooks Rede zum europäischen Datenschutztag

Vergangenen Donnerstag – am «Europäischen Datenschutztag» – hielt Apples CEO Tim Cook eine Eröffnungs-Rede der virtuellen «Computers, Privacy and Data Protection»-Konferenz (kurz «CPDP») der Europäischen Union. In seiner Ansprache verurteilte er die datenschutz-feindlichen Geschäftsmodelle von Unternehmen wie Facebook – ohne das soziale Netzwerk aber direkt namentlich zu nennen – und betonte das Engagement von Apple für eine bessere Privatsphäre der Nutzer.

Stefan Rechsteiner

«Wir sollten nicht vom grossen Ganzen wegschauen. In einer Zeit grassierender Desinformation und durch Algorithmen angeheizter Verschwörungstheorien, können wir nicht länger die Augen vor einer Theorie der Technologie verschliessen, die besagt, dass jedes Engagement ein gutes Engagement ist – je länger, desto besser – und alles mit dem Ziel, so viele Daten wie möglich zu sammeln.» Sagte Cook während seiner Rede. «Es ist längst an der Zeit, damit aufzuhören, so zu tun, als ob dieser Ansatz nicht mit Kosten verbunden wäre – mit Polarisierung, mit verlorenem Vertrauen und, ja, mit Gewalt.»

Ein «soziales Dilemma», so Cook, dürfe nicht zu einer «sozialen Katastrophe» werden.

Wie er bereits vor zwei Jahren in Brüssel sagte, sei es an der Zeit nicht nur für ein umfassendes Datenschutz-Gesetz in den USA, sondern auch für weltweite Gesetze und neue internationale Abkommen, «die die Prinzipien der Data-Minimization, dem Wissen über den Nutzer, des Nutzer-Zugangs und der Datensicherheit weltweit verankern». Die «Datenschutz-Grundverordnung» (DSGVO) der EU sei eine wichtige Grundlage für Datenschutz-Rechte auf der ganzen Welt und ihre Umsetzung und Durchsetzung müsse fortgesetzt werden.

Cook wies während seiner Rede auf zwei aktuelle Massnahmen zum besseren Schutze der Privatsphäre hin, die Apple kürzlich ergriffen hat. Einerseits hob er die neuen Angaben auf den App-Detail-Seiten des App Store hervor, die visuell übersichtlich und in einfacher Sprache ausweisen, was die Apps für Datenschutz- und Privatsphäre-Praktiken verfolgen. Die im Dezember eingeführten Angaben, offiziell «App Privacy Details» genannt, werden oft auch als mit den Nährwert-Tabellen bei Lebensmitteln verglichen.

Tracking-Opt-in mit iPadOS und iOS 14.5

Weiter erwähnte Cook die neue «App Tracking Transparency»-Regelung. Dabei handelt es sich um einen beim App-Start prominent aufspringenden Dialog mit Möglichkeit zum «Tracking-Opt-in». Damit der Nutzer mehr Kontrolle über das Tracking in den Apps die er benutzt hat, müssen iPad- und iPhone-Apps ab iPadOS 14.5 und iOS 14.5 die Nutzer zuerst um Erlaubnis fragen, ob sie auf die Werbe-ID des Gerätes («IDFA») zugreifen dürfen. Mithilfe der IDFA wird App- und auch Anbieter-übergreifende Trackings ermöglicht. Mit der neuen Regelung können Nutzer das Tracking unterbinden. Die Funktionalität ist Bestandteil der kommenden Vorabversionen von iPadOS und iOS 14.5. Die neuen OS-Versionen sollen noch im laufenden Frühjahr veröffentlicht werden.

PDF: Ein Tag im Leben deiner Daten

Im Zuge des europäischen Datenschutztages veröffentlichte Apple auch ein PDF mit dem Titel «A Day in the Life of Your Data».

Dabei handelt es sich um einen leicht verständlichen Bericht, der aufzeigt, wie einige Unternehmen Nutzer über Webseiten und Apps hin tracken. Viele Apps, «die wir jeden Tag nutzen, enthalten durchschnittlich sechs Tracker», so Cook. Diese Tracker gebe es oft, damit man Nutzer über Apps hinweg überwachen und identifizieren, sowie ihr Verhalten beobachten und aufzuzeichnen kann.» Das PDF hebt die Datenschutz-Prinzipien von Apple hervor und nennt weitere Details zum kommenden Tracking-Opt-in im iPhone- und iPad-Betriebssystem.

Tim Cooks CPDP-2021-Rede

Tim Cooks Rede anlässlich der CPDP-2021-Konferenz (ab 3:50) (CPDP / Apple)
Die vollständige Rede von Tim Cook 📝

Good afternoon.

John, thank you for the generous introduction and for hosting us today.

It’s a privilege to join you — and to learn from this knowledgeable panel — on this fitting occasion of Data Privacy Day.

A little more than two years ago, joined by my good friend, the much-missed Giovanni Buttarelli, and Data Protection regulators from around the world, I spoke in Brussels about the emergence of a data-industrial complex.

At that gathering we asked ourselves: “what kind of world do we want to live in?”

Two years later, we should now take a hard look at how we’ve answered that question.

The fact is that an interconnected ecosystem of companies and data brokers, of purveyors of fake news and peddlers of division, of trackers and hucksters just looking to make a quick buck, is more present in our lives than it has ever been.

And it has never been so clear how it degrades our fundamental right to privacy first, and our social fabric by consequence.

As I’ve said before, “if we accept as normal and unavoidable that everything in our lives can be aggregated and sold, then we lose so much more than data. We lose the freedom to be human.”

And yet this is a hopeful new season. A time of thoughtfulness and reform. And the most concrete progress of all is thanks to many of you.

Proving cynics and doomsayers wrong, the GDPR has provided an important foundation for privacy rights around the world, and its implementation and enforcement must continue.

But we can’t stop there. We must do more. And we’re already seeing hopeful steps forward worldwide, including a successful ballot initiative strengthening consumer protections right here in California.

Together, we must send a universal, humanistic response to those who claim a right to users’ private information about what should not and will not be tolerated.

As I said in Brussels two years ago, it is certainly time, not only for a comprehensive privacy law here in the United States, but also for worldwide laws and new international agreements that enshrine the principles of data minimization, user knowledge, user access and data security across the globe.

At Apple, spurred on by the leadership of many of you in the privacy community, these have been two years of unceasing action.

We have worked to not only deepen our own core privacy principles, but to create ripples of positive change across the industry as a whole.

We’ve spoken out, time and again, for strong encryption without backdoors, recognizing that security is the foundation of privacy.

We’ve set new industry standards for data minimization, user control and on-device processing for everything from location data to your contacts and photos.

At the same time that we’ve led the way in features that keep you healthy and well, we’ve made sure that technologies like a blood-oxygen sensor and an ECG come with peace of mind that your health data stays yours.

And, last but not least, we are deploying powerful, new requirements to advance user privacy throughout the App Store ecosystem.

The first is a simple but revolutionary idea that we call the privacy nutrition label.

Every app — including our own — must share their data collection and privacy practices, information that the App Store presents in a way every user can understand and act on.

The second is called App Tracking Transparency. At its foundation, ATT is about returning control to users — about giving them a say over how their data is handled.

Users have asked for this feature for a long time. We have worked closely with developers to give them the time and resources to implement it. And we’re passionate about it because we think it has the great potential to make things better for everybody.

Because ATT responds to a very real issue.

Earlier today, we released a new paper called “A Day in the Life of Your Data.” It tells the story of how apps that we use every day contain an average of six trackers. This code often exists to surveil and identify users across apps, watching and recording their behavior.

In this case, what the user sees is not always what they get.

Right now, users may not know whether the apps they use to pass the time, to check in with their friends, or to find a place to eat, may in fact be passing on information about the photos they’ve taken, the people in their contact list, or location data that reflects where they eat, sleep or pray.

As the paper shows, it seems that no piece of information is too private or personal to be surveilled, monetized, and aggregated into a 360-degree view of your life. The end result of all of this is that you are no longer the customer, you’re the product.

When ATT is in full effect, users will have a say over this kind of tracking.

Some may well think that sharing this degree of information is worth it for more targeted ads. Many others, I suspect, will not, just as most appreciated it when we built a similar functionality into Safari limiting web trackers several years ago.

We see developing these kinds of privacy-centric features and innovations as a core responsibility of our work. We always have, we always will.

The fact is that the debate over ATT is a microcosm of a debate we have been having for a long time — one where our point of view is very clear.

Technology does not need vast troves of personal data, stitched together across dozens of websites and apps, in order to succeed. Advertising existed and thrived for decades without it. And we’re here today because the path of least resistance is rarely the path of wisdom.

If a business is built on misleading users, on data exploitation, on choices that are no choices at all, then it does not deserve our praise. It deserves reform.

We should not look away from the bigger picture.

At a moment of rampant disinformation and conspiracy theories juiced by algorithms, we can no longer turn a blind eye to a theory of technology that says all engagement is good engagement — the longer the better — and all with the goal of collecting as much data as possible.

Too many are still asking the question, “how much can we get away with?,” when they need to be asking, “what are the consequences?”

What are the consequences of prioritizing conspiracy theories and violent incitement simply because of their high rates of engagement?

What are the consequences of not just tolerating, but rewarding content that undermines public trust in life-saving vaccinations?

What are the consequences of seeing thousands of users join extremist groups, and then perpetuating an algorithm that recommends even more?

It is long past time to stop pretending that this approach doesn’t come with a cost — of polarization, of lost trust and, yes, of violence.

A social dilemma cannot be allowed to become a social catastrophe.

I think the past year, and certainly recent events, have brought home the risk of this for all of us — as a society, and as individuals as much as anything else.

Long hours spent cooped up at home, the challenge of keeping kids learning when schools are closed, the worry and uncertainty about what the future would hold, all of these things threw into sharp relief how technology can help — and how it can be used to harm.

Will the future belong to the innovations that make our lives better, more fulfilled and more human?

Or will it belong to those tools that prize our attention to the exclusion of everything else, compounding our fears and aggregating extremism, to serve ever-more-invasively-targeted ads over all other ambitions?

At Apple, we made our choice a long time ago.

We believe that ethical technology is technology that works for you. It’s technology that helps you sleep, not keeps you up. That tells you when you’ve had enough, that gives you space to create, or draw, or write or learn, not refresh just one more time. It’s technology that can fade into the background when you’re on a hike or going for a swim, but is there to warn you when your heart rate spikes or help you when you’ve had a nasty fall. And that all of this, always, puts privacy and security first, because no one needs to trade away the rights of their users to deliver a great product.

Call us naive. But we still believe that technology made by people, for people, and with people’s well-being in mind, is too valuable a tool to abandon. We still believe that the best measure of technology is the lives it improves.

We are not perfect. We will make mistakes. That’s what makes us human. But our commitment to you, now and always, is that we will keep faith with the values that have inspired our products from the very beginning. Because what we share with the world is nothing without the trust our users have in it.

To all of you who have joined us today, please keep pushing us all forward. Keep setting high standards that put privacy first. And take new and necessary steps to reform what is broken.

We’ve made progress together, and we must make more. Because the time is always right to be bold and brave in service of a world where, as Giovanni Buttarelli put it, technology serves people, and not the other way around.

Thank you very much.

Kommentare

Anmelden um neue Kommentare zu verfassen

Allegra Leser! Nur angemeldete Nutzer können bei diesem Inhalt Kommentare hinterlassen. Jetzt kostenlos registrieren oder mit bestehendem Benutzerprofil anmelden.