Interesting People mailing list archives

Kasparov, "Pursuing transparency and accountability for both humans and machines"


From: "Dave Farber" <farber () gmail com>
Date: Mon, 3 Jul 2017 17:04:36 -0400






https://blog.avast.com/pursuing-transparency-accountability-for-humans-and-machines


In his latest post for Avast, Garry Kasparov examines the intersection of privacy, transparency, security, human 
rights, and institutions in the age of AI.

Pursuing transparency and accountability for both humans and machines

 by Garry Kasparov


I was honored recently to be one of the recipients of the Electronic Privacy Information Center’s (EPIC) 2017 
Champions of Freedom award. Tech and privacy giant Bruce Schneier was among the presenters, and my fellow 
recipients—attorney Carrie Goldberg, Ron Rivest, and Judge Patricia Wald—are all doing important work in protecting 
privacy online and off in an age where it feels like we are always being watched. EPIC president and executive 
director Marc Rotenberg made a powerful statement about the vital, and often unsung, role that transparency has in a 
healthy democracy.

Privacy and transparency—2 sides of the security coin

Users are concerned as well, and that is an understatement. The term “privacy” currently has 58,400,000 Google News 
mentions, enough to where it’s worth pointing out that the term means different things to different people in 
different frameworks. As EPIC well understands, the issue isn’t just end-user privacy and oversharing on social 
media; it’s also what happens at the top levels of the governments and corporations that have the capabilities to 
monitor our actions.

This is why privacy and transparency are two sides of the security coin. Companies need to know something about us to 
provide services we want. Government agencies need to monitor to provide essential security services. The eternal 
question is how to balance these needs with the rights of citizens to maintain their privacy and control over their 
own data even as we produce more of it every day. Transparency is required to establish limits—and accountability for 
overstepping them.

This adversarial process is part of the checks and balances that democratic republics are based on, especially in the 
United States. The idea is to use pressure and conflict to expose and fix the cracks in the system, to shine light 
and take gradual steps for the greater good. It’s important to contrast this with the nature of surveillance and 
privacy in the unfree world. The method, motive, and mission behind data collection and violations of privacy cannot 
be ignored. In authoritarian regimes, privacy is only for the rulers while the people have none. The behavior of the 
regime and its most powerful citizens is shielded from the people. Data collection is used for repression and 
persecution of innocent citizens. Real people are in real physical danger when privacy protections fail in an 
autocracy; it’s not a legal exercise, or done to protect people from hackers or terrorists. There could be no 
organization like EPIC in Putin’s Russia—and if there were, it would just be another branch of the security services.

"Watching the watchers"

Our desire to maintain privacy is inseparable from our concerns about the ethical underpinnings of our institutions. 
If we believe the motivations of authorities are morally sound, and operating in our interest, and the interests of 
society, we are more comfortable relinquishing a degree of privacy. You might call this “watching the watchers;” 
those who are in charge of establishing surveillance protocols must be monitored themselves. Their justifications 
must be subject to scrutiny and there must be a chain of responsibility and accountability. Ideally, we want a system 
that allows for the collection of information needed for economic, security, sociological, and other purposes, but 
deters arbitrary collection that does not serve a specific and well-articulated need. Governments and corporations 
must be transparent in expressing their goals and strategies, and consumers have the duty to question any that make 
them skeptical.

Where does AI fit in?

I am, of course, always interested in how AI comes into the picture. We worry about internal decision-making 
processes in bodies that are hidden from the public eye. How can we be sure that the motivations giant corporation, 
deciding how much user information to disclose to foreign governments, has the best interests of its users in mind? 
If we worry about this, and rightly so, we ought to think carefully about how we will delegate similar tasks to AI in 
the future. Increasingly, algorithms are making decisions with weighty consequences for individuals, companies, and 
societies. With the field of machine learning growing at a rapid pace, we will often find ourselves in the position 
of having to evaluate the results of processes that are unknowable to humans. A neural net that reached a particular 
endpoint did so through a series of steps that its human designers did not engineer, but through self-coding. These 
trends are leading us toward a future of unpredictable results and even less clear chains of responsibility for those 
results.

Unsurprisingly, algorithmic transparency and accountability is a growing field in privacy law; EPIC has launched a 
campaign in this area. If we cannot trace the processes of an AI and ascertain how it reached a certain decision, we 
must at the very least make transparent all elements of the process that were controlled by humans. As I argued 
above, we must demand clear and morally-backed justification from the human institutions responsible for broad 
surveillance. The same is true here: we must demand that AI is programmed according to the most rigorous transparency 
and ethics standards.

There must also be a framework to determine responsibility—human responsibility. Where does the liability belong when 
an internet of things connected toaster joins a botnet that brings down a chunk of the internet? Is the manufacturer 
responsible for making an unsafe product? As with dealing with fake news and other symptoms of our powerful 
technology run amok, we have to do a better job of assigning sources and responsibility to people instead of blaming 
technology.

Toward more trustworthy institutions

Government and industry standards can make a big difference, however. After the EPIC award ceremony on June 5, I had 
a brief chat with encryption pioneer Whit Diffie. He was dismayed by my comment that people should know better by now 
not to click on unsafe email attachments. He pointed out that it’s ridiculous that users have the option to do 
something so potentially destructive at all, and he has a point. Modern skyscrapers don’t have windows that can open 
more than a crack—to avoid accidents and suicides. You can never prevent people from self-inflicted damage 
completely, but there is clearly a lot that could be done to improve our digital structures in ways that would help 
protect us from ourselves.

The crucial link is institutions, public and private, and whether there exists enough transparency and accountability 
in them to trust them with our privacy and other forms of security. Whether we are thinking of a government making 
decisions about domestic surveillance programs or a company determining how its products will collect information 
from users, if the values and norms governing human (or machine) action are strong enough, we can be optimistic about 
reaching favorable outcomes—even if it comes after years of trial and error. If institutions are lacking, we make 
ourselves vulnerable to threats from nefarious actors or neutral technologies that can cause harm, either by accident 
or by malicious intent.

In order to be strong, institutions must have continuity across time, and that includes across changes in leadership 
and party. In today’s hyper-partisan landscape, the latter can be hard to stomach, but I pride myself on being a 
vocal critic of the hypocrisy that results from political extremism on both sides of the political spectrum. In my 
acceptance speech at EPIC, I emphasized that if you are okay with surveillance and executive orders when you like the 
president, but you hate surveillance powers and executive orders when you don’t like the president, then you are part 
of the problem! Building a system that works for all of its members requires compromise from all sides. A framework 
complete with common-sense transparency laws and robust checks and balances on monitoring authorities will never 
materialize from finger-pointing and retreating into the safe, but dead-end, corners of total privacy or unbridled 
surveillance.

In the meantime, as we work to build the institutions that will shape our personal experiences and interactions in 
the digital world, we must be responsible as individual users acting within an imperfect system. The online world is 
far from a civilized one. It’s more like the old American Wild West, where the laws are poorly established and 
frequently abused. The ideal safeguards do not exist, and so each consumer must be careful of the information he 
shares and the measures he takes to protect it. Each of us must continue to push for the laws and protections needed 
for collective security, while reconciling ourselves to the current reality and remaining vigilant.

The lines for accountability are still weakly drawn, so companies have little incentive to put protections in place. 
And as popular as it is to demonize “big government” as a road to “Big Brother,” and to target ever-expanding data 
collection by some of the world’s largest companies, these giant institutions are actually the most accountable ones 
we have. Larger  companies with greater name recognition and a reputation to defend are obliged to concern themselves 
with public opinion and user satisfaction.

In the same way, many governments simply do not care about the optics of violating their citizens’ privacy while 
those that are concerned about their standing on the world stage will do better. Protect yourself as best as you can, 
using the most reputable tools available—the Colt 45s and Winchester rifles of the digital Wild West armory—but do 
not stop pushing for the broader institutional change that will lead to a safer, fairer, and more transparent online 
world for everyone—hopefully without a shot being fired.





-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
RSS Feed: https://www.listbox.com/member/archive/rss/247/18849915-ae8fa580
Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915&id_secret=18849915-aa268125
Unsubscribe Now: 
https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-32545cb4&post_id=20170703170443:3A99079E-6033-11E7-858C-A9015F44CD70
Powered by Listbox: http://www.listbox.com

Current thread: