Interesting People mailing list archives

Computer science faces an ethics crisis. The Cambridge Analytica scandal proves it. - The Boston Globe


From: "Dave Farber" <farber () gmail com>
Date: Fri, 23 Mar 2018 07:45:31 -0400




Begin forwarded message:

From: Dave Farber <farber () gmail com>
Date: March 23, 2018 at 7:11:31 AM EDT
To: David Farber <dave () farber net>
Subject: Computer science faces an ethics crisis. The Cambridge Analytica scandal proves it. - The Boston Globe


https://www.bostonglobe.com/ideas/2018/03/22/computer-science-faces-ethics-crisis-the-cambridge-analytica-scandal-proves/IzaXxl2BsYBtwM4nxezgcP/story.html

Computer science faces an ethics crisis. The Cambridge Analytica scandal proves it.
March 22, 2018

NEW YORK TIMES
Facebook founder Mark Zuckerberg speaks at a conference in San Jose, Calif., in 2017. Cambridge Analytica scraped up 
Facebook data from more than 50 million people.
Cambridge Analytica built a weapon. They did so understanding what uses its buyers had for it, and it worked exactly 
as intended. To help clients manipulate voters, the company built psychological profiles from data that it 
surreptitiously harvested from the accounts of 50 million Facebook users. But what Cambridge Analytica did was hardly 
unique or unusual in recent years: a week rarely goes by when some part of the Internet, working as intended, doesn’t 
cause appreciable harm.

I didn’t come up in computer science; I began my career as a physicist. That transition gave me a specific 
perspective on this situation. That the field of computer science, unlike other sciences, has not yet faced serious 
negative consequences for the work its practitioners do.

Chemistry had its first reckoning with dynamite; horror at its consequences led its inventor, Alfred Nobel, to give 
his fortune to the prize that bears his name. Only a few years later, its second reckoning began when chemist Clara 
Immerwahr committed suicide the night before her husband and fellow chemist, Fritz Haber, went to stage the first 
poison gas attack on the Eastern Front. Physics had its reckoning when nuclear bombs destroyed Hiroshima and 
Nagasaki, and so many physicists became political activists — some for arms control, some for weapons development. 
Human biology had eugenics. Medicine had Tuskegee and thalidomide. Civil engineering, a series of building, bridge, 
and dam collapses. (My thanks to many Twitter readers for these examples)

Advertisement

These events profoundly changed their respective fields, and the way people come up in them. Before these crises, 
each field was dominated by visions of how it could make the world a better place. New dyes, new materials, new 
sources of energy, new modes of transport — everyone could see the beauty. Afterward, everyone became painfully aware 
of how their work could be turned against their dreams.

Get Truth and Consequences in your inbox:
Michael A. Cohen tekes on the absurdities and hypocrisies of the current political moment.
Each field dealt with its reckoning in its own way. Physics and chemistry rarely teach dedicated courses on ethics, 
but the discussion is woven into every aspect of daily life, from the first days of one’s education. As a graduate 
student, one of the two professors I was closest to would share stories of the House Un-American Activities Committee 
and the anti-war movement; the other would talk obliquely about his classified work on nuclear weapons. Engineering, 
like medicine, developed codes of ethics and systems of licensure. Human biology, like psychology, developed strong 
institutional review boards and processes.

None of these processes, of course, prevent all ethical lapses, and they neither require nor create agreement about 
which choices are right. Many physicists, for example, began avoiding working on problems with military applications 
in the years after the McCarthy hearings and the Vietnam War. But many others do such research, and the issue is 
frequently and hotly debated.

Computer science is a field of engineering. Its purpose is to build systems to be used by others. But even though it 
has had its share of events which could have prompted a deeper reckoning — from the Therac-25 accidents, in which 
misprogrammed radiation therapy machines killed three people, up to IBM’s role in the Holocaust — and even though the 
things it builds are becoming as central to our lives as roads and bridges, computer science has not yet come to 
terms with the responsibility that comes with building things which so profoundly affect people’s lives.

Software engineers continue to treat safety and ethics as specialities, rather than the foundations of all design; 
young engineers believe they just need to learn to code, change the world, disrupt something. Business leaders focus 
on getting a product out fast, confident that they will not be held to account if that product fails 
catastrophically. Simultaneously imagining their products as changing the world and not being important enough to 
require safety precautions, they behave like kids in a shop full of loaded AK-47’s.

Advertisement

* * *

What would a higher standard of care look like? First of all, safety would be treated as a principal concern at all 
stages, even when “just trying to get something out the door,” and engineers’ education would equip them to do so. If 
safety came first, the Facebook Graph API used by Cambridge Analytica, which raised widespread alarm among engineers 
from the moment it first launched in 2010, would likely never have seen the light of day.

Tech companies focus intensely on preventing crashes. A rigorous effort to anticipate what could go wrong is already 
standard practice for specialists in system reliability, which deals with “what-ifs” around computer failures. A 
higher standard for safety would simply do the same for “what-ifs” around human consequences. This would not imply 
that all systems should be built to the same safety standards; nobody expects a tent to be built like a skyscraper. 
But the civil engineer’s approach would require a substantial shift of priorities.

Such a shift would sometimes be resisted for business reasons, but working codes of ethics give engineers (and 
others) more power to say “no.” If breaking ethics rules would mean the end of someone’s career, an employer couldn’t 
easily replace someone who refuses to cheat. If the systems for enforcement are well-built, a competitor couldn’t 
easily work around those standards. Uniform codes of ethics give engineers more of a voice in protecting the public.

Underpinning all of these need to be systems for deciding on what computer science ethics should be, and how they 
should be enforced. These will need to be built by a consensus among the stakeholders in the field, from industry, to 
academia, to capital, and most importantly, among the engineers and the public, who are ultimately most affected. It 
must be done with particular attention to diversity of representation. In computer science, more than any other 
field, system failures tend to affect people in different social contexts (race, gender, class, geography, 
disability) differently. Familiarity with the details of real life in these different contexts is required to prevent 
disaster.

Advertisement

There are many methods by which different fields enforce their ethics, from the institutional review boards that 
screen life-sciences experiments on humans and animals, to the mid-career certification of professional engineers who 
then oversee projects used by the unsuspecting public, to the across-the-board licensure of doctors and lawyers. Each 
of these approaches has advantages, and computer science would need to combine ideas and innovate on them to build 
something suited to its specific needs. What would not be acceptable is the consequence of inaction. The public would 
lose trust in technology, and computer scientists would face a host of practical, commercial, and regulatory 
consequences.

Computers have made having friends on the other side of the world as normal as having them next door, have put the 
sum of human knowledge in our pockets, and have made nearly every object we encounter more reliable and less 
expensive. Yet their failure, whether by accident or by unthinking design, can have catastrophic consequences for 
individuals and society alike.

What stands between these is attention to the core questions of engineering: to what uses might a system be put? How 
might it fail? And how will it behave when it does? Computer science must step up to the bar set by its sister 
fields, before its own bridge collapse — or worse, its own Hiroshima.

Yonatan Zunger, now at the startup Humu, is a former distinguished engineer in security and privacy at Google. Follow 
him on Twitter @yonatanzunger.
Continue Reading




-------------------------------------------
Archives: https://www.listbox.com/member/archive/247/=now
Modify Your Subscription: https://www.listbox.com/member/?member_id=18849915&id_secret=18849915-aa268125
Unsubscribe Now: 
https://www.listbox.com/unsubscribe/?member_id=18849915&id_secret=18849915-32545cb4&post_id=20180323074540:B3F4664E-2E8F-11E8-A6C9-CDB62686E160
Powered by Listbox: http://www.listbox.com

Current thread: