THE TECH INDUSTRY is having a moment of reflection. Even Mark Zuckerberg and Tim Cook are talking openly about the downsides of software and algorithms mediating our lives. And while calls for regulation have been met with increased lobbying to block or shape any rules, some people around the industry are entertaining forms of self regulation. One idea swirling around: Should the programmers and data scientists massaging our data sign a kind of digital Hippocratic oath?
Microsoft released a 151-page book last month on the effects of artificial intelligence on society that argued “it could make sense” to bind coders to a pledge like that taken by physicians to “first do no harm.” In San Francisco Tuesday, dozens of data scientists from tech companies, governments, and nonprofits gathered to start drafting an ethics code for their profession.
The general feeling at the gathering was that it’s about time that the people whose powers of statistical analysis target ads, advise on criminal sentencing, and accidentally enable Russian disinformation campaigns woke up to their power, and used it for the greater good.
“We have to empower the people working on technology to say ‘Hold on, this isn’t right,’” DJ Patil, chief data scientist for the United States under President Obama, told WIRED. (His former White House post is currently vacant.) Patil kicked off the event, called Data For Good Exchange. The attendee list included employees of Microsoft, Pinterest, and Google.
Patil envisages data scientists armed with an ethics code throwing themselves against corporate and institutional gears to prevent things like deployment of biased algorithms in criminal justice.
It’s a vision that appeals to some who analyze data for a living. “We’re in our infancy as a discipline and it falls to us, more than anyone, to shepherd society through the opportunities and challenges of the petabyte world of AI,” Dave Goodsmith, from enterprise software startup DataScience.com wrote in the busy Slack group for Tuesday’s effort.
Others are less sure. Schaun Wheeler, a senior data scientist at marketing company Valassis followed Tuesday’s discussions via Slack and a live video stream. He arrived skeptical, and left more so. The draft code looks like a list of general principles no one would disagree with, he says, and is being launched into an area that lacks authorities or legislation to enforce rules of practice anyway. Although the number of formal training programs for data scientists is growing, many at work today, including Wheeler, are self-taught.
Tuesday’s discussions yielded a list of 20 principles that will be reviewed and released for wider feedback in coming weeks. They include “Bias will exist. Measure it. Plan for it,” “Respecting human dignity,” and “Exercising ethical imagination.” The project’s organizers hope to see 100,000 people sign the final version of the pledge.
“The tech industry has been criticized recently and I think rightfully so for its naive belief that it can fix the world,” says Wheeler. “The idea you can fix an entire complex problem like data breaches through some kind of ethical code is to engage in that same kind of hubris.”