Facebook CEO Mark Zuckerberg’s testimony on Capitol Hill this week is only the latest focal point in a national conversation about the perils of technology.
As the ethical questions surrounding technology become unavoidable, three Stanford professors are thinking about the kind of ethics education that future engineers will need.
Pushing the boundaries of students’ thinking
On a Tuesday afternoon at the Stanford campus, the professors are sitting around a table crowded with laptops and papers. Mehran Sahami, a computer science professor famous on campus for getting Mark Zuckerberg to visit his class a few years in a row, is at the head of the table. Next to him sits political science professor Rob Reich, infamous for lectures that force students to think critically about their own ethical choices.
Together with political science professor Jeremy Weinstein, they’re creating a new computer science ethics course. It’s an idea they’ve talked about for years, but now feel they can’t put off any longer.
“I'm tired of having engineers figure stuff out on our behalf,” Reich says. “Allowing ourselves to inhabit a world where engineers are increasingly deciding our future, without the engineers being aware of the ethical responsibilities and considerations of the very technologies they're pioneering, is a world that I don't think any of us wants to live in.”
The professors will be developing the curriculum for the class over the next few months. On this day they’re hashing out a unit about data collection and privacy.
Research assistant Michael Dworsky gets them started with a case study. “Facebook has been using facial recognition for a while,” says the fourth-year student, who’s planning to take a job with the social media giant after he graduates.
He explains Facebook is developing technologies that can identify users by clothing, posture or gait; others allow advertisers to target people based on their facial expressions.
The professors start riffing about the different issues this brings up. Sahami points out that this case study raises questions about when public information becomes private.
“If I'm at a supermarket and I'm in line to buy something, the people around me can see what I'm buying in that one basket and I probably don't care,” Sahami says. “But if I just take that information now aggregated over my shopping lifetime, there is data that I probably would have a reasonable belief to be private.”
Then Reich jumps in: “I'm imagining that part of my task in this section will be to remind students that privacy was a value worth caring about in the first place.”
Sahami says this kind of interdisciplinary collaboration between computer science and other disciplines is rare. There have been CS ethics courses since he was a student at Stanford some 30 years ago, but he says the courses haven’t always kept pace with innovation. The courses usually focus on philosophy or public policy, with a touch of tech, or the engineering side of things with a hint of ethics. The professors hope this class bring the disciplines together.
“We just want to push the boundaries of students’ thinking,” Sahami says.
Students Rise Up
Across campus, a handful of Stanford undergrads are sitting around a dorm lounge, thinking about all this, too. They meet a couple of times a month and try to figure out how to give computer science a moral compass.
Senior Sawyer Birnbaum helped start the group. He explains that at Stanford, CS students are required to meet a “Technology in Society” requirement, but it’s really broad.
“Probably by far the majority of CS students at Stanford go through their four years here without like really taking any classes that actually force them to think about ethical questions or engage in that kind of ethical analysis,” he says.
The students float different ways to address that, including putting in place more stringent requirements and creating ethics courses tailored to specific fields within computer science, but they spend most of the time talking about the need for interdisciplinary instruction.
Student Noah Arthurs says he became frustrated by what he saw as a trade-school mindset in his computer science classes. “They were so industry-oriented,” he says. “You're just learning skills.”
He says the CS department lacked the kind of intellectual rigor he found in other departments. He came to see it as an extension of the tech industry, rather than an academic discipline.
“The professors will always tell you, ‘You're learning the algorithm that Facebook uses to do facial recognition, or you're learning the algorithm that Google uses to tell you search results,’ ” he says. “So it's like there's always this looming industry that everything is pointing you towards.”
Biology majors can consider a career in bioethics, he points out, so why not a similar specialty for computer science students? He wants to see some option for CS students, aside from working as an engineer at one of the big Silicon Valley firms or launching a startup.
For his part, Reich believes there's support for interdisciplinary collaboration. He says the computer science faculty he knows don't want Stanford University to become a technical university.
“They want the liberal arts tradition to remain at Stanford,” he says. “So as nearly 50 percent of the undergraduate body is majoring in computer science, the computer science faculty themselves want to ensure that (those students) have a broad education.”
The new course takes a step in that direction, Reich says, but it's only a piece of what's needed.
“Sometimes I dream of the possibility that we could develop an interdisciplinary major,” he says. Or at the least a concentration for people who are interested in the intersection of ethics, public policy and engineering, “rather than deciding that the engineers do what they do, the ethicists do what they do, and the public policy people can catch up to where the engineers are a decade from now.”
The professors are hoping to roll out the class in January. One title they're considering? Practice Safe CS.