"I was extremely excited about the power of social media to advance democracy all over the world," Parakilas says.
But his optimism would be tempered by the reality of Facebook's hunger for raw data about its users. He didn't like the direction it was going.
"They have a business model that is going to push them continuously down a road of deceiving people," he says. "It's a surveillance advertising business model."
Parakilas says he tried to warn his managers at Facebook that they were at risk of putting private information into the wrong hands. But the company was growing fast and making money. Its leaders believed connecting people was inherently good.
Many of its earliest investors believed in its mission, too. But now Roger McNamee, who helped mentor Zuckerberg, says he feels bad about what's happened, "because at the end of the day these were my friends. I helped them be successful. I wanted them to be successful."
As part of his penance, McNamee helped found the Center for Humane Technology. The center is trying to "realign technology with humanity's best interests." Parakilas has also joined the effort as an adviser.
While Facebook may be in the headlines now, there is plenty of regret going around Silicon Valley from people who were part of other companies.
Guillaume Chaslot joined Google/YouTube in 2010. He, too, started as a true believer. "We could only make things better if people were more connected," he says. "If everybody could say what he wanted to say, things would naturally get better."
But Chaslot says he noticed the main goal at YouTube wasn't to inform people; it was to keep people watching videos for as long as possible. "This goal has some very bad side effects and I started to notice the side effect as I worked at YouTube," he says.
Among the side effects he noticed: People tended to get only one point of view on a topic — and not always the right one. For example, a search for "moon landing" might bring up videos from conspiracy theorists arguing that NASA faked the whole event.
Chaslot tried to create an algorithm that would show people different points of view. But, he says, his bosses weren't interested.
A spokesperson from the company says it has updated its algorithms since Chaslot left. According to the company, it no longer just tries to keep people on the site for as long as possible; the goal is to measure through surveys how satisfied users are with the time they spend on the site.
Chaslot left in 2013. But, he continued to lose sleep over what was happening on YouTube. From the outside, he observed the site fill up with conspiracy theories and divisive content. He privately met with former colleagues and tried to warn them. But nothing began to change until after the presidential election, when news of Russian interference brought more attention to the kinds of videos on YouTube.
Chaslot now says he wishes he'd gone public sooner. "Now that's what I'm doing but with a bit of a delay," he says. He even started a site to track what kinds of videos surface when you search terms like "Is the Earth flat or round?" and "vaccine facts." The results bring up plenty of factually incorrect conspiracies.
Of course, it may be easier for many techies to speak out now — investors have done well and employees were paid well for their work. Still, it's probably good news that the very people who helped create the problem are now using their inside knowledge to fix it.
Copyright 2018 NPR. To see more, visit http://www.npr.org/.