https://www.eff.org/deeplinks/2022/05/podcast-episode-philosopher-king

Computer scientists often build algorithms with a keen focus on “solving the problem,” without considering the larger implications and potential misuses of the technology they’re creating. That’s how we wind up with machine learning that prevents qualified job applicants from advancing, or blocks mortgage applicants from buying homes, or creates miscarriages of justice in parole and other aspects of the criminal justice system.

James Mickens—a lifelong hacker, perennial wisecracker, and would-be philosopher-king who also happens to be a Harvard University professor of computer science—says we must educate computer scientists to consider the bigger picture early in their creative process. In a world where much of what we do each day involves computers of one sort or another, the process of creating technology must take into account the society it’s meant to serve, including the most vulnerable.

Mickens speaks with EFF’s Cindy Cohn and Danny O’Brien about some of the problems inherent in educating computer scientists, and how fixing those problems might help us fix the internet.

ecome the captain of it.

And I think about that a lot, because a lot of engineers want to abdicate themselves with the responsibility for being the captain of their own boat. And they say, I’m just going to focus on the boat and that’s it. But in this metaphor sort of society and built in biases and things like that, those are the winds. Those are the currents. And they’re going to push your product. They’re going to push your software towards some shore and that’s going to happen regardless of whether you think that’s going to happen or not. So we really have this responsibility to choose and decide.

Danny: I hate to follow Kierkegaard with Stan Lee, but is that with great power comes great responsibility. And I wonder if part of these ethical discussions is whether that’s not the problem. That you are asking engineers and the creators of this technology to make ethical decisions sort of that will affect the rest of society. And the problem is that actually it should be the rest of society that makes those decisions and not the engineers   maybe the harder work is to spread that power more equally and give everyone a little element of being an engineer like that they can change the technology in front of them. 

James: I think that what you’re talking about sort of at a broad level is governance. How do we do governance of online systems? And it’s a mess right now. It’s a combination of internal company policies, which are not made public, external, that is to say publicly visible policies regulation, the behavior of individual users on the platform. And it’s a big mess. Because I think that right now, a lot of times what happens is a disaster happens and then all of a sudden there’s some movement by both the companies and maybe regulators to change something thing, and then that’ll be it for a bit. And then things kind of creak along then another disaster happens. So it’d be nice to think about, in a more systemic way, how we should govern these platforms. 

Cindy: As a free speech, fourth amendment lawyer, having governments have more say over the things that we say in our privacy and those kinds of things, well, that hasn’t always worked out all that well for individual rights either, right? But we have these gigantic companies. They have a lot of power and it’s reasonable to think, well, what else has a lot of power that might be able to be a check on them? Well, there’s government. And that’s all true, but the devil really is in the details and we worry as much about bad corporate behavior as we do bad governmental behavior. And you have to think about both. 

Cindy: So let’s say you’re the philosopher king or in your great new world, what does it look like for me as a user in this future world ?

James: I think one important aspect is more transparency about how your data is used, who it gets shared with, what is the value that companies are getting from it. And we’re moving a little bit in that direction slowly but surely. Laws like GDPR, CCPA, they’re trying to slowly nudge us in this direction. It’s a very hard problem though, as we all know. I mean, engineers may not fully understand what their systems do. So then how are they going to explain that in a transparent way to users. But in sort of this utopia, that’s an important aspect of online services. There’s more transparency in how things work. I think there’s also more consent in how things work. So these things go hand in hand. So users would have more of an ability to opt into or opt out of various manipulations or sharings of their data.

Once again, we’re starting to go a little bit closer towards that. I think we can do much, much more. I think that in terms of content moderation, I think, and this is going to be tricky, it’s going to be hard, this speaks to sort of Cindy’s observations about, well, we can’t fully trust government or the companies. But in my opinion, I mean, I’m the philosopher king in this experiment. So in my opinion, what I want to have is I want to have a floor that defines sort of minimal standards for protections against hate speech, harassment, things like that. Of course the devils and the details. But I think that’s actually something that we don’t really have right now. There’s also this important aspect of having educated like citizens, right? So having more technical education and technical literacy for laypeople so that they can better understand the consequences of their action. 

Cindy: That we know what choices we’re making, we’re in charge of these choices and have actual choices, I think are all tremendously important. EFF has worked a lot around adversarial interoperability and other things which are really about being able to leave a place that isn’t serving you. And to me, that’s got to be a piece of the choice. A choice that doesn’t really let you leave is not actually a choice.

James: As you may know, there have been some recent proposals that want to solve this portability issue essentially by saying, let’s have users store all their data on user owned machines and then the companies have to come to us for permission to use that data. There’s a sort of push and pull there in terms of, on the one hand wanting to give people literal power over their data, such that it’s actually their machines that are storing it versus saying, well, if I look at like the computers that are administered by my relatives, for example, who are not computer scientists, these computers are offline all the time. They’ve got like terrible, ridiculous programs on them. They’re not reliable. Now in contrast, you look at a data center, that’s administered by paid professionals whose job it is to keep those machines online. So there’s an advantage to using that model.

Do we want to still keep our data in centralized places, but then make sure there’s plumbing to move stuff between those centralized places or do we want to, in the extreme, go towards this peer to peer decentralized model and then lose some of the performance benefits we get from the data center model?

Cindy: That’s a good articulation of some of the trade-offs here. And of course the other way to go is kind of on the lawyer side of things is a duty of care that people who hold your data have a fiduciary or something similar kind of duty to you in the same way that your accountant or lawyer might have. So they have your data, but they don’t have the freedom to do with it what they want. In fact, they’re very limited in what they can do with it.  I feel very optimistic in a certain way that there are mechanisms on the technical side and the non-technical side to try to get us to this kind of control. Again, none of them are without trade-offs, but they exist all across the board.

James: Yes. And I think an interesting area of research, it’s an area that I’m a bit interested in myself, is what are specific technical things that software developers can do to provide obvious compliance with legal regulations. Because these laws, they’re just like any human creation. They can be vague or ambiguous in some cases, they can be difficult to implement. 

And I think that part of this gets down to having these different communities talk to each other. One reason it’s difficult for computer scientists to write code that complies with legal requirements is that we don’t understand some of these legal requirements. The lawyers need to learn a little bit more about code and the computer scientists need to learn a little bit more about the law.

Cindy: It’s also the case, of course, that sometimes laws get written without a clear idea of how one might reduce it to ones and zeros. And so that may be a bug if you’re a computer scientist, it might be a feature if you’re a lawyer, right? Because then we let judges sort out in the context of individual situations what things really mean. 

James: So one of the gifts of the philosopher king to lure people under these semantic morasses 

Cindy: Thank you so much king.

James: No problem of course. It’s been great sitting here chatting with you. Let me return back to my kingdom.

Danny: James Mickens, thank you very much.

James: Thank you.

Cindy: Well, James teaches computer science at Harvard, so it’s right that his focus is on education and personal ethics and transparency. This is the work of the computer scientists. And I appreciate that he’s working and thinking hard about how we build more ethical builders and also that he’s recognizing that we need to kind of move beyond the silos that computer science often finds itself in and reach out to people with other kinds of expertise, especially philosophy. But we also heard from him about the importance of the role of the impacted community, which is something we’ve heard over and over again in this podcast and the need to make sure that the people who are impacted by technology understand how it works and have a voice.

Danny: It wasn’t just sort of this literally academic kind of discussion. He had some practical points too, I mean, for instance, that if we do need to improve things and fix things, we found some ways of doing incremental security improvements like HTTPS, but some really have to overcome a lot of tech debt. And I don’t think we’re going to be in a situation where we can ask people not to book airplane tickets while we fix the fundamentals, which again, points out to what he’s saying, which is that we need to get this stuff right earlier rather than later in this process.

Cindy: And I loved hearing about this embedded ethics program that he’s working on at Harvard and at other places and the idea that we need to build ethics into every class and every situation, not just something we tack on separately at the end, I think is a very good start. And of course, if it leads to a line of students who want to do ethical tech beating their way to EFFs doors, that would be an extra bonus for us.

Danny: It does make everything a little bit more complicated to think of ethics and the wider impact. I mean, I did take on board his comparison of the ease of building a centralized internet, which might have deleterious effects on society with the obvious solution, which is to decentralize things. But you have to make that just as easy to use for the end user and then somebody who’s hacking away trying to build a decentralized web, that’s something I definitely took personally and will take on board.

Cindy: There’s trade-offs everywhere you go. And I think in that way, James is just a true educator, right? He’s requiring us all to look at the complexities in all directions so that we can really bring all those complexities into thinking about the solutions we embrace. After this conversation, I kind of want to live in the world where James is our philosopher king.

Danny: Thanks to you, James Mickens, our supreme leader and thanks you for listening today. Please visit eff.org/podcast for other episodes, or to become a member. Members are the only reason we can do this work. Plus you can get cool stuff like an EFF hat or an EFF hoodie, or even an EFF camera cover for your laptop. Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International and includes music licensed under the Creative Commons Attribution 3.0 imported license by their creators. You can find those creators names and links to their music in our episode notes or on our website at eff.org/podcast. How to Fix the Internet is supported by Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. I’m Danny O’Brien.

Cindy: And I’m Cindy Cohn.

James Mickens is a professor of computer science at the Harvard School of Engineering and Applied Sciences and a director at the Berkman Klein Center for Internet and Society. He studies how to make distributed systems faster, more robust, and more secure; much of his work focuses on large-scale web services, and how to design principled system interfaces for those services. Before Harvard, he spent seven years as a researcher at Microsoft; he was also a visiting professor at MIT. Mickens received a B.S. from the Georgia Institute of Technology and a Ph.D. from the University of Michigan, both in computer science.

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.