For decades, Brown CS has been examining the computer scientist’s impact in a world of constant technological acceleration. The Socially Responsible Computing program, which puts societal and ethical issues at the heart of the department’s undergraduate experience, is its most comprehensive effort yet. Below, we look at how CS 19 Accelerated Introduction to Computer Science implemented one of the key aspects of the program in the Fall semester of 2020: embedding real-world content in courses instead of treating ethical responsibilities as simply a box to check.
(If you don’t have time to read the entire article below, you might want to look at this reflection document, which looks back on the course’s Socially Responsible Computing efforts and makes several recommendations for what to do and not to do when designing such assignments.)
“I knew it was easy to create cut-and-paste, read-and-respond assignments,” says Nick Young, the Socially Responsible Computing Teaching Assistant (STA) for the course, “but Shriram [Krishnamurthi, the course’s instructor] wanted to do better than that. In our very first meeting, his opening lines were something like this: ‘We’re not teaching ethics, we’re doing something much grander. The word ethics implies that we’re working in some normative philosophical role with absolute truths, but that’s only a subset of what we’re doing. We’re trying to create software that works for everyone, to look at how people are trying to break our systems.’ That was interesting to me.”
Nick explains that CS 19 is a highly optimized course that covers a large amount of content. As with other course material created by Shriram and his TAs, developing the socially responsible content was an iterative process. DocDiff was the course’s first assignment. It focuses on the similarity between text entries, which lends itself naturally to thinking about plagiarism, a concept very salient to students. However, rather than asking students to comment about plagiarism at large, which Nick and Shriram felt would produce dozens of very similar responses, they had students try to break a plagiarism detector so they could learn how brittle tools like this can be and how dangerous it can be to use them in automated decision-making.
Student responses included a variety of interesting ideas: using certain Cyrillic characters that look like English characters but have different unicode hashes, or padding documents with zero-width white space characters that can be inserted anywhere. “If we can break the detectors so easily,” Nick asks, “what kind of impact does it have for plagiarism detection at large? This was all about getting students to think like an adversary, like someone trying to break software, so they’ll think about how someone will try to break their software.”
A later assignment, Updater, took a different approach: socially responsible computing content (a talk by Brown CS alum Karen Catlin on having people in a workplace who are assets because of their differences) was studied in parallel with the coding part of the assignment. In this case, Shriram and Nick asked them to think about how a homogenous engineering team might produce an inferior payment processing app. Once again, student responses were inventive: one wrote that having minimum withdrawal amounts can discriminate against low-income people, and another mentioned that being able to decline payments is important for some Asian cultures, where refusing gifts of money is a sign of respect.
“Students are used to being presented with material that asks philosophical questions about why diversity and inclusion are important,” Nick says, “but that can lead them to lose interest when they see something that doesn’t look like it’s connected to computing – they’re much more likely to brush it under the rug. With this project, we didn’t ask them why diversity is useful or important, we asked them to think concretely about how having a diverse team can lead to a better product.”
Nick describes the SRC component of MST (short for Minimum Spanning Tree) as his personal favorite assignment. It challenged students to look beyond the typical ways that malefactors use a system to harm users or the system itself. For example, users of Github have their contributions plotted for them in a grid-like heatmap. By varying their contribution frequency, someone hoping to cause harm could use the graph to spell out profanities or threats that would be visible to the general public. As another example, an adversary could spam a YouTube video with likes, causing YouTube to think that the account was generating artificial traffic.
Because students had already been using Campuswire, a Piazza-like product, for class questions, Nick and Shriram asked them to identify some of its non-obvious attack vectors. Even a simple premise (the idea that a professor would answer the questions from the class with the most upvotes) yielded a variety of responses: for example, students could form a clique and bully a fellow student by voting for other questions and essentially silencing them. This in turn could create a false belief that someone else was asking good questions. “For this assignment,” Nick says, “we tied it to a kind of software that we’re already using – students were really good at making connections with something that we see everyday.”
One common concern about adding socially responsible computing content is that CS classes are already overflowing with material. But it’s only difficult to integrate, Nick says, if it’s treated as an add-on. “Do you just give students a bunch of reflection questions to answer? No, you have them write code and then critique it, make tools, experiment with online standards, learn about web accessibility. The goal should be to teach the code we need to teach through socially responsible computing content, doing more in less.”
When asked about the impact that socially responsible computing content had on students, Nick explains that he was fortunate to be both an STA and an Undergraduate Teaching Assistant (UTA) for CS 19: “It’s a big time commitment, but it really puts you much more in tune with what students are feeling about assignments. By being there when students came to UTA hours, I could really see which ones didn’t care about the socially responsible computing content and which did. Students who were engaged would often walk me through their entire assignment, giving me a lot of insight into their thought process – I could really see that large numbers of them were dealing seriously with the material.” At the end of the semester, he and Shriram put together this reflection document, which captures their process and some lessons learned, including student feedback on the Socially Responsible Computing content.
As a postscript, Nick adds that he found being an STA to be much less about making assignments for other people than it is about educating oneself, with other people benefiting as a result. “People do best,” he says, “when they have a vested interest in what they’re UTAing. If anyone reading this is even remotely interested in the program, I hope they’ll interview to be an STA. It’s a really interesting intersection of computing and society, and I can’t wait to see it flourish.”
For more information, click the link that follows to contact Brown CS Communication Outreach Specialist Jesse C. Polhemus.