Diana Freed Joins Brown CS And DSI As Assistant Professor
- Posted by Jesse Polhemus
- on Oct. 17, 2023
Next fall, Diana Freed joins Brown CS and Brown’s Data Science Institute as an assistant professor. She’s the latest hire in the multi-year CS With Impact campaign, our largest expansion to date.
Diana is involved in an emerging area of computer science focused on building and designing technologies specifically to improve online safety and well-being for vulnerable and marginalized populations globally. She notes that one theme has been present in her work from the beginning: “The core focus of my research is helping people and improving society, trying to support at-risk and underserved communities. Technology has allowed me to develop tools to help people improve their online privacy and security. Understanding how contextual and human factors impact privacy decisions and protective practices as well as the resource needs of at-risk populations to develop digital literacy tools for intervention and prevention is an important part of this work. It’s what motivates me and what drew me here. My work and my core values really align with Brown University.”
Originally interested in neuroscience (she has a Master’s degree in Biology from New York University), Diana’s experiences with communicating illness to children and working with young adults coping with eating disorders convinced her of the need to bring technology into everyday clinical work. Following a Master’s degree in Psychology from Columbia University and another in Interactive Telecommunications from New York University’s Tisch School of the Arts, a 2016 fellowship at Data & Society cemented her interest in security and privacy, eventually leading to a Master’s degree and a doctorate in Information Science from Cornell University.
Currently, Diana’s research focuses on digital security and privacy, as well as developing and evaluating technologies for mental health and chronic conditions. She combines techniques from computer security, human-computer interaction, psychology, and law to develop new tools, technologies, and theories to detect and mitigate technology-enabled abuse. With goals that include helping people make informed choices, her work is both far-reaching (“it covers Internet of Things devices and potentially all technology that people interact with”) and highly collaborative, with a strong emphasis on policy as well.
“One example of my policy contributions,” Diana says, “was in the development of the Safe Connections Act, which allows survivors of intimate partner violence to leave the phone plans they share with abusive (ex)partners. I generally collaborate with government as well as survivor advocacy organizations, and I actively engage with professionals from the survivor support ecosystem, such as caseworkers, attorneys, the survivors themselves, and families. What we didn’t understand fully at the start of this project was how technology was being used to track and surveil survivors of intimate partner violence. We had to identify those threat models in order to help survivors safely disconnect and escape from their abusers.”
Responsible use of technology is one of Diana’s key interests, and when asked about her hopes for the future of the Brown CS Socially Responsible Computing program, her reply shows both an attention to detail and a dislike for simple answers.
“To mitigate digital risks and harms,” she says, “we need to understand how vulnerable people experience challenges to their digital and physical safety. Taking a multidisciplinary perspective can have a great impact, as it allows us to understand the complexity of the social factors that play into resolving these problems. Intimate partner violence for an elderly person can be very different from a young person being stalked online, and I don’t want to generalize. Part of socially responsible computing is to address these issues through innovation and research, but we begin as a community, and we need to look at each individual’s experience in these ecosystems. Often, you solve one issue and another comes in.”
“There is a great opportunity to expand this research,” Diana says, “and grow the work of the newly formed Center for Technological Responsibility, Reimagination, and Redesign. This includes engaging with communities through Brown’s involvement with the Northeast Public Tech Interest Initiative (PITNE.org). Through applied research we have the opportunity to engage in experiential learning with stakeholders that include at-risk populations and to inform policy to provide protections that evolve with emerging technologies. We can take an interdisciplinary approach in understanding the complex sociotechnical factors contributing to disparities in safety and access, and design technical and policy mitigations to address digital security inequities. We have an opportunity to employ a holistic approach in developing technological solutions that help people better manage their safety and well-being.”
Diana’s work puts her in perpetual contact with some of the most challenging aspects of the human condition. Given that, is she an optimist, or perhaps more importantly, how does she stay one? She considers for a moment.
“This is my life’s work,” she answers. “Optimism is part of how I approach that work, and optimism is necessary to try and figure out what someone else is experiencing, to make change. What helps me stay optimistic is using my trauma-informed training and a survivor-centered approach. They’re tools that I use to help navigate these digital safety experiences, and optimism is part of what drives my research: understanding a problem, creating a solution, collaborating to improve things and keep people safer.”
But lay people with children, faced with a daily intake of news that can often be sensational, even fear-mongering, may find optimism challenging. Are there reasonable steps that parents can take to protect their children from digital harm?
“A large focus of my work,” Diana says, “is helping people understand how technology works and helping them make certain decisions. More knowledge is beneficial. Sometimes what parents are trying to do to keep their kids safe doesn’t match what they’re concerned about, or they may perceive that online gaming is risky but not perceive the same risks in Snapchat. Providing training and access to information is an important part of letting people make decisions. It’s not my role to say, ‘this is how to keep your child safe’, but I can provide peer-reviewed research and partner with youth to educate them and involve them in developing new technology. We know from experience that young people will develop ways to get around safety measures when not involved with the creation of tools.”
Diana agrees with many of her peers that there’s a need for guardrails against technology’s misuse, but insists that we need to operate within the reality where people are living: “In some spaces, the advice that survivors of abuse receive is to leave social media or get rid of their phone, and that’s not a solution. Sometimes, the solutions people want don’t match the innovations we can deliver. As devices like AirTags enter the market, there are positive use cases as well as misuse cases, and we need to work broadly as we help organizations, advocates, and survivors, etc. understand how to protect themselves when these devices are used to surveil or track. As you scale from rural to urban to global, stakeholders have different needs and experience different threat models. We need to understand the ecosystems in which people experience digital risks and how we can work within communities to help establish online safety.”
“Ultimately,” she says, “we don’t want to create technology that’s isolated from its users; we want to create tools that people can use safely. My experience working in human-computer interaction and human-centered security and privacy has shown me that, in a sense, part of the work to be done is to understand the risks and harms vulnerable populations experience, and continue to design and develop tools and resources that protect them against technology-facilitated abuse.”
Diana is currently a joint Postdoctoral Fellow at the Center for Research and Computation (CRCS) at the Harvard John A. Paulson School of Engineering and Applied Sciences and at The Berkman Klein Center for Internet and Society at Harvard University.
For more information, click the link that follows to contact Brown CS Communications Manager Jesse C. Polhemus.