Meet the Fellows 2019: Emily Paul

When I applied to master’s programs in information science, I wrote in my application essays that I wanted to “ensure that information systems promote equity, openness, and accessibility.” I wanted to make good technology: technology that brings us closer to a more equitable and just society. After spending the last five years researching and working on technology in academia and industry, I am convinced that in order to do this we need to make more collective decisions about how technology is designed, built, and used. These decisions must involve not only the users of these digital products and services but also the people who are affected indirectly. In many cases, the technological infrastructure affecting people’s lives is hard to see and thus hard to question and debate. People’s access to credit or new job opportunities are increasingly informed by algorithms that users never experience firsthand, and may not be aware of at all. Users of social media platforms don’t know the extent of personal data that is collected and how it is used. In order to make collective decisions we need frameworks and incentives for involving the people who are affected by a digital product or service in choices about how it is created and used.

At their best, technology companies take a human-centered approach to addressing real problems; providing enjoyable experiences for their users; and improving people’s wellbeing. We see this in the many ways that technological innovation improves our daily lives, whether it is through medical advancements or the convenience of a smartphone. I have continuously brought this human-centered approach to the teams I work on as a user experience researcher. However, as technological innovations rapidly restructure our daily lives and our society, the companies that make the applications and products we interact with every day make business decisions that have negative consequences, both for their users and for our society more broadly. Without frameworks in place for assessing the impacts of these decisions and incentive structures that promote equity, we can’t respond until after harm has been done and even then we are left with limited tools to do so.

I applied to spend the year as a Congressional Innovation Fellow because I want to better understand the policy-making process and contribute a user-centered approach to technology policy. I believe policy is a powerful tool for facilitating collective decisions about the role of technology in our society. During my fellowship, I want to help shape policy on issues such as privacy, automated decision-making, access to technology, and other areas essential to advancing more equitable, inclusive, and ethical technology.