Info 213: Intro to Human-Computer Interaction Design
Info 290: HCI research
Info 290: Advanced HCI research
25-30 Apr 20-21 Feb
Impact Fund Class Action Conf.
AI and algorithms 101 [slides]
Design at Large, UCSD. View talk.
Social Justice Program for Undergraduate Students, College of Engineering, UC, Berkeley
Feminist Data Studies Workshop, University of Michigan
16-19 July 20-21 June
UIST PC meeting
Simon Fraser University
Digital Democracies Conference: Artificial Publics, Just Infrastructures, Ethical Learning
Niloufar Salehi is an Assistant Professor at the School of Information at UC, Berkeley, with an affiliated appointment in EECS. Her research interests are in social computing, participatory and critical design, human-centered AI, and more broadly, human-computer-interaction (HCI). Her work has been published and received awards in premier venues in HCI including ACM CHI and CSCW. Through building computational social systems in collaboration with existing communities, controlled experiments, and ethnographic fieldwork, her research contributes the design of alternative social configurations online.
Random, Messy, Funny, Raw:
Finstas as Intimate Reconfigurations of Social Media
Best Paper Honorable Mention
Sijia Xiao, Danaë Metaxa, Joon Sung Park, Karrie Karahalios, Niloufar Salehi, ACM CHI 2020
Feminist Data Manifest-No
Marika Cifor, Patricia Garcia, TL Cowan, Jasmine Rault, Tonia Sutherland, Anita Say Chan, Jennifer Rode, Anna Lauren Hoffmann, Niloufar Salehi, Lisa Nakamura, 2019
Agent, Gatekeeper, Drug Dealer:
How Content Creators Craft Algorithmic Personas
Eva Yiwei Wu, Emily Pedersen, Niloufar Salehi, ACM CSCW 2019
Hive: Collective Design Through Network Rotation
Niloufar Salehi, Michael Bernstein, ACM CSCW'18
My group studies and designs social computing systems. Ongoing projects:
What would a Restorative and Transformative Justice approach to moderation and governance of online platforms look like?
Why don't school assignment algorithms live up to their theoretical promises of efficiency and equity in practice?
What could go wrong when affect recognition is used to evaluate and compare people?
Hive, explores how social systems can help build strong networks by organizing a collective into small teams, then intermixing viewpoints by gradually rotating team membership. I deployed this project with Mozilla to reimagine accessible web browsing with disability advocates online.
In this paper, we introduce a new class of collective design system that intermixes people instead of ideas: instead of receiving mere exposure to others' ideas, participants engage deeply with other members of the collective who represent those ideas, increasing engagement and influence. We thus present Hive: a system that organizes a collective into small teams, then intermixes people by rotating team membership over time. At a technical level, Hive must balance two competing forces: (1) networks are better at connecting diverse perspectives when network efficiency is high, but (2) moving people diminishes tie strength within teams. Hive balances these two needs through network rotation: an optimization algorithm that computes who should move where, and when. A controlled study compared network rotation to alternative rotation systems which maximize only tie strength or network efficiency, finding that network rotation produced higher-rated proposals. Hive has been deployed by Mozilla for a real-world open design drive to improve Firefox accessibility.
Dynamo, shows how structured human labor can help move efforts forward when they stall. I undertook this project in collaboration with worker rights advocates on Amazon Mechanical Turk.
Dynamo is a platform to support the Mechanical Turk community in forming publics around issues and then mobilizing. We are researching new approaches, systems, and labor mechanisms for collective action online.
"Despite their variety, Turkers have something in common—a lack of power. They operate in a realm largely untouched by legislation, unions, and guilds. As a result, the inexperienced can find themselves earning well below minimum wage, or abused by underhanded employers. But a project out of Stanford University [and UC San Diego] is hoping to grant Turkers agency—and might begin to revolutionize the industry. Dynamo is a platform that gives Turkers a collective voice and, consequently, the chance to drive change." -The Daily Beast
Amazon's Turker Crowd Has Had Enough, Wired, 8/23/2017
The Vanishing 9-to-5: Ruthless and Liberating, YES! Magazine, 9/3/2016
On Demand, and Demanding Their Rights, The American Prospect magazine, 6/28/2016
Intellectual Piecework, The Chronicle, 2/16/2015
Amazon's Mechanical Turk workers protest: 'I am a human being, not an algorithm', The Guardian, 12/3/2014
Amazon’s Turkers Kick Off the First Crowdsourced Labor Guild, The Daily Beast, 12/3/2014
Amazon's Mechanical Turk workers want to be treated like humans, Engadget, 12/3/2014
Workers of Amazon services published open letters to Bezos saying "We are not an algorithm", Gigazine, 12/4/2014 ( Japanese)
Amazon Mechanical Turk workers begin letter-writing campaign, Crowdsourcing.org, 12/5/2014
Amazon Mechanical Turk: Artificial Artificial Intelligence, Rhizome Today, 12/5/2014
The proletariat web accesses the class consciousness and launches its first collective action to improve working conditions, Slate, 12/4/2014 (French)
Mechanical Turk workers protest, Xakep, 12/8/2014 (Russian)
Plenty of academic research passes through AMT or is about Turkers, but ethics boards (IRBs) who review and approve research protocols often don't know how workers want to be treated. Turkers have collectively authored these guidelines to help educate researchers and let Turkers hold them accountable to a higher standard.
Email us at email@example.com to add your support.
All around the world, people are writing letters about themselves to a distant, all-powerful figure who can make their dreams come true. But this year, many of those letters will be addressed to Amazon boss Jeff Bezos rather than Father Christmas. - The GuardianThe goals of this campaign are to publicly state that: