How To Actually *Do* Data Privacy

Abstract: Data privacy and AI ethics have been the subject of satire in The Onion. This is a sign that we've spent more time talking about data privacy than enacting practices that empower our data subjects to modulate what they share, for how long, and to which purposes.

In this talk, I will walk through four key practices for dealing with data ethically, in workshop style:
0. Define your data stakeholders.
1. Empower your data stakeholders to see, amend, selectively delete, and globally delete data that represents them. We will sort through legal tensions between retention and deletion requirements and discuss deploying automated decay by design tools to expunge privacy-sensitive data that has little value to the business.
2. Accompany model accuracy testing with and model fairness assessments. We’ll walk through schemas for assessing which fairness threshold is appropriate for a given intervention.
3. Add annual fairness, privacy, and transparency reports to board meeting agendas. Share publicly. A template will be provided.

Participants should leave the workshop with a clear set of technical and corporate tasks to move them into an active practice around data privacy and the ethical deployment of automated decision making. Some of objectives align with requirements in privacy regulations such as GDPR and CCPA, but this is not a compliance workshop. It is an engineering, product development, and ethics workshop.

Bio: Laura Norén is a data science ethicist and researcher currently working in cybersecurity at Obsidian Security in Newport Beach. She holds undergraduate degrees from MIT, a PhD from NYU where she recently completed a postdoc in the Center for Data Science. Her work has been covered in The New York Times, Canada’s Globe and Mail, American Public Media’s Marketplace program, in numerous academic journals and international conferences. Dr. Norén is a champion of open source software and those who write it.