Skip to main content Link Search Menu Expand Document (external link)

Sample Syllabus

This page presents the syllabus, assignments, and their grade contribution used to teach the advanced discussion section of the UC Berkeley’s semester-long CS195 course (Social Implications of Computer Technology) in Fall 2021. You are free to use this syllabus as is or modify it better to suit your teaching needs.

Table of contents

Schedule

Module Topic Assigned Materials
N/A Introduction and Logistics N/A
Ethics and Values Professional Conduct * ACM, “Code of Ethics and Professional Conduct”, 2018.
* Ethical OS, “Risk Mitigation Checklist”, 2018.
* Davide Castelvecchi, “Prestigious AI meeting takes steps to improve ethics of research”, 2020.
* Dan Munro, “Feynman’s Error: On Ethical Thinking and Drifting”, 2018.
* (Skim) Helen Nissenbaum, “Accountability in a computerized society”, 1996.
Ethics and Values Teaching Ethics * (Watch) MIT, “Teaching Ethics and Policy in Computer Science”, 2019.
* Shashi Krishna, “Teaching Ethical Computing”, 2020.
* Kay G. Schulze and Frances S. Grodzinsky, “Teaching ethical issues in computer science: what worked and what didn’t”, 1996.
* Jimmy Wu, “Optimize What?”, 2019.
* Paul Karoff, “Embedding ethics in computer science curriculum”, 2019.
* Tom Abate, “How the Computer Science Department is teaching ethics to its students”, 2020.
* Robert Florida, “Weaving Ethics Into Columbia’s Computer Science Curriculum”, 2019.
Privacy, Surveillance, and Free Speech Overview * Andrew Yang, “Op-Ed by Andrew Yang: Make tech companies pay you for your data”, 2020.
* Louis Menand, “Why Do We Care So Much About Privacy?”, 2018.
* Douglas MacMillan and Nick Anderson, “Student tracking, secret scores: How college admissions offices rank prospects before they apply”, 2019.
* Jack Poulson, “I Used to Work for Google. I Am a Conscientious Objector.”, 2019.
* BBC News, “Microsoft says error caused ‘Tank Man’ Bing censorship”, 2021.
Privacy, Surveillance, and Free Speech Vulnerable Populations * (Watch) Tom Simonite, “CryptoHarlem’s Founder Warns Against ‘Digital Stop and Frisk’”, 2020.
* Joseph Cox, “I Gave a Bounty Hunter $300. Then He Located Our Phone”, 2019.
* Allie Funk, “Apple’s AirTag offers convenience but poses serious threats — and it’s not alone”, 2021.
* Sara Morrison and Adam Clark Estes, “How protesters are turning the tables on police surveillance”, 2020.
* (Skim) William R. Marczak et al., “When Governments Hack Opponents: A Look at Actors and Technology”, 2014.
* (Skim) Sam Biddle, “Police Surveilled George Floyd Protests with the Help from Twitter-affiliated Startup Dataminr”, 2020.
Privacy, Surveillance, and Free Speech Human Factors * Alessandro Acquisti, Laura Brandimarte, and George Loewenstein, “Privacy and human behavior in the age of information”, 2015.
* Ivano Bongiovanni, Karen Renaud, and Noura Aleisa, “The privacy paradox: we claim we care about our data, so why don’t our actions match?”, 2020.
* Arielle Pardes, “How Facebook and Other Sites Manipulate Your Privacy Choices”, 2020.
* Genia Kostka, “What do people in China think about ‘social credit’ monitoring?”, 2019.
* (Skim) Institute of Global Health Innovation (IGHI), “Covid-19: Perceptions of Contact Tracing Global Report”, 2020.
* (Skim) Laura Brandimarte and Alessandro Acquisti, “The Economics of Privacy”, 2012.
Software Risks and Case Studies Case Studies * Gregory Travis, “How the Boeing 737 MAX Disaster Looks to a Software Developer”, 2019.
* Spencer Woodman, “Palantir Provides the Engine for Donald Trump’s Deportation Machine”, 2017.
* Phil Koopman, “A Case Study of Toyota Unintended Acceleration and Software Safety”, 2014.
Software Risks and Case Studies Attention and Engagement * (Watch) Nir Eyal / TED Institute, “What makes some technology so habit-forming?”, 2015.
* Nir Eyal, “The Hooked Model: How to Manufacture Desire in 4 Steps”, 2012.
* Nielsen Norman Group, “The Attention Economy”, 2019.
* Monica Rozenfeld, “How Persuasive Technology Can Change Your Habits”, 2018.
* Trevor Haynes, “Dopamine, Smartphones & You: A battle for your time”, 2018.
* Kevin Roose, “Do Not Disturb: How I Ditched My Phone and Unbroke My Brain”, 2019.
* Chaim Gartenberg, “How do Apple’s Screen Time and Google Digital Wellbeing stack up?”, 2018.
* (Skim) Center for Humane Technology, “How does technology use design to influence my behavior?”, 2021.
Algorithmic Decision-Making Bias and Fairness * (Background) Mehran Sahami, “A Very Brief Introduction to Probability and Machine Learning with the Perceptron Algorithm”, 2021.
* Julia Angwin et al., “Machine Bias”, 2016.
* Julia Angwin, “Make Algorithms Accountable”, 2016.
* Sam Corbett-Davies et al., “A computer program used for bail and sentencing decisions was labeled biased against blacks. It’s actually not that clear”, 2016.
* (Skim) Solon Barocas, Moritz Hardt, Arvind Narayanan, “Fairness and Machine Learning”, 2022.
* (Explore) Karen Hao and Jonathan Stray, “Can you make AI fairer than a judge? Play our courtroom algorithm game”, 2019.
* (Explore) Martin Wattenberg et al., “Attacking discrimination with smarter machine learning”, 2016.
Algorithmic Decision-Making Propaganda and Misinformation * Verizon, “A Guide to Misinformation: How to Spot and Combat Fake News”, 2020.
* Akos Lada, Meihong Wang, Tak Yan, “How machine learning powers Facebook’s News Feed ranking algorithm”, 2021.
* Karen Hao, “The Facebook whistleblower says its algorithms are dangerous. Here’s why.”, 2021.
* Chris Meserole, “How misinformation spreads on social media—And what to do about it”, 2018.
* Filippo Menczer and Thomas Hills, “Information Overload Helps Fake News Spread, and Social Media Knows It”, 2020.
* (Skim) The Wall Street Journal, “How to Fix Social Media”, 2021.
* (Skim) Renée DiResta, “It’s Not Misinformation. It’s Amplified Propaganda”, 2021.
Technology Law and Policy Intellectual Property, Copyright, and Patents * (Background) CS Department at Duke University, “Intellectual Property for CS Students”, 2002.
* (Watch) Computer History Museum, “Software Patent Debate”, 2011.
* Nilay Patel, “The ‘broken patent system’: how we got here and how to fix it”, 2012.
* Richard Stallman, “Misinterpreting Copyright—A Series of Errors”, 2021.
* Cory Doctorow, “America’s broken digital copyright law is about to be challenged in court”, 2016.
* (Skim) Daniel Oberhaus, “The Internet Was Built on the Free Labor of Open Source Developers. Is That Sustainable?”, 2019.
Technology Law and Policy Government and Technology * Kim Zetter, “Of Course Congress Is Clueless About Tech—It Killed Its Tutor”, 2016.
* Frank Nagle, “Digital infrastructure is more than just broadband: What the US can learn from Europe’s open source technology policy study”, 2021.
* Dante Disparte and Tomicah Tillemann, “To Move Forward, Federal IT Infrastructure Needs Resiliency”, 2021.
* Brian Naylor, “‘The Fifth Risk’ Paints A Portrait Of A Government Led By The Uninterested”, 2018.
* Technology and Operations Management at Harvard University, “The Failed Launch Of www.HealthCare.gov”, 2016.
* (Skim) Kearney, “IT infrastructure: pillar of digital government”, 2015.
N/A Conclusion and Wrap-up N/A

Assessment

In addition to attending the regular weekly lecture, students participate in weekly discussions and write reading responses, lead discussions and prepare presentations informed by their understanding of the material, and engage in a series of experiential and real-world assignments.

  1. Attendance and Discussion Participation (10%)
  2. Reading Responses (10%)

Students are expected to demonstrate their understanding of weekly readings by submitting responses to questions based on the assigned material. Reading responses are due each week before the start of the discussion. Students assigned to lead the discussion are not required to submit the reading response for that week (more information below).

  1. Discussion Leadership (20%)

Each week, we will assign a student to lead the discussion based on the material for that week. Discussion leaders will also prepare a 10-minute presentation, which they will present to the class at the beginning of the session. Finally, students assigned to lead the discussion are expected to come up with a set of questions, shared with other students 24 hours before the class, used to foster the discussion around the presented material. We strongly encourage you to participate in office hours the week of your discussion leadership.

If you are one of the discussion leaders for a class, your primary responsibility is to identify and examine the central point of the reading. Your discussion and presentation should be structured around an exploration of the following questions:

  • What is the central point of the reading? What are the authors trying to say?
  • What did you learn from this reading?
  • How do the authors make their case?
  • How does the reading relate to the theme of the class?

Whether or not you’re the discussion leader, think about these questions as you read and as you prepare for class. Resist the temptation to focus on whether you agree or disagree with the author. Focus instead on the ideas the author is articulating and what you can learn from them.

  1. Assignments (15% for each, 60% in total)

Students will complete 4 assignments that focus on the practical implications of the topics covered in this class. Some of these projects will be technical in nature, while others will assess your writing and analytical skills.

Assignment 1: Surveillance, Privacy, and Free Speech

For this assignment, you will be asked to come up with a set of requirements for a fictional app that aims to address a given real-world issue concerning the right to privacy and free speech. As part of this assignment, you will conduct at least two interviews with prospective users of the app, paying special attention to the use cases, desired functionality, and concerns brought up by the interviewees.

You will summarize the interview findings in a short report, describe the main functional requirements of the app, and provide general recommendations to the app developers on how to implement this functionality while respecting user privacy and the right to free speech.

In order to balance the workload, we will form pairs of students to complete this assignment.

Assignment 2: Ethics and Technology

You must pick an article that describes a computer technology (e.g. a platform, an app, an algorithm, etc.) that presents a moral dilemma when considering its intended or unintended use cases. You must select an article from a reputable media source, a newspaper, a science or engineering journal or magazine, a specialized journal or magazine from a different discipline, or similar.

In your paper, you will be asked to identify the moral dilemma and the ethical considerations of using this technology, explaining its benefits and dangers. You will then provide 2-3 independent, persuasive arguments for using the technology and 2-3 arguments against its use. Based on your arguments, you will provide a recommendation as to whether this technology should be used (explaining how to mitigate the potential dangers) or not (explaining how to achieve similar benefits in another way). In drawing your conclusion, be sure to explain which actors should have a say in how this technology is distributed, employed, and regulated.

You will need to complete this assignment individually.

Assignment 3: Software Risks and Algorithmic Bias

In this assignment, you will obtain experience developing a machine learning algorithm and assessing the resulting model on a number of criteria. Your investigation will include not only being able to transparently see the code for the machine learning algorithm (which is what some lawmakers argue is a necessary safeguard in deploying algorithms for such decision-making), but also assessing the “fairness” of the model produced by the algorithm on a number of criteria. You will be asked to determine what you would do to make the decision-making model produced by the algorithm more “fair,” and justify this position in a short writeup.

In order to balance the workload, we will form pairs of students to complete this assignment and ensure that at least one team member has prior programming experience.

Assignment 4: Final Reflection

In this assignment you will be asked to write a final reflection essay using the provided prompt.

You will need to complete this assignment individually.