Course: CS 3001A Computing, Society, and Professionalism
Term: Spring 2024
Location: Scheller College of Business 100
Lecture Time: 11:00 am - 11:50 am, Mondays and Wednesdays
Section Times: Check Oscar or Buzzport for your discussion section time; all sections held in-person
Discussion: Piazza, link on course Canvas
Virtual Office Hours with Instructor: By Appointment; meeting link on course Canvas
Virtual Office Hours with Section TA: Check your section Canvas
Head Teaching Assistant:
Sadie Palmer
Teaching Assistants:
Aashna Doshi [A17, A35]
Akhilesh Sivaganesan [A18]
Alexander Goebel [A16, A10]
Anish Manne [A12, A22]
Ayman Ismail [A07, A15]
Chinar Dankhara [A08, A27]
Jonathan Satterfield [A09, A13]
Mehar Johal [A34]
Niharika Gali [A05, A29]
Parth Desai [A19, A26]
Ritvik Ramakrishnan [A33, A23]
Rithvik Rajavelu [A21, A25]
Shreya Santhanagopalan [A20, A28]
Sushree Pani [A02, A32]
Swathi Ram [A04, A06]
Vedic Panda [A03, A24]
Yash Gupta [A01, A11]
Yesha Jigneshkumar Thakkar [A14, A31]

Learning Objectives

In this class, you will learn about:

  • Ethics: What do "right" and "wrong" mean anyway? How is "ethical" different from "legal"? We'll learn about several philosophical approaches to ethics including utilitiarianism, Kantianism, social contract theory, and virtue ethics. The goal is for students to be able to address ethical dilemmas with reasoned arguments, grounded in a combination of these ethical theories.
  • Professional Ethics: What special responsibilities do we have as computing professionals? What do the Software Engineering Code of Ethics and ACM Code of Ethics say, and how can we use these in our daily practice?
  • Computing and Society: In what ways does computer technology impact society? We'll talk about a host of issues including privacy, intellectual property, freedom of speech, and social justice.

Assignments and Grading

Recitation Section Attendance and Participation - 15%
Homework Assignments (4 in all) - 25%
      Homework Assignment I | Homework Assignment II | Homework Assignment III | Homework Assignment IV
Term Paper - 25%
      Term Paper Proposal | Term Paper Outline | Full Term Paper
In-Person Midterm - 15%
In-Person Final Exam - 20%

Sample midterm exercises are here, a sample final exam is here, and a sample term paper titled "Ethics of Worksplace Surveillance" is here (shared with the student's permission).

Attendance and Participation. Attendance and participation are core aspects of this course. Attendance in the recitation sections is required and will be noted by your section TA. If you need to miss a recitation section for a legitimate reason, please send an email to the Head TA and copy the TA of your section, preferably before class.

You may miss up to two recitation section attendances without it affecting your grade. Other legitimate reasons that are excused include illness/health emergencies, religious events, attending a professional conference, or a job interview.

Please note that exams are strongly based on material that is covered in the lectures and the discussions in the recitation sections. Discussions in the recitations in a particular week will be based on the material covered in the lectures in the same week. If you do miss a recitation section, please do get notes from a classmate and follow along based on the recitation section class exercises made available on the course website.

Reference Format. Please use APA format for all references. APA format is described here.

ESL. If English is not your first language, you may request to not be graded on your writing for a particular individual assignment, including the term paper. This means you won't be penalized for bad writing, but you also won't get credit for good writing. To take advantage of this option, you must mark "ESL" (English as a Second Language) on the first page of your assignment/paper. This option is not available for group assignments. We still of course expect you to try to write in correct English, and will do our best to offer useful feedback on your writing.

Late Policy. Students need to submit all of their materials on or before the deadline to qualify for 100% credit. 24 hours delay will result in 25% penalty; 48 hours late submissions will incur 50% penalty. Materials submitted past 48 hours will not be accepted, and will be entered a zero grade.

Honor Code. This class abides by the Georgia Tech Honor Code. All assigned work is expected to be individual, except where explicitly written otherwise. You are encouraged to discuss the assignments with your classmates (or AI; more below); however, what you hand in should be your own work. We will be thoroughly checking for plagiarism issues (using in-built tools on Canvas) and any instance will be dealt with the Institute's recommended protocol, such as reporting to the Office of Student Integrity.

Policy on Using AI Technologies. This policy is adapted from one by David Joyner.

We treat AI-based assistance, such as ChatGPT, the same way we treat collaboration with other people: you are welcome to talk about your ideas and work with other people, both inside and outside the class, as well as with AI-based assistants. However, all work you submit must be your own. You should never include in your assignment anything that was not written directly by you without proper citation (including quotation marks and in-line citation for direct quotes). Including anything you did not write in your assignment without proper citation will be treated as an academic misconduct case.

If you are unsure where the line is between collaborating with AI and copying from AI, we recommend the following heuristics:

- Never hit "Copy" within your conversation with an AI assistant. You can copy your own work into your conversation, but do not copy anything from the conversation back into your assignment. Instead, use your interaction with the AI assistant as a learning experience, then let your assignment reflect your improved understanding.

- Do not have your assignment and the AI agent open at the same time. Similar to above, use your conversation with the AI as a learning experience, then close the interaction down, open your assignment, and let your assignment reflect your revised knowledge. This heuristic includes avoiding using AI directly integrated into your composition environment: just as you should not let a classmate write content or code directly into your submission, so also you should avoid using tools that directly add content to your submission.

Deviating from these heuristics does not automatically qualify as academic misconduct; however, following these heuristics essentially guarantees your collaboration will not cross the line into misconduct.

Use of grammar checkers and spelling verifiers are allowed without restriction.

Mental Health. As college students, it can be hard to prioritize your health, especially when you are pushed to prioritize academics, work, and extracurricular activities. The instructor and Head TA are happy to talk to you privately if you need mental health related accommodations. Please also refer to the various campus resources to access timely, professional help as well as self-care tips.

Required Texts

  • Ethics for the Information Age, by Michael Quinn (You may rent an electronic copy rather than buying it; you will need 8th Edition)
  • Articles available online.

The GT Library's Web Localizer is useful for research you need to do for this class. You may also need it to access some assigned readings, such as those on the ACM Digital Library.

Class Schedule *

Week 1 8-Jan Welcome and Overview Lecture Slides
10-Jan Case Study: Therac-25 Lecture Slides
Section Handout
Section Discussion Guide
Week 2 15-Jan MLK Day - No Class
17-Jan Do Artifacts Have Politics? Lecture Slides
Recorded Lecture; No In-Person Class
Section Handout
Section Discussion Guide
Week 3 22-Jan Utilitarianism Lecture Slides
Homework 1 Due
24-Jan Deontology Lecture Slides
Section Handout
Section Discussion Guide
Week 4 29-Jan Social Contract Theory Lecture Slides
31-Jan Stakeholder Analysis Lecture Slides
Homework 2 Due
Section Handout
Section Discussion Guide
Week 5 5-Feb Virtue Ethics Lecture Slides
7-Feb Professional Ethics Lecture Slides
Section Handout
Section Discussion Guide
Week 6 12-Feb Privacy I Lecture Slides
14-Feb Privacy II Lecture Slides
Term Paper Proposals Due
Section Handout
Section Discussion Guide
Week 7 19-Feb Privacy and Governmental Surveillance Lecture Slides
21-Feb The USA Patriot Act Lecture Slides
Section Handout
Section Discussion Guide
Week 8 26-Feb Midterm Office Hours
No in-person class
28-Feb In-Class Midterm
No sections
Week 9 04-Mar Freedom of Speech Lecture Slides
06-Mar Regulating Online Speech Lecture Slides
Section Handout
Section Discussion Guide
Week 10 11-Mar Online Radicalization and Polarization Lecture Slides
13-Mar Intellectual Property Lecture Slides
Homework 3 Due
Section Handout
Section Discussion Guide
Week 11 18-Mar Spring Break - No Class
20-Mar Spring Break - No Class
Week 12 25-Mar Algorithmic Surveillance Lecture Slides
27-Mar Algorithmic Manipulation Lecture Slides
Term Paper Outline Due
Section Handout
Section Discussion Guide
Week 13 01-Apr Future of Work, Automation, Predictive Technologies Lecture Slides
03-Apr Algorithmic Bias and Fairness Lecture Slides
Section Handout
Section Discussion Guide
Week 14 08-Apr Generative AI Lecture Slides
10-Apr Research Ethics Lecture Slides
Homework 4 Due
Section Handout
Section Discussion Guide
Week 15 15-Apr The Big Picture
17-Apr Bonus: Implications of Social Media on Health
Material not on exam
Section Handout
Week 16 22-Apr Term Paper Presentations
Final Term Papers Due


Weekly Readings*

Week 1 (Jan 10): Case Study: Therac-25
• Quinn Chapter 8.5
• Leveson, N. (1995). Medical devices: The therac-25. Appendix of: Safeware: System Safety and Computers. [pdf]
     - Radiation Offers New Cures, and Ways to Do Harm, The New York Times [link]
• Ganesh, T. (2014). Incident reporting and learning in radiation oncology: Need of the hour. Journal of Medical Physics/Association of Medical Physicists of India, 39(4), 203. [pdf]

Week 2 (Jan 17): Do Artifacts Have Politics?
• Winner, L. (1980). Do artifacts have politics?. Daedalus, 121-136. [pdf]
     - Planet Money episode 924: Stuck in China's Panopticon (time: 26:36) [link]
• Gillespie, T. (2014). The relevance of algorithms. Media technologies: Essays on communication, materiality, and society, 167, 167. [pdf]
• [Optional] Facebook's own data is not as conclusive as you think about teens and mental health, NPR, by Anya Kamenetz [link]

Week 3 (Jan 22): Utilitarianism
• Quinn Chapter 2.1-2.3, 2.7-2.8

Week 3 (Jan 24): Deontology
• Quinn Chapter 2.6
• Rachels, J (2011). The Elements of Moral Philosophy (7th Ed), Chapter 9: “Are there absolute moral rules?” [pdf]

Week 4 (Jan 29): Social Contract Theory
• Quinn Chapter 2.9
• Doody, R (2015). Criticisms of Rawls' Principles of Justice. [pdf]
• [Optional] An excerpt adapted from The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, by Shoshana Zuboff [link]

Week 4 (Jan 31): Stakeholder Analysis
• Quinn 2.12
• Gilligan, C. (1993). In a different voice. Harvard University Press, pp. 24-39 [PDF on Canvas>Files]

Week 5 (Feb 5): Virtue Ethics
• Quinn Chapter 2.10-2.11
• Hursthouse, Rosalind and Pettigrove, Glen, "Virtue Ethics", The Stanford Encyclopedia of Philosophy (Winter 2018 Edition), Edward N. Zalta (ed.) [pdf]
• [Optional] AI At The Crossroads: Navigating Job Displacement, Ethical Concerns, And The Future Of Work, Forbes, by Benjamin Laker [link]

Week 5 (Feb 7): Professional Ethics
• Quinn Chapter 9
• ACM Code of Conduct
• Anderson, R. E., Johnson, D. G., Gotterbarn, D., & Perrolle, J. (1993). Using the new ACM code of ethics in decision making. Communications of the ACM, 36(2), 98-107. [pdf]

Week 6 (Feb 12): Privacy I
• Quinn Chapter 5
• O'Hara, K. (2013). Are we getting privacy the wrong way round?. IEEE Internet Computing, 17(4), 89-92. [pdf]
     - Zuckerberg: I know that people don't want privacy, CNet [link]

Week 6 (Feb 14): Privacy II
• Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509–514. [pdf]
• Ohm, P. (2009). Broken promises of privacy: Responding to the surprising failure of anonymization. UCLA l. Rev., 57, 1701. [pdf]
     - "Anonymized" data really isn't and here’s why not, Ars Technica [link]
• The Solace of Oblivion: In Europe, the right to be forgotten trumps the Internet, The New Yorker [link]
     - Google wins landmark right to be forgotten case, BBC [link]

Week 7 (Feb 19): Privacy and Governmental Surveillance
• Quinn Chapter 6 (except 6.6)
• Botsman, R. (2017). Big data meets Big Brother as China moves to rate its citizens. Wired UK, 21. [pdf]
• The NSA's Secret Role in the US Assasination Program, The Intercept [link]

Week 7 (Feb 21): The USA PATRIOT Act
• Quinn Chapter 6.6
• Wikipedia articles: “The Foreign Intelligence Surveillance Act” and “National Security Letters”
• Asinari, M. V. P., & Poullet, Y. (2004). The airline passenger data disclosure case and the EU-US debate. Computer Law & Security Review, 20(2), 98-116. [pdf]

Week 9 (Mar 4): Freedom of Speech and Networked Communications
• Quinn Chapter 3.2, 3.5-3.8; skim the rest of the chapter
• Warf, B. (2011). Geographies of global Internet censorship. GeoJournal, 76(1), 1-23. [pdf]

Week 9 (Mar 6): Regulating Online Speech
• Chandrasekharan, E., Pavalanathan, U., Srinivasan, A., Glynn, A., Eisenstein, J., & Gilbert, E. (2017). You can't stay here: The efficacy of reddit's 2015 ban examined through hate speech. Proceedings of the ACM on Human-Computer Interaction, 1(CSCW), 1-22. [pdf]
     - Reddit bans 'Fat People Hate' and other subreddits under new harassment rules, The Verge [link]
• Chancellor, S., Pater, J. A., Clear, T., Gilbert, E., & De Choudhury, M. (2016, February). #thyghgapp: Instagram content moderation and lexical variation in pro-eating disorder communities. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (pp. 1201-1213). ACM. [pdf]
• Nott, L (2019) Does the First Amendment Protect Speech Made By Artificial Intelligence? Freedom Forum Institute [link]

Week 10 (Mar 11): Online Radicalization and Polarization
• Garrett, R. K. (2009). Echo chambers online?: Politically motivated selective exposure among Internet news users. Journal of computer-mediated communication, 14(2), 265-285. [pdf]
• Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130-1132. [pdf]
• Eslami, M., Rickman, A., Vaccaro, K., Aleyasen, A., Vuong, A., Karahalios, K., ... & Sandvig, C. (2015, April). I always assumed that I wasn't really that close to [her]: Reasoning about Invisible Algorithms in News Feeds. In Proceedings of the 33rd annual ACM conference on human factors in computing systems (pp. 153-162). ACM. [pdf]

Week 10 (Mar 13): Intellectual Property
• Quinn Chapter 4.1-4.10

Week 12 (Mar 25): Algorithmic Surveillance
• Tufekci, Z. (2014). Engineering the public: Big data, surveillance and computational politics. First Monday, 19(7). [pdf]
     - Facebook Increasingly Reliant on A.I. To Predict Suicide Risk, NPR [link]
     - Policing the Future: In the aftermath of Ferguson, St. Louis cops embrace crime-predicting software, The Verge [link]
     - Colleges are turning students’ phones into surveillance machines, tracking the locations of hundreds of thousands, Washington Post [link]

Week 12 (Mar 27): Algorithmic Manipulation
• Starbird, K. (2017, May). Examining the alternative media ecosystem through the production of alternative narratives of mass shooting events on Twitter. In Eleventh International AAAI Conference on Web and Social Media. [pdf]
• Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151. [pdf]
• Bessi, A., & Ferrara, E. (2016). Social bots distort the 2016 US Presidential election online discussion. First Monday, 21(11-7). [pdf]

Week 13 (Apr 1): Future of Work, Automation, and Predictive Technologies
• Rotman, D. (2013). How technology is destroying jobs. Technology Review, 16(4), 28-35. [pdf]
     - Imagine Discovering That Your Teaching Assistant Really Is a Robot, The Wall Street Journal [link]
• Goldhahn, J., Rampton, V., & Spinas, G. A. (2018). Could artificial intelligence make doctors obsolete?. Bmj, 363, k4563. [pdf]
• Lazer, D., Kennedy, R., King, G., & Vespignani, A. (2014). The parable of Google Flu: traps in big data analysis. science, 343(6176), 1203-1205. [pdf]

Week 13 (Apr 3): Algorithmic Bias and Fairness
• Edelman, B., Luca, M., & Svirsky, D. (2017). Racial discrimination in the sharing economy: Evidence from a field experiment. American Economic Journal: Applied Economics, 9(2), 1-22. [pdf]
• Kay, M., Matuszek, C., & Munson, S. A. (2015, April). Unequal representation and gender stereotypes in image search results for occupations. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (pp. 3819-3828). [pdf]
• Kosinski, M., Stillwell, D., & Graepel, T. (2013). Private traits and attributes are predictable from digital records of human behavior. Proceedings of the national academy of sciences, 110(15), 5802-5805. [link]
• Wang, Y., & Kosinski, M. (2018). Deep neural networks are more accurate than humans at detecting sexual orientation from facial images. Journal of personality and social psychology, 114(2), 246. [pdf]
    - Researchers use facial recognition tools to predict sexual orientation. LGBT groups aren’t happy, Washington Post [link]

Week 14 (Apr 10): Generative AI
• Moor, M., Banerjee, O., Abad, Z. S. H., Krumholz, H. M., Leskovec, J., Topol, E. J., & Rajpurkar, P. (2023). Foundation models for generalist medical artificial intelligence. Nature, 616(7956), 259-265. [link]
• Zhou, J., Zhang, Y., Luo, Q., Parker, A. G., & De Choudhury, M. (2023). Synthetic Lies: Understanding AI-Generated Misinformation and Evaluating Algorithmic and Human Solutions. In Proceedings of CHI, 2023. [pdf]

Week 14 (Apr 12): Research Ethics
• Kramer, A. D., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24), 8788-8790. [pdf]
• Vitak, J., Shilton, K., & Ashktorab, Z. (2016, February). Beyond the Belmont principles: Ethical challenges, practices, and beliefs in the online data research community. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (pp. 941-953). ACM. [pdf]
• Chancellor, S., Birnbaum, M. L., Caine, E. D., Silenzio, V., & De Choudhury, M. (2019, January). A taxonomy of ethical tensions in inferring mental health states from social media. In Proceedings of the Conference on Fairness, Accountability, and Transparency (pp. 79-88). ACM. [pdf]




* Topics to be covered and the corresponding readings are subject to change. Please always check the online schedule and the list of class readings.

Acknowledgments: Class materials adapted from the offering by Amy Bruckman.