About 2 hours
- Lesson: ~1 hour
- Guided Practice: ~30 min
- Independent practice: ~20 min
- Check for understanding: 10 min
Software is advancing in many aspects, from pure technological capabilities to ubiquity in everyday human life. High-speed development rates make it hard to understand potential impacts early and pass laws or enact measures to protect against negative side effects. This content is meant to help participants identify problematic technology and consider their own involvement with it.
- Understand why software ethics is one of the most important topics today
- Identify situations where software could do harm
- Consider one's own responsibility
- Privacy
- Human and Machine Bias
- Violence
- Agency
- ACM Code of Ethics and Professional Conduct
- Introduction to Software Engineering Ethics
- 10-min article: Can you program ethics into a self-driving car?
- 5-min article: Germany Creates Ethics Rules for Autonomous Vehicles
- 10 min Interactive Ethics Tool: Moral Machine by MIT
- Spend 10 minutes reading the ACM Code of Ethics and Professional Conduct. Start by reading the preamble and all the headers before getting into the explanations. Once 10 minutes is up, move on to part 2.
- Spend 45 minutes working through pages 1-33 of this Intro to Ethics lesson with exercises by Santa Clara University.
- Video: “Challenges in Ethics and Computing” at ACM
- 30-min article: Why Machine Ethics?
- Joy Buolamwini's "How I'm fighting bias in algorithms" TEDx talk
Tracking and collecting user information is a double-edged sword. Helps software better serve user needs but at the expense of user privacy. It's probably OK when there's clear consent, but oftentimes people have no idea that they're being tracked, that they themselves are marketable products of the software they use. Targeted ads seem harmless but imagine this: walk into a shopping mall and be confronted by a robotic agent that takes advantage of your internet cookies, map locations, and your body vitals like eye dilation and breathing rate to make a sale. Target found out that a teenage girl was pregnant before she told her parents; the teen's parents found out because they were getting ads for diapers & formula.
There is the argument that humans are too emotional to make difficult decisions like determining bail amounts or recidivism risk in the justice system, and machines that run our code might be better suited for the job. Of course, the burning issue is that the responsibility is transferred to electrical boxes that amplify our biases in an infinite loop. Remember when Amazon finally admitted after a few years that their secret AI hiring systems were sexist as hell? The rule set that machines/AI use to operate is written by humans, so humans can and do insert their biases into machines.
The software can be better than humans at certain things. Being murderously calculating is probably one of them. Drones and robots are obvious, but things like surveillance tools have oppressive potential. Also, Cambridge Analytica and Twitter bullying are examples of software companies neglecting concerning activity on their platforms in the name of user growth and engagement.
It's easy for the individual to feel powerless, but it took just seven engineers to compel Google in letting the lucrative Pentagon contract expire. Do not assume the people who assign tasks have considered the ethical implications of the task; they are often most concerned about fixing a bug or rolling out a new feature, and it's very possible that you are the only one on the team who has thought about long-term consequences. Speak your mind, and be prepared to walk away. Life's too short to be screwing people.
Software Ethics concerns people, not code quality. Touting proper coding practices can be helpful but is not the point of this exercise.
Form small groups to discuss the following questions:
- Explain why ethics is important in software?
- What are some examples of where a breach in ethics in software can cause harm?
- What can you do as an individual or as part of a group to maintain a high ethical standard in software?
- What do you think the best way to bring up personal ethical concerns while interacting with members of your team?
- Read these two short articles:
- Can you program ethics into a self-driving car?
- Germany Creates Ethics Rules for Autonomous Vehicles
- Work through the Moral Machine by MIT. It's a heavy subject, but there are no "right" or "wrong" answers. Check out your final results - they will show you on what you valued as you made your choices.
- Choose 5 principles from the ACM code of ethics that you think especially apply to the autonomous vehicles dilemma, and write them down.
- Imagine you're at your first engineering job, and you've been asked to create a feature that will auto replace people's names with the name they had during elementary school. What groups could this feature negatively affect, and how? Brainstorm what you would say to your team lead to explain the ethical implications of this feature.
Form small groups and spend 10 minutes discussing your 5 chosen principles, and what you encountered in the articles and the exercise from Independent Practice.