4000246 Advanced Theories | Sozioinformatics | Computer says no - why algorithms discriminate and what we can do about it

Course offering details

Instructors: Prof. Dr. Katharina Zweig

Event type: Seminar

Org-unit: Graduate School | ZUGS

Displayed in timetable as: Advanced Theories

Credits: 4,0

Location: Campus der Zeppelin Universität

Language of instruction: Englisch

Min. | Max. participants: 5 | 15

Course content:
In the USA, there is a software that is used to predict whether a criminal will relapse into criminal behavior. It has been shown that this software puts Afroamericans twice as often wrongly in the high-risk group than Whites – this sounds fairly discriminating, doesn't it? A software in Austria was lately tested to predict job market success of unemployed persons. It turned out that it gives a higher risk-score to women, persons over 50, and care-givers. Surely, this is discriminating! Should this software be used after all?

Both examples belong to the large set of so-called algorithmic decision making systems (ADM systems) that become more and more common. The examples show that – while the results are computed – the results are not objective. Instead, they can be outright discriminating. However, evaluating the whole picture is much more complicated. In this course, Prof. Dr. Katharina Zweig, bestselling author and designer of the new field of study called "Socioinformatics", will explain the basic technology that lies behind ADM systems. She will explain how discrimination gets into these systems, who is responsible for that, and how the responsible persons can be made accountable. She will also explain why there is (almost) no discrimination-free system from a computer science perspective and when it can even be necessary to have a discriminating system. The course uses interactive elements to let you experience what an algorithm is and to show you how to compute so-called fairness measures.

You do not need any knowledge about the subject. The course is open for students from all sciences – the more, the better. There is only one prerequisite: Be prepared to bring in the perspective of your studies. It is very much appreciated.

Schedule (20x 0,45 h):

1st lesson (45 Minutes): Introduction of the docent, the participants, their main motivation to visit the course. Depending on size of course: Introductory definition of discrimination, first interactive piece: Define discrimination
2nd lesson (90 Minutes): The ABC of computer science: A as in Algorithm; B as in Big Data; and C as in Computer Intelligence.
3rd lesson (90 Minutes): How does machine learning work? Correlation finder and how they can go wrong. Can they be better than humans?
4th lesson (135 Minutes): How can discrimination be evaluated and measured – a computer science perspective.
5th lesson (45 Minutes): How does discrimination get into the software?
6th lesson (90 Minutes): COMPAS – an ADM system to predict recidivism.
7th lesson (90 Minutes): AMAS – an ADM system to predict success at the job market.
8th lesson (90 Minutes): Establishing accountability of ADM systems.
9th lesson (45 Minutes): Explaining the task.
10th lesson (90 Minutes) on science communication
11th lesson (90 Minutes) Guided research for finding a suitable ADM system for the final assignment and first ideas on how to present it.

Educational objective:
At the end of this course, you will have the following competencies:

1) You will be able to explain what an algorithm is and how it is different from a heuristic.
2) You will know what big data is and why it is the basis for machine learning.
3) You will know at least three different methods from machine learning whose basic technical principle you can explain.
4) You can name 5 different ways in which a statistical model can make discriminating decisions.
5) You can apply a model of who is responsible for which decisions in the long chain of responsibilities – from data collection to usage of an algorithmic decision making system in a social process.
6) You will be able to research the technology behind a (possibly) discriminating ADM system, to identify the main problems that caused the discrimination (if any), and to evaluate whether this discrimination is likely to be useful or harmful in the way it is embedded in a social process. You can identify the main actors and describe a way in which they can be made accountable.
7) You are able to communicate this knowledge in an engaging video to the general public.

Further information about the exams:
You can choose from a range of situations, in which a software was used that may or may actually not have made discriminating decisions – you can use any other example you found interesting after discussing your choice with the docent. Your first task is then to research the situation as much as possible. Find out which technology was actually used; how was the discrimination detected or measured? Could it finally be proven that the decisions were discriminating? In your view, is the discrimination necessary, useful, or harmful?

Your second task is to explain the situation to other people working in your own field by producing a video of 10-12 minutes– find the right level of detail so that they can understand the technology and the accusation of discrimination (at least 2 minutes for the technology; 2-3 minutes to explain how the discrimination was measured). Evaluate the degree of discrimination and whether you assume it to be necessary, harmful, or even useful (2-3 minutes). Finally, there is an open question to you: Give us your viewpoint of the situation from your own field. What came to your mind when reading this? What would be a method from your field that could be applied to alleviate or solve one of the problems you found? What would be a research question that arises from the situation that could be solved with methods from your field? Feel free to elaborate on anything from your field (approx. 2 minutes). The video needs to adher to scientific standards, i.e., the cited facts need to be referenced with credible sources. The video is evaluated by the following dimensions:
1) Correctness of the presented facts;
2) Creativity in the presentation;
3) The didactic concept.

The video should not be longer than 15 minutes.

Mandatory literature:
[1] Katharina A. Zweig: "Network Analysis Literacy", Springer Verlag Wien, 2016
[2] Katharina A. Zweig, Tobias D. Krafft, Anita Klingel, Enno Park: "Sozioinformatik", Hanser Verlag München, 2021
[3] Katharina Zweig: "Ein Algorithmus hat kein Taktgefühl", Heyne Verlag München, 2019

Appointments
Date From To Room Instructors
1 Wed, 31. Aug. 2022 13:30 17:00 Z | NICHT BUCHEN | Cor | SMH | LZ 5 Prof. Dr. Katharina Zweig
2 Th, 1. Sep. 2022 09:00 17:00 Z | NICHT BUCHEN | Cor | SMH | LZ 5 Prof. Dr. Katharina Zweig
3 Fri, 2. Sep. 2022 09:00 17:00 Z | NICHT BUCHEN | Cor | SMH | LZ 5 Prof. Dr. Katharina Zweig
Course specific exams
Description Date Instructors Compulsory pass
1. Andere Prüfungsleistung No Date Yes
Class session overview
  • 1
  • 2
  • 3
Instructors
Prof. Dr. Katharina Zweig