lect-img

Dr. Goda Klumbytė

Lecturer
Research areas
Comparative Politics and Political Sociology
Political Theory
Additional info
goda.klumbyte@uni-kassel.de

 ORCID iD  LinkedIn Academia

Goda Klumbytė is an interdisciplinary scholar working between informatics and humanities & social sciences. Her research engages feminist new materialism, posthumanism, human-computer interaction and algorithmic systems design. She is currently working on feminist approaches to explainability in AI and machine learning within the project “AI Forensics” (funded by Volkswagen Foundation) at the University of Kassel.

In her doctoral research conducted at the Participatory IT Design department at the University of Kassel, Germany, she investigated epistemic premises of machine learning as a knowledge production tool and proposed innovative ways to work with intersectional feminist and new materialist epistemologies towards more contextualized and accountable machine learning systems design.

She co-edited “More Posthuman Glossary” with R. Braidotti and E. Jones (Bloomsbury, 2022), and published work in “Posthuman Glossary” (Braidotti & Hlavajova, 2018), “Everyday Feminist Research Praxis” (Leurs & Olivieri, 2015), journals Online Information Review, Digital Creativity and ASAP, as well as presented at informatics conferences such as ACM’s CHI, nordiCHI and FAccT. She is one of the editors of critical computing blog “engines of difference“.

Research projects and grants

  1. AI Forensics: Accountability through Interpretability in Visual AI Systems (co-lead). University of Kassel and partners (2022-2025).
  2. CF+: Reconfiguring Computing Through Cyberfeminism and New Materialism (co-lead). University of Kassel (2018-2019).

Courses taught

AI Ethics

Research interests

Critical Algorithm Studies

Science and Technology Studies

Human-Computer Interaction Design

Feminist Epistemology

Critical Theory

Posthumanism

New Materialism

Publications

Braidotti, Rosi; Jones, Emily; Klumbyte, Goda. (eds.) (2022). More Posthuman Glossary. London: Bloomsbury Academic.

Klumbyte, Goda, Piehl, Hannah, & Draude, Claude. (2023). “Towards Feminist Intersectional XAI: From Explainability to Response-Ability.” Workshop paper presented at ACM CHI'23, Workshop Human-Centred Explainable AI. https://doi.org/10.48550/arxiv.2305.03375

Klumbytė, Goda; Draude, Claude; Taylor, Alex (2022). Critical Tools for Machine Learning: Working with Intersectional Critical Concepts in Machine Learning Systems Design. 2022 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’22), June 21–24, 2022, Seoul, Republic of Korea. ACM, New York, NY, USA, 1-14. https://doi.org/10.1145/3531146.3533207

Draude, Claude and Klumbytė, Goda (2022) “Hybrid Spaces, Hybrid Methodologies: Finding Ways of Working with Social Sciences and Humanities in Human-Computer Interaction”. In Human-Computer Interaction. HCI 2022, June 26–July 1, 2022, Proceedings, Part I. Springer-Verlag, Berlin, Heidelberg, 40–56. https://doi.org/10.1007/978-3-031-05311-5_3

Klumbytė, Goda, Draude, Claude (eds.) (2022). Special issue “Prospects For a New Materialist Informatics,” in Matter: Journal of New Materialist Research, Vol. 3, No. 1, February 2022. https://revistes.ub.edu/index.php/matter/issue/view/2720

Klumbyte, G., Lücking, P. Draude, C. (2020) Reframing AX with Critical Design: The Potentials and Limits of Algorithmic Experience as a Critical Design Concept, Proceedings of the NordiCHI ’20, October 25–29, 2020, Tallinn, Estonia. https://doi.org/10.1145/3419249.3420120

Klumbytė, G. and Britton, L. (2020), “Abstracting Otherwise: In Search for a Common Strategy for Arts and Computing”, ASAP/Journal, 5(1): 19-43. http://doi:10.1353/asa.2020.0001

Britton, L., Klumbyte, G., and Draude, C., Doing thinking: revisiting computing with artistic research and technofeminism, Digital Creativity, special issue on Hybrid Pedagogies, 30(4): 313-328. https://doi.org/10.1080/14626268.2019.1684322

Draude, C., Klumbyte, G., Lücking, P. and Treusch, P., Situated algorithms: a sociotechnical systemic approach to bias, Online Information Review, Vol. ahead-of-print No. ahead-of print. https://doi.org/10.1108/OIR-10-2018-0332