Gernot Rieder's Avatar

Gernot Rieder

@aktant

Research Associate @ University of Bergen Centre for the Study of the Sciences and the Humanities Digital STS & Ethics

111
Followers
112
Following
1
Posts
12.12.2023
Joined
Posts Following

Latest posts by Gernot Rieder @aktant

Preview
After Harm: A Plea for Moral Repair after Algorithms Have Failed - Science and Engineering Ethics In response to growing concerns over the societal impacts of AI and algorithmic decision-making, current scholarly and legal efforts have mainly focused on identifying risks and implementing safeguard...

New paper is out! After Harm: A Plea for Moral Repair after Algorithms Have Failed.

@aktant.bsky.social & I show that post-harm scenarios have not received enough attention and argue why attending to them is essential for a satisfactory account of AI ethics and governance.

doi.org/10.1007/s119...

22.09.2025 05:15 πŸ‘ 12 πŸ” 2 πŸ’¬ 1 πŸ“Œ 0
Reassembling Politics through Sensory Power? Digital Contact Tracing and the Infrastructuring of Governance When the COVID-19 pandemic hit, governments worldwide swiftly mobilized digital capabilities and infrastructures to combat the spread of the virus (see ...

#CfP: Topical Collection "Reassembling Politics through Sensory Power?" in Digital Society, co-edited by @nicbaya.bsky.social, Kjetil Rommetveit, CΓ©line Cholez & me.

Abstract deadline: Oct. 1, 2025
Submission deadline: Dec. 31, 2025

More: link.springer.com/collections/...

#STS #philtech

02.09.2025 15:49 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Call for Abstracts – 7th Nordic STS Conference 2025

CfP open for 7th Nordic #STS Conference in πŸ‡ΈπŸ‡ͺ:
www.nordicsts.se/call-for-abs...

17.01.2025 16:18 πŸ‘ 3 πŸ” 2 πŸ’¬ 0 πŸ“Œ 0
Post image

Hard copies of Critical Data Studies book have finally arrived. The post schedule for parcels travelling from England to Ireland seems to have reverted to that of the medieval period. Great to have the book in hand: always feels like a different experience reading the printed book to the pdf copy.

14.01.2025 15:38 πŸ‘ 40 πŸ” 1 πŸ’¬ 1 πŸ“Œ 0
Preview
MIT researchers release a repository of AI risks | TechCrunch A group of researchers at MIT and elsewhere have compiled what they claim is the most thorough databases of possible risks around AI use.

"The AI risk repository, which includes over 700 AI risks grouped by causal factors (e.g. intentionality), and domains (e.g. discrimination), was born out of a desire to understand the overlaps and disconnects in AI safety research"
#AIEthics

techcrunch.com/2024/08/14/m...

05.01.2025 21:03 πŸ‘ 41 πŸ” 16 πŸ’¬ 2 πŸ“Œ 1
Post image

Article by M. Ruckenstein about the necessity & practice of engaging collaboratively w. technology & data in ways that create space for critical inquiry, anticipation & revision. Our Data Ethics Decision Aid is featured as an exemplary practice
doi-org.utrechtuniversity.idm.oclc.org/10.1177/2976...

08.01.2025 13:19 πŸ‘ 6 πŸ” 2 πŸ’¬ 0 πŸ“Œ 0
Post image

I'll do a workshop next week in Vienna on "everything artists and cultural practitioners need to know about this strange thing called AI", titled "debunking the tech-bro AI cult". Lecture+discussion.

Fr 17.1.2025, 14:00-16:00, brut Wien, please spread & register here:
brut-wien.at/en/Programme...

08.01.2025 19:29 πŸ‘ 33 πŸ” 17 πŸ’¬ 1 πŸ“Œ 1

to all search engine researchers out there!!

07.01.2025 17:14 πŸ‘ 4 πŸ” 2 πŸ’¬ 0 πŸ“Œ 0

Still a few more days to submit an abstract for this workshop

05.01.2025 08:31 πŸ‘ 6 πŸ” 5 πŸ’¬ 0 πŸ“Œ 0
Neither fair nor legal. How and why untrustworthy digital ecosystems evolve Untrustworthy technologies create systemic harms for their users, often by design, and at the societal level. However, the nature of these technologies and the reasons why they evolve have not yet bee...

'Neither fair nor legal. How and why untrustworthy digital ecosystems evolve' by Catherine Thompson, Daniel Samson and Sherah Kurnia in Scandinavian Journal of Information Systems. About Robodebt social welfare scandal aisel.aisnet.org/sjis/vol36/i...

02.01.2025 09:47 πŸ‘ 9 πŸ” 5 πŸ’¬ 0 πŸ“Œ 0