UWV wants to use algorithms for WIA recovery. As an algorithm expert and a victim of the UWV, I am very skeptical. 👇 I am on vacation and I am trying to break away from the algorithm madness of the day. But as a disabled person, a victim of the UWV and an algorithm expert, I have something to say about this: the UWV wants to use AI to correct its mistakes with the WIA. Let me provide some context. The trust of people who receive benefits from the UWV in that same UWV is low. Very low. A lot of that has to do with fear. Fear that you will suddenly have to pay back a lot. Fear about the lack of transparency. Fear about a messy life that does not fit into the neat bureaucratic boxes and can therefore be classified as fraud at any given moment – or so it feels. Even if you try to act properly on all sides. Three years ago I wrote a blog for Digital Freedom Fund about the fear I – as an expert by experience and professional – have for AI experiments at the UWV (link in comments). I know how algorithms and AI work. I know roughly how the procedures at the UWV work. I know what a mess my data points are, because I know that my messy life does not fit neatly into the system. It is a recipe for disaster. Three years later, the UWV has made major mistakes with the WIA and AI is the holy grail that has to clean up the mess because there are not enough employees. But lives are extremely messy and algorithms very straightforward. So the question is: how do you design and develop an algorithmic system in which the human dimension is paramount? After all: the citizen must be able to trust that the UWV/the government is doing its job correctly (the good old ABBB are still relevant). And the thing is: designing and developing such a system is not easy. It does not go quickly. This requires a thorough assessment and reflection and the application of the right safeguards. AI is not a quick miracle cure that you can stick on a major problem. In fact, if you are not careful, it will only make the problems worse. I am certainly not reassured. And is this an exception? I don’t think so. We know that disabled people, like other marginalized groups, are often the canaries in the coal mine. We saw it with SyRI, the Toeslagenaffaire and DUO. And that is not so strange. Because how can we expect a very quickly implemented system, with a focus on efficiency, to ‘automagically’ generate equivalent outcomes in an unequal society? Miss Ank would say: ‘We don’t think that’s strange, we just think it’s very special.’ #Algorithms #Algorithms #AI #UWV #ResponsibleAlgorithmUse #AlgorithmicAccountability https://lnkd.in/eCMeyi-r