“No, I don’t use it. It’s a bit of my conscious consumption. A little because I know everything I know, but also because I work with people who worked on ChatGPT training, and it has turned out very badly. It has destroyed their lives, and OpenAI has never compensated them. So when you work with those people closely or they’re friends, it’s like you feel like a betrayal.”
This statement by Dr. Milagros Miceli in an interview with National Geographic gives her reason for not using ChatGPT and similar artificial intelligence (AI) tools. What she says is noteworthy because Dr. Miceli is a computer scientist, researcher, and sociologist with expertise in the field of data and AI. She has published multiple research papers, is a frequent keynote speaker at expert panels, and is a researcher at two institutions.
AI has had a huge rise in recent years. It shows up on apps, search engines, websites, and more. Companies invest money in an effort to be on the cutting edge of AI, while often facing a public that isn’t 100% behind the technology. Dr. Miceli’s remarks highlight the growing problems and the fear surrounding the rapid increase in AI usage, pointing out issues with how workers are treated, AI taking jobs, and replacing human creativity.
Why Miceli doesn’t like ChatGPT
In the interview, Dr. Miceli spoke about how companies will get rid of workers because they feel advancements in AI can replace them. She pointed out that companies are happy to boast their AI advancements while minimizing human impact. “The problem is that it is not in the news as often as when people are kicked out. Saying ‘we create a fully autonomous AI’ sells a lot more.”
Dr. Miceli gives a specific example of when Amazon supermarkets used technology to replace cashiers. However, that new technology required a monitoring system. So instead of frontline workers like cashiers, workers were needed behind the scenes because “the technology was not as autonomous as we were told”. She went on to say that the monitoring system jobs are sent to poorer countries, where the working environment isn’t as stable and doesn’t pay as much. This raises the issue of the ethics of AI and the treatment of workers around the world.
Dr. Miceli also talked about how the frequent use of AI is replacing human intelligence and creativity. She warned that the reliance on AI like ChatGPT makes people less capable of thinking on their own, resulting in a less skilled workforce. Less skilled means lower pay. She also pointed out that the future of work under AI could create jobs that are “less interesting and more precarious”, meaning they do not challenge you in a fun way and aren’t reliable for long-term, stable employment.
Why Miceli’s words are important
Dr. Miceli is an expert in this field, so her thoughts about AI resonate with those who have their own ethical concerns while also getting attention from those who have never questioned AI before. She is the research lead at the Distributed AI Research (DAIR) Institute. Its work is focused on exposing AI harm while working toward non-exploitative technology that helps communities. She is also the research leader of Data, Algorithmic Systems, and Ethics at the Weizenbaum Institute in Berlin.
Dr. Miceli also publishes research and papers on the topics of AI and technology ethics. Recent publications include “What Knowledge Do We Produce from Social Media Data and How?”, “The Making of Performative Accuracy in AI Training: Precision Labor and Its Consequences”, and “The Exploited Labor Behind Artificial Intelligence”.
Her professional focus is on pointing out the dark side of technologies while working toward the ethical and pragmatic use of future tech. With people like her driving change, it can make public fears, like whether AI has already become conscious, seem less scary. In the interview, she points out that technology can have a dangerous side to it. Those drones used for fun also have a dark side of “drones … capable of killing people thousands of kilometers away … The horrific face and the entertaining face are two sides of the same coin. One serves as a distraction from the other.”
