We need to challenge the notions about automation that seem to be carried along in conversations about ChatGPT and other forms of “artificial intelligence” — notions including that AI acts on its own and that it threatens to replace humans.The results of a Google image search for “artificial intelligence” or “machine learning” are telling: lots of pictures of brains, robots and humanoid-type figures. Anthropomorphizing automated technologies reveals our fascination with them, but it gets in the way of a meaningful understanding of how they work, and how they impact us.As an academic, formerly as a professor at UC Berkeley and now director of research at the nonprofit research institute Data & Society, I’ve devoted my career to studying the relationship between digital technology and society. I’m committed to using my research to nudge policymakers and experts of all stripes toward a more humane and human-centered approach to computing technology.In fact, human labor plays an important role in AI tools. It’s human labor that trains these models, based on data produced by humans: We teach them what we know. And around the world, it’s on-call human workers who fix errors in the technology, respond when tools get stuck, moderate content, and even guide robots along city streets.

Origen: It’s time to challenge the narrative about ChatGPT and the future of journalism – Poynter

DEJA UNA RESPUESTA

Please enter your comment!
Please enter your name here