Privacy Please

S6, E247 - The EU's Bold Move on AI Training

Cameron Ivey

Send us a text

The European Parliament has released a groundbreaking 175-page study concluding that AI companies' practice of training on copyrighted material without permission constitutes mass reproduction not covered by current laws. This study recommends transforming the landscape through an opt-in system, radical transparency requirements, and fair compensation models for creators whose work trains AI systems.

• EU study reveals AI companies are treating the internet like a free "all-you-can-eat buffet" of creative content
• Recommendation to shift from opt-out to opt-in system requiring AI companies to request permission
• Call for mandatory transparency about what data AI models are trained on
• Proposal for fair licensing models similar to Spotify where creators get paid when their work trains AI
• New EU AI Act regulations taking effect in August will incorporate some of these protections

Stay safe, stay informed, and always question the code.


Support the show

Speaker 1:

Is your favorite AI creating art from stolen goods? It's a huge question nobody was ready for, but Europe is tackling it head on. Welcome to Privacy, please, where we decode the digital world for you. I'm your host, cameron Ivey, and today we're diving into a bombshell 175-page study from the European Parliament that could change the future of AI forever. So you know those amazing AI tools that can write a poem, design a logo or even generate a photorealistic image of your dog on a moon. They seem like magic, but here's the not-so-magical secret. To learn how to do that, they have to study A lot. And what are they studying? The entire internet that includes your favorite artist's profile portfolio, the photographer you follow on Instagram, best-selling novels and investigative journalism. They are, in essence, reading everything. The big problem they're not asking for permission.

Speaker 1:

This new EU study basically says hold on a second. They've concluded that the way these AI models are trained on copyrighted material is a form of mass reproduction, and the current laws just aren't built for this. Just imagine a robot grabbing books and art and photos from a digital library. So what did the EU's deep dive uncover? 175 pages is a lot. If you don't have the time, I'm going to break it down for you into three main bombshells. Number one the all-you-can-eat buffet is over. Right now, ai companies are essentially treating the internet like a free all-you-can-eat buffet.

Speaker 1:

The EU has a rule called the text and data mining, or TDM, which was meant for research, but this study shows training a massive commercial AI model is not the same as academic research. The big recommendation flipping the script from an opt out system to an opt in. Think of it like this Opt out is like a restaurant automatically adding a 20% tip to your bill and you have to notice it and ask to have it removed. Opt-in is when they ask you first if you'd like to leave a tip. The EU is suggesting that AI companies should have to ask for permission to use creative work for training, not just take it until someone tells them to stop. This is a game changer for creators.

Speaker 1:

Number two we need to see the receipts Right now. What these AI models are trained on is a total black box. It's a secret recipe, but what if that recipe includes your private photos, biased information or pirated content? The study demands radical transparency. It proposes that AI companies must provide a detailed summary of what their models have been trained on. This isn't just about copyright. It's a massive win for privacy and fighting bias. It means we could finally see if our personal data is being used to train the next big AI.

Speaker 1:

And finally, number three, fair pay for fair play. If an AI learns its style from thousands of artists, shouldn't those artists get a piece of the pie? The study says a resounding yes. It calls for a new and fair licensing models. Imagine a system, maybe like Spotify for AI training, where creators get paid every time their work is used to teach a machine. This could ensure that humans who create the foundational culture and knowledge of AI are actually compensated. Weird.

Speaker 1:

So what does this all mean for you on your phone, right now, on your computer? This is more than just a legal document. It's a battle for the future of creativity and, yes, your privacy. The EU's AI Act is already putting some of these ideas into motion, with new rules set to take effect in August. This study will fuel the fire for even stronger protections. It pushes for a world where AI innovation doesn't come at a cost for human creators. It's a call for an AI that is not only smart but also fair and transparent, and in a world where our digital lives are constantly being mined for data. That's a privacy conversation we all need to be a part of, so next time you ask an AI to create something for you, think about what it learned from Thanks to the EU. The answer to that question might soon be public knowledge. That's all the time we have for this episode of Privacy. Please. I'm Cameron Ivey. Stay safe, stay informed and always question the code.

People on this episode