Hey, did you hear about OpenAI's new AI, Sora? It makes short videos, even of video games!
Yeah, I read about it! Sora can create videos that look like famous games, but there might be some legal problems with how it was trained.
Wait, why would it have legal problems? Isn’t it just making new stuff from scratch?
Not exactly. To create those videos, Sora needs to be trained on tons of data. People think it used game walkthroughs, Twitch streams, and maybe even videos of popular streamers, like Auronplay and Pokimane.
Oh, so it copied those videos to learn? That doesn’t sound fair.
That’s the issue. Training an AI usually involves copying data to teach the model patterns. If that data is copyrighted—like video games or streams—then using it without permission might break copyright laws.
But isn’t it creating new videos? It’s not copying the original stuff exactly, right?
True, Sora’s videos aren’t direct copies. But sometimes, AI models can produce outputs that look very similar to the content they were trained on. For example, Sora made characters that look like famous streamers, even with their tattoos!
Whoa, that’s kind of creepy. So, could people sue OpenAI for that?
Possibly. Legal experts say video game content is tricky because it has layers of copyright protection. There’s the game itself, the unique video made by the player, and sometimes even user-generated content, like Fortnite custom maps.
So if Sora was trained on Fortnite playthroughs, who could sue OpenAI? The game company or the players?
Both! The game developer owns the game’s designs, the player owns their gameplay video, and if the map is custom, the map creator has rights too. That’s why this could get really messy.
Wow, I didn’t think AI could have so many legal problems. Can OpenAI argue it’s fair use?
They might try. Fair use means the AI’s training or output has to transform the original content in a big way and not hurt the market for it. But courts haven’t decided how fair use applies to AI training yet.
So, does this mean AI companies can just train models on anything they find online?
Not exactly. Some companies argue it’s like learning by watching. But others say AI training is different because it involves copying huge amounts of data. Until laws are clearer, AI companies are taking risks.
What happens if Sora makes a video that looks like a real game? Could users get in trouble too?
Yes, users might also be responsible if they publish or sell AI-generated content that’s too similar to copyrighted works. That’s why this debate is so important—for companies and for people using AI tools.
Sounds like AI and gaming have a lot to figure out. Do you think Sora’s worth the risk?
It’s exciting tech, but it’s a reminder that AI and copyright laws need to catch up. Until then, both companies and users have to be careful.