Jump in the discussion.

No email address required.

the ai analyzes existing work to make future work

Hmm, and how is this any different front people learning by viewing what others have done? Its a fun argument, but fundamentally it can't go forward without calling into question everything humans ever make.

Jump in the discussion.

No email address required.

This is generally right. The issue with these systems isn't going to be with the training. Maybe in some cases if someone like github trains on private data that they only had incidental access to, it might be a problem, but I can't see how training on anything it can legally get its hands on, like images of paintings or library books, will ever be a problem.

Where there will be a problem is in what the model generates. Humans can learn from whatever but that can't spit out large chunks of what they learned without citation, and even with citation they can't commercialize that. I haven't used co-pilot but have seen many accounts of it reproducing almost verbatim large chunks of code it was trained on. That is not going to :marseypass:

Jump in the discussion.

No email address required.

Uh what? People don't cite things they were trained on either. No coder on earth is citing the book that taught him basic coding while he was in college.

>Maybe in some cases if someone like github trains on private data that they only had incidental access to, it might be a problem, but I can't see how training on anything it can legally get its hands on, like images on paintings or library books, will ever be a problem.

That's completely backwards from reality but cool take. Weirdly enough a private set of data that you design the terms and conditions for is more legally yours than things you find at the library.

I haven't used co-pilot but have seen many accounts of it reproducing almost verbatim large chunks of code it was trained on.

That sounds made up and exactly what the artcels claim about ai art. How would they even know what it was trained on?

Jump in the discussion.

No email address required.

What the frick are you even talking about? Of course you don't cite things you were trained on when you produce something completely original. You cite things when you reproduce pieces of things you have read. Model's aren't just 'trained' in some metaphysical sense. They can also memorize what they're fed, which means they can reproduce it, which is not ever going to be ok.

Jump in the discussion.

No email address required.

The concerns in the lawsuit is over training data not spitting out large portions of stolen code. An accusation I've seen no evidence of.

Jump in the discussion.

No email address required.

I am basically agreeing with you that the lawsuit is meritless, unless github trained on private repos, which is a blurrier line. That's more like learning some shit from a company's trade secrets, which is not ok even if you don't use them to produce the exact same thing. The real fight will be in the future on reproduction, if its happening ( I have only hearsay, I won't claim for sure it is). Microsoft has already said they will not indemnify users of co-pilot against copyright claims. If I was running a large company, no fricking way I let anyone use that on my codebase without indemnification. In a small, closed-source company, who cares. No one will ever know even if it did spit out someone else's shit. AI Art is less straightfoward because you can't really copyright a style of art. Just because your AI outputs something that looks like a Dr. Seuss drawing doesnt mean that's wrong but it includes the Cat in the Hat, well thats a trademarked character. It doesn't even have to be exactly the same. You can't substantially imitate a trademarked character, at least not commercially. There are a lot of issues to settle but some if it will be pretty clear.

Jump in the discussion.

No email address required.

Ma'am we've been over this before. You need to stop.

Jump in the discussion.

No email address required.

Link copied to clipboard
Action successful!
Error, please refresh the page and try again.