US News

Latest AI Challenges Are Copyright Infringement, Theft



Artificial intelligence coding companies are facing some hiccups as content producers are preparing challenges for the use of their words and ideas in AI outputs, paving the way for copyright infringement and plagiarism lawsuits.

After all, AI is not necessarily making stuff up as it goes along. Its intelligence was scraped from someplace, repackaged, and repurposed.

Authors, artists, content originators, and internet publishers are going to potentially seek a cut for the use of their work by AI tools, potentially throwing a wrench in the phenomenon as the technology ramps up, The Wall Street Journal reported Sunday.

“This will not end well for creatives,” bestselling author James Patterson told the Journal.

Patterson is lashing out at the “frightening” reality of the unregulated use of his work to train AI tools without permission or compensation — using his income-earning work for free  — all while potentially getting the code-writer and company paid for it.

Elon Musk has already sniffed it out, moving to limit the number of tweets a user can view as a way to keep AI bots from scraping Twitter – now to be called X – to grab, if not steal, content for free.

There was an effort earlier this month by thousands of authors, including Patterson and Margaret Atwood, in an open letter to AI companies to obtain permission and provide compensation to scrape their work.

Comedian Sarah Silverman is a party of a lawsuit against OpenAI and Meta for allegedly using stolen copies from “shadow libraries” on the Internet.

The Associated Press has signed a deal with OpenAI to use its news archive for its AI tools, while News Corp., The New Yorker, Rolling Stone, and Politico are among the companies seeking compensation, sources told Journal.

OpenAI and Alphabet, the parent of Google that launched ChatGPT, say they use “publicly available” content for its AI tools, but this is still the wild west days of AI, experts told the Journal.

“The cases are new and dealing with questions of a scale that we haven’t seen before,” Yale Law School’s Information Society Project’s Mehtab Khan told the Journal. “The question becomes about feasibility. How will they reach out to every single author?”

Tech companies are seeking to use the “fair use” legal doctrine and AI advocates are pleading to keep the free flow of ideas running through AI developments.

“If a person can freely access and learn from information on the Internet, I’d like to see AI systems allowed to do the same, and I believe this will benefit society,'” Stanford University’s Andrew Ng, an AI investor and researcher, told the Journal.

But AI is unfair to the originator, according to Silverman’s lawyer Matthew Butterick, depending “entirely on having a data set of quality work, made by humans, and if they collapse that market, their systems are going to collapse too.

“They can’t bankrupt artists without bankrupting themselves,” he told the Journal.

Palantir CEO Alex Karp also warned in The New York Times last week the technology’s limitless pathways could be dangerous, potentially even revealing, if not producing, a direct threat to humanity.

“It is not at all clear — not even to the scientists and programmers who build them — how or why the generative language and image models work,” Karp warned in the Times.

Related Stories:


© 2023 Newsmax. All rights reserved.



Source link

TruthUSA

I'm TruthUSA, the author behind TruthUSA News Hub located at https://truthusa.us/. With our One Story at a Time," my aim is to provide you with unbiased and comprehensive news coverage. I dive deep into the latest happenings in the US and global events, and bring you objective stories sourced from reputable sources. My goal is to keep you informed and enlightened, ensuring you have access to the truth. Stay tuned to TruthUSA News Hub to discover the reality behind the headlines and gain a well-rounded perspective on the world.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.