Home News FTC reportedly looking into OpenAI over ‘reputational harm’ caused by ChatGPT

FTC reportedly looking into OpenAI over ‘reputational harm’ caused by ChatGPT

by WeeklyAINews
0 comment

The FTC is reportedly in at the least the exploratory section of investigating OpenAI over whether or not the corporate’s flagship ChatGPT conversational AI made “false, deceptive, disparaging or dangerous” statements about folks. It appears unlikely it will result in a sudden crackdown, nevertheless it exhibits that the FTC is doing greater than warning the AI trade of potential violations.

The Washington Post first reported the news, citing entry to a 20-page letter to OpenAI asking for info on complaints about disparagement. The FTC declined to remark, noting that its investigations are nonpublic.

In February, the regulator introduced a brand new Workplace of Know-how to tackle tech sector “snake oil,” and shortly after that warned corporations making claims round AI that they’re topic to the identical reality necessities as anybody else. “Hold your AI claims in verify,” they wrote — or the FTC will.

Although the letter reported by the put up is hardly the primary time the company has taken on any of AI’s many kinds, it does appear to announce that the world’s current undisputed chief within the subject, OpenAI, should be able to justify itself.

This sort of investigation doesn’t simply seem out of skinny air — the FTC doesn’t go searching and say, “That appears suspicious.” Usually a lawsuit or formal grievance is dropped at their consideration and the practices described by it indicate that laws are being ignored. For instance, an individual might sue a complement firm as a result of the drugs made them sick, and the FTC will launch an investigation on the again of that as a result of there’s proof the corporate lied concerning the negative effects.

See also  Alleged OpenAI DevDay leak suggests connections to cloud drives

On this case there’s a excessive likelihood {that a} lawsuit like this one, during which an Australian mayor complained to OpenAI that ChatGPT stated he had been accused of bribery and sentenced to jail, amongst different issues, might immediate an investigation. (That matter is ongoing and naturally the jurisdiction is mistaken, however there are virtually actually extra prefer it.)

Publishing such issues might quantity to defamation or libel or just “reputational harm” because the FTC’s present letter to OpenAI reportedly calls them. It’s virtually actually ChatGPT at concern as a result of it’s the solely actually public product in OpenAI’s portfolio that might do such a factor — GPT-4 and the opposite APIs are locked down a bit an excessive amount of (and are too latest) to be thought-about.

It’s hardly a slam dunk: The technical points alone name into query whether or not this counts as publishing or speech and even something however a personal communication — these would all must be confirmed.

Nevertheless it’s additionally not a wild factor to ask an organization to clarify. It’s one factor to make a mistake, one other to systematically and undetectably invent particulars about folks, at big scales, and never say something about it. If Microsoft Phrase’s spell-checker sometimes added “convicted felony” in entrance of individuals’s names, you’d higher consider there can be an uproar.

Although the FTC has been handed a couple of high-profile defeats currently within the type of its anti-merger efforts directed at Meta and Microsoft being shot down, it has additionally nailed tech corporations for privateness points and even AI-adjacent violations.

See also  How ChatGPT can help your business make more money

Source link

You may also like

logo

Welcome to our weekly AI News site, where we bring you the latest updates on artificial intelligence and its never-ending quest to take over the world! Yes, you heard it right – we’re not here to sugarcoat anything. Our tagline says it all: “because robots are taking over the world.”

Subscribe

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

© 2023 – All Right Reserved.