Home News FCC aims to investigate the risk of AI-enhanced robocalls

FCC aims to investigate the risk of AI-enhanced robocalls

by WeeklyAINews
0 comment

As if robocalling wasn’t already sufficient of an issue, the appearance of simply accessible, real looking AI-powered writing and artificial voice may supercharge the follow. The FCC goals to preempt this by wanting into how generated robocalls would possibly match below present shopper protections.

A Discover of Inquiry has been proposed by Chairwoman Jessica Rosenworcel to be voted on on the company’s subsequent assembly. If the vote succeeds (as it’s virtually sure to), the FCC would formally look into how the Phone Shopper Safety Act empowers them to behave towards scammers and spammers utilizing AI know-how.

However Rosenworcel was additionally cautious to acknowledge that AI represents a doubtlessly highly effective device for accessibility and responsiveness in phone-based interactions.

“Whereas we’re conscious of the challenges AI can current, there may be additionally vital potential to make use of this know-how to profit communications networks and their prospects—together with within the struggle towards junk robocalls and robotexts. We have to handle these alternatives and dangers thoughtfully, and the hassle we’re launching right now will assist us achieve extra perception on each fronts,” she said in a statement.

Any business that includes numerous voice, like customer support, is probably going already wanting into how automation and generative AI can be utilized to enhance human brokers’ effectiveness. As a substitute of responding with a canned response, as an illustration, a name heart worker may have an AI seek the advice of a data base and supply a script personalized to a buyer’s precise expertise. Or an AI-powered triage system may enhance the laborious “In case you are calling for this, press 1… for this, press 2…” course of that few take pleasure in.

See also  Existential risk? Regulatory capture? AI for one and all? A look at what's going on with AI in the UK

However the identical applied sciences that would make a tedious job extra environment friendly, or an interface extra intuitive, might be deployed in different methods to trick or inconvenience folks. One can think about (and certainly some possible don’t need to think about) robocalls catering to at least one’s career, age and site — the form of tailor-made scams that took time to craft earlier than however can now be automated.

It’s an rising menace, and the FCC is ostensibly the cop on the beat; whereas they’ve hit robocallers earlier than for document fines (although these aren’t at all times collected), they should keep forward of the sport and this inquiry is meant to assist them do this.

Particularly, Rosenworcel stated that the hassle would take a look at:

  • How AI applied sciences match into the Fee’s statutory obligations below the Phone Shopper Safety Act (TCPA);
  • If and when future AI applied sciences fall below the TCPA;
  • How AI impacts present regulatory frameworks and future coverage formulation;
  • If the Fee ought to contemplate methods to confirm the authenticity of legitimately generated AI voice or textual content content material from trusted sources; and,
  • What subsequent steps, if any, are essential to advance this inquiry.

If it sounds a bit of woolly, simply keep in mind that these inquiry-type efforts are what the company and others prefer it depend on when performing precise rulemaking and justifying themselves in courtroom.

Source link

You Might Be Interested In
See also  How can Automation Help to Reduce Data Security Risk with ChatGPT?

You may also like

logo

Welcome to our weekly AI News site, where we bring you the latest updates on artificial intelligence and its never-ending quest to take over the world! Yes, you heard it right – we’re not here to sugarcoat anything. Our tagline says it all: “because robots are taking over the world.”

Subscribe

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

© 2023 – All Right Reserved.