FCC officially declares AI-voiced robocalls illegal

Share via:


The FCC’s war on robocalls has gained a new weapon in its arsenal with the declaration of AI-generated voices as “artificial” and therefore by definitely against the law when used in automated calling scams. It may not stop the flood of fake Joe Bidens that will almost certainly trouble our phones this election season, but it won’t hurt, either.

The new rule, contemplated for months and telegraphed last week, isn’t actually a new rule — the FCC can’t just invent them with no due process. Robocalls are just a new term for something largely already prohibted under the Telephone Consumer Protection Act: artificial and pre-recorded messages being sent out willy-nilly to every number in the phone book (something that still existed when they drafted the law).

The question was whether an AI-cloned voice speaking a script falls under those proscribed categories. It may seem obvious to you, but nothing is obvious to the federal government by design (and sometimes for other reasons), and the FCC needed to look into it and solicit expert opinion on whether AI-generated voice calls should be outlawed.

Last week, likely spurred by the high-profile (yet silly) case of a fake President Biden calling New Hampshire citizens and telling them not to waste their vote in the primary. The shady operations that tried to pull that one off are being made an example of, with Attorneys General and the FCC, and perhaps more authorities to come, more or less pillorying them in an effort to deter others.

As we’ve written, the call would not have been legal even if it were a Biden impersonator or a cleverly manipulated recording. It’s still an illegal robocall and likely a form a voter suppression (though no charges have been filed yet), so there was no problem fitting it to existing definitions of illegality.

But these cases, whether they’re brought by states or federal agencies, must be supported by evidence so they can be adjudicated. Before today, using an AI voice clone of the President may have been illegal in some ways, but not specifically in the context of automated calls — an AI voice clone of your doctor telling you your appointment is coming up wouldn’t be a problem, for instance. (Importantly, you likely would have opted into that one.) After today, however, the fact that the voice in the call was an AI-generated fake would be a point against the defendant during the legal process.

Here’s a bit from the declaratory ruling:

Our finding will deter negative uses of AI and ensure that consumers are fully protected by the TCPA when they receive such calls. And it also makes clear that the TCPA does not allow for any carve out of technologies that purport to provide the equivalent of a live agent, thus preventing unscrupulous businesses from attempting to exploit any perceived ambiguity in our TCPA rules. Although voice cloning and other uses of AI on calls are still evolving, we have already seen their use in ways that can uniquely harm consumers and those whose voice is cloned. Voice cloning can convince a called party that a trusted person, or someone they care about such as a family member, wants or needs them to take some action that they would not otherwise take. Requiring consent for such calls arms consumers with the right not to receive such calls or, if they do, the knowledge that they should be cautious about them.

It’s an interesting lesson in how legal concepts are sometimes made to be flexible and easily adapted — although there was a process involved and the FCC couldn’t arbitrarily change the definition (there are barriers to that), once the need is clear, there is no need to consult Congress or the President or anyone else. As the expert agency in these matters, they are empowered to research and make these decisions.

Incidentally, this extremely important capability is under threat by a looming Supreme Court decision, which if it goes the way some fear, would overturn decades of precedent and paralyze the U.S. regulatory agencies. Great news if you love robocalls and polluted rivers!

If you receive one of these AI-powered robocalls, try to record it, and report it to your local Attorney General’s office — they’re probably part of the anti-robocalling league recently established to coordinate the battle against these scammers.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

FCC officially declares AI-voiced robocalls illegal


The FCC’s war on robocalls has gained a new weapon in its arsenal with the declaration of AI-generated voices as “artificial” and therefore by definitely against the law when used in automated calling scams. It may not stop the flood of fake Joe Bidens that will almost certainly trouble our phones this election season, but it won’t hurt, either.

The new rule, contemplated for months and telegraphed last week, isn’t actually a new rule — the FCC can’t just invent them with no due process. Robocalls are just a new term for something largely already prohibted under the Telephone Consumer Protection Act: artificial and pre-recorded messages being sent out willy-nilly to every number in the phone book (something that still existed when they drafted the law).

The question was whether an AI-cloned voice speaking a script falls under those proscribed categories. It may seem obvious to you, but nothing is obvious to the federal government by design (and sometimes for other reasons), and the FCC needed to look into it and solicit expert opinion on whether AI-generated voice calls should be outlawed.

Last week, likely spurred by the high-profile (yet silly) case of a fake President Biden calling New Hampshire citizens and telling them not to waste their vote in the primary. The shady operations that tried to pull that one off are being made an example of, with Attorneys General and the FCC, and perhaps more authorities to come, more or less pillorying them in an effort to deter others.

As we’ve written, the call would not have been legal even if it were a Biden impersonator or a cleverly manipulated recording. It’s still an illegal robocall and likely a form a voter suppression (though no charges have been filed yet), so there was no problem fitting it to existing definitions of illegality.

But these cases, whether they’re brought by states or federal agencies, must be supported by evidence so they can be adjudicated. Before today, using an AI voice clone of the President may have been illegal in some ways, but not specifically in the context of automated calls — an AI voice clone of your doctor telling you your appointment is coming up wouldn’t be a problem, for instance. (Importantly, you likely would have opted into that one.) After today, however, the fact that the voice in the call was an AI-generated fake would be a point against the defendant during the legal process.

Here’s a bit from the declaratory ruling:

Our finding will deter negative uses of AI and ensure that consumers are fully protected by the TCPA when they receive such calls. And it also makes clear that the TCPA does not allow for any carve out of technologies that purport to provide the equivalent of a live agent, thus preventing unscrupulous businesses from attempting to exploit any perceived ambiguity in our TCPA rules. Although voice cloning and other uses of AI on calls are still evolving, we have already seen their use in ways that can uniquely harm consumers and those whose voice is cloned. Voice cloning can convince a called party that a trusted person, or someone they care about such as a family member, wants or needs them to take some action that they would not otherwise take. Requiring consent for such calls arms consumers with the right not to receive such calls or, if they do, the knowledge that they should be cautious about them.

It’s an interesting lesson in how legal concepts are sometimes made to be flexible and easily adapted — although there was a process involved and the FCC couldn’t arbitrarily change the definition (there are barriers to that), once the need is clear, there is no need to consult Congress or the President or anyone else. As the expert agency in these matters, they are empowered to research and make these decisions.

Incidentally, this extremely important capability is under threat by a looming Supreme Court decision, which if it goes the way some fear, would overturn decades of precedent and paralyze the U.S. regulatory agencies. Great news if you love robocalls and polluted rivers!

If you receive one of these AI-powered robocalls, try to record it, and report it to your local Attorney General’s office — they’re probably part of the anti-robocalling league recently established to coordinate the battle against these scammers.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Swiggy’s Litmus Test

Zomato’s food delivery platform is profitable, thanks to...

Arthur Hayes’ ‘sub $50K’ Bitcoin call, Mt. Gox CEO’s...

Arthur Hayes revealed he “took a cheeky short”...

Five things to expect with Apple’s new M4 Mac...

Apple is expected to begin unveiling their M4...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!