A person holding a phone while using a voice cloning application  As cyber attackers are finding ways to use technology to clone voices to further their deceptive practices, others are attempting to find ways to detect their misdeeds… with varying degrees of success. (Photo: Debalina/Adobe Stock)

The media reports a lot on "deepfakes" and, more recently, "cheap fakes"— video and still images that do not reflect reality but are, in essence, an AI-generated facsimile that looks and sounds frighteningly like the real thing.

Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:

  • Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
  • Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
  • Educational webcasts, white papers, and ebooks from industry thought leaders
  • Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.