Translate this page:


05/26/2023 9:54 AM | Scott Merritt (Administrator)

As you likely already know, AI stands for Artificial Intelligence.  But did you know that AI is capable of generating someone else’s voice which actually sounds deceptively like that other person? This technology has gotten so good that the simulated voice is nearly indistinguishable from an authentic human voice as heard by the human ear.  Moreover, the technology has become relatively inexpensive and widely available so that ill-intentioned persons can access it from the internet.

While perhaps not all persons using AI have bad intentions, those who do now have the ability to impersonate someone else’s voice to perpetrate a scam, and the types of scams we are seeing are limited only by imagination, determination, and opportunity – both on personal and business fronts.  For example, recent news publications described an incident where a fraudster used AI technology to simulate a child’s voice, claiming that the child had been kidnapped and requesting a ransom payment; similarly, a fraudster swindled a couple out of a large sum by using AI technology to convince them that their son was asking them for bail money.  The possibilities are limitless, and the technology behind it is so new that the victims are typically unsuspecting and extraordinarily vulnerable.


We all have distinctive voices.  How often does it happen that you already know who you are talking to simply by hearing their voice?  Let’s think about how DeepFake AI could affect the closing transaction being handled at your office.

DeepFake AI allows anyone to spoof a voice:  A seller calls with a change of disbursement, a broker calls telling you it is ok to pay the realtor at closing, the president of your company calls to tell you to release a wire or pay an invoice.  What would your team do if a realtor called to get additional information about their customer (sensitive information used to then defraud them)? A seller or buyer calls to gain information about the net proceeds from a sale, confirm financial information from lenders or employment details.   It might even be plausible that your office falls prey to a ghost fraud where the persona of a deceased person is used to convey property without having to go through probate.  Do you have a plan in place to verify any or all of the above?  How can you be sure it really is the person they say they are – especially if you speak with them frequently and it sounds like them?

Think about RON technology and how this new scam may affect digital closings. Speak to your RON vendors and verify what they are doing to keep their platforms secure.

Unfortunately, these are all questions and concerns that we now have to be aware of and try to plan for prior to getting caught by one of these DeepFakes.


As deep fake technology becomes more advanced, it is becoming increasingly difficult to distinguish between real and fake video calls. However, there are several signs that can help you recognize a deep fake video call and protect yourself.

1.       Pay attention to the quality of the video.  Deep fake videos often have subtle distortions or inconsistencies that can be difficult to detect but may be noticeable upon closer inspection. Look for things like unnatural or jerky movements, blurry edges around the skin and hair, shifts in lighting or skin tone, lips poorly synched with speech and strange blinking or no blinking at all. watch for is hair and teeth that do not look real, as algorithms may not be able to generate frizzy or flyaway hair or individual teeth. One simple way to detect a live deep fake is to ask the person on the video call to turn their profile to the camera. A person turning 90 degrees to the camera will create a distortion that will be detectable in real time.

2.       Pay attention to the audio. Deep fake videos often use synthetic voices or manipulated audio to create a more convincing illusion. Listen for any unusual or robotic-sounding speech patterns, or any discrepancies between the audio and the video.

3.       Watch for any inconsistencies or contradictions in the video that may indicate that it is being manipulated.

4.       Educate your yourself and your staff. Make sure you teach your employees about the various types and serious nature of cyber threats. Make cybersecurity training part of onboarding and provide continuous training.

In conclusion, recognizing a deep fake video call can be difficult, but by paying attention to the quality of the video, the audio and the context, you can protect yourself from bad actors who may be using deep fake technology to perpetuate fraudulent title insurance transactions.

Message from FLTA Cyber Security Committee


Back to Industry News

Florida Land Title Association is a 501(c)6 not-for-profit organization.

Copyright © 2013-2021. All Rights Reserved.

Mailing Address:
Florida Land Title Association
P.O. Box 66145
St. Pete Beach, FL 33736

Powered by Wild Apricot Membership Software