this post was submitted on 13 Jan 2024
580 points (95.7% liked)

People Twitter

5295 readers
275 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying or international politcs
  5. Be excellent to each other.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 52 points 10 months ago (2 children)

The 'old' way of faking someone's voice like you saw in 90s spy movies was to get enough sample data to capture each possible speech sound someone could make, such that those sounds can be combined to form all possible words.

With AI training you only need enough data to know what someone sounds like 'in general' to extrapolate a reasonable model.

One possible source of voice data is spam-calls.

You get a call, say "Hello?" And then someone launches into trying to sell you insurance or some rubbish, you say "Sorry I'm not interested, take me off your list please. Okay, bye" and hang up.

And that is already enough data to replicate your voice.

When scammers make the call using your fake voice, they usually use a crappy quality line, or background noise, or other techniques to cover up any imperfections in the voice replica. And of course they make it really emotional, urgent and high-stakes to override your family member's logical thinking.

Educating your family to be prepared for this stuff is really important.

[–] [email protected] 14 points 10 months ago

Clutch explanation. I just sent a screenshot of your comment to my folks.

[–] [email protected] 7 points 10 months ago (2 children)

Soo.... use a funny voice when answering the phone? Poison the data

[–] [email protected] 11 points 10 months ago

Keep a helium balloon on you at all times, just in case.

[–] [email protected] 2 points 10 months ago

Suddenly your parents get a desperate plea for money from that Simpsons waiter who had a stroke.