• gastationsushi@lemmy.world
    link
    fedilink
    arrow-up
    86
    arrow-down
    2
    ·
    10 months ago

    Are we sure it’s AI?

    I’ve heard of this scam happening maybe a decade ago with my extended family. The voice was a real person overseas with a lot of exp tricking grandparents. Scammers only had basic information.

    They act as a freaked out kid and the victim gets roped in. They scam for thousands of dollars each time, even succeeding a few times a day would net a big profit. Also cell connections are low fidelity, I bet that aids their ability to trick the victim.

    • TORFdot0@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      10 months ago

      Yeah this happened to my grandparents, they just say “I sound like shit because I’ve been crying”

    • Chetzemoka@startrek.website
      link
      fedilink
      arrow-up
      15
      arrow-down
      1
      ·
      10 months ago

      Same. Years ago my grandfather received a call from a guy claiming to be my younger, male cousin saying he was in jail for something and needed bail. Luckily (?), my grandfather was an asshole and told him to call his mother.

    • OceanSoap@lemmy.ml
      link
      fedilink
      arrow-up
      7
      ·
      10 months ago

      Yeah, my dad called me one day asking if my brother was out if the country because our grandma got a call saying he was kidnapped in Mexico and she needed to put up money for his release.

      It’s wild.

    • AFaithfulNihilist@lemmy.world
      link
      fedilink
      arrow-up
      18
      ·
      10 months ago

      I don’t know, the “Spanish prisoner” is a scam that seems to be reinvented every few years every time we see a little bit of a change in technology. It wouldn’t take much to fake a person’s voice with a trained model, especially if that person has an online profile open to the public where they post content in their own voice.

    • flying_sheep@lemmy.ml
      link
      fedilink
      arrow-up
      17
      arrow-down
      4
      ·
      10 months ago

      Yeah, it has some sus vibes. I’m usually far too trusting, but here even my bullshit detectors rang

      • Mr_Blott@lemmy.world
        link
        fedilink
        arrow-up
        18
        arrow-down
        4
        ·
        10 months ago

        You know that old adage “Never attribute to malice that which can be easily explained by stupidity”?

        We need a new one along the lines of “Never attribute to truth that which can be easily explained by attention-starved teenagers”

    • I could easily conceive some tricks to get clips of a person’s voice without them realizing. I’d write them out but… that would be stupid of me. Humans have more vulnerabilities than computers.

    • PLAVAT🧿S@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      10 months ago

      Yeah, unless this person runs a YouTube or podcast it seems implausible. What would you train a random AI on for the normal person?

      I could see a situation where you hack a phone, get the contacts and call history, pick the 1st or 2nd most dialed number, have a bot call that person to get samples, then go back to the original phone and try this… I mean, eventually you’d get a hit?

    • Szymon@lemmy.ca
      link
      fedilink
      English
      arrow-up
      68
      arrow-down
      4
      ·
      10 months ago

      You can train AI with just a single voice clip. You can do this on your desktop. Microsoft doesn’t need to sell shit, you put that clip on tiktok yourself.

      • brrt@sh.itjust.works
        link
        fedilink
        arrow-up
        12
        ·
        10 months ago

        You don’t even need to upload anything. They can call you, have a short convo and then just say “oh sorry wrong number” or something. You’d never know.

        • SomeGuy69@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          10 months ago

          Yup. You need like 5 to 15 seconds of talking, that’s it. I’ve done this myself to confirm it works actually quite well with.

      • unexposedhazard@discuss.tchncs.de
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        10 months ago

        Well they said they dont share their voice anywhere, if thats true it would be concerning. I for one just dont use any centralized unencrypted services that could scrape my voice but i would assume most people think that if they dont publish anything, they are safe…

        • Overzeetop@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          10 months ago

          You don’t talk to anyone on the phone through a pbx? Never call your bank? Your doctor? Your credit card company? Any of your insurance company? Even on private systems all of those calls are recorded for legal reasons. And all of them will eventually be compromised.

          • unexposedhazard@discuss.tchncs.de
            link
            fedilink
            arrow-up
            1
            ·
            10 months ago

            I make regular phone calls maybe twice a year, everything can be done by email or web forms in germany. But generally the people who have access to all the phone lines are the feds of whichever country you are in. And they, unlike big tech arent super interested in selling that data.

        • tiramichu@lemm.ee
          link
          fedilink
          arrow-up
          52
          ·
          10 months ago

          The ‘old’ way of faking someone’s voice like you saw in 90s spy movies was to get enough sample data to capture each possible speech sound someone could make, such that those sounds can be combined to form all possible words.

          With AI training you only need enough data to know what someone sounds like ‘in general’ to extrapolate a reasonable model.

          One possible source of voice data is spam-calls.

          You get a call, say “Hello?” And then someone launches into trying to sell you insurance or some rubbish, you say “Sorry I’m not interested, take me off your list please. Okay, bye” and hang up.

          And that is already enough data to replicate your voice.

          When scammers make the call using your fake voice, they usually use a crappy quality line, or background noise, or other techniques to cover up any imperfections in the voice replica. And of course they make it really emotional, urgent and high-stakes to override your family member’s logical thinking.

          Educating your family to be prepared for this stuff is really important.

        • Szymon@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          10 months ago

          Yeah I’m gonna go ahead and not give that knowledge out.

  • abbadon420@lemm.ee
    link
    fedilink
    arrow-up
    39
    ·
    10 months ago

    When I was a kid, my parents had “the talk” with me. It was about sex. Now I’m older and my parents are too. I have to have “the talk” with them. It’s about scams.

    • theangryseal@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      10 months ago

      My uncle got divorced a few years back and it nearly crushed him. He we a ridiculously handsome young and successful man, so women chased him. At any point when he was younger he had at least a handful of women actively pursuing him. Now he was older and divorced. Those women were long gone, all having married and carried on with their lives. He didn’t expect to struggle with dating like he did and that made the whole thing even harder.

      I set him up on all of the big dating sites. I didn’t know how bad it was, I’d never used them.

      He was talking to at least 10 scammers a day, probably more.

      He’s kind of a miser so no one was going to get any of his money, but his hobbies showed his wealth and oh boy did they try.

      It was so bad that he gave up on the dating sites entirely. He’s had a few girlfriends since then but he only met one person in over a year on the dating sites.

      It blows my mind just how many people are out there making a living scamming people.

      • nickwitha_k (he/him)@lemmy.sdf.org
        link
        fedilink
        arrow-up
        7
        ·
        10 months ago

        The sad thing is that, in the current era, virtually all dating sites are scams riddled with bots and have been for over a decade. Their goal is to make money not produce matches.

        • theangryseal@lemmy.world
          link
          fedilink
          arrow-up
          8
          ·
          10 months ago

          He really struggled with it.

          I’d do a reverse image search and find the actual person, he’d thank me and move on. Some of them got in his head. One even faked a Skype call with a video of a beautiful woman and somehow scouted his Facebook profile to really dive deep into his personality. The video was like 13 pixels and she’d say, “I’m sorry, I live way out in the wilderness and we have bad satellite internet.”

          He said, “She’s too good to be true. No one is this agreeable.” I told him to ask her to make a specific gesture because of all of scammers. He did, he asked her to hold her hands above her head in the shape of a triangle. She refused, said something like, “I can’t believe you don’t trust me. That breaks my heart. You know me.” She stopped talking to him, a couple weeks later she messaged, “I’m so sorry, my mom is in the hospital and I have no money to eat. I wouldn’t ask you but I have been alone so long you’re all I know.” He told her if it was that desperate she could prove who she is and he’d help. Nope. Nothing. Radio silence. That one really hurt him. Whoever it was played the scam game real good.

          You might not believe this, but I have a cousin (my mom’s first cousin actually) who fell in love with her scammer. He conned her out of thousands of dollars, turned out to be from Nigeria when he said he was from somewhere else. About 6 months in he said, “Listen. I am not who I have said I am. My real name is John, I am truly in love with you. I know you and I want you to really know me.”

          She was in her 50s, he was in his 30s. She was not an attractive woman. She was short, fat, walked with a limp and was born with physical deformities (her nickname to us kids was “old bat”, playfully of course). He wasn’t attractive either, but still. 20 years younger, from another country, had, I don’t know, scammed her. My mom tried her best to stop it, but she flew to Nigeria and married the dude. She stayed over there a few months and then he flew back with her. He stayed with her about 4 years I guess, and he ran around with any woman who would have him. He finally drained her money and rolled out.

          The last thing I heard about him, he was arrested for breaking into the grocery store he worked at and robbing the safe.

          It is crazy to me just how much money can motivate people to do absurd and crazy things. I can’t relate at all. I’ve been broke as shit and all I had to do was sell a few things to get back on track and I couldn’t motivate myself to even do that. Money just doesn’t mean enough to me to go to any trouble to get it haha.

  • cum@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    9
    ·
    10 months ago

    Uh there’s zero chance these big techs are selling voices like this. Also, this sounds very targeted and planned, so there must be more context to this. Also, why the hell are they on bluesky?

  • rbesfe@lemmy.ca
    link
    fedilink
    arrow-up
    39
    arrow-down
    6
    ·
    edit-2
    10 months ago

    The only way to train an AI voice model is to have lots of samples. As scummy as they are, neither Microsoft nor Apple is selling your voice recordings with enough info to link them to you specifically. This person probably just forgot about an old social post where they talk for enough time for a model to be trained. Still super scary stuff.

    • altasshet@lemmy.ca
      link
      fedilink
      arrow-up
      35
      arrow-down
      1
      ·
      edit-2
      10 months ago

      Not true anymore. You can create a reasonable voice clone with like 30 seconds of audio now (11labs for example doesn’t do any kind of authentication). The results are good enough for this kind of thing, especially in a lower bandwidth situation like a phone call.

    • nifty@lemmy.world
      link
      fedilink
      arrow-up
      15
      arrow-down
      2
      ·
      edit-2
      10 months ago

      This person probably just forgot about an old social post…

      Or recordings made during customer service calls, maybe a disgruntled employee decides to repurpose the data.

    • Wirlocke@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      10 months ago

      True for creating voices at all, but that work has already been done.

      Now we’re just taking these large AI’s trained to mimic voices and giving them a 30 second audio clip to tell them what to mimic. It can be done quickly and give convincing results especially when hidden by the phonecall quality.

  • ɔiƚoxɘup@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    1
    ·
    10 months ago

    All those TV shows that taught us how to spot which twin was the evil one by asking about life history were just training us to beat AI

  • voxel@sopuli.xyz
    link
    fedilink
    arrow-up
    28
    ·
    10 months ago

    you don’t even need to fake a voice for these scams tho, it’s very difficult to differentiate q voice while you’re crying

  • Zeshade@lemmy.world
    link
    fedilink
    arrow-up
    24
    ·
    10 months ago

    Do a lot of people put their voice on the internet “as much as they’re able to”? It sounds like that person may post their voice online more than the average person…

    • Powerpoint@lemmy.ca
      link
      fedilink
      arrow-up
      5
      ·
      10 months ago

      Discord just automatically started putting you opt in for having your voice recorded for clips

    • maynarkh@feddit.nl
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      I imagine even discounting social media self-posts, there are VoIP calls, etc.

      Don’t assume a call with your mom through Facebook Messenger or Zoom or FaceTime or whatever is not somehow packaged and sold.

  • LaunchesKayaks@lemmy.world
    link
    fedilink
    arrow-up
    22
    ·
    10 months ago

    My grandmother got a call from scammers pretending to be me. They didn’t use my name, but I was the only adult granddaughter at the time lol. Anyway, the scammers said that they needed money for hospital bills and a bus ticket home. They said they got into a fight at a friend’s funeral in New Jersey and had to go to the hospital. And then after that their car got stolen. My grandmother knew that I was not in New Jersey, and told the scammers that she’d call them back once she got to the bank. She then informed my parents, who told me. It was hilarious.

  • DirkMcCallahan@lemmy.world
    link
    fedilink
    arrow-up
    20
    arrow-down
    1
    ·
    10 months ago

    Waiting for the comment that’s going to say something like, “Joke’s on you, my parents don’t even talk to me.”

  • M500@lemmy.ml
    cake
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    2
    ·
    10 months ago

    I advise everyone to contact their loved ones and inform them of this possibility. I also advise having some codeword that would be used if there was an emergency and money needs to be sent.

    For example is more than $100 is being asked for we have to share the code word or we should not transfer money.

      • get_off_the_phone@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        ·
        10 months ago

        Hey is me, your cousin. I’m kinda in a jam and need your help. The contractor cut me a huge check after the fire that took everything but the bank won’t cash it without proof. I just need $101 for proof and I can pay back immediately.

        This is for real.

        Thanks.

        Donkeyballs

        • Restaldt@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          10 months ago

          Hey its me your 2nd uncle

          I need about $3.50 to pay for shipping on a package from a nigerian price who needed my help

          I know we said over 100 for this but

          Donkeyballs

          • DrRatso@lemmy.ml
            link
            fedilink
            arrow-up
            3
            ·
            10 months ago

            Well thats about the time I noticed my uncle was eight stories tall and a crustacean from the paleozoic era.

  • paulcdb@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    10
    ·
    10 months ago

    Tbh it’s not that hard to stop scams. Treat EVERY call you get as a scammer!

    Either phone back on a known number, not some shit they give you or if they claim you need bail, ask for a reference number and the place being held and phone them after looking up the number, and If they get pissed, it’s a scam!

    No real police force is going to care/shouldn’t care if you call back. It’s not like cops get a percentage of bail money but scammers always seem too desperate to get you to pay and lose it pretty quick.