After being scammed into thinking her daughter was kidnapped, an Arizona woman testified in the US Senate about the dangers side of artificial intelligence technology when in the hands of criminals.

  • DarkThoughts@kbin.social
    link
    fedilink
    arrow-up
    12
    ·
    2 years ago

    Is no one questioning how the alleged kidnappers managed to create a voice profile from a random 15 year old girl to create such a convincing AI voice? The only source that claims that this was potentially an AI scam, was in fact just another parent:

    But another parent with her informed her police were aware of AI scams like these.

    Isn’t it more likely that dad & daughter did this and it backfired?

    • Stumblinbear@pawb.social
      link
      fedilink
      arrow-up
      3
      ·
      2 years ago

      It’s pretty easy to create voice clones, now. As long as you tailor the speech you want it to speak and don’t have it speak too long it can get pretty good even with very little input

    • davidhun@lemmy.sdf.org
      link
      fedilink
      arrow-up
      3
      ·
      2 years ago

      Given the prevalence of social media platforms where you post videos of yourself, it seems pretty easy to get enough voice sampling to generate a convincing clone. Depending on how much personal info she and her family members put out on social media, it’s trivial to connect all the dots to concoct a plausible scenario to scam someone.

      Now whether or not it was “just a prank, bro” from family or whomever, I don’t know.

      • PlantJam@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        2 years ago

        All it takes is a three second sample, according to “The AI Dilemma” on YouTube. It’s about an hour long, but it has a lot of really good information.