The time has come for us to make passwords for identifying each other…

  • FuglyDuck@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 years ago

    Remember, if it’s truly life threatening, the hospital is going to do the surgery and gouge you for it later.

    The time pressure is meant to prevent you from looking into it.

    Hang up, call them…. Don’t just hand money over the phone.use an excuse like calling your bank or something

  • Semi-Hemi-Demigod@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    2 years ago

    This was common advice for parents in the 80s and 90s. If someone had to pick me up from school unexpectedly my parents gave them a code word to tell me to let me know it wasn’t a child abduction

  • gk99@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Fortunately, I hate videocalls and have no reason to use them, so if my friend videocalled me I’d ask what the fuck they were doing and immediately be suspicious.

  • meat_popsicle@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Easy solution: Never give money that’s requested like this. Give the money in person or not at all.

    If the friend doesn’t like it they can go to the bank. If they don’t like my terms they can pay interest to them.

    Sorry people, I’m not your fuckin loan officer and scams are just too easy.

  • preasket@lemy.lol
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    Here’s hoping for popularising secure communication protocols. It’s gonna become a must at some point.

      • Takumidesh@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        But key exchanges work.

        Signal for example, will warn you when the person you are talking to is using a new device.

        As long as the user heeds the warning, it is an effective stop, and at the very least gives the user pause.

        If the signal safety number changes, but the communication stays on track, as in, the context of the conversation is the same, it’s unlikely to be a problem. But if the safety number changes and the next message is asking for money, that is a very simple and easy to process situation.

  • kn33@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    I got one of these a few months ago. I could tell it was fake before I even answered, but I was curious so I pointed my own camera at a blank wall and answered. It was creepy to see my friend’s face (albeit one that was obviously fake if you knew what to look for) when I answered.

    • Kodemystic@lemmy.kodemystic.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      How do these scamers know who our friends are? Also how are they able to get pictures or video from said friend to create the fake?

      • kn33@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        In my case, the friend’s facebook account was compromised. So they were able to get his pictures and call me from his account.

  • redcalcium@c.calciumlabs.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 years ago

    Right now deepfakes doesn’t work well when the face is viewed from extreme angles, so you can ask them to slowly turn their face to the side or up/down as far as they can until the face is not visible. It also doesn’t work well when something obstruct the face, so ask them to put their hand in their face. It also can’t seem to render mouth right if you open it too wide, or stick out your tongue.

    I base this from a deepfake app I tried: https://github.com/s0md3v/roop . But as the tech improves, it might be able to handle those cases in the future.

    Edit: chance that the scammer use a live deepfake app like this one: https://github.com/iperov/DeepFaceLive . It also supports using the Insight model which only need a single well lit photo to impersonate someone.

    • 14th_cylon@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      Right now deepfakes doesn’t work well when the face is viewed from extreme angles, so you can ask them to slowly turn their face to the side or up/down as far as they can until the face is not visible.

      or, you know, you can just pickup the phone and call them.

  • AmbientChaos@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 years ago

    I’m in the US and have a well off friend who had his Facebook hacked. The bad actors sent messages to his friends asking to borrow $500 until tomorrow because his bank accounts were locked and he needed the cash. Someone who was messaged by the bad actors posted a screenshot of a deepfaked video call he received that caused him to fall for it. Wild times we live in!

      • djmarcone@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        I routinely get emails from the owner of the company I work for asking me to kindly purchase several large gift cards and forward them and the receipt to him for prompt reimbursement.

        • graphite@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          2 years ago

          asking me to kindly purchase several large gift cards

          kindly give me your money, thanks

  • Margot Robbie@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    With deepfake technology being so advanced nowadays, how will we ever know if the person we are talking with on the internet is who they say they are?