• 0 Posts
  • 4 Comments
Joined 5 months ago
cake
Cake day: February 16th, 2024

help-circle



  • Ok, so not great, but not terrible.

    Firstly you had to fall for social engineering to get the dodgy app via TestFlight. Later on, you had to fall for social engineering to get the dodgy app via you installing an MDM profile on your own device. In the future, you’ll doubtless be able to get socially engineered to sideload it.

    Currently, in the UK (I don’t know what this is like in other countries), we get regular prompts from our banks not to share one-time codes with anyone, not even bank employees. And not to transfer money to ‘safe’ accounts, even if someone claiming to be the bank or the police tell you to. They’ll just need to update those to also say “We at Bank will never ask you to install test or special versions of our app, or update them anywhere other than the official Apple/Google app store”.

    This is a social engineering problem, not really an iOS (or Android) technical one.

    EDIT: The article is suspiciously vague one one point:

    Once installed on either an iPhone or an Android phone, GoldPickaxe can collect facial recognition data, identity documents and intercepted text messages, all to make it easier to siphon off funds from banking and other financial apps. To make matters worse, this biometric data is then used to create AI deepfakes to impersonate victims and access their bank accounts.

    What ‘facial recognition data’ is it gathering, and how? As I understand it, FaceID is processed in a secure enclave, and regular apps don’t have access to that - they send a ‘verify this person’ request, the phone itself triggers a FaceID scan, does the verification itself and sends back a ‘yes, all good’ reply to the app - the app itself does not get FaceID or biometric data. So unless it’s just doing something like using the camera to take some photos or videos of the user, I’d like to know what the article is talking about there…