April 28, 2024

The mountain of shit theory

Uriel Fanelli's blog in English

Fediverse

Clubhouse and biometrics.

Clubhouse and biometrics.

Friends wondering if clubhouse is safe or not are looking around for literature, and so far have come to understand that yes, Clubhouse leaves metadata here and there. This seems to be their actual concern, but they forget that Clubhouse also sends out data of another kind. Biometric data.

Imagine that a social network asks you to leave your fingerprints to get an account. Until then there is nothing wrong: after all it is like a signature, or we can even imagine a social network asking you to leave your signature (scanned) in order to let you in.

You would be very perplexed, I guess: with a signature you can do many things. Checks can be signed, contracts signed, credit cards checked, etc. So, you wouldn't like a social network asking for high resolution images of your signature. Signing up is serious, right?

So, let's talk about biometrics.

Biometrics is all that set of techniques or technologies that allow you to associate your identity with any unique trait (at least statistically) of your biological body. For example:

  • DNA-PCR
  • Facial recognition.
  • Fingerprints.
  • Retinal scan.
  • Vocal recognition.
  • Signature on paper.

Now, there are some considerations about these methods. And the consideration I would like to make concerns Social Engineering, or cyber attack on "layer 8".

Suppose I call my team assistant on the phone, asking if by chance I could email a certain document that is on the intranet, in which for mysterious reasons I cannot access (ex: I only have a cell phone but not a laptop with me, and the intranet is not accessible from cell phones).

So my heroic team assistant (covid confirmed it: she is heroic) what will she do?

  • He'll recognize me using his voice.
  • He will send me the file.

Very well. Indeed, very bad, because we have just seen an attack on the so-called “layer 8”. Or rather: no. The team assistant is absolutely authorized to give me that file, as long as she can recognize me by voice.

Here, the problem is this: he's using a voice recognition system.

You will have understood the point: that the problem comes when someone can imitate my voice. Let's go step by step, and do a dangerousness statistic. Let us ask ourselves how much damage someone who is able to forge your identity can do.

  • Signature on paper. (the most dangerous because it is accepted for contracts of all kinds).
  • Vocal recognition. (there are telephone contracts, even if for a few things).
  • Facial recognition (the guardian of a building recognizes you from the face).
  • Fingerprints. (here we enter the field of criminology)
  • DNA-PCR. (again in the case of criminology)
  • Retinal scan. (I don't know many commercial applications that use the retina as an ID)

If we rely on this, it is clear that an attack on "layer 8" is much more dangerous if it emulates your signature, and in second place I would put voice recognition because it covers everything you can do with a phone call in which you come. recognized. Facial recognition is also dangerous, but it requires physical presence, so it requires you to mimic the whole body. It is true that a video call can be limited to the face, but in a video call you will also have to imitate the voice.

Clearly the most explosive way is the signature, but usually the signature is in the presence, and therefore would require to imitate also the face and the voice, and often the whole body.

Speech recognition is a tremendous weakness in the world of layer 8 attacks. Speech is, even if we don't think about it much, the most used remote recognition tool.

Cyber ​​attacks on "layer 8" refers to those attacks on computer systems that are not based on using the network (layer 1 to 7) but on accessing systems by attacking the people who administer them.

And it's a problem, yes.

Check out these videos:

These are two examples of "attack on layer 8", and the first is notable because it is hardly IT at all: putting a crying baby in the background has served to increase the sense of empathy and pressure on the operator telephone.

And the problem grows:

Now, I think, you have all the pieces to understand one thing: that giving your voice around today is dangerous . Giving it to strangers is dangerous: by combining data that they can deduce from you, data they find on the internet and more, they can call company porters "can I send my wife to pick up the package arrived for me?" , and all that succeeds.

Now let's go back: how easy is it to imitate a voice when it is heard?

You may have heard of techniques like deepfake, but they affect images. Are there also deepfakes on the voice?
Yes, there are

And as you hear, they are even more effective. But especially, they can be done in real time because the amount of sound information is much less than that of a video.

And these systems are already in circulation:

for this reason I look at ClubHouse as a very dangerous system. It brings together two very dangerous things:

  • the ability to hear things spoken by me.
  • the possibility of extracting data about my person during a dialogue.

For example, even a podcast could be used for this purpose: but in the podcast no one says which bank they rely on, or who their GP is (attack them to extract information), or where exactly do you work (to call the concierge at their name), or which phone company I use, my home number, etc.

Instead, in a highly interactive environment, it is possible to ask for all this information. And then you provide everything you need for an attack on layer 8.

This information is dangerous, because voice recognition via a telephone is the most widely used remote recognition system. It can be used to build (or destroy) alibis, and more.

So, don't give your voice to strangers.

Never. The amount of possible scams is unimaginable.

you risk making phone scams grow to a monstrous level.

It is true that you also give your voice on Whatsapp, but you normally give it to people you know. On a system that is practically public, everything changes. When a stranger has your voice, you're in shit.

As long as I asked you: ah, you are from Düsseldorf, I am looking for a good bank to open an account, what do you recommend? It doesn't take much to say that even if you give two or three names, one of them will be your bank. And the guy has your voice. From tomorrow you can call your bank with your voice.

So NO, NO, NO: you don't give your voice to strangers, let alone on a social network.

Leave a Reply

Your email address will not be published. Required fields are marked *