April 27, 2024

The mountain of shit theory

Uriel Fanelli's blog in English

Fediverse

Cialtroprivacy.

Cialtroprivacy.

By now politics has remained the only social elevator in Italy, so when a theme appears on the horizon, it happens that politicians appropriate it. When this happens to a technological theme, usually the incompetence of politicians transforms the theme into pure magic, a magic made of superstitious gestures: "use Tor", "install uBlock", and all that.

And let's be clear, all of these systems themselves do what they say they do. Those who filter headers filter headers, VPNs do VPNs, and all. But you have to pay attention to one thing: no one, after the bland GPRS, is trying to introduce a real data discipline, that is, teaching users what to do with their data.

One thing must be very clear: at some point, you will have to give your data to someone. As long as the network was only made up of things that didn't affect your real life, you could keep your data for yourself. But in a world where you work remotely, e-commerce goes mainstream, you have an online bank, and you have online insurance, etc., your data goes to someone.

And if you want a world where you can take a selfie without waiting for the place to be deserted, you have to accept that your face, and therefore your position at a given moment, ends up on the network. Ditto for the sounds you make: if you are on the bus and someone calls or sends a local message, they are recording you too.

If we then move on to profiling, well, there is a lot of information that you send and a lot that you do not send: if a person in a family makes a purchase with a spouse's credit card, you are profiled as the type of family that makes a card enough. , and therefore only one works, and the income is medium-low, while those who earn much more have a credit card per person.

in general, the precautions you use to be "untraceable" are in turn a method of profiling, because they allow you to identify your computer. If you use Privacy Badger and uBlock instead of adBlock you can estimate your proximity to EFF, but you could use other software, and other antivirus, and all this contributes to making your browser unique, that is, to track you.

The problem is not to stop offering data. The problem is deciding WHICH data.

For example, when you connect to the internet, you don't just give your ISP your IP address. Also give your latencies. Latencies that make your computer unique: in how many tenths of a second does it respond? But what's the point? The point is that the so-called telemetry that each telco takes on the access network are considered acceptable because they are made necessary by the service that the ISP gives you.

Similarly, when you buy something online you are giving data: the seller will have to know what to send you and where.

The second myth to debunk: "yes but then I have to decide what they do with my data". This is nonsense. When you leave the house dressed in a certain way (which allows you to trace yourself, profile yourself and all that), do you ask yourself the problem? Do you realize that every time you leave the house with an open face, you face a huge number of people?

But even if we limited ourselves to the digital world, whatever contract you have with A remains between you and A. When at some point A is sold to B, why Tumblr is bought, or why Yahoo resells it, or closes it, what will happen. 'to those data? You have no control over the data. But what you will do to protect them provides a UNIQUE portrait of you.

The illusion of data control must be dispelled immediately: it is a unicorn. Does not exist. There is and never has been.

Let's go to the important things instead. Which are two: risk assessment, and value assessment.

Let's start with the risk assessment. Okay, Amazon knows everything about me. So (hypothetically, because he almost never gets it right) “he knows my tastes” and he knows what to offer me.

Good.

Apart from that the same could be said, by now, of the Erkrath butchers where I buy supplies, but the problem is that if I ask myself "what is the risk", honestly I know that at most I would get interesting advertising rather than less interesting advertising.

That's it? This is what I get today:

Cialtroprivacy.

now, obviously someone knows what I do and thinks I'm interested in using 5G for Ericsson's powergrid technologies. Ok. There is. Is that all the risk?

To say it all, if Amazon had asked me “what advertisement would you like to read? What are you interested in? " I would have answered the questionnaire: there was no need to "steal" the information. And what happens now? That my advertisement is more interesting than when they offered me a test for vaginal candidiasis?

It is clear that I would not want Amazon to have my photographs while I am (hypothetically) with the lover, or other situations like, I know, revenge porn. And this is the risk assessment. But when I see people using TOR to read the newspaper, honestly I laugh: what risk have you taken down?

The risk assessment of the data forces you to ask yourself two questions:

  • How hostile / dangerous is the body that collects them? What are his intentions? What does he want to hurt me?
  • How dangerous is the data? What if Zuckerberg knows where I live? And what if he knows which party I'm voting for? What do I gain by publishing the data, and what does Facebook gain by having it?

Once (when the photographs were developed by the photographer) there was a photographer, in Bologna, who one day started selling a CD-ROM of “swinger couples, the secret Bologna” in his shop / studio. What was the center of the problem? The risk assessment: those photos ended up in swinger newspapers, but they ended up being censored, so as to make the ladies, houses, etc. unrecognizable. But the photographer saw the actual photograph.

Even a simple risk assessment would have understood that:

  • the images were dangerous. Both possession by the photographer and their dissemination were dangerous.
  • The photographer wanted, like all economic entities, to make money. So he could use those images to make money. There was the motive.

This is the point. The privacy risk is completely reduced. The risk of revenge porn that comes with sending a semi-pornographic or pornographic photo to your boyfriend is equated with the risk you take when Amazon knows you like to cook sushi.

No, I'm sorry: Amazon can NEVER be as dangerous as your boyfriend.

This flattening of the dangerousness of the data is the reason why NONE of the solutions offered by the so-called "privacy experts" (usually politicians trying to make ends meet) will ever work.

A typical example is the fact that many students are using Teams for DAD. What information do they pass on?

  • that a 12 year old boy attends middle school.
  • who teaches geography from 10 to 11.
  • that his teacher is called Alemagna Alpina.
  • who has some trouble indicating where sugar beet is grown in Morocco.

what's the risk? What can, and what does Microsoft want to do with this data? Even if Microsoft had gory data on him (he peeked at his partner's boobs), what are we talking about?

On the contrary, I would have serious problems if it were a business meeting in which we discuss customer offers, and we are talking about "Microsoft vs Linux". I would have serious problems using it to call a customer and the point was "Azure Vs AWS".

Thinking that Microsoft Teams raises more privacy issues on DAD than in the corporate world is a clear example of provincial self-righteousness.

Let's go to the second point: an evaluation of value.

If there are ONLY me and foo on the market, and I know that diabetes is endemic in our village, I can easily beat foo by offering only products with no added sugar. But if we both know, the value of this information drops to zero. And all modern profiling systems are pretty good at figuring out what's worth and what's not.

What does it mean? It means that to have any value information must be an advantage over the competition. If everyone has the information, a race to the bottom on the price begins, and the value drops, when it is widespread, to zero.

Another point is "who has a copy". For example, if someone says "you were in the square molesting women on December 8, 2017", they have a notable weapon. Or not, if I have a camera in the living room that shows me, and my resolution is better than yours, and it is clear that I was in my living room, as can be seen from the data in the cloud.

For example, I know American consultants (the market is hell there) who go around with this in their pocket:

Cialtroprivacy.

There are even smaller ones to secure an office temporarily:

Cialtroprivacy.

When they enter the elevator together with a woman or are in the room alone with a woman, they usually record everything, to avoid "strange" accusations.

Now, the Privacy point of all this is very simple: the data logging that I own (and only me) is VERY powerful and has a VERY high value.

Therefore, an evaluation of VALUE is required.

If I go to a sex shop and leave the information that I like Bad Dragons that the neighbors mistake for porch pillars, he is VERY valuable because he's the only one who knows. If you want to know what a photo of Valentina Nappi's used ass is worth, the answer is: zero.

But on the contrary, if the paparazzo takes a picture of a politician coming out of the house of Valentina Nappi, he has a terrible weapon in his hand. But if Nappi replies: “I have a camera in my bedroom and I record EVERYTHING that happens”, Nappi has the biggest weapon.

How does this translate into facts? In fact, it means that it is necessary to make an evaluation of VALUE, and to decide that information of little value can be sold, provided that it is sold to many entities. Those of great value that are sold to a few entities require two things: the trust of the entity TOGETHER with the possession of ONE COPY.

Only by evaluating the value of the information can you decide whether to give it away, and to whom to give it away. You can't just say "I hide ALL the information", since in general:

  • information is worth at least as few people have it. It is the information you give to FEW entities that normally has the most value. If you use more e-commerce websites, which I know Amazon, Otto, Zalando, Idealo, Lidl, and everyone leave your preferences, the one that will have more value is the information you have given to "make a avatar with my face ".
  • in proportion, the winner is whoever has the information that is worth more: you have almost certainly given the resolution of your screen to ALL the websites you go to. It has no value. Instead, the answer to the security question "What was your maiden mother's name" that you left on only ONE site is valuable.

In the value decision, it doesn't matter that you decide WHAT the website can do to you. The problem in this case is: who wins between the two?

If amazon says that the package was delivered to me intact while I have the video of my cam where I show that the postman dribbled with the package, I win. How much more information Amazon has about me.

Instead, we focus on tools, like "nooooo to outdoor surveillance cameras to protect the house". And why'? Why would I be able to see who is passing in a public place, ie outside my gate? It is a public place. It says "use the VPN, you will protect your IP". Aha. So what? And what exactly could the forum on anonymous baggers of your IP do; which is dynamic? If, on the other hand, you host a VPN in your home and connect your cell phone to it, it seems to those who track the IPs that the cell phone never leaves the house. You have a copy, and you can prove that you are NOT home, because YOU have the logs.

Ultimately, therefore, instead of suggesting to everyone "use TOR", "use this or that product", the thing that should be done is an awareness campaign, that is:

  • a serious risk assessment, based on the counterparty to which the data is transferred and on the type of data. The CIA does nothing about your waistline.
  • a serious evaluation of the value, that is of the POWER transferred to third parties by transferring information. Don't do "revenge Porn" to Valentina Nappi.

The politicians will be turning up their noses, because they want political solutions, but the point is simple:

"There are no political solutions to technical problems. But the opposite is also true".

To say, I'm not a big fan of solutions like “tutanota” or “protonmail”. Because I can evaluate that my emails in the hands of google can also help him understand who I am with excellent precision. And then with the information…. it doesn't hurt me, at best it improves the advertising.

On the contrary, I know little about both tutanota and protonmail. I know that, trusting in secrecy, I will probably feel freer to speak. But I don't know who they are, and I don't know what power I'm giving them: the poor idiot who thought he was hiding behind protonmail must have been disappointed to see the prosecutor of the French republic unpack his encrypted email "end to end", but in the end, in fact, the data was too precious. All the advice of "use matrix", "use signal", "use tutanota" are advice from politicians.

The correct advice is: ask yourself what is the maximum POWER you want to entrust to others when you write the email. It is not an endorsement towards gmail, at most towards selhosting, but the point is to tell people to ALWAYS AND ONLY use protonmail or tutanota and NEVER gmail is stupid. He hasn't taught anyone to evaluate risk and value power.

The only sensible privacy policy is the following, and it has nothing to do with technologies:

  • if you are not vulnerable to a given piece of information, you can give it to everyone.
  • if you are vulnerable to a given information, but the other party cannot do anything dangerous about it because the balance of power does not change, or changes to your advantage, you can give it to that other party and only that.
  • if you are vulnerable and EVERY counterpart could do something dangerous to us because the balance between the POWERS is unbalanced against you, then do selfhosting or do not send it at all.

The final purpose of this evaluation is to understand how, precisely, the balance of POWER changes between you and whoever stores the data, and of a vulnerability evaluation with respect to the data itself.

And Tutanota can't do it. You have to learn how to do it.

Obviously it is not the politician's solution: the politician does not want YOU to decide what to do with the data, he wants to be the one to decide it.

Leave a Reply

Your email address will not be published. Required fields are marked *