Employee paid to read the news

09. September 2021 – 13:51 clock

Whatsapp chats are not as secure as expected

When our chats are broadcast on Whatsapp, nobody can read them, this fact is very important to Facebook. A current report now shows: There are exceptions.

Facebook has not kept its promise regarding user privacy

When it comes to privacy, Facebook doesn’t have the best reputation, even founder and CEO Mark Zuckerberg had to admit. But because users have to feel comfortable with a messenger, the company attached great importance to the fact that messages on WhatsApp cannot be read through end-to-end encryption. “Not even from Facebook,” explain the chats themselves. And yet the company employs over 1,000 people who do just that.

This is reported by the US news site “Pro Publica”. According to this, these people commissioned by Facebook sit around the world to check millions of messages, photos and videos that have been sent via Whatsapp. The goal: to find “inappropriate content”.

Facebook has employees to read messages

They can be reported on Whatsapp as on Facebook or Instagram. If a user marks a message, photo or video as “inappropriate”, the content and the surrounding chat are forwarded to Facebook. External employees ultimately assess whether it is attempted fraud, child pornography or possible terrorist plans, according to “Pro Publica”.

Researching the site, the job is very similar to moderating other online services. For a wage of $ 16.50 or more, the 29 moderators surveyed would have a number of complaints in offices and, during the pandemic, at home as well. They manage up to 600 such tickets a day, which corresponds to an average processing time of less than a minute. The working speed is monitored.

This is how Whatsapp checks the messages

The process is always the same. If a user marks a message as inappropriate, this and the previous four messages are sent unencrypted to Facebook – including photos and videos. Then they are assigned to individual moderators.

“Proactive” messages appear somewhat more problematic. Here, the message is not sent by the user, but via artificial intelligence, which automatically evaluates chats and content and compares them with problematic content. The AI ​​not only evaluates the user information, but also the frequency of messages sent, terms used or media that are already known to be problematic. Fraud attempts or the passing on of illegal recordings are conceivable. Apple had to struggle with massive PR problems precisely because of such an automatic search on the iPhone, even though the extent of the search was considerably smaller than the one now described on Facebook.

Once in the hands of the moderators, they have to evaluate the message – and decide whether the user has to be observed or even banned. It is not easy for the moderators. “Pro Publica” reported that if child pornography was suspected, the age of the person shown had to be estimated; in the case of a possible decapitation video, the authenticity of the corpse should be assessed. In addition, there are minor hurdles such as language barriers that the tools offered do not always remove to the necessary extent. One moderator said, for example, that the program tried to translate Arabic text as Spanish.

Whatsapp’s allegations confirmed

Faced with the allegations, Facebook’s PR boss Carl Woog confirmed to the site the existence of the teams that examine the content. The aim is to catch and lock out “the worst” perpetrators, he explained. However, Woog emphasized that the group does not see this as content moderation. “We don’t normally use that term on Whatsapp,” he said. The group is about to operate the service reliably and prevent abuse, while protecting privacy, so the group.

In fact, the implementation of the moderation does not necessarily contradict the statements on end-to-end encryption. It only promises that the messages cannot be read out during transmission. They have to be decrypted on the device at the latest – otherwise they could not be displayed to the users at all. If only the content reported as inappropriate is passed on, the rest of the communication would continue to be protected. But that cannot be verified.

In addition to the chats, other data is also collected

In any case, the group seems to be well aware that users could question this representation. While clear figures for moderating the content are available for Instagram as part of the transparency report, there is no such report for Whatsapp. The signal effect that the group can read along even in very limited cases should hardly help the messenger’s fragile reputation.

Because even now Whatsapp is considered a data octopus compared to other messengers. By evaluating the so-called metadata that arise around the chats, Facebook can read out far-reaching relationships about the chat ends. Anyone who only occasionally sends a message back and forth during working hours ultimately has a completely different meaning for a person than the person who is regularly in the same apartment in the evening and sometimes receives a video call at night.

In addition, there is data such as the profile picture, group names and access to the entire address book. “Another aspect is forwarded messages and images. Here, too, Facebook can track who shared them and when. Even if the content of the messages themselves is not known, that reveals a lot about the users,” explains security researcher Paul Rösler. “If you have enough metadata, you don’t need the content,” the former NSA chief advisor once said. Former CIA and NSA chief General Michael Hayden summed it up even more drastically: “We kill on the basis of metadata.”

Terms and conditions have been adjusted as users jumped off

It was only just in spring that it became clear that the user did not like the current collection of data. As part of a change in terms and conditions, Facebook wanted to regulate communication with companies. It quickly spread that it was actually about merging customer data between Whatsapp, Facebook and Instagram. The users ran away in droves that Facebook postponed the deadline for consent to change twice and ultimately made it voluntary. No wonder that you want to avoid the impression that the company can always read the chats.

However, there are two options Facebook could use to easily increase confidence in Whatsapp. Competitors like Signal who really care about data protection collect significantly less data and have also disclosed the source code. And can thus credibly prove that they cannot see the users’ chats. (stern.de/mma)

Note: This article first appeared on stern.de.


Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on pocket
Share on whatsapp

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.