How Metadata From Encrypted Messages Can Keep Everything Safe

The future is encrypted. Real-time, encrypted chat apps like Signal and WhatsApp, and messaging apps like Telegram, WeChat, and Messenger — used by two out of five people around the world — help protect privacy and expedite our rights to organize, free speech, and stay close to our communities.

They are intended to be built for convenience and speed, for person-to-person communication as well as large group connections. However, these same circumstances give rise to abusive and illegal behavior, disinformation and hate speech, and tricks and deceptions; all to the detriment of most of their users. In 2018, investigation reports examined the role played by these components in multiple deaths in India and Indonesia as well as elections in Nigeria and Brazil. The ease with which users can forward messages without verifying their accuracy means that there can be disinformation. SPREAD quickly, secretly, and to a significant extent. Some apps allow very large u-groupsp to 200,000—Or hosts organized encrypted propaganda machinery, separation from the original vision to follow a “Sala.” And some platforms suggest being profit -driven policy change, which allows business users to use customer data in new and invasive ways, which ultimately destroys privacy.

In response to the damages done by these apps, prominent governments encourages platforms to implement so-called backdoors or use client-side automated scans of messages. But such solutions undermine everyone’s basic freedoms and put many users at greater risk, like most. taught. These violation measures and other traditional moderation solutions that rely on content access are rarely effective for combating online abuse, as shown in the recent RESEARCH REVEALS by Riana Pfefferkorn of Stanford University.

Product design changes, not backdoors, are the key to restoring competing usage and misuse of encrypted messages. While the content of individual messages can be destructive, it is the size and virality of allowing them to spread that presents a real challenge by turning the sets of harmful messages into a groundswell of discouraging. social forces. Already, researchers and promoters are analyzing what the changes are. transmission limits, better labeling, and reduced group sizes can greatly reduce the prevalence and severity of problematic content, organized propaganda, and criminal behavior. However, such work is done using workarounds such as tiplines and public groups. Without good data from the platforms, auditing any real-world effectiveness of such changes is hampered.

Much more can be done on the platforms. In order for such significant product changes to be more effective, they need to share “metadata metadata” with researchers. It consists of aggregated datasets that show how many users are on a platform, where accounts are created and when, how information travels, what types of messages and format types are the fastest. that spread, what messages are most often reported, and how (and when) users. booted off. To be clear, this is not information commonly referred to as “metadata,” which usually refers to information about any particular individual and can be very personal to users, such as a person’s name, email address, mobile number, intimate contact, and even payment information. It is important to protect the privacy of this type of personal metadata, so the United Nations Office of the High Commissioner for Human Rights is right thought metadata of a user subject to the right to privacy when applied in the online space.

Luckily, we don’t need this level or type of data to start seriously fixing damages. However, companies must first reach out to researchers and regulators about the nature and size of their metadata. collecting, to whom they share such data, and how they analyze it to influence product design and revenue model options. We know for sure that many private messaging platforms collect a lot of information with a lot of useful insights into how they design and test new product features, or whether they attract investment and announcer.

The aggregated, anonymized data they collect can, without compromising encryption and privacy, be used by platforms and researchers alike to shed light on important criteria. Such aggregated metadata can lead to changing game reliability and improving safety through better features and design options.

Source link


Leave a Reply

Your email address will not be published. Required fields are marked *