← All reports

Meta's Data Harvesting Practices and the Call for Technological Solidarity Against Surveillance

Privacy & SurveillanceApr 18, 2026score 0.884 posts · 7 replies across 3 instances
Tech behemoths like Meta are accused of having a monetary definition of 'value,' meaning they will harvest nearly all available personal data, often under the guise of necessity. Specific invasive examples include WhatsApp's background collection of user contact details, even without explicit user consent. The debate splits between culture and technology. Some users, like @Em0nM4stodon, push for a societal culture change, arguing for a collective responsibility to protect the data of others, noting risks in areas like ancestry testing which expose entire family lines without group consent. Others, such as @kkarhan, argue that personal responsibility demands adopting privacy-enhancing technologies like Tor, Monero, and cash as an act of solidarity against systemic overreach. The consensus points to a systemic failure in data governance. The primary fault lines are between corporate overreach—where tech companies dictate data use—and the difficulty of defining ethical boundaries, as questioned by @internetarchive regarding facial recognition and algorithmic consent. The prevailing mood is deep skepticism toward any company claim of 'valuing' user privacy.

Key points

SUPPORT
Tech companies define 'value' purely in monetary terms, meaning maximum data extraction.
Asserted by @ScottMGS, who stated companies will harvest data when they claim to 'value privacy.'
SUPPORT
WhatsApp's routine tapping of user contact details constitutes invasive background surveillance.
Reported by @werawelt, noting Meta maintains contact details without direct user consent.
SUPPORT
Ancestry testing presents a collective ethical risk by forfeiting data belonging to an entire family unit.
Cautioned by @noodlemaz, who highlighted the issue of gaining consent from all relatives.
SUPPORT
Achieving true privacy requires developing a cultural norm of protecting the data of the wider community.
Stressed by @Em0nM4stodon, emphasizing community data responsibility.
SUPPORT
Countering surveillance demands active individual adoption of privacy-enhancing tools.
Advocated by @kkarhan, naming Tor, Monero, and cash as necessary acts of solidarity.
SUPPORT
Determining who has the authority to set the boundaries for data use remains an unresolved ethical crisis.
Framed by @internetarchive concerning algorithmic decisions and facial recognition.

Source posts

@[email protected]
Remember, when the techbroligarch-owned companies say "We value your privacy!", remember that they only have one definition of "value" and it's measured in money. When they say "Your privacy is important to us." they mean they'll grab as much of your private data as they can because it's important to them. #privacy
0 boosts · 0 favs · 0 replies · Apr 17, 2026
#privacy
@[email protected]
You might consent to your data being used to prevent societal harm, but who decides where that line is drawn? 🤔⚖️ Aram Sinnreich & Jesse Gilbert explore the hidden ethics of data collection, facial recognition, and algorithmic decision making in THE SECRET LIFE OF DATA on the Future Knowledge #podcast, with Laura DeNardis. 🔍 🎧 Listen & subscribe ⬇️ futureknowledge.transistor.fm/episodes/the-secret-life-of-data #Consent #DataEthics #Privacy #AI @aram @jesse #Bookstodon
0 boosts · 0 favs · 0 replies · Apr 15, 2026
#podcast#consent#dataethics#privacy#ai#bookstodon
@[email protected]
In privacy, we talk a lot about how to protect our own data, But what about our responsibility to protect the data of others? If you care about privacy rights, you must also care for the data of the people around you. To make privacy work, we need to develop a culture that normalizes caring for everyone's data, not just our own. www.privacyguides.org/articles/2025/03/10/the-privacy-of-others/ #Privacy #Consent #HumanRights
1 boosts · 0 favs · 7 replies · Apr 15, 2026
#privacy#consent#humanrights
@[email protected]
Who gets to decide how your data is used, especially when you never gave informed consent? Aram Sinnreich & Jesse Gilbert explore the ethical gray areas of data use, from facial recognition to unseen algorithmic decisions, in THE SECRET LIFE OF DATA on the Future Knowledge #podcast, in conversation with Laura DeNardis. 🎧 Listen & subscribe ⬇️ futureknowledge.transistor.fm/episodes/the-secret-life-of-data #Consent #DataEthics #Privacy #AI @aram @jesse #Bookstodon
0 boosts · 0 favs · 1 replies · Apr 8, 2026
#podcast#consent#dataethics#privacy#ai#bookstodon