It’s disappointing that Ring has chosen this moment, when police and the government are infringing civil liberties left and right, to revive its tech’s most invasive feature: helping police get videos captured by home security cameras. https://www.consumerreports.org/electronics/privacy/ring-community-requests-lets-police-ask-for-user-videos-a2437818485/
Install Privacy Badger organization-wide to protect your community from online surveillance and malvertising. https://www.eff.org/deeplinks/2025/09/libraries-schools-why-organizations-should-install-privacy-badger
ICE reactivated its contract with spyware manufacturer Paragon Solutions. You can read more about it here, but what does that mean for using encrypted chat apps like Signal? 🧵(1/8)https://www.eff.org/deeplinks/2025/09/eff-statement-ice-use-paragon-solutions-malware
“Social media sites and chat apps normally strip metadata by default,” which protects privacy, EFF’s Jacob Hoffman-Andrews told Straight Arrow News. ”If XChat is failing to strip metadata, it’s putting its users at risk.” https://san.com/cc/not-so-secret-xs-new-encrypted-chat-feature-puts-users-at-risk-experts-say/
Age verification mandates are a dream come true for Big Tech platforms, which will end up with more traffic and a whole lot more data. But these laws are a nightmare for users. https://www.eff.org/deeplinks/2025/09/age-verification-windfall-big-tech-and-death-sentence-smaller-platforms
Privacy Badger isn’t just for individuals. Libraries and schools can install Privacy Badger organization-wide to make private and secure browsing the default on their computers. https://www.eff.org/deeplinks/2025/09/libraries-schools-why-organizations-should-install-privacy-badger
After years of activist pressure, lawsuits, and bad press Ring made much needed reforms. Now, they’re pivoting back to mass police surveillance as a business model. https://www.eff.org/deeplinks/2025/07/amazon-ring-cashes-techno-authoritarianism-and-mass-surveillance
A federal judge ruled that fair use allowed Anthropic to train Claude on copyrighted books—but decided to send misguided “piracy” claims to trial. Facing copyright's ridiculous statutory penalties, Anthropic agreed to pay a record-breaking $1.5 billion to settle. https://www.nytimes.com/2025/09/05/technology/anthropic-settlement-copyright-ai.html
While digital ID gets pushed as the solution to the problem of uploading IDs to each site users access, the security and privacy on them varies based on implementation. But when privacy is involved, regulators must make room for negotiation. https://www.eff.org/deeplinks/2025/09/verifying-trust-digital-id-still-incomplete
Thanks to Mississippi’s sweeping age verification law, residents just lost access to Bluesky and Dreamwidth. These mandates don’t rein in Big Tech—they entrench it. https://www.eff.org/deeplinks/2025/09/age-verification-windfall-big-tech-and-death-sentence-smaller-platforms
Axon’s AI police report writing tool is designed to not keep a record of which parts of a final report were written by AI and which parts were written by the officer. https://www.eff.org/deeplinks/2025/07/axons-draft-one-designed-defy-transparency
A federal judge ruled that fair use allowed Anthropic to train Claude on copyrighted books—but decided to send misguided “piracy” claims to trial. Facing copyright's ridiculous statutory penalties, Anthropic agreed to pay a record-breaking $1.5 billion to settle. https://www.nytimes.com/2025/09/05/technology/anthropic-settlement-copyright-ai.html
@eff Yeah, no supprise it is working as intended. Protection of children is a ruse. The internet is the best place to get dirt on people in leadership roles so that the dark state can then blackmail them. If people will not willingly go to Epstein Island or Diddy's Rooms then it important to be able to verify the identity of all potential compromised leaders so they can be fully utilised to push globalist goals.
"The unsettling feeling that your device is spying on you is real — but the culprit isn't a secret microphone. It's the data broker industry," EFF’s @evacide tells CNET. https://www.cnet.com/tech/services-and-software/features/no-your-iphone-isnt-listening-to-you-heres-whats-really-happening/
“It’s important to avoid uploading photos (to ChatGPT or other AI) that you want to make sure nobody but you ever looks at,” EFF’s Jacob Hoffman-Andrews told @WSJ - too many AI users assume a certain level of privacy that actually might not be there. https://www.wsj.com/tech/ai/chatgpt-photos-safety-83dd9b5b