EFF
A New Digital Dawn for Syrian Tech Users
U.S. sanctions on Syria have for several decades not only restricted trade and financial transactions, they’ve also severely limited Syrians’ access to digital technology. From software development tools to basic cloud services, Syrians were locked out of the global internet economy—stifling innovation, education, and entrepreneurship.
EFF has for many years pushed for sanctions exemptions for technology in Syria, as well as in Sudan, Iran, and Cuba. While civil society had early wins in securing general licenses for Iran and Sudan allowing the export of communications technologies, the conflict in Syria that began in 2011 made loosening of sanctions a pipe dream.
But recent changes to U.S. policy could mark the beginning of a shift. In a quiet yet significant move, the U.S. government has eased sanctions on Syria. On May 23, the Treasury Department issued General License 25, effectively allowing technology companies to provide services to Syrians. This decision could have an immediate and positive impact on the lives of millions of Syrian internet users—especially those working in the tech and education sectors.
A Legacy of Digital IsolationFor years, Syrians have found themselves barred from accessing even the most basic tools. U.S. sanctions meant that companies like Google, Apple, Microsoft, and Amazon—either by law or by cautious decisions taken to avoid potential penalties—restricted access to many of their services. Developers couldn’t access GitHub repositories or use Google Cloud; students couldn’t download software for virtual classrooms; and entrepreneurs struggled to build startups without access to payment gateways or secure infrastructure.
Such restrictions can put users in harm’s way; for instance, not being able to access the Google Play store from inside the country means that Syrians can’t easily download secure versions of everyday tools like Signal or WhatsApp, thus potentially subjecting their communications to surveillance.
These restrictions also compounded the difficulties of war, economic collapse, and internal censorship. Even when Syrian tech workers could connect with global communities, their participation was hampered by legal gray zones and technical blocks.
What the Sanctions Relief ChangesUnder General License 25, companies will now be able to provide services to Syria that have never officially been available. While it may take time for companies to catch up with any regulatory changes, it is our hope that Syrians will soon be able to access and make use of technologies that will enable them to more freely communicate and rebuild.
For Syrian developers, the impact could be transformative. Restored access to platforms like GitHub, AWS, and Google Cloud means the ability to build, test, and deploy apps without the need for VPNs or workarounds. It opens the door to participation in global hackathons, remote work, and open-source communities—channels that are often lifelines for those in conflict zones. Students and educators stand to benefit, too. With sanctions eased, educational tools and platforms that were previously unavailable could soon be accessible. Entrepreneurs may also finally gain access to secure communications, e-commerce platforms, and the broader digital infrastructure needed to start and scale businesses. These developments could help jumpstart local economies.
Despite the good news, challenges remain. Major tech companies have historically been slow to respond to sanctions relief, often erring on the side of over-compliance to avoid liability. Many of the financial and logistical barriers—such as payment processing, unreliable internet, and ongoing conflict—will not disappear overnight.
Moreover, the lifting of sanctions is not a blanket permission slip; it’s a cautious opening. Any future geopolitical shifts or changes in U.S. foreign policy could once again cut off access, creating an uncertain digital future for Syrians.
Nevertheless, by removing barriers imposed by sanctions, the U.S. is taking a step toward recognizing that access to technology is not a luxury, but a necessity—even in sanctioned or conflict-ridden countries.
For Syrian users, the lifting of tech sanctions is more than a bureaucratic change—it’s a door, long closed, beginning to open. And for the international tech community, it’s an opportunity to re-engage, responsibly and thoughtfully, with a population that has been cut off from essential services for too long.
EFFecting Change: Pride in Digital Freedom
Join us for our next EFFecting Change livestream this Thursday! We're talking about emerging laws and platform policies that affect the digital privacy and free expression rights of the LGBT+ community, and how this echoes the experience of marginalized people across the world.
EFFecting Change Livestream Series:Pride in Digital Freedom
Thursday, June 12th
4:00 PM - 5:00 PM Pacific - Check Local Time
This event is LIVE and FREE!
Join our panel featuring EFF Senior Staff Technologist Daly Barnett, EFF Legislative Activist Rindala Alajaji, Chosen Family Law Center Senior Legal Director Andy Izenson, and Woodhull Freedom Foundation Chief Operations Officer Mandy Salley while they discuss what is happening and what should change to protect digital freedom.
effectingchangepride_social_banner.png
We hope you and your friends can join us live! Be sure to spread the word, and share our past livestreams. Please note that all events will be recorded for later viewing on our YouTube page.
Want to make sure you don’t miss our next livestream? Here’s a link to sign up for updates about this series: eff.org/ECUpdates.
Congress Can Act Now to Protect Reproductive Health Data
State, federal, and international regulators are increasingly concerned about the harms they believe the internet and new technology are causing to users of all categories. Lawmakers are currently considering many proposals that are intended to provide protections to the most vulnerable among us. Too often, however, those proposals do not carefully consider the likely unintended consequences or even whether the law will actually reduce the harms it’s supposed to target. That’s why EFF supports Rep. Sara Jacobs’ newly reintroduced “My Body, My Data" Act, which will protect the privacy and safety of people seeking reproductive health care, while maintaining important constitutional protections and avoiding any erosion of end-to-end encryption.
Privacy fears should never stand in the way of healthcare. That's why this common-sense bill will require businesses and non-governmental organizations to act responsibly with personal information concerning reproductive health care. Specifically, it restricts them from collecting, using, retaining, or disclosing reproductive health information that isn't essential to providing the service someone requests.
The bill would protect people who use fertility or period-tracking apps or are seeking information about reproductive health services.
These restrictions apply to companies that collect personal information related to a person’s reproductive or sexual health. That includes data related to pregnancy, menstruation, surgery, termination of pregnancy, contraception, basal body temperature or diagnoses. The bill would protect people who, for example, use fertility or period-tracking apps or are seeking information about reproductive health services.
We are proud to join Center for Democracy and Technology, Electronic Privacy Information Center, National Partnership for Women & Families, Planned Parenthood Federation of America, Reproductive Freedom for All, Physicians for Reproductive Health, National Women’s Law Center, National Abortion Federation, Catholics for Choice, National Council for Jewish Women, Power to Decide, United for Reproductive & Gender Equity, Indivisible, Guttmacher, and National Network of Abortion Funds, and All* Above All in support of this bill.
In addition to the restrictions on company data processing, this bill also provides people with necessary rights to access and delete their reproductive health information. Companies must also publish a privacy policy, so that everyone can understand what information companies process and why. It also ensures that companies are held to public promises they make about data protection and gives the Federal Trade Commission the authority to hold them to account if they break those promises.
The bill also lets people take on companies that violate their privacy with a strong private right of action. Empowering people to bring their own lawsuits not only places more control in the individual's hands, but also ensures that companies will not take these regulations lightly.
Finally, while Rep. Jacobs' bill establishes an important national privacy foundation for everyone, it also leaves room for states to pass stronger or complementary laws to protect the data privacy of those seeking reproductive health care.
We thank Rep. Jacobs and Sens. Mazie Hirono and Ron Wyden for taking up this important bill and using it as an opportunity not only to protect those seeking reproductive health care, but also highlight why data privacy is an important element of reproductive justice.
Oppose STOP CSAM: Protecting Kids Shouldn’t Mean Breaking the Tools That Keep Us Safe
A Senate bill re-introduced this week threatens security and free speech on the internet. EFF urges Congress to reject the STOP CSAM Act of 2025 (S. 1829), which would undermine services offering end-to-end encryption and force internet companies to take down lawful user content.
Tell Congress Not to Outlaw Encrypted Apps
As in the version introduced last Congress, S. 1829 purports to limit the online spread of child sexual abuse material (CSAM), also known as child pornography. CSAM is already highly illegal. Existing law already requires online service providers who have actual knowledge of “apparent” CSAM on their platforms to report that content to the National Center for Missing and Exploited Children (NCMEC). NCMEC then forwards actionable reports to law enforcement agencies for investigation.
S. 1829 goes much further than current law and threatens to punish any service that works to keep its users secure, including those that do their best to eliminate and report CSAM. The bill applies to “interactive computer services,” which broadly includes private messaging and email apps, social media platforms, cloud storage providers, and many other internet intermediaries and online service providers.
The Bill Threatens End-to-End EncryptionThe bill makes it a crime to intentionally “host or store child pornography” or knowingly “promote or facilitate” the sexual exploitation of children. The bill also opens the door for civil lawsuits against providers for the intentional, knowing or even reckless “promotion or facilitation” of conduct relating to child exploitation, the “hosting or storing of child pornography,” or for “making child pornography available to any person.”
The terms “promote” and “facilitate” are broad, and civil liability may be imposed based on a low recklessness state of mind standard. This means a court can find an app or website liable for hosting CSAM even if the app or website did not even know it was hosting CSAM, including because the provider employed end-to-end encryption and could not view the contents of content uploaded by users.
Creating new criminal and civil claims against providers based on broad terms and low standards will undermine digital security for all internet users. Because the law already prohibits the distribution of CSAM, the bill’s broad terms could be interpreted as reaching more passive conduct, like merely providing an encrypted app.
Due to the nature of their services, encrypted communications providers who receive a notice of CSAM may be deemed to have “knowledge” under the criminal law even if they cannot verify and act on that notice. And there is little doubt that plaintiffs’ lawyers will (wrongly) argue that merely providing an encrypted service that can be used to store any image—not necessarily CSAM—recklessly facilitates the sharing of illegal content.
Affirmative Defense Is Expensive and InsufficientWhile the bill includes an affirmative defense that a provider can raise if it is “technologically impossible” to remove the CSAM without “compromising encryption,” it is not sufficient to protect our security. Online services that offer encryption shouldn’t have to face the impossible task of proving a negative in order to avoid lawsuits over content they can’t see or control.
First, by making this protection an affirmative defense, providers must still defend against litigation, with significant costs to their business. Not every platform will have the resources to fight these threats in court, especially newcomers that compete with entrenched giants like Meta and Google. Encrypted platforms should not have to rely on prosecutorial discretion or favorable court rulings after protracted litigation. Instead, specific exemptions for encrypted providers should be addressed in the text of the bill.
Second, although technologies like client-side scanning break encryption, members of Congress have misleadingly claimed otherwise. Plaintiffs are likely to argue that providers who do not use these techniques are acting recklessly, leading many apps and websites to scan all of the content on their platforms and remove any content that a state court could find, even wrongfully, is CSAM.
Tell Congress Not to Outlaw Encrypted Apps
The Bill Threatens Free Speech by Creating a New Exception to Section 230The bill allows a new type of lawsuit to be filed against internet platforms, accusing them of “facilitating” child sexual exploitation based on the speech of others. It does this by creating an exception to Section 230, the foundational law of the internet and online speech. Section 230 provides partial immunity to internet intermediaries when sued over content posted by their users. Without that protection, platforms are much more likely to aggressively monitor and censor users.
Section 230 creates the legal breathing room for internet intermediaries to create online spaces for people to freely communicate around the world, with low barriers to entry. However, creating a new exception that exposes providers to more lawsuits will cause them to limit that legal exposure. Online services will censor more and more user content and accounts, with minimal regard as to whether that content is in fact legal. Some platforms may even be forced to shut down or may not even get off the ground in the first place, for fear of being swept up in a flood of litigation and claims around alleged CSAM. On balance, this harms all internet users who rely on intermediaries to connect with their communities and the world at large.
Despite Changes, A.B. 412 Still Harms Small Developers
California lawmakers are continuing to promote a bill that will reinforce the power of giant AI companies by burying small AI companies and non-commercial developers in red tape, copyright demands and potentially, lawsuits. After several amendments, the bill hasn’t improved much, and in some ways has actually gotten worse. If A.B. 412 is passed, it will make California’s economy less innovative, and less competitive.
The Bill Threatens Small Tech CompaniesA.B. 412 masquerades as a transparency bill, but it’s actually a government-mandated “reading list” that will allow rights holders to file a new type of lawsuit in state court, even as the federal courts continue to assess whether and how federal copyright law applies to the development of generative AI technologies.
The bill would require developers—even two-person startups— to keep lists of training materials that are “registered, pre-registered or indexed” with the U.S. Copyright Office, and help rights holders create digital ‘fingerprints’ of those works—a technical task with no established standards and no realistic path for small teams to follow. Even if it were limited to registered copyrighted material, that’s a monumental task, as we explained last month when we examined the earlier text of A.B. 412.
The bill’s amendments have made compliance even harder, since it now requires technologists to go beyond copyrighted material and somehow identify “pre-registered” copyrights. The amended bill also has new requirements that demand technologists document and keep track of when they look at works that aren’t copyrighted but are subject to exclusive rights, such as pre-1972 sound recordings—rights that, not coincidentally, are primarily controlled by large entertainment companies.
The penalties for noncompliance are steep—up to $1,000 per day per violation—putting small developers at enormous financial risk even for accidental lapses.
The goal of this list is clear: for big content companies to more easily file lawsuits against software developers, big and small. And for most AI developers, the burden will be crushing. Under A.B. 412, a two-person startup building an open-source chatbot, or an indie developer fine-tuning a language model for disability access, would face the same compliance burdens as Google or Meta.
Reading and Analyzing The Open Web Is Not a CrimeIt’s critical to remember that AI training is very likely protected by fair use under U.S. copyright law—a point that’s still being worked out in the courts. The idea that we should preempt that process with sweeping state regulation is not just premature; it’s dangerous.
It’s also worth noting that copyright is governed by federal law. Federal courts are already working to define the boundaries of fair use and copyright in the AI context—the California legislature should let them do their job. A.B. 412 tries to create a state-level regulatory scheme in an area that belongs in federal hands—a risky legal overreach that could further complicate an already unsettled policy space.
A.B. 412 is a solution in search of a problem. The courthouse doors are far from closed to content owners who want to dispute the use of their copyrighted works. There are multiple high-profile litigations over the copyright status of AI training works that are working their way through trial courts and appeal courts right now.
Scope CreepRather than narrowing its focus to make compliance more realistic, the latest amendments to A.B. 412 actually expand the scope of covered works. The bill now demands documentation of obscure categories of content like pre-1972 sound recordings. These recordings have rights that are often murky, and largely controlled by major media companies.
The bill also adds “preregistered” and indexed works to its coverage. Preregistration, designed to help entertainment companies punish unauthorized copying even before commercial release, expands the universe of content that developers must track—without offering any meaningful help to small creators.
A Moat Serving Big TechIronically, the companies that will benefit most from A.B. 412 are the very same large tech firms that lawmakers often claim they want to regulate. Big companies can hire teams of lawyers and compliance officers to handle these requirements. Small developers? They’re more likely to shut down, sell out, or never enter the field in the first place.
This bill doesn’t create a fairer marketplace. It builds a regulatory moat around the incumbents, locking out new competitors and ensuring that only a handful of companies have the resources to develop advanced AI systems. Truly innovative technology often comes from unknown or small companies, but A.B. 412 threatens to turn California—and anyone who does business there—into a fortress where only the biggest players survive.
A Lopsided BillA.B. 412 is becoming an increasingly extreme and one-sided piece of legislation. It’s a maximalist wishlist for legacy rights-holders, delivered at the expense of small developers and the public. The result will be less competition, less innovation, and fewer choices for consumers—not more protection for creators.
This new version does close a few loopholes, and expands the period for AI developers to respond to copyright demands from 7 days to 30 days. But it seriously fails to close others: for instance, the exemption for noncommercial development applies only to work done “exclusively for noncommercial academic or governmental” institutions. That still leaves a huge window to sue hobbyists and independent researchers who don’t have university or government jobs.
While the bill nominally exempts developers who use only public or developer-owned data, that’s a carve-out with no practical value. Like a search engine, nearly every meaningful AI system relies on mixed sources — and developers can’t realistically track the copyright status of them all.
At its core, A.B. 412 is a flawed bill that would harm the whole U.S. tech ecosystem. Lawmakers should be advancing policies that protect privacy, promote competition, and ensure that innovation benefits the public—not just a handful of entrenched interests.
If you’re a California resident, now is the time to speak out. Tell your legislators that A.B. 412 will hurt small companies, help big tech, and lock California’s economy in the past.
35 Years for Your Freedom Online
Once upon a time we were promised flying cars and jetpacks. Yet we've arrived at a more complicated timeline where rights advocates can find themselves defending our hard-earned freedoms more often than shooting for the moon. In tough times, it's important to remember that your vision for the future can be just as valuable as the work you do now.
Thirty-five years ago, a small group of folks saw the coming digital future and banded together to ensure that technology would empower people, not oppress them—and EFF was born. While the dangers of corporate and state forces grew alongside the internet, EFF and supporters like you faithfully rose to the occasion. Will you help celebrate EFF’s 35th anniversary and donate in support of digital freedom?
Protect Online Privacy & Free Expression
Together we’ve won many fights for encryption, free speech, innovation, and privacy online. Yet it’s plain to see that we must keep advocating for technology users whether that’s in the courts, before lawmakers, educating the public, or creating privacy-enhancing tools. EFF members make it possible—you can lend a hand and get some great perks!
Summer Swag Is HereWe love making stuff for EFF’s members each year. It’s our way of saying thanks for supporting the mission for your rights online, and I hope it’s your way of starting a conversation about internet freedom with people in your life.
shirts-both-necklines-wider-square-750px.jpgCelebrate EFF's 35th Anniversary in the digital rights movement with this EFF35 Cityscape member t-shirt by Hugh D’Andrade! EFF has a not-so-secret weapon that keeps us in the fight even when the odds are against us: we never lose sight of our vision for a better future. Choose a roomy Classic Fit Crewneck or a soft Slim Fit V-Neck.
hoodie-front-back-alt-square-750px.jpgAnd enjoy Lovelace-Klimtian vibes on EFF’s new Motherboard Hooded Sweatshirt by Shirin Mori. Gold details and orange poppies pop on lush forest green. Don't lose the forest for the trees—keep fighting for a world where tech supports people irl.
Join the Sustaining Donor Challenge (it’s easy)You'll get a numbered EFF35 Challenge Coin when you become a monthly or annual Sustaining Donor by July 10. It’s that simple.
If you're already a Sustaining Donor—THANKS! You too can get an EFF 35th Anniversary Challenge Coin when you upgrade your donation. Just increase your monthly or annual gift and let us know by emailing upgrade@eff.org. Get started at eff.org/recurring or go to your PayPal account if you used one.
coin_cat_1200px.jpgSupport internet freedom with a no-fuss automated recurring donation! Over 30% of EFF members have joined as Sustaining Donors to defend digital rights (and get some great swag every year). Challenge coins follow a long tradition of offering a symbol of kinship and respect for great achievements—and EFF owes its strength to technology creators and users like you.
With your help, EFF is here to stay.
Protect Online Privacy & Free Expression
NYC lets AI gamble with Child Welfare
The Markup revealed in its reporting last month that New York City’s Administration for Children’s Services (ACS) has been quietly deploying an algorithmic tool to categorize families as “high risk". Using a grab-bag of factors like neighborhood and mother’s age, this AI tool can put families under intensified scrutiny without proper justification and oversight.
ACS knocking on your door is a nightmare for any parent, with the risk that any mistakes can break up your family and have your children sent to the foster care system. Putting a family under such scrutiny shouldn’t be taken lightly and shouldn’t be a testing ground for automated decision-making by the government.
This “AI” tool, developed internally by ACS’s Office of Research Analytics, scores families for “risk” using 279 variables and subjects those deemed highest-risk to intensified scrutiny. The lack of transparency, accountability, or due process protections demonstrates that ACS has learned nothing from the failures of similar products in the realm of child services.
The algorithm operates in complete secrecy and the harms from this opaque “AI theater” are not theoretical. The 279 variables are derived only from cases back in 2013 and 2014 where children were seriously harmed. However, it is unclear how many cases were analyzed, what, if any, kind of auditing and testing was conducted, and whether including of data from other years would have altered the scoring.
What we do know is disturbing: Black families in NYC face ACS investigations at seven times the rate of white families and ACS staff has admitted that the agency is more punitive towards Black families, with parents and advocates calling its practices “predatory.” It is likely that the algorithm effectively automates and amplifies this discrimination.
Despite the disturbing lack of transparency and accountability, ACS’s usage of this system has subjected families that this system ranks as “highest risk” to additional scrutiny, including possible home visits, calls to teachers and family, or consultations with outside experts. But those families, their attorneys, and even caseworkers don't know when and why the system flags a case, making it difficult to challenge the circumstances or process that leads to this intensified scrutiny.
This is not the only incidence in which usage of AI tools in the child services system has encountered issues with systemic biases. Back in 2022, the Associated Press reported that Carnegie Mellon researchers found that from August 2016 to May 2018, Allegheny County in Pennsylvania used an algorithmic tool that flagged 32.5% of Black children for “mandatory” investigation compared to just 20.8% of white, all while social workers disagreed with the algorithm's risk scores about one-third of the time.
The Allegheny system operates with the same toxic combination of secrecy and bias now plaguing NYC. Families and their attorneys can never know their algorithmic scores, making it impossible to challenge decisions that could destroy their lives. When a judge asked to see a family’s score in court, the county resisted, claiming it didn't want to influence legal proceedings with algorithmic numbers, which suggests that the scores are too unreliable for judicial scrutiny yet acceptable for targeting families.
Elsewhere these biased systems were successfully challenged. The developers of the Allegheny tool had already had their product rejected in New Zealand, where researchers correctly identified that the tool would likely result in more Māori families being tagged for investigation. Meanwhile, California spent $195,273 developing a similar tool before abandoning it in 2019 due in part to concerns about racial equity.
Governmental deployment of automated and algorithmic decision making not only perpetuates social inequalities, but removes mechanisms for accountability when agencies make mistakes. The state should not be using these tools for rights-determining decisions and any other uses must be subject to vigorous scrutiny and independent auditing to ensure the public’s trust in the government’s actions.
Criminalizing Masks at Protests is Wrong
There has been a crescendo of states attempting to criminalize the wearing of face coverings while attending protests. Now the President has demanded, in the context of ongoing protests in Los Angeles: “ARREST THE PEOPLE IN FACE MASKS, NOW!”
But the truth is: whether you are afraid of catching an airborne illness from your fellow protestors, or you are concerned about reprisals from police or others for expressing your political opinions in public, you should have the right to wear a mask. Attempts to criminalize masks at protests fly in the face of a right to privacy.
Anonymity is a fundamental human right.
In terms of public health, wearing a mask while in a crowd can be a valuable tool to prevent the spread of communicable illnesses. This can be essential for people with compromised immune systems who still want to exercise their First Amendment-protected right to protest.
Moreover, wearing a mask is a perfectly legitimate surveillance self-defense practice during a protest. There has been a massive proliferation of surveillance camera networks, face recognition technology, and databases of personal information. There also is a long law enforcement’s history of harassing and surveilling people for publicly criticizing or opposing law enforcement practices and other government policies. What’s more, non-governmental actors may try to identify protesters in order to retaliate against them, for example, by limiting their employment opportunities.
All of this may chill our willingness to speak publicly or attend a protest in a cause we believe in. Many people would be less willing to attend a rally or march if they know that a drone or helicopter, equipped with a camera, will take repeated passes over the crowd, and police later will use face recognition to scan everyone’s faces and create a list of protest attendees. This would make many people rightfully concerned about surveillance and harassment from law enforcement.
Anonymity is a fundamental human right. EFF has long advocated for anonymity online. We’ve also supported low-tech methods to protect our anonymity from high-tech snooping in public places; for example, we’ve supported legislation to allow car owners to use license plate covers when their cars are parked to reduce their exposure to ALPRs.
A word of caution. No surveillance self-defense technique is perfect. Technology companies are trying to develop ways to use face recognition technology to identify people wearing masks. But if somebody wants to hide their face to try to avoid government scrutiny, the government should not punish them.
While members of the public have a right to wear a mask when they protest, law enforcement officials should not wear a mask when they arrest protesters and others. An elementary principle of police accountability is to require uniformed officers to identify themselves to the public; this discourages officer misconduct, and facilitates accountability if an officer violates the law. This is one reason EFF has long supported the First Amendment right to record on-duty police, including ICE officers.
For these reasons, EFF believes it is wrong for state legislatures, and now federal law enforcement, to try to criminalize or punish mask wearing at protests. It is especially wrong, in moments like the present, where government it taking extreme measures to crack down on the civil liberties of protesters.
Privacy Victory! Judge Grants Preliminary Injunction in OPM/DOGE Lawsuit
NEW YORK–In a victory for personal privacy, a New York federal district court judge today granted a preliminary injunction in a lawsuit challenging the U.S. Office of Personnel Management’s (OPM) disclosure of records to DOGE and its agents.
Judge Denise L. Cote of the U.S. District Court for the Southern District of New York found that OPM violated the Privacy Act and bypassed its established cybersecurity practices under the Administrative Procedures Act. The court will decide the scope of the injunction later this week. The plaintiffs have asked the court to halt DOGE agents’ access to OPM records and for DOGE and its agents to delete any records that have already been disclosed. OPM’s databases hold highly sensitive personal information about tens of millions of federal employees, retirees, and job applicants.
“The plaintiffs have shown that the defendants disclosed OPM records to individuals who had no legal right of access to those records,” Cote found. “In doing so, the defendants violated the Privacy Act and departed from cybersecurity standards that they are obligated to follow. This was a breach of law and of trust. Tens of millions of Americans depend on the Government to safeguard records that reveal their most private and sensitive affairs.”
The Electronic Frontier Foundation (EFF), Lex Lumina LLP, Democracy Defenders Fund, and The Chandra Law Firm requested the injunction as part of their ongoing lawsuit against OPM and DOGE on behalf of two labor unions and individual current and former government workers across the country. The lawsuit’s union plaintiffs are the American Federation of Government Employees AFL-CIO and the Association of Administrative Law Judges, International Federation of Professional and Technical Engineers Judicial Council 1 AFL-CIO.
The lawsuit argues that OPM and OPM Acting Director Charles Ezell illegally disclosed personnel records to DOGE agents in violation of the Administrative Procedures Act and the federal Privacy Act of 1974, a watershed anti-surveillance statute that prevents the federal government from abusing our personal information. In addition to seeking to permanently halt the disclosure of further OPM data to DOGE, the lawsuit asks for the deletion of any data previously disclosed by OPM to DOGE.
The federal government is the nation’s largest employer, and the records held by OPM represent one of the largest collections of sensitive personal data in the country. In addition to personally identifiable information such as names, social security numbers, and demographic data, these records include work information like salaries and union activities; personal health records and information regarding life insurance and health benefits; financial information like death benefit designations and savings programs; nondisclosure agreements; and information concerning family members and other third parties referenced in background checks and health records.
OPM holds these records for tens of millions of Americans, including current and former federal workers and those who have applied for federal jobs. OPM has a history of privacy violations—an OPM breach in 2015 exposed the personal information of 22.1 million people—and its recent actions make its systems less secure.
With few exceptions, the Privacy Act limits the disclosure of federally maintained sensitive records on individuals without the consent of the individuals whose data is being shared. It protects all Americans from harms caused by government stockpiling of our personal data. This law was enacted in 1974, the last time Congress acted to limit the data collection and surveillance powers of an out-of-control President.
A number of courts have already found that DOGE’s activities at other agencies likely violate the law, including at the Social Security Administration and the Treasury Department.
For the preliminary injunction: https://www.eff.org/document/afge-v-opm-opinion-and-order-granting-preliminary-injunction
For the complaint: https://www.eff.org/document/afge-v-opm-complaint
For more about the case: https://www.eff.org/cases/american-federation-government-employees-v-us-office-personnel-management
Contacts:
Electronic Frontier Foundation: press@eff.org
Lex Lumina LLP: Managing Partner Rhett Millsaps, rhett@lex-lumina.com
Victory! Austin Organizers Cancel City's Flock ALPR Contract
Austin organizers turned out to rebuke the city’s misguided contract with Flock Safety— and won. This successful pushback from the community means at the end of the month Austin police will no longer be able to use the surveillance network of automated license plate readers (ALPRs) across the city.
Two years ago Austin City Council approved this controversial contract, despite strong local opposition. We knew then that these AI-driven surveillance systems weren’t just creepy, they are prone to misuse and mistakes which have a real human toll.
In the years since, this concern has materialized time and time again, and now the risks have heightened with the potential of using the data against immigrants and people seeking trans or reproductive healthcare. Most recently Texas authorities were implicated in a 404 media report on the use of these cameras to target abortion seekers.
Today's victory in Austin is a tribute to what happens when a coalition of activist groups come together in common cause
Just a few days before the scheduled vote, an audit of the Austin Police Department program also revealed that over 20% of ALPR database searches lacked proper documentation or justification, in violation of department policy. The audit also revealed contract language allowed for data retention beyond council-mandated limits on retention and potential sharing with outside agencies.
Fortunately, more than 30 community groups, including Electronic Frontier Alliance member EFF-Austin, joined forces to successfully prevent contract renewal.
EFF-Austin Executive Director Kevin Welch told us that, "Today's victory in Austin is a tribute to what happens when a coalition of activist groups come together in common cause and stand in solidarity against the expansion of the surveillance state.” He went on to say, “But the fight is not over. While the Flock contract has been discontinued, Austin still makes use of ALPRs via its contract with Axon, and [the] council may attempt to bring this technology back [...] That being said, real progress in educating elected officials on the dangers of these technologies has been made.”
This win in a city as large as Austin lends momentum to the larger trend across the country where local communities are pushing back against ALPR surveillance. EFF continues to stand with these local efforts, and encourages other organizers to reach out at organizing [at] eff.org in the fight against local surveillance.
Speaking to this trend, Kevin added, “As late as Monday, it didn't look like we had the votes to make this victory happen. While these are dark times, there are still lights burning in the dark, and through collective action, we can burn bright."
EFF to Department Homeland Security: No Social Media Surveillance of Immigrants
EFF submitted comments to the Department of Homeland Security (DHS) and its subcomponent U.S. Citizenship and Immigration Services (USCIS), urging them to abandon a proposal to collect social media identifiers on forms for immigration benefits. This collection would mark yet a further expansion of the government’s efforts to subject immigrants to social media surveillance, invading their privacy and chilling their free speech and associational rights for fear of being denied key immigration benefits.
Specifically, the proposed rule would require applicants to disclose their social media identifiers on nine immigration forms, including applications for permanent residency and naturalization, impacting more than 3.5 million people annually. USCIS’s purported reason for this collection is to assist with identity verification, as well as vetting and national security screening, to comply with Executive Order 14161. USCIS separately announced that it would look for “antisemitic activity” on social media as grounds for denying immigration benefits, which appears to be related to the proposed rule, although not expressly included it.
Additionally, a day after the proposed rule was published, Axios reported that the State Department, the Department of Justice, and DHS confirmed a joint collaboration called “Catch and Revoke,” using AI tools to review student visa holders’ social media accounts for speech related to “pro-Hamas” sentiment or “antisemitic activity.”
If the proposed rule sounds familiar, it’s because this is not the first time the government has proposed the collection of social media identifiers to monitor noncitizens. In 2019, for example, the State Department implemented a policy requiring visa and visa waiver applicants to the United States to disclose the identifiers they used on some 20 social media platforms over the last five years—affecting over 14.7 million people annually. EFF joined a large contingent of civil and human rights organizations in objecting to that collection. That policy is now the subject of ongoing litigation in Doc Society v. Blinken, a case brought by two documentary film organizations, who argue that the rule affects the expressive and associational rights of their members by impeding their ability to collaborate and engage with filmmakers around the world. EFF filed two amicus briefs in that case.
What distinguishes this proposed rule from the State Department’s existing program is that most, if not all, of the noncitizens who would be affected currently legally reside in the United States, allowing them to benefit from constitutional protections.
In our comments, we explained that surveillance of even public-facing social media can implicate privacy interests by aggregating a wealth of information about both an applicant for immigration benefits, and also people in their networks, including U.S. citizens. This is because of the quantity and quality of information available on social media, and because of its inherent interconnected nature.
We also argued that the proposed rule appears to allow for the collection and consideration of First Amendment-protected speech, including core political speech, and anonymous and pseudonymous speech. This inevitably leads to a chilling effect because immigration benefits applicants will have to choose between potentially forgoing key benefits or self-censoring to avoid government scrutiny. That is, to help ensure that a naturalized citizenship application is not rejected, for example, an applicant may avoid speaking out on social media about American foreign policy or expressing views about other political topics that may be considered controversial by the federal government—even when other Americans are free to do so.
We urge DHS and USCIS to abandon this dangerous proposal.
EFF to Court: Young People Have First Amendment Rights
Utah cannot stifle young people’s First Amendment rights to use social media to speak about politics, create art, discuss religion, or to hear from other users discussing those topics, EFF argued in a brief filed this week.
EFF filed the brief in NetChoice v. Brown, a constitutional challenge to the Utah Minor Protection in Social Media Act. The law prohibits young people from speaking to anyone on social media outside of the users with whom they are connected or those users’ connections. It also requires social media services to make young people’s accounts invisible to anyone outside of that same subgroup of users. The law requires parents to consent before minors can change those default restrictions.
To implement these restrictions, the law requires a social media service to verify every user’s age so that it knows whether to apply those speech-restricting settings.
The law therefore burdens the First Amendment rights of both young people and adults, the friend-of-the-court brief argued. The ACLU, Freedom to Read Foundation, LGBT Technology Institute, TechFreedom, and Woodhull Freedom Foundation joined EFF on the brief.
Utah, like many states across the country, has sought to significantly restrict young people’s ability to use social media. But “Minors enjoy the same First Amendment right as adults to access and engage in protected speech on social media,” the brief argues. As the brief details, minors use social media for to express political opinions, create art, practice religion, and find community.
Utah cannot impose such a severe restriction on minors’ ability to speak and to hear from others on social media without violating the First Amendment. “Utah has effectively blocked minors from being able to speak to their communities and the larger world, frustrating the full exercise of their First Amendment rights,” the brief argues.
Moreover, the law “also violates the First Amendment rights of all social media users—minors and adults alike—because it requires every user to prove their age, and compromise their anonymity and privacy, before using social media.”
Requiring internet users to provide their ID or other proof of their age could block people from accessing lawful speech if they don’t have the right form of ID, the brief argues. And requiring users to identify themselves infringes on people’s right to be anonymous online. That may deter people from joining certain social media services or speaking on certain topics, as people often rely on anonymity to avoid retaliation for their speech.
Finally, requiring users to provide sensitive personal information increases their risk of future privacy and security invasions, the brief argues.
Keeping the Web Up Under the Weight of AI Crawlers
If you run a site on the open web, chances are you've noticed a big increase in traffic over the past few months, whether or not your site has been getting more viewers, and you're not alone. Operators everywhere have observed a drastic increase in automated traffic—bots—and in most cases attribute much or all of this new traffic to AI companies.
BackgroundAI—in particular, Large Language Models (LLMs) and generative AI (genAI)—rely on compiling as much information from relevant sources (i.e., "texts written in English" or "photographs") as possible in order to build a functional and persuasive model that users will later interact with. While AI companies in part distinguish themselves by what data their models are trained on, possibly the greatest source of information—one freely available to all of us—is the open web.
To gather up all that data, companies and researchers use automated programs called scrapers (sometimes referred to by the more general term "bots") to "crawl" over the links available between various webpages and save the types of information they're tasked with as they go. Scrapers are tools with a long, and often beneficial, history: services like search engines, the Internet Archive, and all kinds of scientific research rely on them.
When scrapers are not deployed thoughtfully, however, they can contribute to higher hosting costs, lower performance, and even site outages, particularly when site operators see so many of them in operation at the same time. In the long run all this may lead to some sites shutting down rather than bearing the brunt of it.
For-profit AI companies must ensure they do not poison the well of the open web they rely on in a short-sighted rush for training data.
Bots: Read the RoomThere are existing best practices those who use scrapers should follow. When bots and their operators ignore these guideposts it sends a signal to site operators, sometimes explicitly, that they can or should cut off their access, impede performance, and in the worst case it may take a site down for all users. Some companies appear to follow these practices most of the time, but we see increasing reports and evidence of new bots that don't.
First, scrapers should follow instructions given in a site's robots.txt file, whether those are to back off to a certain crawling rate, exclude certain paths, or not to crawl the site at all.
Second, bots should send their requests with a clearly labeled User Agent string which indicates their operator, their purpose, and a means of contact.
Third, those running scrapers should provide a process for site operators to request back-offs, rate caps, exclusions, and to report problematic behavior via the means of contact info or response forms linked via the User Agent string.
Mitigations for Site OperatorsOf course, if you're running a website dealing with a flood of crawling traffic, waiting for those bots to change their behavior for the better might not be realistic. Here are a few suggested, if imperfect, mitigations based in part on our own sometimes frustrating experiences.
First, use a caching layer. In most cases a Content Delivery Network (CDN) or an "edge platform" (essentially a newer iteration of a CDN) can provide this for you, and some services offer a free tier for non-commercial users. There are also a number of great projects if you prefer to self-host. Some of the tools we've used for caching include varnish, memcached, and redis.
Second, convert to static content to prevent resource-intensive database reads. In some cases this may reduce the need for caching.
Third, use targeted rate limiting to slow down bots without taking your whole site down. But know this can get difficult when scrapers try to disguise themselves with misleading User Agent strings or by spreading a fleet of crawlers out across many IP addresses.
Other mitigations such as client-side validation (e.g. CAPTCHAs or proof-of-work) and fingerprinting carry privacy and usability trade-offs, and we warn against deploying them without careful forethought.
Where Do We Go From Here?To reiterate, whatever one's opinion of these particular AI tools, scraping itself is not the problem. Automated access is a fundamental technique of archivists, computer scientists, and everyday users that we hope is here to stay—as long as it can be done non-destructively. However, we realize that not all implementers will follow our suggestions for bots above, and that our mitigations are both technically advanced and incomplete.
Because we see so many bots operating for the same purpose at the same time, it seems there's an opportunity here to provide these automated data consumers with tailored data providers, removing the need for every AI company to scrape every website, seemingly, every day.
And on the operators' end, we hope to see more web-hosting and framework technology that is built with an awareness of these issues from day one, perhaps building in responses like just-in-time static content generation or dedicated endpoints for crawlers.
EFF to the FTC: DMCA Section 1201 Creates Anti-Competitive Regulatory Barriers
As part of multi-pronged effort towards deregulation, the Federal Trade Commission has asked the public to identify any and all “anti-competitive” regulations. Working with our friends at Authors Alliance, EFF answered, calling attention to a set of anti-competitive regulations that many don’t recognize as such: the triennial exemptions to Section 1201 of the Digital Millennium Copyright Act, and the cumbersome process on which they depend.
Copyright grants exclusive rights to creators, but only as a means to serve the broader public interest. Fair use and other limitations play a critical role in that service by ensuring that the public can engage in commentary, research, education, innovation, and repair without unjustified restriction. Section 1201 effectively forbids fair uses where those uses require circumventing a software lock (a.k.a. technological protection measures) on a copyrighted work.
Congress realized that Section 1201 had this effect, so it adopted a safety valve—a triennial process by which the Library of Congress could grant exemptions. Under the current rulemaking framework, however, this intended safety valve functions more like a chokepoint. Individuals and organizations seeking an exemption to engage in lawful fair use must navigate a burdensome, time-consuming administrative maze. The existing procedural and regulatory barriers ensure that the rulemaking process—and Section 1201 itself—thwarts, rather than serves, the public interest.
The FTC does not, of course, control Congress or the Library of Congress. But we hope its investigation and any resulting report on anti-competitive regulations will recognize the negative effects of Section 1201 and that the triennial rulemaking process has failed to be the check Congress intended. Our comments urge the FTC to recommend that Congress repeal or reform Section 1201. At a minimum, the FTC should advocate for fundamental revisions to the Library of Congress’s next triennial rulemaking process, set for 2026, so that copyright law can once again fulfill its purpose: to support—rather than thwart—competitive and independent innovation.
You can find the full comments here.
The Dangers of Consolidating All Government Information
The Trump administration has been heavily invested in consolidating all of the government’s information into a single searchable, or perhaps AI-queryable, super database. The compiling of all of this information is being done with the dubious justification of efficiency and modernization–however, in many cases, this information was originally siloed for important reasons: to protect your privacy, to prevent different branches of government from using sensitive data to punish or harass you, and to perserve the trust in and legitimacy of important civic institutions.
This process of consolidation has taken several forms. The purported Department of Government Efficiency (DOGE) has been seeking access to the data and computer systems of dozens of government agencies. According to one report, access to the data of these agencies has given DOGE, as of April 2025, hundreds of pieces of personal information about people living in the United States–everything ranging from financial and tax information, health and healthcare information, and even computer I.P. addresses. EFF is currently engaged in a lawsuit against the U.S. Office of Personnel Management (OPM) and DOGE for disclosing personal information about government employees to people who don’t need it in violation of the Privacy Act of 1974.
Another key maneuver in centralizing government information has been to steamroll the protections that were in place that keep this information away from agencies that don’t need, or could abuse, this information. This has been done by ignoring the law, like the Trump administration did when it ordered the IRS make tax information available for the purposes of immigration enforcement. It has also been done through the creation of new (and questionable) executive mandates that all executive branch information be made available to the White House or any other agency. Specifically, this has been attempted with the March 20, 2025 Executive Order, “Stopping Waste Fraud and Abuse by Eliminating Information Silos” which mandates that the federal government, as well as all 50 state governments, allow other agencies “full and prompt access to all unclassified agency records, data, software systems, and information technology systems.” But executive orders can’t override privacy laws passed by Congress.
Not only is the Trump administration trying to consolidate all of this data institutionally and statutorily, they are also trying to do it technologically. A new report revealed that the administration has contracted Palantir—the open-source surveillance and security data-analytics firm—to fuse data from multiple agencies, including the Department of Homeland Security and Health and Human Services.
The consolidation of government records equals more government power that can be abused. Different government agencies necessarily collect information to provide essential services or collect taxes. The danger comes when the government begins pooling that data and using it for reasons unrelated to the purpose it was collected.
Imagine, for instance, a scenario where a government employee could be denied health-related public services or support because of the information gathered about them by an agency that handles HR records. Or a person’s research topic according to federal grants being used to weigh whether or not that person should be allowed to renew a passport.
Marginalized groups are most vulnerable to this kind of abuse, including to locate individuals for immigration enforcement using tax records. Government records could also be weaponized against people who receive food subsidies, apply for student loans, or take government jobs.
Congress recognized these dangers 50 years ago when it passed the Privacy Act to put strict limits on the government’s use of large databases. At that time, trust in the government eroded after revelations about White House enemies’ lists, misuse of existing government personality profiles, and surveillance of opposition political groups.
There’s another important issue at stake: the future of federal and state governments that actually have the information and capacity to help people. The more people learn to distrust the government because they worry the information they give certain government agencies may be used to hurt them in the future, the less likely people will be to participate or seek the help they need. The fewer people engage with these agencies, the less likely they will be to survive. Trust is a key part of any relationship between the governed and government and when that trust is abused or jettisoned, the long-term harms are irreparable.
EFF, like dozens of other organizations, will continue to fight to ensure personal records held by the government are only used and disclosed as needed and only for the purpose they were collected, as federal law demands.
Related Cases: American Federation of Government Employees v. U.S. Office of Personnel ManagementJudges Stand With Law Firms (and EFF) Against Trump’s Executive Orders
“Pernicious.”
“Unprecedented... cringe-worthy.”
“Egregious.”
“Shocking.”
These are just some of the words that federal judges used in recent weeks to describe President Trump’s politically motivated and vindictive executive orders targeting law firms that have employed people or represented clients or causes he doesn’t like.
But our favorite word by far is “unconstitutional.”
EFF was one of the very first legal organizations to publicly come out in support of Perkins Coie when it became the first law firm to challenge the legality of President Trump’s executive order targeting it. Since then, EFF has joined four amicus briefs in support of other targeted law firms, and in all four cases, judges from the U.S. District Court for the District of Columbia have indicated they’re having none of it. Three have issued permanent injunctions deeming the executive orders null and void, and the fourth seems to be headed in that same direction.
Trump issued his EO against Perkins Coie on March 6. In a May 2 opinion finding the order unconstitutional and issuing a permanent injunction, Senior Judge Beryl A. Howell wrote:
“By its terms, this Order stigmatizes and penalizes a particular law firm and its employees—from its partners to its associate attorneys, secretaries, and mailroom attendants—due to the Firm’s representation, both in the past and currently, of clients pursuing claims and taking positions with which the current President disagrees, as well as the Firm’s own speech,” Howell wrote. “In a cringe-worthy twist on the theatrical phrase ‘Let’s kill all the lawyers,’ EO 14230 takes the approach of ‘Let’s kill the lawyers I don’t like,’ sending the clear message: lawyers must stick to the party line, or else.”
“Using the powers of the federal government to target lawyers for their representation of clients and avowed progressive employment policies in an overt attempt to suppress and punish certain viewpoints, … is contrary to the Constitution, which requires that the government respond to dissenting or unpopular speech or ideas with ‘tolerance, not coercion.’”
Trump issued a similar EO against Jenner & Block on March 25. In a May 23 opinion also finding the order unconstitutional and issuing a permanent injunction, Senior Judge John D. Bates wrote:
“This order—which takes aim at the global law firm Jenner & Block—makes no bones about why it chose its target: it picked Jenner because of the causes Jenner champions, the clients Jenner represents, and a lawyer Jenner once employed. Going after law firms in this way is doubly violative of the Constitution. Most obviously, retaliating against firms for the views embodied in their legal work—and thereby seeking to muzzle them going forward—violates the First Amendment’s central command that government may not ‘use the power of the State to punish or suppress disfavored expression.’ Nat’l Rifle Ass’n of Am. v. Vullo, 602 U.S. 175, 188 (2024). More subtle but perhaps more pernicious is the message the order sends to the lawyers whose unalloyed advocacy protects against governmental viewpoint becoming government-imposed orthodoxy. This order, like the others, seeks to chill legal representation the administration doesn’t like, thereby insulating the Executive Branch from the judicial check fundamental to the separation of powers. It thus violates the Constitution and the Court will enjoin its operation in full.”
Trump issued his EO targeting WilmerHale on March 27. In a May 27 opinion finding that order unconstitutional, Senior Judge Richard J. Leon wrote:
“The cornerstone of the American system of justice is an independent judiciary and an independent bar willing to tackle unpopular cases, however daunting. The Founding Fathers knew this! Accordingly, they took pains to enshrine in the Constitution certain rights that would serve as the foundation for that independence. Little wonder that in the nearly 250 years since the Constitution was adopted no Executive Order has been issued challenging these fundamental rights. Now, however, several Executive Orders have been issued directly challenging these rights and that independence. One of these Orders is the subject of this case. For the reasons set forth below, I have concluded that this Order must be struck down in its entirety as unconstitutional. Indeed, to rule otherwise would be unfaithful to the judgment and vision of the Founding Fathers!”
“Taken together, the provisions constitute a staggering punishment for the firm’s protected speech! The Order is intended to, and does in fact, impede the firm’s ability to effectively represent its clients!”
“Even if the Court found that each section could be grounded in Executive power, the directives set out in each section clearly exceed that power! The President, by issuing the Order, is wielding his authority to punish a law firm for engaging in litigation conduct the President personally disfavors. Thus, to the extent the President does have the power to limit access to federal buildings, suspend and revoke security clearances, dictate federal hiring, and manage federal contracts, the Order surpasses that authority and in fact usurps the Judiciary’s authority to resolve cases and sanction parties that come before the courts!”
The fourth case in which EFF filed a brief involved Trump’s April 9 EO against Susman Godfrey. In that case, Judge Loren L. AliKhan is still considering whether to issue a permanent injunction, but on April 15 gave a fiery ruling from the bench in granting a temporary restraining order against the EO’s enforcement.
“The executive order is based on a personal vendetta against a particular firm, and frankly, I think the framers of our Constitution would see this as a shocking abuse of power,” AliKhan said, as quoted by Courthouse News Service. "The government cannot hold lawyers hostage to force them to agree with it, allowing the government to coerce private business, law firms and lawyers solely on the basis of their view is antithetical to our constitutional republic and hampers this court, and every court’s, ability to adjudicate these cases.”
And, as quoted by the New York Times: “Law firms across the country are entering into agreements with the government out of fear that they will be targeted next and that coercion is plain and simple. And while I wish other firms were not capitulating as readily, I admire firms like Susman for standing up and challenging it when it does threaten the very existence of their business. … The government has sought to use its immense power to dictate the positions that law firms may and may not take. The executive order seeks to control who law firms are allowed to represent. This immensely oppressive power threatens the very foundations of legal representation in our country.”
As we wrote when we began filing amicus briefs in these cases, an independent legal profession is a cornerstone of democracy and the rule of law. As a nonprofit legal organization that frequently sues the federal government, EFF understands the value of this bedrock principle and how it–and First Amendment rights more broadly–are threatened by President Trump’s executive orders. It is especially important that the whole legal profession speak out against these actions, particularly in light of the silence or capitulation of a few large law firms.
We’re glad the courts agree.
Statement on California State Senate Advancing Dangerous Surveillance Bill
In the wake of the California State Senate’s passage of S.B. 690, the Electronic Frontier Foundation (EFF), TechEquity, Consumer Federation of California, Tech Oversight California, and ACLU California Action issued a joint statement warning that the bill would put the safety and privacy of millions of Californians at serious risk:
“SB 690 gives the green-light to dystopian big tech surveillance practices which will endanger the privacy and safety of all Californians. SB 690 would allow companies to spy on us to get our sensitive personal information, such as our immigration status or what healthcare we’ve received. And once they have our sensitive personal information, SB 690 places no limits on how that business can use or share that information, allowing them to share it with data brokers, immigration officials, or law enforcement officials in states that restrict reproductive or gender-affirming care.
“At a time where agencies of the federal government are actively targeting individuals based on information collected from businesses about their political beliefs, religious affiliations, or health decisions, we cannot risk sharing even more sensitive information with them. The legislature should be doing all it can to protect Californians, not make it easier for the federal government to secretly obtain our sensitive information.”
Background
Podcast Episode: Why Three is Tor's Magic Number
Many in Silicon Valley, and in U.S. business at large, seem to believe innovation springs only from competition, a race to build the next big thing first, cheaper, better, best. But what if collaboration and community breeds innovation just as well as adversarial competition?
%3Ciframe%20height%3D%2252px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2Ffb16baef-bf64-4c2b-9069-ad1dfaade04b%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E
Privacy info.
This embed will serve content from simplecast.com
(You can also find this episode on the Internet Archive and on YouTube.)
Isabela Fernandes believes free, open-source software has helped build the internet, and will be key to improving it for all. As executive director of the Tor Project – the nonprofit behind the decentralized, onion-routing network providing crucial online anonymity to activists and dissidents around the world – she has fought tirelessly for everyone to have private access to an uncensored internet, and Tor has become one of the world's strongest tools for privacy and freedom online.
Fernandes joins EFF’s Cindy Cohn and Jason Kelley to discuss the importance of not just accepting technology as it’s given to us, but collaboratively breaking it, tinkering with it, and rebuilding it together until it becomes the technology that we really need to make our world a better place.
In this episode you’ll learn about:
- How the Tor network protects the anonymity of internet users around the world, and why that’s so important
- Why online privacy is NOT only for “people who have something to hide”
- The importance of making more websites friendly and accessible to Tor and similar systems
- How Tor can actually benefit law enforcement
- How free, open-source software can power economic booms
Isabela Fernandes has been executive director of the Tor Project since 2018; she had been a project manager there since 2015. She also has served since 2023 as a board member of both European Digital Rights – an association of civil and human rights organizations aimed at building a people-centered, democratic society – and The Engine Room, a nonprofit that supports social justice movements to use technology and data in safe, responsible and strategic ways, while actively mitigating the vulnerabilities created by digital systems. Earlier, Fernandes worked as a product manager for Twitter; Latin America project manager for North by South, which offered open-source technology integration to companies using expertise of Latin American free software specialists; as a project manager for Brazil’s President, overseeing migration of the IT department to free software; and as a technical advisor to Brazil’s Ministry of Communications, creating and implementing new features and free-software tools for the National Digital Inclusion Program serving 3,500 communities. She’s a former member of the board of the Calyx Institute, an education and research organization devoted to studying, testing and developing and implementing privacy technology and tools to promote free speech, free expression, civic engagement and privacy rights on the internet and in the mobile telephone industry. And she was a cofounder and longtime volunteer with Indymedia Brazil, an independent journalism collective.
Resources:
- “We Are Making the Tech We Want” panel discussion at The Tech We Want online summit (Oct. 17, 2024)
- Help Net Security: “Delivering privacy in a world of pervasive digital surveillance: Tor Project’s Executive Director speaks out” (Aug. 2, 2023)
- EFF: “EFF Now Has Tor Onions” (Apr. 26, 2023)
- Marco Civil Law of the Internet in Brazil
- EFF’s Tor University Challenge
What do you think of “How to Fix the Internet?” Share your feedback here.
TranscriptISABELA FERNANDES: If Tor is successful, the internet would be built by its heart, right? Like the elements that Tor carries, which is community, which is decentralization. Instead of having everything focused on a few small companies would be more distributed. I come from the free software world, so I am always excited with, and I have lived at moments.
In my life where I saw I could touch it, I could touch the moment where the source code would be shared and multiple areas of society would benefit from it. Collaboration allows amazing innovation. We are here today because of free software. If it wasn't for that, we would not be here today.
CINDY COHN: That's Isabela Fernandes, head of the Tor Project, describing the beautiful promise of collaboration, community and innovation that is instilled in the free software world – and the important role it plays as we look forward to that better future we’re always talking about on this show.
I'm Cindy Cohn, the executive director of the Electronic Frontier Foundation.
JASON KELLEY: And I'm Jason Kelley, EFF's Activism Director. This is our podcast series, How to Fix the Internet.
CINDY COHN: The idea behind this show is that we're trying to make our digital lives BETTER. You know, a big part of our job at EFF is to envision the ways things can go wrong online-- and then of course jumping into action to help when things then DO go wrong.
But this show is about optimism, hope and solutions – we want to share visions of what it looks like when we start to get it right.
JASON KELLEY: And our guest today is someone whose vision of getting it right is stronger than most.
CINDY COHN: Isabela Fernandes has been an important presence in the security and free software communities for a really long time now.
She's been the executive director at the Tor project since 2018, and before that she was a product manager there. And I'm happy to say that when I was on the board of the Tor project, I was one of the board members that strongly recommended Isa for the executive director role.
She was and continues to be not only a brilliant mind, but a skilled executor. With the Tor Project providing a model for an open source tool that works, is trusted and literally saves lives around the world. We are so thrilled to have her here. Welcome, Isa.
ISABELA FERNANDES: Hi. Thank you.
JASON KELLEY: We're really excited to talk to you and I wanted to start, if I could, with some basics. I think a lot of our audience, you know, has heard of Tor. Maybe they know what the Tor browser is. But some of these things pop up and I think, you know, some people don't know the difference between. A torrent and a tour browser and like what the tour project actually works on. So what is the Tor Project? What are the tools that you’re sort of responsible for creating and maintaining there?
ISABELA FERNANDES: So the Tor Project is actually a non-profit. And our mission is to advance human rights through the technology that we build, right?
So Tor is very similar to a VPN, but much better. We have a decentralized network that is run by volunteers, that whenever you are making a connection to a service or a website, our network will route you through three servers and it's gonna encrypt it every step of the way. And because of this architecture, it's not centralized on anyone or any entity.
It's completely decentralized to thousands of servers around the world. And we also have the Tor browser, which is a fork of Firefox, and what the Tor browser does besides making it easier for you to connect to the to network, it protects your privacy on the device level, so it blocks third party cookies, it also protects you against fingerprinting tracking and other ways that your device identity can leak.
JASON KELLEY: Okay, so just to dig in and make sure I understand, if I'm on a VPN I'm, you know, basically connecting to another server. And all of my connections are going through that. And usually I can, like, pick where they are from a short list of, you know, potential servers in cities and countries.
But with Tor, I don't choose where I'm going or what those three connections are, but it adds that extra layer of protection because three is better than one. But, but why is that? I'm just like, I wonder for the audience who might not know, you know, why three, why not five, or why not two?
ISABELA FERNANDES: Right, so, three, mainly because, so it works like this, right? Like the first server will know who you are because you're connecting to Tor –
JASON KELLEY: Sure, ok.
ISABELA FERNANDES: But it does not know what you are requesting. The middle one does not know who you are and have no idea about what you are requesting, and the exit one, the third one, only knows that someone on the internet is requesting to open a website,
JASON KELLEY: Got it.
ISABELA FERNANDES: So that count is great because if it was only two, that information you, you would still have some way to discover it and to understand where the information is coming from and where it is going.
So three, it is indeed a sweet number for you to have the level of privacy and that we want without building more latency to the connection.
JASON KELLEY: Okay. And then I'm gonna ask one more question. Um. About the technical aspect. Over the last like decade plus, most of the web has become encrypted. The HTTP level has become HTTPS, and that's something that EFF has worked on with our Certbot project, and Let's Encrypt. And if I'm not super familiar with the difference between, you know, how HTTPS is encrypted versus what TOR is doing.
Why do I still need to use Tor? What is it saving? What, how is it protecting my privacy if, if, quote unquote, the web is already encrypted?
ISABELA FERNANDES: Um, let's give this example, right? Like HTTPS would be encrypting your connection to the website. So when you do, you type your username or password, that information is encrypted. However, the server who is watching still would know who you are and where you're coming from. With Tor, you gain that other layer of protection, right? Like, nobody would know who you are and where you like, uh, and what you are requesting except for the website. So it protects you from outside watchers who might be surveilling your connection, also protects you from other tracking mechanisms on the internet.
So the ideal scenario is for you to use both. Right, like it's for you to not only use Tor, but make sure that you're connected to a website that has HTTPS as well.
JASON KELLEY: Wow. Okay. That's really helpful. Thank you so much. I feel like I'm getting tech tech support from the literal executive director of the Tor Project, but I think a lot of people you know that come to us at EFF for privacy or security recommendations really do not understand some of these, you know, somewhat basic things that you're describing about the difference between proxies and encrypted sites and VPNs and Tor, and, um, I think it's just really important for people to know how these different tools work, because they're always, you know, different tools function for different purposes, right.
CINDY COHN: Yeah. And it's, you know, security is hard. It actually requires, I mean, it would be great if there was a one size fits all security. And I think that if you look at all the pieces that Tor’s building, they're, they're moving towards that.
I want us to talk a little bit more about the why of Tor, 'cause we've really outlined the how of Tor, and I wanna give you a chance to kind of debunk one of the arguments that we hear all the time at EFF, which is, you know, why do people need all this security? If you're not doing anything wrong, you know, why should you worry? Um, or is it all just hopeless and shouldn't I just give up?
But let's start with the first one, and I know that you've done a lot of work at Tor trying to really think hard about who needs these tools, who uses these tools in a way that's privacy protective. So I wonder if you could outline a little bit of kind of what you guys know about who uses Tor and why.
ISABELA FERNANDES: There is a spectrum, and I always like to give examples from the two sides of this spectrum. We collect a lot of anonymous stories from users, and let's call this one Brian. So we have Brian. He’s a father and he has two teenage kids at home.
And, uh, you know, as teenagers, they have questions about everything, right? Like about sex, gender, drugs, everything. So he recommends his kids to use Tor when they're searching for those topics on the internet. And sometimes he needs to search some topics himself. You know, like the kids bring a topic that he had no idea what it is about.
So they use Tor to make sure that those searches does not follow those teenage kids for the rest of their lives. Right. Like it's not tagged to them for the rest of their life. So we recommend his kids to use Tor. And then you have on the other side of the spectrum, um, let's call her Carolina.
Carolina is a woman in Uganda. Uh, she's a lesbian. And in Uganda, you can face criminal charge for that. So Carolina just wants to have a normal social online life. And because it's so dangerous in Uganda for her, she really needs to make sure that she's protected and anonymous online when she's interacting with her friends or just looking for topics that is related to her lifestyle. So she used Tor to be safe online, uh, to just have a normal social life on the internet. We did a research, which I thought was very interesting. We put a question on a browser, it was anonymous and anyone could answer.
And we had like a, almost like a 55,000 people answering that question and was how often you use Tor, the Tor browser. And actually more than half of that said that they use one, uh, a few times a day or a few times a week. And that for me says a lot, right? Like it's for those moments where you're like, okay, this, I will want to do on Tor. I don't want the rest of the internet to collect this information and restore it and attach that to my behavior profile.
And that for me, it's what is important, right? Like if people may think that everything is lost and there is no reason to do that. And I think the other way around, I think, uh, it is possible for you to create black spots about your behavior online. And that's what tools like Tor can allow you to do, right? Like you can, uh, create some black spots about you on the internet that protects your privacy.
I think today people do care a lot about their privacy and one example about, which is related to privacy that I always bring to people, it's how dangerous it is to compare the need and the right to be anonymous. With the need to hide something that you don't want others to know or some illegal activity, because anonymity is actually one of the pillars of our democracy. Your vote is anonymous for a reason. So for you to exercise your citizen rights, you need to have privacy.
CINDY COHN: Yep. I think that's exactly right. And I also think we're living in times when things are moving so fast about who's at risk and who's not at risk, that a lot of people are waking up to the fact that just because you might not need privacy in one zone of your life or in one time that we're living in, doesn't mean that that can't change really quickly.
And having the tools available and ready and working is one of the things that we can do – we meaning people in tech – to make sure that as times change, people have the tools that they need to stay safe and to stay protected, and to, to organize, you know, opposition, to organize for change.
Musical transition
CINDY COHN: I think that this is happening a lot, but I'm wondering how you think about helping people reclaim the idea that privacy isn't something we should be ashamed of, that privacy is something that we should be proud of.
I hear you say, and I think that's totally right, it's a pillar of an open and self-governing world. How do you help convince people about that?
ISABELA FERNANDES: Let me step by for a second. Every time, like you might have like heard this from multiple people, right? Like they complain about ads following them, or give an example. Oh, I, I was talking with my friend about bicycles and now all these bicycles ads are showing up, my phone is listening to me.
Right? Like, so I think those are the perfect moment for you to go deeper into the matter of privacy, right? Like, imagine if it was not bicycles, imagine if it was about a government decision that you were talking about, right? That is the moment, right? Like you need to connect with people when they are presenting to you the problem.
So it's, it's fundamental, right? Like that makes super, super easy for someone to understand. But the next step in it is like they ask you how, how do I protect myself. And sometimes I feel like, uh, our work at Tor is not only to create tools. But to make it easier for people to use, it needs to be friendly, it needs to be familiar, right?
Like, uh, that's why the Tor browser is super nice because it's just like any other browser. People hear about Tor and they think it's like, oh, it is this hacker tool that I need to have a special excuse to use. No, it's just like any other browser that you open and you use and you can use on your phone, you can use anywhere.
So it is extremely important to bring awareness when people are identifying the problem, even if it is in an informal conversation or in a more, uh, global conversation, right? Like sometimes those problems arise in global news. Uh, we have multiple, uh, examples of that. Cambridge Analytica was one of it.
And, um, at those, those moments, we need to learn how to connect. But when we connect, we need to also be able to provide solutions that it's easy and familiar to people so they can have hope. They can look at it and they can say, okay, I can control this, right? Like, I can control, I can protect myself, I can protect my privacy.
And those elements come in altogether, right? Like it's not, uh, a one, uh, catchphrase that will make it happen. You need to combine all those elements in the process, right? So it's doesn't seem too hard and people feel empowered to have agency to take action.
CINDY COHN: Yep. I think that's right. So what we try to do in this podcast is kind of flip the script and think about what would the world look like if we got it. All right. So what would the world look like if Tor was immensely successful? What's your vision of the world where we get this right.
ISABELA FERNANDES: In the case of Tor, I think, uh, one thing would be that service and websites, they are friendly to Tor. So if a user is coming to connect to an application, or to a website, that website would know it and would be friendly to it. This is one of the biggest problems right now, right? Like some websites are not friendly to Tor or solutions like Tor. So that would be number one, right?
CINDY COHN: Yep.
ISABELA FERNANDES: So if Tor is successful, we would have an internet or a world with technology, right? Like, to go a little bit the on internet where technology is driven by sharing, by collaboration and the model of it. It's not about the data. And the business model of it would, it can be unique to each case of services, but would not necessarily be the typical one That is the easy one between quotes of let's collect all the data, either to use it for advertise or sell it to, uh, data brokers so we can make some money out of it. Right?
Like, uh, I think that if Tor would be successful, we would have the philosophy of Tor being part of the heart of what it's building, the technology worldwide.
CINDY COHN: Yeah, I think that's great. Um, in some ways, you know, Tor wouldn't need to exist as a separate project because the Tor values would be built into everything. And what I hear there is that that also includes the way Tor has been developed, the open source collaborative, transparent process by which tools were developed would be part of what gets baked in - it's a good vision.
Music transition
JASON KELLEY: Let’s take a quick moment to say thank you to our sponsor.
How to Fix the Internet is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.
We also want to thank EFF members and donors. You are the reason we exist. We will talk a little later in this episode about how important funding from the community is to the work that we do. You can become a member of EFF’s community for just $25.
The more members we have, the more power we have - in statehouses, courthouses, and on the streets. EFF has been fighting for digital rights for decades, and that fight is bigger than ever, so please, if you like what we do, go to eff.org/pod to donate.
And now, back to our conversation with Isabela Fernandes and the impact that free software and the community of people making it has had on her life.
ISABELA FERNANDES: I was raised by free software. Everything I know comes from this, from not only the software itself but the community. So all my skills come from it. So, uh, in the early 2000s, I joined a volunteer network, Colored in the Media was an independent news website, globe-wide. Uh, I built the one for Brazil and we did everything, uh, with free software, right?
Like it was an open publish website. When I explain to people nowadays about it, like you didn't need a username or password to publish your article on that platform, and we have hundreds of those sites around the world, and I did this work for 10 years because I really believe in the democratization of information, and I saw the internet as the root of it. And I saw how powerful that was because it was the beginning of the internet in many parts of the world, here in Brazil was just starting, and not everybody had a connection, and here we were with this powerful tool to democratize, uh, communication in Brazil.
And through that experience, I was invited to work for the federal government in, uh, series of free software initiatives being one of them, in digital inclusion, we would build solutions to communities, um, was a basket of solutions and that was from online stores, uh, they could sell their own products online, uh, to Voiceover IP. At the time it was like there was no cell phones. I'm talking about 2003, uh, like 2004. There was no cell phone.
I arrived to a community one month after they had power for the first time in their life, and I was bringing internet, you know, like, and I'm like, okay, you have internet. What do you want? They didn't have a phone number, a phone line, so I created Voiceover IP. I was like, you can call anywhere in Brazil with this computer. And there was a line with 20 people right away to make phone calls. But that's, we were doing this, we were like going to, uh, the favelas in Brazil and collecting all the teenagers and saying ‘what do you want?’ Like, ‘oh, I wanna record my music.’ And we were recording the music on CDs using everything, free software. Some communities were like, we wanna, uh, document because they have a lot of, uh, folklore stories that is only oral and they wanted to document it. We created a wiki for them to document it. So, education, go to public schools – we did a lot of that with free software. And at the same time, why this was possible, right? Like was because of the culture that was changing. The culture was, okay, we are not gonna use proprietary software anymore. We are not gonna use the money from the country that we barely have to pay for this big, super expensive license.
Instead, we are gonna use this money to invest on the people, to invest on computer science, the students to invest on conventions, free software meetups, uh, to invest on InstallFest. And we start to do that. And we had like a huge technology boom in Brazil from the private sector, like I said, from the government, from universities, everybody was collaborating.
There was a lot of companies being created to provide different types of service or to maintain software, there was a lot of different business being generated out of it as well.
So I could touch it, I could see it. It is possible. We could like we can do it. Right? Like I actually am always very excited when I, and right now I'm seeing a movement again in Brazil, it's not too public yet, but that is a movement like this, with hundreds of organizations debating and building a strategy to recreate that inside of the country.
So it is possible to build a better world with technology, right? Like better versions of technology for us. It's not a mission impossible thing. It is totally possible.
JASON KELLEY: And it's not the distant past really. I mean, sometimes when you talk about it, like I'm, I was alive at the time, but not, you know, not old enough to be involved in that. And it does sometimes sound like a kind of golden era that's lost forever to people. And it's, it's really great to hear that it's, maybe it's something that's cyclical and, or it's something that, you know, we lost for a brief period and we can get back to. How did that movement that's happening now in Brazil, get sort of reignited?
ISABELA FERNANDES: We're bringing some respect from that time. At that time we have, uh, Linux. Linux Install Fest. So you would bring your machine and you would install Linux. We want to combine any, anytime that you have an event that you're talking about the internet, that you're talking about regulation to have install fests – let's say install Mastodon, let's install Signal. Let's have everybody come out of this event and open for the population, right? Like, because sometimes when you offer those options to people. They don't have a network within that option, so they don't tend to stay.
But if you're doing this at an event and you like, let's say let's install Mastodon and everybody can have their account on different Mastadon instance, but we all following each other and I'm seeing the content and I can see it for real, what that means? And I will leave the event already with a network of people that I can follow on MAs on. Same thing. Every time I do a Signal training, I tell people now they're young, solid, let's copy each other's contact. So we have each other on our Signal account, right? Like, so we have a community. So we are thinking about that combination.
Music transition
JASON KELLEY: I wonder how, you know, again, you talk about some of this and I feel so jealous of, you know, being in this movement, I, I've never really been, you know, an engineer, so I'm sort of looking at the free software and open source communities at a distance. How did you end up sort of getting involved in them and, and do you have any advice for other people you know, today that want to be helpful or, um, want to connect with other people to help build the kind of internet you're talking about?
ISABELA FERNANDES: I end up on this out of necessity. When I was a teenager, I hated school, but I love to learn. I got kicked out of, uh, school multiple times until my dad put me in a technical high school to learn computers, but at the same time, uh, in my house, my parents had to work from 7:00 AM to 11:00 PM. So, you know, the strong survive in the house, it was me and my siblings.
And, uh, I could not touch the computer because my older brother would not let me. So I had to write code with a pen and paper and I hate it. And I start to go to my dad’s office to connect to at night when he would leave at 11, I would arrive and stay till 7:00 AM and that's how I start to learn about Linux.
We didn't have money for a license, so the more I wanted to do with the computer, the more I had to go to free software, you know? And like I said, after that, I joined this media network where we did everything. I learned how to build websites. I learned how to build data centers. We had to have security.
We had to build new products for journalists because we wanted to use free software, but sometimes we didn't have everything, or the solutions we had was not good enough, so we had to improve them to edit an audio or a video, right? Like things like that. So I went through this whole phase because I would not accept the technology that the normal, uh, business model wanted to offer me, I didn't accept that as it is, and I thought something else could happen. And like, uh, every time I talk with young people, I tell them this: Don't accept the technology that is being offered to you as it is. Don't accept it. It is possible. The reason we have free software was because people did not accept it, the technology that it was given to them. And I think that's the spirit.
Music transition
CINDY COHN: Thank you so much, Isa, for coming, and sharing your stories with us and your hope. Um, what a, what a hopeful conversation this was.
ISABELA FERNANDES: Thank you so much, Cindy and Jason. It was great to be here. Thank you.
JASON KELLEY: Well now I know how Tor works, which is great because I've been trying to figure that out for years. Um, three steps. I understand why there are three. This makes a lot more sense to me. And I'm honestly just a lot more hopeful than I was, which is always nice. It doesn't happen every time, but I feel like she's describing a future that actually not only is she, she and the Tor folks helping to build, but that other people can be a part of too, which is great.
CINDY COHN: Yeah, I think sometimes people envision privacy tools as the domain of people who are dark and worried and, and wanting to be self-protective all the time. And what was so refreshing about this, and refreshing about the way Isa and Tor operate in the world, is they're working with some pretty serious issues for people, but they're hopeful, they're building a future, they're very positive, and they have a vision of what the world looks like if we build privacy and security into everything. And, and in some ways it was a really light interview about something that protects people from very dangerous situations.
JASON KELLEY: Yeah. Yeah. And she talked a lot about, you know, what got her into free software. For her, it was kind of the necessity of having to write code on paper and not being able to buy software.
But I think we're coming to have that, for some people, that same necessity, again, for a lot of different reasons, you know, the software is bloated, it's enshittified, as Cory would say. Um, it's, you know, often monopolied in some way and, not that these are good things, but if it gets people back to the point she made where you realize that you can build the things yourself, that you don't have to accept the software that you're given and, and the tech that you're given, you can make your own and edit it and things like that. I think that would be a great outcome, and it sounds like that's already happening.
CINDY COHN: I think the other pieces were just, you know, really emphasizing the community, the need for community and how important community is, both in terms of entry into this, but also in the supporting and maintaining and developing of things and in, in how people use Tor. Right. You know, the Tor project operates because of nodes all across the country that volunteer to hold, you know, to carry other people's things. EFF has has done a Tor challenge a few times where we've tried to get more people to run nodes, whether they're in the middle or in the end. But that community is kind of infused in the way Tor works and it's infused in the vision that she has for a better future too. And that's just so consistent with, you know, what we've heard from people over and over again about how we, how we fix the internet.
JASON KELLEY: And that’s our episode for today – thanks so much for joining us.
If you have feedback or suggestions, we'd love to hear from you. Visit EFF dot org slash podcast and click on listener feedback. While you're there, you can become a member, donate, maybe even pick up some merch and just see what's happening in digital rights this week and every week.
Our theme music is by Nat Keefe of BeatMower with Reed Mathis
And How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.
We’ll see you next time.
I’m Jason Kelley…
CINDY COHN: And I’m Cindy Cohn.
MUSIC CREDITS: This podcast is licensed creative commons attribution 4.0 international, and includes the following music that is licensed creative commons attribution 3.0 unported by its creators: Recreation by Airtone. Additional beds and alternate theme remixes by Gaetan Harris.
San Diegans Push Back on Flock ALPR Surveillance
Approaching San Diego’s first annual review of the city's controversial Flock Safety contract, a local coalition is calling on the city council to roll back this dangerous and costly automated license plate reader (ALPR) program.
The TRUST Coalition—a grassroots alliance including Electronic Frontier Alliance members Tech Workers Coalition San Diego and techLEAD—has rallied to stop the unchecked spread of ALPRs in San Diego. We’ve previously covered the coalition’s fight for surveillance oversight, a local effort kicked off by a “smart streetlight” surveillance program five years ago.
In 2024, San Diego installed hundreds of AI-assisted ALPR cameras throughout the city to document what cars are driving where and when, then making that data accessible for 30 days.
ALPRs like Flock’s don’t prevent crime—they just vacuum up data on everyone who drives past. The resulting error-prone dragnet can then chill speech and be weaponized against marginalized groups, like immigrants and those seeking trans or reproductive healthcare.
Despite local and state restrictions barring the sharing of ALPR with federal and out of state agencies, San Diego Police have reportedly disclosed license plate data to federal agencies—including Homeland Security Investigations and Customs and Border Patrol.
Also, despite a local ordinance requiring city council approval before deployment of surveillance technology, San Diego police have reportedly deployed ALPRs and smart streetlights at Comic-Con and Pride without the required approval.
The local coalition is not alone in these concerns. The San Diego Privacy Board recently recommended the city reject the Surveillance Use Policy for this technology. All of this costs the community over $3.5 million last year alone. That is why the TRUST coalition is calling on the city to reject this oppressive surveillance system, and, instead, invest in other essential services which improve day-to-day life for residents.
San Diegans who want to push back can get involved by signing the TRUST Coalition’s petition, follow the campaign online, and contact their council members to demand the city end its contract with Flock and start respecting the privacy rights of everyone who lives, works, or visits through their community.
Hell No: The ODNI Wants to Make it Easier for the Government to Buy Your Data Without Warrant
New reporting has revealed that the Office of the Director of National Intelligence (ODNI) is attempting to create the Intelligence Community’s Data Consortium–a centralized online marketplace where law enforcement and spy agencies can peruse and buy very personal digital data about you collected by data brokers. Not only is this a massive escalation of the deeply unjust data broker loophole: it’s also another repulsive signal that your privacy means nothing to the intelligence community.
Imagine a mall where every store is run by data brokers whose goods include your information that has been collected by smartphone applications. Depending on your permissions and what applications are on your phone, this could include contacts, behavioral data, financial information, and even your constant geolocation. Now imagine that the only customers in this mall are federal law enforcement officers and intelligence agents who should be going to a judge, presenting their evidence, and hoping the judge grants a warrant for this information. But now, they don’t need evidence or to justify the reason why they need your data. Now they just need taxpayer money, and this newly centralized digital marketplace provides the buying opportunities.
This is what the Office of the Director of National Intelligence wants to build according to recently released contract documents.
Across the country, states are trying desperately to close the loophole that allows the government to buy private data it would otherwise need a warrant to get. Montana just became the first state to make it illegal for police to purchase data, like geolocation data harvested by apps on smartphones. At the federal level, EFF has endorsed Senator Ron Wyden’s Fourth Amendment is Not for Sale Act, which closes this data broker loophole. The bill passed the House last year, but was rejected by the Senate.
And yet, the federal government is doubling down on this very obviously unjust and unpopular policy.
An ODNI that wants to minimize harms against civil liberties would be pursuing the opposite tact. They should not be looking for ways to formalize and institutionalize surveillance loopholes. That is why we not only call on the ODNI to reverse course and scrap the Intelligence Community’s Data Consortium–we also call on lawmakers to finish what they started and pass the Fourth Amendment is Not for Sale Act and close the databroker loophole at the federal level once and for all. We urge all of our supporters to do the same and help us keep the government accountable.