Electronic Freedom Foundation
The Right to Repair Is Law in Washington State
Thanks in part to your support, the right to repair is now law in Washington.
Gov. Bob Ferguson signed two bills guaranteeing Washingtonians' right to access tools, parts, and information so they can fix personal electronics, appliances, and wheelchairs. This is the epitome of common-sense legislation. When you own something, you should have the final say about who fixes, adapts, or modifies it—and how.
When you own something, you should have the final say about who fixes, adapts, or modifies it—and how.
Advocates in Washington have worked for years to pass a strong right-to-repair law in the state. In addition to Washington’s Public Interest Research Group, the consumer electronics bill moved forward with a growing group of supporting organizations, including environmental advocates, consumer advocates, and manufacturers such as Google and Microsoft. Meanwhile, advocacy from groups including Disability Rights Washington and the Here and Now Project made the case for the wheelchair's inclusion in the right-to-repair bill, bringing their personal stories to Olympia to show why this bill was so important.
And it’s not just states that recognize the need for people to be able to fix their own stuff. Earlier this month, U.S. Army Secretary Dan Driscoll issued a memo stating that the Army should “[identify] and propose contract modifications for right to repair provisions where intellectual property constraints limit the Army's ability to conduct maintenance and access the appropriate maintenance tools, software, and technical data – while preserving the intellectual capital of American industry.” The memo said that the Army should seek this in future procurement contracts and also to amend existing contracts to include the right to repair.
This is a bedrock of sound procurement with a long history in America. President Lincoln only bought rifles with standardized tooling to outfit the Union Army, for the obvious reason that it would be a little embarrassing for the Commander in Chief to have to pull his troops off the field because the Army’s sole supplier had decided not to ship this week’s delivery of ammo and parts. Somehow, the Department of Defense forgot this lesson over the ensuing centuries, so that today, billions of dollars in public money are spent on material and systems that the US military can only maintain by buying service from a “beltway bandit.”
This recognizes what millions of people have said repeatedly: limiting people’s ability to fix their own stuff stands in the way of needed repairs and maintenance. That’s true whether you’re a farmer with a broken tractor during harvest, a homeowner with a misbehaving washing machine or a cracked smartphone screen, a hospital med-tech trying to fix a ventilator, or a soldier struggling with a broken generator.
The right to repair is gaining serious momentum. All 50 states have now considered some form of right-to-repair legislation. Washington is the eighth state to pass one of these bills into law—let’s keep it up.
The Federal Government Demands Data from SNAP—But Says Nothing About Protecting It
Last month, the U.S. Department of Agriculture issued a troubling order to all state agency directors of Supplemental Nutrition Assistance Programs (SNAP): hand over your data.
This is part of a larger effort by the Trump administration to gain “unfettered access to comprehensive data from all state programs that receive federal funding,” through Executive Order 14243. While the order says this data sharing is intended to cut down on fraud, it is written so broadly that it could authorize almost any data sharing. Such an effort flies in the face of well-established data privacy practices and places people at considerable risk.
A group SNAP recipients and organizations have thankfully sued to try and block the data sharing granted through the Executive Order. And the state of New Mexico has even refused to comply with the order, “due to questions and concerns regarding the legality of USDA’s demand for the information,” according to Source NM.
The federal government has said very little about how they will use this information. Several populations targeted by the Trump Administration are eligible to be on the SNAP program, including asylum seekers, refugees, and victims of trafficking. Additionally, although undocumented immigrants are not eligible for SNAP benefits, their household members who are U.S. citizens or have other eligible immigration statuses may be—raising the distinct concern that SNAP information could be shared with immigration or other enforcement authorities.
We all deserve privacy rights. Accessing public benefits to feed yourself shouldn't require you to give those up.
EFF has long advocated for privacy policies that ensure that information provided in one context is not used for other reasons. People who hand over their personal information should do so freely and with full information about how their information will be used. Whether you're seeking services from the government or a company, we all deserve privacy rights. Accessing public benefits to feed yourself shouldn't require you to give those up.
It's particularly important to respect privacy for government programs that provide essential support services to vulnerable populations such as SNAP. SNAP supports people who need assistance buying food—arguably the most basic need. Often, fear of reprisal and inappropriate government data sharing, such as immigration status of household members not receiving benefits, prevents eligible people from enrolling in food assistance despite need. Discouraging eligible people from enrolling in SNAP benefits runs counterproductive to the goals of the program, which aim to reduce food insecurity, improve health outcomes, and benefit local economies.
This is just the latest government data-sharing effort that raises alarm bells for digital rights. No one should worry that asking their government for help with hunger will get them in trouble. The USDA must promise it will not weaponize programs that put food on the table during times of need.
The PERA and PREVAIL Acts Would Make Bad Patents Easier to Get—and Harder to Fight
Two dangerous bills have been reintroduced in Congress that would reverse over a decade of progress in fighting patent trolls and making the patent system more balanced. The Patent Eligibility Restoration Act (PERA) and the PREVAIL Act would each cause significant harm on their own. Together, they form a one-two punch—making it easier to obtain vague and overly broad patents, while making it harder for the public to challenge them.
These bills don’t just share bad ideas—they share sponsors, a coordinated rollout, and backing from many of the same lobbying groups. Congress should reject both.
Tell Congress: Don't Bring Back The Worst Patents
PERA Would Legalize Patents on Basic Software—and Human GenesPERA would overturn long-standing court decisions that have helped keep some of the worst patents out of the system. This includes the Supreme Court’s Alice v. CLS Bank decision, which bars patents on abstract ideas, and Myriad v. AMP, which correctly ruled that naturally occurring human genes cannot be patented.
Thanks to the Alice decision, courts have invalidated a rogue’s gallery of terrible software patents—such as patents on online photo contests, online bingo, upselling, matchmaking, and scavenger hunts. These patents didn’t describe real inventions—they merely applied old ideas to general-purpose computers.
PERA would wipe out the Alice framework and replace it with vague, hollow exceptions. For example: it would ban patents on “dance moves” and “marriage proposals,” but would allow nearly anything involving a computer or machine—even if it only mentions the use of a computer. This is the same language used in many bad software patents that patent trolls have wielded for years. If PERA passes, patent claims that are currently seen as weak will become much harder to challenge.
Adding to that, PERA would bring back patents on human genes—exactly what was at stake in the Myriad case. EFF joined that fight, alongside scientists and patients, to prevent patents that interfered with essential diagnostic testing. Congress should not undo that victory. Some things just shouldn’t be patented.
PERA’s requirement that living genes can constitute an invention if they are “isolated” is meaningless; every gene used in science is “isolated” from the human body. This legal wordplay was used to justify human gene patents for decades, and it’s deeply troubling that some U.S. Senators are on board with bringing them back.
PREVAIL Weakens the Public’s Best Defense Against Patent AbuseWhile PERA makes it easier to obtain a bad patent, the PREVAIL Act makes it harder to get rid of one.
PREVAIL would severely limit inter partes review (IPR), the most effective process for challenging wrongly granted patents. This faster, more affordable process—administered by the U.S. Patent and Trademark Office—has knocked out thousands of invalid patents that should never have been issued.
EFF has used IPR to protect the public. In 2013, we challenged and invalidated a patent on podcasting, which was being used to threaten creators across the internet. Thousands of our supporters chipped in to help us bring that case. Under PREVAIL, that challenge wouldn’t have been allowed. The bill would significantly limit IPR petitions unless you’ve been directly sued or threatened—a major blow to nonprofits, open source advocates, and membership-based defense groups that act in the public interest.
PREVAIL doesn’t stop at limiting who can file an IPR. It also undermines the fairness of the IPR process itself. It raises the burden of proof, requiring challengers to overcome a presumption that the patent is valid—even when the Patent Office is the one reviewing it. The bill forces an unfair choice: anyone who challenges a patent at the Patent Office would have to give up the right to fight the same patent in court, even though key legal arguments (such as those involving abstract subject matter) can only be made in court.
It gets worse. PREVAIL makes it easier for patent owners to rewrite their claims during review, taking advantage of hindsight about what’s likely to hold up. And if multiple parties want to challenge the same patent, only the first to file may get heard. This means that patents used to threaten dozens or even hundreds of targets could get extra protection, just because one early challenger didn’t bring the best arguments.
These changes aren’t about improving the system. They’re about making it easier for a small number of patent owners to extract settlements, and harder for the public to push back.
A Step Backward, Not ForwardSupporters of these bills claim they’re trying to restore balance to the patent system. But that’s not what PERA and PREVAIL do. They don’t fix what’s broken—they break what’s working.
Patent trolling is still a severe problem. In 2024, patent trolls filed a stunning 88% of all patent lawsuits in the tech sector.
At the same time, patent law has come a long way over the past decade. Courts can now reject abstract software patents earlier and more easily. The IPR process has become a vital tool for holding the Patent Office accountable and protecting real innovators. And the Myriad decision has helped keep essential parts of human biology in the public domain.
PERA and PREVAIL would undo all of that.
These bills have support from a variety of industry groups, including those representing biotech firms, university tech transfer offices, and some tech companies that rely on aggressive patent licensing. While those voices deserve to be heard, the public deserves better than legislation that makes it easier to secure a 20-year monopoly on an idea, and harder for anyone else to challenge it.
Instead of PERA and PREVAIL, Congress should focus on helping developers, creators, and small businesses that rely on technology—not those who exploit it through bad patents.
Some of that legislation is already written. Congress should consider making end-users immune from patent threats, closing loopholes that allow certain patent-holders to avoid having their patents reviewed, and adding transparency requirements so that people accused of patent infringement can at least figure out who’s making the allegations.
But right now, EFF is fighting back, and we need your help. These bills may be dressed up as reform, but we’ve seen them before—and we know the damage they’d do.
The Defense Attorney’s Arsenal In Challenging Electronic Monitoring
In criminal prosecutions, electronic monitoring (EM) is pitched as a “humane alternative" to incarceration – but it is not. The latest generation of “e-carceration” tools are burdensome, harsh, and often just as punitive as imprisonment. Fortunately, criminal defense attorneys have options when shielding their clients from this over-used and harmful tech.
Framed as a tool that enhances public safety while reducing jail populations, EM is increasingly used as a condition of pretrial release, probation, parole, or even civil detention. However, this technology imposes serious infringements on liberty, privacy, and due process for not only those placed on it but also for people they come into contact with. It can transform homes into digital jails, inadvertently surveil others, impose financial burdens, and punish every misstep—no matter how minor or understandable.
Even though EM may appear less severe than incarceration, research and litigation reveal that these devices often function as a form of detention in all but name. Monitored individuals must often remain at home for long periods, request permission to leave for basic needs, and comply with curfews or “exclusion zones.” Violations, even technical ones—such as a battery running low or a dropped GPS signal—can result in arrest and incarceration. Being able to take care of oneself and reintegrate into the world becomes a minefield of compliance and red tape. The psychological burden, social stigma, and physical discomfort associated with EM are significant, particularly for vulnerable populations.
For many, EM still evokes bulky wrist or ankle “shackles” that can monitor a subject’s location, and sometimes even their blood alcohol levels. These devices have matured with digital technology however, increasingly imposed through more sophisticated devices like smartwatches or mobile phones applications. Newer iterations of EM have also followed a trajectory of collecting much more data, including biometrics and more precise location information.
This issue is more pressing than ever, as the 2020 COVID pandemic led to an explosion in EM adoption. As incarceration and detention facilities became superspreader zones, judges kept some offenders out of these facilities by expanding the use of EM; so much so that some jurisdictions ran out of classic EM devices like ankle bracelets.
Today the number of people placed on EM in the criminal system continues to skyrocket. Fighting the spread of EM requires many tactics, but on the front lines are the criminal defense attorneys challenging EM impositions. This post will focus on the main issues for defense attorneys to consider while arguing against the imposition of this technology.
PRETRIAL ELECTRONIC MONITORINGWe’ve seen challenges to EM programs in a variety of ways, including attacking the constitutionality of the program as a whole and arguing against pretrial and/or post-conviction imposition. However, it is likely that the most successful challenges will come from individualized challenges to pretrial EM.
First, courts have not been receptive to arguments that entire EM programs are unconstitutional. For example, in Simon v. San Francisco et.al, 135 F.4th 784 (9 Cir. 2025), the Ninth Circuit held that although San Francisco’s EM program constituted a Fourth Amendment search, a warrant was not required. The court explained their decision by stating that the program was a condition of pretrial release, included the sharing of location data, and was consented to by the individual (with counsel present) by signing a form that essentially operated as a contract. This decision exemplifies the court’s failure to grasp the coercive nature of this type of “consent” that is pervasive in the criminal legal system.
Second, pretrial defendants have more robust rights than they do after conviction. While a person’s expectation of privacy may be slightly diminished following arrest but before trial, the Fourth Amendment is not entirely out of the picture. Their “privacy and liberty interests” are, for instance, “far greater” than a person who has been convicted and is on probation or parole. United States v. Scott, 450 F.3d 863, 873 (9th Cir. 2006). Although individuals continue to retain Fourth Amendment rights after conviction, the reasonableness analysis will be heavily weighted towards the state as the defendant is no longer presumed innocent. However, even people on probation have a “substantial” privacy interest. United States v. Lara, 815 F.3d 605, 610 (9th Cir. 2016).
THE FOURTH AMENDMENTThe first foundational constitutional rights threatened by the sheer invasiveness of EM are those protected by the Fourth Amendment. This concern is only heightened as the technology improves and collects increasingly detailed information. Unlike traditional probation or parole supervision, EM often tracks individuals with no geographic limitations or oversight, and can automatically record more than just approximate location information.
Courts have increasingly recognized that this new technology poses greater and more novel threats to our privacy than earlier generations. In Grady v. North Carolina, 575 U.S. 306 (2015), the Supreme Court, relying on United States v. Jones, 565 U.S. 400 (2012) held that attaching a GPS tracking device to a person—even a convicted sex offender—constitutes a Fourth Amendment search and is thus subject to the inquiry of reasonableness. A few years later, the monumental decision in Carpenter v. United States, 138 S. Ct. 2206 (2018), firmly established that Fourth Amendment analysis is affected by the advancement of technology, holding that that long-term cell-site location tracking by law enforcement constituted a search requiring a warrant.
As criminal defense attorneys are well aware, the Fourth Amendment’s ostensibly powerful protections are often less effective in practice. Nevertheless, this line of cases still forms a strong foundation for arguing that EM should be subjected to exacting Fourth Amendment scrutiny.
DUE PROCESSThree key procedural due process challenges that defense attorneys can raise under the Fifth and Fourteenth Amendments are: inadequate hearing, lack of individualized assessment, and failure to consider ability to pay.
Many courts impose EM without adequate consideration of individual circumstances or less restrictive alternatives. Defense attorneys should demand evidentiary hearings where the government must prove that monitoring is necessary and narrowly tailored. If the defendant is not given notice, hearing, or the opportunity to object, that could arguably constitute a violation of due process. For example, in the previously mentioned case, Simon v. San Francisco, the Ninth Circuit found that individuals who were not informed of the details regarding the city’s pretrial EM program in the presence of counsel had their rights violated.
Second, imposition of EM should be based on an individualized assessment rather than a blanket rule. For pretrial defendants, EM is frequently used as a condition of bail. Although under both federal and state bail frameworks, courts are generally required to impose the least restrictive conditions necessary to ensure the defendant’s court appearance and protect the community, many jurisdictions have included EM as a default condition rather than individually assessing whether EM is appropriate. The Bail Reform Act of 1984, for instance, mandates that release conditions be tailored to the individual’s circumstances. Yet in practice, many jurisdictions impose EM categorically, without specific findings or consideration of alternatives. Defense counsel should challenge this practice by insisting that judges articulate on the record why EM is necessary, supported by evidence related to flight risk or danger. Where clients have stable housing, employment, and no history of noncompliance, EM may be more restrictive than justified.
Lastly, financial burdens associated with EM may also implicate due process where a failure to pay can result in violations and incarceration. In Bearden v. Georgia, 461 U.S. 660 (1983), the Supreme Court held that courts cannot revoke probation for failure to pay fines or restitution without first determining whether the failure was willful. Relying on Bearden, defense attorneys can argue that EM fees imposed on indigent clients amount to unconstitutional punishment for poverty. Similarly, a growing number of lower courts have agreed, particularly where clients were not given the opportunity to contest their ability to pay. Defense attorneys should request fee waivers, present evidence of indigence, and challenge any EM orders that functionally condition liberty on wealth.
STATE LAW PROTECTIONSState constitutions and statutes often provide stronger protections than federal constitutional minimums. In addition to state corollaries to the Fourth and Fifth Amendment, some states have also enacted statutes to govern pretrial release and conditions. A number of states have established a presumption in favor of release on recognizance or personal recognizance bonds. In those jurisdictions, the state has to overcome this presumption before the court can impose restrictive conditions like EM. Some states require courts to impose the least restrictive conditions necessary to achieve legitimate purposes, making EM appropriate only when less restrictive alternatives are inadequate.
Most pretrial statutes list specific factors courts must consider, such as community ties, employment history, family responsibilities, nature of the offense, criminal history, and risk of flight or danger to community. Courts that fail to adequately consider these factors or impose generic monitoring conditions may violate statutory requirements.
For example, Illinois's SAFE-T Act includes specific protections against overly restrictive EM conditions, but implementation has been inconsistent. Defense attorneys in Illinois and states with similar laws should challenge monitoring conditions that violate specific statutory requirements.
TECHNOLOGICAL ISSUESAttorneys should also consider the reliability of EM technology. Devices frequently produce false violations and alerts, particularly in urban areas or buildings where GPS signals are weak. Misleading data can lead to violation hearings and even incarceration. Attorneys should demand access to raw location data, vendor records, and maintenance logs. Expert testimony can help demonstrate technological flaws, human error, or system limitations that cast doubt on the validity of alleged violations.
In some jurisdictions, EM programs are operated by private companies under contracts with probation departments, courts, or sheriffs. These companies profit from fees paid by clients and have minimal oversight. Attorneys should request copies of contracts, training manuals, and policies governing EM use. Discovery may reveal financial incentives, lack of accountability, or systemic issues such as racial or geographic disparities in monitoring. These findings can support broader litigation or class actions, particularly where indigent individuals are jailed for failing to pay private vendors.
Recent research provides compelling evidence that EM fails to achieve its stated purposes while creating significant harms. Studies have not found significant relationships between EM of individuals on pretrial release and their court appearance rates or likelihood of arrest. Nor do they show that law enforcement is employing EM on individuals they would otherwise put in jail.
To the contrary, studies indicate that law enforcement is using EM to surveil and constrain the liberty of those who wouldn't otherwise be detained, as the rise in the number of people placed on EM has not coincided with a decrease in detention. This research demonstrates that EM represents an expansion of government control rather than a true alternative to detention.
Additionally, EM devices may be rife with technical issues as described above. Communication system failures that prevent proper monitoring, and device malfunctions that cause electronic shocks. Cutting of ankle bracelets is a common occurrence among users, especially when the technology is malfunctioning or hurting them. Defense attorneys should document all technical issues and argue that unreliable technology cannot form the basis for liberty restrictions or additional criminal charges.
CREATING A RECORD FOR APPEALAttorneys should always make sure they are creating a record on which the EM imposition can be appealed, should the initial hearing be unsuccessful. This will require lawyers to include the factual basis for challenge and preserve the appropriate legal arguments. The modern generation of EM has yet to undergo the extensive judicial review that ankle shackles have been subjected to, making it integral to make an extensive record of the ways in which it is more invasive and harmful, so that it can be properly argued to an appellate court that the nature of the newest EM requires more than perfunctory application of decades-old precedence. As we saw with Carpenter, the rapid advancement of technology may push the courts to reconsider older paradigms for constitutional analysis and find them wanting. Thus, a comprehensive record would be critical to show EM as it is—an extension of incarceration—rather than a benevolent alternative to detention.
Defeating electronic monitoring will require a multidimensional approach that includes litigating constitutional claims, contesting factual assumptions, exposing technological failures, and advocating for systemic reforms. As the carceral state evolves, attorneys must remain vigilant and proactive in defending the rights of their clients.
The EU’s “Encryption Roadmap” Makes Everyone Less Safe
EFF has joined more than 80 civil society organizations, companies, and cybersecurity experts in signing a letter urging the European Commission to change course on its recently announced “Technology Roadmap on Encryption.” The roadmap, part of the EU’s ProtectEU strategy, discusses new ways for law enforcement to access encrypted data. That framing is dangerously flawed.
Let’s be clear: there is no technical “lawful access” to end-to-end encrypted messages that preserves security and privacy. Any attempt to circumvent encryption—like client-side scanning—creates new vulnerabilities, threatening the very people governments claim to protect.
This letter is significant in not just its content, but in who signed it. The breadth of the coalition makes one thing clear: civil society and the global technical community overwhelmingly reject the idea that weakening encryption can coexist with respect for fundamental rights.
Strong encryption is a pillar of cybersecurity, protecting everyone: activists, journalists, everyday web users, and critical infrastructure. Undermining it doesn’t just hurt privacy. It makes everyone’s data more vulnerable and weakens the EU’s ability to defend against cybersecurity threats.
EU officials should scrap any roadmap focused on circumvention and instead invest in stronger, more widespread use of end-to-end encryption. Security and human rights aren’t in conflict. They depend on each other.
You can read the full letter here.
245 Days Without Justice: Laila Soueif’s Hunger Strike and the Fight to Free Alaa Abd el-Fattah
Laila Soueif has now been on hunger strike for 245 days. On Thursday night, she was taken to the hospital once again. Soueif’s hunger strike is a powerful act of protest against the failures of two governments. The Egyptian government continues to deny basic justice by keeping her son, Alaa Abd el-Fattah, behind bars—his only “crime” was sharing a Facebook post about the torture of a fellow detainee. Meanwhile, the British government, despite Alaa’s citizenship, has failed to secure even a single consular visit. Its muted response reflects an unacceptable unwillingness to stand up for the rights of its own citizens.
This is the second time this year that Soueif’s health has collapsed due to her hunger strike. Now, her condition is dire. Her blood sugar is dangerously low, and every day, her family fears it could be her last. Doctors say it’s a miracle she’s still alive.
Her protest is a call for accountability—a demand that both governments uphold the rule of law and protect human rights, not only in rhetoric, but through action.
Late last week, after an 18-month investigation, the United Nations Working Group on Arbitrary Detention (UNWGAD) issued its Opinion on Abd el-Fattah’s case, stating that he is being held unlawfully by the Egyptian government. That Egypt will not provide the United Kingdom with consular access to its citizen further violates the country’s obligations under international law.
As stated in a letter to British Prime Minister Keir Starmer by 21 organizations, including EFF, the UK must now use every tool it has at its disposal to ensure that Alaa Abd el-Fattah is released immediately.
CCTV Cambridge: Digital Equity in 2025
EFF has long advocated for affordable, accessible, and future-proof internet access for all. Digital equity, the condition in which everyone has access to technology that allows them to participate in society, is an issue that I’ve been proud to organize around. So, it’s awesome to connect with a group that's doing something to address it in their community.
Recently I got the chance to catch up with Maritza Grooms, Director of Community Relations at EFA member CCTV Cambridge, who told me about the results of their work and the impact it's having on their local community.
How’s your digital inclusion work going and what's been the results within the community?CCTV has had a year of transition and change. One of the biggest was the establishing of the Digital Navigator Pilot Program in collaboration with multiple partners funded in part by Masshire Metro North Workforce Investment Board through the Mass Broadband Institute. This program has already had a great impact in Cambridge since its official launch in August 2024, serving 492 community members! This program demonstrates the clear need for digital navigator services in Cambridge and beyond. Our community has used this service to get devices that have allowed them restart their career journey or go back to school, and take digital literacy classes to gain new skills to help them along the way.
The Electronic Frontier Alliance works to uphold the principles of free expression, information security, privacy, creativity, and access to knowledge. What guides your organization and how does digital equity tie into it?CCTV's mission is to nurture a strong, equitable, and diverse community by providing tools and training to foster free speech, civic engagement, access to knowledge, and creative expression. The Digital Navigator program fulfills this mission not only for the community we serve, but in the ripple effects that generate from our community members having the tools to participate in our society. The Digital Navigator Pilot Program aims to bridge the digital divide in Cambridge, specifically supporting BIPOC, immigrant, and low-income communities to enhance economic mobility.
How can people support and plug-in to what you’re doing?We cannot do this alone. It takes a village, from partners in the work like our friends at EFF, and supporters alike. We encourage anyone to reach out to maritza@cctvcambridge.org to find out how you can support this program or visit cctvcambridge.org/support to support today and invite donations at your convenience. Follow us on social media @cctvcambridge!
Thanks again to Maritza for speaking with us. If you're inspired by CCTV Cambridge's work, consider joining a local EFA ally, or bringing your own group into the alliance today!
She Got an Abortion. So A Texas Cop Used 83,000 Cameras to Track Her Down.
In a chilling sign of how far law enforcement surveillance has encroached on personal liberties, 404 Media recently revealed that a sheriff’s office in Texas searched data from more than 83,000 automated license plate reader (ALPR) cameras to track down a woman suspected of self-managing an abortion. The officer searched 6,809 different camera networks maintained by surveillance tech company Flock Safety, including states where abortion access is protected by law, such as Washington and Illinois. The search record listed the reason plainly: “had an abortion, search for female.”
screenshot_2025-05-30_at_11.08.40_am.png
Screenshot of data
After the U.S. Supreme Court’s 2022 Dobbs v. Jackson Women’s Health Organization decision overturned Roe v. Wade, states were given sweeping authority to ban and even criminalize abortion. In Texas—where the officer who conducted this search is based—abortion is now almost entirely banned. But in Washington and Illinois, where many of the searched Flock cameras are located, abortion remains legal and protected as a fundamental right up to fetal viability.
The post-Dobbs legal landscape has also opened the door for law enforcement to exploit virtually any form of data—license plates, phone records, geolocation data—to pursue individuals across state lines. EFF’s Atlas of Surveillance has documented more than 1,800 agencies have deployed ALPRs, but at least 4,000 agencies are able to run searches through some agencies in Flock's network. Many agencies share the data freely with other agencies across the country, with little oversight, restriction, or even standards for accessing data.
While this particular data point explicitly mentioned an abortion, scores of others in the audit logs released through public records requests simply list "investigation" as the reason for the plate search, with no indication of the alleged offense. That means other searches targeting someone for abortion, or another protected right in that jurisdiction, could be effectively invisible.
This case underscores our growing concern: that the mass surveillance infrastructure—originally sold as a tool to find stolen cars or missing persons—is now being used to target people seeking reproductive healthcare. This unchecked, warrant-less access that allows law enforcement to surveil across state lines blurs the line between “protection” and persecution.
From Missing Cars to Monitoring BodiesEFF has long warned about the dangers of ALPRs, which scan license plates, log time and location data, and build a detailed picture of people's movements. Companies like Flock Safety and Motorola Solutions offer law enforcement agencies access to nationwide databases of these readers, and in some cases, allow them to stake out locations like abortion clinics, or create “hot lists” of license plates to track in real time. Flock's technology also allows officers to search for a vehicle based on attributes like color, make and model, even without a plate number.
The threat is compounded by how investigations often begin. A report published by If/When/How on the criminalization of self-managed abortion found that about a quarter of adult cases (26%) were reported to law enforcement by acquaintances entrusted with information, such as “friends, parents, or intimate partners” and another 18% through “other” means. This means that with ALPR tech, a tip from anyone can instantly escalate into a nationwide manhunt. And as Kate Bertash of the Digital Defense Fund explained to 404 Media, anti-abortion activists have long been documenting the plates of patients and providers who visit reproductive health facilities—data that can now be easily cross-referenced with ALPR databases.
The 404 Media report proves that this isn’t a hypothetical concern. In 2023, a months-long EFF investigation involving hundreds of public records requests uncovered that many California police departments were sharing records containing detailed driving profiles of local residents with out-of-state agencies, despite state laws explicitly prohibiting this. This means that even in so-called “safe” states, your data might end up helping law enforcement in Texas or Idaho prosecute you—or your doctor.
That’s why we demanded that 75 California police departments stop sharing ALPR data with anti-abortion states, an effort that has largely been successful.
Surveillance and Reproductive Freedom Cannot CoexistWe’ve said it before, and we’ll say it again: Lawmakers who support reproductive rights must recognize that abortion access and mass surveillance are incompatible.
The systems built to track stolen cars and issue parking tickets have become tools to enforce the most personal and politically charged laws in the country. What began as a local concern over privacy has escalated into a national civil liberties crisis.
Yesterday’s license plate readers have morphed into today’s reproductive dragnet. Now, it’s time for decisive action. Our leaders must roll back the dangerous surveillance systems they've enabled. We must enact strong, enforceable state laws to limit data sharing, ensure proper oversight, and dismantle these surveillance pipelines before they become the new normal–or even just eliminate the systems altogether.
California’s Cities and Counties Must Step Up Their Privacy Game. A.B. 1337 Can Do That.
“The right to privacy is being threatened by the indiscriminate collection, maintenance, and dissemination of personal information and the lack of effective laws and legal remedies,” some astute California lawmakers once wrote. “The increasing use of computers and other sophisticated information technology has greatly magnified the potential risk to individual privacy that can occur from the maintenance of personal information.”
Sound familiar? These words may sound like a recent push back on programs that want to slurp up the information sitting in ever-swelling government databases. But they’re not. They come from a nearly 50-year-old California law.
The “Information Practices Act of 1977”—or the IPA for short—is a foundational state privacy law and one of several privacy laws directly responding to the Watergate scandal, such the federal Privacy Act of 1974 and California’s own state constitutional right to privacy.
Now, as we confront a new era of digital surveillance and face our own wave of concern about government demands for data, it's time to revisit and update the IPA.
The IPA puts a check on government use of personal information by establishing guardrails for how state agencies maintain, collect, and disseminate data. It also gives people the right to access and correct their information.
While the need for the law has not changed, the rest of the world has. Particularly, since the IPA passed in 1977, far more data collection is now done at the county and city level. Yet local and county government entities have no standard protections in the state of California. And those entities have troves of data, whether it’s the health data collected from vaccine programs or held by county-administered food programs.
As demand for this type of local data grows, we need to tap back into the energy of the ‘70s. It’s time to update the IPA so it can respond to the world we live in today. That’s why EFF is proud to co-sponsor A.B. 1337, authored by Assemblymember Chris Ward (D-San Diego), with our close friends at Oakland Privacy.
Specifically, A.B. 1337, also known as the IPA Reform Act:
- Expands the definition of covered entities in the IPA to include local agencies, offices, departments and divisions.
- Prevents information collected from being used for unintended or secondary purposes without consent.
- Makes harmful negligent and improper release of personal information punishable as a misdemeanor.
- Requires that IPA disclosure records be kept for three years and cannot be destroyed prior to that period.
- Aligns the definition of personal information and sensitive personal information with the California Privacy Rights Act to include location data, online browsing records, IP addresses, citizenship status, and genetic information.
Privacy is foundational to trust in government. That’s part of the lesson we learned from the 1970s. (And trust in government is lower today than it was then.)
We need to be confident that the government is respecting our personal information and our privacy. More than ever, California residents face imminent danger of being targeted, persecuted, or prosecuted for seeking reproductive healthcare, their immigration status, practicing a particular religion, being of a particular race, gender identity, or sexual orientation—or simply for exercising their First Amendment rights.
California is a national leader on consumer privacy protections, having passed a landmark comprehensive privacy law and established the nation’s first state privacy agency. Now, its local governments must catch up.
We cannot afford to wait for these protections any longer. Passing A.B. 1337 is good governance, good policy, and just good sense. If you’re a California resident, tell your Assemblymember to support the bill today.
The Insidious Effort to Privatize Public Airwaves | EFFector 37.5
School is almost out for summer! You know what that means? Plenty of time to catch up on the latest digital rights news! Don't worry, though—EFF has you covered with our EFFector newsletter.
This edition of EFFector explains why efforts to privatize public airwaves would harm American TV viewers; goes over how KOSA is still a very bad censorship bill, especially for young people; and covers how Signal, WhatsApp, and other encrypted chat apps back up your conversations.
You can read the full newsletter here, and even get future editions directly to your inbox when you subscribe! Additionally, we've got an audio edition of EFFector on the Internet Archive, or you can view it by clicking the button below:
EFFECTOR 37.5 - The Insidious Effort to Privatize Public Airwaves
Since 1990 EFF has published EFFector to help keep readers on the bleeding edge of their digital rights. We know that the intersection of technology, civil liberties, human rights, and the law can be complicated, so EFFector is a great way to stay on top of things. The newsletter is chock full of links to updates, announcements, blog posts, and other stories to help keep readers—and listeners—up to date on the movement to protect online privacy and free expression.
Thank you to the supporters around the world who make our work possible! If you're not a member yet, join EFF today to help us fight for a brighter digital future.
Podcast Episode: Love the Internet Before You Hate On It
There’s a weird belief out there that tech critics hate technology. But do movie critics hate movies? Do food critics hate food? No! The most effective, insightful critics do what they do because they love something so deeply that they want to see it made even better. The most effective tech critics have had transformative, positive online experiences, and now unflinchingly call out the surveilled, commodified, enshittified landscape that exists today because they know there has been – and still can be – something better.
%3Ciframe%20height%3D%2252px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2F185a41be-b203-47a2-9f26-6a4c4a5fd08b%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E
Privacy info.
This embed will serve content from simplecast.com
(You can also find this episode on the Internet Archive and on YouTube.)
That’s what drives Molly White’s work. Her criticism of the cryptocurrency and technology industries stems from her conviction that technology should serve human needs rather than mere profits. Whether it’s blockchain or artificial intelligence, she’s interested in making sure the “next big thing” lives up to its hype, and more importantly, to the ideals of participation and democratization that she experienced. She joins EFF’s Cindy Cohn and Jason Kelley to discuss working toward a human-centered internet that gives everyone a sense of control and interaction – open to all in the way that Wikipedia was (and still is) for her and so many others: not just as a static knowledge resource, but as something in which we can all participate.
In this episode you’ll learn about:
- Why blockchain technology has built-in incentives for grift and speculation that overwhelm most of its positive uses
- How protecting open-source developers from legal overreach, including in the blockchain world, remains critical
- The vast difference between decentralization of power and decentralization of compute
- How Neopets and Wikipedia represent core internet values of community, collaboration, and creativity
- Why Wikipedia has been resilient against some of the rhetorical attacks that have bogged down media outlets, but remains vulnerable to certain economic and political pressures
- How the Fediverse and other decentralization and interoperability mechanisms provide hope for the kind of creative independence, self-expression, and social interactivity that everyone deserves
Molly White is a researcher, software engineer, and writer who focuses on the cryptocurrency industry, blockchains, web3, and other tech in her independent publication, Citation Needed. She also runs the websites Web3 is Going Just Great, where she highlights examples of how cryptocurrencies, web3 projects, and the industry surrounding them are failing to live up to their promises, and Follow the Crypto, where she tracks cryptocurrency industry spending in U.S. elections. She has volunteered for more than 15 years with Wikipedia, where she serves as an administrator (under the name GorillaWarfare) and functionary, and previously served three terms on the Arbitration Committee. She’s regularly quoted or bylined in news media, speaks at major conferences including South by Southwest and Web Summit; guest lectures at universities including Harvard, MIT, and Stanford; and advises policymakers and regulators around the world.
Resources:
- XOXO Festival, Portland, OR: "Molly White: Fighting For Our Web" (Aug. 24, 2024)
- This Next Thing 2024, Pontresina, Switzerland: “Molly White: Magic, Creativity, and Meaning” (June 2024)
- EFF: Blockchain
- EFF: “Decentralization Reaches a Turning Point: 2024 in Review” (Jan. 1, 2025)
- Neopets
- Wikipedia
What do you think of “How to Fix the Internet?” Share your feedback here.
TranscriptMOLLY WHITE: I was very young when I started editing Wikipedia. I was like 12 years old, and when it said the encyclopedia that anyone can edit, “anyone” means me, and so I just sort of started contributing to articles and quickly discovered that there was this whole world behind Wikipedia that a lot of us really don't see, where very passionate people are contributing to creating a repository of knowledge that anyone can access.
And I thought, I immediately was like, that's brilliant, that's amazing. And you know that motivation has really stuck with me since then, just sort of the belief in open knowledge and open access I think has, you know, it was very early for me to be introduced to those things and I, I sort of stuck with it, because it became, I think, such a formative part of my life.
CINDY COHN: That’s Molly White talking about a moment that is hopefully relatable to lots of folks who think critically about technology – that moment when you first experienced how, sometimes, the internet can feel like magic.
I'm Cindy Cohn, the Executive Director of the Electronic Frontier Foundation.
JASON KELLEY: And I'm Jason Kelley, EFF’s Activism Director. This is our podcast, How to Fix the Internet.
CINDY COHN: The idea behind this show is that we're trying to make our digital lives BETTER. A big part of our job at EFF is to envision the ways things can go wrong online-- and jumping into action to help when things then DO go wrong.
But this show is about optimism, hope and solutions – we want to share visions of what it looks like when we get it right.
JASON KELLEY: Our guest today is Molly White. She’s a journalist and web engineer, and is one of the strongest voices thinking and speaking critically about technology–specifically, she’s been an essential voice on cryptocurrency and what people often call Web3–usually a reference to blockchain technologies.. She runs an independent online newsletter called Citation Needed, and at her somewhat sarcastically named website “Web3 is going just great” she chronicles all the latest, often alarming, news, often involving scams and schemes that make those of us working to improve the internet pull our hair out.
CINDY COHN: But she’s not a pessimist. She comes from a deep love of the internet, but is also someone who holds the people that are building our digital world to account, with clear-eyed explanations of where things are going wrong, but also potential that exists if we can do it right. Welcome, Molly. Thanks for being here.
MOLLY WHITE: Thank you for having me.
CINDY COHN: So the theme of our show is what does it look like if we start to get things right in the digital world? Now you recognize, I believe, the value of blockchain technologies, what they could be.
But you bemoan how far we are from that right now. So let's start there. What does the world look like if we start to use the blockchain in a way that really lives up to its potential for making things better online?
MOLLY WHITE: I think that a lot of the early discussions about the potential of the blockchain were very starry-eyed and sort of utopian. Much in the way that early discussions of the internet were that way. You know, they promised that blockchains would somehow democratize everything we do on the internet, you know, make it more available to anyone who wanted to participate.
It would provide financial rails that were more equitable and had fewer rent seekers and intermediaries taking fees along the way. A lot of it was very compelling.
But I think as we've seen the blockchain industry, such as it is now, develop, we've seen that this technology brings with it a set of incentives that are incredibly challenging to grapple with. And it's made me wonder, honestly, whether blockchains can ever live up to the potential that they originally claimed, because those incentives have seemed to be so destructive that much of the time any promises of the technology are completely overshadowed by the negatives.
CINDY COHN: Yeah. So let's talk a little bit about those incentives, 'cause I think about that a lot as well. Where do you see those incentives popping up and what are they?
MOLLY WHITE: Well, any public blockchain has a token associated with it, which is the cryptocurrency that people are trading around, speculating on, you know, purchasing in hopes that the number will go up and they will make a profit. And in order to maintain the blockchain, you know, the actual system of records that is storing information or providing the foundation for some web platform, you need that cryptocurrency token.
But it means that whatever you're trying to do with the blockchain also has this auxiliary factor to it, which is the speculation on the cryptocurrency token.
And so time and time again, watching this industry and following projects, claiming that they will do wonderful, amazing things and use blockchains to accomplish those things, I've seen the goals of the projects get completely warped by the speculation on the token. And often the project's goals become overshadowed by attempts to just pump the price of the token, in often very inauthentic ways or in ways that are completely misaligned with the goals of the project. And that happens over and over and over again in the blockchain world.
JASON KELLEY: Have you seen that not happen with any project? Is there any project that you've said, wow, this is actually going well. It's like a perfect use of this technology, or because you focus on sort of the problems, is that just not something you've come across?
MOLLY WHITE: I think where things work well is when those incentives are perfectly aligned, which is to say that if there are projects that are solely focused on speculation, then the blockchain speculation works very well. Um, you know, and so we see people speculating on Bitcoin, for example, and, and they're not hoping necessarily that the Bitcoin ledger itself will do anything.
The same is true with meme coins. People are speculating on these tokens that have no purpose behind them besides, you know. Hoping that the price will go up. And in that case, you know, people sort of know what they're getting into and it can be lucrative for some people. And for the majority of people it's not, but you know, they sort of understand that going into it, or at least you would hope that they do.
CINDY COHN: I think of the blockchain as, you know, when they say this'll go down on your permanent record, this is the permanent record.
MOLLY WHITE: That’s usually a threat.
CINDY COHN: Yeah.
MOLLY WHITE: I try to point that out as well.
CINDY COHN: Now, you know, look, to be clear, we work with people who do international human rights work saving the records before a population gets destroyed in a way that that can't be destroyed by the people in power is, is, is one of the kind of classic things that you want a secure, permanent place to store stuff, um, happens. And so there's, you know, there's that piece. So where do you point people to when you're thinking about like, okay, what if you want a real permanent record, but you don't want all the dreck of the cryptocurrency blockchain world?
MOLLY WHITE: Well, it really depends on the project. And I really try to emphasize that because I think a lot of people in the tech world come at things somewhat backwards, especially when there is a lot of hype around a technology in the way that there was with blockchains and especially Web3.
And we saw a lot of people essentially saying, I wanna do something with a blockchain. Let me go find some problem I can solve using a blockchain, which is completely backwards to how most technologists are used to addressing problems, right? They're faced with a problem. They consider possible ways to solve it, and then they try to identify a technology that is best suited to solving that problem.
And so, you know, I try to encourage people to reverse the thinking back to the normal way of doing things where, sure, you might not get the marketing boosts that Web3 once brought in. And, you know, it certainly it was useful to attract investors for a while to have that attached to your project, but you will likely end up with a more sustainable product at the end of the day because you'll have something that works and is using technology that is well suited to the problem. And so, you know, when it comes to where would I direct people other than blockchains, it very much depends on their problem and, and the problem that they're trying to solve.
For example, if you don't need to worry about having a, a system that is maintained by a group of people who don't trust each other, which is the blockchain’s sort of stated purpose, then there are any number of databases that you can use that work in the more traditional manner where you rely on perhaps a group of trusted participants or a system like that.
If you're looking for a more distributed or decentralized solution, there are peer-to-peer technologies that are not blockchain based that allow this type of content sharing. And so, you know, like I said, it really just depends on the use case more than anything.
JASON KELLEY: Since you brought up decentralization, this is something we talk about a lot at EFF in different contexts, and I think a lot of people saw blockchain and heard decentralized and said, that sounds good.
We want less centralized power. But where do you see things like decentralization actually helping if this kind of Web3 tech isn't a place where it's necessarily useful or where the technology itself doesn't really solve a lot of the problems that people have said it would.
MOLLY WHITE: I think one of the biggest challenges with blockchains and decentralization is that when a lot of people talk about decentralization, they're talking about the decentralization of power, as you've just mentioned, and in the blockchain world, they're often talking about the decentralization of compute, which is not necessarily the same thing, and in some cases it very much different.
JASON KELLEY: If you can do a rug pull, it's not necessarily decentralized. Right?
MOLLY WHITE: Right. Or if you're running a blockchain and you're saying it's decentralized, but you run all of the validators or the miners for that blockchain, then you, you know, the computers may be physically located all over the world, or, you know, decentralized in that sort of sense. But you control all the power and so you do not have a truly decentralized system in that manner of speaking.
And I think a lot of marketing in the crypto world sort of relied on people not considering the difference between those two things, because there are a lot of crypto projects that, you know, use all of the buzzwords around decentralization and democratization and, you know, that type of thing that are very, very centralized, very similar to the traditional tech companies where, you know, all of Facebook servers may be located physically all around the world, but no one's under the. The impression that Facebook is a decentralized company. Right? And so I think that's really important to remember is that there's nothing about blockchain technology specifically that requires a blockchain project to be decentralized in terms of power.
It still requires very intentional decision making on the parts of the people who are running that project to decentralize the power and reduce the degree to which any one entity can control the network. And so I think that there is this issue where people sort of see blockchains and they think decentralized, and in reality you have to dig a lot deeper.
CINDY COHN: Yeah, EFF has participated in a couple of the sanctions cases and the prosecutions of people who have developed peace. Is of the blockchain world especially around mixers. TornadoCash is one that we participated in, and I think this is an area where we have a kind of similar view about the role of the open source community and kind of the average coder and when their responsibility should create liability and when they should be protected from liability.
And we've tried to continue to weigh in on these cases to make sure the courts don't overstep, right? Because the prosecution gets so mad. You're talking about a lot of money laundering and, and things like that, that the, you know, the prosecution just wants to throw the book at everybody who was ever involved in these kinds of things and trying to create this space where, you know, a coder who just participates in a GitHub developing some piece of code doesn't have a liability risk.
And I think you've thought about this as well, and I'm wondering, do you see the government overstepping and do you think it's right to continue to think about that, that overstepping?
MOLLY WHITE: Yeah, I mean, I think it's that those are the types of questions that are really important when it comes to tackling problems around blockchains and cryptocurrencies and the financial systems that are developing around these products.
Tou have to be really cautious that, you know, just because a bad thing is happening, you don't come in with a hammer that is, you know, much too big and start swinging it around and hitting sort of anyone in the vicinity because, you know, I think there are some things that should absolutely be protected, like, you know, writing software, for example.
A person who writes software should not necessarily be liable for everything that another person then goes and does with that software. And I think that's something that's been fairly well established through, you know, cryptography cases, for example, where people writing encryption algorithms and software to do strong encryption should not be considered liable for whatever anyone encrypts with that technology. We've seen it with virus writers, you know, it would be incredibly challenging for computer scientists to research and sort of think about new viruses and protect against vulnerabilities if they were not allowed to write that software.
But, you know, if they're not going and deploying this virus on the world or using it to, you know, do a ransomware attack, then they probably shouldn't be held liable for it. And so similar questions are coming up in these cryptocurrency cases or these cases around cryptocurrency mixers that are allowing people to anonymize their transactions in the crypto world more adequately.
And certainly that is heavily used in money laundering and in other criminal activities that are using cryptocurrencies. But simply writing the software to perform that anonymization is not itself, I think, a crime. Especially when there are many reasons you might want to anonymize your financial transactions that are otherwise publicly visible to anyone who wishes to see them, and, you know, can be linked to you if you are not cautious about your cryptocurrency addresses or if you publish them yourself.
And so, you know, I've tried to speak out about that a little bit because I think a lot of people see me as, you know, a critic of the cryptocurrency world and the blockchain world, and I think it should be banned or that anyone trading crypto should be put in jail or something like that, which is a very extreme interpretation of my beliefs and is, you know, absolutely not what I believe. I think that, you know, software engineers should be free to write software and then if someone takes that software and commits a crime with it, you know, that is where law enforcement should begin to investigate. Not at the, you know, the software developer's computer.
CINDY COHN: Yeah, I just think that's a really important point. I think it's easy, especially because there's so much fraud and scam and abuse in this space, to try to make sure that we're paying attention to where are we setting the liability rules because even if you don't like cryptocurrency or any of those kinds of things, like protecting anonymity is really important.
It's kind of a function of our times right now where people are either all one or all the other. And I really have appreciated, as you've kind of gone through this, thinking about a position that protects the things that we need to protect, even if we don't care about 'em in this context, because we might in another, and law of course, is kind of made up of things that get set in one context and then applied in another, while at the same time being, you know, kind of no holds barred, critical of the awful stuff that's going on in this world.
JASON KELLEY: Let’s take a quick moment to say thank you to our sponsor.
“How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.
And now back to our conversation with Molly White
JASON KELLEY: Some of the technologies you're talking about when sort of separated out from, maybe, the hype or the negatives that have like, overtaken the story. Things like peer-to-peer file sharing, cryptography. I mean, even, let's say, being able to send money to someone, you know, with your phone, if you want to call it that, are pretty incredible at some level, you know?
And you gave a talk in October that was about a time that you felt like the web was magic and you brought up a, a website that I'm gonna pretend that I've never heard of, so you can explain it to me, called Neopets. And I just wanna, for the listeners, could you explain a little bit about what Neopets was and sort of how it helped inform you about the way you want the web to work and, and things like that?
MOLLY WHITE: Yeah, so Neopets was a kids game. When I was a kid, you could adopt these little cartoon pets and you could like feed them and change their colors and do things, you know, play little games with them.
JASON KELLEY: Like Tamagotchis a little bit,
MOLLY WHITE: a little bit. Yeah. Yeah. There was also this aspect to the website where you could edit your user page and you could create little webpages in your account that were, it was pretty freewheeling, you know, you could edit the CSS and the HTML and you could make your own little website essentially. And as a kid that was really my first exposure to the idea that the internet and these websites that I was seeing, you know, sort of for the first time were not necessarily a read-only operation. You know, I was used to playing maybe little games on the internet whatever kids were doing on the internet at the time.
And Neopets was really my first realization that I could add things to the internet or change the way they looked or interact with it in a way that was, you know, very participatory. And that later sort of turned into editing Wikipedia and then writing software and then publishing my writing on the web.
And that was really magical for me because it sort of informed me about the platform that was in front of me and how powerful it was to be able to, you know, edit something, create something, and then the whole world could see it.
JASON KELLEY: There's a really common critique right now that young people are sort of learning only bad things online or like only overusing the internet. And I mean, first of all, I think that's obviously not true. You know, every circumstance is different, but do you see places where like the way you experienced the internet growing up are still happening for young people?
MOLLY WHITE: Yeah, I mean, I think a lot of those, as you mentioned, I think a lot of those critiques are very misguided and they miss a lot of the incredibly powerful and positive aspects of the internet. I mean, the fact that you can go look something up and learn something new in half a second, is revolutionary. But then I think there are participatory examples, much like what I was experiencing when I was younger. You know, people can still edit Wikipedia the way that I was doing as a kid. That is a very powerful thing to do when you're young, to realize that knowledge is not this thing that is handed down from on high from some faceless expert who wrote history, but it's actually something that people are contributing to and improving constantly. And it can always be updated and improved and edited and shared, you know, in this sort of free and open way. I think that is incredibly powerful and is still open to people of any age who are, you know, able to access the site.
JASON KELLEY: I think it's really important to bring up some of these examples because something I've been thinking about a lot lately as these critiques and attacks on young people using the internet have sort of grown and even, you know, entered the state and congressional level in terms of bills, is that a lot of the people making these critiques, I feel like never liked the internet to begin with. They don't see it as magic in the way that I think you do and that, you know, a lot of our listeners do.
And it's a view that is a problem specifically because I feel like you have to have loved the internet before you can hate it. You know, like, it's not like you need to really be able to critique the specific things rather than just sort of throw out the whole thing. And one of the things you know, I like about the work that you do is that you clearly have this love for technology and for the internet, and that lets you, I think, find the problems. And other people can't see through into those specific individual issues. And so they just wanna toss the whole thing.
MOLLY WHITE: I think that's really true. I think that, you know, I think there is this weird belief, especially around tech critics, that tech critics hate technology. It's so divorced from reality because, you don't see that in other worlds where, you know, art critics are never told that they just hate all art. I think most people understand that art critics love art and that's why they are critics.
But with technology critics, there's sort of this weird, you know, this perception of us as people who just hate technology, we wanna tear it all down when in reality, you know, I know a lot of tech critics and, and most of us, if not all of us, that I can think of come from a, you know, a background of loving technology often from a very young age, and it is because of that love and the want to see technology to continue to allow people to have those transformative experiences that we criticize it.
And that's, for some reason, just a hard thing, I think for some people to wrap their minds around.
JASON KELLEY: I want to talk a little bit more about Wikipedia, the whole sort of organization and what it stands for and what it does has been under attack a lot lately as well. Again, I think that, you know, it's a lot of people misunderstanding how it works and, and, um, you know, maybe finding some realistic critiques of the fact that, that, you know, it's individually edited, so there's going to be some bias in some places and things like that, and sort of extrapolating out when they have a good faith argument to this other place. So I'm wondering if you've thought about how to protect Wikipedia, how to talk about it. How you know your experience with it has made you understand how it works better than most people.
And also just generally, you know how it can be used as a model for the way that the internet should be or the way we can build a better internet.
MOLLY WHITE: I think this ties back a little bit to the decentralization topic where Wikipedia is not decentralized in the sense that, you know, there is one company or one nonprofit organization that controls all the servers. And so there is this sort of centralization of power in that sense. But it is very decentralized in the editing world where there is no editorial board that is vetting every edit to the site.
There are, you know, numerous editors that contribute to any one article and no one person has the final say. There are different language versions of Wikipedia that all operate somewhat independently. And because of that, I think it has been challenging for people to attack it successfully.
Certainly there have been no shortage of those attacks. Um, but you know, it's not a company that someone could buy out and take over in ways that we've seen, you know, for example Elon Musk do with Twitter. There is no sort of editorial board that can be targeted and sort of pressured to change the language on the site. And, you know, I think that has helped to make Wikipedia somewhat resilient in ways that we've seen news organizations or other media publications struggle with recently where, you know, they have faced pressure from their buyers. The, you know, the people who own those organizations to be sure.
They've faced, you know, threats from the government in some cases. And Wikipedia is structured somewhat differently that I think helps us remain more protected from those types of attacks. But, you know, I, I am cautious to note that, you know, there are still vulnerabilities.
The attacks on Wikipedia need to be vociferously opposed. And so we have to be very cautious about this because the incredible resource that Wikipedia is, is is something that doesn't just sort of happen in a vacuum, you know, outside of any individual's actions.
It requires constant support, constant participation, constant editing. And so, you know, it's certainly a model to look to in terms of how communities can organize and contribute to, um, you know, projects on the internet. But it's also something that has to be very carefully maintained.
CINDY COHN: Yeah, I mean, this is just a lesson for our times, right? You know, there isn't a magical tech that can protect against all attacks. And there isn't a magical, you know, nonprofit 501-C3 that can be resistant against all the attacks. And we're in a time where they're coming fast and furious against our friends at Wikimedia, along with a lot of other, other things.
And I think the impetus is on communities to show up and, and, you know, not just let these things slide or think that, you know, uh, the internet might be magic in some ways, but it's not magic in these ways. Like we have to show up and fight for them. Um, I wanted to ask you a little bit about, um, kind of big tech's embrace of AI.
Um, you've been critical of it. We've been critical of it as well in many ways. And, and I, I wanna hear kind of your concerns about it and, um, and, and kind of how you see AI’s, you know, role in a better world. But, you know, also think about the ways in which it's not working all that well right now.
MOLLY WHITE: I generally don't have this sort of hard and fast view of AI is good or AI is bad, but it really comes down to how that technology is being used. And I think the widespread use of AI in ways that exploit workers and creatives and those who have decided to publish something online for example, and did not expect for that publication to be used by big tech companies that are then profiting off of it, that is incredibly concerning. Um, as well as the ways that AI is marketed to people. Again, this sort of mirrors my criticism, surround the crypto industry, but a lot of the marketing around AI is incredibly misleading. Um, you know, they're making promises that are not born out in reality.
They are selling people a product that will lie to you, you know, that will tell you things that are inaccurate. So I have a lot of concerns around AI, especially as we've seen it being used in the broadest, and sort of by the largest companies. But you know, I also acknowledge that there are ways in which some of this technology has been incredibly useful. And so, you know, it is one of these things where it has to be viewed with nuance, I think, around the ways it's being developed, the ways it's being deployed, the ways it's being marketed.
CINDY COHN: Yeah, there is a, a kinda eerie familiarity around the hype around AI and the hype around crypto. That, it's just kind of weird. It feels like we're going through like a, you know, groundhog day. Like we're living through the, another hype cycle that feels like the last, I think, you know, for us at EFF, we're really, we, we've tried to focus a lot on governmental use of AI's systems and AI systems that are trying to predict human behavior, right?
The digital equivalent of phrenology right? You know, let us, let us do sentiment analysis on the things that you said, and that'll tell us whether you're about to be a criminal or, you know, the right person for the job. I think those are the places that we've really identified, um, as, you know, problematic on a number of levels. You know, number one, it, it doesn't work nearly as well as,
MOLLY WHITE: That is a major problem!
CINDY COHN: It seems like that ought to be number one. Right. And this, you know, especially spending your time in Wikipedia where you're really working hard to get it right. Right. And you see the kind of back and forth of the conversation. But the, the central thing about Wikipedia is it's trying to actually give you truthful information and then watching the world get washed over with these AI assistants that are really not at all focused on getting it right, you know, or really focused on predicting the next word or, or however that works, right. Like, um, it's gotta be kind of strange from where you sit, I suspect, to see this.
MOLLY WHITE: Yeah, it's, it's very frustrating. And, you know, I, I like to think we lived in a world at one time where people wanted to produce technology that helped people, technology that was accurate, technology that worked in the ways that they said it did. And it's been very weird to watch, especially over the last few years that sort of, uh, those goals degrade where, well, maybe it's okay if it gets things wrong a lot, you know, or maybe it's okay if it doesn't work the way that we've said it does or maybe never possibly can.
That's really frustrating to watch as someone who, again, loves technology and loves the possibilities of technology to then see people just sort of using technology to, to deliver things that are, you know, making things worse for people in many ways.
CINDY COHN: Yeah, so I wanna flip it around a little bit. You, like EFF, we kind of sometimes spend a lot of time in all the ways that things are broken, and how do you think about how to get to a place where things are not broken, or how do you even just keep focusing on a better place that we could get to?
MOLLY WHITE: Well, I've, like I said, you know, a lot of my criticism really comes down to the industries and the sort of exploitative practices of a lot of these companies in the tech world. And so to the extent possible, separating myself from those companies and from their control has been really powerful to sort of regain some of that independence that I once remembered the web enabling where, you know, if you had your own website, you know, you could kind of do anything you wanted. And you didn't have to stay within the 280 characters if you had an idea, you know, and you could publish, uh, you know, a video that was longer than 10 minutes long or, or whatever it might be.
So sort of returning to some of those ideals around creating my own spaces on the web where I have that level of creative freedom, and certainly freedom in other ways, has been very powerful. And then finding communities of people who believe in those same things. I've taken a lot of hope in the Fediverse and the communities that have emerged around those types of technologies and projects where, you know, they're saying maybe there is an alternative out there to, you know, highly centralized big tech, social media being what everyone thinks of as the web. Maybe we could create different spaces outside of that walled garden where we all have control over what we do and say, and who we interact with. And we set the terms on which we interact with people.
And sort of push away the, the belief that, you know, a tech company needs to control an algorithm to show you what it is that you want to see, when in reality, maybe you could make those decisions for yourself or choose the algorithm or, you know, design a system for yourself using the technologies that are available to everyone, but have been sort of walled in by a large or many of the large players in the web these days.
CINDY COHN: Thank you, Molly. Thank you very much for coming on and, and spending your time with us. We really appreciate the work that you're doing, um, and, and the way that you're able to boil down some pretty complicated situations into, you know, kind of smart and thoughtful ways of reflecting on them. So thank you.
MOLLY WHITE: Yeah. Thank you.
JASON KELLEY: It was really nice to talk to someone who has that enthusiasm for the internet. You know, I think all of our guests probably do, but when we brought up Neo pets, that excitement was palpable, and I hope we can find a way to get more of that enthusiasm back.
That's one of the things I'm taking away from that conversation was that more people need to be enthusiastic about using the internet and whatever that takes. What did you take away from chatting with Molly that we need to do differently Cindy?
CINDY COHN: Well, I think that the thing that made the enthusiasm pop in her voice was the idea that she could control things. That she was participating and, and participating not only in Neopets, but the participation on Wikipedia as well, right?
That she could be part of trying to make truth available to people and recognizing that truth in some ways isn't an endpoint, it's an evolving conversation among people to try to keep getting at getting it right.
That doesn't mean there isn't any truth, but it does mean that there is an open community of people who are working towards that end. And, you know, you hear that enthusiasm as well. It's, you know, the more you give in, the more you get out of the internet and trying to make that a more common experience of the internet that things aren't just handed to you or taught to you, but really it's a two-way street, that's where the enthusiasm came from for her, and I suspect for a lot of other people.
JASON KELLEY: Yeah, and what you're saying about truth, I think she sort of applies this in a lot of different ways. Even specific technologies, I think most people realize this, but you have to say it again and again, aren't necessarily right or wrong for everything. You know, AI isn't right or wrong for every scenario. It's sort of, things are always evolving. How we use them is evolving. Whether or not something is correct today doesn't mean it will be correct tomorrow. And there's just a sort of nuance and awareness that she had to how these different things work and when they make sense that I hope we can continue to see in more people instead of just a sort of, uh, flat across the board dislike of, you know, quote unquote the internet or quote unquote social media and things like that.
CINDY COHN: Yeah, or the other way around, like whatever it is, there's a hype cycle and it's just hyped over and over again. And that she's really charting a middle ground in the way she writes and talks about these things that I think is really important. I think the other thing I really liked was her framing of decentralization as thinking about decentralizing power, not decentralizing compute, and that difference being something that is often elided or not made clear.
But that can really help us see where, you know, where decentralization is happening in a way that's empowering people, making things better. You have to look for decentralized power, not just decentralized compute. I thought that was a really wise observation.
JASON KELLEY: And I think could be applied to so many other things where a term like decentralized may be used because it's accessible from everywhere or something like that. Right? And it's just, these terms have to be examined. And, and it sort of goes to her point about marketing, you know, you can't necessarily trust the way the newest fad is being described by its purveyors.
You have to really understand what it's doing at the deeper level, and that's the only way you can really determine if it's, if it's really decentralized, if it's really interoperable, if it's really, you know, whatever the new thing is. AI
CINDY COHN: Mm-hmm. Yeah, I think that's right. And you know, luckily for us, we have Molly who digs deep into the details of this for so many technologies, and I think we need to, you know, support and defend, all the people who are doing that. Kind of that kind of careful work for us, because we can't do all of it, you know, we're humans.
But having people who will do that for us in different places who are trusted and who aren't, you know who whose agenda is clear and understandable, that's kind of the best we can hope for. And the more of that we build and support and create spaces for on the, you know, uncontrolled open web as opposed to the controlled tech giants and walled gardens, as she said, I think the better off we'll be.
JASON KELLEY: Thanks for joining us for this episode of How to Fix the Internet.
If you have feedback or suggestions, we'd love to hear from you. Visit EFF dot org slash podcast and click on listener feedback. While you're there, you can become a member, donate, maybe even pick up some merch and just see what's happening in digital rights this week and every week.
Our theme music is by Nat Keefe of BeatMower with Reed Mathis
And How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.
We’ll see you next time.
I’m Jason Kelley…
CINDY COHN: And I’m Cindy Cohn.
MUSIC CREDITS: This podcast is licensed Creative Commons attribution 4.0 international and includes the following music licensed Creative Commons 3.0 unported by its creators: Drops of H2O, the filtered water treatment, by J. Lang. Additional beds by Gaetan Harris.
Please Drone Responsibly: C-UAS Legislation Needs Civil Liberties Safeguards
Today, the Senate Judiciary Committee is holding a hearing titled “Defending Against Drones: Setting Safeguards for Counter Unmanned Aircraft Systems Authorities.” While the government has a legitimate interest in monitoring and mitigating drone threats, it is critical that those powers are narrowly tailored. Robust checks and oversight mechanisms must exist to prevent misuse and to allow ordinary, law-abiding individuals to exercise their rights.
Unfortunately, as we and many other civil society advocates have highlighted, past proposals have not addressed those needs. Congress should produce well-balanced rules that address all these priorities, not grant de facto authority to law enforcement to take down drone flights whenever they want. Ultimately, Congress must decide whether drones will be a technology that mainly serves government agencies and big companies, or whether it might also empower individuals.
To make meaningful progress in stabilizing counter unmanned aerial system (“C-UAS”) authorities and addressing emerging issues, Congress should adopt a more comprehensive approach that considers the full range of risks and implements proper safeguards. Future C-UAS legislation include the following priorities, which are essential to protecting civil liberties and ensuring accountability:
- Strong and explicit safeguards for First Amendment-protected activities
- Ensure transparency and require detailed reporting
- Provide due process and recourse for improper counter-drone activities
- Require C-UAS mitigation to involve least-invasive methods
- Maintain reasonable retention limits on data collection
- Maintain sunset for C-UAS powers as drone uses continue to evolve
Congress can—and should—address public safety concerns without compromising privacy and civil liberties. C-UAS authorities should only be granted with the clear limits outlined above to help ensure that counter-drone authorities are wielded responsibly.
The American Civil Liberties Union (ACLU), Center for Democracy & Technology (CDT), Electronic Frontier Foundation (EFF), and Electronic Privacy Information Center (EPIC) shared these concerns with the Committee in a joint Statement For The Record.