A Black Woman Invented Home Security. Why Did It Go So Wrong?

There’s a well-known story among surveillance studies scholars and students of Black innovation: that of Marie Van Brittan Brown, a Black woman from Jamaica, Queens, New York who is now recognized as having invented the home security system in 1966. Brown worked long hours as a nurse and often came home late at night. Her husband also worked “irregular hours,” and Brown worried about who might knock on her door if she were home alone at night. Similar versions of Brown’s story can be found at the MIT Lemelson Center and all around the internet, including on Wikipedia, the African American history site Blackpast, and the history site Timeline. It’s understandable that attention would be paid to Brown’s pioneering work as a Black woman inventor whose contribution has rightly been cited in the development of subsequent home security systems and as the origin point for a massive industry.

Brown’s inventor origin story is quite different from that of a similar technology’s creator—Jamie Siminoff, founder of DoorBot, which eventually became the Ring Doorbell. Siminoff started DoorBot in a garage in 2012, after he grew annoyed by people constantly ringing his doorbell. “I was like, how the fuck can there not be a doorbell that goes to your phone?” Siminoff told Digital Trends. As Caroline Haskins wrote in Vice, “DoorBot was thus posed as an answer to a question perhaps only he had ever asked.” Indeed, Brown’s patent is cited in Siminoff’s patent.

A Black woman who feared for her safety creates a system. A white guy develops an iteration of this system later because he is annoyed that people are ringing his doorbell too often. This becomes a tool to manage Amazon’s loss prevention. Eventually, it leads to a boom not only in home security products like the Amazon suite and Google’s security cameras, along with a variety of others, but increasing measures to make the home, the neighborhood, and all public and private spaces a 24/7 watched fortress, complete with cameras, drones, security robots, and automated license plate readers. But amid this escalation, one urgent question arises: What are we defending ourselves against?

While the progression from Brown’s invention to Siminoff’s may seem unlikely or even paradoxical, it isn’t: Surveillance technology always “finds its level.” Its gaze is always going to wind up focused on Black folks—even if that was not the “intent” of the inventor. Surveillance, first and foremost, performs a carceral function by attempting the capture and control of marginalized populations. That it may serve additional functions is somewhat beside the point. Surveillance systems, no matter their origin, will always exist to serve power.

Earlier this year, at Amazon’s annual device launch, the company focused on how it would like us to think about security. A security robot named Astro, essentially a roving Alexa with a camera and big “eyes” to enhance the sense of “cuteness,” will roll around your house and scan the faces of people in your home. A security drone will fly around the house in a predetermined path. This is alongside a host of other initiatives built on existing products: Ring Alarm Pro, Ring Always Home Cam, Virtual Security Guard. Safety, Amazon would have us believe, comes in the form of cameras, or to be more precise, cameras everywhere pointed at everything all the time.

Amazon is not the only one. This trend can also be seen with the rise of automated license plate reader systems for individual neighborhoods, Google’s partnership with ADT, and the company’s launch of “smart” security cameras that offer the ability to define “events” to record, recognize friendly faces, and detect noises such as glass breaking. As tech giants seek to saturate every aspect of our lives, home security has become a 50 billion dollar business in the United States alone.

In keeping with its surveillance expansion over the years, Amazon’s Ring partnered with more than 400 police departments across the country, after a successful multiyear strategy to turn law enforcement into part-time doorbell sales agents and cement the term “porch pirate” into our lexicon. The behemoth then cynically attempted to counter the obvious racial consequences of this in its own consumer-driven way. In 2020 it debuted the Ring dash cam with a Traffic Stop mode that allows drivers to say “Alexa I’m being pulled over,” at which point Alexa will begin recording the subsequent traffic stop. The company that has made so much hay enabling surveillance, supercharging the ability to blast out racist notions about who belongs in a neighborhood and acting as a gentrifying force now throws a bone to people who may be guilty of “driving while Black.” This is very much the same logic that drove the push for body cams. In both instances, the results in terms of protecting Black lives have not lived up to the claims of advocates.

In Dark Matters: On the Surveillance of Blackness, Simone Browne, professor of Black Studies in the Department of African and African Diaspora Studies at the University of Texas at Austin, suggests that anti-Black racism is fundamentally coded into all our systems of vision, oversight, observation, and surveillance. She argues that there is no such thing as a system of surveillance, at least when human beings are involved, that does not add to anti-Blackness. According to Browne, “The historical formation of surveillance is not outside the historical formation of slavery.”

No amount of advances in technology will change the basic truth that surveillance and carceral technology exist to serve those in control. The narratives about police response times and accountability have remained the same, even though the 50-plus years since Brown’s patent have seen far more surveillance in both public and private spaces. This calls into question prevailing assumptions about what keeps communities safe—a point that’s been made repeatedly by community activists and police abolitionists. Brown’s invention is not evidence of some kind of conscious complicity with repressive technologies; rather, it demonstrates that the repressive function of technologies lies in their imbrication in pervasive notions of race.

Many of these tools have become agents of gentrification. They offload the “policing” of Black folks in public spaces to individuals who become de facto cops. Early advertisements for the Ring were explicit about this, even promising bounties in the form of free products. Though the company has toned this rhetoric down in recent years, a key aspect of Ring and Neighbors is still the assertion that by owning the device, you are doing your part to “fight crime.”

Narratives about how a given surveillance technology will improve the way policing works for and in Black communities have similarly remained relatively stable over time. Claims about improved police response times, increased safety and accountability, more safety or better community relations continually mark the introduction of new surveillance technologies—from police body cams to Project Green Light in Detroit, Stingrays or surveillance planes in Baltimore, neighborhood automated license plate readers, and Ring Doorbells. While this may be indicative of what communities demand from policing, there is an alternative read: The promises remain both the same and undelivered because these technologies exist to further entrench the surveillance of Black and brown bodies as a practice that is foundational to how law enforcement operates in this country. Put another way, these technologies nibble around the edges of problems that are systemic. More and better forms of surveillance have not, nor will they ever, be a solution to these issues.

Remarkably, like Amazon and other private providers, US cities and states make assertions about more surveillance producing more safety, despite the fact that other countries have already tested this idea and found it wanting. The United Kingdom has what is reputed to be the largest network of CCTV cameras in a democracy, with between 4 million and 5.9 million cameras in use as of 2015, many of them operated not by government but by businesses and individuals. Yet even the Surveillance Commissioner for the UK and Wales worried that the point of the cameras was to “build a surveillance society,” not to prevent crime, as there is little evidence that cameras deter crime, and the crimes they do impact tend to be property crimes rather than violent ones. This is as incontrovertible empirical proof as one could ask for that visual and audio surveillance of the environment does not create safer communities.

Yet as Browne argues clearly, what it does do is contribute to paranoia and anxiety, which, in the US, is always tied to race. The primary purpose of these devices is to entrench and promote racialized anxieties, advanced under cover of a “safety” that does not deliver and is itself implicated in racism.

There’s a 2020 Alexa Super Bowl spot that runs through a variety of scenarios in which Michael B Jordan—who has notably played Oscar Grant, the victim of a murder by BART officers, in Fruitvale Station and Killmonger, the “villain” in Black Panther—acts as Alexa. Jordan reads to the Alexa owner while she takes a bubble bath, dims the lights for her, and takes off his shirt while being showered by sprinklers. The symbolism is undeniable. Fetishizing Blackness—Black sexuality as well as Black “criminality”—as something to be captured, ingested, and subdued by technology is a centuries-old American project, of which surveillance is an essential part. But, as Tawana Petty, director of the Data Justice Program at Detroit Community Technology Project consistently asserts, “Surveillance ain’t safety.” Surveillance initiatives and technologies, no matter where they start, will end up directed at the most marginalized—particularly Black and brown populations. This is because the technologies are primarily exercises of power rather than efforts to enact systemic change.

As I’ve argued before, one reason it’s so difficult to push back on surveillance is because individuals who purchase these technologies often imagine themselves on the “right end” of the camera—and in many cases they are correct. The people who purchase Rings, for instance, imagine that the footage will only be used to incriminate “outsiders,” never themselves. Yet, while it’s true that surveillance falls first and disproportionately on the most marginalized, it’s worth considering how the world that Amazon (and Google, and Flock) want to build, one covered in devices that watch and record our every move, will eventually come for us all.

Even a cursory look at the landscape proves troubling: Since the pandemic, there has been a meteoric rise in services that monitor white collar employees, an arena that was in many cases once confined to factory and warehouse workers. Automobiles are increasingly computers on wheels, meant to record the driver’s every movement, a tech that previously had been mainly applied to truckers and delivery drivers. Apps whose stated purpose is to monitor children become tools for abusive partners and stalkers. As tech companies extend their tendrils into every aspect of our lives—so that your grocery store sells devices that diagnose your health while also being an arm of law enforcement and collecting the sensitive and intimate data from every device in your home—those who are traditionally on the “right side” of the camera should consider what this might mean for their own autonomy and safety.

Brown was granted a patent in 1969 but never saw her invention brought to life. While she intended to catch the attention of home builders and manufacturers, that scenario did not materialize in her lifetime. In our time, however, home builders are integrating “smart” security systems as part of their suite of amenities. The system that Brown intended to spy on would-be burglars on her stoop now watches the entire family and anyone who crosses the system’s threshold. Owners can send footage to law enforcement or upload it to social media, each instance bringing us closer to the day when we are endlessly watched while continually being told to feel less and less safe.


More Great WIRED Stories