Menu Close

Tag: privacy

Snoop Unto Them As They Snoop Unto Us

Abruptly returning to a previous topic, here’s a guest dispatch from Famicoman (AKA Mike Dank) on surveillance and privacy. Back to the new focus soon.


The letter sat innocently in a pile of mail on the kitchen table. A boring envelope, nondescript at a glance, that would become something of a Schrödinger’s cat before the inevitable unsealing. The front of it bared the name of the sender, in bright, black letters — “U.S. Department of Justice — Federal Bureau of Investigations.” This probably isn’t something that most people would ever want to find in their mailbox.

For me, the FBI still conjures up imagery straight out of movies, like the bumbling group in 1995’s Hackers, wrongfully pursuing Dade Murphy and his ragtag team of techno-misfits instead of the more sinister Plague. While this reference is dated, I still feel like there is a certain stigma placed upon the FBI, especially by the technophiles who understand there is more to computing than web browsers and document editing. As laws surrounding computers become more sophisticated, we can see them turn draconian. Pioneers, visionaries, and otherwise independent thinkers can be reduced to little more than a prisoner number.

Weeks earlier, I had submitted a Privacy Act inquiry through the FBI’s Freedom of Information Act service. For years, the FBI and other three-letter-agencies have allowed people to openly request information on a myriad of subjects. I was never particularly curious about the outcome of a specific court case or what information The New York Times has requested for articles; my interests were a bit more selfish.

Using the FBI’s eFOIA portal through their website, I filled out a few fields and requested my own FBI file. Creating a FOIA request is greatly simplified these days, and you can even use free services, such as getmyfbifile.com, to generate forms that can be sent to different agencies. I only opted to pursue the FBI at this time, but could always query other agencies in the future.

The whole online eFOIA process was painless, taking maybe two minutes to complete, but I had hesitations as my cursor hovered over the final “Submit” button. Whether or not I actually went through with this, I knew that the state of the information the FBI had on me already wouldn’t falter. They either have something, or they don’t, and I think I’m ready to find out. With this rationalization, I decided to submit — in more ways than one.

The following days went by slowly and my mind seemed to race. I had read anecdotes from people who had requested their FBI file, and knew the results could leave me with more questions than answers. I read one account of someone receiving a document with many redactions, large swathes of blacked-out text, giving a minute-by-minute report of his activities with a collegiate political group. A few more accounts mentioned documents of fully-redacted text, pages upon pages of black lines and nothing else.

What was I in store for? It truly astonishes me that a requester would get back anything at all, even a simple acknowledgement that some record exists. In today’s society where almost everyone has a concern about their privacy, or at least an acknowledgement that they are likely being monitored in some way, the fact that I could send a basic request for information about myself seems like a nonsensical loophole in our current cyberpolitical climate. You would never see this bureaucratic process highlighted in the latest technothriller.

About two weeks after my initial request, there I was, staring at the letter sticking out from the mail stack on the kitchen table. All at once, it filled me with both gloom and solace. This was it, I was going to see what it spelled out, for better or worse. Until I opened it, the contents would remain both good and bad news. After slicing the envelope, I unfolded the two crisp pieces of paper inside, complete with FBI letterhead and a signature from the Record/Information Dissemination Section Chief. As I ingested the first paragraph, I found the line that I hoped I would, “We were unable to identify main records responsive to the FOIA.”

Relief washed over, and any images I had of suited men arriving in black vans to take me away subsided (back down to the normal levels of paranoia, at least). It was the best information I could have received, but not at all what I had expected. For over ten years, I have been involved in several offbeat Internet subcultures and groups, and more than a few sound reason enough to land me on someone’s radar. I was involved with a popular Internet-based hacking video show, held a role in a physical hacking group/meeting, hosted a Tor relay, experimented openly with alternative, secure mesh networks, sysop’d a BitTorrent tracker, and a few other nefarious things here and there.

I always tried to stay on the legal side of things, but that doesn’t mean that I don’t dabble with technologies that could be used for less than savory purposes. In some cases, just figuring out how something can be done was more rewarding than the thought of using it to commit an act or an exploit. Normal people (like friends and coworkers) might call me “suspicious” or tell me I was “likely on a list,” but I didn’t seem to be from what I could gather from the response in front of me.

When I turned back to read the second paragraph, I eyed an interesting passage, “By standard FBI practice and pursuant to FOIA exemption… and Privacy Act exemption… this response neither confirms or denies the existence of your subject’s name on any watch lists.” So maybe I was right to be worried. Maybe I am being watched. I would have no way of knowing. This “neither confirms or denies” response is called a Glomar, which means my information has the potential to be withheld as a matter of national security, or over privacy concerns.

Maybe they do have information on me after all. Even if I received a flat confirmation that there is nothing on me, would I believe it? What is to prevent a government organization from lying to me for “my own good”? How can I be expected to show any semblance of trust at face value? Now that all is said and done, I don’t know much more than I did when I started, and have little to show for the whole exchange besides an official request number and a few pieces of paper with boilerplate, cover-your-ass language.

If we look back at someone like Kevin Mitnick, the cunning social engineer who received a fateful knock on his hotel door right before being arrested in early 1995, we see a prime example of law enforcement pursuing someone not only for the actions they took, but the skills and knowledge they possess. Echoing Operation Sundevil, only five years prior, government agencies wanted to make examples out of their targets, and incite scare tactics to keep others in line.

I can’t help but think of “The Hacker Manifesto,” written by The Mentor (an alias used by Loyd Blankenship) in 1986. “We explore… and you call us criminals. We seek knowledge… and you call us criminals,” Blankenship writes shortly after being arrested himself. Even if I received a page of blacked-out text in the mail, would I be scared and change my habits? What if I awoke to a hammering on my door in the middle of the night? I still don’t know what to make of my response, but maybe I’ll submit another request again next year.

Knock, knock.


Header artwork by Matt Brown.

I Hope You Like the NSA Because the NSA Sure Likes You

Today’s news about the NSA feels a little too spot-on. I hope the hackneyed scriptwriters for 2017 feel ashamed:

In its final days, the Obama administration has expanded the power of the National Security Agency to share globally intercepted personal communications with the government’s 16 other intelligence agencies before applying privacy protections.

The new rules significantly relax longstanding limits on what the N.S.A. may do with the information gathered by its most powerful surveillance operations, which are largely unregulated by American wiretapping laws. These include collecting satellite transmissions, phone calls and emails that cross network switches abroad, and messages between people abroad that cross domestic network switches.

The change means that far more officials will be searching through raw data. Essentially, the government is reducing the risk that the N.S.A. will fail to recognize that a piece of information would be valuable to another agency, but increasing the risk that officials will see private information about innocent people.

Really? Expanding the NSA’s power, so soon after the Snowden plotline? A move like this might be exciting in an earlier season, but at this point the show is just demoralizing its viewers. Especially after making the rule that no one can turn off their TV, ever, it just seems cruel.

At least the Brits have it worse? I dunno, that doesn’t make me feel better, since America likes to import UK culture. (It’s one of our founding principles!)

Now is a good time to donate to the Tor Project, is what I’m saying.

In other news, researchers can pull fingerprints from photos and use the data to unlock your phone, etc. Throwback: fingerprints are horrible passwords.

Remember, kids, remaining in your original flesh at all is a poor security practice.


Header photo via torbakhopper, who attributes it to Scott Richard.

Watch Yourself

Let’s talk about sousveillance again. For those not familiar with the word, it literally translates to “undersight” — as opposed to oversight. Surveillance is perpetrated by an authority; sousveillance is perpetrated by the people. The unwashed masses, if you will.

Steve Mann (no relation) led the paper that coined the term. It came out in 2003! They had no idea about Instagram! What’s interesting is how much the connotations of “sousveillance” have morphed since Mann and his colleagues first came up with it. Here’s their original conception:

Organizations have tried to make technology mundane and invisible through its disappearance into the fabric of buildings, objects and bodies. The creation of pervasive ubiquitous technologies — such as smart floors, toilets, elevators, and light switches — means that intelligence gathering devices for ubiquitous surveillance are also becoming invisible […]. This re-placement of technologies and data conduits has brought new opportunities for observation, data collection, and sur/sousveillance, making public surveillance of private space increasingly ubiquitous.

All such activity [until now] has been surveillance: organizations observing people. One way to challenge and problematize both surveillance and acquiescence to it is to resituate these technologies of control on individuals, offering panoptic technologies to help them observe those in authority. […]

Probably the best-known recent example of sousveillance is when Los Angeles resident George Holliday videotaped police officers beating Rodney King after he had been stopped for a traffic violation. The ensuing uproar led to the trial of the officers (although not their conviction) and serious discussion of curtailing police brutality […]. Taping and broadcasting the police assault on Rodney King was serendipitous and fortuitous sousveillance. Yet planned acts of sousveillance can occur, although they are rarer than organizational surveillance. Examples include: customers photographing shopkeepers; taxi passengers photographing cab drivers; citizens photographing police officers who come to their doors; civilians photographing government officials; residents beaming satellite shots of occupying troops onto the Internet. In many cases, these acts of sousveillance violate [either explicit or implicit rules] that ordinary people should not use recording devices to record official acts.

Sousveillance was supposed to be a way to Fight the Man, to check the power of the state. Unfortunately, many governments’ surveillance apparatuses* were poised to take advantage of the compulsive documenting habit that smartphones added to daily life.

For example, the NSA has wonderful SIGINT. Theoretically they can mine Facebook and its ilk for whatever insights they might want to extract. Encryption mitigates this problem, but it’s not clear by how much. Anything that’s publicly available online can be scraped.

So now you have n00bs posting photos of protests on Twitter and accidentally exposing people with open warrants. Elle Armageddon wrote a two-part “OPSEC for Activists” guide, but by default the attendees of unplanned, uncoordinated events aren’t going to follow the rules.

Welp ¯\_(ツ)_/¯


*I thought it would be “apperati” too, but as it turns out, no. See this and this.

Image credit: My Second or Third Skin by Claire Carusillo.

Therapy Bots and Nondisclosure Agreements

Two empty chairs. Image via R. Crap Mariner.

Image via R. Crap Mariner.

Let’s talk about therapy bots. I don’t want to list every therapy bot that’s ever existed — and there are a few — so I’ll just trust you to Google “therapy bots” if you’re looking for a survey of the efforts so far. Instead I want to discuss the next-gen tech. There are ethical quandaries.

If (when) effective therapy bots come onto the market, it will be a miracle. Note the word “effective”. Maybe it’ll be 3D facial models in VR, and machine learning for the backend, but it might be some manifestation I can’t come up with. Doesn’t really matter.

They have to actually help people deal with their angst and self-loathing and grief and resentment, but any therapy bots that are able to do that will do a tremendous amount of good. Not because I think they’ll be more skilled than human therapists — who knows — but because they’ll be more broadly available.

Software is an order of magnitude cheaper than human employees, so currently underserved demographics may have greater access to professional mental healthcare than they ever have before. Obviously the situation for rich people will still be better, but it’s preferable to be a poor person with a smartphone in a world where rich people have laptops than it is to be a poor person without a smartphone in a world where no one has a computer of any size.

Here’s the thing. Consider the data-retention policies of the companies that own the therapy bots. Of course all the processing power and raw data will live in the cloud. Will access to that information be governed by the same strict nondisclosure laws as human therapists? To what extent will HIPAA and equivalent non-USA privacy requirements apply?

Now, I don’t know about you, but if my current Homo sapiens therapist asked if she could record audio of our sessions, I would say no. I’m usually pretty blasé about privacy, and I’m somewhat open about being mentally ill, but the actual content of my conversations with my therapist is very serious to me. I trust her, but I don’t trust technology. All kinds of companies get breached.

Information on anyone else’s computer — that includes the cloud, which is really just a rented datacenter somewhere — is information that you don’t control, and information that you don’t control has a way of going places you don’t expect it to.

Here’s something I guarantee would happen: An employee at a therapy bot company has a spouse who uses the service. That employee is abusive. They access their spouse’s session data. What happens next? Who is held responsible?

I’m not saying that therapy bots are an inherently bad idea, or that the inevitable harm to individuals would outweigh the benefits to lots of other individuals. I’m saying that we have a hard enough time with sensitive data as it is. And I believe that collateral damage is a bug, not a feature.


Great comments on /r/DarkFuturology.

The Productive Attitude to Privacy

Instead of considering privacy to be a right that you deserve, think of it as a condition that you can create for yourself. Comprehensive privacy is difficult to achieve — aim to hide the pieces of information that matter to you the most. Even in countries that say their citizens are entitled to privacy, abstract guarantees are meaningless if you don’t take action to protect the information that you want to conceal. (Remember, you’re only one “national security emergency” away from losing all the rights you were promised.)

What is privacy? Photo by Cory Doctorow.

Photo by Cory Doctorow.

For the most part, protecting information with your actions means restricting access to it. As I wrote before, “when you trust third parties to protect your privacy (including medical data and financial access), you should resign yourself to being pwned eventually.”

The key to perfect privacy is to avoid recording or sharing any information in the first place. If you never write down your secret, then no one can copy-paste it elsewhere, nor bruteforce any cipher that you may have used to obscure it. Thank goodness we haven’t figured out how to hack brains in detail! But unfortunately, some pieces of information — like passwords with plenty of entropy — aren’t useful unless you’re able to copy-paste them. Who can memorize fifty different diceware phrases? The key to imperfect-but-acceptable privacy is figuring out your limits and acting accordingly. How much risk are you willing to live with?

The main argument against my position is that responsibilities that could be assigned to communities are instead pushed onto individuals, who are demonstrably ill-equipped to cope with the requirements of infosec.

“Neoliberalism insists that we are all responsible for ourselves, and its prime characteristic is the privatisation of resources — like education, healthcare, and water — once considered essential rights for everyone (for at least a relatively brief period in human history so far). Within this severely privatised realm, choice emerges as a mantra for all individuals: we can all now have infinite choices, whether between brands of orange juice or schools or banks. This reverence for choice extends to how we are continually pushed to think of ourselves as not just rewarded with choices in material goods and services but with choices in how we constitute our individual selves in order to survive.” — Yasmin Nair

Reddit user m_bishop weighed in:

“I’ve been saying this for years. Treat anything you say online like you’re shouting it in a crowded subway station. It’s not everyone else’s job to ignore you, though it is generally considered rude to listen in.

Bottom line, if you don’t want people to see you naked, don’t walk down the street without your clothes on. All the written agreements and promises to simply ‘not look’ aren’t going to work.”

Software Is Hungry

You may have heard that DeepMind’s machine-learning program AlphaGo beat reigning world champion Lee Sedol in the ancient and complex game of Go. (Technically, AlphaGo has only won two of five matches, but the writing on the wall is clear.) More and more lately, artificial intelligence is in the news, gaining on the analogue world by leaps and bounds. I’m glad of this, despite the accompanying proliferation of media fear-mongering. Hardworking programmers and data scientists are accelerating the future; they deserve recognition. (Shoutout to Francis Tseng!)

Illustration by Michele Rosenthal.

Illustration by Michele Rosenthal.

Unfortunately the present — I know Exolymph’s gimmick is the future-present, but in this case I mean the past-present — consists of tediously logging back in on website after website. Daily life is so mundane compared to the cutting edge. I restored my laptop to factory defaults, which is great because it’s not broken anymore, but I had to reenter my username and password(s) all over the place. It was a little disturbing to realize how many companies have dossiers of data about me. I don’t expect anything bad to happen to that information, but it’s an inherent vulnerability. What if I had a stalker? What if I want to pursue investigative journalism at some point?

The connecting thread between AlphaGo’s prowess and the way privacy keeps slipping away from individuals is that software is eating the world. We’re subsumed by technology, by the math that powers flashing lights behind screens. I’m okay with it. Human nature is fundamentally the same — all that’s changed is the conduit.

Cybersecurity Tradeoffs & Risks

Kevin Roose hired a couple of high-end hackers to penetration-test his personal cybersecurity setup. It did not go well, unless you count “realizing that you’re incredibly vulnerable” as “well”. In his write-up of the exercise, Roose mused:

“The scariest thing about social engineering is that it can happen to literally anyone, no matter how cautious or secure they are. After all, I hadn’t messed up — my phone company had. But the interconnected nature of digital security means that all of us are vulnerable, if the companies that safeguard our data fall down on the job. It doesn’t matter how strong your passwords are if your cable provider or your utility company is willing to give your information out over the phone to a stranger.”

There is a genuine tradeoff between safety and convenience when it comes to customer service. Big companies typically err on the side of convenience. That’s why Amazon got in trouble back in January. Most support requests are legitimate, so companies practice lax security and let the malicious needles in the haystack slip through their fingers (to mix metaphors egregiously). If a business like Amazon enacts rigorous security protocols and makes employees stick to them, the average user with a real question is annoyed. Millions of average users’ mild discomfort outweighs a handful of catastrophes.

Artwork by Michael Mandiberg.

Artwork by Michael Mandiberg.

In semi-related commentary, Linux security developer Matthew Garrett said on Twitter (regarding the Apple-versus-FBI tussle):

“The assumption must always be that if it’s technically possible for a company to be compelled to betray you, it’ll happen. No matter how trustworthy the company [seems] at present. No matter how good their PR. If the law ever changes, they’ll leak your secrets. It’s important that we fight for laws that respect privacy, and it’s important that we design hardware on the assumption we won’t always win”

Although Garrett is commenting on a different issue within a different context, I think these two events are linked. The basic idea is that when you trust third parties to protect your privacy (including medical data and financial access), you should resign yourself to being pwned eventually. Perhaps with the sanction of your government.

Expose Yourself / Still Trying To Hide

Sometimes I like to string quotes together to indicate a point. It’s akin to writing a very short essay using other people’s words.

“What makes crowdfunding possible now is the emergence of new communication platforms. The Internet allows us to surface niche communities that weren’t so obvious beforehand.” — Ellen Chisa, a former Kickstarter product manager

Image via Alan O’Rourke. Get that money.

Image via Alan O’Rourke. Gettin’ that guac.

“In some ways, we’re lucky that the first two decades involving the advent of the commercial Internet were largely a positive-sum game. The creation of digital space for self-expression, at near-zero cost, does not necessarily challenge or erode someone else’s right to space or resources.” — Kim-Mai Cutler on California’s housing and development problems [not necessarily — note that]

“Now, as the stars begin to dim and humans dip and swerve in flocks of social media ephemera, responses are instantaneous and direct and physical, our nascent haptic helpers tugging gently at our sleeves to let us know that someone, somewhere, has had an opinion at us. […] I’ve started thinking of this as an attention lens: small, human amounts of individual attention are refracted through social media to converge on a single person, producing the effect of infinite attention at the focal point.” — Coda Hale on Twitter and related social dilemmas

“the world today is like living in a big field that is more illuminated than ever before” — Joseph Nye, quoted on government surveillance

There are pros and cons to being a figment of the open web. The freely visible web. The semi-universally accessible web. For my purposes, the pros outweigh the cons. But like most choices, it’s worth considering! How much do you want to participate?

Keep Your Head Down

Reading about operational security has turned my mind toward privacy rights. Opsec tactics are concerned with shielding information from enemy access — mostly through rigorous, consistent caution. As the Animal Liberation Front put it in one of their direct action guides, “True security culture requires a clear head, a rational mind, and personal self-control.” The assumption made by savvy opsec practitioners is that all data will be compromised eventually. Therefore, they aim to minimize the inevitable consequences.

I used to disregard privacy. My attitude was a classic: “If you’re not doing anything wrong, then you have nothing to hide!” (a viewpoint refuted very well by Robin Doherty). The problem is that even people who are acting ethically can run afoul of the law or be persecuted by the authorities. Consider how the FBI treated civil rights activists in the 1960s. Current mass surveillance by the NSA and similar government bodies is equally worrisome, as is the treatment of whistleblowers like Chelsea Manning. I’m not naive enough to think that this behavior will stop. People do anything that they are physically or technically capable of doing in order to access power — especially state agents.

Portrait of Edward Snowden by John Meyer of The Spilt Ink; $130.79 on Etsy.

Portrait of Edward Snowden by John Meyer of The Spilt Ink; $130.79 on Etsy.

I’m still not convinced that privacy should be a guaranteed legal right, or if so, to what extent. The best way to restrict your own information is simply to be secretive — stay quiet and maintain the impression of insignificance. After all, the vast majority of day-to-day privacy compromises are self-inflicted, simply because most people don’t care. That’s how Facebook and other social networks manage to compile detailed dossiers on their users.

So, what’s the essential takeaway here? I’m not sure. It’s interesting to ponder the consequences of a post-privacy society, until you realize that we already live in one. The results are quite mundane. Feels normal, right?

© 2017 Exolymph. All rights reserved.

Theme by Anders Norén.