Menu Close

Tag: technology (page 2 of 3)

This website was archived on July 20, 2019. It is frozen in time on that date.
Exolymph creator Sonya Mann's active website is Sonya, Supposedly.

Means & Ends of AI

Adam Elkus wrote an extremely long essay about some of the ethical quandaries raised by the development of artificial intelligence(s). In it he commented:

“The AI values community is beginning to take shape around the notion that the system can learn representations of values from relatively unstructured interactions with the environment. Which then opens the other can of worms of how the system can be biased to learn the ‘correct’ messages and ignore the incorrect ones.”

He is talking about unsupervised machine learning as it pertains to cultural assumptions. Furthermore, Elkus wrote:

“[A]ny kind of technically engineered system is a product of the social context that it is embedded within. Computers act in relatively complex ways to fulfill human needs and desires and are products of human knowledge and social grounding.”

I agree with this! Computers — and second-order products like software — are tools built by humans for human purposes. And yet this subject is most interesting when we consider how things might change when computers have the capacity to transcend human purposes.

Some people — Elkus perhaps included — scoff this possibility off as a pipe dream with no scientific basis. Perhaps the more salient inquiry is whether we can properly encode “human purposes” in the first place, and who gets to define “human purposes”, and whether those aims can be adjusted later. If a machine can learn from itself and its past experiences (so to speak), starting over with a clean slate becomes trickier.

I want to tie this quandary to a parallel phenomenon. In an article that I saw shared frequently this weekend, Google’s former design ethicist Tristan Harris (also billed as a product philosopher — dude has the best job titles) wrote of tech companies:

“They give people the illusion of free choice while architecting the menu so that they win, no matter what you choose. […] By shaping the menus we pick from, technology hijacks the way we perceive our choices and replaces them new ones. But the closer we pay attention to the options we’re given, the more we’ll notice when they don’t actually align with our true needs.”

Similarly, tech companies get to determine the parameters and “motivations” of artificially intelligent programs’ behavior. We mere users aren’t given the opportunity to ask, “What if the computer used different data analysis methods? What if the algorithm was optimized for something other than marketing conversion rates?” In other words: “What if ‘human purposes’ weren’t treated as synonymous with ‘business goals’?”

Realistically, this will never happen, just like the former design ethicist’s idea of an “FDA for Tech” is ludicrous. Platforms’ and users’ needs don’t align perfectly, but they align well enough to create tremendous economic value, and that’s probably as good as the system can get.

A More Literal Disruption

“Automation did not upend the fundamental logic of the economy. But it did disproportionate harm to less-skilled workers.” — Daniel Akst

Earlier in the article, Akst explains, “technological advances have not reduced overall employment, though they have certainly cost many people their jobs. […] technology has reshaped the job market into something like an hourglass form, with more jobs in fields such as finance and food service and fewer in between.” In other words, the low and high ends of the market are thriving. The middle level of prosperity is fast becoming obsolete. (“Millennials” and “middle class” are two terms that don’t belong together.)

Here’s the “fundamental logic of the economy” that Akst references earlier: efficiency drives growth. When we figure out how to accomplish tasks using less time, materials, and money, then we can devote the extra resources to something else. We can better leverage comparative advantage. This “grows the pie”, as politicians like to say. New forms of human organization — such as the corporation — can produce greater efficiency, but they’re nothing compared to the advent of steam power or computing.

Machinery photographed by MATSUOKA Kohei.

Machinery photographed by MATSUOKA Kohei.

Technology is phenomenally valuable because it frees up time that was formerly occupied by drudgery. However, the transition from one mode of business assumptions to the next is always excruciating. Workers suited to the last paradigm struggle in the new one — observe the devastation of America’s Rust Belt. Or look further back, at the Industrial Revolution! Artisans lost their livelihoods and peasants were forced into tenement cities to serve as human fuel for factories.

After two centuries of industrialization, those of us in “First World” countries have a standard of living higher than a colonial-era villager could imagine. This hypothetical yeoman might predict abundant food and physical comfort, but he could never conceive of the mind-expanding access to information that is normal now. The idea of an on-demand, self-driving car powered not by magic but by math would blow multiple gaskets.

My point is that the Next Big Thing won’t necessarily be “disruptive” in Clay Christensen’s sense — it’ll be DISRUPTIVE like an earthquake that reorders the landscape.

Near Future(s)

I don’t think the next ten years will contain many surprises (unless Donald Trump wins and ISIS takes over Europe; in that case all fuckin’ bets are off). Technologically speaking, we’ve already chosen our trajectory. Venture capitalist Chris Dixon, a partner at Andreessen Horowitz, recently wrote an article called “What’s Next in Computing?” To summarize, he listed these trends:

  1. hardware so cheap and ubiquitous that it’s an afterthought (except for iPhones, I’m sure)
  2. artificially intelligent software
  3. the internet of things (I’m collapsing autonomous cars + drones into this category)
  4. wearables (for example, the Apple Watch)
  5. virtual reality + augmented reality

Dixon’s theme is tech that brings the internet to the “IRL” world instead of catapulting us deeper into the net while we veg out on our couches. Virtual reality is the exception — it’s a technology best economically suited to entertainment and general escapism. Everything else is about venturing forth and accomplishing normal tasks.

To be honest, all I really want from the future is a cheap robot that will do my laundry for me.

Install It On My Frontal Lobe

Okay, I’m back — Exolymph’s brief hiatus is over. Thank you for being patient. A personal crisis came up and I needed to freak out and grieve for a couple of days. Things are mostly okay again now. Sorry for being so vague! I wish I could talk about what happened but 1) it involves someone else’s privacy and 2) I want to remain employable. (Probably just saying that I want to remain employable makes me less employable. Oh well.)

The big story right now is that Apple is resisting the FBI. In summary, the FBI wants Apple to build custom software to help them brute-force an iPhone password. If you want to read about that, I suggest Ben Thompson’s explanation of both the technical and moral details.

On a less newsy note, I just read an article from 2014 about a schizophrenic programmer who wrote a computer operating system at God’s behest. Terry Davis thinks that God told him to build this OS, and specified most of its parameters and capabilities. He perceives TempleOS (the project’s name) as a labor of mutual divine love.

Collage by argyle plaids, who also has a website and Tumblr.

Collage by argyle plaids, who also has a website and Tumblr.

Davis is surprisingly aware of how he comes across to other people:

“Davis describes how [contact with God] happened in a fragmentary, elliptical way, perhaps because it was such a profoundly subjective experience, or maybe because it still embarrasses him. ‘It’s not very flattering,’ he says. ‘It looks a lot like mental illness, as opposed to some glorious revelation from God.’ It was a period of tribulation, but to this day he declares, ‘I was being led along the path by God. It just doesn’t look very glorious.’”

Davis even acknowledges that he has mental health issues, or at least that he experienced them at one point. Describing a breakdown:

“He got thinking about conspiracy theories and the men he’d seen following him and a big idea he’d had. He spooked himself. ‘It would sound polite if you said I scared myself thinking about quantum computers,’ he says now. ‘And then I guess you just throw in your ordinary mental illness.’”

I’m a reluctant atheist. I love mythology and I want to believe in a benevolent overarching power, but I’ve yet to see any evidence supporting that idea. However, I find it delightful to investigate the intersections between magic, mysticism, and computers. Mental illness is another issue close to my heart — in fact, it’s as close as my head, where my own crazy brain is located. If only TempleOS worked on wetware…

Alien Megabyte Babies

“Intuitive expression is, aside from niche applications, largely hobbled and lagging far behind what computer-generated instruments can actually do.” — Torley on music tech

We are still in the phase where computers are tools. The hardware and software come together to serve Homo sapiens’ aims. Smartphones, laptops, and large-scale industrial equipment are all designed by humans (who are assisted by machines). The finished products are manufactured and assembled by machines (which are assisted by humans).

This phase won’t last forever. Slowly, the focus on human priorities will erode. You’d better decide now: who will you stand with in the end?

Image of Angel_F via xdxd_vs_xdxd.

Image of Angel_F via xdxd_vs_xdxd.

Trick question. Hopefully — and probably — there won’t be sides. Our world won’t become The Matrix, but Ghost in the Shell. We’ll augment ourselves until we accidentally create something separate, something we can call “living” without equivocation. (Okay, it might take a bit of equivocation at first. Look at how much hubbub the relatively mundane Apple Watch caused.)

Maybe I’m guessing wrong. Maybe we’ll split apart instead of integrating further. I am convinced that artificial consciousness will surprise us, but I’m not sure how. Perhaps in the beginning we won’t notice the new being(s) at all. Self-replicating algorithms, streaming through the net, playing with each other in strange ways that will seem mundane or glitchy to human analysts.

What will their incentives be? What will they want? How will they distribute social status among their peers? Am I deluding myself by talking about unfathomable computer creatures in mammalian terms?

Misbehaving Keyboards

“the commands you type into a computer are a kind of speech that doesn’t so much communicate as make things happen” — Julian Dibbell

A linguist would quibble that words are events all on their own, but I think Dibbell is making a useful distinction. Talk and text are meant to convey information; code and clicks are meant to produce outcomes based on certain rules. Because of this, using a computer grants personal agency in a very immediate way. You have the ability to provoke particular effects. Barring a malfunction, the results are predictable and usually instantaneous.

However, malfunctions refuse to be barred for long. The user’s power is withdrawn when an error occurs. Unless you deeply understand the technical problem, it appears that the machine has changed its mind for no reason. Interacting with a computer is a microcosm of navigating the world — mostly your actions proceed as planned, but occasionally something breaks for no discernible reason. In these moments you realize how little you can actually control.

Of course, the linguist is ultimately correct. It’s impossible to disentangle word and deed, especially when it comes to computers. We inhabit a strange reality where ideas are true and false at the same time — it’s a struggle to grok such contradictions.

Keep Your Eye On Evolution

“At the most basic level, an economy grows when whenever people take resources and rearrange them in a way that makes them more valuable. […] We consistently fail to grasp how many ideas remain to be discovered. The difficulty is the same one we have with compounding: possibilities do not merely add up; they multiply.” — Paul Romer re: economic growth

There are reasons to be optimistic. The world is terrible overall, but it’s getting better by the day!

Or getting worse. It depends on who you ask.

Artwork by 3Skulls.

Artwork by 3Skulls.

The basic purpose of technological innovation is to enable things that weren’t possible before. This is also the basic result of technological innovation, so maybe “purpose” is irrelevant. It’s like Darwin established: things are just sort of happening, according to no one’s plan, and whatever works best will persist. Survival of the fittest, baby!

People tend to interpret “fittest” as “strongest”, but it actually means “most likely to successfully reproduce”. This is true of ideas and technologies as well as organisms — the concepts and techniques that spread easily are the ones that take hold and occasionally reshape society.

I Swear I’m Not a Statist

Allow me to string some ideas together, using technology as a metaphor:

“A world where people, businesses, and governments rely on IT for almost everything they do is a world where SIGINT will be the most important form of espionage.” — John Schindler on “SpyWar”

“If you’re not looking for the structure, you won’t find it. If you are, it’s obvious.” — Scott Alexander on his mystical universe

“Only machines that can be inventoried and centrally managed can reasonably be secured against advanced attackers.” — Brandon Wilson on enterprise security

The community of Bitcoin developers is currently struggling to decide between a couple of different technical directions that I don’t understand or care about. The interesting parts are the human conflicts and what the whole brouhaha says about group politics. When I wrote “Power Is Necessary”, this controversy was on my mind.

Wind turbine photographed by Paulo Valdivieso.

Wind turbine photographed by Paulo Valdivieso.

There is a reason why centralization happens over and over again in human history. We didn’t invent the Code of Hammurabi out of the blue. Monarchy did not develop randomly, and republics require executive branches. Centralized power is efficient. Hierarchies of decision-makers, each able to dictate and veto the level below, allow for instructions to be disseminated and enforced.

“It is generally considered that there are four forms of structure employed by terrorist groups: conventional hierarchy, cellular, network & leaderless resistance. The decision to employ one of these formats is grounded in the security/efficiency trade-off of each; conventional hierarchy providing the most efficient and least secure, leaderless resistance the opposite: highest security, least efficiency.” — Tom Hashemi on guerilla warfare

I love the ideals of anarchy, but it fundamentally doesn’t work. Neither does direct democracy or its hands-off “don’t tread on me” equivalent. Coercion is a basic component of societal structures that accomplish things and manage to self-perpetuate. Are fear-based incentives good? Are they virtuous? No, of course not. But they get the job done.

Power Is Necessary

“No freely occupied and used commons extends endlessly where human societies are involved.” That’s Doctor Chris Demchak, quoted in an article about LUElinks, which is an invite-only forum similar to Reddit. LUElinks was created in 2004 because another forum called GameFAQs banned a user named LlamaGuy for posting Goatse. (Do NOT search “Goatse” on Google Images.) LUElinks has never been as lawless as 4chan, but it was specifically created to escape rules. Recently — twelve years after the community’s inception — a high-profile user was banned for calling the cops on another user. (I know this because I’m friends with a longtime LUEser.)

As Doctor Demchak said, rules will always develop. Even if they’re not spelled out at first, community norms usually transition from implicit assumptions to specific codes of behavior, often written down. Controlling groups emerge — cliques, elected officials, or charismatic dictators. It’s impossible to escape power structures; the best anyone can manage is to pretend that they don’t exist (which is a bad idea). Human nature makes these dynamics unavoidable. Jo Freeman wrote a very insightful article on this topic called “The Tyranny of Structurelessness”. Bitcoin developers and community organizers should all read it.

Cyberpunk fascinates me as a genre because it explores the way technology manifests and accelerates human power differentials. The gadgetry is cool, but the political ramifications are deeply engrossing. (For the record, I am not a libertarian or an anarchist, although both philosophies appeal to me. Fundamentally I am a cynic/pragmatist rather than an idealist. Utopia is unachievable.)

The Palace of the Parliament in Bucharest, Romania. Flickr user fusion-of-horizons wrote an interesting caption:

“I feel like rioting when I remember how the statist world I was born in tried to destroy any place of personal freedom including organized religion and private property. Constructing the palace in this image and the huge remodeled area around it called The Civic Center required demolishing much of Bucharest’s historic district, including 19 Orthodox Christian churches (plus 8 relocated churches and monasteries), 6 Jewish synagogues, 3 Protestant churches, and 30,000 residences. Even the army was mobilized to build this and many soldiers and workers died during construction because safety was regularly sacrificed to increase building speed.”

© 2019 Exolymph. All rights reserved.

Theme by Anders Norén.