Menu Close

Tag: economics (page 1 of 3)

Yup, Everything Will Definitely Be Fine Since No One Will Lose Their Job Ever

Here is a succinct and insightful comment, from Hacker News user AlisdairO, on the trend toward technology handling every kind of labor that can possibly be delegated to it:

The sad reality is that there’s a nontrivial chunk of the populace that isn’t able to pick up highly skilled roles. It also ignores the role of unskilled jobs in providing space for people whose job class has been destroyed and need to retrain (or mark time until retirement).

I’m not advocating slowing innovation to prevent job loss. I am advocating avoiding magic thinking (‘there’s always new jobs to go to’): we need to start a serious conversation about what we do with our society when we have the levels of unemployment we can expect in an AI-shifted world. Right now we’re trending much more towards dystopia than utopia.

I’m going to get around to the dystopian futurism part, but first, a long digression about intelligence! It’s a divisive topic but an important one.

Sometimes I get flack for saying this, but here goes: The average person is not very smart. Your intellect and my intellect probably exceed the average, simply by virtue of being interested in abstract ideas. We’re able to understand those ideas reasonably well. Most people aren’t. Remember what high school was like?

There’s that old George Carlin quip: “Think of how stupid the average person is, then realize that half of them are stupider than that.” This is not a very PC thing to talk about, especially because so many racists justify their hateful worldview with psychometrics. But it’s cruel to insist that everyone has the same level of ability, when that is clearly not true in any domain.

You and I may not be geniuses — I’m certainly not — but we have the capacity to be competent knowledge workers. Joe Schmo doesn’t. He may be able to do the kind of paper-pushing that is rapidly being automated, but he can’t think about things on a high level. He doesn’t read for fun. He can’t synthesize information and then analyze it.

That doesn’t mean that Joe Schmo is a bad person — if he were a bad person, we wouldn’t care so much that the economy is accelerating beyond his abilities. The cruel truth is that Joe Schmo is dumb. He just is. AFAIK there is no way to change this.

I hate that I have to make this disclaimer, and yet it’s necessary: I’m not in favor of eugenics. In theory selective breeding is a good idea, but I can’t think of a centrally planned way for it to be implemented among humans that wouldn’t be catastrophically unjust.

Also, while raw intellect may correlate with good decision-making, it doesn’t ensure it. Peter Thiel’s IQ is likely higher than mine, but I don’t want him to run the world. (Tough luck for me, I guess.) As Harvard professor and economist George Borjas told Slate:

Economic outcomes and IQ are only weakly related, and IQ only measures one kind of ability. I’ve been lucky to have met many high-IQ people in academia who are total losers, and many smart, but not super-smart people, who are incredibly successful because of persistence, motivation, etc. So I just think that, on the whole, the focus on IQ is a bit misguided.

It’s also notable that similarly high-IQ people disagree with each other often.

And now back to the topic of technological unemployment!

The two main responses to concerns along the lines of “all the jobs will disappear” are:

  1. Universal basic income, yay!
  2. No they won’t, look what happened after the Industrial Revolution!

The counterargument to universal basic income is, as Josh Barro put it:

UBI does nothing to replace the sense of reward or purpose that comes from a job. It gives you money, but it doesn’t give you the sense that you got the money because you did something useful. […] The robots have not taken our jobs yet. It is not time to surrender to a social change that is likely to further destabilize a world that is already troubled.

The counterargument to the Industrial Revolution parallel is that AI — alternately called machine learning, or automation, if you prefer those terms — is different. Andrew Ng is the chief scientist at Baidu, and this is what he told the Wall Street Journal:

Things may change in the future, but one rule of thumb today is that almost anything that a typical person can do with less than one second of mental thought we can either now or in the very near future automate with AI.

This is a far cry from all work. But there are a lot of jobs that can be accomplished by stringing together many one-second tasks.

And then there are concerns about general AI, which I don’t want to get into here.

If you’re curious about my opinion, it’s this: We’re in for a difficult couple of decades. Most hard problems can’t be solved quickly.

Tachikoma artwork by Abisaid Fernandez de Lara.

Pointillism of Failure

One of the most interested things that happened this week was an AWS outage. For those of you who aren’t familiar, Amazon Web Services is a sophisticated cloud host for websites and apps. It is very widely used, especially among startups. When it goes down, as it did on Tuesday, many tech workers can’t do their jobs. At least Twitter was still available, providing a convenient location for complaints. (Additional discussion took place on Hacker News.)

I wrote about the incident for work, first summing up reactions from Twitter and then making the case that AWS is not a monopoly and shouldn’t be regulated as such. In response to that argument, my friend Adam Elkus pointed out that decentralized infrastructure was a founding ideal of the internet. The beautiful new world of http://www was supposed to empower individuals at the expense of institutions, be they governmental or private.

It has done that — but as usual, the reality is more of a complex onion than the idealists seemed to expect. In my first Ribbonfarm essay, I wrote:

The internet enables more individual opportunity than ever before — how would my words manage to reach you otherwise? And the internet is more meritocratic than the landscape it took over, because anyone can distribute their own work to a potential audience of millions, but of course age-old power dynamics can’t be erased in one fell swoop. It also enables winner-take-all businesses, like Amazon’s dominance in ecommerce and Facebook’s reign over news media.

Centralization wins because it’s efficient, given the constraints and affordances of the internet. And yet this centralization can be penetrated — not dismantled, but surface segments can be peeled back. That’s what hackers do when they leak a database or whatever.

One of cyberpunk’s central insights, as an ethos, was that the internet gives individuals more power at the same time that amoral, corporatized institutions build up their strongholds. It’s funny that some of the same people — the cypherpunks, say — explicitly bridged cynical cyberpunk and sunny techno-utopianism.

In John Perry Barlow’s “Independence of Cyberspace” manifesto, presented to “Governments of the Industrial World” at Davos, he said:

The global conveyance of thought no longer requires your factories to accomplish. […] We must declare our virtual selves immune to your sovereignty, even as we continue to consent to your rule over our bodies. We will spread ourselves across the Planet so that no one can arrest our thoughts.

No one can arrest our thoughts, unless they’re hosted on AWS — a factory of the information economy if there ever was one — in which case someone fat-fingering a command kicks your thoughts into the inaccessible nowhere of a disconnected server farm. It’s impossible not to be at someone’s mercy.

Header artwork by Igor Kirdeika.

Crowdsourced Common Sense

My friend Beau Gunderson showed me KmikeyM, a project in which “shareholders” get to vote on what Mike Merrill does with his life. Merrill describes his endeavor like this:

By “buying shares” in Mike Merrill you are in effect giving me money. In exchange, I am valuing your input on my choices based on how many of those shares you buy. As this mini-economy grows, my stock price will become a benchmark for my success; the higher the stock price, the more optimistic my shareholders are.

When Shareholder Questions come up, stock owners will be able to vote on significant choices in my life. The more shares you have, the more weight your vote has. You could be the only person voting to send me to night school, but with enough shares behind you, I’m going to enroll. Ostensibly, if we make the right choices, we both win.

Here are some recent decisions that Merrill crowdsourced: Should he accept any and all Facebook friend requests? (No.) Should he subscribe to Spotify? (Yes.) Should he un-register as a Republican? (Yes.) There are many, many more proposals and results.

Is this performance art? Is it business? A sensible way to run your life? Markets are very good at aggregating and weighing preferences, but there’s no guarantee that Merrill’s shareholders will optimize for his values or goals. I suppose he protects himself by only offering up fairly trivial questions.

KmikeyM reminds me of Sarah Meyohas’ stock-trading paintings and, the precursor to Twitch. I deeply wish that I’d come up with the idea first! Of course, there’s no reason why I couldn’t copy him… Would you buy Sonya Shares?

Correction: Mike Merrill allowed the shareholders to dictate whether he would propose to his girlfriend. (They voted yes, but an inside source told me that he hasn’t done it yet.) That’s not a trivial question at all!

Political Economics, I Guess

“Silicon valley ran out of ideas about three years ago and has been warming up stuff from the ’90s that didn’t quite work then. […] The way that Silicon Valley is structured, there needs to be a next big thing to invest in to get your returns.” — Bob Poekert

Bob Poekert's avatar on Twitter.

Bob Poekert’s avatar on Twitter.

I interviewed Bob Poekert, whose website has the unsurpassable URL Perhaps “interviewed” is not the right word, since my queries weren’t particularly cogent. Mainly we had a disjointed conversation in which I asked a lot of questions.

Poekert is a software engineer who I follow on Twitter and generally admire. He says interesting contrarian things like:

“all of the ‘machine learning’/’algorithms’ that it’s sensical to talk about being biased are rebranded actuarial science” — 1

(Per the Purdue Department of Mathematics, “An actuary is a business professional who analyzes the financial consequences of risk. Actuaries use mathematics, statistics, and financial theory to study uncertain future events, especially those of concern to insurance and pension programs.”)

(Also, Poekert said on the phone with me, “[The label] AI is something you slap on your project if you want to get funding, and has been since the ’80s.” But of course, what “AI” means has changed substantially over time. “It’s because somebody realized that they could get more funding for their startup if they started calling it ‘artificial intelligence’.” Automated decision trees used to count as AI.)

“what culture you grew up in, what language you speak, and how much money your parents have matter more for diversity than race or gender” — 2

“the single best thing the government could do for the economy is making it low-risk for low-income people to start businesses” — 3

“globalization has pulled hundreds of millions of people out of poverty, and it can pull a billion more out” — 4

“the ‘technology industry’ (read: internet) was never about technology, it’s about developing new markets” — 5

Currently Poekert isn’t employed in the standard sense. He told me, “I’m actually working a video client, like a Youtube client, for people who don’t have internet all the time.” For instance, you could queue up videos and watch then later, even when you’re sans internet. (Poekert notes, “most people in the world are going to have intermittent internet for the foreseeable future.”)

Poekert has a background in computer science. He spent two years studying that subject in college before he quit to work at at, which later morphed into Twitch. Circa 2012, Poekert joined Priceonomics, but was eventually laid off when the company switched strategies.

I asked Poekert about Donald Trump. He said that DJT “definitely tapped into something,” using the analogy of a fungus-ridden log. The fungus is dormant for ages before any mushrooms sprout. “There’s something that’s been, like, growing and festering for a really long time,” Poekert told me. “It’s just a more visible version” of a familiar trend.

Forty percent of the electorate feels like their economic opportunities are decreasing. They are convinced that their children will do worse than they did. You can spin this with the Bernie Sander narrative of needing to address inequality — or the Trump narrative of needing to address inequality. Recommended remedies are different but the emotional appeal is similar.

Poekert remarked, in reference to economists’ assumptions, “It would be nice if we lived in a world where everyone is a rational actor.” But that world doesn’t actually exist.

Hinting at Globalism

In response to my floundering last week, reader Michael Dempsey suggested:

I think that you could take a look at a weekly concept and go deeper as to the best case, worst case, and cyberpunk outcomes in each. Would allow you to avoid constant negativity while also writing about how our future very well could splinter based on outcomes.

And reader Jan Renner suggested:

Several millennia in the past Europe was the cradle of innovation and cultural development. In my opinion this came to be by chance, since the climate was always very balmy in middle Europe, which made survival much easier compared to other parts of the world. Alongside with some easy to domesticate animals this gave early Europeans a lot of free time for thinking, innovating and developing in all areas of life. This resulted in rich kingdoms and such, which lead to colonization of most of the world, which lead to various other things in turn.

So, I don’t agree with this entirely. Europe and its offspring did end up being globally dominant — see Guns, Germs, and Steel plus current American hegemony — but European empires weren’t the first of their kind and there were other large-scale powers concurrently. Many scientific and cultural advances originated elsewhere before being coopted by Europeans. That said, Renner is broadly correct. (This isn’t a reflection of the quality of European people, but rather luck and first conditions snowballing into surprising end results.)

Tying the two suggestions together, this week I’m going to look at the best case, worst case, and cyberpunk case of today’s empires. I am definitely coming at this from an American perspective, since that’s where I live and what I know best. YMMV.

Image via Salon; originator of the ~cyber~ edit unknown. This is Frank Underwood from House of Cards, played by Kevin Spacey.

Image via Salon; originator of the ~cyber~ edit unknown. This is Frank Underwood from House of Cards, played by Kevin Spacey.

Let’s start the week on an optimistic note, eh? I actually think we’re pretty darn close to an optimal setup, assuming we can keep multinational trade deals intact. That may reflect my cynicism re: what the best-case scenario can be.

On a macro level, political outcomes are largely important to the extent that they affect economic outcomes, and I expect Hillary Clinton (the overwhelmingly likely winner, but please still vote) to be pretty pro-trade, whatever her stump-speech rhetoric. She’s a neoliberal and from what the disgusted leftists tell me, neoliberals like free markets.

The great thing about trade is that it’s win-win for the parties who are directly involved. From Nick Szabo’s long essay about the origins of money:

Because individuals, clans, and tribes all vary in their preferences, vary in their ability to satisfy these preferences, and vary in the beliefs they have about these skills and preferences and the objects that are consequent of them, there are always gains to be made from trade. Whether the costs of making these trades — transaction costs — are low enough to make the trades worthwhile is another matter.

One of the useful effects of the internet is pushing transaction costs lower and lower. Transaction costs are intimately tied to distribution, of both goods and ideas. The internet has “disrupted” the geography-bound analogue world in which distribution was slow and full of gatekeepers. We all bounce together so much more often now.

The unfortunate things about trade are 1) environmental externalities and 2) HR externalities.

Manufacturing wreaks a lot of environmental havoc that the perpetrating companies are never held accountable for, often in countries with nonfunctional governments. (Think mineral mining in the Democratic Republic of the Congo.) And then from the human resources perspective, a corporation moving to [insert country with lower labor costs] is good for both the corporation and the workers in the place they relocate to. But it’s hard for the place they relocate from, at least in the short term.

I don’t see a quick solution to either of these problems. We need strong governments so that we can pressure large companies not to do the heinous things that they love to do absent regulation, and we need free trade to fully express comparative advantage.

What’s really missing is easy movement of labor — if individual humans were able to migrate at will, they could go to wherever the jobs were until we reached a supply-and-demand equilibrium.

I said a few paragraph ago, “political outcomes are largely important to the extent that they affect economic outcomes” — this is an example. A pro-immigration, not-explicitly-racist president is crucial because that kind of executive may ease restrictions on workers’ ability to relocate according to their financial prospects.

Does all of that make sense? Am I too callous, zooming out to focus on economics?

Reader JM Porup disagrees with me re: multinational trade deals. He previously wrote an article about his thoughts on the matter, which you should read if you’re interested!

The Internet, Globalization, and You

Beau Gunderson’s $10 Patreon reward prompt was, “How does living in a cyberpunk world affect our self-determination?” So first let’s talk about regular ol’ self-determination. There are a couple ways to interpret this: sovereign or individual.

The poli-sci version of self-determination is that the citizens of a country get to choose their own mode of government and get to define their constitution. Wikipedia says, this “cardinal principle in modern international law […] states that nations, based on respect for the principle of equal rights and fair equality of opportunity, have the right to freely choose their sovereignty and international political status with no interference.”

The individual form of self-determination is a similar idea, but scaled down — the right and ability to direct your own life. If you examine this closely it’s an obvious illusion, but because free will doesn’t feel like an illusion, we pretend that it exists. I am the master of my fate! It’s a more practical attitude.

Sovereign Self-Determination

Europe and the United States are seeing a split in public sentiment between corporate elite globalism and protectionist plebeian nationalism. I frankly don’t know how this is playing out in South America, Asia, Africa, Australia, etc, etc — but whither goes the USA, the rest of the world tends to follow.

I’d bet on the elite winning over time, and thus the power of governments relative to giant transnational companies weakening and weakening. I mean, hey, at least Cthulhu swims left. But that might take a while, so perhaps global warming will force a sea change first? (Pun very intended.)

The internet is a globalizing force, and it’s so economically compelling that no country or group of people can resist it forever. The winner-take-all dynamics of internet businesses help create new hegemonies that transcend borders. I do want to note that there is significant upside! But upside is not my beat 😉

Personal Self-Determination

I said we pretend to have free will, so even though I don’t believe it exists in a philosophical sense, I’m just going to use conventional language.

Does a cyberpunk world erode the choices available to you? The internet substantially empowers huge companies (think Google, and Facebook) but it also substantially empowers individuals.

You can talk to (almost) anyone, broadcast whatever you want (unless it’s child porn, but I’m okay with that restriction), and sell just about anything anonymously (provided a certain level of opsec prowess — unfortunately this one does apply to child porn). Those caveats don’t negate that more opportunities are available than ever before.

I do worry that I’m over-indexing on my own reality. I have lots of cultural capital, a middle-class safety net, and live in the a place full of jobs. Elsewhere in my country and probably yours as well, there’s a demographic that is saturated with despair.

Opportunities are available. Being equipped to take the opportunities is another thing, yeah?

Header photo by Roel Hemkes.

Relentlessly Growth-Oriented & Profit-Seeking

Developer Francis Tseng, who made Humans of Simulated New York, is currently crowdfunding a dystopian business simulator called The Founder. You play as the head of a startup and your goal is to grow the company however you can. Little obstacles like other people’s lives shouldn’t bother you!

Artwork from dystopian video game The Founder. Image via the Kickstarter campaign.

Image via the Kickstarter campaign.

Tseng writes in his crowdfunding pitch:

“How is the promise of technology corrupted when businesses’ relentlessly growth-oriented and profit-seeking logic plays out to its conclusion? What does progress look like in a world obsessed with growth, as measured only by sheer economic output?”

It looks a lot like San Francisco. That’s not a compliment.

“Winning in The Founder means shaping a world in which you are successful — at the expense of almost everyone else.”

Not so different from the real world of business, right?

Screenshot from The Founder's game website. "Change the world. Everything you do has a consequence. With your revolutionary new products, you have the power to shape a brave new world — one in which every facet serves your ceaseless expansion."

Screenshot from the game site.

I don’t believe that economics is a zero-sum game, especially when it comes to technology. “Innovation” may be an over-fetishized buzzword, but it really is able to move the needle on people’s quality of life.

Unfortunately, that aspect of industry is not prioritized in practice. The profit motive should be a proxy for ~making the world a better place~ but it often gets treated as an end in and of itself.

The Founder interrogates this trend and hopefully makes the player feel uneasy about their own incentives. If you’re interested in playing, contribute!

“The best minds of my generation are thinking about how to make people click ads.” — Jeffrey Hammerbacher, data scientist and early Facebook employee

Cryptocurrencies Aren’t Fake, They’re Just Libertarian

Bitcoin-themed coaster. Photo by pinguino k.

Bitcoin-themed coaster. Photo by pinguino k.

A headline from the Miami Herald: “Bitcoin not money, Miami judge rules in dismissing laundering charges” — c’mon! Bitcoin is clearly money. I have mixed feelings about how cryptocurrencies should be regulated, but they are obviously currencies. The judge’s rubric for this question was weird and ahistorical:

“Miami-Dade Circuit Judge Teresa Mary Pooler ruled that Bitcoin was not backed by any government or bank, and was not ‘tangible wealth’ and ‘cannot be hidden under a mattress like cash and gold bars.’

‘The court is not an expert in economics; however, it is very clear, even to someone with limited knowledge in the area, the Bitcoin has a long way to go before it the equivalent of money,’ Pooler wrote in an eight-page order.”

Most mainstream currencies are backed by governments, but that’s not an inherent feature of money, just a modern quirk. How do people think money got started? It grew out of bartering, and for a very long time it wasn’t regulated or centrally controlled at all. [Edited to add: David Graeber’s Debt asserts that money actually emerged before bartering. Does not change my larger point. See the note at the end.] Just as an example, per the Federal Reserve Bank of San Francisco:

“Between 1837 and 1866, a period known as the ‘Free Banking Era,’ lax federal and state banking laws permitted virtually anyone to open a bank and issue currency. Paper money was issued by states, cities, counties, private banks, railroads, stores, churches, and individuals.”

And that’s relatively recent! John Lanchester wrote a truly excellent overview of what money actually is and how it functions for the London Review of Books, and I wish I could make this judge read it.

Granted, legal definitions exist in a parallel reality, so maybe there’s some legislative reason why the US government can’t bestow official currency status on non-state-sponsored currencies. They’d certainly have to step up their game when it comes to regulating them, which would be a lot of work since so far their game has been practically nonexistent.

Just to top off the ridiculousness, Tim Maly drew my attention to this bit from the Miami Herald article: “‘Basically, it’s poker chips that people are willing to buy from you,’ said Evans, a virtual-currency expert who was paid $3,000 in Bitcoins for his defense testimony.”

As Maly quipped on Twitter, “Bitcoin isn’t money laundering because bitcoin isn’t money says bitcoin expert paid in bitcoin.”

Is this merely a question of semantics? Yes. But I’ve always come down on the side that language is important — it’s both my first love and my livelihood, after all — and it bothers me to see foundational economic concepts misapplied. Let’s at least describe our brave new world accurately.

Note on the origins of money: Facebook commenter Greg Shuflin mentioned David Graeber’s book Debt: The First 5,000 Years and its assertion that bartering came after money. It doesn’t change my larger point, but here’s the relevant Wikipedia passage:

“The author claims that debt and credit historically appeared before money, which itself appeared before barter. This is the opposite of the narrative given in standard economics texts dating back to Adam Smith. To support this, he cites numerous historical, ethnographic and archaeological studies. He also claims that the standard economics texts cite no evidence for suggesting that barter came before money, credit and debt, and he has seen no credible reports suggesting such. […] He argues that credit systems originally developed as means of account long before the advent of coinage, which appeared around 600 BC. Credit can still be seen operating in non-monetary economies. Barter, on the other hand, seems primarily to have been used for limited exchanges between different societies that had infrequent contact and often were in a context of ritualized warfare.”

Sounds like an interesting book!

International Labor Economics, Ugh

The "Bread and Roses" Lawrence textile strike of 1912. Photo via Library of Congress.

The “Bread and Roses” Lawrence textile strike of 1912. Photo via Library of Congress.

In recent musings about Las Vegas, I called myself bourgeois. Refresher: according to Karl Marx, the bourgeoisie are the capitalist class, prone to consumerism — also often associated with snobbery and intellectual affectations. Think New Yorker readers.

I bring this up because commenter gaikokumaniakku said, “There are a lot of folks who thought they were bourgeois, and then they woke up one morning to another rejected job prospect and realized that they were lower-class.” I agree with Scott Alexander that class does not solely hinge on money, but the point is a good one.

gaikokumaniakku also asked what I think of the term “precariat”, which is a play on “proletariat” (opposite of the bourgeoisie). The precariat are people without financial reserves or job security. Macmillan Dictionary’s BuzzWord blog published this in 2011:

“New, international labour markets, significantly expanding the available workforce, have weakened the position of workers and strengthened the position of employers. Increasingly, workers are in jobs which are part-time and/or temporary, have unpredictable hours, low wages and few benefits such as holiday or sick pay. This means that employers can follow what demand dictates and simply [fire people] if work is not available, and are also not obliged to pay anyone that isn’t actually working.”

I find it plausible that globalization is a big part of this. That’s been an ongoing trend: jobs once located in [country where labor is expensive] disappear offshore to [country where labor is cheap]. Workers don’t have the same freedom of movement that employers do, so they can’t easily respond to market changes. Larger companies especially, which rely on many people’s labor, can shift operations to wherever costs the least.

Demonstrators in New York City during the 1913 May Day parade. Signs feature Yiddish, Italian, and English. Photo via Library of Congress.

Demonstrators in New York City during the 1913 May Day parade. Signs feature Yiddish, Italian, and English. Photo via Library of Congress.

Priest and professor Giles Fraser wrote a Guardian editorial on this very topic:

“In this era of advanced globalisation, we believe in free trade, in the free movement of goods, but not in the free movement of labour. We think it outrageous that the Chinese block Google, believing it to be everyone’s right to roam free digitally. We celebrate organisations such as Médecins Sans Frontières for their compassionate universalism. But for all this talk of freedom from restriction, we still pen poor people into reservations of poverty. […] At present, globalisation is a luxury of the rich, for those of us who can swan about the globe with the flick of a boarding pass. The so-called ‘migrant crisis’ is globalisation for the poor.”

The other macro factor that might be creating (and provoking) the precariat is technological unemployment. The machines are taking our jobs!!!!! Ahhhhh!!!!! My guess is that we’ll adjust to new levels of productivity, like we did after the Industrial Revolution, but the transition phase will be very painful. (I basically stole this theory from Ben Thompson.)

Beyond that, I don’t have any particular insights. If you do, hit reply and let me know?