Can A.I. Eliminate Jobs That We Don't Actually Need?

Photo by Lyman Hansel Gerona @lhgerona) on Unsplash

Posted on 18 May, 2023.

I recently read the famously excellent book "Bullshit Jobs" by the late, great David Graeber, about jobs that don't add any value to the world we live in (and indeed cause economic and societal harm), but persist due to mostly political and sociological (but not so much economic) reasons.

(I should add here that the vast majority of jobs, even for those of you who feel like "my job is bullshit", are not "Bullshit Jobs" per Graeber's definition, and are actually needed.)

Before your mind's associative mechanisms take you to the public sector, allow me to inform you that Graeber specifically focuses on—and in my opinion proves, surprisingly—how this is an even bigger phenomenon in the private sector, especially in larger organisations. This is a dangerous book, and well worth reading!

In parallel, like many of you, my email and chat apps are blowing up with numerous threads concerning the recent breakthroughs in generative AI (e.g. ChatGPT or Midjourney), and what it means for our future; implications mentioned range from a worrying decline in the quality of high school book reports, all the way to The Singularity.

Obviously, jobs are a hot item. Will machines replace us at our jobs? It's obviously a multifaceted and fascinating discussion that will take a long time to unfold in the real world. There are some aspects that are more certain, such as the implications on factory and retail sectors, where—absent a social safety net—work elimination could lead directly to poverty. But while we get our heads around these developments and what they mean, I think Graeber's book provides us with an interesting and actionable thought exercise: identifying a group of jobs that AI could (and should, perhaps) help to transition.

Graeber classifies Bullsh*t Jobs into five categories. Here they are with my crude assessment of risk:

  1. Flunkies: some leaders equate number of reports to power, even if those on their team sit around and do nothing (think royal courtiers in history, or Gareth/Dwight from the UK/US versions of the Office). There is no chance that any AI is silly enough to try and recreate this task, because there is none to recreate. LOW RISK.

  2. Goons: roles with an aggressive nature that have an adverse effect, like soldiers (the only reason we need them is because the other people have them too), but also (per Graeber!) bank lobbyists and some corporate lawyers. This is related to basic game theory, and unless we give AI all the control (for an example, watch the hilarious S1E9 of Love Death & Robots on Netflix), it won't change the underlying premise of this particular game. Going back to the warfare example, as it evolves, many military tasks are slowly being replaced by machines (e.g. drones, cyber-warfare, SIGINT), but not uniformly across countries—and sadly, they are definitely not being eliminated. LOW RISK.

  3. Duct tapers: these people are busy manually fixing or bridging over inefficiencies or disconnects that should exist in the first place; if we build systems more thoroughly and consolidate around platforms, we wouldn't need these jobs. System design and modelling seem like a perfect place for AI to excel. HIGH RISK.

  4. Box tickers: people whose roles exist so that a company can claim it has done something, like when as an oil & gas company you hire an ESG Manager without a budget or power. AI is not taking over government institutions that create compliance standards anytime soon... but it could automate how we meet some of the requirements. MEDIUM RISK.

  5. Taskmasters: these could be people whose job it is to assign tasks (Type 1), or both create bullshit tasks and then assign (Type 2). Type 1 could easily be replaced by an orchestration mechanism, in the same way that Uber or Lyft can assign rides to drivers; Type 2 is usually a derivative of Type 1, so therefore it would also be at risk, eventually—or merge with the Flunkies category. HIGH RISK.

Final tally: 2.5 out of 5 isn't bad! :)

On a serious note, of course, prophecy was given to fools, and the likelihood of any of this coming to pass is greatly dependent on us as employees, managers, consumers, and citizen-voters. But if you're looking to read something that challenges your perception of the political, economic and professional systems around us—I highly recommend this book.

Previous
Previous

4 Ways To Know If They're Into You(r data)

Next
Next

The Coming Cloud War Isn’t About Market Share