#420 – WHEN THE JOBS DON’T COME BACK? – KURT CAGLE

Meta laid of another 5,000 people today, many from its content moderation division. Companies are framing these massive cuts as streamlining operations, but let’s be blunt – investors are driving this purge, trying to get a quick cash boost to the companies’ bottom line so that the investors can cash out without taking a bath.

The ai recession continues apace, just as the dot-com recession did more than a generation ago. When you live long enough, you begin to get a sense for recessions, much like the years of bottles of wine that were so spectacularly bad that they take a life of their own. The Recession of 1991, for instance, was a relatively short hiccup, a brief slowdown that pundits of the time had predicted would be bad, yet for the most part, was more of an inconvenience. Let me come back to the .com recession in a second.

On the other hand, the Great Recession of 2007-8 was deep and painful, primarily affecting the housing sector, construction, the bond market, and the broader economy. If you were in tech or made your living on the equity side of things, it had comparatively little impact, especially as mobile was beginning to heat up during that period.

The Pandemic Recession of 2020 technically wasn’t, as most of the job loss and corresponding job gains occurred over about a two-month period. This was one of the more novel recessions in that before it, the unemployment rate was at near-record low levels, and the people that were fired generally weren’t rehired by their old employers but snapped up by companies that were desperately short of talent.

The dot-com bust of 2001-2003 is worth studying more closely because it provides a good mirror of what we may be looking at today. The stock market had been climbing in the months leading up to that period, but most of the activity was in a few new tech stocks. Industry analysts were beginning to raise the warning signs about churn in the labor market, and there was a sense that market valuations were no longer reflective of reality and would eventually collapse.

When earnings season hit in early 2001, companies capitulated by laying off workers, mainly in the tech sector. The realization had hit that the technology wasn’t quite at a point where it could support the promises about when investors would make any money. Headcount was the easiest thing to trim.

This was a period when a lot of paper millionaires, kids loaded up with stock options, discovered that 1) options were usually only available if you had been with a company long enough to vest, and 2) when you are unemployed and are holding a bunch of options but very little in the way of actual money, that such options were worthless, especially when said companies close their doors forever. At one point during that period, San Francisco became affordable.

Structurally, several key things were happening behind the scenes. Google and Amazon came into their own about then. Netscape would reform as Mozilla and would eventually release the Firefox browser. Text-based services became normalized, and the first text-based big document data standard – XML – saw its most formative work. It was a similarly reasonable period for Linux, as developers didn’t have the money for the expensive software toolkits that Microsoft had released earlier, and technical innovations in the mobile space laid the groundwork for mobile phones and the wireless revolution of 2010.

It was also a period that chased all but the most hardcore programmers out of the IT field and into other areas (including house flipping, which was at least partially responsible for the 2008 recession). Too much money may be worse for technology than too little, a lesson investors learn and forget over the years.

What was most remarkable was that, outside of places like San Francisco, Boston, and Seattle, you could have been excused for thinking that the economy had already recovered by 2002. Equity recessions tend to feature this as a characteristic. The economy recovered quickly, but it’s worth noting that tech never reached the same level of participation as it did in that area, though by the time of the 2008 recession, spending and hiring were up in the IT sector.

Waiting for Godot

I do not doubt that the 2023 Recession will be known as the AI recession, partly as a nod to the dot-com recession. It has many similarities to that earlier recession; ordinarily, it would share the same trajectory. However, aspects of this recession concern me a great deal.

That AI Recession would hit has been evident for a number of years. Artificial Intelligence isn’t, in and of itself, a technology. Instead, it is usually a label applied to a tech stack that primarily impacts the knowledge sector, and it is fairly amorphous in that regard. Using neural networks was the latest technology to bear the label. Still, it has been used to describe everything from natural language processing, inferential knowledge graphs, expert systems, artificial vision, gaming, etc.

What’s most remarkable about this period is that so many of these technologies all benefited from machine learning to a dramatic extent simultaneously, to a degree most people are still unaware of. Large language models are a truly remarkable advance, diffusion-based GANs are stunning in what they can produce, the leap in computer vision and simulations completely unexpected, and it is primarily because all of these technologies benefit from the same underlying innovations that the field seems to be advancing at a breathtaking speed.

Even more amazing is that many of the significant areas of development that seemed to be stymied in the last decade are now on track to be a reality by the end of the decade. Autonomous vehicles needed massive investment in GPUs and the development of neural network “algorithms” to be feasible. It’s not hard to see cars being programmed with some large language model hosted by GPU cubes and reprogrammable FPGAs as part of the electronic arrays, with cloud-based traffic control systems by 2030 now (I would have put that by 2040 at the earliest, even a couple of years ago).

I also see the key pieces for the metaverse now emerging as feasible: the generation of dynamic avatars (through generative AIs), the rise of autonomous agents (via AutoGPT and similar generative programmers), and the awakening of companions (via ChatGPT and intelligent filtering). None of these are QUITE here yet, but their necessary precursors are. Nodal cloud programming is the next big thing needed – arrays of specialized knowledge graphs, LLMs, and router agents that collectively store short and long-term memories and pass states from one system to another without direct human intervention while keeping such systems secure.

Decentralized identifiers and verifiable credentials will also be a part of this security mix, and ONCE those are in place, a financial transaction layer will emerge as a stack. I have no real opposition to digital currency, but I have long felt that blockchain and the whole zero-trust approach was NOT the way to make that happen; by rushing that part of the stack, it’s meant that a few con artists have become fabulously rich at the expense of just about everyone else. Get DiDs right, ensure you can get identity management nailed, and digital currency should come along for free, sweeping most of the current DeFi stack out the door.

The AI Recession

Have you noticed something here? There’s a clear roadmap here that will still require some heavy-duty lifting to make real, but that’s feasible. And yet, companies can’t fire all of their data scientists and machine learning specialists fast enough. What the hell is going on?

The answer is simple: Investors got greedy earlier and are panicking now.

The Pandemic caused a real economic disruption, forcing rapid improvisation that drove the AI revolution. Supply chain disruptions added to the costs of production and distribution of goods and services and caused the inflation that hit starting in 2021. Some of that inflation was inevitable – despite the monetarist viewpoint that seems to have become gospel in economic circles over the last seventy years, inflation is not generally caused by the economic policy; it is caused by supply chain disruption.

Inflation, however, becomes entrenched when companies raise the price of goods and services without prior pressures to do so because the expectation is that they can. Prices, once raised, become sticky – vendors do not want to lower them again. There was pressure on the Fed to get rid of inflation, and they set to it with the most obvious tool they had available: raising interest rates, which have been at historic lows for more than twenty years.

However, when you raise interest rates, you also raise the cost of borrowing. The vast majority of money in the IT space is leveraged – it is based upon loans (many of them variable rate loans) that suddenly become MUCH more expensive. People make dubious investments when the cost of money is both cheap and stable because the risk of investing can be diversified across multiple investments so that even one will pay off the risk from the rest. When rates rise unexpectedly, the costs of servicing those loans also rise dramatically, and investors must find ways to cut costs.

Trimming labor has become the go-to means of doing so because trimming labor costs have been cut after decades of business-friendly governments. It also has the immediate impact of pushing wages back down because the number of potential jobs drops after labor is released back into the available workforce.

Here’s where things get weird. Three competing trends are underway right now:

Demographics and Quiet Quitting. Boomers retiring, the GenX population shrinking (it was a much smaller generation), Millennials increasing but slowly, and GenAlpha (now entering the workforce) looking to be one of the smallest generations in history. Couple this with more people suffering long Covid and similar trauma after the Pandemic; labor is declining and will continue for the next thirty years.

Fewer Advanced Degrees. There are fewer people with advanced degrees, especially in STEM. It takes a minimum of 25 years to get a Ph.D. or an MD, and the costs are increasing. These people are being fired right now, though the more thoughtful SMB companies are snapping them up quickly. This means that when many of these large companies are looking to replace their engineering force in a couple of years to remain competitive in this space, there will be no one left to hire, which means wages to attract them will be higher than when they were fired.

The Job Replacement Factor of GPT. This is the wild card. There is a rumor (I haven’t been able to track the source of it) that when IBM released several thousand workers recently, they announced that they would replace these workers with AI. Personally, I suspect this likely was never made. Still, it raises an interesting question – how many companies are banking on the highly evolving and unpredictable nature of the current state of AI to get by with a reduced headcount? If they are, how accurate will these predictions be?

I suspect many companies are salivating at the prospect of AI-driven head-count reductions, perceiving such reductions as likely permanent. The problem I see here is that based on my own experiences and anecdotal interviews, AI isn’t the productivity driver that companies believe it to be, especially as AI is already a component of most existing productivity tools. It can even be a time sink because you are replacing a deliberate production process with a primarily stochastic, discovery-oriented one.

Similarly, most activities that people engage in today are not production oriented. Instead, they are process or organizationally-related – discovering requirements, designing, testing, promoting, or managing. People are (possibly) more likely to use generative tools instead of pre-existing tools. Still, for the most part, they are not going to radically simplify the overall execution of their job with AI tools. Usually, they require domain-specific information that will not be available to that AI. I also do not believe that people will voluntarily provide their own domain knowledge to an employer, knowing full well that this is the difference between being employed and not.

So, the hopes that AI will result in significant staff reductions may be premature at best. What it will do is make it easier for smaller companies to for capable of producing products and services that the larger companies currently compete in. Put it another way, because investors reacted so quickly to reduce headcount at a time when human specialized skills are needed more than ever, they are putting themselves at a significant business disadvantage at a critical juncture in time.

The End of the White-Collar Job

A very interesting cognitive reaction occurs when you tell someone that automation will replace jobs, but only jobs that involve blue-collar jobs. This is a good thing for most people involved in corporate America. However, change the word “blue” to “white,” and they suddenly become more anxious. Funny how the color and style of one’s shirt can make you suddenly so much more fearful.

However, the reality is that white-collar jobs are far more vulnerable to automation than blue-collar ones. Robots are expensive, complex, limited in functionality, and fragile. A skilled landscaper, a barista, a cook, a barber, a nurse, any of these positions will be difficult to replace with a robotic equivalent (I know the idea of a robotic barber terrifies me – cue Sweeney Todd music).

I can go job by job and say whether jobs are endangered. Still, I think what will happen is the disintegration of the corporation, a process that is already underway. Over the last twenty years, function after function has disappeared into the cloud.

For instance, human resources once was a department that had dozens of employees for any moderate to large corporations. The department still exists, but its mandate has expanded dramatically to managing not just payroll and related functions but to manage training, security compliance, onboarding, offboarding, and the like, even with the assistance of an HR platform. An AI is not going to change that appreciably.

Sales, for instance, was dramatically changed by the sales force as a tool, but it did not necessarily eliminate the salesperson. Once companions become more commonplace, then part of the role of the salesperson will become harder because there will be an “intelligent filter” that advocates for their person. Such a person may be hiding behind an avatar who serves the same role as a personal assistant or secretary. This means that sales and marketing continue their merger from what had once been very distinct functions.

This is the more likely path that I see for white-collar jobs. Existing roles will disappear. Already the office manager is disappearing as more and more offices become virtual, but a virtual office steward is emerging that manages the infrastructure of that virtual office. The same holds true for many formerly physical office functions. This virtualization, not necessarily AI itself, will be responsible for these changes, and yes, over time, this will result in an erosion of “jobs” and job slots. Traditionally relatively independent creatives will become consultants, paid by the contract and not by the hour.

Indeed, I think one of the most substantial impacts AI will have on white-collar jobs is that it will make wages a thing of the past, though it will take a few decades for that to happen fully. AIs will be developed as agents in the human sense, negotiating contracts that optimize available returns, possibly in conjunction with other agents. Put another way, we may be on the brink of AI-based unions.

Typically, creatives and entertainers at a certain level can hire agents to negotiate contracts, moving beyond hourly rates to getting points or percentages of the overall net of a product. This more entrepreneurial approach will become much more prevalent, primarily because good art, like good writing and good programming, tends to follow a power law rather than a normal distribution – with AI, good artists can become exceptional artists and be more in demand. From experience, it takes time, skill, and experimentation to become proficient in using the technology and to incorporate it with existing skills. I don’t believe this changes with AI.

Wages are a form of a social contract – an agreement that people forego a single payment at the end of a contract engagement for regular payments cumulatively worth less but are predictable and assumed to be of long-term engagement. Seventy years ago, wages were far higher in buying power, a person had a near guarantee of work for life, and the social contract was arguably tilted towards labor.

Now, most work engagements have a mean lifespan of twelve to eighteen months, with the possibility of becoming unemployed at any time pretty much guaranteed. At the end of that period, the employee generally will have paid the bills, but that’s all that has happened. It increasingly takes two or more such engagements simultaneously to make even that happen. Sweat equity sounds good in theory, but in practice, sweat counts for nothing when the pink slips start flying.

This is not sustainable. I believe that what will most likely happen is that people will start using AI tools and services better to represent themselves to potential clients (not employers) and to determine better their actual worth in the market – and how best to capitalize on that value. Such AI has the potential to be more objective in terms of evaluating skills and marketability than most people are themselves, something that predatory employers take advantage of. They also are more likely to be able to consolidate information about a person’s skill vis-a-vis the overall market, making it easier to see what comparable people are earning for the same work.

And the Rise of the Studio Model

This, in turn, will alter the business landscape in multiple ways. While there will always be corporate jobs, the corporation itself is dissolving into a sea (or graph) of different business relationships, each with various claims on the products for which the corporation is responsible. I call this the Studio Model, though I’ve heard it referred to elsewhere as the Hollywood Model. In this model, a producer or studio typically orchestrates the given products. The work is generally divided by specialization – filming units, special effects, music, post-production, etc. Each of these subordinate units typically will work on multiple projects simultaneously, frequently for different clients, and they are quite frequently separate entities in their own right from the producers.

The advantage of the studio business model is that it reduces the amount of time that any given business unit (or any individual) is unnecessarily idle. Salaries exist for the purpose of keeping people who would otherwise look for work focused on the same company. Still, expensive talent can often end up sitting idle (and hence more likely to seek out new work anyway).

Modern management theory promotes that money incentivizes people to stay, but anecdotal evidence suggests that it is only one of several factors. Others include whether the work expands the person’s employability, is meaningful to that person’s long-term goals, is engaging work, and involves mentorship with a positive boss. Companies can only really control the last; even there, this is not a major priority. Moreover, when excessive salary costs (a problem for the client) become an excuse for trimming the workforce, even money becomes a liability.

Shifting to multiple potential client/contractor types of relationships via the Studio Model already benefits both parties. First, it acknowledges reality: the wage model is no longer working. Even in the midst of an apparent recession, the labor market remains tight, and candidates are increasingly choosing entrepreneurial (and greater ownership models) over the guarantee of an income, even as the wage model tends to have a flattening effect on compensation. The studio model also discourages labor caching, where companies hire specialists (such as data scientists) to deny them to competitors rather than out of any particular need (as happened in the most recent cycle).

Ultimately, the driving factor for this will be a dire need for content. This may seem paradoxical, considering that you would think with generative AI that, you would need less content. However, it’s worth understanding that while there may be a relatively short lull as systems catch up with one another, AI systems need novel content to remain up to date.

AT ITS CORE, an LLM or Generative AI is simply another form of press. You can develop some incredible content using it. Still, almost all of that requires grist for the mill – photography, artwork, video content, models, news, opinion, and narrative stories, which means creatives being PAID to create. It is also likely to need far more as social media companies start being responsible for compensating their creators, a move that is already underway in the European Union and will likely be seen in the United States later this year. Novelty is valuable, and that’s about to tip the equation so far to the creative rather than the money side of the ledger that no one will know how to handle it.

Unfortunately, it will not be uniform. Creative talent, in whatever manifestation tends to follow power laws. AI will make people with solid skills and talent more productive and capable, but for most people, AI will replace other tools they already use. I follow a group of people who work fairly heavily with digital tools to produce artwork, and I’ve not seen a radical change in that number in the last several months. The quality of their work has improved noticeably, but these were already professional artists. When given access to Stable Diffusion or similar tools, the average person will produce two or three crappy drawings and conclude that all this talk about AI is hogwash.

In other words, the situation changes primarily because more talent will be needed to feed the engine while the quality of the product that talent creates also improves. This points to a shift as more (though not necessarily dramatically more) people opt to become involved in the design, media, gaming, and related areas. Similarly, marketing is becoming an analytical service (and subsuming sales), which means that the remaining sales management becomes business development. Documentation will eventually be fully automated, to the relief of just about everyone.

So virtual development teams, less onsite infrastructure, and cloud-based business functions. The trend here is fairly obvious – virtualization isn’t just about remote working but about the physical disintegration of businesses into their virtual equivalents, mediated by artificial intelligence. This is one reason I believe that Return To Office efforts will fail – Covid accelerated a trend already underway by 2020. There are no indications that this trend (towards virtualization) has been seriously countered. It’s also why I am not concerned about AI replacing knowledge workers soon. AI might replace the office worker, but that’s only because AI virtualizes that office worker to a very different economic model.

But What If …

I realize I haven’t fully addressed the opening question. The reason is simple – it’s a nonsense question. Investors do not create value. This is not to say that what they are doing is not valuable. Still, the investors’ role is to ensure businesses can get past the critical phase of developing sufficient markets to become self-sustaining. AI reduces the costs of that phase, perhaps to a point where investors are no longer necessary for most projects.

At the same time, people have Maslow pyramid needs. They need food, shelter, and clothing, and money becomes essential to all of this in that it abstracts the need for barter. As wage income disappears, what is left is still a contractual exchange of goods or services for money, but ultimately this will involve making (nearly) everyone a business. As such, it means that the economy will perforce shift to one where capital gains get taxed – and the larger the capital gain, the higher the tax.

It is ridiculous that there are citizens of the US who have net worths greater than a large number of countries. No one needs that much wealth in a democracy, and it becomes antithetical to the survival of that democracy for anyone to have the power that comes with that much wealth. Eliminate the distinction between wages and capital gains, enforce tax capture, and solve the fundamental problem of AI eliminating jobs – it is not the AI eliminating jobs, but businesses seeking more money at the expense of their employees that use the AI to eliminate the jobs.

Institute a Basic Living Income (BLI) from those taxes so that people can work to gain experience but aren’t forced to work to survive. A family should not have to work when dealing with pregnancy (and indeed, should be supported as their needs increase with new families), and people with illnesses should not have to work for any number of reasons.

The fundamental point is that most people will work, given the alternative – humans are not naturally predisposed towards laziness. If given a share in the work upon completion, they will work harder because they are building their nest eggs.

At a minimum, increase the penalties on those who abuse their positions of fiduciary trust, enforce and harden bribery laws, and stop treating corporations as people and money as free speech. Once corporations become immortal and untouchable, the capitalist system becomes unstable and ultimately collapses because there is no accountability.

These have nothing to do with AI and everything to do with AI. The economy will eventually collapse in states or countries without accountability, while those states that value accountability will thrive. AI can be a tool to avoid responsibility, yes. Still, it can also be a tool to ensure that those who avoid the responsibilities they took on when given the privilege of power have those privileges revoked and the consequences of their actions applied.

Put another way – when everyone has access to AI, it is far harder to oppress those people than when it is a monopoly. We will reach this point long before knowledge workers are pushed out of existence.

BIO:

Kurt Cagle is the Editor in Chief of The Cagle Report, a former community editor for Data Science Central, and the principal for Semantical LLC, as well as a regular contributing writer for Linked In. He has written twenty-four books and hundreds of articles on programming and data interchange standards. He maintains a free Calendly consultation site at https://calendly.com/semantical – if you have a question, want to suggest a story, or want to chat, set up a free consultation appointment with him there.

Leave a Reply

Your email address will not be published.