One of the best ways I’ve ever seen class relations on modern American campuses described is Matthew Stewart’s model of the “New American Aristocracy.” According to Stewart, it makes more sense to divide people today not into two classes—the “haves” and the “have-nots,” or the 99% and the 1% of Occupy lore—but three: the wealthiest 0.1% of Americans, the poorest 90%, and the 9.9% in the middle. Stewart deems this 9.9% America’s new aristocracy: “a well-behaved, flannel-suited crowd of lawyers, doctors, dentists, mid-level investment bankers, M.B.A.s with opaque job titles, and assorted other professionals—the kind of people you might invite to dinner.” Coming out of elite colleges, future members of this group are funneled into a limited number of jobs, ones with high enough compensation to qualify them as aristocrats—and to realize a return on investment for their skyrocketing college tuition—and enough prestige to legitimize their new status. For decades, there have been four such jobs: finance, management consulting, law, and medicine.
Make no mistake, these jobs are handsomely compensated. If you stay in one of these roles, perform decently well, and manage your money wisely, you are almost guaranteed a spot in the 9.9%. But there’s a catch: college graduates entering these fields find themselves working some of the longest hours in corporate America. Junior-level investment bankers and consultants at elite firms almost always work 70-80 hours per week for their first few years after leaving undergrad. Medical school and law school may be slightly better, though still no cakewalk, but aspiring doctors and lawyers can expect the same workload as their banking and consulting peers when they finish graduate school as residents and paralegals, respectively. For many years, top firms have relied on having a near-endless source of people who were willing to be indentured servants in their 20s in exchange for membership in the aristocracy.
In the early 2010s, though, a new path emerged: technology. Originally, this meant software engineering and computer programming, but over time, tech companies large and small seemed to add more and more roles with more and more nebulous responsibilities. These companies were able to match—and, in some cases, exceed—the compensation levels of the “legacy” prestige jobs, often by including equity options inflated by a decade of loose monetary policy. Despite this, newly-minted tech workers seemed to be having a much easier time of it than their counterparts in similarly-compensated roles.
The hallmark of the tech company culture, born and bred in San Francisco, California, was its gentleness and supportive nature, a sharp contrast to the rigidly hierarchical, abrasive environments found in the other paths to aristocratic status. A software engineer might spend 40-50 hours per week in an amenity-filled office and receive mental health check-ins from their managers, as opposed to 80-hour weeks in purposefully-too-cold offices hiding from short-fused, full-throated bosses.
As if this weren’t compelling enough, however, working in tech managed to become more prestigious than in similar roles, particularly finance and consulting in the aftermath of the Great Financial Crisis. There was a certain “halo” around tech that working on Wall Street no longer had—people really did believe that Silicon Valley was a source for good that had the potential to change the world.
All good things, however, must come to an end. The music had already been slowing down for the tech industry since the end of 2021, but it has ground to a halt in the last six months. Perhaps the most public example of this was the failure of Silicon Valley Bank, which occurred when so-called “finance professionals” in Venture Capital failed to understand the basics of fractional-reserve banking and marking-to-market and thus ginned themselves up into a bank run.
Despite the FDIC stepping in to stop the risk of contagion, longer-term questions remain. The era of the zero-interest-rate policy seems to be over in the United States, fueling concerns that tech companies and VC firms will be unable to continue to rapidly expand and systematically overhire without easy access to cheap capital, or that they will not be able to compete with massive investment banks and law firms on a compensation basis. Not only will this lack of funding depress salaries, but it will also force companies to look to do more with less and get more out of their employees, as firms must now compete more intensely with one another for pieces of a shrinking pie.
More significant than the changing macroeconomic conditions, however, may be the change in the public perception of the technology industry. The angel has fallen, and the halo has been lost. Thankfully, society has, by now, largely ditched the fantasy that Silicon Valley is a bunch of cutting-edge lone-wolf innovators, that social media brings the world together, and that massive corporations are your best friend. The tech industry is no longer a hotbed for idealistic college grads who want to “change the world,” and the internal attitudes and cultures of technology firms may change to reflect this—a veneer of meditation pods and in-house therapists may remain, but with harsher competition and a slower flow of funding, I would expect tech companies to become somewhat less tolerant of employees taking “mental health days” or working 30 hours per week.
By no means are we about to enter some sort of neo-Luddite period, but it feels likely that fewer tech roles will be available for recent college graduates, and that the positions that remain will be more competitive to obtain and to keep. It may be soon that the open offices of Palo Alto tech firms begin to feel a lot like an investment banking bullpen in Manhattan. Oh, the horror.