I recently vibe-coded a new website for myself. Even though I worked at a software company for many years, I’m not a developer. But a few hours with Claude Code, and I was able to create a website just by “talking” to Claude with phrases like, “Can you move that to the center?” or “Can you change the color here?”
To pay someone to create a website for me would have likely cost thousands of dollars. Now, it’s a few hours with an AI coding tool — even without any specialized skills.
This is a small, personal example of something much larger happening across the labor market. AI is changing work from two directions at once. It’s changing how people get hired (AI screening, AI interviews, AI deciding whether your resume ever reaches a human). And it’s changing whether companies need to hire at all, as AI agents start replacing headcount entirely.
Both of these trends are quietly hollowing out the foundation of how careers have traditionally been built. Workers are rightly nervous because they see the seismic shift coming... but don’t yet know how it will impact them.
AI’s role in hiring: gatekeeper or equalizer?
AI is now embedded in the hiring process, but the results are a mixed bag. Sometimes, AI reduces human biases, while other times it reinforces discrimination that has always made hiring problematic.
Derek Mobley, a Black man over 40 with a disability, applied to more than 100 jobs through Workday’s platform. He was rejected every time, often within minutes, sometimes in the middle of the night. He’s now suing Workday, alleging their AI screening tools discriminate based on race, age, and disability.
The numbers are staggering: Workday disclosed that 1.1 billion applications were rejected using its software during the relevant period. The class action lawsuit could potentially include hundreds of millions of members. Even though the case hasn’t settled yet, the message is clear: companies may be liable for the actions of their AI systems if those systems violate existing laws.
But there’s another way to think about how AI can impact the hiring process.
PSG Global Solutions developed an AI interviewer called “Anna” and had researcher Brian Jabarian at the University of Chicago Booth School of Business run an experiment on 70,000 job applicants. Applicants interviewed by Anna were 12% more likely to get a job offer and 18% more likely to stay in the job for at least a month. Reports of gender discrimination were cut nearly in half compared to human-led interviews.
Surprisingly, when given a choice, 78% of candidates chose the AI interviewer over a human. Why? The theory is that people felt the AI was “less judgy.” For example, a human interviewer might penalize a candidate for being nervous (whether consciously or unconsciously). The AI also captured more job-relevant information: an average of nine topics per interview compared to five for human interviewers.
So which is it? AI can reduce human bias, but it can also perpetuate biases at scale. We don’t fully know the answer (yet), but the difference seems to be in how the AI is designed and what data it’s trained on. For the time being, job seekers have no way of knowing whether that’s working for them or against them.
AI replacing humans: the appeal and the problem
In January 2025, I wrote “No, AI isn’t coming for all the jobs.” I truly believed what I wrote at the time. But it seems more and more that AI can replace some of the work of humans, in ways we couldn’t have predicted even a year ago.
Jason Lemkin is the founder of SaaStr, the world’s largest community for software founders, and a veteran SaaS investor. After two of his salespeople quit, he made a decision that fundamentally changed how his company: he replaced his entire go-to-market team with AI agents. SaasTr now has 20 AI agents managed by 1.2 humans (one full-time, one part-time) doing the work that used to require 10 sales development representatives and account executives.
The result? Same revenue, 24/7 availability, able to handle greater volume, and instant response times.
His framing is really telling: “The job that would have gone to a new rep now goes to an AI agent. The headcount just never gets added.”
But there’s a problem with this approach. You need seasoned, senior people to train AI agents. It worked for SaaStr right now because the senior people already exist. But what happens five to ten years from now when there are no more junior people becoming senior people?
According to data from Revelio Labs, entry-level tech hiring has declined about 35% since January 2023. While this data is specific to tech workers, the pattern will easily extend to other industries as AI tools become more capable and accessible.
It’s a classic case of short-term thinking with long-term consequences. Companies are solving today’s labor struggles by creating a future talent crisis.
So, as workers, what can we do?
I see people online who are still opposed to AI. I get it: there’s still a lot to figure out. In addition to biases and job displacement fears, many of the ethical and environmental questions are very real and unresolved.
And yet... AI isn’t going anywhere. Continuing to resist will likely be a losing strategy.
Early in my career, I helped banks with digital transformation. Customers had bought enterprise software to replace a lot of manual internal processes. There were always employees who insisted that the “old ways” were better. But, after a period of time, the employees who were most successful were the ones who figured out how the software could help them do their jobs better.
AI is going to become as ubiquitous as a calculator or a spreadsheet. The question isn’t whether it will change your work. It’s whether you’ll be ready when it does.
For employees:
If you work for an employer, you’re probably going to need to learn on your own. Part of this is because companies consistently fail to train employees properly. If you wait for formal training, it may never come.
Lemkin’s advice is practical: “Pick an AI tool today. Train it yourself. Become the person at your company who knows how to make AI agents productive. That skill set is the new job security.”
For self-employed people:
AI can extend your skills without extending your costs or the amount of effort you put in. My vibe coding example is proof of that. But it also means your competitors can do the same. Staying ahead means staying curious and constantly testing new tools and new ways of working with AI.
Final thoughts:
The labor market is being reshaped, and we don’t fully know the outcome. The workers who protect themselves are the ones who see this uncertainty and prepare themselves as best they can.
Want to build a life-first business? These reflections will help you determine your priorities.
If you want to support my work as a writer, you can subscribe to receive additional issues I publish.
Have a work story you’d like to share? Please reach out using this form. I can retell your story while protecting your identity, share a guest post, or conduct an interview.





This is a thought-provoking piece on the dual impact of AI in the labor market. The contrast between AI as a potential equalizer (like the PSG study showing reduced bias) and as a perpetuator of discrimination (like the Workday lawsuit) highlights just how critical thoughtful implementation is.
The point about creating a future talent crisis really resonates. If we replace all entry-level positions with AI agents today, where will tomorrow's experienced professionals come from? It's like cutting down all the saplings while still needing mature trees.