A few weeks ago, I saw the following comment on LinkedIn:
“Brand personality indeed creates connections that resonate! Let’s prioritize feelings just as much as features. 🚀 #CustomerExperience”
And a second example:
“If you treat a setback like a reset button instead of a lesson bank, you’ll keep repeating the same level in different outfits. The win? Extract the data. Apply it louder. Move with receipts, not regret.”
WTF does that even mean?
It took all of my willpower not to reply, “You know we can tell you used AI to generate that comment, right?”
I’ve also gotten AI-generated cold emails. Highly personalized, but clearly not written by a human. They make me run for the “Report Spam” button.
The motives are obvious. More comments lead to more profile views and engagement. Outbound emails by the thousands increase the possibility that someone will reply.
But social platforms and our inboxes have become inundated with “AI slop” (low-quality content).
Yes, AI is fast. It can produce content at a dizzying speed. But it comes at a real reputational cost — a cost many AI proponents haven’t fully considered.
People are skeptical of (and annoyed by) AI-generated content
A March 2025 YouGov poll found that 44% of Americans are skeptical of AI. That’s up from 36% just three months earlier. Respondents are concerned about misinformation and deepfakes (among other AI risks).
While using AI to post comments on LinkedIn may feel harmless, the anti-AI sentiment is out there. No matter how good someone thinks their AI-generated content is, it sticks out like a sore thumb. It feels like the person is taking a shortcut, and it rubs people the wrong way.
The ability to leave dozens (or hundreds) of comments might generate likes, but that’s a hollow metric if it doesn’t lead to a genuine connection. There’s no shortcut to authenticity, and as AI slop continues to expand, people will crave authenticity. They’ll gloss over the AI-generated content, looking for the human insights.
Or, at worst, they’ll think less of you for adding to the noise created by AI content.
You can use AI as an assistant, but not a substitute
Bullish as I am that content should be written by humans, I think you can use AI to assist you in creating content, while still remaining authentic.
Have a hard time thinking of ideas for LinkedIn posts? Dictate some thoughts in your area of expertise and have AI give you some suggestions.
Need to turn a blog post into social posts? Feed your blog post to ChatGPT and have it generate some drafts.
You maintain authenticity when you use AI this way, because you’re using your ideas as a starting point, not a destination. You must always, always edit the output. If you don’t, you’ll alienate your intended audience, rather than connect with them.
And if you’re using AI to generate comments on LinkedIn, I can safely speak for everyone when I say: please stop.
If you love this newsletter and look forward to reading it every week, please consider forwarding it to a friend or becoming a subscriber.
Most issues of this publication are free because I love sharing ideas and connecting with others about the future of work. If you want to support me as a writer, you can buy me a coffee.
Have a work story you’d like to share? Please reach out using this form. I can retell your story while protecting your identity, share a guest post, or conduct an interview.