Should I Start Using AI to Write My Job Descriptions?
Well, before I weigh in, let’s ask our new friend ChatGPT. Here’s the response I got:
“As of now, a person is generally better at writing a job description than AI. While AI tools can assist in generating job descriptions and even provide suggestions, they may lack the nuanced understanding, creativity, and context that a human can bring to the task.”
Yep. Can’t say that I disagree. But as you can plainly see, part of the intro to this very blog came courtesy of a quick prompt I entered into ChatGPT. The Pew Research Center reports that for those who are aware of ChatGPT, 16% are regularly using it at work. Generative AI tools provide a much-needed jump-start to speed up and streamline tasks. And when you look at any list of “great” use cases of AI for recruiters, the task of job description rewrites almost always tops the list.
Leveraging AI to boost productivity by automating time-consuming work that feels like pure drudgery makes obvious sense. Applications of smart automation in our tech stack workflows and sourcing engines have already been delivering efficiencies and hours back in our day for years. The boost of a job description rewrite in seconds especially when so many of our job descriptions could seriously use one, also makes sense. However, the rewrite you get back is really just that—a boost.
Today, there’s still a need for plenty of human oversight with a generative AI tool writing your job descriptions. Here are three very real reasons why:
Misguided prompts
Out of date, jargon-filled, too much information. Too little information. For all or some of these reasons, your job descriptions may have been riding along on the struggle bus for years. An accurate portrayal of the job itself and ideal attributes that sync to your company’s culture, values, and purpose might be sorely missing within them, too. So, if you want to get yours off that bus by running them through a generative AI tool, it’s not enough to ask “Hey ChatGPT, make this a better job description.” You will need to think about a well-constructed prompt before you begin.
Take a closer look at your EVP, employer brand guidelines, and personas. Tap into that treasure trove of informed research and include it in your initial prompt. Here’s a basic example of what I mean:
This [JOB DECRIPTION] for a [JOB TITLE] needs to attract a person with the following attributes [INSERT ATTRIBUTE LIST] that align with our [COMPANY’S VALUES] while portraying a concise and realistic overview of what it takes to thrive in this role based on the following [PERSONA OVERVIEW] and the [EMPLOYEE VALUE PROPOSITION] our company offers in return.
Even with a solid start, the resulting copy may warrant continued prompts for specific and necessary tweaks. Regardless, as you perform your review, there’s a very strong case to be made that you or another real live person invested in your employer brand messaging will want to invest in finalizing copy. Why?
Quality is questionable
Even AI is aware that there’s a lack of “nuanced understanding, creativity, and context that a human can bring to the task” when writing job descriptions. Without a solid set of training data (think minimally of the fill-in-the-blank intel in the example prompt), a great job description rewrite isn’t going to happen.
More importantly, the human touch is missing in action. And that’s a big miss. Style and originality that more powerfully convey your employer brand won’t be coming from parsed copy. It will come from the perspectives of people.
And speaking of parsed copy, there’s a real chance that over time using generative AI tools like ChatGPT may homogenize our writing. Not so great when you are trying to differentiate, is it?
Let’s talk about bias
AI has been positioned as a bias-buster for job descriptions. Its ability to identify stereotypical language and recommend gender-neutral terms appears to be a wonderful advantage and there are solutions out there doing a great job of it. However, let’s not forget how a tool like ChatGPT gets its learnings. It gets them from existing data and language patterns. From us. That creates an imperfect solution. So again, enough said. Human oversight of what generative AI produces is a must.
While we’re strictly talking about copy creation here, there’s been plenty of examples of text-to-image models of generative AI playing into race and gender stereotypes, too. When reporting for Bloomberg Technology + Equality, Leonardo Nicoletti and Diana Bass prompted a text-to-image model to “create representations of workers for 14 jobs — 300 images each for seven jobs that are typically considered ‘high-paying’ in the US and seven that are considered ‘low-paying’ — plus three categories related to crime.”
After a review of the results, it is no surprise that their report is titled: Humans are biased. Generative AI is even worse. I encourage you to give it a read.
In the end, my answer to the question that this blog post poses isn’t a hard no. Not at all. Staring at a blank screen or looking at that three-page deep job description not knowing where to even begin is in itself a time suck you don’t need to suffer through. With some consideration to your prompts, personas, and people who know your employer brand, it’s a yes to having improved copy in a split second. Limit your reliance on generative AI to what it’s good at—a working draft—and tap into what only a human can bring to the written word for your job descriptions. It will be a winning combination for this very important task.