You’ve probably seen or heard plenty about using AI tools (ChatGPT, Bard, others) to help write job applications. It’s tempting, especially when you don’t have the time to write long selection criteria responses and you want your responses to “shine.” But for APS jobs, there are good reasons why leaning too heavily on AI is likely to backfire.
1. The APS is serious about authenticity and integrity
The APS values genuine, personal responses more than polished wording. The selection criteria ask for examples of what you’ve done, how you did it, and why it mattered. They’re not just looking for fancy wording. They want evidence of your APS capabilities, such as sound judgement and decision-making.
While there’s no evidence that APS Agencies run AI detectors or checkers on your application, that doesn’t mean AI-written responses go unnoticed. Selection panels read hundreds of applications and can usually spot text that feels too generic or disconnected from the applicant’s experience. In other words, while there isn’t a technical check in place, a human review is often enough to reveal when an application has been produced by AI.
2. “Perfect” can make you look generic
AI tools can help tidy up grammar and structure. But they often generate fairly generic content. You risk having responses that:
- Lack personal detail — The APS wants specific examples, usually written in CAR (challenges, actions, outcomes) or STAR (situation, task, action, result) format. AI can create responses that look polished yet feel empty once you read closely. This makes hard for the panel to assess whether your example really shows what’s required.
- Sound formulaic — e.g. using standard phrases and buzzwords. Selection panels know the common language of priority capabilities. If your responses read like many others, you won’t stand out.
- Miss nuance — APS values things like ethical judgement, adaptability, collaboration, risk awareness, and integrity. Panels often see this in the way you tell your story, acknowledge challenges, and describe the lessons you took away. AI isn’t great at capturing your learning unless you feed it precise info. And even then, the result might feel ‘flat.’
According to the APSC’s How to Spot an AI Applicant, responses with identical word choices and a tone that never changes can come across as unnatural. These traits can raise doubts about authenticity instead of strengthening your application.
For a detailed comparison of AI-generated pitches versus those crafted by professional writers, see our blog on AI vs Professional Pitch Writers
3. You’ll likely struggle to back it up in interviews or tests
Even if your written application looks great (thanks to AI), the APS process usually involves more than just writing. There might be:
- Interviews, behavioural questions, scenario questions
- Work samples or tasks
- Referee checks, where your previous colleagues or supervisors are asked about what you say you did
If your written documentation comes from AI and you don’t really ‘own’ it, you may stumble when asked to expand or explain. Interviewers can often tell when someone hasn’t personally lived the example, or when the story doesn’t quite add up.
4. APS has explicit policies and advice about AI use
From What APS applicants need to know about AI, it’s clear the Australian Government is being proactive about AI use in both service delivery and recruitment.
Similarly, the APSC’s guidance in How to Spot an AI Applicant shows that while Agencies don’t scan applications with AI detection tools, HR professionals are learning to recognise tell-tale signs of AI reliance. That means applications that read as overly polished or impersonal are less likely to make a strong impression.
5. Ethics, fairness, and risk
There are ethical issues in misrepresenting your own work. If you imply something as your achievement when it’s not, that can damage trust. APS places a lot of emphasis on merit, integrity, and transparency. Submissions that mislead, even unintentionally, could risk being assessed less favourably.
There’s also the fairness angle. If some applicants rely heavily on AI while others write their own responses, inequity creeps in. Agencies want to maintain a level playing field, so responses that appear machine-generated may attract more scrutiny.
6. The best way: Use AI as a helper, not a substitute
This isn’t to say you can’t ever use AI. If used carefully, it can help you:
- research and generate ideas
- check grammar or spelling
- compare wording or phrasing
But you should always personalise what it produces to make sure your unique voice comes through, and check it against your real experiences.
Use real stories from your experience. Talk about what you did, where the challenges were, and what you achieved. Writing in your own voice matters, and even minor imperfections can reassure the panel that your application is authentic.
Final thoughts
If you are applying for an APS job, keep in mind that the panel is not only looking at how neat the writing is. They want to understand if you are the right person, with the suitable skills and APS values to succeed.
AI tools might help tidy up, but they won’t replace lived experience or authenticity. And while the APS doesn’t run AI checkers, seasoned Assessors know when a response doesn’t sound like it came from the applicant.
At the end of the day, panels respond most to applications that sound real, and written by someone capable and sincere.
That’s where we can help. Our pitch and selection criteria writing services are tailored by real people who understand the APS process inside out. We highlight your strengths in a way that keeps your application true to you and ready for the APS process.