When Job Specs Go Bad
Updated: Sep 2
As someone actively looking for a position I’m becoming increasingly discouraged by job specifications that appear to have little to no relevance in hiring skilled testers. After reviewing job specifications pertaining to quality engineers (in all their multivariate word salad constructions) it’s becoming increasingly clear that *some organisations have little to no idea what they actually require, aided and abetted by ill-informed recruiters who amplify these misconceptions. Whether this is due to a misunderstanding of what good testing looks like or whether it’s a product of years of over-selling automation (in all it’s misunderstood glory) as a panacea to quality ills.
The net result being that applications are received, hiring decisions are made and under-motivated QA are encouraged who subsequently coast along in maintaining this accepted standard.
To highlight my concerns, I’ve cherry-picked lines from a job opening on linkedin that highlights what *I perceive as deficient, both in an understanding of testing as a skilled activity, the conflation of separate and unrelated activities and requirements that don't actually add any further information as to whether the person applying understands the role of skilled testing. I’m not going to name names, that would seem unnecessarily harsh and definitely not productive. I’ll explain *why* I think these are badly framed. If you see a line from one of your job specifications I hope you find my breakdown of use...or not.
I absolutely understand the difficulties in articulating the requirements for a specific role. I've written many tester job specs. I approach them from an applicant's perspective.
i.e ‘If I saw this role, why would I want to apply?’.
If I saw this role 'Does the position give me an insight into their attitude towards testing?'
The Job Spec In Question
*Reviewing requirements in a variety of formats (user stories, functional specifications, flow diagrams, features) in support of test planning and preparation ensuring nothing enters a sprint unless it is fully ready for development and test
This starts off well, a good QA will review requirements in whatever form they are presented. Then it sadly falls off a cliff '...ensuring nothing enters a sprint unless it is fully ready for development and test' - ?
So what's bad about it?
This sentence starts off with reviewing requirements and then somehow conflates reviewing requirements with a *Definition Of Ready*.
A QA of any level or seniority will never be able to fulfil these requirements, this is an activity which is predicated on the actions of an individual. This is lazy shorthand, a QA acting as a quality gatekeeper with all the attendant baggage. A practice which I thought had passed its sell-by date many, many, moons ago. The whole team decide if a story/backlog item is *ready* for development. The actual requirements are one small facet of the *DOR. This is an impossible job requirement to fulfil.
A minimum 3 years’ experience in a web testing role
Why 3 years?
Not *experience of web testing*, but specifically 3 years.
Not 2yrs, 11 months.
What oracle is this gem derived from? This always puzzles me. Someone, somewhere, has made a conscious decision to specify this timeframe.
Is this some kind of magic 8-ball for QA?
I genuinely don’t know how this informs the decision-making progress as to whether the individual with 3 years experience (who may have been testing by rote) is a better fit than an individual with 1 year or 2 years experience (who may have been taking an active part in driving the quality of the application)
Proven history of writing test scripts*
Not ‘A proven history of adding value to the quality effort via accurate articulation of a test idea’ but *a proven history of writing test scripts….sigh Notwithstanding the debatable usefulness of test scripts as a static, unchanging artefact there is no information that helps to inform *why* this requirement is of use. This would, no-doubt, appeal to drive-by testers who see their role and value within that role as a singular function. his is atypical of an organisation that uses no. of test scripts == quality deliverable That’s not how the real world of testing works.
*Experience with TFS, Jira or similar tooling*
I see a lot of these. I have to admit, I’ve mentored and worked with many testers. I’ve yet to encounter a tester whose attitude and approach to quality has been hindered by their lack of exposure to a project management tool. It's just a tool. So, if I've never used TFS or JIRA does that mean that my critical thinking, analytical approach, user-focused skills have no value?
There are many, many more of this style, both on linkedin and the job boards out there. I can help you structure your job specs, I can help you articulate expectations using the correct terminology. I can coach your QA resource to write meaningful job specs that will address the identified problem that the opening is intended to fulfil.
If you’re an org or recruiter that wants help in attracting testers that add value, drop me a line at gesqa.com
If you recognise your post amongst my cherry picking feel free to message me. I’ll help you to construct a meaningful spec free of charge.