Mar. 19th, 2013

pktechgirlbackup: (pktechgirl)
I am a software tester, but that's imprecise. "tester" can mean anything from someone with a 12 week training course certificate from a community college that plays xbox all day to someone with a computer science degree from a top university that writes software that tests other software, with a coding proficiency rivaling all but the best traditional developers. These are often called test automation engineers or "[company's internal term for software developers] + in Test".

Between the two, automated testing is far more prestigious and better paid. Although frequently not as well paid or prestigious as development, which is frustrating when companies insist TAEs should be able to code as well as developers.* People who do more things get paid more. Anyways, within reason, I agree that automated testing deserves to be the more prestigious. Automation scales and requires more planning. And because of that, the smarter people move into it, which means more of the interesting work goes to the automation side, and so on. But ego has begun to enter into it. A lot of software companies have a thing about only hiring the best and the brightest, and will either refuse to hire manual testers outright, or only hire them on contract.

There are very few products that even contemplate doing entirely automated testing, and zero consumer products. Whether or not you hire manual testers, you're going to be doing manual testing. In an attempt to grab a halo of "only hiring the best", this gets dumped on automated testers, support engineers, and maybe even developers, people who are both horribly overpaid for the task and likely not very good at it. A really good manual tester has an OCD or Aspergerish focus on things being exactly right, a trait heavily discriminated against in computer science programs, where the emphasis is on doing things in the absolute laziest way possible. If manual testing is dumped primarily on "automation" testers, you'll push the best ones into development** and start a vicious cycle of losing your best engineers. If it's spread equally, well then you're just not allocating resources very efficiently.

If you dump the testing on contractors, you lose any expertise they develop in the product every 12 months. Companies like to pretend there's no value in that expertise because it's harder to measure than code, but there are some exceptionally good manual testers out there who would provide a lot of value if people let them do what they were good at. And they're cheap relative to the people who would be doing the work otherwise, because it's low prestige and has a much lower barrier to entry.

But honestly, I think "hiring the only the best" is kind of a bad strategy even for automated testers and developers. There is not a direct correlation between "requires intelligence" and "valuable". Smart people can get themselves involved in any number of hilarious low pay off adventures, and average people can maintain google reader. This drive to hire only the best is being driven by something other than value.

*I once got a haircut from a woman who was doing a one-course-at-a-time programming degree at a community college. There was a detectable dismissive sniff when I told her I did software testing, because she was aiming for much bigger things. I didn't play the "I went to school in Boston" card, but I was thinking about it.

And it was a shitty haircut

**Because they like coding, or because of the increasing pay gap, or because of the prestige. Nobody likes being thought of as settling for second.

Profile

pktechgirlbackup: (Default)
pktechgirlbackup

May 2014

S M T W T F S
    123
45 678910
11121314151617
18192021222324
25262728293031

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 24th, 2017 10:40 pm
Powered by Dreamwidth Studios