Smarterer – a broken recruitement tool

Having taken a number of Smarterer tests I find it scary that people use it as a recruitment tool. The tests are unreliable and meaningless. First, I will explain what is wrong about my scores — and I am complaining more about scores being to high than too low, so there is no personal grudge here. Then a few quick thoughts on the reasons.

Smarterer is being used for recruitment, and its business model relies on that. The tests are free to take, but employers pay for access to job applicants scores. Smarterer scores can play a large part in who gets hired.

Smarterer uses an 800 point scale, not unlike GMAT. Scores close to 800 are supposed to be very impressive (I recall impressing people more with my 780 GMAT score than I did with any real life achievement) and get an appropriate “master” description. One assumes that “expert” means someone is pretty good at something, and even “proficient” must surely indicate that someone would be a competent employee in a job that relied on that skill. What else could these as an assessment of a potential employee?

My Smarterer scores right now (not all are shown on my public profile):

SEO (Master – 796)
C (Master – 796)
Corporate Finance (Master – 791)
Financial Analysis (Master – 791)
Management Accounting (Master – 785)
HTML (Master – 782)
Statistics & Probability (Expert – 775)
CSS (Expert – 767)
MySQL (Expert – 764)
Python (Expert – 746)
Linux (Expert – 733)
SQL (Expert – 717)
Apache (Expert – 713)
Financial Accounting (Proficient – 653)
Python Workshop (Proficient – 650)
Financial Markets & Securities (Proficient – 639)
PHP (Proficient – 594)
Web Design (Proficient – 561)
Investment Banking Fundamentals (Proficient – 487)
Django (Proficient – 483)
PostgreSQL (Proficient – 461)
Microeconomics (Familiar – 434)
HTML5 (Familiar – 338)
Trading Options (Beginner – 274)

My Management Accounting score is far higher than my Financial Accounting score. I have studied the two subjects to about the same level. My overall training in finance gives me a feel for the concepts behind management accounting, but I have a lot more experience of financial accounts, albeit interpreting and analysing them rather than preparing them. I would have expected to do much better in Financial Accounting, but Smarterer thinks quite the opposite. I am inclined to put this down to a mixture of luck and, possibly, a flaw in Smarterer’s adaptive system for picking questions.

There are many other inconsistencies in my scores. Different scores in two different tests on Python, one little better than my score in the PHP one. I have used PHP only to write some simple WordPress and WolfCMS plugins, and similar light use, whereas I have used Python a lot over the last few years. My Django score is actually lower than my PHP score. I write Django all the time but rarely write PHP.

My scores in Statistics, SQL, and a number of other tests may be accurate if you interpret them in the correct context. I am no statistician or DBA, but I am far ahead of the average accountant or MBA graduate in my understanding of statistics. You cannot really interpret the results of these tests without spending time evaluating the level of the test.

For a number of tests, it is not clear what the objective is. What job does the skill relate to? Trading Options and Investment Banking Fundamentals are not tough enough to be useful to the employers to whom they may be relevant, and are too specialist to be of interest outside the speciality.

Now for the outrageously wrong tests. Sadly, this means I have to repudiate my two best scores.

There is no way I am a master SEO. I wish I was. I know the basics, but I also know my limits. My feeling is that the score reflects the general low level of competence in an industry full of snake oil, guess work, and fearful clients.

Even better is my equally high score in C. My total experience of C programming consists of an unsuccessful attempt to write a game for the Atari ST as a schoolboy and reverse engineering a few hundred lines of C (essentially working out the exact financial calculations being done by undocumented code) nearly 10 years ago. I also read about half the K & R book (the classic book on the language) about six months ago. This makes me a master?

The truth is that the C test is better described as an IQ test that assumes a knowledge of C. It tests got basic knowledge of the language, while the harder questions are essentially puzzles written in C. There is nothing that tests the skills that would be needed to write software in C.

Part of the problem is that it is hard to devise multiple choice questions for most subjects. I have seen meaningful multiple choice questions all the way up to postgraduate level, but they are very hard to set. It requires both deep subject knowledge and the very specific skill of devising good questions.

Part of the problem is that expertise is, by its nature, hard to crowd source. It may work is Smarterer gains massive scale, but I doubt it, and it is certainly not working now. Even where there are multiple tests on a subject, there is no useful way of quantifying quality.

Smarterer is great fun and I enjoy it. Use it as a subject specific trivia game. Relying on it in its current state would be mistake.