zlacker

Being good at coding competitions correlates negatively with job performance

submitted by azhenl+(OP) on 2020-12-15 01:03:37 | 251 points 124 comments
[view article] [source] [go to bottom]

NOTE: showing posts with links only show all posts
4. jaredt+g2[view] [source] 2020-12-15 01:21:27
>>azhenl+(OP)
This is Berkson's Paradox. Even if coding competition performance correlates positively with job performance in the general population (which it certainly does, given that most people can't code), selecting for this attribute in the hiring process leads to a negative correlation among those hired.

Great write-up by Erik Bernhardsson, CTO of Better, here: https://erikbern.com/2020/01/13/how-to-hire-smarter-than-the....

◧◩
19. dcolki+L3[view] [source] [discussion] 2020-12-15 01:32:18
>>jaredt+g2
Simple analogy. There is no correlation between height and salary across NBA players.[1]

The naive conclusion would be that height has nothing to do with basketball ability. The real answer is that markets are efficient and are already correcting one important feature against other predictors. Steph Curry wouldn't even be in the NBA if had the shooting ability of Gheorghe Mureșan.

[1] https://rpubs.com/msluggett/189114

21. renewi+S3[view] [source] 2020-12-15 01:33:01
>>azhenl+(OP)
Berkson's, right? A perfect interview process would result in a population where none of the people who are hired as a result of it have any attributes that could be correlated with job performance. i.e. all the information has been 'used up' by the selection filter. If you have a correlation, then you can improve the selection filter, so it can't be perfect.

This can have interesting outcomes. For instance, when Triplebyte published their blog post about which environments get the most hires⁰, it revealed the areas they haven't yet entirely accounted for in their quest to increase matching performance.

0: https://triplebyte.com/blog/technical-interview-performance-...

◧◩◪◨
38. jmchus+96[view] [source] [discussion] 2020-12-15 01:52:09
>>permo-+D3
Hikaru Nakamura's take on the statement https://www.youtube.com/watch?v=IrN16c3DTzE&feature=youtu.be...
◧◩
39. uyt+b6[view] [source] [discussion] 2020-12-15 01:52:17
>>jaredt+g2
Depending on how they define "winner at programming contests", this might narrow down the population to just a handful of "sport programmers". The same handful of guys win all the contests.

The statement might as well be "tourist has bad job performance". (https://en.wikipedia.org/wiki/Gennady_Korotkevich) And that isn't surprising given how much he has to train everyday to stay on top. He even turned down offers from Google/Facebook just to continue qualifying for the big annual competitions like Google Code Jam and Facebook Hacker Cup.

For a more in-depth account on how the top people train, you can check out this guy's advice on how to get two gold medals in IOI: https://codeforces.com/blog/entry/69100 and his training schedule: https://codeforces.com/blog/entry/69100?#comment-535272

Or this guy, who won IOI this year: https://www.youtube.com/watch?v=V_Cc4Yk2xe4&feature=youtu.be...

◧◩
81. angry_+3f[view] [source] [discussion] 2020-12-15 03:32:42
>>warabe+yb
Kaggle comps don't, to my knowledge, involve tasks like: - Convincing someone to give you data.

- Convincing them to spend great effort into changing their data collection and labelling practices.

- Explaining why a particular technique was used and why it is correct.

- Explaining why they can't expect magic from 'big data'.

- Making models that are robust and easily maintained, vs fragile spaghetti.

But I don't think being good at kaggle implies being bad at data science soft skills. Technical skills are probably weakly correlated -- that's my prior, it would be good to see a study.

I did find a paper examining the performance of TopCoder participants: https://doi.org/10.1007/s10664-019-09755-0

◧◩◪
107. vacher+qy[view] [source] [discussion] 2020-12-15 07:21:45
>>gwern+Ie
This made me reread the article[0] again (if we were talking about the same one) and I don't see any mention of interview ratings and job performance.

[0] - https://www.nytimes.com/2013/06/20/business/in-head-hunting-...

◧◩◪
112. Clumsy+CQ[view] [source] [discussion] 2020-12-15 10:43:19
>>dcolki+L3
- "The real answer is that markets are efficient"

Like it efficiently chooses all hockeyplayers with birthdays in January to March? [0]

The real answer is people hire based on their biases and organisational restrictions more than they hire on objective metrics - and we have plenty of evidence for that.

0 - https://en.m.wikipedia.org/wiki/Relative_age_effect

◧◩
121. sitkac+EX1[view] [source] [discussion] 2020-12-15 18:29:14
>>jaredt+g2
That write up is excellent and I found this paragraph particularly interesting.

> An interesting paper [1] claims a negative correlation between sales performance and management performance for sales people promoted into managers. The conclusion is that “firms prioritize current job performance in promotion decisions at the expense of other observable characteristics that better predict managerial performance”. While this paper isn't about hiring, it's the exact same theory here: the x-axis would be something like “expected future management ability” and the y-axis “sales performance”.

[1] https://www.nber.org/papers/w24343.pdf

[go to top]