I use a gig to hire process for this. It works very well. I've been meaning to write more about it but I also get income for helping with hiring around this model so im a bit conflicted. The whole answer is long and complex.
Basically: give candidate a proper task on your project and evaluate them based on that work - the "interview" is less than an hour, the "work" is longer (many hours (and you pay for it)) - provides a solid evaluation across their skill-set
That will work for some people but not everyone. At my current job I can't monetize work I do outside the company without approval. That means I can't go through you application process unless I quit first. Even if I was allowed to do this I don't think I'd want to. The time commitment of an extra project on top of my normal workload is not something I want.
Well, then you can just work out a deal to not monetize it.
Your objection is of course valid - extra workload on top of a normal job probably has its limits. But the fact one is willing to pay those who are willing to be paid for such an interview doesn't suddenly make it a strict requirement ha.
I must say though, I also view the interview standard I've faced - quick phone screen, 1h tech screen, 4-6h onsite (probably online now) - as adding up to a lot of time very quickly, especially as the "onsite" time is of course awkward when it means you take PTO to go to an interview. In this sense, I'd probably rather have an ~8-12h coding task than a 4-6h onsite. Maybe even a higher factor
An employment process which prohibits common methods of job searching might well be found anticompetitive and hence unenforceable. Or straight-up illegal.
All hiring process have the some-not-all problem. This process is just a different set for "some". Remember though it's a some-some problem on both sides too. I think it's sometimes called the "secretary problem".
what if they "give" you a gift card after you submit? I've had that before, where the gitft card was essentially paypal credit.
Not wanting to do an extra project is the second part of this problem - the best people already have jobs and aren't looking, so they're not going to spend all their free time working on high-effort job applications.
I'm curious - what's the conversion ratio for this? I like the idea of this but can't help but think it's ripe for being taken advantage of (on both sides). Have you had trouble with converting to full time offers? Have you had candidates "waste" time/money by severely under-performing or not finishing tasks?
So unless you are paying an above local market rate, why would I as an in demand developer go through that when I could do one phone screen and one in person interview and get a job that pays just as much?
My entire interview process for BigTech was only six hours and because of Covid, it was fully remote.
There is a very wide range of openings and talent out there. And of course you're aware that one uses different tools for different needs right? There's plenty of room for those who aren't so ace like you to land that AWS gig in six hours.
Before the AWS gig, I was a bog standard “Enterprise Developer” or Architect depending on the month of the year doing yet another software as a service CRUD app bumping around in mostly small local companies that no one had ever heard of.
When I was looking for a job in 2016 locally, I emailed a bunch of local external recruiters and was juggling eight or nine openings. I did six phone screens in one day from different companies.
I was no special snowflake. Anyone who could call themselves a “full stack developer” with 5 years of experience could do the same thing. Heck, I didn’t know any of the $coolkids front end frameworks.
The process was the same - one phone screen, a half day in person where you spoke about your past projects, maybe did a coding test that was barely above FizzBuzz level of complexity and show that you know SQL.
A local full stack developer who had any kind of experience could get the “right now” job within a month if they weren’t being picky.
Of course post-Covid, things are different.
So either way. If you are looking for framework developers (no insult intended - that’s what I was as a developer until recently) the process is easy if you’re any good. If you are looking for “smart people” (tm). They are able to do the leetCode monkey dance. Who is left that is willing to spend hours doing take home tests? All of the best candidates know how to play one game or the other.
> Who is left that is willing to spend hours doing take home tests?
I've used this process to help place entry level all the way to C-Level -- maybe I've not provided enough details (it's more than just a simple test) or maybe you're over-simplifying the process?
It's not a perfect process (hiring in general is still looking for that) and clearly it's not the right process for you.
Two non developer anecdotes from what I’ve heard or seen
Solutions Architect - answer the soft skill questions aka “tell me about a time when..” and they had to do a writing sample either a sales blog post or an SOW.
Project Manager - same soft skill questions, talk about processes, previous challenges and outcomes. Maybe do a simplified project plan.
Are those the types of take home tasks you are referring to for non developer roles? Just curious.
I've done sales engineering and solutions architecture for years and never once written a sales blog post. I'm not sure how I'd respond to that if I was asked to do it.
SOW also seems tricky... I'm usually putting out 27+ SOWs, and there is a lot of "standard language" cut-n-paste.
It's not an unguided take home. It's an interactive gig, imagine a process to demonstrate your skill. eg: when hiring a PM, we'd have them demonstrate, same with Architect - my whole process is about SHOW rather than tell.
But just think about all of those business people that want free work and to hire people that are willing to throw themselves in to the project for no extra comp. /s
I filter those jerks out before even letting them see my candidates. Step #1 is for me to work with the company to find the project slice - which they also have to comp my time.
So they spend more time and cash on the traditional method...then complain about a lack of talent. Resume keyword filters and non-technical recruiters are a (not) surprising waste.
I've seen great devs intern twice at the same company while in college and spend several years there as full time right after graduating, only jumping ships when a friend at an other company specifically reached out.
Yes, but the company hiring them took a chance on them earlier and was willing to invest for at least a year before getting a junior dev. And I find that the interview at intern level can be less strict.
I'm incredibly nervous about job interviews. I'll start sweating in the first two or three minutes even over Zoom. I recently "fell into" a job because my university supervisor recommended me to a friend of his at a small business in which my only test was a code review I could do the day before. But I worry that for any future job because I'm terrible at algorithm questions, and I suspect (and I doubt this is just impostor syndrome) that I'm simply not great at algorithms, despite a small cachet of personal projects, which aren't that impressive either.
I'm five years into my career, and still experience this same kind of anxiety. Only recently did I revisit algorithms as I'm getting serious about my job search now, and am still super nervous.
Not to mention, having a family, I don't have a ton of time to study for the gauntlet decent tech companies put in place to vet candidates.
As for personal projects? See lack of time above. I'm lucky to get an hour a week to work on something I actually enjoy. Forget about building something unique/useful/worth public launch in any kind of reasonable time. Also, let's not forget the fun non-compete/anything you build while employed with X is owned by X clauses.
Hiring in general is broken. In tech it just seems amplified.
Most jobs are who you know. So cultivate the kind of person who interviews well and will quickly be trusted to say hire you.
My first job all open positions went to Jon (last name hid), if he handed a name that person was called in, the interview was reverse, unless you really did something stupid you had the job, the purpose of the interview was to convince you to take it. If Jon didn't know anyone a head hunter was called for a 6 month contract, with option to hire if we liked you. So moral of this story is know Jon.
Where I work now we train people on how to put you at ease. It works okay, I think.
That's definitely something that seems to be reiterated often -- the power of networking is real. You'd think after hearing it since high school (some time ago, not gonna date myself lol) I'd actually spend more time nurturing the network I do have.
Luckily my current and last position didn't require any kind of mental gymnastics (my soft skills are great in terms of software engineers), but I'm still wondering what lies behind those gates kept locked by technical trivia and tricky buzzword algorithms that send CVs to the void before a human ever sees them.
My current situation is a bit different than previous (trying to transition tech stacks and job titles now) so the nerves are a bit different. The anxieties are just as real as before though, albeit the payoffs are higher with the new goals. Guess you could call em game day jitters.
Your interview process sounds improved over traditional corporate style interviews. My last employer had similar tactics (put the interviewee at ease, casual conversation), and it led to the best 18 months I've had in my career yet.
that would really only be a big competitive advantage if great people who don't interview well were more numerous than those who interview at least adequately, or if those great people were greater than the great people who do interview adequately.
I guess it would still be something of a competitive advantage though if you were a great person who got hired despite interviewing poorly you might be more grateful than those who interview well.
But anyway I don't know that anyone has the stats on how many great people interview poorly - I would of course only put myself as average, albeit lucky, and I interview poorly half the time but some times I interview really well, for some reason, and when it clicks like that I get the job.
So I assume if there is hope for as bad an interviewee as myself there must be hope for all the other poor interviews out there.
> that would really only be a big competitive advantage if great people who don't interview well were more numerous than those who interview at least adequately, or if those great people were greater than the great people who do interview adequately.
It sounds like you're making it a zero sum game. Isn't it reasonable to assume that you could create an interview process that does well for both parties rather than one that is exclusionary? Isn't the issue with the current interviewing practices is that it is exclusionary and doesn't assess people correctly?
I guess, maybe it's possible, but as I have not have not had any eureka moment recently leading me to pursue a method for doing it, and have not read anything where someone had a moment that sounded to me like "hey that might work!" I am prone to cynicism - one of my failings.
If every variation of interviewing people try seems to lead to a sub-optimal zero-sum game I worry that interviewing is zero-sum and what one should try to do is to work the optimization angle. but like I said - cynic.
Usually the ones who do terrible in the interview but the resume is solid. That person knows how to do their job but can't sell themselves. When they end up finding work they rely on good work.
The opposite is a great interviewer. They will have an even better resume. Rarely are the best workers but can somehow stage what little they do into looking like a top performer. Will likely move on when the going gets tough.
everyone also has a pretty shallow pool of quality talent, so it doesn't scale well.
Once you've been involved in hiring you see why a company will pass on a likely good candidate to avoid a maybe bad experience. In the longer term it's way less work.
Judgement, taste, sensibility, intuition - this is what human professionals are for. The value of an experienced worker is in the biases he has accumulated.
Do you really not understand that when I say "bias," I mean prejudices against people that are irrespective of their qualifications for the job?
Of course I want to hire people for their technical and professional judgement. That's the whole point of the interview process: to attempt to measure technical knowledge and its application (ie- judgement). It's hard, but we have to try.
But if we google "bias in hiring [1]," all the results are around prejudices that have nothing to do with their performance on the job. In this context, bias means discrimination against certain genders, races, religious beliefs, cultural backgrounds and so on.
I'll give you the benefit of the doubt: perhaps you are unfamiliar with the generally understood definition of bias when talking about hiring practices.
Alternatively, you're selectively and intentionally feigning misunderstanding to advance a position that such prejudice and discrimination is acceptable.
> There are a number of factors driving these numbers. Simple population counts are one of them: There are more white people than black people in the United States, so it makes sense that the average American is going to have more white friends than black friends.
That's the major factor but I guess if one tries hard enough they can see the angels dancing on the head of a needle. Why would they dismiss it in the 2nd paragraph to come up with some holier-than-thou babble.
Of course, and then one could do some math and figure out that in a graph where 75% of nodes are X and 25% are Y and random relationships occur, the average Y will have a higher proportion of X friends than the average X will have of Y friends. ? Where does one jump from this to some kind of moral argument?
It's like saying the avg kitesurfers has more non-kitesurfer friends than the avg non-kitesurfer has kitesurfer friends.
F*, this grouping and deindividuation of people that the American left has degenerated to, has truly has become a post-modern religion with it's requisite self-flagelation and utopic symbolism.
How did a simple citation turn into a rant about the “liberals”?
This isn’t a novel observation. People live by people that look like them when they are voluntarily choosing who to socialize with, they socialize with people that look like them. I’m not making any value judgments.
But, even if you look beyond racial lines, how many 25 year old singles hang out with 45 year old married people? It even happens in reverse. My former manager was trying to build a department of developers in a remote office. After he got one or two good hires using the standard recruiting process, all the rest came from referrals.
No one in our office of 10 developers were under 35. How much different would the median age have been if he hadn’t used referrals?
Yeah that definitely helps a lot. But it means you won't get to hire great people from outside your two-layer deep professional circle (people known by people you know).
Do great people really interview _that_ poorly? It's certainly possible, but it seems almost an edge case that's not worth trying to exploit. Rather, I feel the main competitive advantage would be to become better at interviewing.
Learn how to evaluate people on their skills, rather than your own personal belief on how a problem should be solved. Understand how someone is thinking and approaching a problem, instead of a yes/no checkbox. Be able to determine what a candidate's strengths are and how you can best utilize them, rather than how to fit them into a pigeonholed role you have imagined.
There's a growing rift in software between employers saying "there's a talent shortage" and a rapidly growing population of devs who feel like they're locked out due to the technical interview process.
Many of the engineers not being hired are recent bootcamp grads but there are also tons of CS majors that can't seem to "crack" the interview process.
Part of my job is helping companies "fix" their hiring and one of the ideas that I've been putting forth for years that's slowly gaining steam is developing a "technical apprentice" role. This role would be responsible for tasks that are frequently de-prioritized like documentation, testing, QA, bug fixes, note taking, etc. and would be a foot in the door for entry-level engineers. The role is designed to focus on communication and soft-skills while also giving the person a chance to prove their "grit" on the technical side. Even a few months in an apprenticeship role is generally enough for companies to "take a chance" on someone as an entry-level engineer.
This has been a great way to shift interviews away from algorithms and more towards finding people can add immense value to technical teams even without having on-the-job programming experience.
I'm curious what the HN crowd thinks about that role as a way to bridge the hiring gap.
Seems like a no-brainer to me. Most skilled professions provide some sort of apprenticeship track where you balance some mundane tasks with more complex tasks working with skilled professionals in the discipline to learn the trade better. You're not paid at industry high rates initially but provided a reasonable timeline for full competitive pay employment into the profession.
New and young surgeons don't start out day one on the job after six years or so expected to perform a successful open heart surgery, like software engineers are essentially expected to do. You also don't have civil engineers designing full dams or bridges on day one of their job after their bachelors or masters.
Software on the other hand thinks that for some reason, you transition from student to expert in the blink of an eye or can simply pick up what you need in a few weeks. It's completely unrealistic and businesses need to realize the value and necessity of apprenticeships and mentoring.
The problem with this though is that technology is so diverse (now more than ever) that a certain amount of skill you learn at a business is going to be non-transferable as opposed to other professions where their craft is mostly constant.
This means employees are a little less interested in these commitments because it can tie their skillsets to a specific employer if the employer isn't keeping with popular industry trends. It's also a cost employers and employees often don't want to pay in a world where employer/employee loyalty is non-existent. I think for apprenticeships to work, they need to provide transferable skills/knowledge and or provide some basis of loyalty and long term commitment goals between an employer and employee. Both of these seem like incredible obstacles in the current development climate.
I have taught at the undergrad, which gave me the perspective of observing those undergrads that got job offers and those that didn't. Almost without exception, those students that did well in class and made an effort had no problem getting job offers. It was rare if they got a job at a FAANG company, but I don't think I knew anyone who wasn't able to secure an offer. One thing common to all of these students is they practice technical interview questions. Some had internships, some didn't. Some of the students were very bright from a theory perspective but most had a relatively poor grasp on theory. However, one common trait was they made an effort to grow as programmers.
There were students who really struggled getting jobs and it was not at all surprising. They struggled in their courses and were not proficient programmers. I never got the impression that they spent much time outside of class trying to work on these skills. No matter how hard I tried to motivate them it fell on deaf ears. I do not think an apprenticeship would have much value to these students other than possibly delaying the conclusion that this may not be the field for them.
I see the idea of apprenticeships discussed as a better alternative to the technical interview. My observation is that those that are proposing this have not worked in a "skilled profession" (not entirely sure how one defines that) that has a so-called apprenticeship. A common example I see on HN is that of doctors, where their residency serves as a sort of apprenticeship. The number and difficulty of tests doctors have to go through to practice as a doctor (at least in the US) is pretty incredible. Given the choice between going through that or studying months for the most difficult battery of technical interview questions, I would choose the technical interview route every time. This completely ignores the fact that doctors attend 4 years of post-graduate education (medical school) before even starting the residency. Imagine if companies required a PhD in CS before you could become an apprentice! And their boards are, without question, orders of magnitude more difficult than a PhD defense (I have a PhD in CS and my wife is a spine surgeon, so I am speaking from experience).
The counter argument I suppose is "well maybe not doctors, but what about accountants and actuaries?" I was a fully credentialed actuary in a prior life and can say those exams were way more stressful and difficult than preparing for technical interviews. Once you are through them it is very easy to move around and there are no technical interviews, but it also takes on average 7 years of intense studying and heartbreaking failures to get there.
I am not saying technical interviews are perfect. However, it seems the theme is that the grass is greener in other professions, and I don't think that is actually the case. What other profession can you studying your butt off for 6 months and land a job paying over $250k out of college?! Yes preparing for technical interviews is difficult, but boy is it worth it (at least in my opinion). I personally think they are a great opportunity to grow as a developer as well. Anyways, I suppose I have gone on enough.
We have roughly this concept in my org (field consulting for a major cloud provider). I've been very involved on the training side and on having people directly on my team. Some notes from my experience
1) We still have a pretty intense interview process. We largely aren't taking people from code camps, but people with programming experience in university but who may have not been CS majors.
2) To work, it is necessarily VERY labor intensive. I ask teams taking on of our early "apprentices" to expect to have a senior level person spend at least an hour a day with them for at least a month, and multiple hours a week for a year.
3) You need to have long timelines- On-boarding is hard with well trained with lots of experience. In my experience, "apprentices" often take 6 months plus to be value adding. For a project that will be over in 8 months, this can be hard to swallow. The program and adoption has to be driven at a strategic level.
4) The benefits of the apprenticeship program often do not accrue to the group doing the investment. If it takes 6 months to get someone to the point where there are adding value on average and then another 6 months to the point where they are more than covering their salary and cost, then they are a year into the position. We see a lot of our people choosing to transition at around 18 - 24 months in role. We just sunk a ton of time into people to get them competent, but the benefits of that training are going to another group or company and would have been better off hiring no one.
5) Despite all the above, many of the best people in my org have come through the program. Beyond just being good at their jobs, they bring a diversity of background and experience that really adds our ability to execute on problems.
I'm a big supporter of apprenticeship type programs and think they pay off in the end, but as I've described there are a lot of failure points along the way.
at the moment the comment immediately below this is advising folks to find an internship, and in many ways that's what you're describing - a role with a limited timeline and a low minimum bar, but within which some people will be able to demonstrate abilities and promise far beyond that minimum.
It's reasonable, but it doesn't always work out. One of the larger heartbreaks of my professional mentorship time was bringing in someone who seemed like they could be a "diamond in the rough" but was just unable to ever quite pull it together. Watching this person's limited successes and repeated failures and their understandable, almost always mature emotional responses, and being unable to help them get over the hump was pretty frustrating for all parties involved. I also think it was short-term bad for my career: I felt judged for putting so much energy into someone who wasn't quite working out, and I don't think that opened any doors for me personally.
So - it's still a risk! What I think is that when it works it's great, but the people doing the hiring, evaluation, and training are still taking a fairly time-and-resource intensive risk, and when it doesn't work it's no fun at all. Gives me a lot more sympathy for folks who take the safe route.
Maybe there's a paraphrase of "nobody ever got fired for choosing IBM" along the lines of, nobody ever gets dinged in performance review or passed over for promotion if they only hire candidates who can sail through whiteboard-algorithms questions.
As someone who is regularly helping companies improve the hiring process, I can say with absolute certainty that companies _do_ get dinged for "only hiring candidates who sail through whiteboard-algorithms questions". Companies know that these are not indicators of success and that they lead to uniform teams with homogenous backgrounds.
I will also say that crafting real world "on the job style" interview problems is growing steam, as it should be!
This is a great idea for apprentices/entry level talent. However what I see in FAANGs is that though we claim to have a shortage, the reality is that demand is low (internal headcount) and supply is high (for ELTs). The leetcode style coding interviews is very very pushed for so that we can have a standardized process whether that is what the job entails or not. After all most of the work being done is on some custom stack anyway. The downside of this it ends up impacting experienced engineers who are being evaluated on the standardized tests rather than experience sadly. I feel this may even be the desired outcome!
I like your approach, in fact several people when I started development truly believed that the first 2 years or so, you were only supposed to fix bugs and things like that.
The one problem I see with the approach, and this is an issue with the company not your idea, is that they don't have people to train entry level developers. Most of the senior people I've met are incapable of doing it. They lack real expertise or an ability to effectively communicate and deal with people, or they are simply burned out from it. I've seen a number of places where the senior person is tasked with doing that and it takes up all of the person's time. They get punished for being promoted to that role. Heck I can say that I did for a while too. I took a lower position so I could write code and not deal with all of the politics and stress of leading people.
My experience, as someone who has advocated for hiring people along the same lines, is that this is a pitch that everyone will agree is a good idea. But then when it comes time to actually hire someone under it, people will hem and haw about whether or not they have the capacity to mentor people like this.
Usually it then won't happen unless they fit a very specific "comfortable" mold of person that they believe can be a "self-starter" which usually just means someone who's similar to person/people deciding on the hiring in various ways.
Where I work mentoring a junior person is almost mandatory once you reach a certain level. I've been told flat out one reason I was passed for a promotion was I wasn't mentoring people, event though my department didn't have any juniors in at the time, it still held me back (and my boss also started looking for opportunities to mentor outside the department, but I left before that got anywhere for other reasons)
- a real chance and a clear path to 'upgrading' to a dev role
- generally good integration with the team (AKA not a person 'over there')
- mentoring & pair programming (for example for bug fixes)
- time scheduled for learning
Can be both attractive for an inexperienced applicant and very useful for a team. Just note that if you take an apprentice or even intern, then you have a responsibility of teaching and helping that person grow.
Also note that some of the things you mentioned, specifically QA and documentation, are quite hard to do well. There needs to be some guidance.
The danger is that people just/only give an apprentice tasks that nobody else wants to do. Some of that is fine, but I've seen people give interns/apprentices tasks which required expertise but were 'boring'. This is a failure in my eyes, because now your apprentice is overwhelmed and demotivated.
I'm an advocate for when it works, but it takes real commitment and effort. It takes engineers who want to mentor and see it as part of their responsibilities. It takes managers willing to mentor mentors and hold them accountable.
It takes all of that AND it takes the luxury of a company that has the excess in time and money to invest like that. Some companies don't. I've been in both companies that do and don't. When you're running a tight budget, you can't responsibly hire talent that is going to take a year to do a job you need done today.
There is a shortage of talent but it’s self-induced. There is no common baseline of competency. So when talent seems rare or absent what is cheaper: waiting for talent to materialize or lowering the level of acceptance.
The problem with that is that the new lower level of acceptance only provides temporary relief for the current shortage. Later developers see this new lower level of acceptance the same way prior developers saw the prior level of acceptance and thus the bar must be lowered further because the shortage rate remains unchanged.
I think this is the way that hiring will work in the future, and companies will be slow to adapt it... until a few companies show amazing results, then everyone will get onboard, in order to lock down good results early.
I imagine a big part of the success of this approach, though, is that the companies have the proper orientation towards the technical apprenticeship.
So, if the technical apprenticeship goes well, of course the apprentices benefit, but I suspect the members of their teams that support them also derive massive benefit.
What a cool opportunity for less-senior developers, to play a role in helping the technical apprentices make a large impact.
I'd love to talk more about this with you. I didn't find your email address - would you be willing to share it, or send me an email at josh@josh.works?
this is a domain I've spent a lot of time on, and have only scratched the surface.
Internships are generally thought of as something you do during college (summers or otherwise). Most CS grads would scoff at getting an "internship" after graduating. Internships are also very structured and generally involve working on a specific project within a technical team (I know they're all different).
The apprentice role, the way I've been pitching it at least, is different. This is a role where you join an engineering org and learn the product by QAing it, join technical discussions and help out by taking notes for the team, show off your communication skills by documenting new features and, big picture, you find ways to add value to the team in whatever ways they need. Over time, the bugs fixed get bigger and the person can bite off small features, etc.
The problem is that companies _want_ to hire new grads (even bootcamp grads) but don't feel comfortable paying SWE rates for someone who hasn't worked as a SWE (often rightfully so). The comp for this is equivalent to a QA eng but has a clear path towards being an entry-level SWE (3-6 months maximum). If after 3-6 months it's not clear if the person can add value as an engineer then it's clearly not a good fit.
This is an interesting approach. My concern would be that companies too often consider their senior engineers to be purely technical and set them to work on "the hard problems," rather than intentionally setting aside part of their days for mentorship, exploration, and thought leadership.
That culture shift is going to be difficult for organizations that don't already embrace it. It's not something you can quantify easily for an executive board.
I think it’s a fantastic idea and applaud you for your work. The major hurdle is leading companies only want to hire senior devs. Why spend money and implement a training program when you can outsource it. There seems to be 3 paths in. One, go to a top 10 CS college. As a hiring manager, our internship choices were broken down by school with less than a dozen choices: Stanford, Harvard, MIT, Berkeley, CMU, etc. and a few diversity conferences. The candidates resumes where literally partitioned in folders for each school or “diversity”, so to peruse through hundreds of candidates you had to go by school or conference, with no way to see who had interests or experience in distributed systems, front-end, AI, etc. without choosing a school or conference to look through. I couldn’t believe how biased it was. Two, go work at a startup or small company for below market wages to get experience and try to work your way up to better and better companies. Three, work overseas for a couple years, pay full price for a masters in the US, and then work the H1B masters recruiting pipelines.
While there are exceptions the bootcamps seem to be geared toward path two. I wish bigger companies had training programs. I was working with a veteran who recently graduated from a local college and helping him with interview prep. He couldn’t pass the phone screen for the role I had available and ended up working for a small local consultant which seemed to be more IT focused than dev. I couldn’t in good faith just hire him without support from the team. I wish companies had training programs to hire local people into these jobs, but there really isn’t a shortage of talent that requires them to set these programs up. The shortage is in quality senior devs who could get into a top company but are willing to work for less elsewhere. Anyone who sets up one of these training programs and doesn’t pay FAANG salaries, is just training up candidates to move to FAANG. I guess in a way path two and three really are the training programs you are talking about.
And the stuff you are talking about regarding documentation, testing, etc. also sounds like other tangential roles like TPM or quality, which unfortunately are just being put on developers themselves at many places. I think most companies should have more people in these roles with the aspiration of moving into dev. Maybe a “training” role focused on this could help, but you may also run into resistance from people who are in these roles in organizations.
> a rapidly growing population of devs who feel like they're locked out due to the technical interview process.
I’m not convinced all these articles about “hiring is broken” and “interviews are broken” actually help with this problem. They are pining for something that may not exist (and may not be possible), and failing to help candidates understand and excel within the current system’s imperfections.
I get the impression that many young devs have skewed expectations and aren’t practicing job interviews before attempting them, and believe that their coding skills are the only things that should matter. The article here reinforces those expectations by repeatedly talking about evaluating competency without acknowledging that competency in soft skills like writing and communication and attitude are often at least as important as competency in software engineering, for example.
That said, I would very much agree with your approach and that a technical apprentice is a nice way for both the candidate and the company to learn about each other. Internships naturally do this without necessarily having the expectation of hiring the intern - though I’ve seen a lot of interns hired because they prove themselves competent.
One problem with a technical apprentice role as an alternative to longer interviews is that you have to expect to not hire a large percentage of the apprentices (otherwise there’s no point). In that sense, the apprenticeship becomes a much more demanding interview, and the commitment and risks are much higher for both the candidate and the company. As someone who does a lot of interviewing and resume reading and hiring, I’m not sure I would change my interview process if I hired more apprentices (but I already spend a lot of time looking for potential over experience.)
I would argue that _none_ of the articles about “hiring is broken” and “interviews are broken” actually help with this problem.
There is near-universal consensus that technical hiring is atrocious and yet very few people putting forth possible solutions and/or companies willing to experiment with the status quo.
My ideas are:
* Real-world interview questions
* Standardized testing for "soft" skills. It can be done.
* Dedicated onboarding resources
* Apprenticeships for entry-level engineers with dedicated training programs
* Remove all names from all inbound job applications
* Offer _more_ money for internal referrals. Make them on par with what you'd pay a recruiter.
> There is near-universal consensus that technical hiring is atrocious
I hear this a lot on Hacker News, but not inside any companies. And when asked why people think it's atrocious, I rarely get a set of answers that agree on the specifics. I'd love to hear more about what metrics show evidence that hiring is atrocious, as well as what is making it atrocious for you. What does that mean in the pre-coronavirus context? Are companies not finding people, and are people not getting jobs? I'm asking honestly and seriously, so I can improve my own hiring practices. I don't doubt there's a problem, I just don't have a handle on exactly what it is, and the reports here on HN aren't matching my own experience. I admit I don't match the profile of the average web developer, so I may be ignorant of what's happening in the broader tech world today, especially regarding recent hires from school.
In my own experience interviewing at and hiring for multiple companies over the last 20 years, what I've seen and done matches some of your ideas. We are getting and giving real world interview questions. The interviews are at least somewhat standardized. There are dedicated onboarding resources.
I haven't seen names redacted anywhere, and that's a good idea to normalize cultural or gender biases, but problematic if you want people to review your online portfolio, or if you know someone in the company who can vouch for you, both of which many candidates do want.
It is my experience that internal referrals have significantly higher chance of success, pay for the candidate is likely to be higher than for external candidates, and referrers are typically rewarded financially for the referral. Many people complain that the referral system is part of the problem under the logic that it can encourage nepotism and echo chamber behavior. I don't fully agree, but I don't think this is as obvious of a win as you suggest either, or that the majority of people would agree with it.
It's atrocious but most companies won't reveal specifics because _those_ people were hired via the same process. So it plays into survivorship bias. And nothing will change.
Could you elaborate on why you think it’s atrocious?
I’m not sure I agree that the processes are hidden, every company I’ve ever worked for (several large corps you’ve heard of + several startups) talks about hiring practices publicly. Glassdoor and other sites, including HN, have all kinds of information & stories about what happens during interviews.
Software engineering is a field that is both requires breadth and depth at a very high bar for acceptance. From the interviewing side and the many interviews I've been through, it often feels impossible to overcome. The breadth that the field requires leaves gaps in knowledge, where anecdotally I've been asked what's a language I'm an expert in, or to detail the deep particulars of react app development, or to solve X problem with the greatest efficiency with ease and have lots of input on your thought process. I don't mean to paint a picture where you're not allowed to test anyone on anything, but as an average dev, it feels like stabs in the dark compounded with not having a clear path to improving, because everyone's interviews are different. You can improve on the fundamentals, but that only helps so much.
To me, communication and problem solving skills are the biggest factors of what makes a good SWE. I find that most technical interviews don't look for talent that's acceptable, but what's exceptional; not for who's capable, but for who's accomplished.
My firm belief is that many devs who struggle getting a job now, could have walked into any company pre 2010 and gotten a job with the skills they have now.
This is all with the caveat of the knowledge that high salary commands high talent, but my opinion is that the bar is set ridiculously high without regard for how arduous the process is.
I have had a few very fair, but challenging interviews, and I do think that's a step in the right direction.
Seems totally reasonable, thanks for sharing your perspective. I’d agree that a lot of interviews can range from hard to insane. I’m really curious what that means in terms of measurable outcomes... like how many fresh CS graduates are completely failing to land a job? How many companies are failing to find candidates? Stuff like that...
FWIW, I warn people I’m going to ask questions they don’t know, and I ramp the questions up until people can’t answer them. I know it can be uncomfortable, but I also like finding the bounds of what people know. I do ask a lot of easy questions, and the majority of the interview isn’t knowledge questions at all, it’s open-ended conversation, usually about experience. I don’t have a sense for whether asking a few questions that are too hard for the candidate puts my interviews in the category you’re describing, or whether you’re talking about an entirely different level of arduous.
Oh that's fascinating! Can you talk about what questions are scientifically proven to make a difference, and/or the methodology of proving it? I've read bits and pieces here and there, but not seriously followed such research. I imagine that style, specific words, amount of follow-up, and even mood in the room can play important factors here in the "success rates" of certain interview questions...?
I just ask the prepared questions... That is a good question but I force myself to concentrate on other interesting questions.
They have said that if we want a new question added to the list we need a good link controlled experiment : ask a random sample of candidates, don't use the answer, and then evaluate after 6 months if there is a difference between the two groups performance on the job.
Care to back that up with any reasons? Engineers are absolutely not immune to cultural and gender biases, the evidence may be pointing the other way, that we currently have greater than average biases in tech. Isn’t oversight likely to make it better, not worse? While it may be a problem to not be able to ask some kinds of technical questions, like the GP comment alluded to, what is wrong with the idea of trying to limit questions to those that are proven to be relevant to candidate performance? This seems similar to how I heard that graduate school performance and success in the sciences doesn’t correlate with GRE math & science scores anywhere near as much as it correlates with the language & writing tests... I wouldn’t be surprised if technical interview questions do not do a good job of identifying who you should hire...
Some technical questions are really sink or swim. [0]
I've seen interviews get totally derailed by a simple FizzBuzz question. [1]
I wonder if the GRE situation doesn't have more to do with selection bias, ie applications with a score lower than a certain threshold aren't considered at all or that folks that aren't convinced of their ability in math & science simply won't apply to graduate school.
I’d completely agree that asking some technical questions is pretty important, and my experience is that they don’t have to be particularly difficult at all, you can filter a lot of people with very basic questions. I have several first hand stories that match the one you heard about the FizzBuzz question. People who refuse to answer easy technical questions and/or get angry about being asked them are a HUGE red flag. Sometimes you have to do mundane tasks at work, and anyone who thinks they’re above it doesn’t deserve the job. Getting angry about easy questions is short-sighted, I mean they’re easy. It’s a sign the person doesn’t enjoy coding or work.
Anyway, you could be right, but I don’t think the GRE thing is selection bias, I looked this up a while back but I can’t find the study now - I’ll add a link if I can find it. Choices and scores were controlled for, and the takeaway was that people who are good at language truly did perform better in grad school. I think it’s plausible since success in grad school and primary output in grad school is papers & writing & a thesis. The same is true for working at a company, the majority of very successful people aren’t the coders (with some exceptions) but they are more often people who are good at communication, writing, planning, strategizing, and rallying others to work together.
I wouldn't jump to that conclusion. It might be true sometimes, but there are people who can code but get offended by such tests. I'm saying it doesn't matter if they can code or not, because getting angry about easy questions is a good reason to reject the candidate, aside from their technical skills. It's demonstrating a lack of willingness. They might be an amazing coder, but not a team player. They might be unfriendly. They might be defensive about their skills, or assuming a golden resume should exempt them from coding tests. They might not be able to code. No matter what the cause is, I have a reason to avoid hiring that person.
> What's missing from the picture is that a great communicator who isn't a chemist will never apply to a PhD program in chemistry.
While true, I'm not sure why that matters? PhD applicants do have a wide range of scores on their GREs, and you can still control for those scores and adjust, as well as being able to look at all science fields and not just chemsitry. If the correlation between outcomes and writing scores is higher than the correlation between outcomes and math/science scores, it doesn't really matter that you haven't measured people who weren't interested, does it?
BTW, googling around currently gets me a metric ton of results and studies concluding that GREs don't predict graduate school success beyond the first year. I'm not finding much on the subject tests vs math vs language, but it looks like the current consensus is that GREs are bad predictors and cause some gender/race/class bias problems.
> That's not really going to help fill technical positions.
I'm saying the successful coders are the ones who are better at communication, among the coders -- counting only coders who already have the job. And the successful grad students are the ones who are better writers, counting only grad students who are already enrolled in a science PhD. That's what I meant about controlling for the bias. Being able to communicate and write well is a major skill needed in technical positions, and the technical people who excel are the ones who are better at explaining, communicating, writing, publishing, etc.
I don't know how to find them, and leetcode and interview puzzles clearly aren't it. However, we've all worked with folks that get way more done than others. We've also worked with people that are a net negative for company. Your paid internship only kinda works at the entry level, and doesn't fix that the filter is broken for mid and senior level.
At mid and senior level, those engineers' peers know who they are. I know we're culturally allergic to using reputation/personal experiences in hiring, but the information is there if you really want it.
> There's a growing rift in software between employers saying "there's a talent shortage" and a rapidly growing population of devs who feel like they're locked out due to the technical interview process.
Many of the engineers not being hired are recent bootcamp grads but there are also tons of CS majors that can't seem to "crack" the interview process.
I'm extremely skeptical of bootcamps, especially after learning that some of the TA's at these are hired to help with teaching as little as two months into the program as students[0].
I'm afraid that a lot of these bootcamps train the students on "practical and applied skills" that makes them one trick ponies. ie, they know how to do one thing and one thing only and if the project's stack change it's unclear if they will be able to adapt.
> This has been a great way to shift interviews away from algorithms and more towards finding people can add immense value to technical teams even without having on-the-job programming experience.
I'll play devil's advocate here and say that algorithms are a pretty good proxy for on-hire performance. I can, as an interviewer, expect most grads from serious CS programs to have had an algorithm class. Being able to demonstrate that they are capable of learning algorithms and applying them gives me confidence that they will be able to learn new stuff fast when joining the team.
It's a sink or swim situation where I believe that it's very hard to teach CS fundamentals on the job but relatively easy to teach new tools and new frameworks. So I would rather hire someone with strong fundamentals
While I have never done an algorithm style interview in 25 years of being a developer across 8 jobs, I have softened my stance to it.
One of the biggest problems CS grads face is that they can’t break out of the cycle of not having experience and can’t get a job and can’t get a job so they can’t get experience. After the first job it gets easier.
You can practice for algorithm type interviews and get a job. It also doesn’t matter where you went to school. It’s the great equalizer if you can teach yourself.
If I were trying to get my first job today instead of in the mid 90s, I would have been spending time “grinding LeetCode”.
> One of the biggest problems CS grads face is that they can’t break out of the cycle of not having experience and can’t get a job and can’t get a job so they can’t get experience
Serious CS programs often have employment rates really close to 100%.
This makes sense and it's a wonder that most companies aren't doing it. There are a lot of people who can do the work with the right apprenticeship/training. IMO, there really isn't a shortage of people who can learn to do the work, but there is a shortage of companies willing to make the investment in people.
There's nothing wrong with apprenticeships but if senior hiring remains unfixed then what is the apprenticeship for? As long as going from one company to another is a major hardship with risks and drop out tech is not really a career path.
My previous employer did this pretty well. They had a paid internship program, but they also had started as an AS/400 shop writing RPG code, which none of the local colleges taught anymore.
So, they basically developed their own course, and kept it fairly consistent over several years. The first 20 or so hours was some computer based learning modules they had purchased to teach fundamentals and syntax. After that was a series of well documented training assignments.
For the first assignment, they'd be handed an existing report program with several bugs that they'd have to figure out. After that, they would start on a simple CRUD program that they would expand on. At the end was a capstone assignment that simulated a real project from interviewing a "stakeholder" to gathering requirements, writing documentation, and implementing the solution before finally promoting it through the change control process.
Altogether it was 150-200 hours of training, which corresponded to about 2-3 months for a student working part time.
The company recorded everyone's performance, so after the first dozen went through the program, they had a solid baseline for comparison to tell if someone was falling behind or blasting ahead. If a trainee proved they were solid, they might get to skip the last assignment. If they were horrible, it would quickly become apparent and after a few chances to turn things around, they'd be let go.
I ended up on the team responsible for running the training program. Even as the development started shifting to Java, they still used the RPG training for a long time, since it was at least a gauge of a person's ability to pickup unfamiliar tech. Part of my job was to help develop a Java version of the course to eventually become the new default.
Reviewing the assignments was kind of fun and taught me to appreciate people in QA roles. Submissions didn't have to be perfect, but they had to meet a certain level of quality before the trainee could proceed.
Sometimes someone would bomb one of the early assignments, and that was okay, even if fixing the issues and resubmitting took longer than normal to finish. The real red flag was repeating the same mistakes over and over again.
Even though the training program was created for interns, a few of the dev managers would have all their new hires, including senior engineers, go through the same program. Normally the full time hires would "graduate" early, but I know of at least one mid-level dev who didn't make the cut at all. The hiring manager was thankful to have dodged a bullet, and have the historical performance data to back up his decision when he went to talk to HR.
> There's a growing rift in software between employers saying "there's a talent shortage" and a rapidly growing population of devs who feel like they're locked out due to the technical interview process.
I think there are ways for both of those things to be happening simultaneously. (Well, sort of- employers claim a talent shortage, but really it's a shortage of talent at prices they wish to pay.) The fundamental problems are that it's not possible to measure a broad swath of candidates' abilities, and that hiring an engineer is very expensive.
I interview people all the time who have job in "top" companies, who then proceed to do a pretty poorly in the interview. Now that could be the interview's fault, or my fault, bad luck/nerves, or at the very least you can say the interview may not provide an accurate depiction of the candidate's abilities. And I am sympathetic to that, I've been on the other side of that and I know what that's like. But the two fundamental problems remain.
Related message for students / new-grads: Find an internship. Internship interviews tend to be much easier as the stakes are lower. If you perform well during your internship (arguably easier / more accurate indicator of success), the company will likely extend an offer. At big companies I've seen internship to offer rates exceed 30%.
I've noticed that in banking - and I'm not talking about investment banking, or high-finance in general, but regular consumer banking - there's been a trend to basically hire woefully overqualified people for the lowest positions around - I'm talking about bank greeters, customer support, and what not, and then train them from there.
This could very well be a local thing, but there are so many people today qualifying for these jobs - lots of BBA and MBA candidates out there, willing to do pretty much anything to get a foot inside.
When I was interning for a bank, even the greeters (basically the person that just greets the clients, and forwards them to the right people within the bank - a receptionist, really) had a Bachelors degree, many were working on their Masters.
The more sought after positions had been re-labeled "graduate programs" or "trainee programs", and were aimed at the top-shelf students. While the rest pretty much had to get a foot inside by working their way up from the bottom.
So you suddenly have a ton of highly educated candidates applying for jobs that only 15 years ago required a HS diploma, if that even.
Then when they're first inside, they tend to get moved around internally - as a lot of positions only get posted internally.
It's almost the same way with internships. Internships are there to practically train and select future company workers. If you do well, you get a return offer - if not, well, at least you have some experience.
With the rising number of graduates, I can foresee a future where candidates are being divided into the regulars and the elites. The regulars will, no mater how qualified they are, will have to start at the rock bottom, proving themselves for $8.5/hr, while the elites are trained for leadership/management-track positions.
I'd go so far as to say, at least within tech, your primary goal during undergrad is securing one or more internships.
I look at internships before grades and the school.
When I interview fresh grads, if they have internships, that's what we talk about.
When I have interns on my team, my goal is to hire them. At one large tech firm I was at, we had a goal of hiring 50% of our internship pool. It was the primary way we brought in new talent.
In most of Wall Street, internship to offer rates effectively start at 100% but some people during the course of internship manage to cause the employer to rethink the offer, so the final result is anywhere between 90-98%
30% is very low for big companies, I’ve never heard of somewhere being lower than 50% and 75-90% seems more standard (though offer does not necessarily lead to an acceptance).
This message is expressed very well here [1]. Tl;dr internships are helpful to students for a variety of reasons, like making applicants more valuable to employers, making programming less of an abstract concept, and exposing them to interesting subfields of computer science.
Wow, that headline is a stick in the mud on a nuanced topic.
> What’s your biggest weakness? Where do you see yourself in five years? Why do you want this job? Why are you leaving your current job?
I do a lot of tech & business interviews, and I don't ask those questions (unless they've recently left A LOT of jobs). I ask situational questions to understand how they think, who they talk to, what research they did to understand the problem, and the solution (was it simple?). If they tell me they built a Rube Goldberg machine, I ask if they would have done anything different with hindsight. If they don't realize they built a Rube Goldberg machine, well, perhaps they won't be a good fit.
I look for people who can solve problems, do their own research, ask for help when they get stuck, aren't afraid to attempt solutions (many that will knowingly fail) and have the introspection to identify failures and admit it are generally people you want to hire for senior positions.
Now I admit that it's a lot harder to hire this way for junior positions, when they have less examples, less job history, etc. Educational projects are a substitute, as well as working on personal projects.
It is true, and not that I am trying to be dishonest myself but I am not the same person while taking an interview. It's a sad that we can't be honest or humble but in this system we have to sell ourselves, fake enthusiasm for the hiring company is a must, do whatever it takes to pass the interview then think later if we take the position or not. If not, somebody else with the same capability or or less will snatch that job. Plus that interviews like tests are gameable.
In an ideal world a trial period would ensure both the employees and employers are a fit. But that could be abused as well if it becomes the norm.
Someone always suggests this. The issue is that if you did this, 95% of the candidates that would agree to this kind of setup would be the kind you didn't want.
If I'm sitting on 2 offers, one is a hire and one is a "Let's see how this works out after 2 weeks of work", I'm going to take the first one. And that says nothing for the necessary benefits question in the states, where changing jobs often involves expensive (in money or time) changes to health care insurance.
We don't know for sure and it's hard to know how this works. What's obvious is that the current system is broken and we need a replacement of some sort.
Being able to try a company and see whether they like me and whether I like them and the type of projects I am supposed to work would be a major factor in finding the right marriage. We take jobs for the salary tag and quite often we do whatever we have to do to continue getting that nice paycheck but we're not happy with the work we do.
For sure, it's an empirical question what %age of people who would accept this kind of offer are "I have no other choice, so I accept your offer" or "I want to make sure I will actually want to work at your company".
I'm just saying that I would 100% not do this unless I had no other choice, or was independently wealthy.
As a company you're also competing against other companies for hires. Good candidates can get offers from multiple companies. If you're going to tell candidates to make an opportunity cost sacrifice of several weeks of work just to the the possibility of an offer then you better be offering something extremely valuable to the candidate. Otherwise why would the candidate take the chance on your interview process if they can secure an offer from other companies without paying the opportunity cost?
The result is that this interview process will likely only be pursued by candidates not able to get offers from other companies.
In an ironic cycle, many employers also market their services and products in this fashion - over the top marketing, fake enthusiasm for the customer, over-selling features, constant talk about "community", virtue-signalling with social media posts on hot button political topics, etc, etc. To be sure, I'm not saying everyone and everything is fake, but it's interesting to just climb out to the bank, clear the clutter and just observe how the river flows.
> It's a sad that we can't be honest or humble but in this system we have to sell ourselves, fake enthusiasm for the hiring company is a must, do whatever it takes to pass the interview then think later if we take the position or not. If not, somebody else with the same capability or or less will snatch that job.
As someone who’s given hundreds, maybe thousands of interviews, personally I would advise against this. Being fake about your interests is not more likely to help you get the job you want and one that fits you well, if it even works at all. Being honest about your interests with yourself and your potential employers is the best way to end up being where you want to go.
You can show excitement for specific topics that a company works on, and it’s okay to like the company’s products, or their work environment, but people who are enthusiastic fans of a company without specific reasons to want to work there are, in my experience, not more likely to get hired.
> Plus that interviews like tests are gameable.
This is true, and it’s good to understand so that your expectations don’t go crazy. You are competing against other people, and you have no idea how good they are. All you can do is your best.
Oh you know what, I somehow thought you were saying that interviews are a gamble, I misread "gameable". My reply only somewhat applies, but allow me to fix it...
This is a pretty common concern, I've heard it before. But in practice I've never actually seen it happen myself. It's worry about what's possible, but not what's likely. The problem with worry about others gaming interviews is that it ignores what happens after they get hired. Someone who's faking it would probably be later revealed. It's usually easy to uncover exaggerations during interviews. And last but not least, the vast majority of people I've met haven't lied or cheated or tried to game the interviews.
I've had lots of interviews where people start by inflating or spinning their experience a little - there's nothing wrong with framing yourself from a positive angle. All it takes is a couple of follow-up questions to uncover the boundaries of their experience. This is one reason why the article here is wrong. Unstructured portions interviews serve the super important function of conversation about the candidate's experience. Standardized questions are not as good as casual conversation and follow-up questions, at helping me really get a sense for you and your experience and what interests you.
> In an ideal world a trial period would ensure both the employees and employers are a fit. But that could be abused as well if it becomes the norm.
I was talking to our work HR director about this and given that US is mostly "at will" employment, either side could decide to end the employment at any time so a trial period doesn't really mean much. Secondly its hard on the employee if we let them go on short notice so we try to hire from a long term perspective.
In the U.S. you can be fired for no reason, so the hypothetical trial period is basically just hiring people without interviewing them and then firing them as soon as you realize you made a mistake.
Work on a sample project, sample team for 2 weeks. See if you like the work and the company sees if they like you. This would only work if it were a widely accepted standard practice.
In the US, two weeks of vacation is pretty standard so a prospective employee would either have to a) forfeit all of their vacation for the year or b) quit their previous job first and risk having no job at all.
It was a suggestion if it wasn’t broken. The system is currently broken with respect to both employers and employees finding eachother on multiple dimensions not just salary but basically the best possible fit
I'm wondering if much of the discussion here is even about the article, which advocates for things we long know work better:
* Structured interviews
* Blind auditions
* Competency-related evaluations
The title "Job interviews don't work" is rather bait-clicky when clearly they advocate that _some_ form of job interviews work. Or are at least better.
As a technical hiring manager for over a decade, here's where I'm at:
- The best interview is an internship. We can't always do that and often we need senior talent now.
- The next best interview would be a portfolio. I am so envious of artists with their public portfolios. If there's one thing I wish we as an industry could figure out, it would be some way of stopping to test and retests ourselves as if we have to constantly reprove what we've already done and instead find a way to better showcase our work.
- The next best technical interview would be a "homework" project, but I've come around to the mentality that this just isn't fair to candidates. As a hiring manager I love it, but most folks just don't have the time to do a bunch of unpaid work. Even if you compensate them, it's unrealistic for many.
So we're mostly back to the suggestions in the article. They're good. A good hiring process is not easy but it's worth it.
And finally, a bit of anecdotal evidence: yes, there are folks out there you probably shouldn't hire. They aren't a good fit for the role. You want to set them and yourself up for success. That said, there are probably more people who can excel than you realize. A major factor in their success is the maturity of the team and leadership that's already in your company. Sometimes you'll get lucky and hire some rare talent, but if all you're doing is looking for "rare" talent, then you're likely poorly calibrated and relying too much on outside talent to come in an fix the mess already on your hands.
I tried using my (limited) open source hobby project portfolio as a substitute for coding interviews. Companies either didn't take me up on it, or still also required me to do their regular take-home. Twice now, I have had two companies ask for the same take-home, although in the first case they asked me to re-do the work in their preferred language.
FWIW, as a hiring manager, if someone has a portfolio I definitely judge them on it, and if it's good it allows me to bypass huge swaths of technical/coding interview stuff and dig much deeper into where/what I want. I always take it as a positive, even if it's old stuff.
What do you find makes a portfolio better or worse for these purposes?
Not so much "more likely to get them the job", more... I felt like my projects weren't actually suitable to take the place of coding interviews, largely because I couldn't actually drop in and work on them in the way that coding interviews show me actually doing work.
Literally anything. If I see someone with a lot of relevant forked repos, even if they're old, I take that as interest and something I can bring up.
If I see a repo of rcfiles I know they care about working efficiently. If I see abandoned stuff with more than 1 commit that's OK, that's something that was cared about at one point. These are just two super generic examples. Almost everything is a positive.
The only thing I don't like to see is repos with 1 commit, and nothing other than a README with the repo title in it. Not really negative, more of a 'cmon gimme more.'
If you are creating a portfolio for the primary purpose of showcasing it to get a job, I don't think it's worth the effort. Maybe there is an exception if you are a high profile contributor to high profile projects.
Otherwise the number of companies that care about a candidate's portfolio is small. Just studying leetcode problems is still the most scalable way to a job, unfortunately.
A shame, because there are so many things I want to build, work on, or contribute to, but all of that is put on hold until I can land the job first via leetcode.
I think it's a great idea and you should keep trying. Lots of places (somewhat understandably) won't deviate from their policy because consistency of the interview process is a good goal.
But at places that are a bit more flexible, it may work.
> What’s the best way to test if someone can do a particular job well? Get them to carry out tasks that are part of the job. See if they can do what they say they can do. It’s much harder for someone to lie and mislead an interviewer during actual work than during an interview. Using competency tests for a blinded interview process is also possible—interviewers could look at depersonalized test results to make unbiased judgments.
Why would a practical test be more effective than a discussion about prior work experience? Short coding tests under pressure are extremely unrepresentative of real work and I'm not sure homework-style interviews are much better. I personally don't want to spend an entire weekend on your test and I've dropped opportunities for that reason in the past.
All these "competency tests" are good for is catching blatant lying. Is this a serious problem? In my view job interviews are mostly about finding a match between skills/interests and project needs.
Communications skills are a very big advantage in interviewing and that feels unfair but I'm not sure that true. Communication and persuasion skills are extremely useful and important to all office jobs, even the most technical.
I'd bet that people who interview well also write beautiful code comments and commit messages.
> All these "competency tests" are good for is catching blatant lying. Is this a serious problem?
Of the people I interview that claim they have years of experience, mastery over multiple languages, and expertise in various frameworks, a solid 80% or more can't pass fizz-buzz in any language they choose.
Claims made on resume or in person? On the seeking side, I've felt pressure to put every technology I've used even in passing on my resume, to satisfy "buzzword bingo" and get through the initial screen, but in an interview I would give a (truthful) answer along the lines of "I don't have a ton of experience, but I know the basics; enough to be confident that I can quickly pick up whatever else I need to learn on the job."
Both. And it's not "putting every technology I've used even in passing on the resume" that bothers me. It's "I don't know how to write a loop or a function in any programming language."
You would be astonished, absolutely flabbergasted, at the number of "senior engineers" who interview well, talk great about projects, but cannot demonstrate basic programming skills (e.g., convert an integer to a string representation, or write fizzbuzz).
How does this happen? What (apparently huge) swath of industry am I insulated from where this can take root? Is this just a bunch of wannabes after tech salaries? Or is this is a side-effect of industrial java programming, where their "job" was basically filling in boilerplate? (I am probably unfairly slandering java shops, it seems unlikely, but everything else seems even more unlikely)
I think it usually involves 'helper' type folks. They end up on a project, and they 'help' the project get done. But the work they do tends to be just the boilerplate (or cut-and-paste style work), or just the gathering of test resources, or a surprising amount of work that you would assume that the PM or Manager would do, but instead is done by this person (organizing meetings, setting up checklists, emailing people about current issues/priorities).
This isn't to say that those people don't add value, but if they do this for a few years, they may find that their ability to do greenfield coding has significantly diminished, so much so that they struggle to write objectively simple code because they've fallen out of practice.
Source: I've been one of those people, and I've worked with many. It's an easy rut to fall into, particularly in Test Engineer role (what MS used to call SDET roles).
It boggles my mind too. I've conducted a fair number of interviews both phone and in-person, and I don't think I've ever encountered anyone remotely that bad.
I work in a non-tech company (bank), so I assume the general caliber of candidates is quite lower than what a bonafide Silicon Valley style tech company would see.
Would the typical candidate I interview be able to pass a DS&A leetcode medium or hard, getting the superoptimal solution on the first try within <40 minutes? Probably not. Maybe most of them wouldn't be able to do so on a harder leetcode easy problem either.
But most are still able to code up an acceptable solution to a more "realistic" in-person coding exercise we give them.
There's the concept of the "expert beginner" who works in a role when they're never challenged, so they quickly assume they're great at the thing when from a global perspective, the just stopped learning very quickly.
There's also the type that accidentally falls into a manager-type role way too soon, before they've actually had a proper chance to learn programming well. Then whatever basic skills they had atrophies quickly.
If you assume the top third of candidates only need 2 interviews to get an offer, the middle third need 5, and the bottom third need 15, then 15 / (2 + 5 + 15) = 68% percent of the candidates you interview will be from the bottom third.
This, of course, ignores every human factor but it is a starting point.
I must be one hell of an outlier, then, because of the dozens of people I have interviewed over the years, exactly one could not demonstrate basic programming skills. Most of them couldn't get much further, but that is a different matter.
You are lucky. I end up doing 2-3 screens a week. Ask any of the typical questions were you to google for interviews for Java/Spring/etc.. you get the same canned answers. Push any practical bits, and there are a lot of people who apparently don't know exception handling and some of the other more 'basic' programming things outside of flow control. Few things sadden me more than the senior developer who appears to have done junior coding for years.
> Why would a practical test be more effective than a discussion about prior work experience? Short coding tests under pressure are extremely unrepresentative of real work and I'm not sure homework-style interviews are much better. I personally don't want to spend an entire weekend on your test and I've dropped opportunities for that reason in the past.
These are good points, and the tests I have given are specifically designed to avoid these issues.
Personally I do this:
* Come up with a simple but realistic 1 or 2 hour task, and write up the request in plain English (do not write a tech spec). It should be task you'd expect anyone to be able to easily do on day one without any assistance.
* Include a short narrative about the end user that clearly implies a specific requirement or two, but don't spell out it out as an explicit requirement.
* Omit a small but important detail that any reasonable person would ask.
* Instruct them to provide a solution to the users problem, and ask questions if something isn't clear.
These are pretty simple to evaluate too:
* Does the code run? (i.e. can the candidate write basic code?)
* Does it meet the user's explicit needs? (i.e. can the candidate follow basic instructions?)
* Does it meet the user's implicit needs? (i.e. does the candidate write code to spec, or do they think about the bigger picture?)
* Did the candidate ask a question about that obvious missing detail? (i.e. does the candidate speak up, or will they make assumptions?)
You would be surprised with how many people don't have the ability to apply their ability to write code to solve simple problems.
That said, they seem to work best when they're part of an overall interview process. I think it's a mistake to look at only the highest scores and interview those people, they should only weed out the bottom scores and follow a more typical (do they fit? can they learn it?) process for the rest.
> Competency tests have helped me get in the door.
How so? The door was already open if the company decided to give you a test.
My impression is that coding tests either offer neutral or negative feedback during an interview. It's not like a candidate can manipulate linked lists in a way that shows particular brilliance.
I imagine a good compromise might be a small piece of "contract work" where a company presents a small representative task as a project for a candidate to complete and the candidate actually gets some form of compensation for their time. There are definitely problems to this approach, but in a best case scenario, the result is a win-win situation even if a candidate doesn't get hired. At the moment every candidate that doesn't get hired represents a net loss for both parties (time for both and opportunity for the candidate)
Agree on the test stuff. When talking to someone who can goes deep in details you mostly know that he's not a lier; then you have the trial period to fire him if he is.
I interview a lot of people and found much to complain about in this essay. This is the main thing:
> The key is to decide in advance on a list of questions, specifically designed to test job-specific skills, then ask them to all the candidates. In a structured interview, everyone gets the same questions with the same wording, and the interviewer doesn’t improvise.
This is good advice, if and only if you're interviewing an undifferentiated group of applicants, as in the cited examples (college entrance and army recruits). If you're hiring a QA III and you have three different applicants, it's terrible advice. You need to ask about the candidate's specific experience, and ask follow-ups.
More generally, I don't think the stated goal of an interview according to this essay (peering in to the candidate's soul to suss out traits like "responsibility" or "sociability") is possible or reasonable. My goal is more modest - I just want to figure out whether you were good at your last job or not. If you say you're responsible, I can't prove you right or wrong in a one hour conversation. But if you say you're a whiz at Selenium UI automation, and you're lying, I will figure it out pretty easily.
My boss tasked me with hiring my assistant. HR filtered most of them and I was left with three applicants. So I ran it the way I'd like to be hired which was skip the useless small talk and other painful BS and just bring the applicants around the shop and show them what I did.
The first applicant seemed like his mother dressed him and reminded him to breath that morning. The second guy was pretty sharp but disinterested during the walk and talk. The third applicant immediatly stood out. He was excited and fascinated by our systems and kept asking technical questions - winner. Excellent co-worker until he moved on to greener pastures.
All that wear your best suit and where do you see yourself in 5 years (best answer: prison) nonsense sounds like it was lifted from one of those cheesy 1950's self help shorts the MST3K crew routinely riffed.
This is a great article. An undiscussed issue I've personally seen when conducting interviews and being parts of roundups is a lack of self understanding. lots of people who conduct interviews think they are way smarter than they are (me included!). They hold up candidates to very high standards and then dismiss them at the slightest mistake, however many of these people also make many mistakes on the job. When I give interviews I always just ask the easiest and clearest questions I possibly can that still try to be relevant to the job to minimize this bias.
> A job interview is meant to be a quick snapshot to tell a company how a candidate would be at a job.
> Unstructured interviews can make sense for certain roles. The ability to give a good first impression and be charming matters for a salesperson. But not all roles need charm, and just because you don’t want to hang out with someone after an interview doesn’t mean they won’t be an amazing software engineer.
If that's the attitude someone has towards interviews, then no wonder they draw the conclusion that they don't work.
The real issue is that most teams either don't give much attention to interviewing (because they have their primary job to attend to), or too much of the process is delegated/outsourced elsewhere (where people only tangentially understand the work area, and/or have no deep knowledge of the job role).
Lean on interviews for measuring soft skills, and lean on demonstrations (portfolios, code tests, pseudo-code, problem solving, etc) for measuring hard skills. Every job requires some balance of hard and soft skills. If you use the wrong tool for the evaluation, or if the person using the tool doesn't know how to use it, you get the wrong result. Interviews have their place, but technical evaluation is not it.
This hits home too hard. My job search is going so poorly, that I have started to doubt my technical value at all.
Confusingly, personal one-on-one interactions with companies or hiring agencies almost invariably result in positive feedback and comments, "you are exetremely hirable," "you have a strong technical background," etc. However, none of these interactions has gone anywhere. Either they "move forward with someone else" or simply evaporate into thin air. A few have even evaporated after extensive interviews and claiming that they wish to hire.
Is positive-sounding feedback just a polite way of avoiding some "elephant in the room" problem? Am I inadvertantly projecting an image of ineptitude or hostility?
I have over 20 years of experience on Linux, tinker and program as a hobby, and also lightly contribute to open source projects. I believe I have what it takes, but geez, sometimes this job search is just soul crushing. I just want to offer my skills and talents---to be a valuable member on a good team.
Hang in there.
I have a friend who's in a similar position, she's interviewed and been ghosted a few times and it has really gotten her down.
It can take time but eventually a company will come to their senses and realise your value.
In the summer before my senior year at college, I did a 3 month internship at a software development company. The interview for the internship was very soft, partly because it was only an internship, and also because I knew someone at the company who helped me get the position.
After I graduated, I applied for a full-time position there and have been there for 10 years. The interview process for the full-time position was also fairly soft and non-technical because I was hiring into a team that I worked with during my internship. I like to describe it as a 3-month long interview process. Not only did the company get to know me and what I was capable of, but I got to know the company and its people (in order to make a decision about if it was some place that I would like to work).
Surely this isn't scalable (internship are fairly rare, and generally are not available to anyone except students or recent grads), but the whole internship process worked out very well for me. It allowed me to bypass the traditional tech interview, which is something I feel very fortunate about.
Finding good people to work with is hard. Nothing you do will find 100% of the good people (finding 50% of them is extremely good) and filter out 100% of the bad people. Nothing you do will be effective for everyone. Trying to develop a hiring strategy based on what you want interviews or hiring processes to be like will be biased.
Accept that you will make mistakes. Accept that many of the good ones will get away or be undiscovered. Accept that you will make hiring mistakes and have to fix those.
If you are interviewing, do you really want to spend a few years of your life at a place that does the bare minimum to vet you? They do that for everyone and guess what kind of coworkers you are going to get. Sure taking PTO and spending a day in an interview process is a lot of time. What is the alternative? Not really being vetted and working with horrible people.
Finding the person or the right company to work with it hard. Take the time to be comfortable that it can work for both sides.
Silicon Valley interviews are worthless. The algorithm pop-quiz is just a way for the interviewer to beat his chest about some obscure facts and demonstrate his superior knowledge to the interviewee.
Has anyone found a better process than casual conversation? I've found it effective as long as engineers and non-engineers get a chance to participate. Usually I talk about what they want, talk about what you need, talk about expectations from both parties, talk about needs, and talk about past experiences both good and bad. Once expectations are set, there's literally no opportunity for a "bad hire", because if they don't live up, it's a simple conversation to refer back to the expectations that are set and help them achieve , or worst-case, offer them severance.
> because if they don't live up, it's a simple conversation to refer back to the expectations that are set and help them achieve , or worst-case, offer them severance.
That is a bad hire. You are describing a PIP and then firing someone that you wasted time and resources recruiting, on-boarding, and training.
> They are in no way the most effective means of deciding who to hire because they maximize the role of bias and minimize the role of evaluating competency.
I can believe that badly planned and executed job interviews do the above but I've overseen or been directly involved in the interviewing of hundreds of candidates over the years, for dozens of roles, and the hit rate has been pretty good. Two probation failures, and that's about it.
We're looking to assess skill and character in our interview process. We are interested in whether you can do the job, and whether you're a reasonable human being, and that's it. We have strong structures and guidelines in place in terms of questions, answers, and evaluation. And inasmuch as it's possible we strive to make our hiring process a pleasant experience, regardless of whether a candidate is successful or not (obviously there's some level of stress inherent in going through a selection process). We also give feedback that we hope will help unsuccessful candidates in future (I realise this is unusual and even frowned upon in some circles but our experience has been that most people appreciate it enough that it's worthwhile to deal with the headaches caused by the odd person who wants to argue about it).
I get it. There's a cohort of people on HN who don't like job interviews. Honestly, I'm one of them. But done well, they work well.
Our process isn't perfect, and we're always looking for ways to improve it - there was quite a lot of tweaking early on, for sure - but for us it's worked well. We've spent a lot of time on it because - although my role is as a CTO in a mid-sized firm, and this might not fit with everybody's expectations of that role - literally my most important job has been and continues to be hiring, building, and maintaining a strong, effective team.
And I am very happy with the people we've hired. Just as important, I'm also happy with the decisions we've made about people we've chosen not to hire.
> I can understand social filters when it comes to friendships, sex and intimate relationship, but for jobs, I will never understand why they exist.
Because jobs are just as social as intimate relationships, if not more so. Up to about the 2010s, the majority of marriages came from getting to know someone at work.
Even if nothing intimate does come from work, it's still people you have to see 8 hours a day for years on end. If someone is repulsive/annoying/toxic, it WILL make you miserable at work, and people are very wary of disrupting a well-oiled collective.
Aside from abusive/toxic workers, do you think some diversity (not necessarily racial but cultural as well, different age groups,etc) would do any harm?
I personally enjoy working alongside older folks who I find invaluable, they have been around for a long time and have lots of stories to share, but are not as quick to jump on a new technology bandwagon. Lots of companies seem to want to keep the young vibe and avoid hiring these folks.
> Because jobs are just as social as intimate relationships, if not more so.
For very small or family companies, maybe. In small communities, maybe, but small communities could also include people from any horizon.
But in other cases, I disagree. Work and production is the blood of human civilization. Generally, the free market ideology says that if you're competent, it's the only relevant parameter. Social filters are arbitrary, unnecessary and backwards.
> If someone is repulsive/annoying/toxic
The nazis sent people who were not desired to death camps. Today, those same people are being excluded from society through social filters. You cannot have a healthy society if you keep segregating people like this, even if it's not race, but other traits like education, politics or behavior. You're advocating social darwinism through a detour.
If you don't exclude socially toxic people from your company, talented people who don't want to work with that type of person will leave. I know multiple people who have left or transferred because of difficult coworkers, and more who have left because of difficult managers. If one toxic person can't outperform all the talented people who will leave, it is in the company's best interests not to hire that person.
Yes, you can technically call this social darwinism. If you are an irritating person that no one wants to be around, you will be socially excluded. I don't know of any movement that is going to champion your cause. There is a difference between preventing profiling / prejudice and avoiding manipulative, mean people.
> There is a difference between preventing profiling / prejudice and avoiding manipulative, mean people.
I see your point, and yes, there's a difference. I still believe those people can be worked with, in some way or another. Being too selective at the workplace isn't a healthy way to run society.
It's true that there's a difference between discriminating them and avoiding them, but the result and the intention are the same, in my view. Avoiding them is just politically correct and acceptable, but the truth is, it's the same process.
The nazis lost the war, but they won the battle of social darwinism.
> If you are an irritating person that no one wants to be around, you will be socially excluded.
What's not really what I'm talking about. And that's a nature fallacy. The role of civilization has always been to fix the problems of nature. There are many ways to interpret some socials signs as being "irritating". They're often normative or arbitrary.
> I don't know of any movement that is going to champion your cause.
Socialism, would it be democratic, or any form of progressivism, attempts to aim at that cause. Europe is more progressive about this.
A rather click-baity headline. The article doesn't claim that job interviews don't work so much as it claims that subjective job interviews are more subject to bias than structured job-interviews. I'd say that's true, but the caveat is that structured job interviews are more subject to people studying and honing skills specific to the job interview process. Grinding leetcode definitely makes you a better at solving algorithms problems on a whiteboard in 60 minutes but doesn't do much to improve working effectiveness.
I think there are 3 core trade-offs for the job interview process: logistical feasibility, consistency, and resemblance to the actual work experience.
Doing 2 rounds of coding interviews, and one round of systems design from a set list of questions with explicit rubrics is easy to implement and is very consistent. But you can leetcode your way to knowing most coding question archetypes. Systems design question archetypes are even smaller in problem space. These questions have moderate to low resemblance to the actual work experience. Sure, it can identify people who can't code or aren't experienced in systems design. But does it show how well someone takes feedback, or reviews other people's code? Not really.
One of my co-workers used to conduct 90 interviews that started with one question: "how would you build a text editor?" He didn't specify whether this text editor was WYSIWYG like Word, a web-based editor, a code editor, etc. It expanded and touched on a whole variety of questions. It could be traditional data structures, or UI design, or systems (e.g. implementing auto-saving text fields on the web). This was low consistency, since the interview was mostly unique to each candidate. It had moderate logistical feasibility since it was hard to train interviewers on these open ended questions. But I think it had better resemblance to the actual work experience, since it didn't just test coding ability. It tested thinking through the problem and what the desired end behavior for the user really was and navigating how those expectations influence implementation.
An idea of an interview process that I have is to do it asynchronously through github or another version control system. Give the candidate a task to open a PR on a mock codebase. See how they implement the task and justify their design decisions. How thoroughly the test. Respond to the PR with comments and see how the candidate responds. And next, have the candidate review another person's PR and see what they look for in a review. This potentially has even better logistical feasibility since it's not dependent on the candidate and employee being active at the same time. I think it would have the most direct resemblance to actual work experience, since it's emulating the workflow most developers actually use in their day to day work. Consistency may be difficult to achieve, but if evaluation was broken up into multiple segments for separate evaluation it may be able to be made consistent.
In my hiring process, we use a number of filters to gather the data we're looking for to make a decision. That requires a bunch of different steps. By the time we're done, we've spent at least 8-10 hours talking with this person.
From a technical perspective, we do the following:
- A short screen to go over the resume and ensure we've got a rough fit to the right role.
- A simple consistent coding exercise using coderpad. (surprisingly some fail)
- A series of consistent open-ended questions we ask everyone about their tech background, such as, what was one of the most difficult bugs you ever fixed?
- A set of consistent design/architecture problems: "given this design, what problems do you see? How would you fix them?"
- Another consistent, more involved coding exercise with an existing code base that is VERY much related to the work they'll be doing.
- A Q&A session on a wide range of technical topics. Goal is NOT for someone to know everything, it's to get a bit of map of their strengths and weaknesses. We found we make assumptions of what someone knows based on our background and their resume. We try to make this fun.
- Another set of behavioral and situational questions with a shared scoring rubric on values such as teamwork, collaboration, communication and leadership.
And then we have to take all of that data and look at it holistically and across a wide range of candidates. And even then we'll make mistakes, but we keep trying to optimize it, reduce bias, and make it better for candidates and us alike.
One last note: I like to finish my first interview with the question:
"Is there anything about yourself that you really want me to know that we haven’t discussed?"
Because I know I've only had ~40 minutes to get to know this person. I have my agenda of what I want to know, but I could easily miss a lot. So I want to give them a chance to represent themselves in the broadest way possible.
Yikes, I don't know about that. If somebody asked me if I could program in, say, Perl, I'd say I knew what it was but that was about it. That's like somebody asking if I can speak Chinese: I know what it sounds like, but no, I can't.
But on a serious note - if some interviewer ask you outright "Have you programmed in [x], and what's your opinion on it?", try to be positive.
I used to be very honest about my opinion on languages, but my experience is that if you start being too honest (on the negative side) regarding some language you're being asked about, it will have few upsides.
For whatever reason, some devs. are very personal about the tools and languages being used, and will even get hurt or agitated by criticism.
So whenever I get asked about that now, I just try to come up with the positive things I enjoy about [x] language.
Does your interview process perform better than a coin flip? (50% chance of making the correct decision). If the answer is yes, you have a useful process.