At first glance, it is easy to believe that programming as a profession is one which is both in rude health, and for which the future is incredibly bright. Increased automation, the mind bending world of machine learning, and the ever more intuitive ways in which software impacts our lives all suggest that programming is the career to be in, and one of the few careers which one can safely guarantee will still be around in 50 years irrespective of automation or many of the other issues that threaten the future workforce.
Many thousands of people have heeded the call. An entire industry has been rapidly built around getting budding young developers “job ready” in 12 weeks. The idea, in my experience as a student of an early self driven remote course , is to festoon fertile young minds with just enough Rails/JS knowledge to get them through a technical interview, and that’s about it. These businesses thrive on selling dreams of working for Google and Facebook and usually profit handsomely both at point of enrolment and graduation (when finding their students jobs). But this is not yet another blog post criticising the commoditisation of young programmers, as that topic has been explored to a large degree already.
The problem comes when these budding young developers hit the jobs board. The halcyon days where a developer could simply get stuck in with a particular language, whether on the front or the backend, are gone. The definition of a full stack developer is somewhat vague and the requirements for the role depends entirely on who you talk to.
A better place to start may well be to ask the question: what makes a good professional developer, full stop? The short answer is clear: a professional developer produces good quality code on a regular basis. It is a much more complex question as to how said developer can achieve this. It’s not enough to be a savant about a given language or framework, due to the fact that this does not help with strategic decisions and technology moves at such a pace that such knowledge in and of itself may soon be useless (who’s hiring Flash developers?).
Knowing the plethora of buzzwords of design and architecture is all well and good in a theoretical sense, but it does little to help with a concrete implementation. Understanding design patterns is frequently cited as good advice for budding developers, but that again is not sufficient, both due to the fact that they focus on particular challenges and are frequently misused but also due to the fact that the framework that you use often makes virtually all of these decisions for you. Instead, a good professional developer must have an understanding on all of these areas (in their mother tongue), alongside many others.
The code which appears on the IDE is simply the culmination of this work and the considerations of dozens of technical details, which are often interconnected in tenuous ways.
And then you add to this the full stack, beyond one’s mother tongue:
The phrase “a jack of all trades is master of none” can ring true here. Whilst nobody would be expected to know all of the items in this list, understanding one from each row would certainly be a prerequisite to being considered a “senior” full stack developer. This is the stack as of today, and every year there are additional prerequisites added. There is also a clear opportunity cost when you deviate from your core competency to learn something new. Whilst some of these skills are trivial to learn, fully appreciating the idiosyncracies of any major programming language or framework can take years.
So where did this idea of a full stack developer come from? Facebook of all places seems to have provided the genesis of this idea, or more specifically a Facebook engineer named Carlos Bueno. At the time this was written, Facebook only employed full stack developers which makes a lot more sense when you view it within the context of its time. It had a relatively simple PHP backend and did not have the massive technical demands that it has now. Early iterations of Facebook certainly did not require 2-3 years of professional front end design skills.
Personally I think the idea of the full stack developer comes from the age old idea of the 10x developer, who (whether or not they exist, again a topic for another blog post), have come to represent the Ark of the Covenant for startups and smaller businesses that cannot afford to hire specialised developers for every aspect of the delivery of a web application. The two terms seem to be used as synonyms for each other, but I think the idea that is hidden beneath all of the advertising is that companies want to hire super effective engineers.
I would argue that it is significantly counter productive to developing such engineers by hiring them for these roles. Excellent full stack developers do exist (I work with several), but very few of them started out as such. The conventional wisdom is that a good programmer is a good programmer irrespective of language, but as the programming world splinters into ever more complex language combinations, frameworks and even programming paradigms (functional anyone?), it is perhaps pertinent to take a moment to sit and consider the best way of acquiring good problem solving skills. Whilst critical thinking and problem solving skills are developed, are we helped by having to learn to understand the intricacies of Chef?
What makes this even worse is that often the companies who are seeking full stack developers most ardently are startups where you can add the specs for a project manager to the list of requirements. But what is happening in programming is symptomatic of a wider cultural shift within the workforce. Employers want full stack employees, because why hire dozens of people if you can get one person to do all of their jobs to a higher degree?
If a brilliant full stack developer exists, as defined in Bueno’s piece, are they achieving maximum utility by even writing code? Someone of that talent, with that level of expertise across the stack should sit in a CTO role rather than making rudimentary code changes in the trenches. For the avoidance of doubt, I am not advocating siloed programming where everyone is ignorant of the rest of the stack and what the rest of the team is working on.
The path to being successful at anything is in knowing what you know, knowing it well, and more importantly knowing what you don’t know and knowing where to improve on this. Being thrown into the deep end of development by having to learn 8 (possibly more) disciplines simulataneously is not the way to do this. The underlying truth is that there are not enough unicorns in the forest to fill all of these roles to the level at which they have been advertised and this makes a large amount of job “vacancies” permanently unfilled, which in turn pushes more people into the cycle of thinking there are millions of tech jobs available.
Instead we should focus on allowing our junior developers to grow and to develop their core competencies before branching out into the various layers of the stack. A key part of a developer’s growth is confidence, generated via a series of “wins”, but that is difficult to achieve when you are seeking wins in sometimes drastically different areas. As a case in point, Chef is written almost entirely in Ruby, but trying to navigate the source code as a novice programmer is a project in and of itself. The most important skill in programming is learning to learn, and that can only be improved upon by learning one thing and learning it well. The best programmers I know have a rapacious thirst for knowledge, which is built on a solid bedrock of understanding of the fundamentals they are using. It’s difficult to have that when you are standing on 8 separate bedrocks each made of sand.