I mean look, it's not a matter of dispute that tech companies employ far fewer people relative to their market cap than older industrial companies once did. GM employees almost 200,000 people in the United States alone along with a vast satellite of jobs in the manufacture of parts and in service. Facebook employs less than 60,000 people worldwide. The scales are just different. And that matters because the immense policy pressure to push everyone into college is a reaction to the decline of industry and manufacturing as bases of mass employment.
From my experience what is far more common is 10 developers plus a roughly comparable sized group of database/network/unix/cloud admins write an app designed for 100m people and then sit back and hope that marketing does their job.
These are obviously not average engineers, but there are actual real world examples of what Freddie said. Instagram had about 10 engineers when they sold to Facebook and I'm sure they had close to 100m users at the time. Similarly, WhatsApp had ~50 engineers and 1b users when they were acquired.
From my experience what happens in the real world is that when you scale from 100 to 1,000,000 people you hire a couple of DBA's, network admins, etc.
Oracle, for example, still has a development staff of thousands. Therefore there is zero reason for anybody else to hire thousands of developers to build their own database. Instead you buy their product and hire a much smaller support staff of administrators.
As somebody who works with very large parallel databases for a living I can tell you that you are incorrect. Oracle has seen steady revenue growth for decades and is in no danger of dying out. PostgreSQL is probably the nicest open source platform but in terms of features there is just no comparing it to Oracle. That is why Oracle made more in 2022 than at any other time in its history.
As for traditional RDBMS's, one of the systems I work on now scoops up a few billion rows (gigs of data), analyzes it and returns the results in about 2-3 seconds. Of course the tradeoff is that the hardware alone is very, very expensive and the software even more so. Performance isn't the issue, cost is.
"I've never seen (say) a startup choose an Oracle backend."
Because Oracle is expensive. But again, you get what you pay for. If you need Oracle you know it because open source alternatives like PostgreSQL shit the bed.
Nobody wants to pay for Oracle. More subtly, nobody likes Oracle. But people do end up paying for Oracle, simply because they have no choice.
"...any reasonable price point."
Again, the issue is cost, not capacity. A few years back Oracle liked to boast in their marketing material that you could run all of Facebook off of two of their cluster appliances.
I think you're super overstating this. Yes, scaling an app from 1000 to 100m users is a lot of work but:
* It's way less work than it was 10 years ago.
* It's way, way, *way* less work than it is to scale a physical product from 1000 to 100m users.
In my experience load scaling requires engineers that grows with the log of the number of users.
Anyway, to separate this out though - there really is a fundamental difference between businesses where you do something and sell it once versus doing something and selling it over and over. Surgeons, tax accountants, lawyers and hot dog vendors do something once and sell it once. Authors, movie studios and Substack authors do something once and sell it over and over. In this respect, software development is much more like writing a book than it is selling a hotdog or doing a surgery.
Plus scaling is only a concern for a specific type of application where concurrency and latency are issues. There are any number of examples of small teams (2-3 developers) who write a game that takes off and sees tens of millions of installs.
This is definitely something you see when you're inside the industry. There are people on my team who easily do the work of 5-6 other people because they're just that well suited to coding. It is shocking the performance level differences that can exist even on a single team.
From my outside observation of other professions, I would say there is a much tighter aptitude bell curve outside software.
Agreed and I wonder why it is so. Perhaps in writing there are writers who are orders of magnitude more productive and effective than other writers (e.g. Freddie). Geniuses in the hard sciences. Songwriters. These are (mostly) solitary activities?
Some people have graduate degrees in CS when their bachelor's was in some completely unrelated field. I knew one woman once who had a B.A. in, I think, English, and then somehow got into a CS graduate program. Her master's thesis was on cellular automata and she freely admitted that although she knew a lot about cellular automata, she didn't know a lot about coding. She worked as a tech writer.
To me, it seems weird to get a graduate degree in a field completely unrelated to your bachelor's, but then again, I have a brother who got a bachelor's in physics and then went on to get a master's and Ph.D. in political science. He's now a professor of political science at one of the top universities in the US.
For what it's worth, I don't think your anecdote says anything negative about your English BA acquaintance's ability to code. If she knows a lot about cellular automata, she already has the central idea of a complex system where rigid highly-predictable rules produce very unpredictable behavior at the macro level. Understanding that unpredictability is one of the major barriers to coding for the general population - you have to teach people that their intent is irrelevant once the computer starts reading the bits.
She was an intelligent person and it's possible she could have become a decent coder, but she knew very little of the basics you'd learn in a good undergraduate CS program. As things stood, she wasn't a coder and she knew it; she had earned a master's in CS by learning a lot about a very small corner of the field without having learned the fundamentals.
She needn't know a lot about programming to do research on CA---it's simply not necessarily relevant to study such a mathematical concept. CS != programming/coding. I don't expect most of my CS colleagues to be proficient in any language.
Here is an anecdote that may help distinguish between what CS/CE actually do/know versus what people think that they do. My undergrad embedded programming course (CE) was taught by a professor who specialized in computer architecture. He could (and has) literally designed the chips his computer was running on (down to the transistor level) and yet he was completely incapable of using his computer (Windows) for all but the most rudimentary tasks (essentially he knew how to click around to open email or a word processor but if anything unexpected came up he'd call IT for support).
Early in my career, we hired a guy with a Masters degree in CS and he was shockingly incompetent. It was a real eye-opener for me because I had always assumed "more degrees == more intelligent" up to that point.
Not in my experience. It took me several semesters to acquire basic competency in programming. I now teach programming and have watched students struggle in coding semester after semester (and not just after taking my course, smart guy =)) but it finally clicks for them after repeated effort.
`plenty of people with advanced degrees in CS who couldn't code their way out of a paper bag.'
I think that you're making the same error here as most of the comments and, indeed, Freddie, himself. Computer Science is *not* primarily about programming/coding. Most CS programs are an absolutely terrible place to learn where to code given that they focus on theoretical (hence the term science) use of computers. If you want coders then look to Computer and/or Software Engineering.
"Born not made" always seems like a false dichotomy to me, at least if taken literally. Nobody is born with professional skills, but there does seem to be some innate potential that some people have and others don't for any given type of skill, and I think that's what people really mean when they say "born not made".
Eh I think a lot of programs just don't do the things which would turn their students into good programmers. I'm not denying the existence of talent but I'm certain that most university cs programs aren't even remotely close to being optimal.
One of many big motivations behind my current path was learning that something like 70% of CPAs are going to retire in the next decade. Accountants, while not paid poorly, consistently are paid less than MBAs and finance people so those two disciplines siphoned off many students, especially MBAs because the academic knowledge is much less rigorous. It also helped that I already was working as an accountant and just wanted to remove the ceiling to advancements.
Anyhow, liberal learn to code rhetoric is a complete crock for the very simple reason that you can't make income distribution more equal without actually doing so. If we live in a highly stratified society it won't make a lick of difference if you hand out PhDs in ComSci to every man, woman, and child. Someone will still have to be poor. This was never a solution and I'm sure you know it.
Sure, but there is no solution to that problem. Someone is always going to be poor and someone else is always going to be rich, no matter how you organize society.
This is just flat out wrong. There will always be differences in wages, sure, but you can vastly reduce this via taxation. We currently do so and have to a far greater extent in the past. Higher marginal tax rates and other changes to the tax code to specifically target high net worth individuals is how you fix it. This isn't fantasy either, the only thing preventing it is a combo of regulatory capture and lack of political will.
No, it's not wrong. You can reduce the differential between the richest and poorest, but there will still be a difference between them, and the poorest will still be considered to be poor. Generally speaking, being "poor" just means your standard of living is worse than most people's, so as long as there is a range of economic outcomes, there will be poor people.
Depends on what definition of poverty you're using. Some are more relative to the rest of the population's income, while others look more at material deprivation/your ability to afford to pay for basic needs.
Sure, but the concept of "basic needs" varies over time. Electricity and indoor plumbing were luxury items for the rich a century ago. If you make a list of all the things you consider "basic needs" today, at some point in the future they'll be almost universal, but people will be complaining about all the people who are "living in poverty" because they don't have something that doesn't even exist today, or that we consider a luxury for the ultra-rich. So there will still be poverty because the bar keeps moving.
Electricity and indoor plumbing are not absolute basic needs in the way that warmth and water are, sure. But it's pretty impossible to legally meet those basic needs in a country like Canada without those prerequisites, much less participate in society.
You could live a subsistence lifestyle, but you need the necessary savings to purchase enough property where you can build your own well, shit in an outhouse, warm yourself with a wood-burning furnace, and live off the land via hunting, fishing, and/or agriculture (which is pretty fucking hard, especially with dwindling wildlife due to deforestation and the prime agricultural land owned by existing farmers or developed over). This is partially why remote Indigenous communities have such a hard time, even though their ancestors had previously lived off the land.
It's also hard to do this at scale without disease and mass deforestation and further accelerating climate change, especially with the world population being what it is. If you want to live without electricity (or natural gas) and indoor running water in a city, basically you have huddling around a hearth during a Canadian winter (which is not something most homes are built with anymore) and shitting in a chamberpot and dumping it...somewhere? A hole in your backyard? And harvesting your water from a rain barrel or a lake of dubious water quality. And either cooking your food of said hearth or living a raw food only diet. And to even have that shelter with a fireplace and backyard to live in and to purchase the food, you need a job. Good luck applying for and keeping a job if you don't have access to a shower and modern transportation to get there and some means of communication.
Not to mention, a country like Canada or the US absolutely has the means and resources to ensure that every person living there has electricity and clean running water.
It's true that there's an aptitude moat that is very pronounced in software development. However I still like CS as a major because even if you fall short it's still just an undergraduate degree with undergraduate debt and time commitment, there is a big range of roles hiring for it demanding different skill levels, and last I checked the number of job openings still outpaced the number of graduates (that may be outdated now). Compare this to the "go to law school" instinct which thankfully seems to be subsiding.
I think there is some method to the madness of policymakers hyping STEM degrees and CS in particular. These degrees offer the biggest productivity increase for students and in rare cases lead to innovations which grow the pie. I wouldn't call it a virtuous cycle as there are economic and aptitude limits on how much CS majors can grow the economy, but it's still a much better bet than a more traditional degree which largely prepares students to occupy a slot in an established, static field and only grow the economy via consumption.
This is what deterred me from doing it when other job prospects weren't panning out.
If the lowest wage for replacement-level programmers is, say, $10 per hour, then tens of millions of Indians will learn to code because that's very good money. Coding is inherently remote work in most cases, so why would replacement-level programmers in other countries be paid more than that?
People have been talking about offshoring for decades. Billions upon billions of dollars have been spent in building campuses in places like Hyderabad and Shanghai and employing millions of people in India and China. The result?
Soaring salaries in the US and desperate companies expanding their search for workers beyond China and India to places like Ireland and Hungary. Demand is insane because tech is expanding faster than the economy at large.
I’ve worked in the field for 24 years. In that time, salaries have increased by at least 2-3x at entry-level, and by much more at elite levels - 5x and more. And hiring has never been so tight.
I think the premise makes sense: even in our capitalist society we often don't apply a basic supply-and-demand analysis to what can basically be boiled down to an economic issue. (Conversely, sometimes we over-apply economic considerations: I'd rather be appreciably poorer and living in a cohesive nation than keep my current wealth living in the globalist economic zone we call a country.)
On the other hand, the average IQ of a computer science major is 128. We're already talking about the end of the tail, here. And even in the set of CS graduates, how many are actually suited to a career as a programmer? (See https://letterstoanewdeveloper.com/2019/08/23/the-surprising-number-of-programmers-who-cant-program/ ) I am a professional programmer and I used to tell my friends who were lacking direction or a good career, "learn to code!" I found out through experience that basically none of my female friends were interested, and while some of my male friends were willing to give it a go, the vast majority failed to get the hang of it--some even after going through bootcamps or university programs.
Well this is the paradox, right - for the right individual, learning to code can be a great idea; as a mass fix for our employment prospects, it's lunacy.
People will struggle through years of high school math, complain about it the entire time, never use it again, and then go on to say everyone should learn to code.
Coding is really hard. Most people won't be able to do it at the level of proficiency required for a career. Getting all students to learn it just sets them up for more failure.
The kids who have aptitude and interest in it will come across it on their own and be just fine. There is an insane amount of free resources out there already. And it's all much higher quality than a school will provide.
One small quibble - you can’t really run an app with 100,000,000 users with the same dev team you would use for 1,000 users. There’s a lot more involved than just paying for more servers. You have to use different technologies, and have people with special knowledge that doesn’t come into play for a 1,000-user app.
But to your main point, I’m saying this as an adaptable person who is ten years into a successful career as a developer, and my college degree is in... “Jazz Studies” 😆
Ha, amazing! I wouldn't trade my impractical arts-focused education, either. It gave me some wider perspectives that I think many people who studied strictly STEM are missing.
A human I know suggested that I learn to code. It takes a couple days, a week tops, he said.
"No, it takes *you* a couple days. Me, um...." This human is a high school dropout and certified genius, now worth several decamillions running a software company.
Software engineer here with a primary degree in English. Lots of people come to Computer Science in their second life in college. I work with other programmers who used to be airplane mechanics, chefs and biologists.
You can usually build/maintain an app for 100, 000, 000 with maybe only 10 to 100 x the number of devs as the 1000 user app, depending on the domain. Instagram was bought by FB for billions when it had a 20ish person staff. Obviously, if you write tax software, it won't scale nearly as well since there are so many different tax circumstances and jurisdictions, plus the laws change all the time.
Makes you wonder if the "learn to code" and "STEM shortage" mantras were discreetly juiced by larger firms as part of a longer game to drive down the cost of labor.
Of course they are! That’s what all the support for classroom STEM and Scratch coding for kids is all about. It’s not hidden, and it’s not particularly sinister.
It's not exactly a secret. On the more K-12 side, the push for Next Generation Science Standards (NGSS) and more widespread engineering education are well funded by corporate money: https://www.nextgenscience.org/sponsors lists the Carnegie Corporation, GE, Intel, Cisco, and DuPont. Surprised Gates/Microsoft isn't in there, but they're all over the place in education (Common Core), so give them a pass on this one. Tons of "learn to code" things are sponsored by big names like Google, Oracle, Accenture, AT&T, Microsoft, etc., especially the "Women in STEM" ones.
The problem with the "this is a corporate social engineering ploy to drive down the cost of labor" isn't that it's wrong or tinfoil-hatty. It's that driving down the cost of STEM labor is unironically good for society, because we'll have better stuff at lower prices. When the market gets so flooded with CS grads that they have to settle for 50k salaries making copy-paste websites for every restaurant in their city, then every restaurant in my city will have a working digital portal.
These are also the same firms that have had difficulty hiring enough qualified programmers to fill their job openings for many years and saw an alarming drop in the number of CS graduates in the late 2000s and early 2010s. Trying to correct a long term labor shortage is not particularly sinister, though I'm sure they will be more than happy to stop paying the the extra labor costs from that shortage.
Good point! We haven't talked about the precipitous drop in CS graduates after the dotcom crash. If you look at the first chart in this article about women in STEM, for example, you can see that the total number of CS degrees awarded peaked in 2003-04 (graduates who would have started just before the crash) and didn't hit that same level again until 2014-15: https://www.wm.edu/news/stories/2018/disrupting-the-brogrammer-culture.php. By that point smartphones and social media had become ubiquitous, e-commerce had destroyed brick-and-mortar retail, and streaming had taken over music and TV. This is why coding bootcamps really took off in 2013-2015; it was the peak of the imbalance between supply and demand for software engineers. At that time it was insanely easy to get a foot in the door as a self-taught dev or a bootcamp grad. That has definitely changed and the job market for entry level engineers reflects it, but the overall demand for people who can code has continued to increase. More companies need more software than ever before, outsourcing largely failed, and AI isn't ready to take over software engineering jobs yet.
I think this is one of those things where the econ 101 story is straightforward (when Q goes up, P goes down) but the econ 304 story is more muddled. A surge in programmer supply could also raise demand for certain types of code.
People who work in tech but who have a limited upside on their coding abilities have more room to specialize and find a niche in the ecosystem where there is a huge rush to computer science. Your typical team of data scientists is now being supported by a team of data engineers that probably weren't in that group of sought after 50 purdue students. But because of the growth of the field there is now a niche that allows two different groups of non-top tier CS people to add marginal revenue.
This surge in supply is probably very bad for those at the very margin of the field. It is definitely actually good for those who are code competent but with a marketable complementary skill, and very good for those who are mid-tier CS people who can sell tools to that second group. (I don't think that's a insignificant chunk of jobs for computer science humans given how much of software is about keeping the damn site up and deploying new bits to the machines).
Pharmacists are also notably just capped at the total demand of the pharmaceutical industry. Coders are not. Pharmacy is a malthusian profession in a sense.
Yes, if supply of coders went to infinity then wages of codes would drop to zero.
But along the way, because certain types of coding demands a specialized market that mostly consist of other people who code, surges in supply will sometimes be wage neutral to positive for incumbent coders.
Yeah. Isn’t like, the total number of twitter employees under 4000 or something like that? It’s not like there’s a fixed employee-to-user ratio when it comes to social media & apps. Haha
You do get to be outside and hang on to a truck, that seems fun. But it does smell like garbage and and you're breathing in the truck's diesel exhaust.
Yeah, I'm sure. In my comment that you were responding to, I was going to say "and they get to do physical work," but I assume that is very much a double edged sword. I was a landscaper in my 20s during the financial crisis, and it remains my favorite job ever. But the guys who were doing it in their 50s and had done manual labor all their life had back issues, knee issues, etc.
I had a 60 year old plumber at my house recently and he was telling me his littany of injuries (and his movement showed it) and how he dreamed of retirement but it was still a ways off. These are the things the "learn a trade" people don't talk about when they insist college is for suckers.
I agree they should but I'm not so sure about that - my dad was a janitor for years and loved the freedom of it. Not necessarily true if you're like, driving the trash truck though.
I'm a DevOps consultant with 20+ years of programming experience under my belt, have a stable gig for forever, my salary puts me in the top 5% - and I did it all with a bachelor's in English Lit. Fortunately for me, I graduated in the late 90's, when the internet and IT was just starting to take off, and anyone who could spell "Java" was given a job. There was no skills assessment at my first interview because the people interviewing me had no idea how to assess my skills. It was entry-level, but I self-taught for years (I still am!) and slowly and relentlessly climbed up the ladder. But it's very different now for young grads. Just saying you need to learn to code to get a job in this field is like saying you need to learn the alphabet to be a professional author. It also now requires communication skills (you often need to work across time zones with people in different cultures), solutions architecture (as more and more stuff goes on the cloud and is automated, making the widget isn't as important as designing the factory that makes the widgets), and some of the adaptability you mentioned (basically, you'll need to accept you'll be learning new tech, languages, and design patterns for the rest of your career.) I don't envy any of these kids coming up today tbh
I’m a SWE and my wife is a pharmacist, so I feel qualified to comment on the job markets in both fields. I don’t think they are as comparable as you are making them out to be.
In pharmacy, a prescription filled is one fewer prescription to be filled. More pharmacists do not create the opportunity for more pharmacy activity. In software, more code creates the opportunity for more code. If I make something new (or even keep something old updated) it means that others can use it in novel ways, extend it, etc. It’s inherently generative. Additionally, something like 80% of software costs are maintenance. Software, if it’s being used, has to be updated all the time. The more of it that’s written, the more maintenance work there is.
In pharmacy, you graduate about as good as you’ll ever be, and your pay reflects this. You graduate basically capped out on your hourly pay. In software, an averageish engineer can expect to double their pay in 5 years. Honestly they are typically a much better bargain at 2x pay after 5 years because the skill ceiling in the field is very high, and people continue to improve for a long time.
Also the skills taught in CS are not the same ones used in a typical SWE job, so new grads have to learn a lot on the job (pharmacy is less like this). This means companies hesitate to hire new grads, lose money training them, then have them jump ship for a 20-30% pay raise after a year or two.
So while getting hired as a new grad is hard, I expect the market for SWE to be quite good and continue to improve over time, even if there’s occasional down periods in the technology sector generally.
I’m still telling people they should try learning to code.
Every line of code you write today is a line of code someone else is going to have to maintain tomorrow, or five years from now, or ten years from now, etc etc. Software engineering certainly isn't immune to supply and demand but I think Freddie just doesn't realize that the demand actually is increasing in tandem with supply (or even outpacing it). You're right about new grads though - that job market is brutal and has been for a while, but once you get a foot in the door you're pretty much set for life, assuming you have some baseline level of competence that the average state school CS grad certainly has. You may not be working at Google or Apple but there are software jobs at practically every company these days, which is another difference between the job market for software engineers and pharmacists. Every institution that employs pharmacists probably also employs software engineers but the reverse is not true.
Do they scale linearly? Did Facebook need ten times the coding staff when it went from 10,000,00 to 100,000,000 users?
I mean look, it's not a matter of dispute that tech companies employ far fewer people relative to their market cap than older industrial companies once did. GM employees almost 200,000 people in the United States alone along with a vast satellite of jobs in the manufacture of parts and in service. Facebook employs less than 60,000 people worldwide. The scales are just different. And that matters because the immense policy pressure to push everyone into college is a reaction to the decline of industry and manufacturing as bases of mass employment.
Fair enough
From my experience what is far more common is 10 developers plus a roughly comparable sized group of database/network/unix/cloud admins write an app designed for 100m people and then sit back and hope that marketing does their job.
These are obviously not average engineers, but there are actual real world examples of what Freddie said. Instagram had about 10 engineers when they sold to Facebook and I'm sure they had close to 100m users at the time. Similarly, WhatsApp had ~50 engineers and 1b users when they were acquired.
There is one counterexample I can think of here: WhatsApp before the FB acquisition.
From my experience what happens in the real world is that when you scale from 100 to 1,000,000 people you hire a couple of DBA's, network admins, etc.
Oracle, for example, still has a development staff of thousands. Therefore there is zero reason for anybody else to hire thousands of developers to build their own database. Instead you buy their product and hire a much smaller support staff of administrators.
As somebody who works with very large parallel databases for a living I can tell you that you are incorrect. Oracle has seen steady revenue growth for decades and is in no danger of dying out. PostgreSQL is probably the nicest open source platform but in terms of features there is just no comparing it to Oracle. That is why Oracle made more in 2022 than at any other time in its history.
As for traditional RDBMS's, one of the systems I work on now scoops up a few billion rows (gigs of data), analyzes it and returns the results in about 2-3 seconds. Of course the tradeoff is that the hardware alone is very, very expensive and the software even more so. Performance isn't the issue, cost is.
"I've never seen (say) a startup choose an Oracle backend."
Because Oracle is expensive. But again, you get what you pay for. If you need Oracle you know it because open source alternatives like PostgreSQL shit the bed.
Nobody wants to pay for Oracle. More subtly, nobody likes Oracle. But people do end up paying for Oracle, simply because they have no choice.
"...any reasonable price point."
Again, the issue is cost, not capacity. A few years back Oracle liked to boast in their marketing material that you could run all of Facebook off of two of their cluster appliances.
I think you're super overstating this. Yes, scaling an app from 1000 to 100m users is a lot of work but:
* It's way less work than it was 10 years ago.
* It's way, way, *way* less work than it is to scale a physical product from 1000 to 100m users.
In my experience load scaling requires engineers that grows with the log of the number of users.
Anyway, to separate this out though - there really is a fundamental difference between businesses where you do something and sell it once versus doing something and selling it over and over. Surgeons, tax accountants, lawyers and hot dog vendors do something once and sell it once. Authors, movie studios and Substack authors do something once and sell it over and over. In this respect, software development is much more like writing a book than it is selling a hotdog or doing a surgery.
Plus scaling is only a concern for a specific type of application where concurrency and latency are issues. There are any number of examples of small teams (2-3 developers) who write a game that takes off and sees tens of millions of installs.
Agreed - I think this argument that "scaling is actually hard" forgets that web application development is only one type of programming among many.
This is definitely something you see when you're inside the industry. There are people on my team who easily do the work of 5-6 other people because they're just that well suited to coding. It is shocking the performance level differences that can exist even on a single team.
From my outside observation of other professions, I would say there is a much tighter aptitude bell curve outside software.
Agreed and I wonder why it is so. Perhaps in writing there are writers who are orders of magnitude more productive and effective than other writers (e.g. Freddie). Geniuses in the hard sciences. Songwriters. These are (mostly) solitary activities?
Programmers are born, not made. I've known plenty of people with advanced degrees in CS who couldn't code their way out of a paper bag.
This!
Some people have graduate degrees in CS when their bachelor's was in some completely unrelated field. I knew one woman once who had a B.A. in, I think, English, and then somehow got into a CS graduate program. Her master's thesis was on cellular automata and she freely admitted that although she knew a lot about cellular automata, she didn't know a lot about coding. She worked as a tech writer.
To me, it seems weird to get a graduate degree in a field completely unrelated to your bachelor's, but then again, I have a brother who got a bachelor's in physics and then went on to get a master's and Ph.D. in political science. He's now a professor of political science at one of the top universities in the US.
For what it's worth, I don't think your anecdote says anything negative about your English BA acquaintance's ability to code. If she knows a lot about cellular automata, she already has the central idea of a complex system where rigid highly-predictable rules produce very unpredictable behavior at the macro level. Understanding that unpredictability is one of the major barriers to coding for the general population - you have to teach people that their intent is irrelevant once the computer starts reading the bits.
She was an intelligent person and it's possible she could have become a decent coder, but she knew very little of the basics you'd learn in a good undergraduate CS program. As things stood, she wasn't a coder and she knew it; she had earned a master's in CS by learning a lot about a very small corner of the field without having learned the fundamentals.
She needn't know a lot about programming to do research on CA---it's simply not necessarily relevant to study such a mathematical concept. CS != programming/coding. I don't expect most of my CS colleagues to be proficient in any language.
Here is an anecdote that may help distinguish between what CS/CE actually do/know versus what people think that they do. My undergrad embedded programming course (CE) was taught by a professor who specialized in computer architecture. He could (and has) literally designed the chips his computer was running on (down to the transistor level) and yet he was completely incapable of using his computer (Windows) for all but the most rudimentary tasks (essentially he knew how to click around to open email or a word processor but if anything unexpected came up he'd call IT for support).
Early in my career, we hired a guy with a Masters degree in CS and he was shockingly incompetent. It was a real eye-opener for me because I had always assumed "more degrees == more intelligent" up to that point.
`Programmers are born, not made.'
Not in my experience. It took me several semesters to acquire basic competency in programming. I now teach programming and have watched students struggle in coding semester after semester (and not just after taking my course, smart guy =)) but it finally clicks for them after repeated effort.
`plenty of people with advanced degrees in CS who couldn't code their way out of a paper bag.'
I think that you're making the same error here as most of the comments and, indeed, Freddie, himself. Computer Science is *not* primarily about programming/coding. Most CS programs are an absolutely terrible place to learn where to code given that they focus on theoretical (hence the term science) use of computers. If you want coders then look to Computer and/or Software Engineering.
"Born not made" always seems like a false dichotomy to me, at least if taken literally. Nobody is born with professional skills, but there does seem to be some innate potential that some people have and others don't for any given type of skill, and I think that's what people really mean when they say "born not made".
Eh I think a lot of programs just don't do the things which would turn their students into good programmers. I'm not denying the existence of talent but I'm certain that most university cs programs aren't even remotely close to being optimal.
One of many big motivations behind my current path was learning that something like 70% of CPAs are going to retire in the next decade. Accountants, while not paid poorly, consistently are paid less than MBAs and finance people so those two disciplines siphoned off many students, especially MBAs because the academic knowledge is much less rigorous. It also helped that I already was working as an accountant and just wanted to remove the ceiling to advancements.
Anyhow, liberal learn to code rhetoric is a complete crock for the very simple reason that you can't make income distribution more equal without actually doing so. If we live in a highly stratified society it won't make a lick of difference if you hand out PhDs in ComSci to every man, woman, and child. Someone will still have to be poor. This was never a solution and I'm sure you know it.
Sure, but there is no solution to that problem. Someone is always going to be poor and someone else is always going to be rich, no matter how you organize society.
This is just flat out wrong. There will always be differences in wages, sure, but you can vastly reduce this via taxation. We currently do so and have to a far greater extent in the past. Higher marginal tax rates and other changes to the tax code to specifically target high net worth individuals is how you fix it. This isn't fantasy either, the only thing preventing it is a combo of regulatory capture and lack of political will.
No, it's not wrong. You can reduce the differential between the richest and poorest, but there will still be a difference between them, and the poorest will still be considered to be poor. Generally speaking, being "poor" just means your standard of living is worse than most people's, so as long as there is a range of economic outcomes, there will be poor people.
Depends on what definition of poverty you're using. Some are more relative to the rest of the population's income, while others look more at material deprivation/your ability to afford to pay for basic needs.
Sure, but the concept of "basic needs" varies over time. Electricity and indoor plumbing were luxury items for the rich a century ago. If you make a list of all the things you consider "basic needs" today, at some point in the future they'll be almost universal, but people will be complaining about all the people who are "living in poverty" because they don't have something that doesn't even exist today, or that we consider a luxury for the ultra-rich. So there will still be poverty because the bar keeps moving.
Electricity and indoor plumbing are not absolute basic needs in the way that warmth and water are, sure. But it's pretty impossible to legally meet those basic needs in a country like Canada without those prerequisites, much less participate in society.
You could live a subsistence lifestyle, but you need the necessary savings to purchase enough property where you can build your own well, shit in an outhouse, warm yourself with a wood-burning furnace, and live off the land via hunting, fishing, and/or agriculture (which is pretty fucking hard, especially with dwindling wildlife due to deforestation and the prime agricultural land owned by existing farmers or developed over). This is partially why remote Indigenous communities have such a hard time, even though their ancestors had previously lived off the land.
It's also hard to do this at scale without disease and mass deforestation and further accelerating climate change, especially with the world population being what it is. If you want to live without electricity (or natural gas) and indoor running water in a city, basically you have huddling around a hearth during a Canadian winter (which is not something most homes are built with anymore) and shitting in a chamberpot and dumping it...somewhere? A hole in your backyard? And harvesting your water from a rain barrel or a lake of dubious water quality. And either cooking your food of said hearth or living a raw food only diet. And to even have that shelter with a fireplace and backyard to live in and to purchase the food, you need a job. Good luck applying for and keeping a job if you don't have access to a shower and modern transportation to get there and some means of communication.
Not to mention, a country like Canada or the US absolutely has the means and resources to ensure that every person living there has electricity and clean running water.
It's true that there's an aptitude moat that is very pronounced in software development. However I still like CS as a major because even if you fall short it's still just an undergraduate degree with undergraduate debt and time commitment, there is a big range of roles hiring for it demanding different skill levels, and last I checked the number of job openings still outpaced the number of graduates (that may be outdated now). Compare this to the "go to law school" instinct which thankfully seems to be subsiding.
I think there is some method to the madness of policymakers hyping STEM degrees and CS in particular. These degrees offer the biggest productivity increase for students and in rare cases lead to innovations which grow the pie. I wouldn't call it a virtuous cycle as there are economic and aptitude limits on how much CS majors can grow the economy, but it's still a much better bet than a more traditional degree which largely prepares students to occupy a slot in an established, static field and only grow the economy via consumption.
This is what deterred me from doing it when other job prospects weren't panning out.
If the lowest wage for replacement-level programmers is, say, $10 per hour, then tens of millions of Indians will learn to code because that's very good money. Coding is inherently remote work in most cases, so why would replacement-level programmers in other countries be paid more than that?
Learn to nurse, I guess?
People have been talking about offshoring for decades. Billions upon billions of dollars have been spent in building campuses in places like Hyderabad and Shanghai and employing millions of people in India and China. The result?
Soaring salaries in the US and desperate companies expanding their search for workers beyond China and India to places like Ireland and Hungary. Demand is insane because tech is expanding faster than the economy at large.
I’ve worked in the field for 24 years. In that time, salaries have increased by at least 2-3x at entry-level, and by much more at elite levels - 5x and more. And hiring has never been so tight.
I think the premise makes sense: even in our capitalist society we often don't apply a basic supply-and-demand analysis to what can basically be boiled down to an economic issue. (Conversely, sometimes we over-apply economic considerations: I'd rather be appreciably poorer and living in a cohesive nation than keep my current wealth living in the globalist economic zone we call a country.)
On the other hand, the average IQ of a computer science major is 128. We're already talking about the end of the tail, here. And even in the set of CS graduates, how many are actually suited to a career as a programmer? (See https://letterstoanewdeveloper.com/2019/08/23/the-surprising-number-of-programmers-who-cant-program/ ) I am a professional programmer and I used to tell my friends who were lacking direction or a good career, "learn to code!" I found out through experience that basically none of my female friends were interested, and while some of my male friends were willing to give it a go, the vast majority failed to get the hang of it--some even after going through bootcamps or university programs.
Well this is the paradox, right - for the right individual, learning to code can be a great idea; as a mass fix for our employment prospects, it's lunacy.
Everyone, including those pushing "just learn to code!" already knows this.
It's just a flip excuse to justify non-action, the 21st century equivalent of "If the peasants have no bread, then let them eat cake!"
People will struggle through years of high school math, complain about it the entire time, never use it again, and then go on to say everyone should learn to code.
Coding is really hard. Most people won't be able to do it at the level of proficiency required for a career. Getting all students to learn it just sets them up for more failure.
The kids who have aptitude and interest in it will come across it on their own and be just fine. There is an insane amount of free resources out there already. And it's all much higher quality than a school will provide.
One small quibble - you can’t really run an app with 100,000,000 users with the same dev team you would use for 1,000 users. There’s a lot more involved than just paying for more servers. You have to use different technologies, and have people with special knowledge that doesn’t come into play for a 1,000-user app.
But to your main point, I’m saying this as an adaptable person who is ten years into a successful career as a developer, and my college degree is in... “Jazz Studies” 😆
Hey the jazz studies just proves you're adaptable. Life is jazz, man.
Dig
one of the best engineers I ever worked with had a degree from Berkeley in Peace Studies
Ha, amazing! I wouldn't trade my impractical arts-focused education, either. It gave me some wider perspectives that I think many people who studied strictly STEM are missing.
programmers need a lot of reps thinking about what humans are going to do with their code!
Same. Did a stint in VC. every day I used the same skills as in grad school for comp lit: study, analyze, synthesize, present/persuade.
A human I know suggested that I learn to code. It takes a couple days, a week tops, he said.
"No, it takes *you* a couple days. Me, um...." This human is a high school dropout and certified genius, now worth several decamillions running a software company.
Came here to say the same thing as a software engineer of six years… with a degree in anthropology 😂
Software engineer here with a primary degree in English. Lots of people come to Computer Science in their second life in college. I work with other programmers who used to be airplane mechanics, chefs and biologists.
Former art major, current software engineer. I love how diverse the academic backgrounds are in this field!
…I thought you were a wealthy wife of a disgraced businessman? (Gene Parmesan, how ya doing)
That was before I went to art school! It's never too late to chase your dreams
god, you are so inspiring
Still less is it too late to watch them recede over the horizon
You can usually build/maintain an app for 100, 000, 000 with maybe only 10 to 100 x the number of devs as the 1000 user app, depending on the domain. Instagram was bought by FB for billions when it had a 20ish person staff. Obviously, if you write tax software, it won't scale nearly as well since there are so many different tax circumstances and jurisdictions, plus the laws change all the time.
Makes you wonder if the "learn to code" and "STEM shortage" mantras were discreetly juiced by larger firms as part of a longer game to drive down the cost of labor.
Of course they are! That’s what all the support for classroom STEM and Scratch coding for kids is all about. It’s not hidden, and it’s not particularly sinister.
Today, animating cartoons in Scratch. Tomorrow, optimizing the control loop in a missile guidance system.
Sinisterness is a function of time. 😉
(but on the real, no, it is not at all sinister)
It's not exactly a secret. On the more K-12 side, the push for Next Generation Science Standards (NGSS) and more widespread engineering education are well funded by corporate money: https://www.nextgenscience.org/sponsors lists the Carnegie Corporation, GE, Intel, Cisco, and DuPont. Surprised Gates/Microsoft isn't in there, but they're all over the place in education (Common Core), so give them a pass on this one. Tons of "learn to code" things are sponsored by big names like Google, Oracle, Accenture, AT&T, Microsoft, etc., especially the "Women in STEM" ones.
The problem with the "this is a corporate social engineering ploy to drive down the cost of labor" isn't that it's wrong or tinfoil-hatty. It's that driving down the cost of STEM labor is unironically good for society, because we'll have better stuff at lower prices. When the market gets so flooded with CS grads that they have to settle for 50k salaries making copy-paste websites for every restaurant in their city, then every restaurant in my city will have a working digital portal.
These are also the same firms that have had difficulty hiring enough qualified programmers to fill their job openings for many years and saw an alarming drop in the number of CS graduates in the late 2000s and early 2010s. Trying to correct a long term labor shortage is not particularly sinister, though I'm sure they will be more than happy to stop paying the the extra labor costs from that shortage.
Good point! We haven't talked about the precipitous drop in CS graduates after the dotcom crash. If you look at the first chart in this article about women in STEM, for example, you can see that the total number of CS degrees awarded peaked in 2003-04 (graduates who would have started just before the crash) and didn't hit that same level again until 2014-15: https://www.wm.edu/news/stories/2018/disrupting-the-brogrammer-culture.php. By that point smartphones and social media had become ubiquitous, e-commerce had destroyed brick-and-mortar retail, and streaming had taken over music and TV. This is why coding bootcamps really took off in 2013-2015; it was the peak of the imbalance between supply and demand for software engineers. At that time it was insanely easy to get a foot in the door as a self-taught dev or a bootcamp grad. That has definitely changed and the job market for entry level engineers reflects it, but the overall demand for people who can code has continued to increase. More companies need more software than ever before, outsourcing largely failed, and AI isn't ready to take over software engineering jobs yet.
I think this is one of those things where the econ 101 story is straightforward (when Q goes up, P goes down) but the econ 304 story is more muddled. A surge in programmer supply could also raise demand for certain types of code.
People who work in tech but who have a limited upside on their coding abilities have more room to specialize and find a niche in the ecosystem where there is a huge rush to computer science. Your typical team of data scientists is now being supported by a team of data engineers that probably weren't in that group of sought after 50 purdue students. But because of the growth of the field there is now a niche that allows two different groups of non-top tier CS people to add marginal revenue.
This surge in supply is probably very bad for those at the very margin of the field. It is definitely actually good for those who are code competent but with a marketable complementary skill, and very good for those who are mid-tier CS people who can sell tools to that second group. (I don't think that's a insignificant chunk of jobs for computer science humans given how much of software is about keeping the damn site up and deploying new bits to the machines).
Pharmacists are also notably just capped at the total demand of the pharmaceutical industry. Coders are not. Pharmacy is a malthusian profession in a sense.
More concisely:
Yes, if supply of coders went to infinity then wages of codes would drop to zero.
But along the way, because certain types of coding demands a specialized market that mostly consist of other people who code, surges in supply will sometimes be wage neutral to positive for incumbent coders.
Yeah. Isn’t like, the total number of twitter employees under 4000 or something like that? It’s not like there’s a fixed employee-to-user ratio when it comes to social media & apps. Haha
By coincidence the number of Twitter users who are not bots is also 4000 so there's a one to one staff to user ratio.
😂😂
The country is in need of law enforcement officers, teachers and nurses. One of those pays pretty well. None of them have cultural cache.
It's also in need of sanitation workers, who get paid more than most teachers.
They should. I'm sure there is zero personal fulfillment in being a sanitation worker.
You do get to be outside and hang on to a truck, that seems fun. But it does smell like garbage and and you're breathing in the truck's diesel exhaust.
It's also dangerous. Those guys get hurt a lot.
Yeah, I'm sure. In my comment that you were responding to, I was going to say "and they get to do physical work," but I assume that is very much a double edged sword. I was a landscaper in my 20s during the financial crisis, and it remains my favorite job ever. But the guys who were doing it in their 50s and had done manual labor all their life had back issues, knee issues, etc.
I had a 60 year old plumber at my house recently and he was telling me his littany of injuries (and his movement showed it) and how he dreamed of retirement but it was still a ways off. These are the things the "learn a trade" people don't talk about when they insist college is for suckers.
I agree they should but I'm not so sure about that - my dad was a janitor for years and loved the freedom of it. Not necessarily true if you're like, driving the trash truck though.
I'm a DevOps consultant with 20+ years of programming experience under my belt, have a stable gig for forever, my salary puts me in the top 5% - and I did it all with a bachelor's in English Lit. Fortunately for me, I graduated in the late 90's, when the internet and IT was just starting to take off, and anyone who could spell "Java" was given a job. There was no skills assessment at my first interview because the people interviewing me had no idea how to assess my skills. It was entry-level, but I self-taught for years (I still am!) and slowly and relentlessly climbed up the ladder. But it's very different now for young grads. Just saying you need to learn to code to get a job in this field is like saying you need to learn the alphabet to be a professional author. It also now requires communication skills (you often need to work across time zones with people in different cultures), solutions architecture (as more and more stuff goes on the cloud and is automated, making the widget isn't as important as designing the factory that makes the widgets), and some of the adaptability you mentioned (basically, you'll need to accept you'll be learning new tech, languages, and design patterns for the rest of your career.) I don't envy any of these kids coming up today tbh
I’m a SWE and my wife is a pharmacist, so I feel qualified to comment on the job markets in both fields. I don’t think they are as comparable as you are making them out to be.
In pharmacy, a prescription filled is one fewer prescription to be filled. More pharmacists do not create the opportunity for more pharmacy activity. In software, more code creates the opportunity for more code. If I make something new (or even keep something old updated) it means that others can use it in novel ways, extend it, etc. It’s inherently generative. Additionally, something like 80% of software costs are maintenance. Software, if it’s being used, has to be updated all the time. The more of it that’s written, the more maintenance work there is.
In pharmacy, you graduate about as good as you’ll ever be, and your pay reflects this. You graduate basically capped out on your hourly pay. In software, an averageish engineer can expect to double their pay in 5 years. Honestly they are typically a much better bargain at 2x pay after 5 years because the skill ceiling in the field is very high, and people continue to improve for a long time.
Also the skills taught in CS are not the same ones used in a typical SWE job, so new grads have to learn a lot on the job (pharmacy is less like this). This means companies hesitate to hire new grads, lose money training them, then have them jump ship for a 20-30% pay raise after a year or two.
So while getting hired as a new grad is hard, I expect the market for SWE to be quite good and continue to improve over time, even if there’s occasional down periods in the technology sector generally.
I’m still telling people they should try learning to code.
Every line of code you write today is a line of code someone else is going to have to maintain tomorrow, or five years from now, or ten years from now, etc etc. Software engineering certainly isn't immune to supply and demand but I think Freddie just doesn't realize that the demand actually is increasing in tandem with supply (or even outpacing it). You're right about new grads though - that job market is brutal and has been for a while, but once you get a foot in the door you're pretty much set for life, assuming you have some baseline level of competence that the average state school CS grad certainly has. You may not be working at Google or Apple but there are software jobs at practically every company these days, which is another difference between the job market for software engineers and pharmacists. Every institution that employs pharmacists probably also employs software engineers but the reverse is not true.
You nailed it