keiferski 5 hours ago

I hope I don’t come across as too harsh here, but I think a lot of developers are finally being forced to understand that their high salaries and above-average job security were fundamentally predicated on business models that largely didn’t have a ton of competition. In that kind of environment, there is space for a focus on the actual fundamentals, the things-in-themselves, the theory behind the action. Most of this stuff is good and it was a beneficial situation to have that buffer space to allow it.

But ultimately business reality has changed, largely because achieving business goals is dramatically easier with AI tools. This undercuts a lot of the focus on building solid fundamentals, and in a lot of cases that’ll come back to bite the business. But in many scenarios it won’t, and the industry will rumble on.

Those of us working in marketing or journalism or education have already been forced to accept this new reality decades ago, largely because of inventions by software developers. Now devs are just late to their own party.

  • hnthrow0287345 2 hours ago

    >I hope I don’t come across as too harsh here, but I think a lot of developers are finally being forced to understand that their high salaries and above-average job security were fundamentally predicated on business models that largely didn’t have a ton of competition.

    Would love to see the business and manager types manage software and infrastructure. What's the worst that could happen? Go on, do it. Every time a foot gun goes off it'll be followed by a condescending chuckle.

    I used to see 'passion' as the defining factor of how to stay in the field and do well, and that was advice given to people who wanted to join the industry -- who showed the minimum of interest. Now we're going to have these non-technical people who definitely aren't interested and definitely don't have passion for it try to make and manage quality software?

    • pjc50 an hour ago

      There's a lot of value to be extracted in the period between "we fired all the qualified staff" and "oops, we lost all our customers due to unreliability". In physical industries that may happen sooner or in a more alarming way - you discover the loss of your safety personnel in the form of, say, a refinery explosion. But in software you can just .. break stuff, and leak personal data, and deliver a service which is down quite a lot (see github discussion passim, or endless complaining about Windows 11), and nobody goes away. Partly because software switching costs are so high, partly because the alternatives have the same problems.

      This sort of thing happened to, for example, Maplin.

      The big poster child is sadly Twitter. A lot of people said it would collapse without 90% of the staff, and that hasn't materialized. I suspect they can't deploy huge changes to the backend, but they never did that much anyway.

      (also, those of us not in the US and not in FAANG always wondered how such a steep salary differential could have been maintained forever; more than doctors and lawyers? Comparable to finance bros or the fabled quants? All of those are much more onerous jobs with much harder entrance criteria!)

  • pjmlp 4 hours ago

    In many European countries there aren't high salaries and above-average job security for developers, you are considered an office worker like everyone else, this isn't Silicon Valey over here, especially if you come from the Southern Europe countries.

    • dandellion 2 hours ago

      I'm in Southern Europe and developer salaries here are definitely above average. Sure, much less exaggerated than in the US, but still above than the average salary in the country. Even if you limit the comparison to only office workers in the same city, dev is still on the upper half, at least for now.

      • pjmlp 33 minutes ago

        I am Portuguese, and to get above average salary you really need to be lucky working to one of the top companies in either Porto or Lisbon.

        If I would return today to Portugal, I would probably earn less after taxes than in the dotcom days when working for Altitude Software, by moving into my home town.

        Sure it is above minimum wage, yet plenty of office workers get similar salary levels, provided they have an university background.

        I am also aware that IT salaries in Greece and Italy aren't that great, versus other office workers with high education background, and we all three enjoy our unpaid overtime.

      • grim_io 2 hours ago

        There is a huge difference between above average and multiples of the average.

  • dominicrose 5 hours ago

    I've worked 10+ years as a developer in France where salaries weren't too high to begin with, but I certainly noticed the added competition as it was harder to find a job. I stopped "fighting" for a high-paying role so my experience didn't provide net gains but it still protected me from inflation. The net "gains" rather came from spending less by moving from a rent to a mortage and then making it smaller.

    I'm OK with this now, it is what it is, but these years weren't smooth as there were ups and downs and a down after an up can be stressful if you're not ready for it.

  • trgn an hour ago

    nice first-principal analysis, but little connection to material reality, you're looking back at a mere 1-2 years. it is outsourcing dev labor that has killed the domestic market. engineering roles have been moving overseas now (latin america, southern europe, india). any american dev now will have international colleagues, something rarely the case 15 years ago.

    not to say AI-tools do not contribute, they lower the bar of entry to the profession after all, but any C-suite/hiring manager is much more arbitrating labor expense than AI-subscriptions.

  • agentultra 2 hours ago

    Ah the threat of no work in order to depress wages. A scenario we’ve seen play out time and again. Typical capitalism.

    That doesn’t mean we should accept mediocre. Businesses might not care. Few businesses have bought a product based on how many lines of code it has or how easy the code is to maintain.

    Even building software for them for nearly 3 decades it became apparent early on that businesses don’t care. It has always been a point of contention: the struggle to ship now, faster and making sure we ship the right thing and do it well. We had to learn when to give ground and when to pull hard… because in the end there are times when it absolutely matters.

    Just because business can’t recognize when it’s about to shoot itself in the foot doesn’t mean we should let them.

    This has been the excuse of mediocre developers for decades too. It’s how we ended up with sloppy code in production. Terminals that can’t scroll without flickering or handle much data. Apps that have loading screens on super computers. Software that sometimes works. Ship fast and break stuff.

  • armchairhacker 5 hours ago

    “The actual fundamentals, the things-in-themselves, the theory behind the action” don’t go away, they change.

    Programmers used to work with punch cards, then assembly, then low-level languages with odd quirks. Today few developers even think about first-party code size, micro-optimizations, register allocation, etc. LLMs are just another abstraction.

    A developer with the ideal AI code writer (which we’re not at yet) must still think about idea, design, scope, etc. like a product owner or manager. And these concepts have theory, sometimes even math (e.g. time complexity).

    EDIT to comment on the article: all abstractions are leaky, but sometimes it rarely matters. Today we do still need to understand code quality and architecture when working with LLMs, or the software will get bad enough that it will affect the company. But maybe not next year. An analogy: stack vs heap, memory allocations, etc. still matter in high-performance software, which isn’t uncommon, but programmers almost never think about register allocation.

    • lionkor 4 hours ago

      LLMs are not another abstraction. ALL OTHER LAYERS you named are fully deterministic, understood, debuggable, etc.

      You cannot be serious.

      • true_religion 32 minutes ago

        A non-deterministic layer seems like exactly what would need a competent, professional to ensure a good outcome, so it doesn't follow that LLM usage would depress wage more than high-level languages depressed wages by opening up programming to tens of millions of people who could never grok assembly.

      • ai_critic an hour ago

        Counter-point: most developers have no idea or eagerness to actually do that debugging, so it doesn't really matter.

        • lionkor an hour ago

          It DOES matter, because the claim that LLMs are a layer of abstraction implies that it's somehow more than a random word generator. It does a great job at generating words in the right order, and often, given enough time, datacenter resources, money, and training, they can produce code that runs and does things as expected.

          However, there is absolutely nothing stopping an LLM from "deciding" tomorrow that a fix it built a week ago is no longer real, because not only has that fix left its context, but also the bug was not obvious.

          • ai_critic 29 minutes ago

            > However, there is absolutely nothing stopping an LLM from "deciding" tomorrow that a fix it built a week ago is no longer real, because not only has that fix left its context, but also the bug was not obvious.

            Yeah, and we've never had deterministic tools like GCC suddenly fuck up commonly-relied-on undefined behavior between releases. Sure.

            I get what you're saying, but again, to the vast majority of devs, none of that shit matters. Whether that's a good thing or a bad thing is a different discussion.

      • antonvs 2 hours ago

        LLMs are one of the most general abstractions possible.

        LLMs are also quite deterministic if you want them to be - generally, their final token selection is deliberately randomized (the model “temperature”). But the word you’re looking for here is probably not actually determinism, it’s probably something closer to predictability.

        In any case, it’s perfectly possible to ensure that the output of LLMs is fully deterministic, debuggable, understandable, and testable.

        > You cannot be serious.

        I don’t think you’re thinking about this clearly.

        • lionkor an hour ago

          With a sufficiently complex prompt and a sufficiently complex codebase, LLMs consistently fail and make mistakes, "forget" parts of the prompt, etc.

          There's no comparison to be made between this and, for example, a compiler. It's an incompetent comparison.

          > I don’t think you’re thinking about this clearly.

          My literal job is dealing with layers of abstraction. I'm thinking pretty clearly when I tell you that, not only are LLMs a super leaky, terrible abstraction, they are also not comparable to any other layers of abstraction. All other layers of abstraction we use are well understood, predictable (as you put it), and DEBUGGABLE.

          When claude deletes a fix it did two weeks ago, while trying to fix some unrelated error, do you never stop and think "this is not quite the same as what GCC does"?

  • keybored 5 hours ago

    This isn’t harsh at all. As I’ve commented before (but this time as well I do no have the receipts/links), it’s been reported that highly paid programmers in the US also brought in a ton of profit; it was not at all the case that their employers had some thin profit margins because the labor was expensive to them. We talking one million USD profit for a 100K USD salary.

    They didn’t even earn anything close to what they were worth. According to Marx’ Labor Theory of Value anyway.

    However the dice fall now, one of the possible outcomes is that the tech billionaires take that 100K USD for themselves. The very deserving individuals whose job is to sit their arses on automation assets.

    Meanwhile workers from other sectors can gloat about how they are now in the same boat as them. The boat of accepting your ever-meagre reality.

    • alper 5 hours ago

      > that highly paid programmers in the US also brought in a ton of profit

      In Germany for instance I've seen many a company that treated their programmers as a cost center and they actually were (probably a mutually reinforcing self-fulfilling prophecy).

      Too many instances of programmers being deployed in such a way that I couldn't possibly see a way that they would get back even that meagre investment that was being made. Fully irrational dev teams doing useless busy work.

      Most German "startups" used to be replaceable with Zapier and Pipedrive. That has probably only gotten worse with the advent of LLMs.

    • titanomachy 5 hours ago

      Or the margins shrink significantly as the space becomes more mature and competitive, and that surplus mostly goes away.

ClawsOnPaws 13 hours ago

I'm in a similar position to the OP, unemployed for about 10 months, with tons and tons of applications sent both remote and local, and yeah not sure where this is gonna go or what I'm supposed to do. Also disabled, my eyes don't work so that automatically removes many, many non-software jobs I'd otherwise do from the equation.

Don't even really have anything else to say other than that, but maybe commenting it somewhere helps someone else realize they're not alone. I don't know how that helps you or me, but that's what I got. Maybe there's still something for us somewhere, but it is very difficult to stay motivated, and I don't have an answer.

  • MakeAJiraTicket 10 hours ago

    I'm not in your situation, but I've hit the bottom of the despair and found the inner "fuck it we ball" within me. I don't know what's an option for you, but I'm learning bartending, stocking shelves, and having irresponsible sex with the young women I work with in retail.

    I enjoy software development and hopefully one day I will return to it, but I am but one tiny kernel of corn in such a mighty ocean of shit so I might as well right the waves instead of fighting them. Maybe your calling is scamming Indians or scamming Americans or scamming Indian scammers. You aren't alone but the attitude you have will never stop mattering. See if you want to go back to school, start a tutoring program for kids. Motivation is for morons, do something.

    • andyjohnson0 4 hours ago

      Its only 11:37am where I am, but this is the sanest thing I've heard so far today.

    • myst 6 hours ago

      I had to burn out to obtain the insight :-/

    • fontain 8 hours ago

      I agree with your message but maybe don’t have sex with young co-workers.

      • alper 5 hours ago

        I'm guessing everybody in this interaction is an adult.

      • solumunus 8 hours ago

        Why?

        • fontain 6 hours ago

          Only violate one proverb at a time. If you shit where you eat, don't rob the cradle. If you rob the cradle, don't shit where you eat.

      • dragochat 8 hours ago

        ...puritans will be puritans

AdieuToLogic 10 hours ago

From the well-written article:

  I have spent months adjusting my resume, applying for all 
  jobs where my skill set may be of use, building 
  proof-of-concepts using Claude, and doing cold outreach to 
  anyone who may be interested in my potential products or my 
  services. The well has gone dry. 
A major quandary companies are finding themselves in is "resume fraud", which can be defined here as being inundated with applicants only to find 99%+ have used GenAI to produce a bogus work history tuned to satisfy the job posting. To the point where many companies simply give up trying to identify "real" applicants via online submissions.

It is analogous to email spam in the 90's, before anti-spam technology was mature.

  • sayamqazi 3 hours ago

    They wanna filter the candidates who used GenAI and then force the GenAI on the existing employees. Makes total sense.

  • vatsachak 9 hours ago

    Yeah it's pretty bad. Can't even browse projects on Reddit because a lot of them are just slop

    • AdieuToLogic 9 hours ago

      Oddly enough, the solution lies in what was previously replaced; staffing firms.

      Staffing companies have recruiters which vet candidates to varying degrees of success. At minimum, they establish the candidate:

      - is a human

      - lives where they claim to live

      - has worked where they claim to have worked

      - has eligibility to work for one or more of their clients

      If nothing else, the above eliminates much of the "99% resume fraud" problem companies are dealing with now.

      • jaggederest 8 hours ago

        I've been thinking this for a while now, but I feel like especially with the rise of crazy salaries in AI research, it's time for software development to have its agency moment. Just like athletes and actors, I think the industry might be better off if there were reputable agents with a portfolio of people they represent, and something the equivalent of a casting director at companies instead of the current "cram leetcode" mode of evaluation.

        • psidium 7 hours ago

          I’ve seen a setup like this in software for some specific high-demand SAP (ERP) consultancy roles. The nature of SAP migrations are per-project in nature (you wouldn’t want to migrate your company’s ERP all the time). The person had such a skillset that they had what is effectively an “agent” who would negotiate their next job assignment. The agent was even baked into the contract with the client as a party, I don’t recall how much of the hourly this agent would get, but they were invoicing the company separately.

          At least this is what I recall.

          Meta: this is probably the first time this year where I use the word agent to refer to a human. Feels odd even.

        • girvo 3 hours ago

          Already sort of exists in the high end contracting/consulting software dev business in Australia at least

        • AdieuToLogic 7 hours ago

          > Just like athletes and actors, I think the industry might be better off if there were reputable agents with a portfolio of people they represent ...

          This is what recruiters in quality staffing firms do. Granted, there are many staffing firms which are worthless body-shops. But those are not reputable. :-)

          > ... and something the equivalent of a casting director at companies instead of the current "cram leetcode" mode of evaluation.

          The equivalent has traditionally been hiring managers who work with approved staffing companies, both to ensure those companies provide value as well as to foster an understanding of the people/skills needed by the organizations.

          Wise organizations use multiple staffing firms and perform internal audits in order to minimize complacency/corruption.

      • leoc an hour ago

        All else being equal, the return of high-touch recruiting work is of course a reduction in industrial productivity and a negative contribution to economic growth. But it does generate more jobs! Put that in your predictions of AI’s economic impact and smoke it …

        • vatsachak an hour ago

          It's not a decrease of productivity?

          If someone who contributed to the Linux kernel has their resume on par with a spammer who lied about it how do we know who is correct unless there is a verification system?

pjmlp 4 hours ago

I feel the pain, and if I get unemployed now on my 50's, most likely I will do something else outside computing.

Everyone that praises how they get more productive always forgets that means big corp now needs less of us.

I work on enterprise consulting, and have watched how the change into managed cloud infrastructure, followed by low-code/no-code tooling, has had an impact on team sizes, meaning less devs for the same outcome.

AI driven development is reducing those team sizes even further.

In many European countries, gettting jobs at a later age is really an almost impossible task, the easiest solutions end up trying to get early retirement status, or go self employed, which also isn't without its own set of complications.

noashavit 22 minutes ago

I feel for Gen Z. If it’s hard for those with years of experience, judgment and taste how can you even get into the game?

arkt8 15 hours ago

More than ever is time to be stoic. Have things but live as having nothing. But as obvious as the author says it was predictable too.

By now... I see in my country high prices for laptops with only 4Gb of Ram and Celerons.

It could do wonderful things if in 2000s people didn't buy the argument that hardware is so cheap so lets write unefficient code. Same hardware that could play an Youtube video in 2000s today cannot even open the website. Electron send hugs...

Now people are mad about AI until when? Oceans be drought like in Oblivion movie?

And professionals? The generation of specialists will pass... and people will blindly depend on Ai soon if the course of things doesn't stop or at least be corrected.

I think the author could have brighter days in future (and still thing in present in some hidden niches) as knowledge will always precious.

The main lesson I have is buy less TI and every buzz promises and find the place where knoledge and craft walk side by side.

  • 21asdffdsa12 9 hours ago

    But all that advice. Never worked out. Be stoic. Be supportive. Be.. to Not to be. Accept that beeing some eldritch gods lunch is your destiny. Stoic is what we expect the cattle to be as it goes up the ramp. Do not go quietly into the night, rage against the dying of the light.

    • tsunamifury 7 hours ago

      Correct. Stoicism was for two audiences. Those doing the killing to be indifferent toward it and for those being killed to be indifferent about it.

      Marcus Aurelius the historical figure was a monster who killed a measurable portion of humans alive at the time

      • laszlojamf 4 hours ago

        I mean... so did a lot of other rulers. As far as emperors go, Aurelius wasn't that bad. You have to judge historical people by their peers, not by your own modern standards.

        • king_geedorah 3 hours ago

          Strictly speaking nobody has to do anything.

    • 9dev 8 hours ago

      The great meat grinder doesn't care either way. Stoic or screaming and kicking, you're going in.

      • keybored 5 hours ago

        So sayeth a Head of Engineering.

        • 9dev 4 hours ago

          Aye - I'm marching into it like everyone else, just with a different flavour of pain

    • keybored 6 hours ago

      Stoicness is for the lamb and the wolf alike; mindfulness is for the monk and the samurai killing on behalf of their lord alike.

      We are not so one-dimensional that good mind habits are the one and only thing we do and act on.

      • the_gipsy 6 hours ago

        But that was a stoic comment.

donatj 15 hours ago

I've come to the conclusion in the last couple years that being the guy who understands how the abstraction works under the hood is treated by companies as more of a liability than a virtue.

More and more places just want Jira tickets done fast instead of someone that's going to push back or question if this is the best way to build some thing. They want the thing, they don't care if it works well. They don't care if it's efficient. They want it now.

We've been moving to React, replacing an internal framework that's worked wonders for us we've been using for over a decade. The biggest part of the move is "hiring".

My general sense is that nobody understands how React works under the hood. The answer I get when I ask questions is generally just "don't worry about it".

Everything is giant overbuilt and terrible because most people never bothered to learn even a single level up from where they do most of their work. The people that do become unhirable. Everything takes hundreds or thousands more cycles and electricity it should because people can't be bothered to understand what they're doing.

  • lelanthran 7 hours ago

    Well, if more react devs knew how it worked under the hood they might choose something else[1] :-)

    Jokes aside, if you don't need two-way data binding, using react frameworks pulls in a lot of crap that you never need.

    The majority of web apps have no need for react

    ‐--------

    [1] I always joke that the reason I am atheist is not because I don't know much about your religion, it's because I know too much about your religion.

    • Izkata an hour ago

      ...React doesn't use two-way data binding.

  • Jean-Papoulos 8 hours ago

    Hardware is cheap ; human labor is not. Companies have figured out that the best way to extract monye from customers is to give them something that barely works now, rather then something that works great later.

    • TeMPOraL 6 hours ago

      > Hardware is cheap ; human labor is not.

      Especially true when you're paying for neither hardware nor labor.

      Writing inefficient client-side software, whether it's desktop or webshit, makes the customers / users pay for the hardware, and pay with their time.

    • terseus 7 hours ago

      Is it, though? Can we really keep saying that "hardware will always be cheaper than human labour" when RAM prices are soaring, GPUs are becoming prohibitively expensive, and we're looking at a probably chip shortage?

      I think the era of "poor software for fantastic hardware" is coming to an end.

      • mixermachine 7 hours ago

        RAM + GPU are getting more expensive but mostly for applications that require a lot of it like AI. The hardware cost for regular applications has not vastly increased (especially when factoring in inflation). Spending 2x development time on a problem often is not worth it (or only with large deployments).

        UI development is an even more special case here. The customer buys the machine which runs the code, not the company. So sadly "good enough" is the standard.

        One example for me here is the "switch product option" button on Amazon listings (e.g. switch green to blue color, smaller to larger model). On my phone this sometimes takes >5 seconds to properly load. Horribly optimised.

        • terseus 2 hours ago

          Oh of course, that's the current standard, but I doubt it will be considered acceptable for much longer.

      • antonvs 2 hours ago

        It’s not even close to at an end. Hardware would need to increase in cost by hundreds or even thousands of times to materially change that calculation.

        Just as an example, the cost of one week of engineering time corresponds to tens of thousands of vCPU-hours, which is many years of CPU time.

        As such, it only ever makes business sense to optimize code either when it has bottlenecks that can’t be fixed by throwing hardware at it, or when it’s so inefficient that it can be sped up by several orders of magnitude.

    • piokoch 5 hours ago

      That's not true if you are on cloud. Clumsily written software becomes really expensive to run.

  • ankurdhama 5 hours ago

    > They want the thing, they don't care if it works well. They don't care if it's efficient. They want it now.

    That's because they don't know what to build that will be a successful product, so they essentially try to brute force this question of "what to build" by trying different ideas quickly and see which one will stick. And in this quick iteration loop people just throw bunch of stuff together to make something and once that something gains traction they will keep piling on top of that shaky foundation.

  • rdevilla 15 hours ago

    > I've come to the conclusion in the last couple years that being the guy who understands how the abstraction works under the hood is treated by companies is more of a liability than a virtue.

    This is one of the most alienating things about the modern software engineering industry. Someone who grew up just fucking around with computers since they were 5 is supposedly now on even footing with someone who took a 16 week bootcamp and a Claude subscription and has never seen a terminal before.

    I was at a drum and bass show recently and talked to one of the other people there. It was obvious I didn't really listen to that much drum and bass as I couldn't name anybody except the most popular artists. You see peoples' reactions change slightly when they discover you are not really part of their music scene - you're an outsider, or a tourist, or even a poser. That's not even a problem, that's just the way subcultures are - you've either lived and breathed that way of life, or not.

    What LLMs are doing is they are automating the manufacture of posers and cultural appropriators at scale - you don't really understand the nooks and crannies of this territory, you never actually lived on IRC or in the bash terminal - but you can sure wave around these oversimplified maps of the territory with all the back alleys and laneways missing, and use your pocket book of translated phrases to pose as a native.

    > My general sense is that nobody understands how React works under the hood. The answer I get when I ask questions is generally just "don't worry about it".

    The problem in software is it seems that we are losing the ability to distinguish between appropriators of computer geek culture and those who do "speak" programming languages natively. The bar has fallen so low that I can't even expect people to understand the difference between runtime and compile time. Anybody who brings up such advanced and esoteric (read: high school level computing) topics is viewed with scorn, as if their ability to expose ignorance on foundational topics presents an existential (or career) threat.

    There's been a rise of anti-intellectualism in software from people with non-STEM backgrounds who actually disdain seeking out and possessing such knowledge. It's utterly useless to study - just like math. I find it harder and harder to locate hobbyists, especially here in Toronto, who bother to go below the abstractions not just because they want to, but because they are compelled to understand.

    • linguae 9 hours ago

      Your words resonate with me. Even before LLMs, I’ve been disappointed with the general direction the software industry took in the 2010s. Today’s software industry is not the industry of Licklider, Engelbart, Bob Taylor, Alan Kay, Woz, Stallman, Ritchie, Thompson, Pike, Joy, and many others whom I admire, who helped establish an ethos of computing that fostered a sense of freedom, creativity, and wonder.

      Instead, what we have today is a computing ecosystem dominated by powerful players who care about money and control. Speaking from the standpoint of a Bay Area resident, since roughly 2012, the field has been increasingly taken over by people who are in it for the money. Combine that with Alan Kay’s observation that computer science is a “pop culture” that often lives in the moment and has little regard for the past, and also combine that with the “move fast and break things” attitude that permeates modern software development, and this has created an environment that seems hostile to the types of nerdy pursuits that the industry once encouraged. The working environments of many major software companies and the products they release are a reflection of the values of the companies’ executives, managers, and shareholders.

      While I’m not anti-AI, I see agentic coding as another step in the direction that the software industry was already heading towards, where it can move even faster and break even more things.

      There is still wonder, joy, and freedom in computing, but I feel this is increasingly confined to the hobbyist world and certain niches in research environments.

    • xpct 11 hours ago

      I can confidently say that I know little to no people truly interested in understanding technology, except for strangers online.

    • Nevermark 8 hours ago

      > Anybody who brings up such advanced and esoteric (read: high school level computing) topics is viewed with scorn.

      Design time, code time, compile time, run time. Why all that potentially wasteful upfront work?

      The next step are shipped applications whose help menu is a chat interface that responds to all user questions of the form "How do I ...", with a short pause to add a new hack to the existing pile, and then some upbeat instructions.

      In theory this should be nirvana. No more vibe coding! Everyone is a power user. Zero dependencies. But there will be much weeping.

      • rablackburn 7 hours ago

        > In theory this should be nirvana. No more vibe coding! Everyone is a power user. Zero dependencies. But there will be much weeping.

        If I had to sum up the zeitgeist of the '90s techno-optimism it would be this persistent, confident prediction that once people just learned _how_ to use computers, and everyone is a power user everything will be fine! Despite the mounting evidence that actually, no, like everything else in reality the distribution of skill is a bell-curve with the median sitting uncomfortably low for those who, to quote OP, "lived on IRC or in the bash terminal".

        Free universal education didn't fix this problem, LLMs won't fix this problem. Man's natural paucity is no longer in the availability or accessibility of knowledge. The liberal ideal that all we must do is empower the individual turns out to not have been the solution to everything forever.

        But hey, being self-aware enough to make productive use of this new technology is probably _some_ kind of edge.

        May as many as possible survive.

    • globalnode 13 hours ago

      sounds like youre working at the wrong place. detailed computing knowledge and maths is essential in some industries and like you said, scorned in others. i couldnt think of anything worse to do with my time than spend all day with mba's or webdevs (lol im sorry thats unfair, web development is complex with all the callbacks and sync issues).

      • Fr0styMatt88 7 hours ago

        Thank you, I was starting to wonder.

        I guess because I’m in game dev maybe, but in all my jobs knowing about the underlying stack has either been necessary knowledge or highly regarded.

        I can’t think of any time in my career where knowing about the internals of the stack was ever frowned upon or where it’s been anything other than an advantage (especially when hunting bugs). I must have been lucky.

    • slopinthebag 9 hours ago

      people will accuse you of "gatekeeping" because you shouldn't need to have any knowledge or skill to do stuff. those things are unimportant, even bad, because anything requiring those is inherently exclusionary. lmao.

  • jongjong 14 hours ago

    This has been obvious to me since I graduated with a BIT majoring in 'Software design.' I literally went to university with software design and software architecture being my core interests.

    When I graduated, I was shocked to learn that no company cared about any of the architectural concepts that I had learned. UML class diagrams, sequence diagrams, ER diagrams, etc... had been on the way out. At one point, as internet companies where scaling up, there was a brief resurgence of interest in sequence diagrams... Especially as a communication method when explaining complex bugs or complex message-passing scenarios. But it didn't really last. Nowadays most software is riddled with race conditions and deep exploitable architectural flaws. Cryptocurrencies have been victims to many such attacks. Billions of dollars have been lost to race conditions... And that's just the ones which were discovered. They are notoriously difficult to find post-implementation.

    The programming primitives that we're using today aren't optimized to avoid race conditions or even try to encourage good concurrency patterns; quite the opposite; they encourage convenient but disorganized parallelization and they're optimized to put the focus on type safety which is a far less concerning issue. A lot of people who were rightly alarmed by gaps in schema validation (which is critical at API boundaries) became overly obsessed with type safety (which is a broader concern). I have built some async primitives for Node.js, nobody cared! NOBODY! Other developers have had the same experience with most other languages. I think only a few niche languages like Elixir actually treated it as important. But nobody even acknowledged that the problem could be remedied in existing languages. It's so bad that it seems as though some people wanted it to be that way.

    The term 'concurrency safety' doesn't even exist! Some have a vague idea about thread-safety OK, that's very specific to one particular concurrency primitive... but what about the concurrency of asynchronous logic (much more common nowadays)? I have felt thoroughly suppressed in that regard in my career.

    The only voice on the subject of architecture that got through to the 'mainstream' was Martin Fowler (one of the inventors of Agile software development). After that, there was Dan Abramov of Redux fame. Some notable opinionated architecture books were published but none really identified the underlying essential philosophy to good architecture.

    The best, most succinct quote I ever read on the subject was from Alan Kay (inventor of OOP) who said "I'm sorry that I long ago coined the term 'objects' for this topic because it gets many people to focus on the lesser idea. The big idea is messaging."

    I like that quote for many reasons; firstly because it shows wisdom, secondly, it tells you what the issue is, very simply and, thirdly, it hints at the importance of 'focus' in this discipline where we are saturated with thousands of complex overlapping and partially conflicting ideas.

    I think the FP trend was somewhat of a red herring. Same with Type Safety. Yes, they were useful to some extent, there are some really good ideas in there, but people got so caught up in them that the most fundamental area of improvement was ignored entirely. To me, the core value proposition of FP can be reduced down to "pass by value is safer than pass by reference." Consider that in the context of Alan Kay's "The big idea is messaging." - Is an object reference a message? NO! A live instance is not a message! Precisely! His point supports pass-by-value, furthermore, it encourages succinct/minimal parameters.

    Good architecture is rooted in 2 core concepts. 1. Loose coupling. 2. High cohesion and you achieve those by separating logic + structure from messaging. The biggest mistake people make it passing around structure and logic as parameters to other logic. You should avoid moving around logic and structure at runtime; only messages should move between objects; the simpler the messages, the better. And note that 'avoid' doesn't mean never but it means you have to be extremely careful when you do violate this principle and there should be a really good commercial reason to do so. I.e. You should exhaust other reasonable approaches first.

    • jeffreygoesto 8 hours ago

      My journey is quite similar. My mental model got a huge boost after I read and understood Leslie Lamports early work and the work of Edward Lee about getting deterministic results in the presence of concurrency. I even found the earliest paper with a mathematical proof that write and read must be separated in time or space (the math basics of the rust borrow checker), but don't find it anymore.

      - https://lamport.azurewebsites.net/pubs/time-clocks.pdf

      - https://en.wikipedia.org/wiki/Chandy%E2%80%93Lamport_algorit...

      - https://www2.eecs.berkeley.edu/Pubs/TechRpts/2006/EECS-2006-...

    • burakemir 12 hours ago

      Yeah, passing by value or "Value semantics" can prevent many programming errors. Passing references to immutable data can serve a similar purpose. In low-level languages where memory layout and calling convention map to target hardware, there are differences in performance to consider.

      Pass by value would indeed make a big difference to how programs are structured and make it easier to reason about programs.

      I just want to point out that "concurrency safety" is very much a word, although "thread safety" is more common. These are broadly part of memory safety, which is a topic mainly due to security concerns but also academic study.

      The two perspectives are not perfectly congruent. Non-concurrency-safe languages like go can also be considered broadly memory safe. The pragmatic rationale is that data races in GCed languages are much less exploitable. From a academic, principle based view this is unsatisfying and unconvincing as one would prefer safety to be matter of semantics. See also https://www.ralfj.de/blog/2025/07/24/memory-safety.html

      Rust uses "fearless concurrency" as a slogan. Rust offers more options than passing by value (Copy) while still guaranteeing safety through static type checking.

      There is also research for GCed languages to establish non-interference eg Scala capture checking.

      Concurrency is recognized as difficult (at least by people who are knowledgable) and programs language design usually involves pragmatic choices if you need concurrency. If the language does not provide the primitives or spec that enables safety, then you are left with patterns and architecture.

      The science is still evolving, it is certainly not the case that nobody cares. Rather, progress is slow and moving ideas from research industry is even slower. How much value we ascribe to correctness, safety and performance in industry depends very much on the context.

    • kelsier_hathsin 13 hours ago

      > only messages should move between objects

      Can you provide an example for this?

      • aryehof 10 hours ago

        The Alan Kay viewpoint (he is NOT the inventor of OOP [1]) is considered the least helpful viewpoint on OO design. The “magical” and unhelpful “its all about messages” perspective, that helps you not at all unless one is talking about the internal implementation of a platform like Smalltalk. Consider the views of the real inventors - Nygaard and Dahl.

        [1] I don't think I invented "Object-oriented" but more or less "noticed" what was really powerful about just making everything from complete computers communicating with non-command messages. This was all chronicled in the HOPL II chapter I wrote "The Early History of Smalltalk". — Alan Kay

      • burakemir 12 hours ago

        Say you have a Car, Engine and Dashboard object.

        Let's not have dashboard access the temperature by doing `GetSurroundingCar().engine.temperature`

        If the dashboard needs to get the temperature from a sensor in the engine, it should be able to "talk" to the sensor, without going through car object.

        In ideal OOP, a "method call o.m(...)" is considered a message m being sent to o.

        In practice, field access, value and "data objects" etc are useful. OOP purism isn't necessarily helping if taken to the extreme.

        The pure OOP idea emphasizes that the structure of a program (how things are composed) should be based on interactions between "units of behavior".

        • bluGill an hour ago

          > Say you have a Car, Engine and Dashboard object.

          Then you should burn the whole thing down and start over!

          I work on tractors not, cars, but I'm sure the abstractions are similar. Our engine objects are things like [service]AutomationEngine. Our dashboards objects are for a collection of things on secondary displays - meanwhile the thing you would point with your finger as the actual dashboard doesn't have or need a dashboard object. There is an object for things like RPM gauge, or check engine light - this later is a generic fault indicator with a icon field; it is connected to a different messages; and placed in different positions depending on the fault.

          The point of the above isn't how tractors are designed, it is how the objects you need to design a real OO system rarely have anything to do with that looks like objects. Nobody ever writes dog and cat objects derived from animal; nobody ever writes triangle objects derived from shape.

        • skydhash 3 hours ago

          I had to learn OOP with common lisp (CLOS) and smalltalk to understand this. Now, I’m leaning towards C, because it’s easier to model a problem with struct and function and not have to deal with the flavor of OOP that some languages foster.

      • jongjong 12 hours ago

        1. Avoid passing live instances (by reference) to other instances as much as possible. Because you don't want many instance references to be scattered too widely throughout your codebase. This can cause 'spooky action at a distance' where the instance state is being modified by interactions occurring in one part of the code and it unexpectedly breaks a different module which also has a reference to that same instance in a different part of the codebase. The more broadly scattered the reference is throughout the codebase, the harder it is to figure out which part of the code is responsible for the unexpected state change. These bugs are often very difficult to track down because stack traces tend to be misleading because they don't point you to the event which led to the unexpected state change which later caused the bug.

        2. Avoid overly complex function parameters and return values. Stick to passing simple primitives; strings, numbers, flat objects with as few fields as necessary (by value, if possible). Otherwise, it increases the coupling of your module with dependent logic and is often a sign of low-cohesion. The relationship between cohesion and coupling tends to be inversely proportional. If you spend a lot of time thinking about cohesion of your modules (I.e. give each module a distinct, well-defined, non-overlapping purpose), the loosely-coupled function interfaces will tend to come to you naturally.

        The metaphor I sometimes use to explain this is:

        If you want to catch a taxi to go from point A to point B, do you bring a steering wheel and a jerry-can of petrol with you to give to the taxi driver? No, you just give them a message; information about the pick up location and destination. This is an easy to understand example. The original scenario involves improper overlapping responsibilities between you and the taxi service which add friction. Usually it's not so simple, the problem is not so familiar, and you really need to think it through.

        We understand intuitively why it's a bad idea in this case because we understand very well the goal of the customer, the power dynamics (convenience of the customer has priority over that of the taxi driver), time constraints (customer may be in a hurry), the compatibility constraints (steering wheel and fuel will not suit all cars). When we don't understand a problem so well, an optimal solution can be difficult to come up with and we usually miss the optimal solution by a long shot.

    • globalnode 12 hours ago

      nice post, lately ive been dealing with concurrency, between threads and processes. trying to keep it cross platform as well, its a lot to learn. if you have large buffers and want to keep some semblance of performance, its VERY interesting understanding all the transfer mechanisms and cache levels involved. i feel these are the sorts of things my education skipped, it was all very focused on the static structure of objects not the dynamics of data transfer.

  • sublinear 6 hours ago

    > replacing an internal framework that's worked wonders for us we've been using for over a decade

    Can you share what this internal framework is?

  • skydhash 15 hours ago

    > More and more places just want Jira tickets done fast instead of someone that's going to push back or question if this is the best way to build some thing.

    That's one thing I never care to do unless I'm the one making the technical decisions. What I do is to build the thing, but with defensive programming in place. I take care of making that my code is good, then harden any interface so that I can demonstrate that I'm not the cause for new bugs. People will be careless, so make sure that you have blast doors between your work and theirs.

    And I do take time to learn about the abstractions of the new shiny tools, even when it's overengineered. Going blind and making mistakes is not my cup of tea.

  • gib444 6 hours ago

    > The biggest part of the move is "hiring".

    By that they mean outsourcing.

oxag3n 14 hours ago

"Any problem in computer science can be solved with another layer of indirection, except of course for the problem of too many layers of indirection." Bjarne Stroustrup

That's why you see hundred level call stacks, polymorphism with a single implementation and still errors are hidden or root causes hidden behind "exception caught".

alper 5 hours ago

Premature abstraction is the root of all evil.

hamasho 15 hours ago

  “Duplication is far cheaper than wrong abstraction."
noborutakahashi an hour ago

Interesting point.

In the 1980s, we also relied on abstraction during development, but removed much of it as we moved closer to the hardware.

Abstraction wasn’t something permanent — it was something we used and then deliberately reduced.

myst 6 hours ago

> In the world of computing, we tend to abstract away complexity. Doing so seems liberating. It enables us to focus on the bigger picture. Unfortunately, in doing so, the fidelity of our understanding often decreases. We sometimes end up blinding ourselves.

Some “Java in the 90s” understanding of abstraction. Proper abstractions break complexity into composable elements. Hence, fidelity of our understanding increases.

dragochat 8 hours ago

> I have spent months adjusting my resume

just share the damn thing, someone may have something for you ;)

...I've kind of rarely seen these ppl complaining about work actually sharing their resume or a condensed description of their skills, knowledge and experience

  • dragochat 8 hours ago

    ok, f googled it and found it: ~"entry-level/junior sysadmin and cyber"

    so, a path could be picked from what you know:

    1. devops/sre - really hard to get above entry-level without real experience and you _will_ be competing head on with AI ...ouch

    2. cyber - same with whitehat as with devops/sre ...basically go full red-team / blackhat / offesinve for a while, the get certs and portofilio, then job in "real cyber" ...BUT ppl that do this tend to have a "very specially broken brain", so if you haven't done this already you're probably not one of them [probably for the best]

    ...but they're probably all bad, so better DO SOMETHING ELSE ENTIRELY:

    ...gtfo of software, you're likely not gonna become an "agents hearder" with skillset, mentality and experience - in the US probably going full on on agriculture [recent US protectionism and isolationism will give you decent levels and shield for globalized markets], learning some minimal hardware tinkering to automate drones and later manage android workers, software for planning farming automation etc... hire hands for physical labour and BUILD AND MANAGE A FARM or something like that (maybe farm + restaurant or smth else form tourism / hospitality)

    • kajman 7 hours ago

      All of the three sectors you've mentioned are not in a good place right now. Probably much less stressful to be an unemployed programmer than trying to make a hobby-scale farm profitable with soaring fuel and fertilizer prices, along with a labor force that is fleeing.

      E: Farm automation probably has some juice though, regardless of how close the androids I keep seeing in demos actually are.

    • kinow 8 hours ago

      With some knowledge in devops and cyber maybe moving to QA, tester could work too. But the idea to move towards agro is a good idea too!

soopypoos 10 hours ago

  I spoke a million words
  They didn't mean that much to me
  They rang around my head
  Like empty tuneless harmonies
  Love's great abstraction mine
AussieWog93 10 hours ago

Can't offer you any work unfortunately, but have an updoot. Hope this gets to the top and helps you provide for your son.

slopinthebag 9 hours ago

It's not just tech, other industries are experiencing the same hiring woes. I think the economy is deeply broken, it shouldn't work like this and it doesn't seem like there is any hope in fixing it - governments just continue to run up debt as if they can just keep kicking the can down the road indefinitely. eventually the can becomes a brick and you break your foot.

there will be a reset at some point, and software developers will be needed. especially when every piece of software stops working. idk if that will happen before or after an economic collapse tho.

i have no idea where things will go in the future, but i doubt it will be much fun

  • kajman 7 hours ago

    I'm confident the world will need more software developers than ever before, no matter where "AI" goes from here.

    I don't think most of those jobs will be in the West, though.

    • lionkor 3 hours ago

      Why not? Are there any software-related industries in the West where software engineers are not needed or won't be needed?

shadowgovt 15 hours ago

Oof. There are two pieces to this story. One is great and one his heartbreaking.

The fact that modern tech has disintermediated people with problems to solve from the need for a "priest class" to commune with the machine to solve the problem is a great thing. It's the goal. The more we do it the better we are making the world for humans.

... the fact that people need to work to eat or provide anything above a subsistence quality of life is not only tragic, it's increasingly abhorrent in a world where automation and simplification via machines has freed up this much raw resource and free time.

If we're pitting LLMs against people's ability to provide for their families, we have lost the thread on why we're doing any of this.

  • renticulous 10 hours ago

    > this much raw resource and free time.

    Those resources are being redirected to create entertainment areas for the rich like golf courses, 7 star luxury hotels and villas. This is the modern predicament.

  • arkt8 15 hours ago

    Not he automation, but the way... we gone farther since agricultural and energy domestication... but the profit as main director is less than suboptimal, it is tragical. Having known about many accidents in complex systems is a madness to see things at this point in the most complex of systems that is society.

    • hgyyy 14 hours ago

      Profit is what drives the survival of the firm to be fair

      However there are tasteful ways of doing it. And google and meta in particular certainly are not.

SadErn 10 hours ago

I may be missing something, but this doesn't read to me like an abstraction or AI-related problem.

It sounds more like a packaging issue. I know he's attempted to edit his resume, but there's missing information here that OP may not even be aware of.

For instance, I recently became the last of two candidates interviewing for a great opportunity that I sadly lost. When I received feedback, it turned out the hiring committee had a completely different sense of one aspect of my work than I had attempted to convey. I'm glad I got the feedback, but it was frustrating to lose after so many interviews.

Then just recently, I interviewed a candidate at my current company who reminded me of OP. Laid off worker, very nice guy, but he had no idea how to portray himself as a dev at the level he was applying for.

I wanted to call him up and coach him, but it didn't seem appropriate, especially since he didn't ask for feedback.

If you are in this position, find a free coaching program that can help you revamp and resell what you have to offer.

It's not fair to have to do that just to get a chance to be paid a fair wage. But companies get thousands of resumes a month and do dozens of interviews.

We try to give candidates a chance to show us who they are, but if what they are showing us doesn’t line up with the role, or their strengths are buried, there’s only so much we can infer. It sucks, because the resume and interview are not the job. But they are the gate you have to get through before anyone sees the work.