Two big tech FAANG jobs, org is 95% h1b engineers from china/india. Tons of resumes from american grads somehow never hit my desk, continue to interview random candidates from india with some low quality USA masters from missisipi state. Candidate has spent the last year locked in a room memorizing algorithm interview questions
Madmallard [3 hidden]5 mins ago
Why are we letting tech companies in the USA do this?
lateforwork [3 hidden]5 mins ago
Because if we didn't we wouldn't have a tech industry?
The main problem with immigrant talent in computer field is that legislators don't understand the difference between IT and Tech product development jobs. IT jobs don't need immigrant talent, so companies like Accenture, Infosys etc. should not be given H-1B visas. But tech companies like Google, Meta, Apple, OpenAI etc. absolutely need immigrant talent, or they will lose to Chinese competitors.
shimman [3 hidden]5 mins ago
We had a tech industry prior to H1Bs before the 90s. What we didn't have was Silicon Valley corporatism that doesn't value American labor nor American education. It's why SV is so gun-ho on charter schools and devaluing American labor.
Let's not act like we need to import 80k "high tech" workers that amount to writing react components and spring endpoints.
Hardly anything hard that we couldn't force companies to train workers to do, but they don't want to ever help people they just want to suck up all the money in the room while decimating entire populations.zzzzzzz
Also, as an American I don't really benefit if US corporations are doing "better." How does that help the person that can't pay for healthcare or afford to go to school, but they sure can get their serving of Zuckerberg slop? I'm supposed to care about these companies success? Really? I hope they go down in flames.
The problem is that the rich and elite have captured and dictated American tech policy for far too long.
Spooky23 [3 hidden]5 mins ago
It generates economic activity and taxes in the US and suppresses wages.
Most of the H1 candidates are in shitty roles that are well defined low/moderate skill jobs for giant companies. Hire people whom you can’t actively exploit and those are the kind of jobs where unions can organize.
The alternative is offshoring the work, not hiring Americans.
The smart thing would be to just let people immigrate. Instead we have a weird tiered system with a small number of highly skilled specialists and an army of serfs facing deportation if they piss off the bosses.
To do what, attract talents from the rest of the world?
ang_cire [3 hidden]5 mins ago
Because we live in a techno-oligarchy now? Because the leaders of the top tech companies (by revenue) literally sat behind the President at his inauguration?
Because they can afford to buy the 'right' to do what they want, and you can't, and what they want is cheaper labor who they have more control over, and H1B workers will never rock the boat because the visa is a sword hanging over their heads.
Downvote all you want, it's the truth.
dfdsjsdklfjs [3 hidden]5 mins ago
The big open secret that explains everything is that all these big name companies (Google, Facebook, etc) were founded by US intelligence.
The insta-downvotes are their bots attempting to cover up this fact.
bigyabai [3 hidden]5 mins ago
Why would we stop them? Labor is a free market.
mikert89 [3 hidden]5 mins ago
honestly i have no idea, in some cases, they are working weekends/are hyper focused on extremely boring, somewhat manual work. some of the systems are complicated and break constantly, so they are almost just oncall fodder for manually fixing a constantly breaking high scale service
Ifkaluva [3 hidden]5 mins ago
Will these kinds of jobs survive Claude code? It sounds like exactly the job that would be easiest to automate.
cheschire [3 hidden]5 mins ago
Without an archive link I think 90% of these comments will be in response to the title only.
bookofjoe [3 hidden]5 mins ago
I'd happily provide one but I've had enough of being repeatedly trashed and denigrated here for posting too many archive links.
gerdesj [3 hidden]5 mins ago
You could look at your denigrators and decide: "fuck you". Internet points are strangely attractive but not vital. You can always post with another account with a bit of effort too.
I understand that you are pissed off (as am I too) but debating with an army of bots, LLMs, wankers and Russians is unfortunately the status quo, quotidien.
On the bright side there are lots of lovely folk hereabouts with a large thing between their ears.
alexchantavy [3 hidden]5 mins ago
> You could look at your denigrators and decide: "fuck you"
That escalated quickly and I enjoyed this comment a lot
gerdesj [3 hidden]5 mins ago
Happy to oblige.
layer8 [3 hidden]5 mins ago
Looking at your karma, I think you can afford it. ;)
Uhhrrr [3 hidden]5 mins ago
I have only ever gotten thanks and net positive points for posting archive links.
slg [3 hidden]5 mins ago
It is interesting that sites like HN have existed for decades and still don't have any real solution for this sort of problem. Is providing a way for us to all bypass paying for this content that cost money to produce actually the most desirable outcome?
Like imagine if there was some 90 minute tech documentary on Netflix that was worth discussing here. Could I just rip it and link to a copy on my Google Drive? How long would that link stay up? I can't imagine long. I guess the conclusion based off how these sites operate is that piracy doesn't count when it's just words.
Terr_ [3 hidden]5 mins ago
> It is interesting that sites like HN have existed for decades and still don't have any real solution for this sort of problem.
Heck, I find it odd we don't even model the problem, let alone solve it.
There's no tickbox for "this submission requires additional access", no way to sort for/against them, not even an informal convention like putting [paywall] in the title.
That first link is not relevant to the point of my comment. I was not complaining about paywalls. The comment also doesn't address whether paywall bypasses would be acceptable for non-text links.
Regarding the second link, I'll happily engage with something specific dang said on this topic if you want to link to it, but a link to every time he said the word "paywalls" is not a productive contribution to this conversation.
neom [3 hidden]5 mins ago
If people are doing that, that is lame, especially given you're a valued member of this community.
> You’ve just missed out—free access to this article has expired. Register to view
dragonfax [3 hidden]5 mins ago
Given that it's paywall, there's a good reason for that. (gift link below no longer works)
didgetmaster [3 hidden]5 mins ago
Posts with links to paywalled articles should be identified somehow. (Flashing red title??)
I would prefer to just skip them, but they are not easy to spot.
mmmpetrichor [3 hidden]5 mins ago
I work for a fairly big tech company, For the last 2 years, basically stopped hiring people in the US. Only hiring engineers in the India office.
0xAntonioo [3 hidden]5 mins ago
Feels early to blame AI for most of this. A lot of it still looks like a reset from 2021-era hiring assumptions rather than actual labor replacement
dvwcf [3 hidden]5 mins ago
Over hiring is one thing.. but that wouldnt be a problem if there was an endless stream of projects to take that are value creating.
So the issue is not necessarily the over-hiring.. more that the large tech firms are running out of projects to take that are value-creating. Which is not surprising - the labour market in its currently state is absolutely not perfect in allocating labour.
It should be noted that fixing tech debt is not necessary value-creating from a financial standpoint. What engineers think is value creating has nothing to do with what a CFO determines to be a value creating project - whose job is to maximize firm value.
ikr678 [3 hidden]5 mins ago
Market saturation for tech products + new competition from vibe coded startups moving into mature enterprise spaces.
The rate of non-tech business growth has slowed, who is going to continue to buy all these cloud software services? Tricking consumers into subscribing to AI tools or extra storage only goes so far.
chii [3 hidden]5 mins ago
The "value" creation is tied to the interest rate.
When interest rates are low, even low value creation projects are viable.
When rates are high, those exact same projects are no longer viable.
Therefore, i would argue that the labour market is not perfectly allocating labour, but it is close enough for practical purposes.
dvwcf [3 hidden]5 mins ago
Value creation is more of a function of cash flow potential on a project than the risk-free rate, cost of equity or cost of debt.
"labour market is not perfectly allocating labour, but it is close enough for practical purposes."
No its not lmao. Do you even know what characteristics a 'perfect labour market' is even comprised of? Go on, surprise me.
plemer [3 hidden]5 mins ago
Isn’t it both - ROI must exceed whatever cost benchmark?
dvwcf [3 hidden]5 mins ago
"..is more of a function of cash flow potential "
"more of".
If you wanna finesse the discount rate by a few percentage points go ahead. Cash flows contribute more toward the end value number.
in b4 some numpty writes about the fed changing a market set rate.
Aurornis [3 hidden]5 mins ago
I don't think it's that simple. The 2021 over-hiring was a weird time where hiring as many people as possible became a goal in itself.
There were companies growing organizations faster than they could be made productive. They acquired a lot of bureaucracy, excess structure, and inefficiency. The current backlash is trying to reverse the excess structure and middle management buildout so they can get back to more functional teams.
throwaway1492 [3 hidden]5 mins ago
Glad I was over working three full time jobs (remote w2) during that period. I stacked A LOT of cash during that those years that has appreciated nicely.
kypro [3 hidden]5 mins ago
> Over hiring is one thing.. but that wouldnt be a problem if there was an endless stream of projects to take that are value creating.
I very much agree.
A lot of tech job growth during the late 2010 and pandemic period were frankly BS for a ROI perspective. Late 2010s was really the first time in tech that I started to feel like most of the stuff that needed to be built was built, and increasingly I was working on BS projects offering less and less value every year.
Consider:
- In the 80s developers were needed to write fundamental business software for word processing and spreadsheets
- In the 90s computers became mainstream and there was a huge demand for consumer software
- In the 00s the internet took off and we needed people build the web
- In the 10s the smart phone revolutionised computing and we needed people to build apps and rebuild websites to be mobile-first
But towards the late 10s entrepreneurs and investors seemingly ran out of no-brainer tech investments so increasingly started trying mental stuff still promising tech-like returns – block-chain, metaverse, Web 3.0, [insert traditional industry here] but a tech company.
I'm not saying there's nothing to build or maintain anymore, but I also no longer see where people think the exponential need for new software and software developers could come from, and I suspect this would have become obvious earlier if it wasn't for ZIRP.
But it's not a lack of productive things to build. We also have other trends hurting demand for new SWEs today. Consider how today completely non-technical people can start and scale an ecommerce company without any developers. Things that would have taken armies of developers just 10-15 years ago, can now be largely done in an afternoon on platforms like Shopify. It's actual hard to believe that just 15 years ago selling things online used to be very hard if you weren't technical.
Similarly starting in the early 2010s even being a developer got significantly easier because increasingly there was packages for everything. Things I might have spent weeks building before could now be built in days or less. And another thing that changed was sites like stackoverflow and blogs which help you solve problems and learn new skills. I remember trying to learn how to do things before 2010s was hard, and before the 00s it very hard.
And of course now we also have AI coding tools which don't just hurt the overall demand for developers, but effectively expands the supply of developers to anyone with an internet connection and computer.
So to summerise:
- There's much fewer good investments to be made in new software today.
- Where there are investments to be made you need far less developers.
- When you need developers there's far more people who can do the job.
Even if tech companies are doing well and the number of tech jobs is increasingly, the above means the average person trying to find a job in tech today will find it much, much harder than they have in the past. People working in tech today genuinely should consider a career change if they're primarily in tech for the money.
suzzer99 [3 hidden]5 mins ago
Yeah it feels like a lot of this could still be shakeout from FAANG and FAANG wannabes hiring 10x more engineers than needed. I realize Square is a more complicated business than OnlyFans. But OnlyFans isn't a trivial app and it runs on 42 employees. Block had more than 10k employees pre-layoff.
lux-lux-lux [3 hidden]5 mins ago
42 employees and a legion of contractors.
agentultra [3 hidden]5 mins ago
I don’t think that sufficiently explains it either. We all want a nice little bow on a simple, easy target. But there isn’t one.
Block, for example, said that they had experienced record profits… in spite of their, “hiring spree.”
This is just capitalism working as usual, only more of it, faster.
AI has a part in it. So does austerity policies and the rise of a certain political climate.
mcmcmc [3 hidden]5 mins ago
We still haven’t fully corrected from ZIRP either
jmuguy [3 hidden]5 mins ago
You'd think if the market were rational that Block would be punished for firing that many people, and obviously lying about why, but we haven't reached that point yet I guess.
throwaway13337 [3 hidden]5 mins ago
Software devs lost their pricing power due to LLMs but not exactly how most people think.
What's missed in understanding is 'how exactly does this functionality work for this specific case?' or 'can we implement this tiny one off feature in some legacy code base'. Both things are why you keep the guy that wrote it around. And you couldn't really replace him. Because digging into what he wrote was hard.
Now, LLMs can do that stuff better than the guy that wrote it.
Software devs were non-fungible. Now they're commodities. When things become commodities, they lose their value.
I'm not sure why I haven't heard people talk about this aspect. It's the biggest effect on jobs.
sebmellen [3 hidden]5 mins ago
While this is true to an extent, oftentimes the important context is not in the code but in the head of the writer. The code is just the fence in the Chesterton’s Fence analogy. And that is still non-fungible and will (presumably) forever be.
> There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.”
throwaway13337 [3 hidden]5 mins ago
You're correct that it doesn't answer the why.
But it answers the what, how, and allows one-off features.
So the guy that wrote might (or might not) still have the edge with the why. But that's not the moat it used to be.
sebmellen [3 hidden]5 mins ago
In big companies, the why is 80% of the work. I could swear actual dev work is less than 20% of a “developer’s” job at a standard large (non-SV/FAANG/tech-first) company. The rest is holding a lot of really weird organization-specific context in your head to make the right decision.
qwertyuiop_ [3 hidden]5 mins ago
I am yet to see this pan out in the enterprise. Enterprises are full of mini kingdoms built by VP+ leaders with the tools they prefer or were sold on. And many of these tools are inherently and sometimes by design are cumbersome, expansive and not onboarding friendly. LLMs haven't breached this domain and this domain empirically is 80% of enterprise software. I am yet to see direct examples of llm agents replacing say 4 engineers out of an existing 8 person team.
mchusma [3 hidden]5 mins ago
developer jobs are up: postings ~11% YoY, tech workforce +1.9% projected for 2026, BLS 15% long-term growth. Rebound from 2024-25 lows
Jevons paradox
MeetingsBrowser [3 hidden]5 mins ago
Job postings are not jobs. A company can layoff 10k and post jobs for 1k and have job postings be up.
Projections are useless. The past does not predict the future.
BLS is useless for the same reason, amplified.
calepayson [3 hidden]5 mins ago
I think a big part is that the hiring market for juniors feels apocalyptic rn. Jobs may be up in aggregate but that doesn’t necessarily mean everyone’s feeling it equally.
JoRyGu [3 hidden]5 mins ago
Maybe a case of really high profile companies doing RIFs while medium and smaller companies keep hiring?
AlBugdy [3 hidden]5 mins ago
Can anyone provide historical data for "job busts" or other types of declines in tech employment, massive layoffs or hire freezes? I seem to read about something like this every few years. Would like some data to see if this trend is stronger than the previous ones or not.
jcranmer [3 hidden]5 mins ago
Here's some historical data for employment in category NAICS 5415: https://fred.stlouisfed.org/series/IPUMN5415W200000000. Which, NAICS 5415 definitely isn't a complete picture of tech employment, but it should give rough correlation I would hope.
mr_00ff00 [3 hidden]5 mins ago
Layoffs.fyi maybe? Not sure if that’s the data you are asking for
menzoic [3 hidden]5 mins ago
I just put your question into GPT-5 Pro
0xchamin [3 hidden]5 mins ago
One of my mentor at my company got redundant exactly a year ago. He is still looking for the next role. I really feel bad.
ang_cire [3 hidden]5 mins ago
I think a lot of people don't see the historical trend of where investment moves around. The 90s dot-com era was driven by massive investment in e-commerce (remember that term?) businesses. It created a bubble, which then burst.
When bubbles burst, investors change tack, and when investors pull back, companies turtle and do layoffs.
After Y2K, investment moved on to real estate. In 2008, that bubble burst. It moved on to social media next.
I would argue that bubble burst somewhat silently during COVID with the quiet deaths of Xitter and Facebook (from a standpoint of cultural relevance), and investment transitioned towards and into its current AI bubble.
Investors are always going to move on once the lustre of a field has waned, and I'd hazard that we'll see investment move somewhere other than software tech next (if I had to prognosticate, I'd say it's moving into robotics/ drones).
atduran [3 hidden]5 mins ago
corps do that to make stocks go up, once the quarter ends, they start hiring again
jmyeet [3 hidden]5 mins ago
So I want to talk about the Claude Mythos mega-model that had a real marketing push in the last week. Marcus Hutchins (of IWannaCry fame) posted about the economics of this [1].
The upshot here is this was a longstanding BSD bug but mostly because nobody is paying BSD bug bounties and a null-pointer dereference may induce a crash but rarely (if ever) leads to privilege escalation as buffer overflows generally do so isn't as high-priority. The estimate on the cost is $20-50k of tokens.
$50k gets you a lot of developer time, upwards of 2 months. As has been stated in a bunch of threads, much smaller models could also find this bug. The defense is "they didn't" and "once you know it's there, it's easier to find" but again, nobody has been paying BSD bug bounties. Put another way: far fewer people have been looking.
My point is that the economics of these models are still highly debatable. What they will do is allow you to use 1 engineer with AI tools instead of 2-4 engineers.
And we've arrived at the real goal here: to reduce head counts. Why? To suppress labor costs. To get people who didn't get laid off to do more free work.
In addition, AI capex is used as an excuse to cut costs ie reduce labor, for exactly the same reasons.
Lastly, the threat of this happening in the near to medium future is also used to scare labor and reduce costs.
Sufficiently largecompanies can't grow anymore. Their only path forward is to raise prices and redcue costs. For tech companies, labor is a significant cost. That's what's going on here.
Problem with these models is that they are not reliable. You never know when the provider decides to nerf the model and after good week of making progress you find the model is now running in circles and you are burning tokens like you are operating a steam locomotive and standing still.
xdennis [3 hidden]5 mins ago
> $50k gets you a lot of developer time, upwards of 2 months.
God, I wish I was American sometimes...
yoyohello13 [3 hidden]5 mins ago
Right up until you lose that job and get sick.
jopsen [3 hidden]5 mins ago
> What they will do is allow you to use 1 engineer with AI tools instead of 2-4 engineers.
Stop there. The costs of today state-of-the-art hype-ware will be lower tomorrow.
In a year, there will be fewer easy to find bug, and models like this will be cheaper.
Doing math based on hardware deployed today is a bit naive.
Assuming any of this is intentional about reducing devs is just dumb.
ianm218 [3 hidden]5 mins ago
You’re missing the cost curve of inference. What costs $20-50k now will cost 2-5k in a year or two at which point the math is a no brainer. It makes a lot of sense to build products that almost work now or are almost economical and ride the trend line.
ehnto [3 hidden]5 mins ago
Why would it reduce ten fold? I think performance of the chips gets a little better but not that much better. The cost of the infra is probably going to go up if energy costs keep going up and presuming the US can keep getting cheap chips.
stego-tech [3 hidden]5 mins ago
Would really appreciate an archive link to read this, even though it appears to be another meta-analysis wrapped in a “everything will be fine” narrative from the limited previews I’ve been able to gleam and The Economist’s general lean.
The problem with the tech layoffs is that it’s poisoning the well downstream. Smaller employers have repeatedly cited their layoffs as justification for abysmal, unlivable salaries, and demanding those of us looking for work suck it up and deal with it while they search for bottom-dollar unicorns.
AI isn’t replacing the IT crowd outside of the expected junior roles, and even that’s starting to rebound as executives realize Juniors were how they got “white glove service” for themselves - a Senior Engineer isn’t going to wipe their ass for them, job market or not, because said Engineer’s time is infinitely better spent on literally anything else.
One other thing I’ll note is that the layoffs also seem to be remnants of the Brogrammer hustle culture: tech folks were enjoying more time for themselves to grow or live life outside of work specifically thanks to a few good years of paying down technical debt with properly-staffed teams, but the grifters up top see anything less than a 9-9-6 as somehow stealing from the employer and slash accordingly. The remnants are expected to do more work for less pay, with AI tools somehow filling the gap (even though these same employers often don’t want to pay for proper tooling to maximize use of AI).
This is definitely an industry downturn as those who stand to gain maximize their immediate returns. From the perspective of the C-Suite and Boards, the safest (albeit unethical) move is to betray: if AI is a bust, they’ll have made their wealth and can fuck off; if AI eliminates jobs and work, they believe their wealth will protect them in the future dystopia of their own creation.
It’s in that context (“fuck you got mine”) that the broader narrative fits with the myriad of puzzle pieces out there (higher interest rates, stock pumps, circular financing, tariffs, aging population, AI, etc).
cute_boi [3 hidden]5 mins ago
AI + Outsourcing is the major reason of tech job bust. But, I think politician won't care as they are here for corporation not for the people.
trgn [3 hidden]5 mins ago
> politician won't care
they dont care because nobody has a real gut feel affinity for computer programmers or the work, the sort of feeling that is required to animate somebody to action. it's never been a profession with any esteem, and the field never professionalized in the past 60 years, which is a shame, because we now see the outcome.
qwertyuiop_ [3 hidden]5 mins ago
Nothing happens in a vacuum. "Tech companies" act in a coordinated fashion despite claiming they don't. Remote work was great until it wasn't. This is a coordinated action to shed all the mass gathered during ZIRP. Market forces is one naive explanation. But as history has shown factors extraneous to market forces are at work. This period will be written about, analyzed and diced in thousand different ways by future "thought leaders".
downrightmike [3 hidden]5 mins ago
The only reason to blame AI is that it doesn't live up to promises, so corps are betting that scale will solve the problem. So datacenter after datacenter wants to be built in areas that can't supply them water/power.
And that money sink is causing the layoffs. If AI really helped like it expected to, you would grab any dev you could so that you could have a army of 100x devs.
Humans are still cheaper for the real costs. Augmented humans are on average a few percentage points better.
mountainofdeath [3 hidden]5 mins ago
The same thing was said about the crazy build-out of fiber and telecommunications infrastructure. That infrastructure did prove useful but it took about 10-20 years before that was the case. It took 4G becoming broadly available and the ensuing increase of mobile devices to use at least of the overbuilt network capacity.
HerbManic [3 hidden]5 mins ago
While I get this stream of thought, the major difference is that the internal hardware will age much faster than the core infrastructure. The real question is, how much of these build out costs is the center itself and the hardware within.
That fiber build out will last for decades, a Blackwell GPU, not so much.
wmf [3 hidden]5 mins ago
I could be wrong but it seems like in the case of a crash no one will be buying new GPUs and thus the existing ones could hold their value longer. Of course that value will no longer be massively inflated by bubble FOMO.
fluoridation [3 hidden]5 mins ago
>in the case of a crash no one will be buying new GPUs and thus the existing ones could hold their value longer.
No, because no one has any use for those monstrous GPUs outside of ML and some research projects. They can't even be dropped onto the consumer market because a SOHO is not equipped to house devices like that. The best case scenario is that the boards get dismantled and the VRAM gets salvaged for refurbishing. They've built these machines so specialized that they're essentially disposable.
wmf [3 hidden]5 mins ago
There's a baseline paid demand for AI inference that can fully occupy today's GPUs (even after a crash) so there's no need to sell or scrap them.
fluoridation [3 hidden]5 mins ago
What are you basing that on? Some of the demand that currently exists, exists because of all the money sloshing around the AI ecosystem (i.e. people using AI to sell AI solutions to other people), so how are you so sure demand can fully utilize all existing compute even after a crash?
irishcoffee [3 hidden]5 mins ago
It isn’t about holding value, the cards are going to burn up. If they don’t, in 5 years one could run a rack of 4 cards at home at an affordable rate. Either the cards become affordable again and the datacenter is useless, or they don’t, and nobody can fucking afford to rent them.
wmf [3 hidden]5 mins ago
GPUs definitely have higher failure rates than CPUs but I'm not sure what the absolute rates will turn out to be. If 10% of GPUs die within 5 years that's very high but also probably economically fine. If 50% die that's a disaster.
irishcoffee [3 hidden]5 mins ago
Sorry, I meant at some point the current cards in he data centers will be obsolete, financially. They’ll be sold on the secondary market. Buying 8 h200s at $150 a pop will either be a real thing, or they all burn up and capex explodes again, which would be a death knell.
Either way, the moat is about 7 inches wide.
nyrikki [3 hidden]5 mins ago
Chalk an cheese, Wavelength Division Multiplexing took out Globalcrossing Worldcom etc…
But the secondary market that grew out of it was because once it is in the ground it has a long lifespan and low upkeep costs, this is not the same thing as ultra high power density data centers.
Cooling needs to be balanced with demand, they may not work for even cloud scale type loads without serious issues etc…
Not that it matters, my hometown has an announced DC and it is looking more and more like it is a shill, as do several of the others in the area.
bitwize [3 hidden]5 mins ago
The difference is that fibre is infrastructure, LLMs are an application. Who knows, maybe they will pass and leave behind that server infra, and that's where our digital consciousnesses will live once our bodies die.
Esophagus4 [3 hidden]5 mins ago
> If AI really helped like it expected to, you would grab any dev you could so that you could have an army of 100x devs.
This seems maybe a bit reductionist.
AI will have diminishing returns because at a certain point, coding is not the bottleneck and coordination is or some other thing that hasn’t been optimized yet. The exact bottleneck seems like it depends on the organization.
My theory is that in general, augmented devs are much more productive, but 100% of that gain doesn’t translate into 100% more software delivered to customers, and there is a point where coding isn’t the longest pole.
But I don’t think most orgs are at that break even yet, and I think we can still get more out of engineering before we plateau.
varispeed [3 hidden]5 mins ago
Another reason is that wages don't keep up with the cost of living. For instance in the UK it makes little sense to be in IT, unless it's something boring and typing React boilerplate feels easier than stacking shelves or running deliveries.
cynicalsecurity [3 hidden]5 mins ago
Oh no, we are all going to die, again. How many times already.
Bad news gets sold better. Even better hysteria. Why not write a hysterical article to milk some ads money on doom and gloom? Because we value your privacy, we use cookies and similar technologies. Some collect your data, like your IP address. Others collect anonymous data. Together with our 177 (holy fuck) trusted partners.
rvz [3 hidden]5 mins ago
When anyone asks the question: "What is AGI", it is actually this. An "abundance" of nothing else but this.
It's just that the tech workers are the canaries in the coal mines for the other white collar knowledge workers.
This is "AGI".
i_love_retros [3 hidden]5 mins ago
Meanwhile... I had to step in and hand code lots of css today because copilot couldn't do the thing. And I had to step in and manually fix tests yesterday because copilot couldn't do the thing.
Are you paid by OpenAI and/or Anthropic?
gbnwl [3 hidden]5 mins ago
Have you considered not using copilot and using Claude Code or Codex directly?
rvz [3 hidden]5 mins ago
> Are you paid by OpenAI and/or Anthropic?
No. The VCs and angel investors screaming “abundance” are the paid promoters.
The problem is their “utopia of abundance” is not for us. They know the opposite is the reality (layoffs, offshoring, wage suppression and AI backlash)
They built their own bunkers and moats for a reason. Because true “AGI” will bring an abundance of very angry people going after them.
That is not worth being paid for by any AI lab.
omgJustTest [3 hidden]5 mins ago
reciprocal tariffs had put the non-tech and tech economy in stasis (except for hardware for AI). they are also are better than tax breaks and will supercharge bottom lines for large corporations once reclaimed and if prices remain high.
also if you want to test/force ai adoption you have to put pressure by firing some
now wars will put us into further stasis or decline via increased inflation pressure.
h4kunamata [3 hidden]5 mins ago
It is the perfect storm, companies overhired during covid by a 1000%
Then covid went away, life went back to "normal", tons of juniors everywhere who moved to IT because of WFH and the salary.
Then comes AI, within the IT field, I am seeing first hand companies firing by the thousands because of AI.
Only Blue collar was being affected in the past, but now is everybody, no matter of your degree, experience, you either already had a hard time or you will have.
If Google Quantum computing somehow finds a massive breakthrough, we are all fu :)
The main problem with immigrant talent in computer field is that legislators don't understand the difference between IT and Tech product development jobs. IT jobs don't need immigrant talent, so companies like Accenture, Infosys etc. should not be given H-1B visas. But tech companies like Google, Meta, Apple, OpenAI etc. absolutely need immigrant talent, or they will lose to Chinese competitors.
Let's not act like we need to import 80k "high tech" workers that amount to writing react components and spring endpoints.
Hardly anything hard that we couldn't force companies to train workers to do, but they don't want to ever help people they just want to suck up all the money in the room while decimating entire populations.zzzzzzz
Also, as an American I don't really benefit if US corporations are doing "better." How does that help the person that can't pay for healthcare or afford to go to school, but they sure can get their serving of Zuckerberg slop? I'm supposed to care about these companies success? Really? I hope they go down in flames.
The problem is that the rich and elite have captured and dictated American tech policy for far too long.
Most of the H1 candidates are in shitty roles that are well defined low/moderate skill jobs for giant companies. Hire people whom you can’t actively exploit and those are the kind of jobs where unions can organize.
The alternative is offshoring the work, not hiring Americans.
The smart thing would be to just let people immigrate. Instead we have a weird tiered system with a small number of highly skilled specialists and an army of serfs facing deportation if they piss off the bosses.
Because they can afford to buy the 'right' to do what they want, and you can't, and what they want is cheaper labor who they have more control over, and H1B workers will never rock the boat because the visa is a sword hanging over their heads.
Downvote all you want, it's the truth.
The insta-downvotes are their bots attempting to cover up this fact.
I understand that you are pissed off (as am I too) but debating with an army of bots, LLMs, wankers and Russians is unfortunately the status quo, quotidien.
On the bright side there are lots of lovely folk hereabouts with a large thing between their ears.
That escalated quickly and I enjoyed this comment a lot
Like imagine if there was some 90 minute tech documentary on Netflix that was worth discussing here. Could I just rip it and link to a copy on my Google Drive? How long would that link stay up? I can't imagine long. I guess the conclusion based off how these sites operate is that piracy doesn't count when it's just words.
Heck, I find it odd we don't even model the problem, let alone solve it.
There's no tickbox for "this submission requires additional access", no way to sort for/against them, not even an informal convention like putting [paywall] in the title.
https://hn.algolia.com/?dateRange=all&page=0&prefix=false&so...
Regarding the second link, I'll happily engage with something specific dang said on this topic if you want to link to it, but a link to every time he said the word "paywalls" is not a productive contribution to this conversation.
> You’ve just missed out—free access to this article has expired. Register to view
I would prefer to just skip them, but they are not easy to spot.
So the issue is not necessarily the over-hiring.. more that the large tech firms are running out of projects to take that are value-creating. Which is not surprising - the labour market in its currently state is absolutely not perfect in allocating labour.
It should be noted that fixing tech debt is not necessary value-creating from a financial standpoint. What engineers think is value creating has nothing to do with what a CFO determines to be a value creating project - whose job is to maximize firm value.
The rate of non-tech business growth has slowed, who is going to continue to buy all these cloud software services? Tricking consumers into subscribing to AI tools or extra storage only goes so far.
When interest rates are low, even low value creation projects are viable.
When rates are high, those exact same projects are no longer viable.
Therefore, i would argue that the labour market is not perfectly allocating labour, but it is close enough for practical purposes.
"labour market is not perfectly allocating labour, but it is close enough for practical purposes."
No its not lmao. Do you even know what characteristics a 'perfect labour market' is even comprised of? Go on, surprise me.
"more of".
If you wanna finesse the discount rate by a few percentage points go ahead. Cash flows contribute more toward the end value number.
in b4 some numpty writes about the fed changing a market set rate.
There were companies growing organizations faster than they could be made productive. They acquired a lot of bureaucracy, excess structure, and inefficiency. The current backlash is trying to reverse the excess structure and middle management buildout so they can get back to more functional teams.
I very much agree.
A lot of tech job growth during the late 2010 and pandemic period were frankly BS for a ROI perspective. Late 2010s was really the first time in tech that I started to feel like most of the stuff that needed to be built was built, and increasingly I was working on BS projects offering less and less value every year.
Consider:
- In the 80s developers were needed to write fundamental business software for word processing and spreadsheets
- In the 90s computers became mainstream and there was a huge demand for consumer software
- In the 00s the internet took off and we needed people build the web
- In the 10s the smart phone revolutionised computing and we needed people to build apps and rebuild websites to be mobile-first
But towards the late 10s entrepreneurs and investors seemingly ran out of no-brainer tech investments so increasingly started trying mental stuff still promising tech-like returns – block-chain, metaverse, Web 3.0, [insert traditional industry here] but a tech company.
I'm not saying there's nothing to build or maintain anymore, but I also no longer see where people think the exponential need for new software and software developers could come from, and I suspect this would have become obvious earlier if it wasn't for ZIRP.
But it's not a lack of productive things to build. We also have other trends hurting demand for new SWEs today. Consider how today completely non-technical people can start and scale an ecommerce company without any developers. Things that would have taken armies of developers just 10-15 years ago, can now be largely done in an afternoon on platforms like Shopify. It's actual hard to believe that just 15 years ago selling things online used to be very hard if you weren't technical.
Similarly starting in the early 2010s even being a developer got significantly easier because increasingly there was packages for everything. Things I might have spent weeks building before could now be built in days or less. And another thing that changed was sites like stackoverflow and blogs which help you solve problems and learn new skills. I remember trying to learn how to do things before 2010s was hard, and before the 00s it very hard.
And of course now we also have AI coding tools which don't just hurt the overall demand for developers, but effectively expands the supply of developers to anyone with an internet connection and computer.
So to summerise:
- There's much fewer good investments to be made in new software today.
- Where there are investments to be made you need far less developers.
- When you need developers there's far more people who can do the job.
Even if tech companies are doing well and the number of tech jobs is increasingly, the above means the average person trying to find a job in tech today will find it much, much harder than they have in the past. People working in tech today genuinely should consider a career change if they're primarily in tech for the money.
Block, for example, said that they had experienced record profits… in spite of their, “hiring spree.”
This is just capitalism working as usual, only more of it, faster.
AI has a part in it. So does austerity policies and the rise of a certain political climate.
What's missed in understanding is 'how exactly does this functionality work for this specific case?' or 'can we implement this tiny one off feature in some legacy code base'. Both things are why you keep the guy that wrote it around. And you couldn't really replace him. Because digging into what he wrote was hard.
Now, LLMs can do that stuff better than the guy that wrote it.
Software devs were non-fungible. Now they're commodities. When things become commodities, they lose their value.
I'm not sure why I haven't heard people talk about this aspect. It's the biggest effect on jobs.
> There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.”
But it answers the what, how, and allows one-off features.
So the guy that wrote might (or might not) still have the edge with the why. But that's not the moat it used to be.
Jevons paradox
Projections are useless. The past does not predict the future.
BLS is useless for the same reason, amplified.
When bubbles burst, investors change tack, and when investors pull back, companies turtle and do layoffs.
After Y2K, investment moved on to real estate. In 2008, that bubble burst. It moved on to social media next.
I would argue that bubble burst somewhat silently during COVID with the quiet deaths of Xitter and Facebook (from a standpoint of cultural relevance), and investment transitioned towards and into its current AI bubble.
Investors are always going to move on once the lustre of a field has waned, and I'd hazard that we'll see investment move somewhere other than software tech next (if I had to prognosticate, I'd say it's moving into robotics/ drones).
The upshot here is this was a longstanding BSD bug but mostly because nobody is paying BSD bug bounties and a null-pointer dereference may induce a crash but rarely (if ever) leads to privilege escalation as buffer overflows generally do so isn't as high-priority. The estimate on the cost is $20-50k of tokens.
$50k gets you a lot of developer time, upwards of 2 months. As has been stated in a bunch of threads, much smaller models could also find this bug. The defense is "they didn't" and "once you know it's there, it's easier to find" but again, nobody has been paying BSD bug bounties. Put another way: far fewer people have been looking.
My point is that the economics of these models are still highly debatable. What they will do is allow you to use 1 engineer with AI tools instead of 2-4 engineers.
And we've arrived at the real goal here: to reduce head counts. Why? To suppress labor costs. To get people who didn't get laid off to do more free work.
In addition, AI capex is used as an excuse to cut costs ie reduce labor, for exactly the same reasons.
Lastly, the threat of this happening in the near to medium future is also used to scare labor and reduce costs.
Sufficiently largecompanies can't grow anymore. Their only path forward is to raise prices and redcue costs. For tech companies, labor is a significant cost. That's what's going on here.
[1]: https://www.tiktok.com/@itsmarcushutchins/video/762774007353...
God, I wish I was American sometimes...
Stop there. The costs of today state-of-the-art hype-ware will be lower tomorrow.
In a year, there will be fewer easy to find bug, and models like this will be cheaper.
Doing math based on hardware deployed today is a bit naive.
Assuming any of this is intentional about reducing devs is just dumb.
The problem with the tech layoffs is that it’s poisoning the well downstream. Smaller employers have repeatedly cited their layoffs as justification for abysmal, unlivable salaries, and demanding those of us looking for work suck it up and deal with it while they search for bottom-dollar unicorns.
AI isn’t replacing the IT crowd outside of the expected junior roles, and even that’s starting to rebound as executives realize Juniors were how they got “white glove service” for themselves - a Senior Engineer isn’t going to wipe their ass for them, job market or not, because said Engineer’s time is infinitely better spent on literally anything else.
One other thing I’ll note is that the layoffs also seem to be remnants of the Brogrammer hustle culture: tech folks were enjoying more time for themselves to grow or live life outside of work specifically thanks to a few good years of paying down technical debt with properly-staffed teams, but the grifters up top see anything less than a 9-9-6 as somehow stealing from the employer and slash accordingly. The remnants are expected to do more work for less pay, with AI tools somehow filling the gap (even though these same employers often don’t want to pay for proper tooling to maximize use of AI).
This is definitely an industry downturn as those who stand to gain maximize their immediate returns. From the perspective of the C-Suite and Boards, the safest (albeit unethical) move is to betray: if AI is a bust, they’ll have made their wealth and can fuck off; if AI eliminates jobs and work, they believe their wealth will protect them in the future dystopia of their own creation.
It’s in that context (“fuck you got mine”) that the broader narrative fits with the myriad of puzzle pieces out there (higher interest rates, stock pumps, circular financing, tariffs, aging population, AI, etc).
they dont care because nobody has a real gut feel affinity for computer programmers or the work, the sort of feeling that is required to animate somebody to action. it's never been a profession with any esteem, and the field never professionalized in the past 60 years, which is a shame, because we now see the outcome.
And that money sink is causing the layoffs. If AI really helped like it expected to, you would grab any dev you could so that you could have a army of 100x devs.
Humans are still cheaper for the real costs. Augmented humans are on average a few percentage points better.
That fiber build out will last for decades, a Blackwell GPU, not so much.
No, because no one has any use for those monstrous GPUs outside of ML and some research projects. They can't even be dropped onto the consumer market because a SOHO is not equipped to house devices like that. The best case scenario is that the boards get dismantled and the VRAM gets salvaged for refurbishing. They've built these machines so specialized that they're essentially disposable.
Either way, the moat is about 7 inches wide.
But the secondary market that grew out of it was because once it is in the ground it has a long lifespan and low upkeep costs, this is not the same thing as ultra high power density data centers.
Cooling needs to be balanced with demand, they may not work for even cloud scale type loads without serious issues etc…
Not that it matters, my hometown has an announced DC and it is looking more and more like it is a shill, as do several of the others in the area.
This seems maybe a bit reductionist.
AI will have diminishing returns because at a certain point, coding is not the bottleneck and coordination is or some other thing that hasn’t been optimized yet. The exact bottleneck seems like it depends on the organization.
My theory is that in general, augmented devs are much more productive, but 100% of that gain doesn’t translate into 100% more software delivered to customers, and there is a point where coding isn’t the longest pole.
But I don’t think most orgs are at that break even yet, and I think we can still get more out of engineering before we plateau.
Bad news gets sold better. Even better hysteria. Why not write a hysterical article to milk some ads money on doom and gloom? Because we value your privacy, we use cookies and similar technologies. Some collect your data, like your IP address. Others collect anonymous data. Together with our 177 (holy fuck) trusted partners.
It's just that the tech workers are the canaries in the coal mines for the other white collar knowledge workers.
This is "AGI".
Are you paid by OpenAI and/or Anthropic?
No. The VCs and angel investors screaming “abundance” are the paid promoters.
The problem is their “utopia of abundance” is not for us. They know the opposite is the reality (layoffs, offshoring, wage suppression and AI backlash)
They built their own bunkers and moats for a reason. Because true “AGI” will bring an abundance of very angry people going after them.
That is not worth being paid for by any AI lab.
also if you want to test/force ai adoption you have to put pressure by firing some
now wars will put us into further stasis or decline via increased inflation pressure.
Then covid went away, life went back to "normal", tons of juniors everywhere who moved to IT because of WFH and the salary.
Then comes AI, within the IT field, I am seeing first hand companies firing by the thousands because of AI.
Only Blue collar was being affected in the past, but now is everybody, no matter of your degree, experience, you either already had a hard time or you will have.
If Google Quantum computing somehow finds a massive breakthrough, we are all fu :)