Hacker Newsnew | past | comments | ask | show | jobs | submit | johnvanommen's commentslogin

> A lot of them were making low 6 figures 10-15 years ago, and now many of them have no hope of making that much in their careers again because companies have vastly reduced the number of those roles.

I moved to the Seattle area during the dotcom boom.

Within 18 months I was unemployed.

There was DEFINITELY a feeling, like the whole “internet” thing might have been a bubble. I helped a friend move to Pleasanton CA and there were so many empty office buildings, it looked like a zombie movie.

But it all came back, and more.


I'm not at all saying that tech is somehow "dead". What I am saying is that many of the non-engineering roles that were popular in tech over the past 15-20 years are either going away or will continue with vastly reduced numbers of people. If you're someone who did agile training for the past 15 years and are now in your 40s and were laid off, I sincerely doubt you'll be able to get a job as an agile coach before the end of your career that pays as much as in the 2015-2020ish timespan.

The number of tech employees worldwide in 2026 is probably 2x or more compared to 2001. Maybe even 3x. Things are not the same.

also add in that the single largest source of tech growth in the last 3 years has been building tools that will obliterate the need for tech employees.

replacing SRE-4s with AI is the point


> To me, this is the purpose of the creative journey. Knowing yourself better, and enjoying all of the steps involved in arriving at what is always a surprising destination.

That's EXACTLY how I used to feel about creativity. I was an art major who didn't make it, and I found that expressing myself via my hobbies was good for the soul.

Then I almost died and completely lost interest in making art!

Facing my own mortality, I realized that the time I invest into my wife, kids and family will have a larger positive contribution on the world, I think.

I know that sounds like a Hallmark Card.

At the same time, I've often wondered what my life would look like if I appreciated my family MORE and my hobbies LESS when I was younger.


I can relate here. I have son, who is now 3.5 years old. I haven't had the time or energy to produce any "finished work" since he's been born, and that marks a lull after 25 years of steady output. I don't feel sad or disappointed about this in the slightest. As my wife likes to say, "it's the season we're in." That said, I do really enjoy chaotic jam sessions with my son, as he's very interested in banging on his little drum set, so in some ways, it's just a new beginning. There's no better investment than time with our children.

I feel a part of this is that in any creative endeavor, you can never exactly capture what you want and thus have to leave something out. There are those that try to get it perfect, they never finish.

Nothing wrong with prioritizing family over art, that's pretty rad! But occasionally you can still do art, just don't be to serious about it. All my paintings are objectively rubbish, but heck I like them and didn't put a huge amount of time into them.


> Nothing wrong with prioritizing family over art, that's pretty rad! But occasionally you can still do art, just don't be to serious about it. All my paintings are objectively rubbish, but heck I like them and didn't put a huge amount of time into them.

That's basically where I landed. The idea being that making art is something I should do if I'm just trying to relax. Once the hobby starts looking like a second job, I know it's too much.


Not a near death experience but similarish. Im trans and from a conservative religious family, so I planned to cut them off and eventually did..

Throughout my teens and young adulthood I immersed myself really deep into drawing and writing. But as my own life has started to form around me, I got a partner who I might have kids with, friends I care about. Ive slowly come to your pov too, and Im wishing I spent less time doing art in the past


That is a heck of a graph


> The US media has completely fooled the public into thinking their town is a violent hellhole, and that a trip to the grocery store is endangering their lives.

I used to live in both Seattle and Portland.

I took my family to Portland last year and wanted to show them the Ground Kontrol Arcade.

Before I even parked the rental car, some vagrant on a BMX bike threatened to murder us.


Right? It’s the obvious question:

why are they using software to detect software?

I can detect AI written prose in less than five seconds; I would expect a trained teacher to be able to do that as well.


You know you can't just say "I detect AI written prose" and then do whatever you want about it, right? It's not difficult, sure, to detect it. It's difficult to prove that it's true and then punish the student for it.


It's very easy to convince yourself it's true and then hand out punishment like it was on sale.


I suppose the hard part then is sitting through 26 grade appeals.


> How many hospitals, roads, houses, machine shops, biomanufacturing facilities, parks, forests, laboratories, etc. could we build

“We?”

This isn’t “our” money.

If you buy shares, you get a voice.


> That implies combined hyperscaler cloud and AI revenue going from: $330B today to $1.2T within 3 years :-))

You’re ignoring the fact that gaming is going to the cloud.

That industry is bigger than Hollywood.

Desktop computers will invariably follow.

The RAM shortage will drive the transition.

For instance, my wife uses her personal laptop about four days a year.

People like that won’t be buying personal desktops or laptops, five years from now. The RAM shortage will drive a transition into thin clients.

I already see it with our kids. They use an iPhone, unless they need to type. Then they use an iPad with a BT keyboard.


The RAM shortage is extremely temporary. It’ll last as long as it takes for new capacity to come online. RAM shortages and price spikes have happened many times before.

Eventually China will catch up in EUV fabrication and flood the market with cheap silicon. When that happens a terabyte of RAM will cost what 128gb costs now.


Cloud gaming is crap and any actual gamer will tell you that. The niche of gamers casual enough to not care about playing over network latency but serious enough to pay real money for cloud gaming is microscopic.


>The niche of gamers casual enough to not care about playing over network latency

In the saddest way possible, the niche of gamers are people playing on desktops with ethernet connections.

The majority of gamers are buying booster packs on mobile games.


Yes, but that majority doesn't need cloud gaming precisely because those games run just fine on their phone - there's no benefit in putting them in the cloud, that was supposed to be for fancy stuff where you need a beefy GPU for the eye candy.


It's not 2023 anymore. Have you tried cloud gaming in 2026? I can barely tell it's connected to the cloud.


Yes, it's amazing because it's streaming directly from a computer in the room behind me. :)


And the increases in network speed are one of the last bastions of Moores Law.


> nd the increases in network speed are one of the last bastions of Moores Law.

Throughput has increased but latency hasn’t changed much

Latency hasn’t decreased substantially since the late 90s when I remember getting sub 50 ms ping in Quake III from my dorm room in college


Speed of light doesn't adhere to Moore's law :) and it's made worse by the fact most everyone connects via WiFi these days and it alone adds a few ms more.


I'm not surprised; you need a lot more servers and even so, there are a lot of places where something low ping times is difficult. While there is a lot of room for latency to go down, 1 lightmillisecond is ~300 km (~186 mi). This means that if a computer is 150 km away, 1 ms is the minimum ping allowed by physics, if I am talking directly to it.

By that yardstick, we've actually done very well in a lot of cases. :)


Even if gaming goes to the cloud, how are they going to run the massive existing library of video games on the dedicated AI inference hardware that everyone is buying right now? Seems like that pivot would require even more spending.


And how are they going to get sub-5ms round trip latency into the average consumer’s home to avoid people continuing to see cloud gaming as a janky gimmick that feels bad to use?


What amount of the gaming industry do you think will go to AI providers and not game developers?

You think we'll replace gaming and desktop computers into the cloud in the timeline of the poster above (2-4 years?)

Just not realistic.


That may be true, but all of this can be done today without the massive capex and without “AI”.


Or we’re seeing a world where corporations dwarf countries.

Apple will be around in a hundred years.

Will the USA?


Tech companies never last. Apple will miss a disruptive innovation or make a key strategic error causing them to lose their dominant spot. Look at the top tech companies 50 years ago: how are they doing today?


Is like the transition from monarchies to nation states.

By the 19th century, the rise of nation-states accelerated due to the spread of nationalism, the decline of feudal structures, and the unification of countries like Germany (1871) and Italy (1861). Centralized governments, uniform laws, national education systems, and a sense of collective identity became defining features. The French Revolution (1789) played a pivotal role by promoting citizenship, legal equality, and national sovereignty over dynastic rule

Maybe in 2300 they'll say something similar about nationalism


Yes!

The world is changing fast.


There is exactly 0.00% chance Apple will be around in 50 years let alone 100.


exactly.


> the baseline of open models running on cheap third-party inference providers, or even on-prem. This is a bit of a challenge for the big proprietary firms.

It’s not a challenge at all.

To win, all you need is to starve your competitors of RAM.

RAM is the lifeblood of AI, without RAM, AI doesn’t work.


Assuming high bandwidth flash works out, RAM requirements should be drastically reduced as you'd keep the weights in much higher capacity flash.

> Sample HBF modules are expected in the second half of 2026, with the first AI inference hardware integrating the tech anticipated in early 2027.

https://www.tomshardware.com/tech-industry/sandisk-and-sk-hy...


How does HBF compare to the discontinued 3D XPoint?


HBF is NAND and integrated in-package like HBM. 3D XPoint or Optane would be extremely valuable today as part of the overall system architecture, but they were power-intensive enough that this particular use probably wouldn't be feasible.

(Though maybe it ends up being better if you're doing lots of random tiny 4k reads. It's hard to tell because the technology is discontinued as GP said, whereas NAND has kept progressing.)


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: