> spot prices of DRAM, used in various applications, nearly tripled in September from a year earlier.. improving profitability of non-HBM chips has helped fuel memory chipmakers' share price rally this year, with Samsung's stock up more than 80%, while SK Hynix and Micron shares have soared 170% and 140% respectively... industry is going through a classic shortage that usually lasts a year or two, and TechInsights is forecasting a chip industry downturn in 2027.
Micron has US memory semiconductor fab capacity coming online in 2027 through 2040s, based on $150B new construction.
> Microsoft CEO Satya Nadella has said the company has AI GPUs sitting idle because it doesn’t have enough power to install them.
If the PC supply chain will be impacted by memory shortages until 2027, could Windows 10 security support be extended for 24 months to extend the life of millions of business PCs that cannot run Windows 11?
> Micron has US memory semiconductor fab capacity coming online in 2027 through 2040s, based on $150B new construction.
Yay, the public is on the hook for $150B of loans to be payed by inflationary pricing.
I guess, you offered the news hoping prices will fall... In terms of real economic analysis there's a lot to say here but let me point at only one of the many entry points of the rabbit hole:
"Microsoft CEO says the company doesn't have enough electricity to install all the AI GPUs in its inventory - 'you may actually have a bunch of chips sitting in inventory that I can’t plug in'
Microsoft and all other AI wannabes are hoarding GPUs and thus RAM, they hope to sell them to you for a price of subscription which doesn't change the fact of speculative hoarding and trust-like behavior against the public.
The hoarding and inflation economy we live in is a weapon against the public, at the moment, there's no visible force that isn't laboring diligently on the enhancement of that weapon, so the timeline of change is likely to stretch somewhere between far in the future to infinity... just hoping otherwise is futile.
If you pay attention, you won't fail to notice the propaganda push to convince the public to pay higher electric costs in order to pay the capex for new energy plants and transmission lines. In other words, you pay the price, they own the benefits. And if the propaganda fails, they can always use some more general inflation to do the same, as it's being done elsewhere in the economy.
As I said, this is just scratching the surface, there's a lot more which cannot fit in a single comment.
Edit: actually not. The parent comment was edited after mine, to include a link to MS inadvertently admitting to the hoarding of GPUs and RAM.
>Yay, the public is on the hook for $150B of loans to be payed by inflationary pricing.
Where does it say it was funded by $150B of public loans?
>which doesn't change the fact of speculative hoarding
All investment resembles "speculative hoarding'. You're pouring money into a project now with the expectation that it'll pay off in decades.
> and trust-like behavior against the public.
???
>If you pay attention, you won't fail to notice the propaganda push to convince the public to pay higher electric costs in order to pay the capex for new energy plants and transmission lines. In other words, you pay the price, they own the benefits. And if the propaganda fails, they can always use some more general inflation to do the same, as it's being done elsewhere in the economy.
Datacenters are actually associated with lower electricity costs in the US
> Where does it say it was funded by $150B of public loans?
let me repeat something you've already quoted
>> the public is on the hook for $150B of loans to be payed by inflationary pricing.
one more time "to be payed by inflationary pricing"
> Datacenters are actually associated with lower electricity costs in the US.
"Associated" means these areas are getting preferential pricing to shift more of the cost to the public. Proves my point.
The actual truth, with numbers, just for 2024 and Virginia alone:
"Mike Jacobs, a senior energy manager at the Union of Concerned Scientists, last month released an analysis estimating that data centers had added billions of dollars to Americans’ electric bills across seven different states in recent years. In Virginia alone, for instance, Jacobs found that household electric bills had subsidized data center transmission costs to the tune of $1.9 billion in 2024."
"Over the last five years, commercial users including data centers and industrial users began drinking more deeply from the grid, with annual growth rising 2.6% and 2.1%, respectively. Meanwhile, residential use only grew by 0.7% annually."
>>> the public is on the hook for $150B of loans to be payed by inflationary pricing.
That framing makes even less sense. Even if we grant that capital spending is inflationary, nobody thinks the public is "on the hook" or pays for it "by inflationary pricing". If I bought a box of eggs, it probably drives up the price of eggs by some minute amount in the aggregate, but nobody would characterize that as the public being "on the hook" for it, or that the public is paying for it "by inflationary pricing". Same if I bought anything else supply constrained, like an apartment or GPU. Seemingly the only difference between those and whatever Micron is doing is that you don't like Micron and/or the AI bubble, whereas you at least tolerate me buying eggs, apartments, or GPUs, so your whole spiel about "payed by inflationary pricing" is just a roundabout way of saying you don't like Micron/AI companies' spending. I also disagree with people dropping $30K on hermes handbags, but I wouldn't characterize buying them as "the public is on the hook for $30k to be payed by inflationary pricing".
>The actual truth, with numbers, just for 2024 and Virginia alone:
"actual truth"? That settles it, then.
On a more substantive note, since you clearly haven't bothered to look into either article to examine their methodology, here's the relevant snippets for your convenience:
>Mike Jacobs, a senior energy manager at UCS, uncovered these additional costs by analyzing last year’s filings from utilities in seven PJM states and identifying 130 projects that will connect private data centers directly to the high-voltage transmission system. Over 95% of the projects identified passed all of their transmission connection costs onto local people’s electricity bills, totaling $4.3 billion in costs previously undistinguished from other, more typical expenses to upgrade and maintain the electricity grid.
and
>The Economist has adapted a model of state-level retail electricity prices from the Lawrence Berkeley National Laboratory to include data centres (see chart 2). We find no association between the increase in bills from 2019 to 2024 and data-centre additions. The state with the most new data centres, Virginia, saw bills rise by less than the model projected. The same went for Georgia. In fact, the model found that higher growth in electricity demand came alongside lower bills, reflecting the fact that a larger load lets a grid spread its fixed costs across more bill-payers.
Looking at the two different methodologies, the economist methodology seem far more reasonable, because the UCS's methodology is basically guaranteed to come up with a positive number. It just counts how much money was spent on connecting datacenters, and assumes assumes household users are paying the entire bill. It doesn't account for different rates/fees paid by retail/household users, or the possibility that datacenters could be paying more than their "fair share" of costs through other means (eg. they might require disportionately less infrastructure to service, but pay the same transmission rates as everyone else).
> If I bought a box of eggs, it probably drives up the price of eggs by some minute amount in the aggregate
"Eggs"? "Eggs" are the same as "apartment or GPU"? You display all of the comprehension abilities of an LLM... or the mainstream economist "teaching" it.
Semiconductor capex are huge compared to, eh, "eggs", and they must be payed off as part of pricing. Artificial jump in demand, as from hoarding, makes the capex artificially high and the pricing inflationary.
Also, hoarding semiconductors (private NPUs like TPUs, Trainiums, etc, stocking on hard to obtain GPUs) reduces competition and via renting, the respective services can extract the inflated capex plus high profits.
>"Eggs"? "Eggs" are the same as "apartment or GPU"? You display all of the comprehension abilities of an LLM... or the mainstream economist "teaching" it.
How did you miss the part about "anything else supply constrained", which was between "eggs" and "apartments"? Or are too lazy to search up "egg shortage"? Maybe you should check your own reading comprehension before accusing others of poor reading comprehension.
>Semiconductor capex are huge compared to, eh, "eggs", and they must be payed off as part of pricing.
Yet the aggregate demand of all americans drive the price of eggs, apartments, or GPUs. Either something drives inflation, or it doesn't. Otherwise you're just critiquing Micron's size rather than what their behavior is.
>Artificial jump in demand, as from hoarding, makes the capex artificially high and the pricing inflationary.
>Also, hoarding semiconductors (private NPUs like TPUs, Trainiums, etc, stocking on hard to obtain GPUs) reduces competition and via renting, the respective services can extract the inflated capex plus high profits.
1. There's no alternate universe where absent "hoarding", everyone would be running LLMs on their own GPUs at home. The frontier models require multiple 128+GB GPUs to run, and most people only do a few minutes of inference a day. There's no way the economics of buying works out.
2. "AI GPUs sitting idle because it doesn’t have enough power to install them" doesn't imply "hoarding", any more than buying PC parts but not building them because the last part hasn't arrived yet isn't "hoarding". It could simply be that they expected that power be available, but due to supply chain issues they aren't. Moreover hoarding GPUs would be a terrible idea, because they're a rapidly depreciating asset. No cloud provider is trying to corner the market on CPUs by buying them up and renting it back to people, for instance, and CPU is a much easier market to corner because the rate of advancement (and thus rate of depreciation) is much lower.
> the propaganda push to convince the public to pay higher electric costs in order to pay the capex for new energy plants and transmission lines.
This makes me so angry.
The private companies told governments they want money and the governments replied "sure we'll just steal it from citizens and lie and sell it as a tax, no problem. We'll just go hard on the net zero excuse lol" ??
Hopefully this will put pressure on the market to produce much more efficient AI models. As opposed to bigger, then bigger, and then even BIGGER models (which is the current trend).
FYI: gpt-oss:120b is better at coding (in benchmarks and my own anecdotal testing) than gpt5-mini. More importantly, it's so much faster too. We need more of this kind of optimization. Note that gpt5-mini is estimated to be around ~150 billion parameters.
For what it’s worth, even the Qwen 30B model has its use cases. And as far as some of the better open models go, by now the GLM 4.6 355B model is largely better than the Qwen3 Coder 480B variant, so it seems that the models are getting more efficient across the board.
Who is the "we" in this sentence? The ultra-rich that don't want to pay white collar workers to build software?
The advantages of LLMs are tiny for software engineers (you might be more productive, you don't get paid more) and the downsides are bad to life-altering (you get to review AI slop all day, you lose your job).
> The ultra-rich that don't want to pay white collar workers to build software?
This is already a fact and it's set in stone - making AI cheaper won't change anything in that regard. However, a cheaper AI will allow the laid-off software engineers to use models independently of those firing them, and even compete on an equal footing.
Compete for what exactly? Under the assumption that AI agents will make human software engineering obsolete there won't be a market for you to compete in. Everyone that wants a piece of software will ask their AI agent to create it.
My ability to create software is only really useful to me because other people pay me for it. If AI agents take that from me, it won't matter that I can now create awesome software in minutes instead of weeks. The software was never really the thing that was useful for me.
I hope this AI craze will crash soon enough. Maybe then various things normalize in price again. And consumers get cheaper products with less limitations.
Best we're getting is probably a stop to the price raises, but no price cuts. Kids will continue to grow up not knowing a $600 flagship GPU or a $1000 gaming PC.
Pet peeve: Contrary to a persistent popular belief, inflation != currency debasement.
(You can have inflation while your currency go up relatively to all the others on the FX market, like what happened to USD in 2022-S1, or you can have massive inflation difference between countries sharing the same currency, like it happened in the Euro Area between 2022 and today).
>Pet peeve: Contrary to a persistent popular belief, inflation != currency debasement.
Not to mention that "debasement" doesn't make sense anymore given that there basically aren't any currencies on the gold standard anymore. At best you could call a pegged currency that was devalued as being debased (with the base being the pegged currency), but that doesn't apply to USD. "debasement" therefore is just a pejorative way saying "inflation" or "monetary expansion".
>A gold standard is a monetary system in which the standard economic unit of account is based on a fixed quantity of gold.
and
>The Zimbabwe Gold (ZiG; code: ZWG)[3] is the official currency of Zimbabwe since 8 April 2024,[2] backed by US$900 million worth of hard assets: foreign currencies, gold, and other precious metals.
>...
>Although the rate of devaluation of the ZiG may vary,[13] the ZiG has consistently lost value since its introduction, and its long-term prospects are dim so long as large grain imports continue and the government continues to overspend.
sounds like it's not "fixed" at all, and "backed by ... hard assets" just means it has central bank reserves, which most fiat currencies have.
I think it's fair to keep using debasement for the act of letting your currency go down against other currencies on the FX market.
> "inflation" or "monetary expansion".
This is my second pet peeve on the topic, inflation and growth of the money supply are independent phenomenons. (they are only correlated in countries with high inflation regimes and, hyperinflation aside, the causation is essentially reversed: the money supply grow because of the inflation, higher price leading to an increase of loans).
The Radeon 8500 from 2001 was even cheaper than that, roughly 300 USD... the voodoo 3 3500 from 1999 was roughly 200 USD... if you ask me, we don't need graphics chips as intense as we have now, but the way they crank them out every year and discontinue the older models ruins most of the value of buying a GPU these days.
It really is a damn shame, but before AI, it was cryptomining. Desktop GPU prices have been inflated to nonsense levels for gamers, to the point where console vs. PC isnt even really question anymore.
And even with increased priced you often still get paltry amount of RAM. All for market segmentation due to AI use cases. Which is bad as requirements have crept up.
Really frustrating for a hobbyist 3D artist. Rendering eats gobs of RAM for complex scenes. I'd really love a mid-level GPU with lots of VRAM for under $500. As is, I'm stuck rendering on CPU at a tenth the speed or making it work with compositing.
3d rendering can use multiple GPUs right? Maybe pick up a couple MI50 32GB cards off Alibaba. A couple months ago they were $100 each but it looks like they're up to ~$160 now.
In some ways though, the increase in visual fidelity has been _marginally_ improved on a per-year basis since the PS4/Xbone era. My GPUs have had much, much longer useful lives than the 90s/early-2000s.
If you stay off of the upgrade treadmill, you can game with a pretty dated card at this point. Sure, you cannot turn on all of the shines, but thanks to consoles, a playable build is quite attainable.
For $50, kids these days can buy a Raspberry Pi that would have run circles around the best PC money could buy when I was a kid.
Or, for $300, you can buy an RTX 5060 that is better than the best GPU from just 6 years ago. It's even faster than the top supercomputer in the world in 2003, one that cost $500 million to build.
I find it hard to pity kids who can't afford the absolute latest and greatest when stuff that would have absolutely blown my mind as a kid is available for cheap.
> Or, for $300, you can buy an RTX 5060 that is better than the best GPU from just 6 years ago. It's even faster than the top supercomputer in the world in 2003, one that cost $500 million to build.
RTX 5060 is slower than the RTX 2080 Ti, released September 2018. Digital Foundry found it to be 4% slower in 1080p, 13% slower in 1440p: https://www.youtube.com/watch?v=57Ob40dZ3JU
Depends whether or not there's a big bubble burst that involves bankruptcies and Big Tech massively downscaling their cloud computing divisions. Most likely they'll just end up repurposing the compute and lowering cloud rates to attract new enterprise customers, but if you see outright fire sales from bankruptcies and liquidations, people will be able to pick up computer hardware at fire sale prices.
That's exactly how it works. A whole generation is already unaware that you used to be able to buy PC games anonymously, offline, without a rent seeking middleman service.
I think there's always been a rent-seeking middleman service. In the 80s it was retail: you'd go to a physical computer store to buy a game for $50 (note: that's $150 inflation-adjusted, more expensive than most games today), and the retail store, the distributor, and the publisher would all take a cut. In the 2000s it was the developer's ISP, web developer, and credit card payment processor, which were non-trivial in the days before Wix and Stripe.
The shareware/unlock-code economy of the 90s was probably the closest you'd get to cutting out the middlemen, where you could download from some BBS or FTP server without the dev getting involved at all and then send them money to have them email you an unlock code, but it was a lot of manual work on the developer's part, and a lot of trust.
Is that really the cause of this price increase? I still don't understand if this price surge is specifically for the US (https://news.ycombinator.com/item?id=45812691) or if it's worldwide, I'm not sure I notice anything here in Southern Europe, so either that means it's lagging and I should load up RAM today, or this is indeed US-specific. But I don't know what's true.
> that means it's lagging and I should load up RAM today, or this is indeed US-specific. But I don't know what's true.
This is a global issue that is most severe in the US due to its share of hyperscalers and their, uh, scale. You may not feel the effects yet, but it is matter of time until someone notices your market has a RAM glut, while 30-55% of their orders aren't being fulfilled.
In all likelihood, the supply channels to your locality have a low turnover rate, and DRAM has a long shelf-life. If the high prices stay high for long, it's going to impact prices when your retailers try to restock. If the price shock ends soon, your retailer may not even notice it. Whether you ought to buy or not depends on your outlook on how things will shake out
In the US some of it could be tariffs. Micron is a US company with some US fabs but most of theirs are in other countries and Samsung and Hynix are both South Korea.
U.S. tariffs inadvertently kept prices low, due to stockpiling of memory when prices were cheap, before tariffs took effect. As that inventory is depleted, new supply chain purchases are much more expensive and subject to tariffs.
I did (https://news.ycombinator.com/item?id=45812691), the RAM I bought in March 2024 currently costs about the same as when I bought it, seems the price stagnated rather than increased for that specific example.
Do you have some concrete examples of where I can look?
Are those graphs specifically for the US? When I change the country in the top right, it doesn't seem like the graphs are changing, and considering they're in USD, I'm assuming it's US-only?
Is the same doubling happening world-wide or is this US-specific, I guess is my question?
Edit: one data point, I last bought 128GB of RAM in March 2024 for ~€536, similar ones right now costs ~€500, but maybe the time range is too long.
They are US-specific, yes. Thanks for asking that - I'll look into updating those graphs to show for the appropriate region/country depending on what country you've selected (on the top right of the page).
I'm not finding any way of figuring out if that's true or not, I live near the second-largest city in Spain, kind of feel like people probably buy as much RAM here as elsewhere in the country/continent, but maybe that's incorrect. I've tried searching for graphs/statistics for the last 1-2 years about it in Spain but not having much success.
I can add Spain price trends to PCPartPicker. Quick question though - do you want the price trends to cover just Spanish retailers, or should it trend the prices across all of the EU?
That would be incredible! Personally I only buy the stuff I can find inside the country, inside the country. But then some stuff I have to order from Germany/France/Italy when it's only available outside our borders.
So I don't know the right approach here, I can see value for both price trends for multiple reasons, unfortunately :) Wish I could give a simpler answer!
In the UK I was looking at DDR4-3200 SODIMM last week for some mini-pcs... and decided to pass after looking at the price graphs. It's spiked in the last few weeks.
536 € seems expensive for March 2024, but either way, the price dropped a lot over the last one and a half years, only to surge in the last two months.
I was able to get a bundle deal from Microcenter here in SoCal with the Ryzen 9950x, motherboard and 32GB of RAM for $699. They have since removed the RAM from all the bundles.
While thats a sweet upgrade for people with an older desktop that can support a motherboard swap, its worthwhile to point out the ram is probably insufficient.
RAM usage for a lot of workloads scales with core/thread count, and my general rule of thumb is that 1G/thread is not enough, and 2G/thread will mostly work, and 4G/thread is probably too much, but your disk cache will be happy. Also, the same applies to VMs, so if your hosting a VM and give it 16 threads, you probably want at least 16G for the VM. The 4G/thread then starts to look pretty reasonable.
Just building a lot of opensource projects with `make -j32` your going to be swapping if you only have 1G/thread. This rule then becomes super noticeable when your on a machine with 512G of ram, and 300+ threads, because your builds will OOM.
Even used memory has doubled in price. I was thinking of putting together a high-memory box for a side project, and reddit posts from a year ago all have memory at 1/2 to 1/3 of current ebay prices for the same part.
I've just built a gaming PC (after more than a decade without one), for curiosity's sake I just compared the prices I paid for DDR5 2 months ago to now, and at my location it already shows a 25-30% increase. Bonkers...
I think that's nearly exactly what I paid for 2x32GB at a retail store last week. I hadn't bought RAM in over a decade so I didn't think anything of it. Wish my emergency PC replacement had occurred a year earlier!
I’ve been selling unused ddr4 on eBay. It’s not as profitable as one would think tbh even with elevated demand. Only making a profit on the ones I initially acquired 2nd hand
I think people who bought a high-end nvidia graphics card in the max. memory config pre-AI Hype, would have made a very decent deal on eBay. DRAM is yet to come.
Not just server memory, desktop memory has gone up for the same reason... it's all going to AI. Forget building a new gaming pc, or buying a laptop, or even an arm SBC, because the supply is just gone.
These price hikes do fun things to the whole market.
In one of the last GPU booms I sold some ancient video card (recovered from a PC they were literally going to put in the trash) for $50.
And it wasn’t because it was special for running vintage games. The people that usually went for 1st rate gpus went to 2nd rate. Pushing the 2nd rate buyers to 3rd rate, creating a market for my 4th rate gpu.
> South Korean SK Hynix has exhausted all of its chip
production for next year and plans to significantly increase investment, anticipating a prolonged "super cycle" of chips, spurred by the boom of artificial intelligence, it said on Wednesday after reporting a record quarterly profit.
> Adata chairman says AI datacenters are gobbling up hard drives, SSDs, and DRAM alike — insatiable upstream demand could soon lead to consumer shortages
Less ram? Not really. Less hardware? Yes. That machine is already in the wild, so why not let it have proper AI accelerator + increase memory the machine has - also the data transfer problem is solved since it's on that machine.
In the end I just need more available PCIe lanes (so I can chuck more disks in there) and ideally PCIe Gen 5, otherwise I don't have much reason to upgrade.
Manufacturers learned a valuable lesson a few years ago: overproduction leads to lower prices. Samsung was the first to address this issue by scaling back, and other manufacturers soon followed suit (collusion, cough cough). The past couple of years have been extremely profitable for the entire industry, and they’re not about to increase production and risk hurting their profits.
I suspect they would rather face shortages then satisfy market demand.
lower prices are ok if they are selling more units, the question is whether the price point * units is Pareto optimal
overproduction means unsold units which is very bad, you pay a cost for every unsold unit
underproduction means internal processes are strained, customers are angry, but a higher price per a unit... can you increase the price by more than you are underproducing?
If we want to engage with game theory here, I would argue that overproduction is a much safer bet than underproduction from the perspective of Samsung, et. al. Underproduction brings additional caveats that manifest as existential risks. For example, encouraging your customers to move to entirely different technologies or paradigms that completely obviate the need for your product in the first place. If you leave a big, expensive constraint in place for long enough, people will eventually find paths around it.
I think the Nintendo ecosystem has been a pretty good example of where intentional underproduction can backfire. Another example might be that migration to SSD was likely accelerated by (forced) underproduction of spinning disks in 2011. We use SSDs for a lot of things that traditional magnetic media would be better at simply because the supply has been so overpowering for so long.
You can train your customers to stick with you by bathing them in product availability. Overproduction can be a good thing. Inventory can be a good thing. We've allowed a certain management class to terrorize us into believing this stuff is always bad.
> I suspect they would rather face shortages then satisfy market demand.
Doubtful. A shortage is normally a scary prospect for a vendor. It means that buyers want to pay more, but something is getting in the way of the seller accepting that higher price. Satisfying market demand is the only way to maximize profitability.
Why do you think companies would prefer to make less profit here?
It's not the 1980s anymore. If you make too much profit nowadays you pull a John Deere and start crying to government that your customers aren't profitable enough (because you siphoned off all of their profit) and need a bailout so that they can pay even more for your product in the future.
It's intense market demand by people with lots of money against products that have a very long supply chain. Even with multiple sellers competing, this kind of demand is insane, and the buyers pockets run deep.
The other way I look at this is that these companies have been collecting an insane amount of wealth and value over the last 2-3 decades, are finally in a situation where they feel threatened, and are willing to spend to survive. They have previously never felt this existential threat before. It's basically bidding wars on houses in San Francisco, but with all the wealthiest companies in the world.
> spot prices of DRAM, used in various applications, nearly tripled in September from a year earlier.. improving profitability of non-HBM chips has helped fuel memory chipmakers' share price rally this year, with Samsung's stock up more than 80%, while SK Hynix and Micron shares have soared 170% and 140% respectively... industry is going through a classic shortage that usually lasts a year or two, and TechInsights is forecasting a chip industry downturn in 2027.
Micron has US memory semiconductor fab capacity coming online in 2027 through 2040s, based on $150B new construction.
Are some HBM chips idle due to lack of electrical power? https://www.datacenterdynamics.com/en/news/microsoft-has-ai-...
> Microsoft CEO Satya Nadella has said the company has AI GPUs sitting idle because it doesn’t have enough power to install them.
If the PC supply chain will be impacted by memory shortages until 2027, could Windows 10 security support be extended for 24 months to extend the life of millions of business PCs that cannot run Windows 11?
Yay, the public is on the hook for $150B of loans to be payed by inflationary pricing.
I guess, you offered the news hoping prices will fall... In terms of real economic analysis there's a lot to say here but let me point at only one of the many entry points of the rabbit hole:
"Microsoft CEO says the company doesn't have enough electricity to install all the AI GPUs in its inventory - 'you may actually have a bunch of chips sitting in inventory that I can’t plug in'
https://www.tomshardware.com/tech-industry/artificial-intell...
Microsoft and all other AI wannabes are hoarding GPUs and thus RAM, they hope to sell them to you for a price of subscription which doesn't change the fact of speculative hoarding and trust-like behavior against the public.
The hoarding and inflation economy we live in is a weapon against the public, at the moment, there's no visible force that isn't laboring diligently on the enhancement of that weapon, so the timeline of change is likely to stretch somewhere between far in the future to infinity... just hoping otherwise is futile.
If you pay attention, you won't fail to notice the propaganda push to convince the public to pay higher electric costs in order to pay the capex for new energy plants and transmission lines. In other words, you pay the price, they own the benefits. And if the propaganda fails, they can always use some more general inflation to do the same, as it's being done elsewhere in the economy.
As I said, this is just scratching the surface, there's a lot more which cannot fit in a single comment.
Edit: actually not. The parent comment was edited after mine, to include a link to MS inadvertently admitting to the hoarding of GPUs and RAM.
Where does it say it was funded by $150B of public loans?
>which doesn't change the fact of speculative hoarding
All investment resembles "speculative hoarding'. You're pouring money into a project now with the expectation that it'll pay off in decades.
> and trust-like behavior against the public.
???
>If you pay attention, you won't fail to notice the propaganda push to convince the public to pay higher electric costs in order to pay the capex for new energy plants and transmission lines. In other words, you pay the price, they own the benefits. And if the propaganda fails, they can always use some more general inflation to do the same, as it's being done elsewhere in the economy.
Datacenters are actually associated with lower electricity costs in the US
https://www.economist.com/united-states/2025/10/30/the-data-...
let me repeat something you've already quoted
>> the public is on the hook for $150B of loans to be payed by inflationary pricing.
one more time "to be payed by inflationary pricing"
> Datacenters are actually associated with lower electricity costs in the US.
"Associated" means these areas are getting preferential pricing to shift more of the cost to the public. Proves my point.
The actual truth, with numbers, just for 2024 and Virginia alone:
"Mike Jacobs, a senior energy manager at the Union of Concerned Scientists, last month released an analysis estimating that data centers had added billions of dollars to Americans’ electric bills across seven different states in recent years. In Virginia alone, for instance, Jacobs found that household electric bills had subsidized data center transmission costs to the tune of $1.9 billion in 2024."
https://www.commondreams.org/news/ai-data-center-backlash
Also:
"Over the last five years, commercial users including data centers and industrial users began drinking more deeply from the grid, with annual growth rising 2.6% and 2.1%, respectively. Meanwhile, residential use only grew by 0.7% annually."
https://techcrunch.com/2025/11/01/rising-energy-prices-put-a...
>>> the public is on the hook for $150B of loans to be payed by inflationary pricing.
That framing makes even less sense. Even if we grant that capital spending is inflationary, nobody thinks the public is "on the hook" or pays for it "by inflationary pricing". If I bought a box of eggs, it probably drives up the price of eggs by some minute amount in the aggregate, but nobody would characterize that as the public being "on the hook" for it, or that the public is paying for it "by inflationary pricing". Same if I bought anything else supply constrained, like an apartment or GPU. Seemingly the only difference between those and whatever Micron is doing is that you don't like Micron and/or the AI bubble, whereas you at least tolerate me buying eggs, apartments, or GPUs, so your whole spiel about "payed by inflationary pricing" is just a roundabout way of saying you don't like Micron/AI companies' spending. I also disagree with people dropping $30K on hermes handbags, but I wouldn't characterize buying them as "the public is on the hook for $30k to be payed by inflationary pricing".
>The actual truth, with numbers, just for 2024 and Virginia alone:
"actual truth"? That settles it, then.
On a more substantive note, since you clearly haven't bothered to look into either article to examine their methodology, here's the relevant snippets for your convenience:
>Mike Jacobs, a senior energy manager at UCS, uncovered these additional costs by analyzing last year’s filings from utilities in seven PJM states and identifying 130 projects that will connect private data centers directly to the high-voltage transmission system. Over 95% of the projects identified passed all of their transmission connection costs onto local people’s electricity bills, totaling $4.3 billion in costs previously undistinguished from other, more typical expenses to upgrade and maintain the electricity grid.
and
>The Economist has adapted a model of state-level retail electricity prices from the Lawrence Berkeley National Laboratory to include data centres (see chart 2). We find no association between the increase in bills from 2019 to 2024 and data-centre additions. The state with the most new data centres, Virginia, saw bills rise by less than the model projected. The same went for Georgia. In fact, the model found that higher growth in electricity demand came alongside lower bills, reflecting the fact that a larger load lets a grid spread its fixed costs across more bill-payers.
>chart 2: https://www.economist.com/content-assets/images/20251101_USC...
Looking at the two different methodologies, the economist methodology seem far more reasonable, because the UCS's methodology is basically guaranteed to come up with a positive number. It just counts how much money was spent on connecting datacenters, and assumes assumes household users are paying the entire bill. It doesn't account for different rates/fees paid by retail/household users, or the possibility that datacenters could be paying more than their "fair share" of costs through other means (eg. they might require disportionately less infrastructure to service, but pay the same transmission rates as everyone else).
"Eggs"? "Eggs" are the same as "apartment or GPU"? You display all of the comprehension abilities of an LLM... or the mainstream economist "teaching" it.
Semiconductor capex are huge compared to, eh, "eggs", and they must be payed off as part of pricing. Artificial jump in demand, as from hoarding, makes the capex artificially high and the pricing inflationary.
Also, hoarding semiconductors (private NPUs like TPUs, Trainiums, etc, stocking on hard to obtain GPUs) reduces competition and via renting, the respective services can extract the inflated capex plus high profits.
How did you miss the part about "anything else supply constrained", which was between "eggs" and "apartments"? Or are too lazy to search up "egg shortage"? Maybe you should check your own reading comprehension before accusing others of poor reading comprehension.
>Semiconductor capex are huge compared to, eh, "eggs", and they must be payed off as part of pricing.
Yet the aggregate demand of all americans drive the price of eggs, apartments, or GPUs. Either something drives inflation, or it doesn't. Otherwise you're just critiquing Micron's size rather than what their behavior is.
>Artificial jump in demand, as from hoarding, makes the capex artificially high and the pricing inflationary.
>Also, hoarding semiconductors (private NPUs like TPUs, Trainiums, etc, stocking on hard to obtain GPUs) reduces competition and via renting, the respective services can extract the inflated capex plus high profits.
1. There's no alternate universe where absent "hoarding", everyone would be running LLMs on their own GPUs at home. The frontier models require multiple 128+GB GPUs to run, and most people only do a few minutes of inference a day. There's no way the economics of buying works out.
2. "AI GPUs sitting idle because it doesn’t have enough power to install them" doesn't imply "hoarding", any more than buying PC parts but not building them because the last part hasn't arrived yet isn't "hoarding". It could simply be that they expected that power be available, but due to supply chain issues they aren't. Moreover hoarding GPUs would be a terrible idea, because they're a rapidly depreciating asset. No cloud provider is trying to corner the market on CPUs by buying them up and renting it back to people, for instance, and CPU is a much easier market to corner because the rate of advancement (and thus rate of depreciation) is much lower.
This makes me so angry.
The private companies told governments they want money and the governments replied "sure we'll just steal it from citizens and lie and sell it as a tax, no problem. We'll just go hard on the net zero excuse lol" ??
FYI: gpt-oss:120b is better at coding (in benchmarks and my own anecdotal testing) than gpt5-mini. More importantly, it's so much faster too. We need more of this kind of optimization. Note that gpt5-mini is estimated to be around ~150 billion parameters.
Who is the "we" in this sentence? The ultra-rich that don't want to pay white collar workers to build software?
The advantages of LLMs are tiny for software engineers (you might be more productive, you don't get paid more) and the downsides are bad to life-altering (you get to review AI slop all day, you lose your job).
This is already a fact and it's set in stone - making AI cheaper won't change anything in that regard. However, a cheaper AI will allow the laid-off software engineers to use models independently of those firing them, and even compete on an equal footing.
My ability to create software is only really useful to me because other people pay me for it. If AI agents take that from me, it won't matter that I can now create awesome software in minutes instead of weeks. The software was never really the thing that was useful for me.
(You can have inflation while your currency go up relatively to all the others on the FX market, like what happened to USD in 2022-S1, or you can have massive inflation difference between countries sharing the same currency, like it happened in the Euro Area between 2022 and today).
Not to mention that "debasement" doesn't make sense anymore given that there basically aren't any currencies on the gold standard anymore. At best you could call a pegged currency that was devalued as being debased (with the base being the pegged currency), but that doesn't apply to USD. "debasement" therefore is just a pejorative way saying "inflation" or "monetary expansion".
https://en.wikipedia.org/wiki/Zimbabwean_ZiG
>A gold standard is a monetary system in which the standard economic unit of account is based on a fixed quantity of gold.
and
>The Zimbabwe Gold (ZiG; code: ZWG)[3] is the official currency of Zimbabwe since 8 April 2024,[2] backed by US$900 million worth of hard assets: foreign currencies, gold, and other precious metals.
>...
>Although the rate of devaluation of the ZiG may vary,[13] the ZiG has consistently lost value since its introduction, and its long-term prospects are dim so long as large grain imports continue and the government continues to overspend.
sounds like it's not "fixed" at all, and "backed by ... hard assets" just means it has central bank reserves, which most fiat currencies have.
> "inflation" or "monetary expansion".
This is my second pet peeve on the topic, inflation and growth of the money supply are independent phenomenons. (they are only correlated in countries with high inflation regimes and, hyperinflation aside, the causation is essentially reversed: the money supply grow because of the inflation, higher price leading to an increase of loans).
Absolutely prices should adjust appropriately… once… oh never mind
Now if only major studios would budget for optimizations..
Or, for $300, you can buy an RTX 5060 that is better than the best GPU from just 6 years ago. It's even faster than the top supercomputer in the world in 2003, one that cost $500 million to build.
I find it hard to pity kids who can't afford the absolute latest and greatest when stuff that would have absolutely blown my mind as a kid is available for cheap.
RTX 5060 is slower than the RTX 2080 Ti, released September 2018. Digital Foundry found it to be 4% slower in 1080p, 13% slower in 1440p: https://www.youtube.com/watch?v=57Ob40dZ3JU
The shareware/unlock-code economy of the 90s was probably the closest you'd get to cutting out the middlemen, where you could download from some BBS or FTP server without the dev getting involved at all and then send them money to have them email you an unlock code, but it was a lot of manual work on the developer's part, and a lot of trust.
Stripe is way more expensive than regular payment processors. Convenient for sure, but definitely not cheap.
This is a global issue that is most severe in the US due to its share of hyperscalers and their, uh, scale. You may not feel the effects yet, but it is matter of time until someone notices your market has a RAM glut, while 30-55% of their orders aren't being fulfilled.
In all likelihood, the supply channels to your locality have a low turnover rate, and DRAM has a long shelf-life. If the high prices stay high for long, it's going to impact prices when your retailers try to restock. If the price shock ends soon, your retailer may not even notice it. Whether you ought to buy or not depends on your outlook on how things will shake out
Do you have some concrete examples of where I can look?
https://pcpartpicker.com/trends/price/memory/
Is the same doubling happening world-wide or is this US-specific, I guess is my question?
Edit: one data point, I last bought 128GB of RAM in March 2024 for ~€536, similar ones right now costs ~€500, but maybe the time range is too long.
[1]: https://kakaku.com/item/K0001448114/pricehistory/ (archive: https://archive.is/CHLs2)
So I don't know the right approach here, I can see value for both price trends for multiple reasons, unfortunately :) Wish I could give a simpler answer!
Yeah, that was my hunch, that something like that was going on. Thanks for clarifying.
RAM usage for a lot of workloads scales with core/thread count, and my general rule of thumb is that 1G/thread is not enough, and 2G/thread will mostly work, and 4G/thread is probably too much, but your disk cache will be happy. Also, the same applies to VMs, so if your hosting a VM and give it 16 threads, you probably want at least 16G for the VM. The 4G/thread then starts to look pretty reasonable.
Just building a lot of opensource projects with `make -j32` your going to be swapping if you only have 1G/thread. This rule then becomes super noticeable when your on a machine with 512G of ram, and 300+ threads, because your builds will OOM.
In one of the last GPU booms I sold some ancient video card (recovered from a PC they were literally going to put in the trash) for $50.
And it wasn’t because it was special for running vintage games. The people that usually went for 1st rate gpus went to 2nd rate. Pushing the 2nd rate buyers to 3rd rate, creating a market for my 4th rate gpu.
https://www.tomshardware.com/pc-components/dram/openais-star...
> South Korean SK Hynix has exhausted all of its chip production for next year and plans to significantly increase investment, anticipating a prolonged "super cycle" of chips, spurred by the boom of artificial intelligence, it said on Wednesday after reporting a record quarterly profit.
https://en.ilsole24ore.com/art/korean-chip-race-sk-hynix-has...
> Adata chairman says AI datacenters are gobbling up hard drives, SSDs, and DRAM alike — insatiable upstream demand could soon lead to consumer shortages
https://www.tomshardware.com/tech-industry/big-tech/adata-ch...
But there have been plenty of articles over the last decade saying that it was done around 2015 or so.
I would say that a claim about component cost has something to do with price.
I suspect they would rather face shortages then satisfy market demand.
overproduction means unsold units which is very bad, you pay a cost for every unsold unit
underproduction means internal processes are strained, customers are angry, but a higher price per a unit... can you increase the price by more than you are underproducing?
I think the Nintendo ecosystem has been a pretty good example of where intentional underproduction can backfire. Another example might be that migration to SSD was likely accelerated by (forced) underproduction of spinning disks in 2011. We use SSDs for a lot of things that traditional magnetic media would be better at simply because the supply has been so overpowering for so long.
You can train your customers to stick with you by bathing them in product availability. Overproduction can be a good thing. Inventory can be a good thing. We've allowed a certain management class to terrorize us into believing this stuff is always bad.
Doubtful. A shortage is normally a scary prospect for a vendor. It means that buyers want to pay more, but something is getting in the way of the seller accepting that higher price. Satisfying market demand is the only way to maximize profitability.
Why do you think companies would prefer to make less profit here?
Because if you make too much profit, you get regulated by government.
Either way, without competition expect it to increase further.
The other way I look at this is that these companies have been collecting an insane amount of wealth and value over the last 2-3 decades, are finally in a situation where they feel threatened, and are willing to spend to survive. They have previously never felt this existential threat before. It's basically bidding wars on houses in San Francisco, but with all the wealthiest companies in the world.