Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
TSMC sees chip shortage lasting into 2022 (reuters.com)
375 points by tosh on April 15, 2021 | hide | past | favorite | 234 comments


I found that the chips for a project that I am working on are no longer available and I got the lead time of 13 months, without guarantee. It essentially means that I have to redesign the project using different part which may take me few months and I have no guarantee the other chip I choose will be available. The problem is, however, that I can see those chips available in their thousands on aliexpress and similar sites for 10x the price. I also read on forums that Chinese entrepreneurs buy all the chips and stockpile them. Is this some new kind of war?


Those AliExpress chips may not be genuine. Buying from unauthorized distributors has always been risky, and some companies (especially Chinese ones) have been know to stamp blank or generic parts with whatever part number you're looking for. The really tricky vendors will put a few (10-50) genuine parts at the beginning and end of a reel (to pass validation testing), with defective or fake ones in between. Non-genuine parts may also be lower-spec versions of what they are labelled as (like a slower microcontroller or a higher offset op-amp).


For a fun set of reads on this ("Label the chips as what the buyer wants regardless of what they actually are") from a decade ago, Sparkfun got some "fake" ATMega 328p chips in that, in fact, were nothing remotely resembling an ATMega 328p.

In the words of someone after they'd puzzled out the puzzle, "...so it looks like that die in the picture is pre-release engineering material. Where the hell did you find that?"

https://www.sparkfun.com/news/350

https://www.sparkfun.com/news/364

https://www.sparkfun.com/news/384

https://www.sparkfun.com/news/395


I'm guessing it's a sample run and the company realized there was a bug in the silicon and told the fab to destroy the rest of the samples. Instead someone at the fab set them aside for the future when they need some raw material to scam people with.

I find it strange that whoever decided to do this kept the correct datecode on the packaging. Presumably they didn't label them as ATMega 328s until the order came in, it would have been trivial to change the datecode to something reasonable for that chip.


A proper scale should be able to detect weak effort fraud like this.


A proper scale can detect all sorts of weak effort fraud.

Years back, I talked to someone who had done some analysis on some of the "fake silver/gold" bars floating around places (I think they were common on Silk Road for a while?). Apparently some of the people faking metal bars didn't actually bother learning what a "Troy Ounce" was.

There was a "10 oz" bar that was 10 ounces - 283.5 grams. Except, in metals, "oz" means Troy Ounce - so it should have been 311 grams. You could literally feel that it was light if you were used to dealing with metals.


I suspect it wasn’t really trying to defraud people who know what a Troy Ounce is. I’d find it believable if people thought fake bars at 311g were frauds for not weighing 10oz.


This was a great read, thanks!


That's super interesting, thank you!


It's a gray market. Sometimes, I mean sometimes, these chips are real and they usually come from a "leak" in the supply chain - unauthorized resales by small vendors, or "unwanted" surplus from a previous order or production run. Usually they exist only in small quantities and are stored and handled in unknown conditions, it's not something you want to use in production. But it explains why the AliExpress vendors always have some components when major distributors are out of stock (e.g. You can do it too - just order all the remaining stock of a rare chip from Digikey and put them on eBay...)

For example, I heard that a common practice in Shenzhen is selling reels of genuine components, but the "lot number" (serial number) on the labels are removed, so the manufacturers cannot trace these unauthorized resales back to the source. Another example, in a personal hobby project I needed to use a specialized controller that is not available on the public market, but a AliExpress vendor has them, it's likely surplus from somewhere.

> I can see those chips available in their thousands on aliexpress and similar sites for 10x the price

The quantity is not always real, sometimes they just put a random number here. And even when it's accurate, the sales occur via various online and offline channels simultaneously.

---

Disclaimer: It's extremely risky to order components from an unknown source, I do not endorse doing it. And my hypothesis on their origins is my speculation, so take a grain of salt.


Random story... I bought some resistor kits from aliexpress. I noticed the few I used were super accurate. So I tested a bunch. They were all accurate to, not sure exactly but let's say 0.1% ish. I went and measured a bunch of 1% ones from digikey,and they were around 0.5 to 1%.

I guess they were tight tolerance ones that made it into a ultra cheap smd kit.


It's what happens when you buy random components, occasionally you get lucky ;-)

There's a saying I've heard before: How many 1% resistors can you find in a bag of 5% resistors? None. The 1% resistors were already sorted out and sold as such. I'm not sure how true it is, but if it's true, perhaps a bunch of higher-precision resistors somehow made their ways into a resistor kit not because of random chance, but because they were surplus...


Resistors are so dirt cheap nowadays it's not generally cost effective to make 1% resistors by selecting from a process which makes 5% ones. It's much cheaper to just tune the process so you always get 1% without needing to do the selection. And when you get to higher precisions you have a trimmable design where you adjust the resistance with a laser to get it in spec.


I like to think we’ve gotten so good at making something as basic as a resistor that modern/established manufacturers are so consistent there practically is near-zero tolerance for error.


This is why buying chips off amazon isn't always a great idea.


Its not war, just a good way to make a lot of money now. You buy cheap and sell absurdly high (10-20x the price). People still buy because they have no other option.


In a way, having a 1-2 year manufacturing plan can be like pre-selling many still-discrete still-not-in-hand components in advance. In other words, short-selling.

Your BOM can be 1000 vulnerabilities to a focussed short-squeeze.

I suppose this is a predictable and reportable business risk.


> I also read on forums that Chinese entrepreneurs buy all the chips and stockpile them. Is this some new kind of war?

I suspect it isn't just the Chinese speculating right now.

I do suspect that the Chinese are probably the only place in the world that had rolls of random semiconductors tucked away in various factories not being used. So, some of these are genuine--most are probably fake. Caveat emptor.

However, I'm am totally down with everybody who depends on Always Late Inventory(tm) getting a rude awakening. The problem is that the lesson won't stick.


It's democracy when it works in our favor, and war when it works against us.

People in Bolivia and Peru struggle to afford local food (e.g., Quinoa) because they're outbid (by money, but certainly not necessity).


The quinoa story isn't quite that simple. For a time, it was actually beneficial to the traditional growers: https://nacla.org/news/2018/03/12/quinoa-boom-goes-bust-ande...


That's the funny thing about democracy: It mostly works in favor of those with capital.


High tech version of "toilet paper hoarding"


This is multi-pronged. Chip manufacturing output isn't flexible, and in the last year we've had:

* COVID stay at home * Every kid wants to be a streamer * E-sports becoming a real sport * 2 new console releases * Amazing new CPU's from AMD * Amazing new graphics cards from AMD/Intel. * Another crypto boom/bubble (depending on who you ask)

All these factors have put pressure on chip production.


The difference here is that people are used to not being able to buy chips at costco/wallmart so you might have a chance at selling your stock.


China def is stockpiling more chips I think. The Huawei situation really brought home to china how the US or others could basically try and cut them off and they are doing a lot of things to reduce that potentially impact (all out on chip mfg, chip orders etc) so having stock may be of benefit to them in that environment.


Not stockpiling, a lot of people just do actual hoarding, and speculation.

Buying out all stocks of some rare MCU which only has probably few million units on the market at any given time at $0.1 just to resell for a few dollars later is quite real given the unprecedented squeeze.

How much would a car company pay for a rare, out of stock chip which is the only thing missing in a car worth $100k?

There is murmur in Chinese BBSes frequented by industry insiders about distributors intentionally either stuffing their part stock counts, or diminish them to play the price.


So Martin Shkreli, but instead of pharma, chips.

And designing products around perceived availability that doesn’t actually exist.


Just-in-time manufacturing is pre-medidated on the idea that whatever you need is available immediately for a fair price.

I think we'll see some of this in the future where a company's only purpose may be to hedge on bets on what will be unavailable in the near future for "a fair price"


A "market-maker", in the form of a financial firm that just skims pennies off, but for chip supply, instead of other financial instruments :)


I wouldn't be surprised if it ends up being like buying power "in bulk" where you pay a little more but there's a guarantee that when "the chips are down" (sorry, couldn't resist), the company will eat the difference.

I do think the market won't use these suppliers a lot just like the US-made masks which got millions of orders but only after china stopped sending stuff.


After China restricted other companies from entering their own market then the only fair thing to do is to restrict Chinas' access to other markets.

I'm not a trump supporter but he did the right thing on this front.


It's a totally fair game.

> I'm not a trump supporter but he did the right thing on this front.

It's a shame you have to admit that. I'm a liberal and think we should be putting a hard squeeze on unilateral trade. This policy makes sense for the US and other countries regardless of your party.

You can see that the US is starting to ramp up hard core. With international navies now sailing into the South China sea and a deafening rise of anti-CCP news and (admittedly) some propaganda, player two has finally entered the game.

I live near an air force base and over the last few weeks have seen (and definitely heard!) an almost daily fighter jet exercise. This hasn't happened in years.

China is going to be in a very tough spot soon.


>It's a shame you have to admit that. I'm a liberal and think we should be putting a hard squeeze on unilateral trade. This policy makes sense for the US and other countries regardless of your party.

People like to join teams. So if I have an opinion that's not part of the "team" or part of the "other" team then people like to attack those opinions. That's why I make sure I say something along the lines of "Hey, I'm not on the other team, but I have a different opinion."


This "join teams" thing is a constant frustration.

If there is a problem, the best solution likely frustrates either team, at least a little.

And so time is wasted in bullshittery rather than getting on with the job.


> I live near an air force base and over the last few weeks have seen (and definitely heard!) an almost daily fighter jet exercise. This hasn't happened in years.

Funny you should mention that... I live in the UK and a few weeks ago there were military maneouvers near my house. I've been here 18 years and this is the first time they've happened.

Happened across the country too.


When the trading stops, the bloodshed starts and bodies flop.


I had the same thing. 18k available on digikey. I even still had the browser window open, and the next day 0 available. I found arrow had 26 and everyone else zero. I managed to get enough for testing but will have to redesign now, but how to redesign for a chip I know will be available. I guess I need to buy a reel before designing. This sucks tbh


I wonder if OEMs are resorting to buying stock off digikey instead of from the manufacturer because they’re not manufacturing (enough).


Heh. Let me tell you about how I tried to buy a new Nvidia RTX 3090...


I really don't understand why Nvidia doesn't just give every buyer a test to see their knowledge of either ML or gaming skill before allowing purchase of a GPU.

Either

(a) submit a history of gaming high scores with dates

(b) a Github repo with Tensorflow or Pytorch code that has commits from at least a year or two ago

(c) play a hard in-browser game to show your skill


What is it that I do not get about this joke?


That wasn't intended to be a joke. It might sound comical but it would solve the problem and would probably the first solution an 8-year-old would think of.


I don't think mentioning that an 8-year-old thinking of this idea actually helps your argument in this case


Which points out the the real answer to the question: Nvidia doesn't want it solved. Not just that they don't care, but they're making more money now — so they don't want it solved.


Not really, they're selling out and making the same amount of money because the crypto-nuts buying and hoarding them are still buying them at list price. And they are not making use of 90% of the features available on the cards.


Are gamer or ML engineer dollars worth more than those of the general public?


What about resale? Would you forbid that too? How would they enforce it?

Buying a 3090 is not a basic need.


Or they could just you know, increase the price until quantity demanded = quantity supplied


(c) could be cheated away, and (a) and (b)... I don't know, I guess if NVDA does this my best option is to sell my ML toy project repos to gamers :P


That is just a horrible suggestion... Doesn't matter what rules or what level of enforcement you take to try to control prices/allocate resources... people will find a loophole to get around your authoritarian restrictions, and rightly so.


You mean like the Verified Actual Gamer program Linus Tech Tips is doing?


Old-new stock and "unofficial" chips and parts has been a thing for like ever. Now I hear of more "mainstream" projects having to tap into that market but it has always been there.


My boss at an internship bought like 20 NE2000 NICs for cheap somewhere in the late 1990's. We were confused when we would plug a second one in and it all went to heck no matter which 2 NICs we used.

We realized they all had the same MAC address. So we just overrode the MAC in the OS and got them to work.

Someone cloned a card perfectly :)


This happens a lot with cheap clone Bluetooth dongles for some reason - they all have the same MAC address, or one of a handful of addresses with some of the newer ones.


> Old-new stock

Most commonly referred to as New-old stock (NOS).


according to one news outlet, I heard, the chinese have a long history of double ordering chips. Now, the strategy final pays off, as they can continue development. Most businesses don't double order because of the additional costs, now they're hurting more.


> I can see those chips available in their thousands on aliexpress and similar sites for 10x the price. I also read on forums that Chinese entrepreneurs buy all the chips and stockpile them. Is this some new kind of war?

No, this is just good old-fashioned capitalism.


What's the alternative? only people deemed "essential" are allocated a GPU? how are you going to enforce that? by invading China and seizing the means of production?


>entrepreneurs buy all the chips and stockpile them. Is this some new kind of war?

No, that's capitalism.


arbitrage is probably a better description


> Is this some new kind of war?

No, this is called capitalism! Everyone should be proud.


You show a distinct lack of understanding of what capitalism is.

I can very easily hoard goods in a controlled centralised economy as well, the only major downside is the governmnent will likely throw me in some Gulag if they caught me.

It's actually market forces, nothing whatsoever to do with capital.


One of the most common themes of propaganda in communist Poland was how speculants are responsible for all shortages.


That is just disingenuous. If capitalism made no difference then governments would not have enacted things like rationing during times of shortage. It is a well known problem in a capitalist system that when products get in extreme shortage, the system kind of fails and you rely on planned economics, in the form of rationing.

Not saying planned economics is some sort of silver bullet, only that it has uses in extreme situations.


Again, utter nonsense. This is human psychology reacting to a market, and nothing to do with the control of capital.

In a centralised capital control economy such as communism it is well documented that there was hoarding as well, communism never removed the human component no matter how much they tried.

Centralised planning fails utterly as the complexities of the supply systems are way beyond are current abilities to project. And fully planned economies have well documented failures anyway so your comment about this being a problem of capitalist systems is fully disingenuous; or maybe they were just not socialist enough, right?.

What we had here in this instance for semiconductors was a confluence of factors that I doubt anyone would have planned for. Also I feel much of it is sensationalized. We are talking super high end consumer electronics for the most part. The leading few nodes of production are mostly Apple for example and the further back in the production node you go the more players and capacity is available.


>Centralised planning fails utterly as the complexities of the supply systems are way beyond are current abilities to project.

The price signal being the major data input that informs production plans. In centrally planned economies, they basically just guess. They'll lie to you and say it's possible to "predict" demand. And when supply constraints happen, well I'll be damned it's the friends of all the Party members that get their share first.


Yeah. It is called the US-China tech war.


Sadly I would not trust those Chinese chips. This is where 99% of counterfeit chips come from and often they either are sub-par to original specs or they don't function at all.

We have a similar situation - 13 month lead times. In one case we were able to substitute a $1 part for a $150 military version. The other situation can't be fixed so easily: Xilinx FPGAs. Each unit of our product has 120 of them and we are building dozens just for orders this year.

We've already had our assembly house lose $10K of parts through employee theft as well. They still don't know "how it happened" but honestly only a handful of people would know they were valuable and know how to fence them! Probably not minimum wage employees to be honest.

It probably is an angle to the war.

That and shutting down low-tech supply chains for routine things like TP - just so people know, the dominant standard supply chain for paper products is to use US/Canadian and Malay/Indonesian or Brazilian (tropical) wood chips, shipping these to China to make raw paper rolls, then bringing it back to the US to be cut into TP et al.

It's easy to disrupt things without even shutting down the supply chain - MOST JIT (just-in-time) supply chains (which are 99% of all supply chains today) are tuned to accept no more than 5%-10% volume variance so you can shut down supply chains by forcing the variance to 20%-30% without even going to 100% off.

This is part of what happened to TP in 2000; the other part was the fact that commercial and residential TP supply chains can not be mutually substituted - BY LAW.


Does anyone else also feel like we're in a factorio game and someone realized we're not producing enough green chips after all?


And the GPU shortage is because all our red circuit production is going to the miners.


Speaking of, this is a blog post that I really liked on factorio from a functional programming perspective.

https://bartoszmilewski.com/2021/02/16/functorio/


We put all our green chips in a mega-factory or two, and used trains to move them to the major places that needed them.

Unfortunately it's a multiplayer server, and someone started rerouting the trains because they wanted to build more cars.


I know, I know! But I didn't make the bus wide enough for all the iron plates and copper plates I'd need to increase green circuit capacity. I could upgrade to blue transport belts, but that will take forever!!!

How could past me have ever been foolish enough to believe I'd never need more than 20 lanes on the main bus? I was so stupid stupid stupid!

Woah. Sorry. I blacked out for a minute there. What were we talking about?


I think it's more the case of getting to 7nm Blue Chips and realizing how much more costly they are in terms of prerequisite requirements.


If only the real world worked in Factorio timescales.


I'd like to carry thousands of chip factories in my pocket and create gigafactories using robots in a matter of days (in-game days).


Or M.U.L.E. for us old timers.


Lots of people still need to learn that the primary objective is for the colony to be successful and even if you personally get a high score you lose the game if the colony fails. Also, Wumpus hunting is underrated.


I hope that doesn't mean that I have to build a rail line to a new copper sector to the north-east...


No, but you might have to "liberate" some oil fields in the mid-east...


This article doesn't say what is in the title. This might be better:

https://www.theverge.com/2021/4/15/22385240/tsmc-chip-shorta...

To that point, this issue is already painful for anyone making electronics, and is going to hurt the small electronics manufacturers more (if you aren't in a position to buy a million parts, good luck getting any priority as things come back on line).


Wait until you talk to your plastics supplier...


Don't worry, metal and fiberglass too!

At least with materials if one source disappears, it's not a redesign to use another source (just eats into your margin). Not so lucky when it's critical ICs with whose manufacturers you had long-term deals (for continuing production) until their factories burned down and/or got destroyed in natural disasters.


We use a custom plastic with a corporate-approved color. Not as easy to second-source quickly....


That’s me! And yes, it sucks.


I feel like I've been playing "part-replacement whack-a-mole" for in-production designs for the last 5 months, and it's only getting worse.


So this seems to be about increased demand for high end components driven by increased IT equipment purchases during the lockdown as people depend on this stuff more. That's the implication from the article.

Meanwhile I'm also reading about chip shortages affecting car manufacturers, and possibly some other industries. The dynamic there seems to be that car makers (and possibly others) cut orders drastically early in the lockdown, which means component makers shut down manufacturing capacity and it's taking a long time to ramp it up and also clear the backlog of orders as demand came back earlier than expected.

So this seems to be two completely different effects going on. I know very little about these industries, but I wouldn't be surprised if there's a meeting of these effects in the middle. High end devices often have some lower end components (peripheral and glue logic on desktop motherboards for example) in addition to the potentially pricier main CPU.

Is that the story?


So to over simplify.

1. Car Manufactures canceled their order (X) for 6 months.

2. Fab sold those capacity to others

3. Demand for All electronics increases because of WFH ( and possibly bitcoin )

4. Demand for Car actually raised during COVID.

5. Now Car Manufacture want X, their original order, another X for their next 6 months, as well as 0.5X where 0.5 was the increase in demand. So total 2.5X

2.5X increase in order to catch up with their production All while other electronics such as iPhone 5G, Tablet, PC are selling record number in recent years.

It also doesn't help when Samsung's yield are failing yet again

Now let's take this even further. Apple is predicted to sell record number of iPhone, Mac, and iPad. TSMC will do preferential treatment to all orders relating to Apple with Apple courting their suppliers. So that uses up those bigger nodes as well. You end up having industry fighting for whatever that is left.

And just like any product or commodities, you have people hoarding them for profits. It is the same with PS5, Switch, Graphics Card or any other with limited quantities and high demand. This will attract interest to trade them and make money. Which add even more strain to supply chain.

From a Supplier perspective, all of a sudden you are looking at a market with seemingly unlimited demand and you have limited supply. This scenario is similar to what happened with DRAM and NAND in 2015 - 2019. Although the cause is different.


I think the focus on automotive manufacturers has been exaggerated. Automotive orders are definitely a contributing factor, but the popular narrative is putting too much emphasis on a single industry.

The reality is, like you said, demand for consumer devices across the board has spiked. Not just cars, but phones and video game consoles and everything else. Cars are a part of it, but I don't see how we would have avoided a chip shortage if auto makers hadn't cancelled some previous orders and then resubmitted them.


I think media latched onto it as the most compelling story. The cost of the chips is only a tiny fraction of the cost of the whole vehicle.

Cellphones, game consoles, smart gadgets? Ok, chip shortage, seems understandable. Whole car factories shutting down because they can't get a few grams of chips per car feels like a dramatic and new kind of supply chain failure.


> Automotive orders are definitely a contributing factor, but the popular narrative is putting too much emphasis on a single industry.

Yes. All because no one give a damn before the Car Manufacture started to become very vocal about it, politician picked it up because it affect jobs, and jobs affect vote. Then it become a news story in mainstream media, then people started blaming TSMC, that also fit the narrative of relying on TSMC being far too dangerous for National Security. Then people want Digital Sovereignty ( whatever that means ). Then companies will ask for government incentives...... and on and on and on.....


Who were those others in point 2)? That's the question.

Either they would want the capacity either way, or it's some kind of customer induced by temporarily reduced prices, so price sensitive.


Specific to Bigger ( Older ) Node, they have been in very tight Supply for a fairly long time. Possibly going back to 2018. Fab has been built but still not catching up with demand. So when there are capacity to be filled. People grab them.


I am not really familiar with the industry, why will TSMC give preferential treatment to Apple orders?


In a number of cases in the past, Apple has financially backed suppliers to increase manufacturing capacity for some product line Apple was interested in, because Apple wanted millions of that part, at tight tolerances, and the company could only produce perhaps hundreds of thousands. In exchange for paying for the buildout, they get a permanent Friends and Family discount on that manufacturing line. Lately it seems they've opted instead to pay for preferential scheduling on the backlog.


Because Apple has super deep pockets and TSMC is a for-profit corpotation with shareholders


To over simplify again, your biggest customer always get preferential treatment. It really is just business, and nothing to do with industry.


Apple buys capacity by by paying TSMC. What it means is that they pay to get preferential treatment for certain chip runs.


A couple of other things that happened.

Intel has been struggling to shrink their dies, so they went to TSMC for some of their chip production.

Apple ditched intel with their latest macbooks (with the M1), which also requires more TSMC capacity.


Not to mention AMD moved from mostly using Globalfoundries to using TSMC for most of their new CPU/GPU including consoles. And AMD have had quite the resurgence over the last 5 years in terms of sales.

It's definitely a confluence of demand scenarios which would have been difficult to plan for at the best of times and then there was the overall disruption to supply chains and long term planning that Covid would have presented.


From what I understand, Intel's announcement that they would use TSMC was basically hot air. TSMC's capacity was already tapped, and they have little incentive to use it for a competitor when it's been planned for other customers since years back.


Aren’t they using them for their upcoming dGPU production?)

https://www.techpowerup.com/277134/intel-xe-hpg-to-be-built-...


IIRC, Intel only planned on using TSMC to produce some of their low cost celeron chips.


> component makers shut down manufacturing capacity and it's taking a long time to ramp it up

Not really. The component manufacturers canceled their fab time contracts with fab companies as they could not afford to hold onto them if their customer (the car manufacturers) were not buying from them. And once they did that fabs just sold the capacity to the next buyer.

So now the component manufacturers only option is to buyout the contract from someone else (really really expensive with current available capacity) or just wait until the contracts expire to get a chance to bid for them again. Also as fab time is auctioned if the shortage still continues then the fab time will be really expensive even then.


Component makers did not shut down manufacturing capacity, they just re-allocated the timeshares formerly set aside for car manufacturers to other customers.

Notice that there's no shortages of Intel CPUs, and few serious shortages of console gaming systems. That's where the manufacturing capacity went.


I thought this article about purchases of chip manufacturing equipment was very interesting:

https://www.scmp.com/tech/tech-trends/article/3129611/us-chi...

US-China tech war: China becomes world’s top semiconductor equipment market as Beijing pushes local chip industry

Mainland China topping the list for the first time ever. South Korea is investing a lot more than usual. Taiwan is stable. The US is down.

Will be very interesting how the chip situation will look on the other side of the current shortage. The shortage is all over, not just for high end processes. The high end processes will probably still be done by SK, Taiwan and the US but a lot of the lower end will go to China.


If anybody been watching the second hand semi equipment market, all kinds of opportunistic players from China been scouring the market clean for last 5 years.

One of those opportunistic players hoping on becoming a n-th tier fab player is for example Galanz — a kitchenware company: https://twitter.com/ogawa_tter/status/1310852850033946624

Though, there is notion that those are just manoeuvres to get giant tax subsidies from the state. Semiconductor companies in China pay near no tax, even if they have 1 wafer per month fabs.


Well. Nokia started with rubber boots.


Their rubber venture is still alive and well and makes good studded winter tires for bicycles


And Samsung started with dried fish.


Nintendo playing cards!


I'd say that Nintendo is in the same line of business then, just at a different level of technology :)


I mean there was no signs of anything ending anytime soon so not surprising. The real tragedy is the US has let the ability to produce such critical technology lapse and we fall onto almost solely TSMC for chip production.

I'm glad we seem to be ramping that back up but with at least a 5 year lead time I'll be crossing my fingers the entire time that nothing else happens.


Yeah, consolidate more [1]. What could possibly go wrong.

https://en.wikichip.org/wiki/technology_node#Leading_edge_tr...


As far as I know, these were not consolidation so much as trailing players dropping out of the race in a winner-take-all market.


Maybe one of the ultra billionaires could make a competitor, but the amount of expertise built up over time in those companies seems insurmountable. Possible maybe, but quite a tall order, people would rather just build a space company it’s easier


Apple and Google sit on large piles of cash. They both need chips. I wonder if they would invest in expansion the industry, at least to have preferential treatment.

This may mean more fabs, but not necessarily more players.


I thought Apple bankrolled TSMC to expand their foundries.


Yes, before the shortage. They can double down.


Interesting chart. For someone who doesn't follow the industry, does it show current capability, or capability at the time a given node was cutting edge? If the latter it really does show a tremendous winnowing. To what extent is that winnowing attributable to consolidation (M&A) vs dropping out?


I only hope in the next decade or so that fab tech becomes more commoditized so GlobalFoundries and co can get back onto the leading edge node.


Development costs of each generation rises exponentially. Would not be surprised if that number dropped even more around 3-2nm nodes.


Real Men Have Fabs.

The above quip came from TJ Rodgers of AMD/Cypress. It was popularized by Jerry Sanders, CEO of AMD.[1]

There were many fabs back in the day. Now they're mostly EPA Superfund sites in Silicon Valley.

The IC industry has done thru many many boom/bust cycles. This cycle could be one of the worst because fabs are so expensive that everyone has chosen to simply buy their chips from the few remaining "real men" who still have fabs.

Not entirely unforeseen.

[1] https://semiwiki.com/john-east/273760-real-men-have-fabs-jer...


Bigger and bigger risk to build state of the art fabs if TSMC/Samsung are just going to beat you anyways

https://en.wikipedia.org/wiki/Moore%27s_second_law


Bigger and bigger risk for the likes of Apple and AMD depending so totally on TSMC. I don't see how this dependency benefits them in the long run.


Apple seems strangely unaffected by the chip shortage, with a seemingly-plentiful supply of new iPhones, iPads, and M1 Macs.


It's affecting even Apple. But not to a great extent.

https://www.macrumors.com/2021/04/08/ipad-macbook-production...


Apple is less harmed than AMD. TSMC basicly is Apple Taiwan, without Apple as a customer paying for R&D and providing custom 5nm designs to build and push R&D, TSMC has no business. Apple is an ensured sale with very long term contracts in place (> 5 years). By time any contract expires, Intel will have caught up.


I'm a bit confused; the linked article talks about TSMC's profits and makes no mention of the chip shortage or any predictions related to it. Am I missing something?



I think the article was updated


The strange thing is, it affects so many different chips. In a current project, we use an FPGA (XC7A35T-2CSG324I), an USB controller (CYUSB3014) and a GPS module (MAX M8C) - all of them are now unavailable until end of the year or early next year. During the past few weeks, you could literally watch how stocks dropped to 0 at Farnell, Mouser, Digikey. I'm very concerned about the impact this will have in the coming months.


I've created a graphic how the FPGA stock slowly dropped to 0 in the past 3 weeks: https://blog.classycode.com/fpga-inventory-depletion-631e907...


Thanks for letting everyone know which chips to use ChipHFT(TM) to front-run you on.


That link points to an "article" (it's like 3 sentences long) with the title

> TSMC's Q1 profit up 19%, beats market estimates

And says nothing about a chip shortage


TI DC-DC switcher IC in one of my designs cost about 0.7EUR before this. For current production batch we had to buy it for around 22EUR (qty 100).


Damn, at this point I might just wait until zen 4 stuff comes out before replacing my current desktop (circa 2012) since it still seems fine for most stuff.


Ryzen 5 5600x and Ryzen 7 5800x prices / stock have about equalized lately. So they are pretty easy to find at MSRP. You can currently get a Ryzen 7 5800x from many major retails for MSRP. The Ryzen 9 5900x and 5950x are harder to get.

It's GPUs that are nearly impossible to get right now. I've been trying for weeks to get one for a friend. I'm intently watching the stock notification Discords. There are frequent restocks, but they seem to sell out in seconds.


If you have a microcenter near you you can find out when they get shipments and show up the minute they open (like 6am around me) and get a card.


How much fab capacity is indirectly dedicated towards Crypto mining?

Could blocking the fiat inroads, and causing a price fall, be a solution to this chip shortage?


Miners buy at cost parity with coin costs.

Even at the current ridiculous price of crypto, they can't afford latest nodes. And TSMC make them pay cash because they were previously burned by few mining chip makers going bust after their bet on coin prices didn't pay put.

But only underlines how chipmaking is the next most lucrative thing after, effectively, printing money.


How much fab capacity is indirectly dedicated towards gaming?

Could blocking video game sellers, and throttling the internet, be a solution to this chip shortage?


It’s direct competition. Nvidia sells normal graphics card chips directly to miners.


That's an interesting question. Crypto mining wants to do more and more cycles, faster and faster, so of course they need chips from the newest processes.

But in reality I don't think "blocking" is possible now. That ship has sailed (feel free to use your own cliche here). There are now exchange listed companies, eg Coinbase, that deal in crypto. There are mutual funds that deal in crypto. Tesla owns a bunch of crypto :)

A lot of older ICs, eg automotive, don't need leading edge fabs, and those other fabs also are seeing record demand.


But what does eg. Germany, Korea, Japan have to gain from Crypto activity?

What does humanity have to gain generally, with an increasing share of our energy and chip manufacturing output going towards an essentially pointless activity?

Why not more individual action, like India has taken?

Much of the wealth seems to be funneled into the USA, where many of the earliest entrants and major Crypto traders are located.


It seems to be a net negative to society to me. One of those cases where a small group gets rich, while everyone else pays the externalities.


Lot of the mining infrastructure could quite easily be converted to distributed computing resources doing useful work.

Could be a nice alternative to the few monopolistic public clouds with arbitrary rules and one could even use the crypto stuff to pay for using the resources.


>What does humanity have to gain generally, with an increasing share of our energy and chip manufacturing output going towards an essentially pointless activity?

Let us ban all internet chat. People can use smoke signals and paper mail instead. (Aside, the latency differential between internet messaging and paper mail is extremely similar to the differential between legacy settlement technology and blockchain-based.)

Let us ban video games. Board games worked for thousands of years.

Let us ban all video streaming. Let people use local video stores instead of all this high energy information broadcasting.

You ask this question from ignorance. You assume the new technology is pointless, when in fact it represents a sea change in finance, contracts, and international order. There are use cases and financial instruments that are impossible with legacy technology. To call it pointless is simply wrong. Anyone from any country with the ability to broadcast and receive hash strings can participate, this is not an exclusive endeavor.


TSMC lumps crypto into their HPC segment which is rising. How much of it is crypto is unclear. Bitmain manufactures their mining equipment on the latest TSMC process.


This is the result of something called just-in-time manufacturing. The geniuses who created this method of manufacturing always thought that there would be no interruption of supply chains, and that we could forever produce only what we could consume at the moment.


Toyota was one of the creators of Just in Time manufacturing, and they planned for this particular supply chain interruption.

https://jalopnik.com/toyota-prepared-for-the-chip-shortage-y...


Kanban is not incompatible with queuing theory.


Logistics analysis probably has a lot of crossover w/ queuing theory.

How big a buffer should one maintain if you want the user to be able to keep consuming a video stream for an up to x second packet loss window vs how big an inventory of parts carry in order to keep a production line going given an interruption of x months. Only the physical product has a harder time substituting some sources than others so optimize inventory/buffers on an individual per part basis.


I feel like this comment is kinda like blaming the inventors of the Knife for the people who were stabbed. JIT was an amazing invention and a progress to mankind.


Reminds me of a joke - if economist designed humans instead of having two kidneys we would share one between five people.


And I can make a baby in 1 month with 10 women.


Shocking to absolutely no one. When you are the only game in town and everyone wants the latest greatest (high demand), shortages are going to last into the foreseeable future[1].

[1] - Foreseeable future = multiple years on end


Samsung is shipping 5nm too, it's not as good as tsmc 5nm but it is close.


2 sources is not much better then 1 ... we have seen 3 and 4 sources collude in the past


On the bright side, this might encourage reuse of used/old parts and care about being more efficient with what we currently have, which is a form of innovation on its own


It is interesting to me that if someone were to say "labour shortage" there would be many people arguing whether 'shortages' are real concepts, but if someone were to say "chip shortage", 9 hours later there will be no comments of the sort on the topic.

It appears that the prime determiner of whether something is a shortage or not in the HN community is if it negatively affects tech workers negatively or not.


This comparison is not fleshed out enough to be compelling. Chips have very long and fragile supply chains. If you have a shortage of doctors (they take a long time to train), the ramifications would be obvious and hard to alleviate.

I personally have never heard anyone argue that shortage is not a real concept.


It looks interesting to me. It am kinda skeptical about it, most of the small chips which are under shortage are small chips, and definitely not made with latest 5nm/7nm node sizes. Most of this chips can easily manage to fall at 22nm node size, where TSMC is no monopoly.

I think its just a PR stunt by TSMC, to overprice and prebook their production capacity for increased margins.

PS: It is my personal opinion.


On that topic has anyone seen any EPYC Milan CPU available anywhere? They were supposed to have been released last month but are nowhere to be seen.


lol I ordered a 5900x on launch day and it only arrived in late January...


Question: does the chip shortage only apply to the smallest process sizes? That is what I would assume, but with the talk about problems for car manufactures having supply issues it would seem the shortage extends to larger process sizes as well? (Do they really need the newest snapdragon processor in a modern car?)


It's across the board. Makers of small scale microcontroller projects/kits (whose chips typically use 28nm and larger process nodes) are reporting problems getting parts. Apparently some of the parts the autos need are >100nm.

I suspect that this problem started with the Huwai sanctions causing lots of Chinese companies to start panic buying inventory not knowing of they're going to be the next hit with sanctions. Add to that the demand shifts caused by the pandemic. Then throw in other companies realizing how vulnerable they are (and therefore trying to add to their own inventories) and some speculators and it makes for a real mess.


No, it's actually much worse for legacy processes, discretes, and some components.

It's stuff that was usually made on non-300mm fabs, and 130nm+ nodes.


It isn't just integrated circuits, even components like resistors and capacitors are in short supply. Not to mention that raw materials like plastic resin are also going way up in lead time.


Other way around. Small process sizes have already been called for a long time ago so no changes were made in the demand. For example, TSMC's 5nm process was bought out by Apple two years ago. It's the more JIT processes with shorter order lead that have impacts.

Apple will be able to get things like the M1X with no problem but they will have trouble getting things like microprocessors used for power management.


COVID induced demands for consumer electronics, while constraining supply of ICs. There's lots of low-end chips (not the CPU or GPU, but say power controllers, etc) going into laptops and peripherals (mice, keyboard, etc.).

Hence the squeeze all around.

On top some distributors are selling whatever they have left at whatever market is willing to pay, which may mean 100x the usual price...


Everything is short in supply. Part of that is because analog chips generally use older processes anyway and part of it is that old fabs have a very long tail of selling older process size chips. I think right now the best bang for your buck is 40nm.


Looking at the amount of driver assists and video analysis solutions packed onto modern cars, maybe?


The view associated with Joseph Tainter, regarding collapse, is that collapse necessitates/involves retrenching at lower levels of complexity.


I wonder if auto has been moving to smaller processes just for better efficiency.



RIP all gamers everywhere.


And the gaming industry that may be holding off their PS5 releases.


Why doesn't TSMC just raise prices 100% or so ? Surely everyone would just have to pay up?


Semi orders are made years in advance with pricing fixed in the contract.


But they seem to be able to cancel a price cut as per https://news.ycombinator.com/item?id=26657436


Imagine if we get to the point that critical medical or infrastructure devices cannot be produced because all the chips were taken up by Bitcoin miners wanting to speculate on a fake currency bubble. Our own greed will be our destruction.


Which critical medical infrastructure requires 5nm to function well?


What if all those chips were taken up by gamers, or ML researchers, or by cloud services. There will always be different people computing for resources.


The thing is that you could argue that games are art, ML researchers are advancing sciences, and cloud services enable businesses to create value. I’m not saying crypto is utterly useless, but many people certainly think so.


https://wccftech.com/tsmc-plant-hit-by-power-outage-millions...

This certainly won't help. Just like the weeks that Samsung's fabs were down in Texas after the storms in February...it doesn't take much to disrupt a facility for weeks.


When is it going to be financially practical for recycling centers to start removing and categorizing used chips and reselling them?


It probably makes sense for some but these are a few months old high end chips. You wouldn’t find a working 3080 core in a recycling pile since it’s worth over a thousand dollars


5-6 year old low end 1060 cards are also going for crazy prices, more than they were worth new half a decade ago.


What would be a great opportunity to bring back Western talent who move abroad to Asian plants, and also bring back part of that production to Europe and the USA just won't be taken.

Producing the much needed chips would be great for any economy.


Wasn't able to get a new dishwasher because of "no chips".


Hands and soap for me. I use the same bowl for most of my meals, so I don't have to wash it but once or twice a day.


Lucky you!


Yeah my low expectations for life consistently come in handy.


I consider myself lucky to have upgraded a few months ago. Now I might not have to worry for two or three years.


I strongly believe this is going to allow Intel to make a dramatic comeback. It will buy them enough time to catch up while everyone is stuck buying Intel still as a result of supply constraint.


That seems unlikely, for the same reason that TSMC isn't able to put Intel down permanently. We're reaching the end of silicon scaling. Transistor densities might continue to rise with 3D techniques, etc... but the actual logic is reaching physical limits. So transistors per price and per power are hitting walls.

TSMC pulled ahead, but they don't have a lot of runway to pull far ahead. Likewise Intel can catch up, but they can't retake the kind of lead they had a decade ago.

Semiconductors are turning into commodities, basically.


> We're reaching the end of silicon scaling.

For the next 2 shrinks at least TSMC disagrees.

According to roadmap of TSMC they are roling out 3nm risk production this year and they built a new research lab for 2nm last year and already picked a site for the new 2nm fab (Hsinchu, Taiwan). Beyond that I don't think anyone has any real plans (yet).


I think the question is if anyone is going to refuse a 5nm product because 3nm is available.


Maybe not, but they might start refusing 14nm products.


Not car makers, I suppose.


Probably, Apple, AMD, and NVIDIA would pay for 3nm instead of 5nm. Apple may be the first one to use 3nm. If it gets lots of gain, there would be lots of followers.


A valuable and strategic commodity I'd add.


That’s an interesting take. I feel the chip space is driven by marketing a bit too much. At the data center level I don’t believe the performance difference is as dramatic in practical terms. For the most part, you’re not missing out that much by going Intel unless you are working on something time critical.


If we're talking AMD vs Intel, then the biggest difference I see is cost (when the CPUs are in stock, anyway). Intel's performance is fine, but the CPUs can cost multiple times that of the competition. With Xeon, a small municipality would never be able to say "You know what? Just go with the 32-core CPU, it's not that much more."


I would prefer a 2-core CPU, when Oracle Enterprise Databases run around $24k/core after discount, and SQL Server Enterprise runs around $15k/core.

From single-core performance and low core count, Intel appears to be the better choice.


not a DB guy so excuse my ignorance but at that price per core wouldn't it be more cost effective to hire a few more admins and buy more cores and use something like postgress or MariaDB?


Because to non-technical businesses hiring competent database architects is a lot more expensive than buying an Oracle support contract.


These are likely systems originally built when the database was the application. So everything in stored procedures and shared across hundreds of individual jobs/apps/reports.


My Oracle Enterprise databases were installed in the late 90s, and lots of systems that I support have grown around them.

I've raised the question of the Enterprise DB version of Postgres (because of the PL/SQL emulation, which would certainly be needed). Even this change would be a large engineering effort.


Intel seems to be multiple years behind at this point. They might make a bit of a comeback, but it would have to be a huge process size shrink to be competitive. Based on their previous years, I doubt that an extra year will help that much.


Behind on...what? Against the latest and greatest Apple or AMD? Maybe. I'd probably put it at a generation at most.

That aside, there's a -huge- market for chips that aren't the latest and greatest. Intel expanding into that, which if I read right is what they're doing, could be a huge moneymaker.


Intel is shipping 10nm superfin (I'm typing this on a Tiger Lake NUC). Density wise, this is comparable to TSMC 7nm (used by AMD).


While Intel 10 nm SuperFin might be comparable in density with TSMC 7 nm, it definitely is much worse for the yields of chips with large area.

Six years ago, in 2015, Intel had no problems to introduce the smaller Skylake U and the larger Skylake H at the same time.

Now, Intel needs more than half of year to increase the yields of 10 nm SuperFin enough to be able to introduce the larger Tiger Lake H after the smaller Tiger Lake U introduced last year.

This happens even if now the area difference between Tiger Lake H and Tiger Lake U is not so great as in the previous generations, because a good part of the twice larger CPU is compensated by the 3 times smaller GPU.

On the other hand, the previous variant of the Intel 10 nm process (non-SuperFin), which is still used for Ice Lake Server, has much worse electrical properties than the TSMC 7 nm, because at the same number of active cores and the same power consumption AMD Epyc 7xx3 can have a clock frequency up to 50% higher.

Moreover, the similar density of the Intel process means absolutely nothing when the competition using the TSMC process can deliver a more than 3 times greater L3 cache memory at the same price and in the same package size.

The density does not have any importance if you cannot deliver more transistors in a given package, because you cannot manufacture large enough chips and because you have been unable to predict that you will never be able to manufacture large enough chips, so that you should have designed from the beginning your product as multi-chip.


Why does everyone make this excuse for Intel? It really sounds like fanboyism. Not trying to offend.


Is it an excuse, or a fact?


anecdotally two friends of mine wanted AMD but bought Intel because of availability. You may be right.


That's just because no one wants Intel at the moment.


Intel still gets paid even if the people buying their CPUs don't want them.


I could get away with using my old 2013 Macbook Pro for 90% of my computing needs. I could run the rest in the cloud. Sure I have the latest AMD and I know the headline numbers but I wouldn’t be able to tell you that consumer chips improved all that much in the past few years purely from a practical pov.


Same here, but that last 10% of computing needs can complicate things.

Try running two 4k external monitors on a Haswell MBP.

They look about the same, but a Haswell MBP13 (Late 2013) is very much less capable than an Ice Lake MBP13 (2020). Not all of that difference is attributable to the CPU I suppose, but the graphics and thermal throttling under load certainly are.


Even 2019 i9 MBP can't do that without extreme overheating and throttling, measured with pmset -g thermlog.


Intel has been trying to get their 5nm fab process right for sometime now. So far, they have not been able to make it work. This is not software. It's physics. It's chemistry. It's material science. These are very hard problems to solve and there's currently only one company that can do it and their machines cost $200 million dollars and have a several year backlog of orders.


> there's currently only one company that can do it and their machines cost $200 million dollars and have a several year backlog of orders.

ASML has a market cap of 223 billion $, and their stock value has doubled in a year. That's a lot of growth...


Not really, the limits you see from TSMC here are for the lesser of the important chip sales. Apple's orders that will put a hurting on Intel in the market and mindshare are still good. So Maybe Intel may sell more units in a small way by the M1X will still hurt Intel's lead by just existing.

Did I mention that Intel Xe uses TSMC as well?


didnt someone here explain recently that intel has a huge warchest, such that their current sales have little to do with their future?


that might help the global warming...


What if chip shortage just became the new normal? Any alternatives?


That's a fantastic question! Here's my hot take:

- Price increases on all products that require chips.

- Right-to-repair laws make some headway, as the cost of repair becomes more competitive with the cost of replacement.

- Commercial software developers shift their focus a little more towards program efficiency, at the cost of slower feature development and/or higher code complexity.


I hope PC game developers start thinking about optimizing for existing and last-gen hardware instead of just assuming their games will run better on next year's hardware. So many potential new PC gamers will be turned off of PC gaming forever over a short term spike in demand/profit from miners who will abandon gpus altogether as soon as theyre no longer economically viable.



If you look at the numbers 50% of TSMC's revenue is made up by the 5nm and 7nm as of 2021. And just one company, Apple, makes up 25% of TSMC's revenue. While the M1 is a great chip - it also just launched and Apple has been one fourth of TSMC's revenue for years. This means that around 40 to 50% of the top performance nodes suitable for general purpose use in data centers, workstations, etc. is being used for mobile devices -> mostly phones and tablets.

On one hand I have to thank Apple for pushing semiconducter development and making ARM a real x86 replacement outside the DC. However, it seems really wasteful and like a mis-allocation of resources to put the top performing silicon in devices that don't really need it. Even the M1 architecture is handicapped by Apple itself. Besides the most obnoxious use of M1 Mac mini's in DC basically for the sole purpose of iOS and macOS app development, the CPUs are mostly used in consumer devices. Their main purpose for use in consumer devices? To eliminate the need for a fan and to improve battery life. Both of these things are great but also just not high on the list of problems humanity needs to solve.

What I am trying to say is that, for the last few decades, it at least seemed like 'real' work drove semiconductor development. From cloud giants to a local companies data center, performance and power efficiency were what drove and purchased the bleeding edge CPU/GPUs (I suppose in some ways GPUs were driven by consumers/entertainment reasons). Now it seems like luxury products are taking on that role.

I call these mobile iOS/Android products luxury products because outside of novelty purposes no one is really producing movies or something of value beyond a word document on these devices. Traditional laptops and desktops "won" in a way for the same reason that the M1 processor exists; mobile operating systems still have never matured to replace then so instead even Apple is bringing mobile software back to the traditional computer.

The most worrying aspect of this development of course is how locked down most of these devices are. iOS devices are of course in a walled garden and even macOS has more & more restrictions. I can easily imagine a day where consumer level electronics are completely locked down and the only way to get a open/free platform is to buy server hardware - if that is even possible. Server hardware is also moving to ARM and while not locked down in a software or hardware way it is locked down in the sense that only a few companies can buy it or at least no one so far is interested in selling single digit volumes of ARM servers. The days of open computing for the general person certainly seem numbered.


It's a lot easier to add another server to a data center than to increase the battery life and performance of a mobile device that is constraint by size and cooling.


> However, it seems really wasteful and like a mis-allocation of resources to put the top performing silicon in devices that don't really need it

Semiconductor research requires a lot of capital. If Apple is providing that upfront capital at a higher rate, then that's great for TMSC. More capital for spearheading development. And with better and better yield at smaller node sizes, it'll trinkle down. 2 years later 5nm would become common place and can be used to produce chips for servers. It's just the question between now and a couple years later.

> Both of these things are great but also just not high on the list of problems humanity needs to solve.

At the end of the day, it's the market that determines innovation and R&D. More money for a specific product results in more money spent in researching and development. It's about what people need rather it's about what people want. Also not having a fan and improved battery life means less ewaste, smaller batteries and less electricity use.

> What I am trying to say is that, for the last few decades, it at least seemed like 'real' work drove semiconductor development.

First it was military spending that funded CPU development. Later on, it was enterprise because of companies had money. Now with cheaper and faster hardware, more and more people are able to get their hands on a PC. You can watch educational videos online. Talk to thousands of people or connect with relatives. You can do online webinars or shows. Heck, you can even learn languages on apps on your phone. In the end, mobile phones are a tool. They can be beneficial or detrimental on the use.

> mobile operating systems still have never matured to replace then so instead even Apple is bringing mobile software back to the traditional computer.

It's about centralizing software development between platforms. It's cheaper and easier to develop one software OS across platforms than to have MacOS and iOS separate. This might mean that later on your Ipad can run VScode.

> I can easily imagine a day where consumer level electronics are completely locked down and the only way to get a open/free platform is to buy server hardware - if that is even possible.

Its the opposite. The prices for all these microprocessors have been getting lower and lower while becoming exceeding more powerful. You can buy an integrated microcontroller with bluetooth and wifi modules for <$10. In fact, open computing has expanded due to cheaper cost to PCs and faster hardware. More people can afford to do software and hardware development. There's so many resources nowadays for open source development or even hobbyist tinkering of hardware. This is one of the best times you can be in for open computing.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: