I realize Intel has done some serious ball dropping over the past two decades but you do realize the US has on shore cutting edge fabs, right? It's only luxury consumer electronics and the highest end corporate gear that use cutting edge nodes to begin with.
Disruption of the cutting edge would certainly wreak havoc on the pricing and specs of high end luxury electronics but that would hardly be the end of the world. I still use a desktop with DDR3 on a daily basis (granted the GPU is much newer with GDDR6) and my laptop is from the early era of DDR4 ...
The Intel CMOS process 18A, which they have launched a few weeks ago, is the first after almost a decade that is somewhat competitive with TSMC and Samsung.
Good for Intel: their new manufacturing process has demonstrated a much better energy efficiency than the TSMC "3 nm" process that was used to make Intel Arrow Lake and Intel Lunar Lake.
Unknown: TSMC now has a "2 nm" process and the first products using variants of this process are being launched. It is unknown how TSMC "2 nm" compares with Intel 18A, but it is almost certain that the TSMC "2 nm" is better.
Bad for Intel: they had difficulties to achieve high clock frequencies in Intel 18A in comparison with TSMC "3 nm", so most Panther Lake models have lower clock frequencies than their Arrow Lake counterparts. Moreover, it is also pretty certain that for now Intel 18A has much lower fabrication yields than even the latest TSMC "2 nm" process.
> I realize Intel has done some serious ball dropping over the past two decades but you do realize the US has on shore cutting edge fabs, right?
We could squabble about the finer details of Intel's fab capabilities. They have advanced nodes, but it's irrelevant. They simply do not have the capacity to support the entire demand that is currently supplied by TSMC.
It is not just "high end luxury electronics" that have modern CPUs. It's every bloody server in the cloud. (Have a look at who makes and distributes the mainboards. Same story, substitute Intel for Supermicro.)
The economic impact on this field would be a disaster. Compute becomes much more expensive, SaaS prices will follow, and with that a massive drop in demand.
Not to mention you can kiss the entire AI industry goodbye if the price of GPUs spike.
I don't think that's an accurate prediction. Currently less than 10 year old hardware gets recycled for pennies on the dollar. That's effectively due to the combination of how cheap and how much better the cutting edge hardware is. If it suddenly became more expensive it would just see slower adoption.
Case in point, this very comment section. The major suppliers have discontinued DDR4 production because it's "obsolete" meanwhile capacity for that exact same technology is coming online in China. What makes sense just depends on context.
I realize Intel has done some serious ball dropping over the past two decades but you do realize the US has on shore cutting edge fabs, right? It's only luxury consumer electronics and the highest end corporate gear that use cutting edge nodes to begin with.
Disruption of the cutting edge would certainly wreak havoc on the pricing and specs of high end luxury electronics but that would hardly be the end of the world. I still use a desktop with DDR3 on a daily basis (granted the GPU is much newer with GDDR6) and my laptop is from the early era of DDR4 ...