8 Comments
User's avatar
⭠ Return to thread
Haxion's avatar

Can you elaborate on 26? Almost all the major innovations in AI were invented in the US first, and while Chinese companies are doing very impressive work on balance I’d say the US is still ahead. Are you referring to the need to construct tons of additional power generation or something like that? As I do agree the US political system is very sclerotic and dysfunctional and that has real consequences in infrastructure.

Expand full comment
Alberto Romero's avatar

Exactly, I am referring to the underlying infrastructure. As you say, if China is behind in anything right now, it is in innovation. The US is still ahead. But China produces more, manufactures more, builds more, etc. Like a lot more now.

Expand full comment
Haxion's avatar

I guess I’d respectfully disagree here. The major innovation that you highlight as moving beyond generative AI, computational reasoning, is incredibly expensive in power cost. Looking at the results from DeepMind and OpenAI o1 (I’m sure Anthropic isn’t far behind and will have something similar out soon), it’s pretty easy to infer that the LLMs are using chain of thought or whatever as a kind of tree search or CSP solver. And for formal reasoning, all those algorithms scale exponentially in problem size; see for example the o1 plot that shows a linear increase in accuracy with an exponential increase in runtime (on whatever benchmark, it’s not clear). And no doubt the systems will get more efficient with more training and parameter tuning and so forth, but barring some truly insane breakthrough, close to proving P=NP, exponential scaling is going to remain. And in that limit walls get hit pretty quickly, so a 10x increase in available power doesn’t necessarily translate to much larger problems that can be solved. Granted you could solve more small problems at a time, of course, but I’m not sure I would call that winning the AI race. To my mind real victories will come from algorithmic efficiency breakthroughs, not infrastructure.

Expand full comment
Alberto Romero's avatar

Oh, but you're focusing too narrowly on algorithmic breakthroughs and I think to win the AI race (admittedly a vague thing to say) you need much more than that, like huge datacenters, top notch GPUs and chip fabs, the best engineers, etc. The US wins over China in most of those *for now*. That's my point - everything points to China eventually surpassing the US also on this front. (I will admit also that this is possibly the most controversial of my statements on this post - good catch!)

Expand full comment
Kaya Aykut's avatar

*Supposedly*, market capitalism's biggest advantage over more centrally controlled systems is the optimal resource allocation game. Assuming economics & ROI of AI work out for investors (which is not a trivial assumption given capex now, scaled ROI way later) do you think China would still come out on top in the long term? #14 also suggests you have some additional thoughts about adoption & ROI curves.

Expand full comment
Alberto Romero's avatar

Yeah but it's funny that the US, supposedly more capitalistic and pro free market than China, is more focused on culture wars while China dominates the world with sheer commercial prowess (no wars, no bullets). So, given the numbers we've been seeing recently about China, there might be some flaw in your reasoning (whose premise I accept in principle!)

Expand full comment
Kaya Aykut's avatar

Agreed, on the same page. Ultimately it comes down to the tradeoff between the detractor power of culture wars (=noise in allocation problem) from efficient allocation vs. the efficiency of hybrid (*) allocation systems ~without said noise.

(*) hard to suggest China allocates centrally..

Expand full comment
Haxion's avatar

Fair enough! I focus on power and algorithms because new data centers in the US come from private investment and are being built at pretty crazy rates. And coming back to power, physics gives us a hard constraint: most of the gains in performance per watt come from making transistors smaller, and we are running into limits from quantum mechanics there. We can keep making chips denser through more vertical layers or more die area (each GPU generation is physically larger than the last, after all), but that doesn’t reduce power consumption. From conversations with industry veterans I personally guess another factor of 2-4 is within reach in the remainder of the decade, but that’s really it barring something like moving away from silicon entirely. Gains are possible from more specialized circuitry, though even there it’s not clear much further we can go—an H100 has the self-attention mechanism already hard coded, I think!

Expand full comment