United States chip main Nvidia is contemplating growing manufacturing capability for its H200 synthetic intelligence chips as demand for the product from Chinese language corporations exceeds current output, Reuters reported citing sources.
Responding to Reuters queries, a spokesperson for Nvidia mentioned, “We’re managing our provide chain to make sure that licensed gross sales of the H200 to approved prospects in China may have no impression on our capability to provide prospects in the USA.”
TSMC and China’s Ministry of Business and Data Know-how (MIIT) didn’t instantly reply to queries, it added.
Robust demand from China pushes Nvidia to extend capability
One supply advised the publication that demand from Chinese language corporations is so sturdy that Nvidia could add new capability. Among the many patrons is China’s e-commerce big Alibaba and Douyin (TikTok) proprietor ByteDance. As per a earlier Reuters report, the businesses contacted Nvidia for big order buy of the H200 chips this week.
This comes after US President Donald Trump on 9 December permitted CEO Jensen Huang’s firm to export its second quickest AI chips, the H200 processors, to China for a 25% gross sales charge.
Eye now on Chinese language govt nod
Nevertheless, uncertainties stay, because the Chinese language authorities has but to greenlight any buy of the H200, the report added. Sources advised the company that Chinese language officers had that emergency conferences on 10 December to debate the difficulty.
Until date, solely restricted variety of H200 chips are in manufacturing as the corporate was centered on its Blackwell and Rubin chips. The sources mentioned that Chinese language purchasers have reached out to the corporate concerning the provide issues.
Sources additionally mentioned that the corporate gave purchasers steering on present provide ranges with out offering a particular quantity.
About Nvidia’s H200 chips
Launched into mass deployment in 2024, the H200 AI chip is manufactured by TSMC utilizing its 4nm manufacturing course of expertise.
Demand is robust as Nvidia’s H200 is 6x stronger than its H20 that was tailor-made for launch within the Chinese language market in 2023.
Nori Chiou, funding director at White Oak Capital Companions advised the company, “Its (H200) compute efficiency is roughly 2-3 occasions that of probably the most superior domestically produced accelerators. I am already observing many CSPs (Cloud Service Suppliers) and enterprise prospects aggressively inserting massive orders and lobbying the federal government to loosen up restrictions on a conditional foundation.”
He added that Chinese language AI demand exceeds the capability of native manufacturing.
(With inputs from Reuters)







