0833 GMT - CPU servers are gaining traction from inference-related workloads, Daiwa analysts say in a research note. Agentic AI applications such as OpenClaw and other similar AI tools are likely to drive higher-than-expected inference demand, they note. The adoption of AI agents could lead to higher token demand, as local agents need to interact with large language models more frequently to complete tasks and automate routine workflows, they say. The GPU and HBM combination is unlikely to be the most efficient solution for all inference use cases, they say, adding that ASIC- and CPU-based server solutions may play a more important role. Daiwa retains a positive view on Taiwan's data-center hardware sector, supported by continued hyperscaler capex growth. (sherry.qin@wsj.com)
(END) Dow Jones Newswires
March 24, 2026 04:33 ET (08:33 GMT)
Copyright (c) 2026 Dow Jones & Company, Inc.
Comments