#high-bandwidth-memory

[ follow ]
Artificial intelligence
from24/7 Wall St.
40 minutes ago

Is Micron Technology the Cheapest AI Stock?

Micron Technology is a heavily AI-exposed memory supplier with sold-out high-bandwidth capacity through 2026, constrained supply, and discounted valuation implying significant upside.
#ai-infrastructure
Business
fromTechzine Global
4 days ago

SK Hynix expands capacity with $12.9B investment in South Korea

SK Hynix will invest about $12.9 billion to build an advanced packaging factory in Cheongju to expand HBM capacity for AI demand.
Tech industry
fromEngadget
1 week ago

Samsung says RAM costs will likely lead to price hikes soon

AI-driven demand for high-bandwidth memory is causing a global RAM shortage that may force Samsung and other manufacturers to raise product prices.
Artificial intelligence
fromTechzine Global
2 weeks ago

Samsung sees HBM4 as key to leadership in AI

Samsung is regaining competitiveness in AI memory and semiconductor markets with positive HBM4 feedback, integrated offerings, and accelerating foundry and sensor customer momentum.
fromLondon Business News | Londonlovesbusiness.com
1 month ago

Sunnov investment: Samsung reveals massive AI investment - London Business News | Londonlovesbusiness.com

Global investors are focusing on Samsung Electronics as the group sets out a five-year artificial intelligence investment programme worth $310 billion over the coming five-year period, with analysis from Sunnov Investment Pte. Ltd. highlighting how the plan reshapes expectations for capital spending across the semiconductor industry. The commitment concentrates on AI specific semiconductors and on domestic manufacturing capacity in South Korea.
Artificial intelligence
fromTechCrunch
3 months ago

OpenAI ropes in Samsung, SK Hynix to source memory chips for Stargate | TechCrunch

The companies signed the letters of intent following a meeting in Seoul between OpenAI CEO Sam Altman, South Korea's president Lee Jae-myung, Samsung Electronics' executive chairman Jay Y. Lee, and SK chairman Chey Tae-won. Under the deal, Samsung and SK Hynix plan to scale their manufacturing to produce up to 900,000 high-bandwidth memory DRAM memory chips per month for use in Stargate and AI data centers.
Artificial intelligence
Gadgets
fromTheregister
3 months ago

Raspberry Pi prices hiked as AI gobbles all the memory

Raspberry Pi raised prices on higher-memory devices because HBM costs surged about 120% year-over-year, affecting 4GB/8GB Compute Modules, Pi 500, Development Kit, and Pi 3B+.
fromTechzine Global
3 months ago

Huawei challenges Nvidia with new AI chip technology

HBM, or High-Bandwidth Memory, plays a crucial role in the operation of modern AI chips. By stacking DRAM layers vertically, signal paths become shorter and the chip's bandwidth increases significantly. This not only delivers higher performance, but also reduces energy consumption for data-intensive tasks such as training and applying large language models. Because the memory is placed directly next to the processor, unnecessary data movement is minimized.
Artificial intelligence
fromTheregister
3 months ago

Huawei lays out multi-year AI accelerator roadmap

First off the rank, in the first quarter of 2026, will be the Ascend 950PR which, according to slideware shown at the conference, will boast one petaflop performance with the 8-bit floating-point (FP8) computation units used for many AI inferencing workloads. The chip will also include 2 TB/s interconnect bandwidth and 128GB of 1.6 TB/s memory. In 2026's final quarter Huawei plans to deliver the 950DT, which will be capable of two petaflops of FP4 performance thanks to the inclusion of 144GB of 4 TB/s memory.
Artificial intelligence
Tech industry
fromTechzine Global
6 months ago

Samsung's profits plummet, Nvidia-related woes continue

Samsung's profits have declined significantly due to delays in HBM chip certification and strong competition in the memory market.
[ Load more ]