#hbm

[ follow ]
Marketing tech
fromTheregister
18 hours ago

SK hynix's latest range: Corn in banana chocolate

SK hynix and 7‑Eleven released honey‑banana HBM‑themed square corn chips with collectibles and a prize draw including a grand gold prize.
#dram
Artificial intelligence
from24/7 Wall St.
4 days ago

Why the Selloff in Micron Technology Stock Never Made Sense

Micron's DRAM, NAND, and HBM leadership positions position it for lasting AI-driven demand despite near-term stock volatility and market overcapacity fears.
from24/7 Wall St.
1 month ago

Up 145% in 2025, This AI Infrastructure Stock Is Still Deeply Discounted

When it comes to artificial intelligence, a few names dominate the conversation like Nvidia ( NASDAQ:NVDA ), Taiwan Semiconductor Manufacturing ( ), or even Intel ( NASDAQ:INTC ) in recent months. These companies rightfully claim the spotlight. These players drive the AI narrative because they deliver tangible results - record revenues, market share gains, and innovations that fuel everything from chatbots to autonomous systems. Investors flock to them, bidding up shares on every earnings beat or product launch. Yet beneath the hype, AI's foundation relies on more than just processing power and fabrication prowess. Data storage and high-speed memory are the unsung necessities that enable seamless data flow , preventing bottlenecks in the AI pipeline.
Artificial intelligence
Mobile UX
fromGSMArena.com
1 month ago

Samsung Q3 earnings guidance reveals very solid performance

Samsung expects its largest quarterly profit since 2022 as AI-driven memory chip demand boosts revenue to KRW 86 trillion and profit to KRW 12.1 trillion.
fromTheregister
2 months ago

PC DRAM costs to climb as fabs favor servers and HBM

PC memory prices are set to rise as the major suppliers allocate manufacturing capacity to the more lucrative server DRAM and HBM instead amid reports of tightening supplies. Memory prices are set for an increase in Q4 of 2025, according to market watcher TrendForce, which points the finger at the three top DRAM makers - Samsung, SK Hynix, and Micron Technology.
Gadgets
fromTheregister
2 months ago

Nvidia's context-optimized Rubin CPX GPUs were inevitable

Nvidia on Tuesday unveiled the Rubin CPX, a GPU designed specifically to accelerate extremely long-context AI workflows like those seen in code assistants such as Microsoft's GitHub Copilot, while simultaneously cutting back on pricey and power-hungry high-bandwidth memory (HBM). The first indication that Nvidia might be moving in this direction came when CEO Jensen Huang unveiled Dynamo during his GTC keynote in spring. The framework brought mainstream attention to the idea of disaggregated inference.
Artificial intelligence
Tech industry
fromMedium
3 months ago

SK Hynix Forecasts 30% Annual Growth in AI Memory Market Through 2030

SK Hynix forecasts the high-bandwidth memory market to grow about 30% annually through 2030, driven by robust AI demand and custom HBM development.
[ Load more ]