Close Menu
    Facebook LinkedIn YouTube Instagram X (Twitter)
    Blue Tech Wave Media
    Facebook LinkedIn YouTube Instagram X (Twitter)
    • Home
    • Leadership Alliance
    • Exclusives
    • Internet Governance
      • Regulation
      • Governance Bodies
      • Emerging Tech
    • IT Infrastructure
      • Networking
      • Cloud
      • Data Centres
    • Company Stories
      • Profiles
      • Startups
      • Tech Titans
      • Partner Content
    • Others
      • Fintech
        • Blockchain
        • Payments
        • Regulation
      • Tech Trends
        • AI
        • AR/VR
        • IoT
      • Video / Podcast
    Blue Tech Wave Media
    Home » Samsung’s 8-layer HBM3E chips pass Nvidia’s testing for adoption
    8-8-chips
    8-8-chips
    IT Infrastructure

    Samsung’s 8-layer HBM3E chips pass Nvidia’s testing for adoption

    By Rebecca XuAugust 8, 2024No Comments3 Mins Read
    Share
    Facebook Twitter LinkedIn Pinterest Email
    • A version of Samsung Electronics’ fifth-generation high bandwidth memory (HBM) chips, or HBM3E, has passed Nvidia’s tests for use in its artificial intelligence processors, three sources briefed on the results said.
    • This development marks a significant milestone in the collaboration between the two tech giants and signals a step forward in the advancement of AI technology.

    OUR TAKE
    Nvidia is a pioneering company in the field of graphics processing units (GPUs) and AI processors, dedicated to advancing computing technologies for a wide range of applications. Samsung Electronics is a world-renowned manufacturer of consumer electronics and semiconductors, known for its innovative and cutting-edge products that drive industry progress. Samsung’s HBM3E chips in Nvidia’s AI processors promise faster speeds, better efficiency, and the ability to manage larger, more intricate AI models, propelling AI innovation forward.

    –Rebecca Xu, BTW reporter

    What happened

    According to three sources briefed on the matter, a variant of Samsung Electronics’ next-generation high bandwidth memory (HBM) chips, known as HBM3E, has successfully cleared Nvidia’s assessments for integration into its artificial intelligence processors.

    Though Samsung and Nvidia have not yet formalised a supply agreement for the tested 8-layer HBM3E chips, insiders anticipate that the deal will be sealed shortly, with deliveries projected to commence in the fourth quarter of 2024.

    HBM, a DRAM (dynamic random-access memory) standard introduced in 2013, stacks chips vertically to save space and reduce power usage. It is integral to GPUs for AI, aiding in processing large data volumes from intricate applications. HBM3E chips are likely to become the mainstream HBM product in the market this year with shipments concentrated in the second half, according to research firm TrendForce.

    Also read: Samsung forecasts AI-driven chip demand surge

    Also read: Nvidia’s AI chip delay is a supply chain test for AI technology

    Why it’s important

    The adoption of Samsung’s 8-layer HBM3E chips by Nvidia is set to significantly impact the future of AI technology, particularly in enhancing performance and efficiency.

    Firstly, the HBM3E chips offer a substantial increase in memory bandwidth, which is crucial for AI processors handling complex workloads. The chips provide a data rate of 9.6Gb/s, which is an improvement over the 6.4Gb/s offered by HBM3, leading to over 1200GB/s of memory bandwidth compared to 819GB/s of the previous generation . This leap in bandwidth allows for the processing of larger and more sophisticated AI models, thereby enhancing the performance of AI applications.

    Secondly, the HBM3E chips are designed with advanced technologies such as high-k metal gates (HKMG) that reduce electrical current leakage, optimising internal circuitry and improving power efficiency by 12% compared to the previous generation . This increase in efficiency is vital for AI processors that require high memory bandwidth at lower power consumption, ensuring that they can perform at optimal temperature ranges without compromising on performance.

    Moreover, Samsung’s HBM3E chips are the world’s first and are 50% faster than current HBM3, delivering a total of 10TB/sec of combined bandwidth. This allows the new platform to run models 3.5x larger than the previous version, while improving performance with 3x faster memory bandwidth . The ability to connect multiple GPUs for exceptional performance and easily scalable server design further amplifies the potential for AI processors to tackle more extensive generative AI workloads.

    Chips NVIDIA Samsung
    Rebecca Xu

    Rebecca Xu is an intern reporter at Blue Tech Wave specialising in tech trends. She graduated from Changshu Institute of Technology. Send tips to r.xu@btw.media.

    Related Posts

    Cloud Innovation calls for AFRINIC wind-up after ‘impossible’ election standards

    July 11, 2025

    Fimnet: Enabling Kenya’s Digital Growth

    July 11, 2025

    CoLi Link Ghana Limited: Pioneering connectivity

    July 11, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    CATEGORIES
    Archives
    • July 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • August 2024
    • July 2024
    • June 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023

    Blue Tech Wave (BTW.Media) is a future-facing tech media brand delivering sharp insights, trendspotting, and bold storytelling across digital, social, and video. We translate complexity into clarity—so you’re always ahead of the curve.

    BTW
    • About BTW
    • Contact Us
    • Join Our Team
    TERMS
    • Privacy Policy
    • Cookie Policy
    • Terms of Use
    Facebook X (Twitter) Instagram YouTube LinkedIn

    Type above and press Enter to search. Press Esc to cancel.