A Complete Tech News Blog!

Samsung develops high-bandwidth memory through integrated AI processing technology

Samsung new Bandwidth

Samsung Electronics said on Wednesday that it has developed high-bandwidth memory (HBM) that integrates artificial intelligence processing capabilities. The new memory processing (PIM) architecture adds an AI engine to the Samsung HBM2 Aquabolt, which was first launched in 2018.

Samsung claims that this chip, called HBM-PIM, doubles the performance of the AI ​​system and reduces power consumption by more than 70% compared to traditional HBM2.

The South Korean tech giant explained that this is possible because installing an AI engine in each repository can maximize parallel processing capabilities while minimizing data movement.

Samsung new Bandwidth

 

Samsung said that by providing this improved performance, it expects the new chip to accelerate large-scale processing in data centers, high-performance computing systems, and AI-enabled mobile applications.

Samsung added that HBM-PIM uses the same HBM interface as earlier versions, which means that customers will be able to apply the chip to their existing systems without changing any hardware and software.

The company’s paper on the chip will be published at the Virtual International Solid-State Circuits Conference next week.

The chip is currently being tested in a customer AI accelerator, which is scheduled to be completed in the first half of this year. Samsung is also working with customers to build an ecosystem and standardize the platform.

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on tumblr
Share on vk
Share on mix
Share on reddit
Share on email

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles

baseread

Content Creator

Baes Read brings all the most recent Consumer Technology achievements & demonstrates to you what’s going on, what makes a difference & how technology can enhance your life.

recent post