GDDR Does Math, Big-Screen Explosions

GDDR Does Math, Big-Screen Explosions

[ad_1]

//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>

The “G” in GDDR stands for “graphics” (as in “graphics double data rate”), but today, the use cases for the high-performance memory boil down to math. It is the reason GDDR7, recently announced by Samsung Electronics, will find its way into high-performance computing (HPC), artificial-intelligence and automotive applications.

Objective Analysis’s Jim Handy.
Objective Analysis’s Jim Handy

Graphics applications rely a great deal on matrix algebra, Jim Handy, principal analyst at Objective Analysis, said in an interview with EE Times. GDDR and GPUs are well-suited for matrix algebra. And because it is required for many AI workloads, GDDR DRAM has found itself supporting more than just graphics, he said.

Supercomputing applications like airfoil design and weather modeling use GDDR and GPUs because of the underlying math, Handy said. And because GDDR is geared toward graphics applications, it is used a great deal in video production, including computer graphics imaging (CGI) in blockbuster movies like “Avatar,” which would have put “scads” of GPUs together, he said.

Because GDDR can process large amounts of data in parallel quickly, Samsung expects to see more adoption of GDDR in areas that require higher bandwidth and processing speed, such as AI and machine learning, HPC and automotive applications, where it supports increasingly visual dashboards and advanced driver-assistance systems (ADAS), as well as traditional usage in gaming PCs, laptops and consoles.

In an email interview, Samsung told EE Times that the maximum speed for its GDDR7 will reach up to 32 Gbps, which is a significant boost from existing servers and PC DRAM.

GDDR has come to fill a gap between DRAM and HBM because it has a speed advantage over conventional DRAM while being more competitive compared with HBM in terms of total cost of ownership, as well as technical requirements, Samsung said. As demand for high-bandwidth solutions increases, high-end solutions like HBM may not be the best option for certain applications, and GDDR can be an alternative to such customer needs.

While demand for regular DRAM is slow right now, Samsung is seeing “robust” customer engagement for its GDDR7 and expects to see wider adoption in applications that require higher performance.

Samsung appears to be first out of the gate with GDDR7: The company said it will first be installed in next-generation systems of key customers for verification. Notable improvements over previous iterations include a bandwidth of 1.5 terabytes per second (TBps), which is 1.4× that of GDDR6’s 1.1 TBps and features a boosted speed per pin of up to 32 Gbps.

Samsung said these enhancements are made possible by the pulse-amplitude modulation (PAM3) signaling method adopted for the new memory standard instead of the non-return-to-zero (NRZ) from previous generations: PAM3 allows 50% more data to be transmitted than NRZ within the same signaling cycle.

The company said its GDDR7 design is also 20% more energy-efficient, with power-saving design technology optimized for high-speed operations. This makes it a low-operating–voltage option for applications that are especially mindful of power usage, such as laptops.

Samsung appears to be first out of the gate with GDDR7.
Samsung appears to be first out of the gate with GDDR7. Notable improvements over previous iterations include a bandwidth of 1.5 TBps, which is 1.4× that of GDDR6’s 1.1 TBps and features a boosted speed per pin of up to 32 Gbps. (Source: Samsung Electronics)

Micron Technology said this summer that it would introduce its first GDDR7 memory devices on its 1ß node in the first half of next year, while Cadence announced technical details for its GDDR7 in the spring.

GDDR is an example of a tried-and-true technology that is finding new uses since its inception.

“It’s a very, very old concept,” Handy said. “Video RAM was invented in the late 1980s, and this is just an extension of that.”

Both GDDR and GPUs have their limitations.

“GPUs are architected specifically to do matrix algebra really well, but they do everything else really poorly,” Handy said. “If you were trying to control a robot arm with a GPU, it would be wretchedly expensive and it would probably perform more poorly than if you did it with a 16-bit CPU.”

Handy said any movie today that has a big fiery explosion is going to make use of GDDR and GPUs.

“Nowadays, they don’t use pyrotechnic guys to do it,” he said. “They just have people synthesize the big, nasty explosion.”

[ad_2]

Source link

Share this post
Facebook
Twitter
LinkedIn
WhatsApp