The Times Australia
Google AI
The Times World News

.

What is a GPU? An expert explains the chips powering the AI boom, and why they’re worth trillions

  • Written by Conrad Sanderson, Research Scientist & Team Leader, CSIRO
What is a GPU? An expert explains the chips powering the AI boom, and why they’re worth trillions

As the world rushes to make use of the latest wave of AI technologies, one piece of high-tech hardware has become a surprisingly hot commodity: the graphics processing unit, or GPU.

A top-of-the-line GPU can sell for tens of thousands of dollars[1], and leading manufacturer NVIDIA has seen its market valuation soar past US$2 trillion[2] as demand for its products surges.

GPUs aren’t just high-end AI products, either. There are less powerful GPUs in phones, laptops and gaming consoles, too.

By now you’re probably wondering: what is a GPU, really? And what makes them so special?

What is a GPU?

GPUs were originally designed primarily to quickly generate and display complex 3D scenes and objects, such as those involved in video games and computer-aided design[3] software. Modern GPUs also handle tasks such as decompressing[4] video streams.

The “brain” of most computers is a chip called a central processing unit (CPU). CPUs can be used to generate graphical scenes and decompress videos, but they are typically far slower and less efficient on these tasks compared to GPUs. CPUs are better suited for general computation tasks, such as word processing and browsing web pages.

How are GPUs different from CPUs?

A typical modern CPU is made up of between 8 and 16 “cores[5]”, each of which can process complex tasks in a sequential manner.

GPUs, on the other hand, have thousands of relatively small cores, which are designed to all work at the same time (“in parallel”) to achieve fast overall processing. This makes them well suited for tasks that require a large number of simple operations which can be done at the same time, rather than one after another.

Read more: Demand for computer chips fuelled by AI could reshape global politics and security[6]

Traditional GPUs come in two main flavours.

First, there are standalone chips, which often come in add-on cards for large desktop computers. Second are GPUs combined with a CPU in the same chip package, which are often found in laptops and game consoles such as the PlayStation 5. In both cases, the CPU controls what the GPU does.

Why are GPUs so useful for AI?

It turns out GPUs can be repurposed to do more than generate graphical scenes.

Many of the machine learning techniques behind artificial intelligence (AI), such as deep neural networks[7], rely heavily on various forms of “matrix multiplication”.

This is a mathematical operation where very large sets of numbers are multiplied and summed together. These operations are well suited to parallel processing, and hence can be performed very quickly by GPUs.

What’s next for GPUs?

The number-crunching prowess of GPUs is steadily increasing, due to the rise in the number of cores and their operating speeds. These improvements are primarily driven by improvements in chip manufacturing by companies such as TSMC[8] in Taiwan.

The size of individual transistors – the basic components of any computer chip – is decreasing, allowing more transistors to be placed in the same amount of physical space.

However, that is not the entire story. While traditional GPUs are useful for AI-related computation tasks, they are not optimal.

Just as GPUs were originally designed to accelerate computers by providing specialised processing for graphics, there are accelerators that are designed to speed up machine learning tasks. These accelerators are often referred to as “data centre GPUs”.

Some of the most popular accelerators, made by companies such as AMD and NVIDIA, started out as traditional GPUs. Over time, their designs evolved to better handle various machine learning tasks, for example by supporting the more efficient “brain float[9]” number format.

A photo of an iridescent computer chip against a black background.
NVIDIA’s latest GPUs have specialised functions to speed up the ‘transformer’ software used in many modern AI applications. NVIDIA[10]

Other accelerators, such as Google’s Tensor Processing Units[11] and Tenstorrent’s Tensix Cores[12], were designed from the ground up for speeding up deep neural networks.

Data centre GPUs and other AI accelerators typically come with significantly more memory than traditional GPU add-on cards, which is crucial for training large AI models. The larger the AI model, the more capable and accurate it is.

To further speed up training and handle even larger AI models, such as ChatGPT, many data centre GPUs can be pooled together to form a supercomputer. This requires more complex software in order to properly harness the available number crunching power. Another approach is to create a single very large accelerator, such as the “wafer-scale processor[13]” produced by Cerebras.

Are specialised chips the future?

CPUs have not been standing still either. Recent CPUs from AMD and Intel have built-in low-level instructions that speed up the number-crunching required by deep neural networks. This additional functionality mainly helps with “inference” tasks – that is, using AI models that have already been developed elsewhere.

To train the AI models in the first place, large GPU-like accelerators are still needed.

Read more: Clampdown on chip exports is the most consequential US move against China yet[14]

It is possible to create ever more specialised accelerators for specific machine learning algorithms. Recently, for example, a company called Groq has produced a “language processing unit[15]” (LPU) specifically designed for running large language models along the lines of ChatGPT.

However, creating these specialised processors takes considerable engineering resources. History shows the usage and popularity of any given machine learning algorithm tends to peak and then wane – so expensive specialised hardware may become quickly outdated.

For the average consumer, however, that’s unlikely to be a problem. The GPUs and other chips in the products you use are likely to keep quietly getting faster.

References

  1. ^ tens of thousands of dollars (www.tomshardware.com)
  2. ^ soar past US$2 trillion (www.reuters.com)
  3. ^ computer-aided design (en.wikipedia.org)
  4. ^ decompressing (en.wikipedia.org)
  5. ^ cores (en.wikipedia.org)
  6. ^ Demand for computer chips fuelled by AI could reshape global politics and security (theconversation.com)
  7. ^ deep neural networks (en.wikipedia.org)
  8. ^ TSMC (www.anandtech.com)
  9. ^ brain float (en.wikipedia.org)
  10. ^ NVIDIA (nvidianews.nvidia.com)
  11. ^ Tensor Processing Units (en.wikipedia.org)
  12. ^ Tensix Cores (tenstorrent.com)
  13. ^ wafer-scale processor (www.cerebras.net)
  14. ^ Clampdown on chip exports is the most consequential US move against China yet (theconversation.com)
  15. ^ language processing unit (wow.groq.com)

Read more https://theconversation.com/what-is-a-gpu-an-expert-explains-the-chips-powering-the-ai-boom-and-why-theyre-worth-trillions-224637

Times Magazine

With Nvidia’s second-best AI chips headed for China, the US shifts priorities from security to trade

This week, US President Donald Trump approved previously banned exports[1] of Nvidia’s powerful ...

Navman MiVue™ True 4K PRO Surround honest review

If you drive a car, you should have a dashcam. Need convincing? All I ask that you do is search fo...

Australia’s supercomputers are falling behind – and it’s hurting our ability to adapt to climate change

As Earth continues to warm, Australia faces some important decisions. For example, where shou...

Australia’s electric vehicle surge — EVs and hybrids hit record levels

Australians are increasingly embracing electric and hybrid cars, with 2025 shaping up as the str...

Tim Ayres on the AI rollout’s looming ‘bumps and glitches’

The federal government released its National AI Strategy[1] this week, confirming it has dropped...

Seven in Ten Australian Workers Say Employers Are Failing to Prepare Them for AI Future

As artificial intelligence (AI) accelerates across industries, a growing number of Australian work...

The Times Features

I’m heading overseas. Do I really need travel vaccines?

Australia is in its busiest month[1] for short-term overseas travel. And there are so many thi...

Mint Payments partners with Zip Co to add flexible payment options for travel merchants

Mint Payments, Australia's leading travel payments specialist, today announced a partnership with ...

When Holiday Small Talk Hurts Inclusion at Work

Dr. Tatiana Andreeva, Associate Professor in Management and Organisational Behaviour, Maynooth U...

Human Rights Day: The Right to Shelter Isn’t Optional

It is World Human Rights Day this week. Across Australia, politicians read declarations and clai...

In awkward timing, government ends energy rebate as it defends Wells’ spendathon

There are two glaring lessons for politicians from the Anika Wells’ entitlements affair. First...

Australia’s Coffee Culture Faces an Afternoon Rethink as New Research Reveals a Surprising Blind Spot

Australia’s celebrated coffee culture may be world‑class in the morning, but new research* sugge...

Reflections invests almost $1 million in Tumut River park to boost regional tourism

Reflections Holidays, the largest adventure holiday park group in New South Wales, has launched ...

Groundbreaking Trial: Fish Oil Slashes Heart Complications in Dialysis Patients

A significant development for patients undergoing dialysis for kidney failure—a group with an except...

Worried after sunscreen recalls? Here’s how to choose a safe one

Most of us know sunscreen is a key way[1] to protect areas of our skin not easily covered by c...