Are discrete Apple graphics cards inevitable?

url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url

Key findings

  • Apple's custom GPUs offer strong performance but do not support expansion beyond SoCs.
  • GPU performance scales based on combined chips, with some overhead and limitations on scalability.
  • For users who require more performance, the development of dedicated Apple graphics cards may be necessary.



Apple Silicon has been a huge success for the technology company, with memory, CPU and GPU all custom-built and combined into a single system on a chip. By and large, Apple customers seem happy with the performance offered, but unlike Apple's Intel systems, there is no support for graphics cards beyond the performance that Apple sells as standard.

While Apple's best integrated GPUs are nothing to sneeze at, there are plenty of use cases for better GPU hardware for enthusiasts and professionals alike. If Apple doesn't include Intel, AMD, or Nvidia in its new ecosystem, does that mean the company will have to develop its own cards?


How Apple scales GPU performance

At the time of this writing, the performance of the Apple Silicon GPU, like everything else about the computer you buy, is fixed at the time of purchase. There are no later upgrades to RAM or primary storage. Each generation of Apple Silicon has a base chip, like M1, M2, M3, and M4 so far. That's the baseline, but Apple offers more performance by selling scaled-up versions of the base SoC. At the top of the stack, the “Ultra” chips “glue” two complete Max SoCs together using an incredibly fast and wide edge-to-edge bonding system called “Ultrafusion.”

The Apple M1 chip family.


This has nothing to do with older attempts at multi-GPU setups like Nvidia's SLI. From a software perspective, there is only one GPU, and the delegation of graphics tasks is completely handled by Apple's hardware and software like the GPU drivers and graphics API.

Scaling is a problem

While simply solving the problem with more silicon is a viable strategy, this strategy also has some disadvantages.

For one, the GPU performance on offer is not perfectly scalable. If you have discrete GPUs, some of the performance is lost to overhead. This also means that the most powerful GPU Apple can offer depends on how many complete system-on-a-chip modules it can combine.


I foresee a solution where Apple could make a GPU block the size of a Pro chip, for example, and then have something the size of a Max chip that is at least 50% GPU, but that would take away some of the manufacturing advantages Apple currently enjoys at the expense of the competition.

It's not just about sheer silicon area and interconnect effort, there's also a limit to how much heat and power you can pack into a single chip package. Combining multiple CPUs and GPUs into a single chip package is only feasible to a certain extent. It's similar to the challenge faced by current gaming consoles like the PlayStation 5, where the CPU and GPU combination package must operate within a heat and power range of around 200W.


An Apple graphics card makes sense

Frostpunk 2 is played on a MacBook Pro.

The only current Mac that can physically accommodate a graphics card is the Mac Pro. However, even if you plug a card into a PCIe slot on a Mac Pro, nothing happens.

So you might think that this limits the customer base too much for an Apple graphics expansion card, but that doesn't take into account eGPUs, nor the fact that Apple may release more Macs with expansion slots in the future.


A dedicated graphics card from Apple makes sense when professional or enthusiast users need more GPU power than the best Apple can pack into an SoC. This broadly corresponds to mid-range GPUs like the Nvidia RTX 3070 and RTX 3080.

It also makes sense to develop dedicated GPU silicon chips for high-end MacBooks, and of course there are now many AI-related use cases for GPUs, such as running large language models (LLMs) and AI image generation. Current Apple Silicon models are also capable of this, but not at the speed and scale that serious users would need.

Gaming is really where it is used the least. Although gaming on the Mac has improved by leaps and bounds, it is far from the main focus and especially for gaming, eGPUs are not the best choice.


Thunderbolt could be the bottleneck

A MacBook with a Thunderbolt 4 cable connected.

As for eGPUs, one of the reasons for their poor performance in gaming is due to the bandwidth limitations of Thunderbolt (or alternatives like OCuLink). Thankfully, that's about to change with Thunderbolt 5, which will likely enable real-time 3D applications like video games.

However, for GPU tasks that run on the contents of the card's memory, such as AI tasks or GPU-accelerated video rendering, where you don't need to constantly move data between the CPU and GPU, bandwidth is not as important. Most professional GPU workloads could easily run on eGPUs without much of a performance penalty.


Apple may now be targeting other customers

The entire premise of my argument is based on the idea that Apple cares about the segment of the market that needs more GPU power than it can muster with an Ultra chip configuration. If the company doesn't release a discrete GPU solution, that's a sign to me that it no longer cares about that segment of the market.


These are the people who have traditionally used Macs for professional 3D rendering work or high-demand creative tasks like big-budget video editing. Without graphics card support in computers like the Mac Pro or eGPU support for any of Apple's Thunderbolt-equipped systems, people will have no choice but to look elsewhere.

Leave a Comment