For a long time, I couldn’t figure out why the 2016 MBPr15 had a discrete GPU. It’s not significantly faster than the integrated GPU, it’s no good for games, there aren’t many games on Mac anyway, and it doesn’t provide any significant improvement for creatives (Photoshop, Final Cut, etc).
It comes at a large cost. It takes a huge amount of space, reducing battery capacity and increasing weight. It requires extra cooling, consuming space, reducing battery capacity and increasing weight. The extra power consumption… you guessed it… costs battery capacity and increases weight. dGPUs on the previous 15” line have been a nuisance (massive, unexplained power consumption, lower customer satisfaction). And because it’s AMD, you can’t do any significant GPGPU workloads (NVIDIA’s CUDA is miles ahead of AMD’s stuff).
I think I’ve worked out why Apple had to put a dGPU on the MBP15.
Windows machines are going to standardise on 4K. It will be cheap, thanks to consumer TVs, and it’s well supported already by hardware.
Apple is pushing 5K.
1920px wide is too coarse on 27” and 3840px is too fine. Windows still lags with HiDPI support (it’s basically there, but there are a lot of rough edges). With the sole exception of the MacBook Air 13”, Apple now has HiDPI on every single device.
When Apple released the 5K iMac, they had to design and manufacture custom chips. No PC manufacturer wants to do that. They ship whatever they can buy off the shelf.
Current 5K tech requires two display streams. It’s not transmitted over the wire as one super-high-resolution display. It’s split into two smaller ones and reconstructed on the display. From the PC’s point of view (but hidden by the OS), a single 5K display is two small displays arranged a certain way.
You can see where I’m going with this. The iGPUs have three outputs. You need two to run a 5K display. So with the iGPU, you can run the internal display and one external 5K display.
That’s kindy shoddy for a top-of-the-line machine; people expect to run two external displays. So the only thing Apple can do is add more outputs. How do you add more outputs? You need to add a whole GPU!
AMD was probably chosen because NVIDIA’s parts usually have the same three output limitation. (Possibly one could run the iGPU and dGPU together, but I imagine there would be technical headaches and the performance gap would be difficult to explain). AMD’s GPUs typically support six outputs. That’s exactly three 5K displays.
Also, you don’t want to add a big GPU on a laptop (unless it’s gaming specific). GPU peak power consumption is way higher than for CPUs. You need to design the machine to remove the consequent heat output. Some software runs the GPU hard unneccessarily, just as it does the CPU. This isn’t obvious to users, but they complain that random software (like Flash) kills their battery life and makes the machine hot. For these users (the vast majority!) the best thing you can do is limit their power consumption by giving them underpowered hardware. They probably won’t notice and can’t do much damage.
Intel’s iGPUs will probably eventually support more outputs (and/or with single-stream 5K). The dGPU will then be totally redundant and can be removed for weight reduction or battery life improvement. This won’t be a reality for a few more years, at least. In the meantime, Apple has a unique feature and competitive advantage.
Intel is putting more and more GPU power onto their regular CPUs all the time. There’s not really a middle ground any more. Current iGPUs are good enough for all non-gaming tasks. Gaming requires a high-power dGPU, and Apple’s not catering to that market.