With VRAM being king in the AI world, people started buying Apple computers as their RAM is integrated and used both by the CPU and GPU. How else would you get 128gb of RAM so “cheaply” where a good portion of that could be used by the GPU? This seemed like a more cost effective way of getting a large pool of VRAM, and as a plus you could do so in a smaller size with less power and technical hurdles (like when trying to build a 4 GPU rig).
Nvidia surprised us by announcing an ARM based computer for $3000 that comes with 128gb of unified memory at a price about $1700 less than you would pay from Apple.
This computer is said to be able to handle models up to 200 billion parameters in size, which is simply great.
We hope this product won’t be in so high demand that it will be hard to get a hold of.