Lots of data here, but you know if you are interested that they call a “big system” as a fully developed computer component (the winners are graphics cards). Small AI computers are microcontrollers, the kind that embed into devices or machines.

They are measuring capability by how long it takes to “train” on a set of data to get artificial intelligence results, in this case for image classification (which images match or are of the same subject), and for natural language processing (speech interpretation). And a few other tests.

For the “biggest AI computer” the NVidia A100 wins. Costs about $15000 for the basic model.

For the “smallest AI computer” it looks like Greenwave Technologies Risc-V Core is the winner, closely followed by the Syntient Core 2. All other competitors were an order of magnitude slower by these tests.

This is good to know, since I intend to take over the world with AI. Stay tuned….

New Records for the Biggest and Smallest AI Computers – Nvidia H100 and Intel Sapphire Rapids Xeon debut on ML Perf training benchmarks