Is Google Coral worth buying? And is it better than Raspberry Pi 4 or the Jetson Nano? Which one is the best?
Both Google and NVIDIA released a development board targeted towards Edge AI to attract developers, tinkerers, and hobbyists. But which one is right for you?
You may also like: The Ultimate IoT Hardware Comparison Guide
In this post, we take a look at various development boards and the best options on the market right now. Let’s get started!
What Are Some of the Best Development Boards on the Market?
Google Coral Dev Board uses the best of Google’s machine learning tools to make AI more accessible. The board boasts a removable system-on-module (SOM) featuring the Edge TPU and looks a lot like a Raspberry Pi.
Next, we have the Coral Dev Board:
Jetson Nano is a new development board from Nvidia that is targeted towards AI and machine learning. It comes with a GPU with 128 CUDA cores and a bunch of software and examples pre-installed to get you started.
Lastly, we have Raspberry Pi 4, the latest product in the popular Raspberry Pi range of computers. It comes with up to 4GB of RAM (four times that of any previous Pi), a faster CPU and GPU, faster Ethernet, dual-band Wi-Fi, twice the amount of HDMI outputs, and two USB 3 ports.
So, Which One Do You Prefer?
We collected opinions from the community, so let’s see what they say.
Google Coral is limited to Tensorflow Lite IIRC. But Jetson supports PyTorch as well. To me, that makes Jetson the preferred option of the two, as I’m more familiar with PyTorch. However, quantization and pruning support are way better on TensorFlow (for now).
The Raspberry Pi 4 is really not comparable with the other two as it does not have a GPU or TPU, according to Mxbonn.
I would say it depends on the application. What DL model? Inference only or training too? How many inferences per second do you expect?
According to one dev, Yusuf Bengio, in terms of computing power/wattage: Google Coral > Jetson > Raspberry
And in terms of software ecosystem (i.e. framework/additional hardware support, etc.), he would order them as such: Google Coral < Jetson < Raspberry.
So that’s what the community had to say. Here’s my take:
Best Flexibility: Jetson Nano
Upside: Good performance and runs anything you can run on your computer (that fits in 4GB RAM)
Downside: 4-year-old SOC, decent all-round performance but not the most efficient
Bonus: Great software/library support, comes with heatsink, etc.
Best Performance/Wattage: Coral DevBoard
Upside: newest chip, most efficient
Downside: Locked into TF Lite, you’re at the mercy of what they support, Most expensive
Bonus: Probably the best choice (perf/watt) if you know what you’re doing, sometimes faster than Jetson Nano
Cheapest: Raspberry Pi 4
Upside: Cheap, good tutorials for most stuff, probably can run any models on CPU
Downside: CPU-only, my estimation (extrapolated from RPi 3) is you can get 3–5 FPS on normal TF + MobileNet. Definitely needs extra heatsink + fan for running sustained inference.
Bonus: “It’s a Raspberry Pi. I love these boards for some reason. You can pair it later with a USB accelerator if you wish. Also, you can overclock it for lulz. ” — tlkh
Lastly, I tried to answer the same question of trying to build the platform for RC-Cars.
If you want to run a model supported by Coral (so CNN, no RNN), RPi4 + Coral is the best option. It is super fast compared to Jetson Nano.
But if you want to run an RNN model, you need Jetson Nano, which has moderate speed.
Here are some additional thoughts and conclusions from the community:
“The best would be if Coral support RNN models, that would be awesome. From my perspective, Autonomous RC-CAR needs RNN models so I decided to go with Nano.” — melgor89
“Have you considered an RPi 4 + Coral USB Accelerator? It may be a good combination. But to give my own opinion: I wouldn’t suggest Google Coral. I would prefer an RPi 4 + USB accelerator because of the software ecosystem. The biggest disadvantage is the support for TensorFlow Lite only.
“RPi 4, by itself, is not powerful enough.
Jetson is a good compromise. It supports more libraries and is more powerful than RPi 4 by itself.” — vladfedchenko
“My bet for on-edge inference is on TPU (Coral) or CPU-only (Raspberry) solutions.
“For Coral: TPU provides a very attractive performance per watt. For many lightweight inference tasks (e.g. face detection, segmentation, object detection), Coral would be the best solution. Google provides support for both modeling using [tf-lite] and inference using [mediapipe]. You can train, optimize, and deploy an entire system in a very short period of time and expect production quality.
“On the other end of the spectrum, Coral is a CPU-only inference, which I’m very excited about. For CPU, you need highly quantized, specialized models. Several startups (e.g. xnor.ai) are working on this, but they want to sell you the model, not the hardware. If the software toolkit becomes commoditized, then these solutions will become very popular.
“The biggest problem with Jetson is that it is not developed end-to-end by the same company. Facebook does not care about embedded systems, nor they want to help Nvidia sell GPUs. So Jetson is always a second-citizen in the PyTorch community. On top of that, using GPUs in embedded systems, unless you have a very specialized use-case, is a bad choice.” —— adelope
I hope this overview and product comparision was helpful. Let us know your thoughts on Google Coral, Raspberry Pi 4, Jetson Nano, or all three, in the comments section.