Development kit for AI

Greetings!

I need to find a single board capable of running Python programming language. It should have the power to run generative AI in CUDA core. I need at least 4 GB of RAM and M.2 SSD support. I would like to It must be portable and capable of operating on batteries for at least 5 hours. It also should have Bluetooth compatibility. Do you offer battery kits for these boards? How can I purchase the batteries that will go with these kits? What would the recommended hardware be? Do you have something that fits this project?

Thanks

sounds like an M3 (Pro) based macbook to me. I’ve run generative image AIs and LLMs on an M1 mac mini and the Apple Silicon is already capable of such things, despite being two generations behind.

CUDA enabled gaming laptops would not reach 5h battery life. 2-3h is realistic under workloads. An external Li-Ion power bank could extend that time, e.g. using an USB-C connection for power supply.

I doubt you’ll find similar convenience in a CUDA dev board. Also I haven’t seen off the shelf integrated solutions with batteries from nVidia.

Third party vendors like SparkFun appear to offer battery kits for Jetson embedded devices.

What are your size restrictions, by the way?

I am looking for something straightforward to transport and use daily. I need the CUDA power for AI processing, but it has to be small enough to be carried in our bodies, light, and portable. That could handle multiple hours of battery power usage.

The objective is to be able to be carried in our bodies. Since this is only for prototyping, I want to prove the concept. Later on, I will try to make the design as slim and practical as possible. For now, I need to have it operational.

Thanks, Mr.

could the application run on a highend smart phone? That’s the pinnacle of being portable.
Yes, it’s not with CUDA but AI toolkits (be it python or C++ based) can generally hide the underlying architecture and adapt to whatever the hardware offers.

1 Like

where can I find a phone capable of running CUDA and Python at once?

It seems that NVidia TegraK1 does it but it seems that this chip was removed from the market.

If your trained ML model is stored binary formats from Python based AI/ML toolkits (pyTorch, Keras, etc) it can be imported e.g. into CoreML and run natively on iPhone.

2 Likes

You are telling me that my Phython-based program can be exported into CoreML, allowing the application to run on an iPhone. I can use Xcode on my Macbook Pro to test my project. Correct?

What about the CUDA plugging? Can the iPhone emulate CUDA? I need Phyton and CUDA workloads.