Xavier Teardown

Over the weekend I did a basic teardown of the Xavier. Attached are my pics.

Thanks,
Andrew Baker












1 Like

So you couldn’t get the heatsink off?

Found another picture of the heatsink off, though. Plus die photos!

The devkit has an integrated thermal solution that does not have the TTP (the OEM module will have the TTP), so it’s not recommended to remove the heatsink from the devkit unless you want to re-do the thermal pads and paste.

Negative. It is solidly attached. I even removed the fasteners. It is not coming out and I wasn’t about to pry it out either :)

I bet a hot air gun set below the solder melting point would do it!
Totally worth it.
DO IT! DO IT! :-)

Just Lol :D

If NVIDIA wants to provide me with a sacrificial Xavier, then I am game :)

I know this thread is dead, but we managed to do this:
https://github.com/ShreyasSkandanS/nvidia_xavier_utils/blob/master/decasing_xavier.md

Obviously it isn’t recommended but we did it anyway.

Fantastic stuff @singularity7. Nice work.

I’m disappointed…if you had posted on April 1 I could have quite seriously warned that a band saw runs a minor risk of voiding the warranty! :P

I am curious though…was that a working unit, and were you able to get it back together such that it still worked?

The end of the page says:

Haha, not an April fool’s joke I’m afraid.

Yes, it still works and sits on my desk. I hesitate to use it without some form of active cooling though.

Try conformal coating parts not needing cooling (perhaps connectors would need sealing, but that is due more to the cable jacket wicking coolant and not so much due to the contacts), and then submersion cooling (it could be as simple as mineral oil). Not that it needs special cooling, but it would make an awesome desk display and be the topic of conversation (I’m not into the bling of LEDs all over…but a working Xavier would make a nice exception).

it is cool to do this ,but i don’t know why the nvidia will produce the kit like that,it is not reasonable

It is totally reasonable! The module needs a certain amount of cooling, and NVIDIA has significant experience with cooling solutions from all their desktop graphics card development.
A solid thermal interface module is always going to perform better than some vaguely attached oxidized aluminum rubbing.

Also note that this is for the devkit. If you buy the module commercially, you can use your own thermal solution and get the module without the devkit cooler.

So
How to redo thermal paste?