Electronically Assisted Astronomy with a Jetson Nano

Hello,

Merry Christmas to everyone !

One year and a half, i received the Jetson Nano Dustin sent to me. Quite some work have been done since !

Clear sky !

Alain

1 Like

Last image from 2020.

Both get image treatment with Jetson Xavier NX (Nano can do the same treatment but la little less fast).

We get comparison between Clementine satellite and Celestron C9.25

Next year, i have to work on my motorized mount but as i get no knowledge on that king of work, it will take me some time to make a new mount, more solid, precise and small.

Alain

2 Likes

For courageous persons, you can find my software here :

https://github.com/AlainPaillou/ProjectX

2 software :

  • one for the raspberry pi camera V2 (colour version)
  • one to make video treatment (no camera acquisition)

Those software can be used with Jetson Nano and Jetson Xavier NX. I think they get bugs so you will need some courage to test them.

You can use those software freely for personal use. If you want commercial use, please contact me before.

Next year, i will give my full software with camera acquisition (for ZWO ASI178MC camera).

Alain

4 Likes

Hello,

i had some free time today so i was able to put my software on Github.

This time, it is the full software, able to control my ZWO ASI178MC camera.

2 versions of the software :

  • PC version (under windows)
  • Jetson version (Nano 2 & 4GB and Xavier NX).

You can find the source code and the libraries here :

https://github.com/AlainPaillou/JetsonSky-SkyPC-distrib

Now, you have all i have.

It is free for personal use. For commercial use, contact me.

Good luck.

Alain

Hello

happy new year to everyone ! I wish you the best for 2021.

Dustin, Chelsea, Rick : happy new year from France !

Alain

3 Likes

Happy new year Alain! Best wishes for 2021!

2 Likes

Hello,

the first video from 2021. The Jetson Xavier NX is in charge of camera setup, acquisition and real time treatment, using JetsonSky software :

Clear sky !

Alain

2 Likes

Different kind of real time treatments for deep sky capture :

In order to try to get realistic video (i mean video with only deep sky signal), heavy treatment is required :

  • to remove noise from the sensor (because of sensor very high gain)
  • to remove light pollution in the sky
  • to correct lens problems like vignetting etc.

Alain

1 Like

The difference is astonishing!

1 Like

In fact, i have a lot of frustration with this system !

The computer and software part of the system is ok. Jetson Xavier NX is really great and gets enough power to manage video processing without any problem. The treatment routines are ok even if i can make some improvements.

The mount works even if it is a beta system (i am looking for help to try to make better mechanical system in the future).

But whatever i will do, i will still remain with the limitations of the camera sensor (IMX178 is not bad but it is mainly not good enough) and the limitation of the lens.

My cctv lens is just F1.5 and my 50mm lens is just F1.4

With a much better sensor like IMX294 (using BIN1 or BIN2) and a very wide open lens like a 50mm F0.95 or 85mm F1.2, i could really get very very good results. But i have to be reasonable because the ratio price/use is not good at all.

It could be very cool to get a complete system (good camera, good lens, good mount) with a VR headset (to get the video and control the mount) to make a space journey. But the price of that kind of system would be a bit high and that kind of system is too complicated for me (for now).

Alain

1 Like

I could very much imagine a 3D VR preview from binocular cameras, or even just 2D with a VR headset so long as the tracking is enabled to allow head movement to move the actual optics. Combine that with the Jetson software to improve the quality and a lot more people would become amateur astronomers just so they could look out at the sky. Money is of course always an issue with anything pushing the edge of quality while trying to remain within a budget.

1 Like

Yes. Money, always money.

To get significant result, it is hard to make such thing with low cost equipment. The weakest element will bring serious issues. To get the waouh effect, all the parts of the system must be top class (or close to).

A very good camera sensor and good lens will still remain expensive.

I keep all those things in mind. Maybe one day …

Alain

Hello,

i would like to say small things about camera sensor because sometimes, i read some posts which makes me a bit nervous.

For daylight captures, any modern CMOS sensor can be ok, even small photosites sensors (for example 1.15 or 1.45µm) like those you can find in your smartphone.

Be careful with the resolution of the sensor. Marketing says very high resolution is so cool ! Yes, but if ou want to make treatments on the video, you will probably get some problems because high resolution means quite high treatment time.

In my opinion, Jetson Nano can manage fullHD and Jetson Xavier NX can manage 4K (if the treatment is not too heavy).

Most “cheap” sensors are rolling shutter. It’s ok for quite static scene but if you want to manage moving objects, you will have to choose global shutter sensor. They are more expensive and the noise is higher than rolling shutter sensors.

For industrial application with moving objects, most of time, global shutter sensors are the good choice.

For low or very low light sensors, you must keep in mind the small photosites sensor are the very wrong choice. At least, you must choose 2.4-2.9µm photosites. 3.75µm and upper is suitable.

As you need to manage very low light conditions, you will have to set high gain, that is to say heavy noise. Rolling shutter sensor can give lower noise but it is not really suitable for moving objects.

Choosing a sensor is really complicated. You will need to match your application with the sensor specifications (size of the photosites, resolution, noise, rolling or global shutter, gain, sensitivity, etc.).

Very low cost solutions are often ok with daylight conditions and low gain. If you raise gain, you will get terrible noise with very bad result. Managing noise with software is really complex and need very powerful CPU/GPU.

If someone says very small photosite is really cool for low light conditions, DO NOT TRUST HIM.

If you want to see main specifications of common sensor (except very small photosites sensors because they are most of time useless for difficult light conditions), you can take a look here (French astrophotography forum) :

https://www.webastro.net/forums/topic/155460-comparatif-des-derniers-capteurs-sony-cmos-imx385-imx294-et-imx183/

Unfortunately, it is in French. Sorry for that.

Clear sky

Alain

3 Likes

Hello,

an unexpected use of my adaptative absorber denoise filter : the atmospheric turbulence management.

Video will explain result better than words :

I think this is really interesting.

Alain

3 Likes

Same exercise with Jupiter :

Alain

1 Like

Last example of noise and turbulence management with my denoise filter. Go back to the Moon :

Well, that’s enough for today.

Clear sky.

Alain

2 Likes

I made a new test using an old capture of Jupiter (RAW video uncompressed 7000 frames using my telescope and my colour camera).

To perform treatment, i use 2 software :

  • Autostakkert 3 (AS3) to sort the frames (depending of the quality) and stack the best frames
  • Registax 6 (R6) to apply wavelets in order to retrieve details

I applied my AADF to the RAW video to reduce noise and manage atmospheric turbulence.

I made classical planetary treatment (AS3 + R6) with the 2 videos.

For the RAW video, i kept the 2500 best frames.
For the AADF video, i kept the 1800 best frames.

Here is the results :

This comparison shows my filter does not destroy the informations in the RAW video and it allow harder treatment with R6 than with the RAW video (applying wavelets with the RAW video brings noise very quickly when you push to much the sliders !).

So, i think AADF can bring some (small) improvements with a very good capture (we can’t do anything with a bad capture).

Alain

2 Likes

Hello,

still testing my AADF with astronomy video.

We saw AADF can manage noise and atmospheric turbulence.

This time, i tested AADF on the video which already had AADF treatment. This means 2 pass AADF.

Let see the result :

If the field is really static (no movement) the result is really good.

Alain

Hey Alain,

I’ve been following your work very closely since I discovered it a few days ago. I am super, super happy to have found it and must say it is very encouraging. I have been pondering this same idea of real-time video approach, parallel to you for many years - ever since I looked up with binoculars and said to myself, a)”Why don’t I do this more often??!” and b)”How do I follow that satellite using goggles or at least using a mount and machine vision?!”. I’ve since tried many, many lenses and usb cameras just to be able to get close to what my eye can see… I am a tinkerer like you and your work gives me hope that this is attainable with hard work, persistence and learning (and of course incremental waste of money). I’ve got a couple of questions:
a) Have you tried celestron mount via serial usb from your code, instead of building your own pan/tilt? Why build your own pan/tilt? (and how, in case it makes sense)
b) Have you looked into GoCV? I think it’s way faster than python.
c) I’m using a cheap IMX477, do you think there is a way to use a speed booster to optically make up for the smaller size pixels?
d) Finally, am curious what your thoughts are about image intensifiers for astronomy

Thanks in advance!

-Victor

1 Like

Hello Victor,

some years ago, i used my Celestro Nexstar 6SE mount remotely controled with Stellarium software. It works fine.

I made a mount because my main objective was to get a standalone system always outside. My mount is a beta test and if i can find some free time, the next version will be more simple and the system will stay outside in order to be used very quickly.

I did not know about GoCV. It looks interesting but i must see if it can be used with Jetson SBC and if i can manage GPU with GoCV.

Concerning camera sensor, i think there is no real way to boost them. Camera sensor is the most important part of the system (if the optical lens is good of course).

Image intensifiers can be a solution but i haven’t tried that kind of equipment. I think it is very important to test it before buying one.

Alain

1 Like