Electronically Assisted Astronomy with a Jetson Nano

Hello,

my project is going ahead.

My mount can now automatically find a target (star, Messier or NGC object), it can manage the earth rotation and it starts to know how to find an artificial satellite and track it.

Of course, i need more tests to be sure everything is ok but for now, a lot of things look good.

Here is a new test video with Jetson Nano controlling the camera meanwhile raspberry controls the mount.

the sky was very bad but i was able to track a plane and some satellites.

When the mount control will be really ok and with a good sky (i mean winter sky), i should be able to get interesting videos.

Alain

Hello,

no tests with my system because of the bad sky but i can use my work to enhance some of my astrophotography.

This time, we go on the Moon.

Of course, my equipment is a bit different :

Real time treatment (using Python and PyCuda with Nvidia GPU) can be use with old captures.

I made a test on the Moon (Gassendi crater region).

You can see how filters can improve video quality here :

To make a great picture of the Moon, we have to make a video of the Moon (raw video, no compression), sort the frames depending of their quality, stack the best frames (to remove noise), and apply wavelet filtering to retrieve details and sharpness.

Here is an example of what we can get without and with preliminary filtering (using PyCuda) :

External Media

Preliminary filtering can bring very useful improvement of sensor rendering and CUDA programming is a perfect way to manage such heavy process.

Sensors are getting better and better but they are very expansive (some camera can cost 5000-10000$) so image processing with powerful GPU like NVidia GPUs is a must have.

There is a lot of things we can do with image processing and even amateur can make science.

I have worked for long years on the colours of the Moon. Yes, the Moon gets colours but they are very tiny. We have to catch them with a lot of computer treatments.

Here is what mineral Moon looks like (one of my best Moon picture) :

External Media

Blue colour comes from titanium oxides, red colour comes from iron oxides, green can come from Olivine etc. Those are the true colours but strongly enhanced. It took me years to get such results (the original picture is about 100Mpx).

With appropriate filtering, maybe i will be able to make videos of the colours of the Moon. I will try to work on that a bit later.

Well, i hope this was interesting you.

That’s all for now.

Alain

Does the ZWO ASI178MC have a shutter?

Hello linuxdev,

The camera does not have mechanical shutter. Only electronic shutter. It’s rolling shutter, not global.

Most of time and at least for Sony sensors, rolling shutter sensors are cheaper and have higher QE.

Alain

If you were to improvise and cover the camera such that you are guaranteed absolutely no light enters, and then make an exposure for about two minutes or longer just prior to starting your photography, then the sensor noise could be subtracted out using that reference frame on each actual photograph (you might need to interpolate or extrapolate between exposure time and reference time for the black image). I have an old Canon 20Da (astrophotography variant no longer in production) which does this automatically. Subtracting out sensor thermal noise is a big help.

You are right. I could use “Dark frame” to remove noise. My software can make “master dark frame” and subtract it to my video.

But to do that, we have to get constant gain and exposure time for the capture. If i change gain or/and exposure time, i have to make a new master dark frame.

We often use master dark when we make a long exposure capture for photography. We also subtract “bias frame” and we also use “flat frame” to clean the picture. But it is very different philosophy from EAA which is basically a simple video of the sky.

As it is difficult to apply standard photography methods to EAA videos, it is very useful to apply real time treatments to be able to clean the sensor response and Jetson Nano is powerful enough to perform such filtering.

Alain

In theory, if you make a dark frame which stops at the moment one pixel reaches full white, then you could interpolate based on exposure time. You’d need a new dark frame each time you start since noise is dependent upon temperature (and perhaps humidity). That’s something machine learning would be good with
real time subtraction of an interpolated master dark frame when exposure times vary. Though I suppose you don’t need this at all
you have mentioned it is a short exposure time (and noise won’t matter on a good sensor over a short exposure).

Yes, short exposure avoid some problems but as i need to catch small light, very often, gain is very high so i have some important noise.

In some case, when i can fix for sure gain and exposure time, i will use master dark to clean as much as possible the video.

Traditional photography with dark bias and offset is quite simple but cleaning a video using real time filters is a bit complex. That’s why i decided to write my own capture software, to be able to test my own solutions and i must say it is really interesting.

The biggest problem is not writing software. The main problem is to find hardware which is able to apply filters quickly. Jetson Nano is quite good for this exercise and as PyCuda is not to complicated, we have a good tool to make some useful tests.

Maybe i will buy a new laptop with a NVidia GPU. As my soft works on a Windows 10 computer, i will be able (if i find the money ; i have sold a refractor telescope to buy a new laptop but i need to find some extra money to buy the laptop) to write new treatment routines and test them with the laptop. ZWO is also supposed to give me a new camera with (maybe) Sony IMX294 sensor. This sensor can give me cleaner pictures, that is to say real time filtering will give better results (it is very hard to get very good result when the picture is really crappy ; i guess magical filter does not exist !).

My project is only at his very beginning. Time will bring good surprises.

Alain

Hello,

still working (playing) with the Jetson Nano.

Maybe you have seen my colourful Moon (a few post above this one). This is the result of many treatments.

I have succeed to make some treatments (more simple) with the Jetson Nano to show those colours with a video of the Moon.

Here is the result :

I think the video is worth being seeing. The Jetson Nano shows you what your eyes can’t see.

Alain

easybob,
Hi

I am also trying to use Jetson Nano dev kit for astronomy.
However, I does not seem to be as capable as my laptop. I have opencv running on my laptop with all the other Python 3.7 software. Matplotlib etc. However, the documentation and examples on the nano are so bad that I cannot run anything like this on the nano!
I thought it would be as easy as loading Python dev environment. Wrong.
There are just no Python tools or software to launch python code using the theoretical power of Nano.
I cannot even find a demo anywhere on the Nano.
I am trying to speed up the measurement of all stars in a star field every 20-40 seconds looking for dips in the light curve.
At the moment I have to capture about 200 images and then process them over night to do the measurements on my laptop.
Maybe I am missing something but the Nano seems to be less capable than my Raspberry Pi?
Why are they hiding the Python tools? I can see some python examples but cannot run them
 tried clicking on them and I just get the editor. Seems to be no way to run them. As if Python does not exist?

Very disappointing.
let me know if you get Nano working.
John (www.astro.me.uk)

1 Like

Not sure if this is related or not, but you can install multiple versions of Python. On Ubuntu 18.04 the default is Python3, so if you use an older Python2, then you might need to install it. Does this work?

sudo apt-get install python2.7 python3.6

Hello John,

i understand your frustration. Sometimes, it is quite hard to get the programming environment working on a computer.

For example, i have just buy a new laptop to continue my work with PyCuda (and maybe C programming) and i spent a all day to get everything ok to make my python software work on my new computer. It’s a bit boring but necessary.

Jetson Nano works great with Python. You can trust me. But you will need to spend some time to get a working environment. Nano is much much better than raspberry pi.

From what i know, no need to use Python 2. Python 3 works great with standard libraries like opencv, pillow, tkinter, numpy, pycuda etc.

First, try to make a clean install of Python 3 and be sure your PATH is OK. If very simple example programs can run (just Python examples with standard Python functions), than try to add python libraries.

You can take a look here about opencv library :

https://devtalk.nvidia.com/default/topic/1049972/jetson-nano/opencv-cuda-python-with-jetson-nano/1

When i started using Jetson Nano, i spent a lot of time to try to make it work properly. I had to search for many informations so i guess you will have to do the same. But the results worth the hard work.

Alain

1 Like

Easybob,

I can run my Python code but I cannot seem to run the demo python:
camera-viewer.py because it Python3 cannot find the import jetson.utils
I guess it must be that it is not installed??
so I tried Sudo apt-get install jetson.utils this does not exist??

Seems really difficult to get any demo programs running.

My Opencv code works with python3

Not sure what the problem is


John

I can only run opencv Python programs and cannot find any online help or way to install the necessary software to run the demo examples.
I flashed the card as instructed and it seems the demos don’t work.

You would think that NVIDIA would ensure that at least the demos work without too much effort.
I have not even started my project to measure all the stars in a camera field against 5 reference stars looking for exo planets.
demos just give errors?

========== RESTART: /home/john/Desktop/MyPython/detectnet-camera.py ==========
jetson.inference.init.py
Traceback (most recent call last):
File “/home/john/Desktop/MyPython/detectnet-camera.py”, line 24, in
import jetson.inference
File “/usr/lib/python3.6/dist-packages/jetson/inference/init.py”, line 4, in
from jetson_inference_python import *
ImportError: libjetson-utils.so: cannot open shared object file: No such file or directory

Alain,
Ok I have managed to get the demos working and really like the work you have done with image processing.
I think I will start with just simple image processing
 Dark and Flat as this is the real bottle neck right now. I will have to send the image to Nano and nano can send back a dark/Flat processed image. Then I will send that to Astroart7 (running in main computer) to be solved and measured. All in less that 20 seconds.

Some time in the future maybe there will be a nano plate solve routine and even a photometry tool.

Be nice to have a tool to measure all the stars in a field with say 5 reference stars.

I then have to convert JD (time) to BJD to be able to detect light curve dips. I do that with Python running on the main computer using astropy and matplotlib.

At the moment I cant do this in real time. I have to collect say 100 images and then run a software tool I wrote to manage the measurement and processing between python and Astroart7.

Was pycuda hard to work with and install??

regards,
John

Hello John,

Nice to ear that you succeed to manage the python demos.

Pycuda was not really hard to install. You will need to search some informations in the forum.

Using pycuda is not too hard. Pycuda documentation is really useful. I thought pycuda will be much more hard to manage.

Opencv also work well and if you compile 4.1 version, you will get intersting results.

I won’t be able to give you help for at least one week because i have heavy work to do and i am not at home so i think i won’t come here very often.

Good luck with your work. Everything will be ok and you will get cool results. Just a question of time.

Alain

Thanks for the pointer to PyCuda documentation. I will take a look.
John

Hello,

i still continue my work with Jetson Nano but i must say i can’t make real conditions tests because using Nano with remote system is not really easy and for now, this is a big issue for me. Still working on it.

So, for now, i almost use my Jetson Nano as a post treatment system to test realtime filtering.

Here is a post treatment i made with the Nano to show the colours of the Moon’s soils.

Base video was made with my Celestron C9.25 telescope and a colour camera (Sony IMX178 colour sensor).

When i will have interesting results, i will be back.

Have a nice day.

Alain

I’m reminded of a Maxfield Parrish painting!

As far as remote systems testing issues, is this due to needing a physical display on the Nano as a means of using CUDA apps? If so, consider a virtual desktop.

I will !

Alain