Electronically Assisted Astronomy with a Jetson Nano


clouds are back. As i have some free time, i tried something new with my colour camera : try to change it into a monochrome camera.

My sensor (IMX485) is a bayer sensor (to get colour information).

Colour is extract from 4 pixels. In my case, the order is :
R G1
G2 B

So, 1 red pixel, 2 green pixels (G1 & G2) and 1 blue pixel.

For sure, photosite doesn’t get the entire signal (only red, or green, or blue). That’s why colour sensor are about 1/3 sensitive as mono sensor (which gets the entire signal).

Most time, you can get grey image with a colour sensor. The camera applies a small calculus like :
Luminance = Red0.2126 +Green0.7152+Blue*0.0722 (or something like that). In that case, the result intensity is quite the same result you will get with the colour image.

Let’s try something different :
Luminance = R + (G1+G2)/2 +B

Quite simple. So, let’s see the result.

First, the classical colour image (gain 100, exposure 11ms) :

Now, the classical grey transformation. The luminance is the same :

Now, the new transformation, which is supposed to raise the luminance :

As you can see, it works quite well.

What would be the camera setting to get the same luminance with classical grey transformation ?

To get the same result, the exposure time must be 43ms (compare it to the 11ms exposure time with the new transformation).

It is the same with colour image :

So, with the new grey transformation, i can get much more signal than the normal signal i can get with the colour sensor. I must say i always use the sensor information so i don’t bring false information).

With this method, i will be able to make deep sky survey video with (much) lower exposure time (for example, 100ms exposure time instead of 300-400ms exposure time) and keep my colour sensor instead of buying a new monochrome sensor camera.

For sure, i will lost the colour. As deep sky survey means quite long exposure time, i guess i could colourize my monochrome capture with the RAW colour image. Just have to verify that.

An important thing to say : monochrome sensor is better for resolution and details (1 pixel for the mono sensor, 4 pixels for the colour sensors). I can’t make miracles each time.

Well, that’s all. Clear sky !



quite a sad day today. 3 years ago, a achieved my 2 axis motorized mount (see the first post of this topic).

It was interesting project. Several months for conception, 3 weeks to make the mount, several months to make it work and improve the software (mount control).

Now, i must say the mount is a bit old and i have new issues each time i use it (mechanical and electrical problems). I had to face the truth : my mount needs retirement.

So, today, i decided to disassemble it. This is the end.

So pictures to say good bye to my mount :


And the last picture : my mount totally disassembled

It took me 3 hours to totally disassemble it. So sad.

If i had time (i mean no need to work), i would have make a new mount (more simple, more accurate, etc.). But as i don’t get much free time, i will probably buy a small altaz mount (like Skywatcher AZ-GTI) in order to get time to improve my treatment software (JetsonSky) and make sky survey.

So long my mount. You were really cool.


1 Like


some time ago, i tried to get automatic satellite detection with JetsonSky. It worked quite well but it was very sensible with noise and background illumination.

I did not give up.

Looking around, i founded an promising OpenCV function : SimpleBlobDetector

So, i tried to use it in JetsonSky and i must say it works really well.

An example :

This function will bring me new opportunities. I am quite happy with it.

For sure, those opportunities will need more CPU and GPU power. AGX Orin looks more and more sexy.

Have a nice day.


Related to previous post, an other test.

First part : Hercules great cluster (M13)
Second part : classical wide field survey


1 Like


some new improvements of satellites tracking. This time, i can get the trajectories. Still some improvements to come.

This new result is really promising.


This is rather cool. I am curious though, there were a few which did not track, but went smoothly across most of the field of view. Do you have a guess as to why those were skipped, e.g., not bright enough?

Yes, some satellites are not detected. The main problem is the noise. If i catch low brightness satellites, i will also catch many false satellites (mainly noise). I will try to improve this.


1 Like

Some new tests.

The method is interesting but i still get issues.

The problem is always the same : noise. To catch satellites, i need to set the camera gain at high value. So, i got noise.

I need some time to get improvements. But the test result is quite cool.

The treatment is really heavy and hard to be made with GPU. A big CPU is required.


Great job with the detections of satellites.

I’m interested on plate solving to know the center coordinates of the video. Did you implement astrometry in real time video?

Hello Carlos,

astrometry is something i am interested in. I plan to use it but i must say it is something quite complicated.


I have received my new mount : Skywatcher AZ-GTI.

Quite the same as my old home made mount but i must admit it looks … better.

I will be able to use many interesting software. I guess this new mount will be cool. First test (very quick test) is OK. Just have to wait for clear sky now.


1 Like