Electronically Assisted Astronomy with a Jetson Nano

I suppose that in the end it is always a question of whether two images have a difference, and so faster frame rates also depend on the pixel resolution…two perfect images spaced closely together in time will show no difference if their resolution is not high enough. There is a relationship between maximum useful frame rate and image resolution whereby higher resolution allows faster frame rates for a given speed of movement. The “elephant in the room” becomes one of faster frame rates maybe needing more light, or faster frame rates showing more noise (image sensor noise is “false motion”). I’m just talking aloud because it is an interesting topic.

Frame rates which are too high just imply a more precise measurement by spanning multiple frames for measurement even if two individual frames do not show motion.

Hi linuxdev,

you are 100% right ! Congratulations. I will recommend you for a teacher job at Standford university !

High frame rate needs :

  • very high gain and therefore noise management
  • consider many frames in the past to be able to detect a significant object movement or getting a smaller angular resolution (higher focal or smaller sensor photosite)

It is interesting to find the hardware limitations.

An other subject i must work on : you have some banding effects with CMOS sensors when reading the pixels values and this banding effect is not always the same (we can see it clearly when i set 16 bits capture mode and i only get the first 10 bits and convert then to a 8 bits full scale. This banding effect can be compared to noise and it can be solved with classical dark subtraction because it is a kind of random phenomena. I will have to manage it. Ampglow for some sensors can also be significant and needs to be corrected.

With significant process capabilities, it is possible to get something interesting with not so good sensors.

Alain

Hello,

a daylight test to see 8 bits VS 16 bits captures :

Alain

Hello,

i made an other test to compare 8 VS 16 bits captures. This time, it is outdoor at night but still no deep sky because of the weather.

I hope 16 bits capture will bring many things with DSO targets.

Alain

Hello,

i try to improve 16bits videos support with JetsonSky.

Astronomy software often use SER file format, which is a kind of AVI uncompressed format with a specific header. The big deal with SER files is that they can easily support 16bits data we can create with classical astronomy software like Firecapture and Sharpcap (they can capture a 16 bits video from a camera and save a 16 bits SER file).

So, i decided to add SER file support in JetsonSky (read only, i will always save 8 bits videos). It works. JetsonSky can now read 16 bits videos (monochrome or color) and choose which bits will be use to work with.

The fact i can choose the bits i keep to work with is really interesting to retrieve some very small signal. I can also make HDR capture with only one image because i can create from this unique image 3 or 4 images :

  • image 1 : i keep bits from 1 to 10
  • image 2 : i keep bits from 1 to 12
  • image 3 : i keep bits from 1 to 14
  • image 4 : i keep bits from 1 to 16

If i combine those images, i get a HDR image. I tested this (daylight test) and it works. I need to test it with night capture to see if it’s ok. I can also set BIN 2 with 16 bits images and make HDR on the BIN 2 result. This is really cool.

I have to write some code in JetsonSky to add all the features and test it daylight and if it"s ok, i will have to wait for a clear sky to make deep sky tests.

If everything is ok, 16 bits capture and post treatment are very promising. This is why i made JetsonSky. It will allow me to see photons which start their journey billions years ago.

When JetsonSky will be fully operational, i will have to find some solutions to get better sensor and optics. As we say in French, “une chose après l’autre”.

Clear sky.

Alain

A new name for astronomers: Photonic Archaeologist! Ok, I’m just being silly.

Hi linuxdev,

Yes indeed. Just like Indiana Jones. Raider of the lost ark was a masterpiece. Bellocq was also an archaeologist but joined the dark side of the force. Bellocq was French in the movie. Why choosing a French for the bad guy ? For a strike, ok but for the bad guy ?

Hello,

i have uploaded a new version of JetsonSky in my Github : V51_01RC

I have also uploaded a brand new and up to date JETSONSKY_V51_01RC_INFOS.pdf file to explain changes in JetsonSky.

This version of JetsonSky now support 16 bits video reading using SER format files (JetsonSky needs now Serfile library which is in my Github; thanks to Jean-Baptiste Butet who wrote this library ; i made some small modifications for JetsonSky).

16 bits mode will allow to recover small signal and to make HDR image with only 1 image even with post treatment (8 bits HDR mode needs several captures using different exposure times).

If you want to make some tests with a SER 16 bits file, you can find one here :

This is a RAW capture so you will have to debayer it using JetsonSky with RGGB pattern.

I did not test this version of JetsonSky with the AGX Orin. I will test it ASAP.

For Windows version : i had some serious issue with CUDA SDK V12.5 and above. I had to get back to V12.1 to solve those issues (problem with nvcc compiler; impossible to compile CUPY kernels with V12.5 or above).

Alain

Hello,

new version on my Github : V51_03RC

Some bugs removed.

I tested it with the AGX Orin and it seems to work fine.

Alain

Merry Christmas everyone !

Alain

1 Like

Hello,

i have upload a new version of JetsonSky on Githhub : V51_05RC

I improved FPS for high frame rate capture (planetary capture) and made some small changes.

I am a bit confused with Jetson version : if i use my ASI178MC, everything seems ok. If i use my ASI485MC camera, i get a problem with CUDA (illegal memory access). I don’t understand. I don’t have this problem with my Windows laptop. Quite strange.

Alain

I can’t say, but I suspect it is because memory might have a slightly different layout with the iGPU. I would bet that’s something @dusty_nv could answer.

Hello linuxdev,

i founded the reason of this issue.

For a bad reason, i used a thread for the camera image capture function. Threads are great except with Python.

My cameras (ASI178MC & ASI485MC) don’t not respond the same way with JetsonSky, especially with USB link (works quite bad with Jetson and quite good with my laptop). I had a simple timing problem with my thread.

I tried to remove the thread and use a simple function and now, everything is ok and i guess i will have a more stable JetsonSky with a simple capture function rather than with a thread.

I have just made some test with the AGX Orin and my 2 cameras are fine with JetsonSky ! Hurray !!!

I removed the V51_05RC from Github and added the brand new V52_01RC version which will be much better.

Alain

Hello,

i work on JetsonSky to remove some bugs and improve HDR capture using 16 bits camera capture or 16 bits SER files.
2 methods for HDR :

  • Mertens (slow)
  • Mean (fast)

The new version is V52_03RC.

And now, a color picture of the Moon :


Alain

To illustrate HDR method with a simple 16 bits capture with JetsonSky.


Alain

A video (daylight unfortunately) about HDR capture with 3 methods (16 bits video capture) :

  • Mertens
  • Median
  • Mean

I don’t know if this 16 bits HDR based algorithm is new or not but with Median or Mean method (or equivalent method), we can get quite high frame rate HDR which could be useful for industrial or automotive vision (complex light scenes).

Alain

I am just curious, is this a Bayer image? How many bits per channel? Is “first bits kept” the same as “least significant bits”? I imagine keeping the most significant bits up until more bits cannot be handled, and then throwing away the less significant bits. I don’t really work with camera software, this is just a curiosity.

Incidentally, that moon picture is awesome!

Hi linuxdev,

The 16 bits capture is RAW data of a Bayer sensor. I have to debayer it. 16 bits per channel.
The first bits are the less significant bits. You are write.

As I try to make astronomy videos, I am interested with low signal so the interesting bits are less significant ones.
If my threshold is bit 11, every pixel with a value superior to 2 exp 11 is set to 2 exp 11. Then, I convert the result image to 8 bits format.

I will post the conversion code asap to better explain the treatment.

And thanks for the Moon. I also like it.

I wish you a happy New year !

Alain

Hello,

here is my code to generate HDR image from a single 16bits RAW image (1 bayer channel) :

# HDR Test program
# Alain PAILLOU - 2025
# Create HDR image from single frame generating 4 frames from this single frame

import cv2
import cupy as cp



def HDR_compute(image_16b,method,threshold_16b,type_bayer) :

    if (16 - threshold_16b) <= 5 :
        delta_th = (16 - threshold_16b) / 3.0
    else :
        delta_th = 5.0 / 3.0
                            
    thres4 = 2 ** threshold_16b - 1
    thres3 = 2 ** (threshold_16b + delta_th) - 1
    thres2 = 2 ** (threshold_16b + delta_th * 2) - 1
    thres1 = 2 ** (threshold_16b + delta_th * 3) - 1
                                                        
    image_brute_cam16 = image_16b.copy()
    image_brute_cam16[image_brute_cam16 > thres1] = thres1
    image_brute_cam8 = (image_brute_cam16 / thres1 * 255.0)
    image_brute_cam_tmp = cp.asarray(image_brute_cam8,dtype=cp.uint8)
    image_brute1 = image_brute_cam_tmp.get()

    image_brute_cam16 = image_16b.copy()
    image_brute_cam16[image_brute_cam16 > thres2] = thres2
    image_brute_cam8 = (image_brute_cam16 / thres2 * 255.0)
    image_brute_cam_tmp = cp.asarray(image_brute_cam8,dtype=cp.uint8)
    image_brute2 = image_brute_cam_tmp.get()

    image_brute_cam16 = image_16b.copy()
    image_brute_cam16[image_brute_cam16 > thres3] = thres3
    image_brute_cam8 = (image_brute_cam16 / thres3 * 255.0)
    image_brute_cam_tmp = cp.asarray(image_brute_cam8,dtype=cp.uint8)
    image_brute3 = image_brute_cam_tmp.get()
    
    image_brute_cam16 = image_16b.copy()
    image_brute_cam16[image_brute_cam16 > thres4] = thres4
    image_brute_cam8 = (image_brute_cam16 / thres4 * 255.0)
    image_brute_cam_tmp = cp.asarray(image_brute_cam8,dtype=cp.uint8)
    image_brute4 = image_brute_cam_tmp.get()

    img_list = [image_brute1,image_brute2,image_brute3,image_brute4]                       
                            
    if method == "Mertens" :
        merge_mertens = cv2.createMergeMertens()
        res_mertens = merge_mertens.process(img_list)
        res_mertens_cp = cp.asarray(res_mertens,dtype=cp.float32)
        image_brute_cp = cp.clip(res_mertens_cp*255, 0, 255).astype('uint8')
    if method == "Median" :
        img_list = cp.asarray(img_list)
        tempo_hdr = cp.median(img_list,axis=0)
        image_brute_cp = cp.asarray(tempo_hdr,dtype=cp.uint8)
    if method == "Mean" :
        img_list = cp.asarray(img_list)
        tempo_hdr = cp.mean(img_list,axis=0)
        image_brute_cp = cp.asarray(tempo_hdr,dtype=cp.uint8)

    HDR_image = image_brute_cp.get()
    HDR_image cv2.cvtColor(HDR_image, type_bayer)
            
    return HDR_image

# Choose method. 3 possible choices :
# - Mertens (quite slow)
# - Median (fast)
# - Mean (fast)


mode_HDR = "Median"

# image_camera_base = 16 bits RAW image (1 bayer channel) - Cupy array

TH_16B = 12.0
# TH_16B is a value generally between 10 bits and 14 bits

type_debayer = cv2.COLOR_BayerRG2RGB # depends of your bayer pattern


HDR_image_result = HDR_compute(image_camera_base,mode_HDR,TH_16B,type_debayer) # HDR image 3 channels R G B numpy array like

Alain

Happy new year to you too!

Also, a related story which might be of interest. Many years ago all X-Ray machines used regular photographic style film. When they got to digital “synthetic” versions of X-Ray which did not need to go through the film development process the CRT type computer monitors (in combination with the computers) had no ability to mimic the quality of the old style photographic film’s color gamut and resolution. So they played a “trick”.

The viewing device had something that looked like an audio “volume” control. One could change that and it would transition to which bits were displayed. Moving that back and forth you could see the details of little things that you wouldn’t have noticed in a lower resolution device. They were essentially giving manual control to what your software is doing to work with HDR.