Electronically Assisted Astronomy with a Jetson Nano

I made some tests yesterday at night before clouds arrived.

For sure, i found some bugs. Nothing really important and i was able to solve the problems during acquisition session.

An example of the tests :

It’s a post treatment i made with Windows platform. RAW capture was made with Jetson AGX Orin.

Post treatment with Windows platform revealed a problem : Cupy do not like GStreamer with Windows environment. Cupy context is destroyed when using GStreamer. For now, i did not find the solution for this issue.

I have released a new version of JetsonSky on Github (V40_07RC) to disable the use of GStreamer with Windows platform.

Alain

1 Like

Hello,

i have released a new version of JetsonSky : V40_09RC

Some improvements, bugs fixed and it seems mp4 encoding is now working with GPU acceleration (GStreamer) for Windows platform.

I had to save cupy context with cupy.cuda.stream() function.

It’s always here :

Alain

1 Like

Hi,

i made some tests yesterday with JetsonSky V40_09RC. It seems to be quite good.

Some test videos :

JetsonSky starts to work fine. I solved some small bugs. Latest version (V40_10RC) is here :

Alain

JetsonSky V40_10RC is replaced with V40_11RC.

It seems sometimes, mp4 quality profile generates failure with AGX Orin. So, i set a lower quality.

Not really easy to get JetsonSky properly working with Jetson SBC and Windows computer.

Alain

I have a feeling that a lot of the traffic in the second video is from Starlink.

What I’m curious about though is the jitter in the heading indicator of the first video. Do you think that this is from the “digital” format? Restated, I’m wondering if the fact that the pixels are not truly “linear” is why there is some jitter, and that perhaps higher resolution would result in less jitter?

Hello linuxdev,

there must be many Starlink satellites but there are also many other things up there. Many times, it is not satellites but launcher stages which are still there.

I used to get NORAD files to identify those satellites but there are referenced in classical NORAD files.

You are right, jitter comes from quite low resolution. With higher resolution, jitter would be smaller. I think i will remove this information because it is useless.

My camera is 4K resolution but in need to set it in BIN 2 mode to get some signal with quite slow exposure time. So, it is unusable with 4K resolution. When i will make some test on very small bright deep sky objects like M13, i will be able to use BIN 1 but in this case, very small exposure time is not absolutely needed. But is this case, i do not track satellites (the field of view is too small).

Alain

I wouldn’t remove that information, it is interesting! I suppose you could make it a checkbox option. You have no reason to do this with your program, but it would be interesting if the AI could “interpolate” and correct jitter by knowing the resolution and seeing changes over successive points. Actually, that reminds me of something else AI could do, but which I have not heard of: If one takes a moving object and considers it to be sufficiently far away, then perhaps the jitter could be compared to resolution and actually used to estimate size. The tinier the object is in comparison to resolution, the more the jitter would increase.

One day perhaps an optical version of a phased array antenna will exist. Then you’ll be able to have both high resolution and sensitivity to light (along with many other new abilities).

Hi linuxdev,

i still want to work on satellite trajectory detection but i guess i will do this a bit later. Using jitter to estimate satellite size won’t be possible. The resolution of the camera is quite low and considering the lens focal and the satellite size, i think i am far from the angular resolution needed to estimate the size of the satellite.

If i use my C9.25 at F/D10, maybe something could be done but targetting and following the satellite with a 2350mm focal lens is a bit hard. It can be done but i would need a much better mount i have.

I keep this in mind.

I worked on JetsonSky to get one single program for camera, videos and images.

It is V40_12RC version. I made a few tests and it seems to work. I will improve it (i do not manage flies loading as i should, no error management. i know, it’s bad).

Still at the same place :

Alain

Hi,

V40_12RC had a critical bug. I have released V40_14RC and removed V40_12RC.

Alain

Hello,

My Celestron C9.25 was out for the first time since 2.5 years.

The weather conditions were very bad (turbulence, wind, bad transparency). As it was still daylight, impossible to verify the mirrors collimation.




This time, i used my laptop. Too many things to manage and it is more easy to use laptop outside.

Alain

I also made a test on M13 cluster. The result :


Alain

There was a very small bug in V40_14RC. It is corrected now.

The soft needs to be reloaded in GitHub. Same version number.

Alain

Hello,

i have uploaded 2 new versions of JetsonSky on Github and V40_14RC has been removed.

  • New V40_15RC is in fact V40_14C with some bugs removed
  • New V40_16RC added a trigger for satellite detection (check box under satellite checkbox on the left of the screen). Means when click Start Record, recording will be active only if a satellite is detected. Otherwise, recording wait for satellites. It’s a beta version function. Needs some tests in real conditions. Also work with video treatment only (no camera).

Did someone try to use JetsonSky with a Jetson Orin Nano or Jetson Orin NX? I would like to know if the software works fine with them.

Alain

Hello,

new test yesterday. I was supposed to test JetsonSky V40_16RC and the target was the Moon. I also had to test IMX485MC colour sensor because i did not really used it with my telescope.

The setup :
Celestron C9.25 F/D 10 (2350mm focal length)
Equatorial GOTO mount
Sony IMX485MC colour sensor
JetsonSky with my laptop

No collimation check (primary and secondary mirrors alignment) because it was daylight (no stars in the sky).

The weather was horrible. High turbulence and strong wind.

I uses JetsonSky with small live treatments (sharpness, contrast, etc.) to get the RAW video. I uses Autostakkert 3 and Registax 6 to make RAW video treatments. I uses JetsonSky to make small post processing treatment of the image i get with AS3 and R6).

What it looks like with JetsonSky on my screen :

First test with Ptolemee crater region :


A comparison with LRO probe image :


Despite the weather, the result is not so bad, also considering my camera is a colour camera which means details are not really good. A monochrome camera with a red filter would have been more suitable.

New target : Eratosthenes and Mare Vaporum

New target : Plato crater region (the shadows in Plato Crater are intersting)


Considering the poor weather conditions and the camera (colour sensor), the result is not so bad.

I made those tests to see if i can use JetsonSky for planetary imaging. It seems the answer is yes.

I also made a test to see if IMX485MC sensor was able to retrieve colours of the Moon’s soils. The video :

Finally, it was an interesting test session and everything seem to be OK.

Alain