Electronically Assisted Astronomy with a Jetson Nano

Hello linuxdev,

for now with JetsonSky as it is, the “smearing” or “ghosting” comes from the noise removal functions. Adaptation absorber noise removal filter or 3 frames noise removal filter works really great with static videos. If stars move, i get a ghosting effect. Exposure time is not responsible for smearing.

If i could use AI to manage noise, it would be great because i won’t get this ghosting effect any more. The problem is : “i don’t understand a single word about AI”. For sure, i can make run a tutorial but this does not match my needs.

One day, if light comes in my mind, maybe i will be able to try to use AI to clean my RAW videos but for now, my brain is not IA compatible !

Alain

😀 I understand about not understanding!

The topic is still interesting. Consider that if you were to preprogram some movement by a small amount on the camera mount, with a long enough exposure, then “real” objects would produce that ghosting/smearing. Any pattern that exists after that is all noise from the sensor. Think of it as a poor man’s version of covering with a lens cap and taking a dark exposure for a couple of minutes to see what the sensor produces. The difference is that this takes about half a second or a second of movement, and occurs at every movement, whereas the other has to run at the right temperature for significant time before starting (but is likely more accurate). The pixels which do not change at a given CCD cell are suspect, and if movement in one direction changes intensity, then it is due to actual light; pixels which do not change to their neighbor value during a movement must be part of the sensor.

It is just “food for thought”. There are likely ways to detect “unchanged” versus “changed” based on known camera movement, but it is more a curiosity.

oh yes, i understand.

This could avoid classical sensor calibration. AADF filter do thing a bit close but not exactly what you describe.

I will think about that !

Alain

Hello,

i decided to try object detection using AI. Reading things on the internet, i decided to try to use YOLOV8 with Pytorch.
My first exercise is about Moon carters automatic detection.
First, i had to find some good Moon photos. Luckily, i have some.
Then, i had to make a dataset with annotated photos with YOLOv8 format. After some web research, i used Roboflow online tool to do this.
A also had to get some small Python programs to train the custom model and test it with images and videos.
I also had to make tests with the dataset and training parameters to get something usable.
At least, i had to put all those things in JetsonSky.

Here is my very first try :

For sure, it is not perfect but it works (i mean the main idea).

Now, i need to get more photo in my dataset and maybe i will need to create craters categories (small, large, complex) in order to get better detection.

Anyway, i am quite happy with the result. It only took me few hours to understand the global idea of objects detection and get my first results.

I will test this with the Jetson AGX Orin ASAP.

Alain

Hello,

i tried to improve my craters dataset to get better results.

I still use YOLOv8L pre trained model to train my crater model.

Training a model needs a big GPU. With my laptop RTX4080, it takes 2 hours to train the model (only 300 images, batch-size = 12 because of limited GPU RAM, epochs = 150).

I understand now why NVidia created H100 or A100 systems with huge amount of memory.

Here is a more complete video about my new model :

If i consider i trained the model with only 300 images, the results are not bad.

To get something more serious, i would need 10 times more images but training the model would be a bit long.

I will test this new version of JetsonSky (with crater model) as soon as i will have solved a problem with keyboard management under linux (needs to be root, quite stupid thing).

JetsonSky get now some little AI !

Alain

Thing are going there way with crater model prediction.

I have tested object tracking with YOLOv8 with persist option and this works great, even if the calculus time is longer.

I will try to make a model for satellites detection (and shouting stars). Hope it will work.

Alain

Hello,

i made a YOLOv8 model to detect satellites, planes (not really good) and shooting stars.

I tested it (prediction & tracking) with JetsonSky and it works no so bad with JetsonSky.

It also works (crater & satellites detection & tracking) with the AGX Orin. I just tested the models with the AGX Orin this afternoon.

I am very pleased with that. To celebrate this, i will work on a up to date version of JetsonSky for Jetson Orin and i will provide it if someone wants it for its personel use (no commercial use). People who wnts JetsonSky will have to ask me for it.

But i need some time to get a clean and up to date JetsonSky for the Orin.

Alain

Hi,

here is a video about satellites detection.

I show the RAW video, the preprocessing and 2 detection method :

  • my old OpenCV simple blob detection method
  • the YOLOv8n based model to detect satellites

The result is interesting.

Alain

Hi,

i need help.

Some functions in JetsonSky need to get some keyboard information (key pressed to do things).

With Windows system, i use keyboard python library.

Under Linux, keyboard library needs to be root.

If i launch JetsonSky with this command : ‘sudo python JetsonSky.py’ i have :

  • no problem with keyboard
  • problems with other libraries : Python does not found those libraries

When i installed keyboard, i used ‘sudo pip install keyboard’. I did not use ‘sudo pip install …’ for the other libraries.

Keyboard is starting to boring me.

Does anyone know an other wait to easily get kind of keypressed function (the function must not wait for a key to be pressed and return the key value) with Linux and Python ?

Alain

I don’t work with Python, so a lot I can’t help with. It does sound like a permission issue though. If this were a traditional user space C/C++ program I’d find out what libraries were linked and then examine their permissions, but I don’t know how to do this under a scripted (interpreted) program. However, if you have some means of finding the libraries, then you could check their permissions to see if your regular user has access.

Hi linuxdev,

the problem comes from permissions required by keyboard and it this there is no easy way to solve this problem.

Starting Python with sudo solve the problem with keyboard but brings problem with other libraries.

I will search for an other way to get a key pressed information with Python under linux. It is crazy to get such problem with common function.

Alain

Hello,

i made some small deep sky capture tests with JetsonSky and an old Canon FD 135mm F/D2.5 lens. The targets are M13 Hercules Cluster and M31 Andromeda galaxy.

The sky was bad because of the Moon but i have to deal with the clouds here so i work when i can.

With this bad sky, those targets are quite hard to capture, especially when you do video capture with live treatment.

The result is not that bad for a video and we can recognize the targets.

Alain

Hello,

i have released on Github some small programs to allow you to test some of my basics functions i use in JetsonSky.

The link :

Alain

Hi,

well, nobody to test my filters. Too bad.

Anyway, i try to get a working version of JetsonSky for Jetson SBC. Quite hard to get something working with my last version of JetsonSky. Jetson and PC architectures are quite different so i get some issues with JetsonSky working with AGX Orin. Maybe i will have to provide an old version of JetsonSky. I need to make more tests.

The key pressed function was solved with pynput but i get others issues i don’t really understand. Wait & see.

JetsonSky can also manage background gradient in a video (or a photo. An example here :

Alain

Hello,

JetsonSky V50_15RC is back on Github :

This version is supposed to run with Linux and Windows systems. I must say i made very few tests with Linux version & AGX Orin. It seems to work but it is really slow comparing to Windows version. Keyboard management seems to work but i must say i am not pleased with pynput. I don’t have a better solution for now.

Alain

Hello,

i have uploaded a new version of JetsonSky on Github : V50_17RC.

Some bugs removed and i have added the ability of managing AZERTY or QWERTY keyboards, considering the functions we access with the keyboard.

I recommend you to read the up to date documentation i have also uploaded on Github :

Alain

Some tests yesterday. It was the only day without clouds for weeks. When i say cloudes free, i should say almost clouds free.

Something different : my post is quite long now, and it seems my new posts do not interest much people. If i consider this, i ask myself if i must end this long topic now or if some of you are still interested with new posts.

JetsonSky gets everything i need and i think it won’t get exciting new features now. it could be the right time to end this topic. Maybe.

Alain

I still look at all of the new posts. I personally don’t have any ability to work with any kind of telescope or astrophotography, and so about all I can do is watch.

Hello linuxdev,

i was thinking my posts are starting to be a bit boring. No huge thing to show.

I reach limits of my hardware and software can’t make miracles with extreme conditions. I also need beautiful sky but it seems beautiful sky is very hard to find.

I hope i will get some great skies this winter to be able to make great videos. I also think to buy a small telescope (Skywatcher 72ED) but i am not convinced yet (F/D is a bit high, even with a 0.5x focal reducer). Maybe a new camera with better specs. i have to think about that.

If you are the last to follow my posts, i will still post for you.

I read all of your posts, and I watch all of the videos on YouTube. I’m not really in a situation where there is any ability to actually use any of your software, but I consider it to be a bit like art.