And the use of JetsonSky to improve a Moon capture :
Hi,
an other test to improve Moon video capture.
This time, i can choose the part of the video i want to stabilize (hook placing). Then, i manage turbulence in order to apply sharpening.
I think itâs a huge step forward to get more precise and detailed videos of the Moon (or planets). For sure, RAW capture must be as good as possible and low turbulence is a must have.
Alain
Alain,
Is there a reason you are not capturing FITS files vs RAW format image data?
Stefan
Hi Stefan,
no particular reason except tif is good enough and can be read very easily. I donât really need fits format, as I donât need specific header.
Alain
Hello,
i still work on turbulence management with planetary & Moon video captures. I get significant improvements :
I am quite happy with this result.
Alain
Hello,
yesterday, the sky waq clear, the turbulence was low and the Moon was quite high in the sky.
Some classical results with good telescope collimation, as this topic is about astronomy :
And a last picture. Moon mosaic with colors of the soils (JetsonSky treatment) :
Alain
The all Moon from my last capture session. The result is not really good because of the bad turbulence.
Alain
Hi,
I tried to install Astroberry in my Nvidia NX but fail.
I follow the advance installation in here. Astroberry Wiki
I wonder which website link did you use? Thx
nvidia@nvidia-desktop:~/Astroberry$ sudo apt install astroberry-server-full
Reading package lists... Done
Building dependency tree
Reading state information... Done
Some packages could not be installed. This may mean that you have
requested an impossible situation or if you are using the unstable
distribution that some required packages have not yet been created
or been moved out of Incoming.
The following information may help to resolve the situation:
The following packages have unmet dependencies:
astroberry-server-full : Depends: astroberry-server-wui but it is not going to be installed
Depends: indi-full but it is not installable
Depends: kstars-bleeding but it is not installable
Depends: gsc but it is not installable
Depends: phd2 but it is not installable
Depends: phdlogview but it is not installable
Depends: oacapture but it is not installable
Depends: firecapture but it is not installable
Depends: indi-astroberry-diy but it is not installable
Depends: indi-astroberry-piface but it is not installable
Depends: indi-weather-mqtt but it is not installable
Depends: indiwebmanagerapp but it is not installable
E: Unable to correct problems, you have held broken packages.
nvidia@nvidia-desktop:~/Astroberry$ sudo apt-get install indi-full kstars-bleeding
Reading package lists... Done
Building dependency tree
Reading state information... Done
Package indi-full is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source
Package kstars-bleeding is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source
E: Package 'indi-full' has no installation candidate
E: Package 'kstars-bleeding' has no installation candidate
Hello AK51,
i did not install KStar on the Xaver NX so i donât know if it can work or not.
You can take a look here :
Not sure astroberry can be installed as it is with the Xavier NX.
Alain
Hello,
still working on JetsonSky development. Many improvements.
- i can get now the signal level for color sensor i should get with a monochrome sensor (monochrome sensor are more sensitive than color sensor which have bayer matrix). This allow me to set lower gain or set smaller exposure capture time
- i can get the coordinates of the mount (quite useful to know what i am looking at)
- i can make live RGB alignment (in order to manage atmospheric dispersion - prism effect)
- many bugs fixed
Some new videos :
Alain
Hello,
i have decided to explain things about JetsonSky so i started with my first spoken video : Where does come from ?
For now, it is French spoken with English subtitles.
I will make at least 3 more videos :
- JetsonSky functions with a camera plugged
- examples of treatments with Moon and planetary videos
- examples of treatments with deep sky videos
If things goes well and if people are interested in thoses videos, i will try to make some English spoken videos. This will be quite funny and rock 'n roll i guess.
Alain
Hello,
2nd video about JetsonSky : the functions
It explains the main functions of the software. In French with English subtitles.
Still 2 more videos to come.
Alain
Hello,
3rd video about JetsonSky : exercises on Moon and Jupiter videos
In French with English subtitles.
Still 1 video to come later.
Alain
Hello,
4th and last video about JetsonSky : exercises with deep sky videos.
French spoken and English subtitles.
Quite long videos but i wanted to explain how JetsonSky works. Very few will look at them but i hope those videos will inspire the few.
Alain
Hi Alain,
I was about to use your work which I have been seeing at Jetson community projects and later on Astronomy related posts. I was planning to feed video from my Dwarflab Dwarf2 smart telescope streams but i think you no longer publish it on github. Do we have a chance to access it? :/
Best,
Giray
Hello Giray,
for now, i stopped sharing JetsonSky on Github. No time at all to manage this and my priority is to make astronomy.
Maybe later i will consider to share it again. Not sure.
Very sorry for that.
Alain
Hello,
a strange summer here : clouds, clouds, clouds.
So, i still work on JetsonSky, trying to solve small bugs and working on my few 2024 Moon images.
Here are colors of the Moon (again) with JetsonSky treatment to manage brightness and colors :
Alain
Hello,
some tests yesterday. The sky was not bad.
I used the 7 Artisans 35mm F0.95 lens with an IR cut filter. Not bad but i bit disappointed. Not a really great lens. My old Canon FD 50mm F1.4 gets better optics. It is the problem with optical system. Most of time not bad but not good.
The video :
Alain
I just realized something interesting from that video (at least it is interesting to me). When you move the drive motor to a new orientation there is a slight âsmearingâ of objects actually producing light due to the length of exposure. However, the noise content does not âsmearâ. If the AI has access to knowledge about the drive motor movement, then it could use every movement to improve the noise removal that is part of the sensor. You wouldnât need to run something like a dark lens with lens cap, it could just improve every time you move the camera. Not sure if it is worthwhile.