ok, fine. I do not need the filter wheel. The recording from last night shows a satellite rain or starlink train, choose what you want ;-). I have used the timestamp. Exposuretime is 40ms and the result is great. Unfortunately I downgraded the resolution to hd instead of 4k while postprocessing for yt. So it looks not so good as it is. Maybe I will fix it soon.
If you see the time you can see, that there is no difference in the speed of the video. It is every time timelapse. It is equal to have exposure at 400ms or 40ms. the only difference of course is that the satellites now are nice dots and no streaks. I don’t want to argue against you, maybe I do not understand something wrong. But my experiences are that the videorecording is not possible to get in realtime. It is ok for me. If I slow it down in post processing then it is not more fluent as you have recognized before.
it is strange that whatever the exposure time will be, the result is always the same, that is to say a timelapse video. If i understand well, the camera setting is 4K, that is to say BIN1. If you get full resolution (4K), AGX Xavier will need some time to apply treatments. Maybe this explains the timelapse effect. Try to use BIN2. The treatment time will be lower and you should see a difference between 40ms and 400ms exposure time.
I looked at your video on YT. It’s amazing. Congrates.
Ok, I think I got it. Video is recorded with 25 fps. That means a frame every 40 ms. If my exposure is less than 40 ms it is possible to get and record 25 fps. If my exposure is higher than that AND the post processing consumes several ms for enhancing, these 25 frames could never been reached. Then the jetson would take what it can get and puts 25 frames together as a second. Because this need much more time than 40 ms it has to get frames “of the future”. And this is why the video seems to be compressed to timelapse. Well, it is, but it is a result of the long exposure and processing time. I hope I could make clear what I mean and I hope I have understands the things quite right.
Sure, your welcome. But I posted the gcode-file which belongs to the adapter of Tom. It depends on the printer you will use. My could not process stl-files. I had to convert them to gcode-file and posted this above Sep12.
If you like, I will make it for you and send per mail.
The needed parameter:
Ultimaker 3 Extended
Don’t forget to enable support with layer height 0.1mm or less!
Take in mind that this thing is for Sony E-Mount-Objectives!!!
My question is what does the number at the very bottom signify? My time typically shows 43ms. If JetsonSky was recording 43 ms frame time the would yield a 22fps video.
What I am seeing is 4.8 fps which is 192ms frame time. When watching the processed video in real time (like at night) I can see the jittering on the fastmoving satellites which indicates something below 25fps.
I am very happy with the time lapse effect. A 8 hour video playing in 45minutes is a very good speed for observing…
Oops, I never see a number at the bottom?? I think it is the processing time. Timelapse is ok but that strong, I think you must miss something. I watch my videos with 0,25x speed and I can detect some strange objects which I definitively would not see with higher speed (imho). The big TicTac you will probably get.
@ Thom The STL file is a normal 3d print type format. All stl files are then processed through a program called a slicer.
I use a slicer program call Simply3d. Now a days the program Cura is the popular slicer and it is free. It is the slicer that calculates the gcode. And that gcode is specific to your particular 3d printer.
I got my first commercial print from shapeways. The print was of the Sony E mount to 42m x 0.75 . It is very nice. Very precise and much better than anything I can print. The cost of the adapter delivered was $25.
@ Alain Have you settled on what lens you are going to buy? I think you said one with a Fuji Fx base…
Ahh, thanks, that makes sense, I’m not familiar with 3d-print. I was wondering as I give my printer some instructions that this nice adapter came out. I was afraid it could be some monster ;-) Then I will try to let make it from some professionals here. But my result seems to be ok, in the first sight.
Thank you Alain for all your work on the JetsonSky software.
It is a very neat program that works great. I plan on using it often in the coming months. I did get both adapter rings back from the Shapeways printing shop. I must of made some kind of errors in the Fuji adapter as it was too big. The Sony E mount was better and the m42 threads are very accurate and fix nicely. I was able to make 1 good Fuji adapter out all of it . I need a real cad drawing of the Fuji camera mount to do it correctly.
Hello Alain… I loaded version 14_16 and it works fine. I just bought a used asi385 camera for my telescope and it is working with JetsonSky . Pretty cool. I guess it autodetects the camera and just works…
Cool ! i changed camera setup in the V14_16 version. JetsonSky gets camera parameters from the camera in order to set the good values in JetsonSky. I tested it with my ASI178 and my ASI485 but not tests for others cameras. I am happy this works fine. I only set the different resolutions by myself because sensors get different form factors and resolutions and we can’t choose all the resolutions we want (sensor needs some specific resolutions). So, regarding the cameras resolutions, i have to choose the good ones.
Still preparing winter here so i won’t make new things with JetsonSky for now. Holidays are also over so i will have to get back to work. I’ll be back ASAP.