Hello,
We are designing astronomy telescope mount controller with video processing platform. Out of the mount control we want to be able to capture video and images and possibly process them.
The final list of features should be:
- EQ / AZ mount control
- Stepper motor Mount Control (RA/HA or ALT/AZ directions)
- Stepper motor Focuser Control (manual/automatic camera focusing)
- Joystick controlled movement
- GPS, Temperature / Humidity / Pressure sensors
- Assisted mount “north orientation” placement using Magnetometer
- Assisted mount “water leveling” with Gyroscope/Accelerometer
- Camera assisted polar alignment
- Finder camera image capture / processing (for manual sky object navigation)
- Main camera image capture / processing / storage (planetary and DSO objects)
- Deep space object imaging
- high resolution long exposure image capturing / storage
- dark/white frame capture support
- possibly realtime image processing and stacking
- dark frame substraction, bad pixel elimination, noise reduction, contrast and brightness adjustment, color adjustment and enhancement, gradient removal, stacking
- Planetary imaging
- high FPS video capture / storage
- possibly realtime image processing and stacking
- dark frame substraction, bad pixel elimination, noise reduction, contrast and brightness adjustment, color adjustment and enhancement, gradient removal, adaptive histogram equalization, stacking
- Satellite tracking and imaging
- External tracking (INDI)
- Internal tracking (object detection, trajectory calculation)
- Deep space object imaging
- Guiding camera image capture / processing (mount and control error correction)
- Automatic object navigation and tracking (Goto)
- DSO, planets, moon, sun, asteroids, satellites
- Autoguiding / tracking with necessary error corrections
- Externally (INDI) controlled tracking / guiding
- Internally controlled guiding (object detection and error correction)
- object detection, cyclic error calculation and predictive correction
- USB (ZWO) filter wheel control
- WiFi connectivity
- USB or NVME storage
- Web application with video live view
- remote camera and mount setup
-
h264/h265 camera live video stream (pointer scope, main camera)
- raw image debayer, color space conversion, h264/h265 compression
- INDI Library integration
The final hardware will be built on:
- Jetson Nano or Xavier NX, possibly
- TMC5160 motion controllers in motion controller mode
- BME280 temperature / pressure / humidity sensor
- MPU9255 magnetometer/accelerometer/gyroscope
- Microstack GPS module
- State buzzer
- water leveling indicators (led diodes)
Cameras to be used:
- 2x LOGITECH C270 USB camera (polar alignment, pointer scope)
- 1x ZWO ASI290MC (COLOR) (planetary imaging, guiding)
- 1x ZWO ASI1600MM PRO (MONO) (deep space imaging)
Most of described functionalities (except video capturing and processing) on SBC were tested already, at least on the table. Most of the C++ software is in pre-alpha state and ready to be integrated together. Goto/tracking parts was not touched yet. Software for video capturing is currently in progress - just capturing, data storage and live view (filters mentioned above are not currently even tested).
The story behind…
Friend of mine got SkyWatcher 8" and EQ5 unmotorized mount as a present almost 3 years ago. The idea was he will motorize the mount on his own. We thought it will be much easier task to build a goto mount. We didn’t have any experience with astronomy, with mounts, telescopes, optics, nothing. We didn’t inspire by existing goto mounts or open-source projects available on the internet. We’ve found them much later when we were looking why something works in another way we expected or why something does not work at all. During last almost 3 years we found that to build such gadget is really tough task. Not just from electrical and software perspective, but also from mechanical one. We learned a lot of stuff. We’ve found that if mount is not accurate and smooth enough with small backlash, nothing can help to fix it. We’ve found, that if motor drivers are piece of crap, no accurate navigation and tracking can be made. We’ve also found that wrongly soldered wires or shorts costs a lot of money, especially if SBC is fired out (1 PI, 1 Panda, many Arduinos, many motor drivers). In the end we’ve found that it would be much cheaper and easier to buy any commercially available GOTO mount :-) But… It would not be that funny (believe us, also a lot of tears, especially when expensive hardware burned out :)). Once this project will be completely done, everything will be released as open source so everybody will be able to build his own gadget for astrophotography or the use software as example for another project development.
Mechanical part
In order to be possible to motorize the mount we had to find a way how to fit motors on it and how to fix them there. We used some universal iron holders for NEMA17 which was surprisingly easily possible to fix to the mount using few screws. We used GT2 pulleys and belt to connect the motor with both EQ5 mount axis shafts. The GT2 pulley ratio is 1:1 - there is no space on the mount to make bigger ratio but we expected that bigger ratio will be required because of bigger torque and step smoothness. It was relatively easy task despite the fact that motor and mount axes have different diameters and it was hard to find GT2 pulleys in Europe. Later after some experience it was confirmed that 1:1 ratio is not enough to achieve needed tracking accuracy (even with 256 micro steps). We’ve found that stepper motors and drivers are not accurate enough, especially in micro stepping mode and with high load. We’ve found we need at least 0.5 arcsec (micro) step resolution to eliminate driver and stepper inaccuracies. We ended with 1:5 planetary transmission allowing us to have 36864000 (micro) steps per revolution with internal 1:144 mount transmission ration. This is 0.03 arcsec step for 256 micro steps what is good enough. Also, we have 5 times bigger holding torque what is definitely a good thing.
The current version consist of 2 3D printed motor holders which exactly fits the mount positions and allows better control of belt tense. The rest is still same - GT2 belt on GT2 pulleys are still in use. Another additional motor on 3d printed holder is mounted on focuser now. It was also funny as friend of mine bought nice small stepper motor on which manufacturer claimed to have 160ncm holding torque. We measured it. It had 20 so it was useless. Another lesson we learned: don’t trust no-name manufacturers.
Micro stepping alias stepper motor control
The first version we’ve built was based on RPI1 and DRV8825 stepper motor drivers. The whole thing was gamepad controlled. Single tracking speed in one axis was also possible. Once we first navigated to Saturn and it held almost still in the view we were lucky so much. Later we found that the planet is moving forwards and backwards in the eyepiece. For long time we could not find where the problem was. We thought it is a mechanical problem first. Later we thought it is a problem of Linux timer in the RPI or that the timer is not accurate enough because of the Linux is not RTOS and time slices are not predictable. Two years later, when we changed stepper drivers for TMC ones, we realized that DRV8825 is not a driver for applications where high accuracy is required. Bot of them were losing micro steps time to time.
So we found that accurate stepper motor control requires good drivers and accurate pulse source on the CPU side. Once we realized that pulses are not accurate enough we tried to switch to Arduino as a pulse source. But it is not powerful enough to drive two stepper motors accurately all the time, especially at higher speeds - the cpu frequency together with relatively expensive interrupt handling is not simply enough to drive 2 motors with possibly 3rd motor for focuser. But the current version works with Arduino as the step/dir generator and it works relatively well. In order to have the most accurate and smooth movement we need to use more powerful CPU with RTOS running on it, ASIC or some smart stepper driver.
We decided for last variant and the final version will be built on TMC5160 motion controller which allows us just to send target position and/or speed to the driver. It also allow us to set acceleration and deceleration ramps. It is accurate, silent, simply - it is a great driver.
SBC
First we thought that RPI will be enough for everything. Later, after few night astronomy sessions we’ve agreed that because it is necessary to take a lot of stuff with us to the observing site it would be great if number of items we have to carry can decrease. Since then it is increasing only as we have bought more optical components and more accessories to be taken with us but it is different story. We mainly wanted to have most of the stuff in compact piece and to plug as low count of cables as possible. Ideally we wanted to leave at least notebooks we use for image/video capture at home.
We wanted to have just single small computer allowing us to control the mount but also to capture photos and video and storing the data. We tried if RPI4 is capable of high FPS video capture but our performance tests shown us that RPI4 USB (or its internal buses) is not fast enough to transfer FullHD at high frame rate moreover with another video for error correction. Later we decided to try different platform - LattePanda. It have had Arduino built-in and promised fast USB 3.0. It worked well but we were not able to test if it will be also possible to store the data on another USB drive before… it burned.
In the end, as we also wanted to have remote control and we were suspicious that LattePanda will not be enough we decided to go for Jetson Nano. We also wanted to try some real-time image processing.
So I ordered a devkit and when it arrived I was really disappointed. The performance of ZWO camera is really bad on the USB3.0. It could not handle the video capture properly. Stream was lost time to time, framerate was unstable, CPU usage was really high - terrible. I was suspicious there is some bottleneck on the internal PCIE BUS.
So I ordered Xavier NX devkit as there is USB 3.1 and I was hoping it will be better. But the problem was the same. Very high CPU usage and problems with the video stream.
I was looking over NVidia forums if this can be a hardware problem and I’ve found that Nano is really capable of 5Git USB 3.0 transfers and Xavier is capable of 10GBit USB 3.1 transfer. Fortunately, after another 2 days of reverse engineering, debugging, testing and learning USB bus details and libusb I discovered the ZWO API probably uses libusb in wrong and non-efficient way.
After another 2 days of writing the libusb wrapper and avoiding calling of stream handling functions from the ZWO library it was possible to achieve expected frame rates with 0.7% CPU usage. Thanks ZWO. You made us buy Xavier NX. Its nice piece of hardware. Pricy, but very nice and powerful. I wanted to buy it anyway, you just made me to buy it now :). But now, as a compensation, please make camera API open source. Otherwise bugs in libusb and buffer handling will not never get fixed.
Video and image capturing / processing
Currently we are working on video capturing. The idea was to capture FullHD@170/8bit or FullHD@85/12bit video or long exposure images. Captured data will be stored on the disk. Also, the captured data will get processed by CUDA - debayer, color depth conversion, color format conversion and displayed on screen or H264 encoded and RTSP (or HTTP) streamed to the controlling device (such as mobile phone).
Video processing / image filtering
Later on, hopefully some real-time filtering will be applied on the video / images being captured and stored.
Problems we are solving:
- See eglstream cuda opengl interop for details. Currently, I work on EGL interop. I want to get libusb captured data, pass them to CUDA debayer them and display them on screen (GL textured quad). As I wanted to minimize memcpys I was hoping that EGLStream will help me with this task. And later with passing the data to the video encoder. At least this is what I’ve read in docs. So I wrote OpenGL app using X window to render the content but I am not able to bind the consumer texture to the egl stream.
Problems solved (mainly Jetson related)
- ZWO usblib: bad performance
- code must be reworked, but it works somehow
Photos
Version 0 - The first we have seen at our friend and the reason why this project born…
Version 1 - Initial one…
Version 2 - Current (RPI4 + Arduino Nano)
Version 2 - Morots, 3D printed case for connectors and box with mount controller
Version 3 - NextGen devkit with RPI4 and all required hardware