How to take such a space station photo 400 kilometers away?
As the largest man-made object in space, the International Space Station has always been the dream of countless astronomy enthusiasts.
The method that most people use is to calculate the transit time of the space station (ranging from a few seconds to a few minutes) according to their latitude and longitude, and then shoot with a mobile phone or a micro-single, and then slam — get such a small light spot:
A little more advanced, it will use “Transiting sun and moon“The shooting method, that is, before the space station flies over the sun or the moon, using these two celestial bodies as the background source to determine a smaller area.
Then wait on the center line, hold a large-diameter, long-focus telescope weighing several kilograms to tens of kilograms for long-term stable tracking, until the space station passes the moon or the sun within 1 second, and press the shutter to shoot.
This shot will be clearer, but the fatal problem remains:
The space station orbits the Earth on average every 90 minutes. Compared with the well-known star observation simulation tool Stellarium, Venus and Mars are like “still life” compared to the space station (real-time simulation, no acceleration):
In short, the space station is running too fast.
Therefore, not only the number of frames captured in 1 second is limited (multi-frame superposition cannot be used to overcome atmospheric jitter), but only silhouette silhouettes can be captured, and it is even possible to miss the opportunity to shoot due to calculation errors.
If you want to take more high-precision photos, you need not only extraordinary patience, but also deep and sophisticated manual control skills.
For example, once it encounters strong winds, the effect of the shot is easy to “fog into a ball”:
▲ Picture source Wang Zhuoxiao, authorized
So at this time, someone had an inspiration:
No unicorn arm, with codeLet the telescope move on its ownWouldn’t it be alright?
Just do what he says, this “amateur programmer” immediately exploded for 17 days and came up with a set ofautomatic trackingsystem.
With the help of this automatic tracking system, the telescope is no longer only able to take a few static images in a few seconds, but continuously follows the space station 2 minutes.
Finally, multiple images are superimposed and post-processed to synthesize a high-precision stereoscopic GIF image:
(That’s the picture we started with)
It was just such an image that netizens sighed that “the era of tracking photography of artificial celestial bodies is directly opened”.
So, we found the developer himself, Liu Boyang, an astronomy alumnus of Peking University and a doctor of astrophysics to chat.
How difficult is it to shoot a high-precision space station?
First of all, you need to briefly understand the “timing” of shooting the space station.
Although the space station moves very fast, it can circle the earth every 90 minutes on average, and the average height is about 400 kilometers away from the earth, which is visible to the naked eye, but we cannot observe it at any time.
There are two main constraints: field of view and observation time.
visual fieldRefers to the space station flying within the visible range of our field of vision, that is, the period of “transit”;
observation timeRefers to when we can observe the space station. The space station itself does not emit light. Only within two hours after sunset or two hours before sunrise every day, the sunlight reflected by the space station is the brightest, which is the most suitable for shooting.
Only when these two conditions are met at the same time can we have the opportunity to observe and photograph the space station on the ground, but the effect is also affected by factors such as weather (as shown in the picture, it is cloudy weather):
▲ Image source Zhu Yijing & Xu Chengcheng, authorized
However, there are several common celestial photographing methods currently available.higher precisionThe photos of the space station are actually not suitable.
the first methodwhich is shot directly through the “hand-cranked” telescope, that is, pushing the telescope to track celestial objects.
There is one flaw with this method, and there is no way to shoot the space station in very high definition. Since manual tracking is required when shooting, you cannot use a telephoto lens to find it directly. Otherwise, it is like using a microscope to capture a fast-moving ant, and the space station will disappear out of the lens if you are not careful.
The second methodis like “waiting for a rabbit”, setting up various high-definition lenses and equipment, waiting for the space station to “pass by” in place.
This method does not require moving the camera, but instead waits for the space station to “pass by” itself. But it faces some new problems. For example, the “passing” time of the space station is very short, sometimes only a few seconds, and it is very likely that it cannot be captured; even if the captured footage cannot be adjusted for reasons such as the angle, the effect is not Can not guarantee.
So, why notTelescope built-in tracking functionshoot?
This function is usually only suitable for tracking the rise and fall of the sun, moon, planets, stars and other celestial bodies due to the rotation of the earth. After all, they do not move fast and are basically synchronized with the rotation of the earth. But for fast-moving celestial bodies like the space station, the telescope itself cannot catch up.
Therefore, in the end it isprogramAuxiliary, to achieve high-precision space station tracking shooting.
third methodis to use the orbital root number (that is, the orbital parameter) to track, that is, to use the celestial body information found on various astronomical websites (such as Heavens-Above, etc.) to adjust the tracking path of the telescope and make manual corrections:
At present, most astronomy enthusiasts use this method to achieve tracking + fine-tuning, and there are already some relatively mature programs on the Internet. For example, this is the effect of using an electric theodolite to track the space station according to orbital parameters:
▲ Picture source Wang Zhuoxiao, authorized
BUT, you never know when these astronomy websites are updated or not. Sometimes the space station temporarily adjusts the orbit, but the website is not updated, and your program will not work.
Using optical recognition, the error is controlled within 4 pixels
As an old astronomy fan, Liu Boyang could not understand all the above problems.
His initial idea was to find “spots of light” in the lens through some existing software, based onOptical identificationThe method implements identification and tracking of the target.
However, when he was looking for suitable programs, he found that these programs were either unmaintained (even the Windows version was too old to be used), the updates were not timely and the system was complicated, or simply closed-source charging.
Therefore, Liu Boyang finally decided to start by himself, write an automatic tracking script for optical recognition, and manually find the space station and control the tracking based on PID.
His plan is divided into two steps:
The first step is to write a program to realize the automatic identification and tracking of the space station by the telescope, which takes 5 days to complete.
It is worth mentioning that optical recognition is not Liu Boyang’s “first-hand choice”.
He did think about using parameters + manual fine-tuning for tracking, including stepless control of the rotation speed of the equatorial mount with the joystick, thick heel with the number of orbits, stepless fine-tuning combined with the gamepad, etc., but the test shooting effect was not ideal ( The hand is not stable enough when fine-tuning).
So, he wrote a method of optical tracking based on the PID control principle. This is a very classic control algorithm, PID refers to proportional, integral and differential units respectively, such as keeping a 2-wheeled robot car in balance, this algorithm is used.
Liu Boyang had not learned this knowledge before, but in order to establish a stable automatic control system, he naturally introduced proportional unit (P) and integral unit (I) to reduce the error of the system.
Liu Boyang’s telescope is divided into two parts: a finder with a larger field of view and a primary mirror with a smaller field of view. The basic goal of this algorithm is to calculate the magnitude of its deviation from the field of view of the main mirror according to the current position of the space station in the finder mirror, so as to adjust the tracking speed of the telescope to correct the existing deviation and make the space station fall into the field of view of the main mirror. middle.
Using this program, the finderscope can quickly follow the moving space station “light source”, so that the space station is always kept in the center of the field of view. Liu Boyang tried to use a laser pointer to create a bright spot moving at a constant speed on the wall of his home to simulate the movement of the space station, and the effect was not bad:
The program itself, based on a ASCOM platform development.
It can integrate all the configurations of astronomical equipment, such as controlling the focus of the telescope, the rotation of the filter, and the opening and closing of the camera, into a single software. It is a very widely used software interface standard in the field of astronomy:
In terms of hardware preparation, in addition to the laptop, it also includes:
11″ aperture, f/10, catadioptric Celestron EdgeHD telescope with CGEM equatorial mount
Canon EOS R5 Camera
QHY5III462c camera, as a guide camera
Thrustmaster T16000M Gamepad
Among them, the telescope is about 40,000 yuan, the Canon EOS R5 camera rented for two weeks and cost 2,200 yuan (market price 25,000 yuan), the 462c camera is less than 1,000 yuan, and the handle was obtained in exchange with friends (market price 500 yuan) .
Calculate the entire costless than 45,000 yuanAccording to Liu Boyang, if the accuracy requirements are not so high, the whole set can be done for less than 10,000 yuan.
Next, go to the second step, take real shots on the spot and successfully use the equipment to take high-precision photos of the space station.
But what I didn’t expect was that the actual shooting was more difficult than imagined, during which Liu Boyang “has been fixing bugs by trial and error”.
His initial goal was to captureChinese space stationhowever, there were two consecutive bugs, which resulted in missing the best time for the two observations.
On March 23, due to the failure to focus in time, the automatic optical tracking failed to work; on March 27, due to the finder’s field of view was only about 3°, the initial capture failed due to the too small field of view, and the automatic optical tracking failed again. Track the process.
At this time, it is still a long time before the next time the Chinese space station can see the border.Therefore, after repairing the operation problem (increasing the field of view of the finderscope to 15°), Liu Boyang decided to use the finder that will usher in several excellent transits first.International Space Station“Practice Hands”.
Therefore, after changing the “capture” in the automatic tracking program to manual triggering, Liu Boyang successfully captured the International Space Station on April 2.
Although there are still imperfections, for example, the software crash leads to the loss of the position calibration data of the finder mirror and the main mirror. In response to this problem, Liu Boyang has added the calibration data recording function.
At this point, the code has grown from the original 400 lines to 600 lines.
Finally, on the evening of April 3, after urgently fixing the bug, Liu Boyang successfully captured the International Space Station.
Specifically, the telescope’s capture of the space station is divided into two axes: x and y. After pressing catch, the y-axis quickly and steadily follows the target, and the x-axis is slightly slower by 10 seconds.
At about 30 seconds, both axes were kept within a stable error range (about four pixels). This high-precision tracking lasted for a total of 120 seconds, and completely recorded the entire process of the International Space Station from approaching to distant:
The original image obtained at first is about 100 pixels, and finally, after supersampling processing of multiple frames, the pixel of the picture is increased to more than 200 pixels.
Finally, after processing, a series of 300 × 300 pixel images (synthetic GIF images) were successfully output:
And this is Liu Boyang who started this projectDay 17.
Then a small rocket will be launched
When talking about the most difficult stage in the whole project, Liu Boyang was most impressed by how to make the telescope be called by the Python code:
For a poor programmer like me, development was initially a complete black box.
Liu Boyang studied at Peking University and University of Western Australia for his undergraduate and Ph.D. degrees.Astrophysicsprofession.
This major requires the mastering of basic programming skills, but Liu Boyang’s college-related courses, such as Introduction to Computer and Data Structure, were all passed or postponed with low scores.
At the doctoral stage, there was a lot of data processing work that needed to be done with scripts, and he began to learn programming languages in depth.
The reason why I chose to write code to control the telescope this time is that in addition to not finding ready-made software, I also want to continue to exercise my programming ability.
Then this code willopen source? When we asked this question, Liu Boyang said:
At least in the range that you can debug, optimize the code to a level you are satisfied with before considering the next step.
One of his most recent goals is another transit of the Chinese space station in two weeks.
After successfully “practicing hands” with the International Space Station, Liu Boyang is full of confidence, and is still considering whether to properly narrow the field of view in the next capture, thereby improving the accuracy of shooting.
If the filming of the Chinese space station goes well, it will end before April 21. After that, he will immediately rush to Qinghai to start a new project:Launch a small rocket with its own camera.
Going further, Liu Boyang also mentioned the possible launch of the Shenzhou series of rockets and the experimental module in the second half of this year. At that time, he will take his own program for tracking the space station, and then follow the big rocket.
Such a hard-core “preparation” plan is undoubtedly suitable for an avid aerospace enthusiast.
Liu Boyang finally said:
I have been interested in astronomy since I was a child, so I read astrophysics for both undergraduates, masters and PhDs.
However, with the increasing number of domestic aerospace missions, I have more and more opportunities to get in touch with related activities, so my interest in aerospace has gradually developed, and now it has developed into a major hobby. .
Reference link:
[1]https://weibo.com/1144755982/LmV8Cp72V
[2]https://zhuanlan.zhihu.com/p/493080686
[3]https://www.heavens-above.com/orbit.aspx?satid=25544
[4]https://mp.weixin.qq.com/s/gNueq8lDQz_86Ifuw8n6Pg#rd
.
[related_posts_by_tax taxonomies=”post_tag”]
The post I, shot the International Space Station 400 kilometers away with 600 lines of code in 17 days appeared first on Gamingsym.