How to take such a photo of the space station at a distance of 400 kilometers?
As the largest man-made object in space, the International Space Station has always been the dream of countless astronomy enthusiasts.
The method that most people use is to calculate the transit time of the space station (ranging from a few seconds to a few minutes) according to their latitude and longitude, and then shoot with a mobile phone or a micro-single, and then slam-get such a small light spot:
A little more advanced, the “transiting the sun and the moon” shooting method will be used, that is, before the space station flies over the sun or the moon, use these two celestial bodies as the background source to determine a smaller area.
Then wait on the center line, hold a large-diameter, long-focal-length telescope weighing several kilograms to dozens of kilograms for a long period of stable tracking, until the space station passes the moon or the sun within 1 second, and press the shutter to shoot.
This shot will be clearer, but the fatal problem remains:
The space station orbits the Earth on average every 90 minutes. Compared with the well-known star observation simulation tool Stellarium, Venus and Mars are like “still life” compared to the space station (real-time simulation, no acceleration):
In short, the space station is running too fast.
Therefore, not only the number of frames captured in 1 second is limited (multi-frame superposition cannot be used to overcome atmospheric jitter), but only silhouette silhouettes can be captured, and it is even possible to miss the opportunity to shoot due to calculation errors.
If you want to take more high-precision photos, you need not only extraordinary patience, but also deep and sophisticated manual control skills.
For example, once it encounters strong winds, the effect of the shot is easy to “fog into a ball”:
△Picture source Wang Zhuoxiao, authorized
So at this time, someone had an inspiration:
Without a unicorn arm, wouldn’t it be nice to use code to make the telescope move by itself?
As soon as he said it, the “amateur programmer” immediately exploded for 17 days and came up with an automatic tracking system.
With the help of this automatic tracking system, the telescope is no longer only able to take a few static images in specific seconds, but continuously follows the space station for 2 minutes.
Finally, multiple images are superimposed and post-processed to synthesize a high-precision stereoscopic GIF image:
(That’s the picture we started with)
It was just such an image that netizens sighed that “the era of tracking photography of artificial celestial bodies is directly opened”.
So, we found the developer himself, Liu Boyang, an astronomy alumnus of Peking University and a doctor of astrophysics to chat.
How difficult is it to shoot a high-precision space station?
First of all, you need to briefly understand the “timing” of shooting the space station.
Although the space station moves very fast, it can circle the earth every 90 minutes on average, and the average height is about 400 kilometers away from the earth, which is visible to the naked eye, but we cannot observe it at any time.
There are two main constraints: field of view and observation time.
The field of view refers to the space station flying to the visible range of our field of vision, that is, the period of “transit”;
Observation time refers to when we can observe the space station. The space station itself does not emit light. Only within two hours after sunset or two hours before sunrise every day, the sunlight reflected by the space station is the brightest, which is the most suitable for shooting.
Only when these two conditions are met at the same time can we have the opportunity to observe and photograph the space station on the ground, but the effect is also affected by factors such as weather (as shown in the picture, it is cloudy weather):
△Picture source Zhu Yijing & Xu Chengcheng, authorized
However, there are several common methods of photographing celestial objects that are not suitable for taking higher-precision space station photos.
The first method is to shoot directly through a “hand-cranked” telescope, that is, push the telescope to track celestial objects.
There is one flaw with this method, and there is no way to shoot the space station in very high definition. Since manual tracking is required when shooting, you cannot use a telephoto lens to find it directly. Otherwise, it is like using a microscope to capture a fast-moving ant, and the space station will disappear out of the lens if you are not careful.
The second method is to set up various high-definition lenses and equipment like “waiting for a rabbit” and wait for the space station to “pass by” in place.
This method does not require moving the camera, but instead waits for the space station to “pass by” itself. But it faces some new problems. For example, the “passing” time of the space station is very short, sometimes only a few seconds, and it is very likely that it cannot be captured; even if the captured footage cannot be adjusted for reasons such as the angle, the effect is not Can not guarantee.
So, why not shoot with the tracking function that comes with the telescope?
This function is usually only suitable for tracking the rise and fall of the sun, moon, planets, stars and other celestial bodies due to the rotation of the earth. After all, they do not move fast and are basically synchronized with the rotation of the earth. But for fast-moving celestial bodies like the space station, the telescope itself cannot catch up.
Therefore, in the end, it is necessary to rely on program assistance to achieve high-precision space station tracking and shooting.
The third method is to use the orbital root number (that is, the orbital parameter) to track, that is, to use the celestial body information found on various astronomical websites (such as Heavens-Above, etc.) to adjust the tracking path of the telescope and make manual corrections:
At present, most astronomy enthusiasts use this method to achieve tracking + fine-tuning, and there are already some relatively mature programs on the Internet. For example, this is the effect of using an electric theodolite to track the space station according to orbital parameters:
△Picture source Wang Zhuoxiao, authorized
BUT, you never know when these astronomy websites are updated or not. Sometimes the space station temporarily adjusts the orbit, but the website is not updated, and your program will not work.
Using optical recognition, the error is controlled within 4 pixels
As an old astronomy fan, Liu Boyang could not understand all the above problems.
His initial idea was to use some existing software to find “light spots” in the lens, and to identify and track targets based on optical recognition methods.
However, when he was looking for suitable programs, he found that these programs were either unmaintained (even the Windows version was too old to be used), the updates were not timely and the system was complicated, or simply closed-source charging.
Therefore, Liu Boyang finally decided to start by himself, write an automatic tracking script for optical recognition, and manually find the space station and control the tracking based on PID.
His plan is divided into two steps:
The first step is to write a program to realize the automatic identification and tracking of the space station by the telescope, which takes 5 days to complete.
It is worth mentioning that optical recognition is not Liu Boyang’s “first-hand choice”.
He did think about using parameters + manual fine-tuning for tracking, including stepless control of the rotational speed of the equatorial mount with the joystick, thick heel with the number of orbits, stepless fine-tuning combined with the gamepad, etc., but the test shot was not ideal ( The hand is not stable enough when fine-tuning).
So, based on the PID control principle, he wrote a method of optical tracking. This is a very classic control algorithm. PID refers to proportional, integral and differential units respectively. For example, to keep a 2-wheeled robot car in balance, this algorithm is used.
Liu Boyang had not learned this knowledge before, but in order to establish a stable automatic control system, he naturally introduced proportional unit (P) and integral unit (I) to reduce the error of the system.
Liu Boyang’s telescope is divided into two parts: a finder with a larger field of view and a primary mirror with a smaller field of view. The basic goal of this algorithm is to calculate the magnitude of its deviation from the field of view of the main mirror according to the current position of the space station in the finder mirror, so as to adjust the tracking speed of the telescope to correct the existing deviation and make the space station fall into the field of view of the main mirror. middle.
Using this program, the finderscope can quickly follow the moving space station “light source”, so that the space station is always kept in the center of the field of view. Liu Boyang tried to use a laser pointer to create a bright spot moving at a constant speed on the wall of his home to simulate the movement of the space station, and the effect was not bad:
The program itself is developed based on a platform called ASCOM.
It can integrate all the configurations of astronomical equipment, such as controlling the focus of the telescope, the rotation of the filter, and the opening and closing of the camera, into a single software. It is a very widely used software interface standard in the field of astronomy:
In terms of hardware preparation, in addition to the laptop, it also includes:
11-inch aperture, f/10 focal ratio, Celestron EdgeHD catadioptric telescope with CGEM equatorial mount
Canon EOS R5 camera
QHY5III462c camera, as a guide camera
Thumpmaster T16000M Gamepad
Among them, the telescope is about 40,000 yuan, the Canon EOS R5 camera rented for two weeks and cost 2,200 yuan (the market price is 25,000 yuan), the 462c camera is less than 1,000 yuan, and the handle was obtained in exchange with a friend (the market price is more than 500 yuan) .
The whole cost is less than 45,000 yuan. According to Liu Boyang, if the accuracy requirements are not so high, the whole set can be done for less than 10,000 yuan.
Next, enter the second step, take real shots on site and successfully use the equipment to take high-precision photos of the space station.
But what I didn’t expect was that the actual shooting was more difficult than imagined, during which Liu Boyang “has been fixing bugs by trial and error”.
His initial goal was to capture the Chinese space station, but two consecutive bugs caused him to miss the best time for two observations.
On March 23, due to the failure to focus in time, the automatic optical tracking failed to work; on March 27, because the field of view of the finder mirror was only about 3°, the field of view was too small, causing the initial capture to fail, and the automatic optical tracking failed again. Track the process.
At this time, it is still a long time before the next time the Chinese space station can see the border. Therefore, after repairing the operation problem (increasing the field of view of the finderscope to 15°), Liu Boyang decided to use the International Space Station, which is about to have several excellent transits, to “practice his hands”.
Therefore, after changing the “capture” in the automatic tracking program to manual triggering, Liu Boyang successfully verified the program function on April 2.
Although there are still imperfections, for example, the software crash leads to the loss of the position calibration data of the finder mirror and the main mirror. In response to this problem, Liu Boyang has added the calibration data recording function.
At this time, the code has grown from the original 400 lines to 600 lines.
Finally, on the evening of April 3, after urgently fixing the bug, Liu Boyang successfully captured the International Space Station.
Specifically, the telescope’s capture of the space station is divided into two axes: x and y. After pressing catch, the y-axis quickly and steadily keeps up with the target, and the x-axis is slightly slower by 10 seconds.
At about 30 seconds, both axes were kept within a stable error range (about four pixels). This high-precision tracking lasted for a total of 120 seconds, and completely recorded the entire process of the International Space Station from approaching to distant:
The original picture obtained at first is about 100 pixels, and finally, after the supersampling processing of multiple frames, the pixels of the picture are increased to more than 200 pixels.
Finally, after processing, a series of 300 x 300 pixel images (composite GIF images) are successfully output:
And this is the 17th day that Liu Boyang started this project.
Then a small rocket will be launched
When talking about the most difficult stage in the entire project, Liu Boyang was most impressed by how to make the telescope be called by the Python code:
For a poor programmer like me, development was initially a complete black box.
Liu Boyang studied at Peking University and the University of Western Australia for his undergraduate and Ph.D. degrees, both majoring in astrophysics.
This major requires the mastering of basic programming skills, but Liu Boyang’s college-related courses, such as Introduction to Computer and Data Structure, were all passed or postponed with low scores.
At the doctoral stage, there was a lot of data processing work that needed to be done with scripts, and he began to learn programming languages in depth.
The reason why I chose to write code to control the telescope this time is that in addition to not finding ready-made software, I also want to continue to exercise my programming ability.
Will this code be open sourced?
When we asked this question, Liu Boyang said:
At least in the range that you can debug, optimize the code to a level you are satisfied with before considering the next step.
One of his most recent goals is another transit of the Chinese space station in two weeks.
After successfully “practicing hands” with the International Space Station, Liu Boyang is full of confidence, and is still considering whether to properly narrow the field of view in the next capture, thereby improving the accuracy of shooting.
If the filming of the Chinese space station goes well, it will end before April 21. After that, he will immediately rush to Qinghai to start a new project: launch a small rocket with his own camera.
Going further, Liu Boyang also mentioned the possible launch of the Shenzhou series of rockets and the experimental module in the second half of this year. At that time, he will take his own program for tracking the space station, and then follow the big rocket.
Such a hard-core “preparation” plan is undoubtedly suitable for an avid aerospace enthusiast.
Liu Boyang finally said:
I have been interested in astronomy since I was a child, so I read astrophysics for both undergraduates, masters and PhDs.
However, with the increasing number of domestic aerospace missions, I have more and more opportunities to get in touch with related activities, so my interest in aerospace has gradually developed, and now it has developed into a major hobby. .
Reference link:
[1]https://weibo.com/1144755982/LmV8Cp72V
[2]https://zhuanlan.zhihu.com/p/493080686
[3]https://www.heavens-above.com/orbit.aspx?satid=25544
[4]https://mp.weixin.qq.com/s/gNueq8lDQz_86Ifuw8n6Pg#rd
Hashtag: code shoot the space station
.
[related_posts_by_tax taxonomies=”post_tag”]
The post 600 lines of code in 17 days to shoot the International Space Station 400 kilometers away appeared first on Gamingsym.