2022
DOI: 10.1371/journal.pone.0263882
|View full text |Cite
|
Sign up to set email alerts
|

SISPO: Space Imaging Simulator for Proximity Operations

Abstract: This paper describes the architecture and demonstrates the capabilities of a newly developed, physically-based imaging simulator environment called SISPO, developed for small solar system body fly-by and terrestrial planet surface mission simulations. The image simulator utilises the open-source 3-D visualisation system Blender and its Cycles rendering engine, which supports physically based rendering capabilities and procedural micropolygon displacement texture generation. The simulator concentrates on realis… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 44 publications
0
5
0
Order By: Relevance
“…A spacecraft proximity operations guidance strategy that utilizes deep reinforcement learning, a subfield of artificial intelligence is discussed in [14]. The examination of simulation software called SISPO, has been specifically developed for proximity operations and serves as a tool for visualizing the space environment in [15] with features for simulating the orbital path, light conditions, and the intrinsic camera parameters. The experiments conducted in February 2020 [16,17], in which two spacecraft successfully carried out the inaugural commercial satellite-servicing rendezvous and docking in Geo-stationary Earth Orbit (GEO), provided an exceptional chance to analyze the dynamics of the involved vehicles.…”
Section: State Of the Artmentioning
confidence: 99%
“…A spacecraft proximity operations guidance strategy that utilizes deep reinforcement learning, a subfield of artificial intelligence is discussed in [14]. The examination of simulation software called SISPO, has been specifically developed for proximity operations and serves as a tool for visualizing the space environment in [15] with features for simulating the orbital path, light conditions, and the intrinsic camera parameters. The experiments conducted in February 2020 [16,17], in which two spacecraft successfully carried out the inaugural commercial satellite-servicing rendezvous and docking in Geo-stationary Earth Orbit (GEO), provided an exceptional chance to analyze the dynamics of the involved vehicles.…”
Section: State Of the Artmentioning
confidence: 99%
“…The regular mission design tools, such as STK and GMAT, do not include photorealistic 3D capabilities, which in turn are necessary to model and develop optical navigation and attitude estimation, available in blender-based environments, such as SISPO [53], FlyBy-Gen [54] and AIS [55]. Furthermore, open-loop control, which takes advantage of optical measurements, requires real-time 3D modelling.…”
Section: Attitude and Navigation System (Ans)mentioning
confidence: 99%
“…from 2019-09-13T21:00:00 to 2019-09-14T00:00:00 [12] Given the dynamic nature of the event camera, the ejection scene depicted in Figure 2 is then animated in Blender, using photorealistic models of the asteroid and the particles. Different textures and reflectance properties are used to reproduce the rubble pile appearance of Bennu [13] based on the position of the Sun during the Orbital C phase. The simulated ejecta consists of 14 particles with sizes ranging from 1 − 11 𝑐𝑐𝑐𝑐 according to the average diameters reported in [1].…”
Section: Figure 2 Bennu-fixed Particle Ejection Visualization Based O...mentioning
confidence: 99%
“…To support the evaluation of event-based multi-particle tracking, we use the navigation and ancillary information of the OSIRIS-REx mission in simulation as well as reports of notable particle ejection episodes [12] to highlight how an event-based sensor can augment visual data capture and contribute to the mission's scientific objectives. The dynamic scene, composed of asteroid Bennu and several centimetre-size particles ejected from its surface, is first reconstructed with photorealistic computer graphics tools from the point of view of the visiting spacecraft [13]. Dynamic vision sensing is subsequently simulated by emulating the sensitivity and noise characteristics of a real event camera [14,15].…”
Section: Introductionmentioning
confidence: 99%