In various science fiction films, spaceship computers are usually controlled by voice commands. In a few years, this could also be the case with real spaceships and space stations. At least Lockhee Martin, Cisco and Amazon are working on that. As a test, an Amazon Echo speaker is scheduled to fly around the moon in March of this year.
Announcing the current weather, playing music, sending messages, searching the Internet or ordering pizza, modern language assistants can already do a lot very well today. This is why they are used more and more as a matter of course, especially by younger people. And maybe by astronauts in the years to come.
At least that’s what the weapons and aerospace company Lockheed Martin, the network supplier Cisco and Amazon are planning. As part of the Artemis 1 moon mission planned for March 2022, the three companies want to send a specially adapted Amazon Echo loudspeaker and thus the Alexa voice assistance system on board the Orion spacecraft on a tour around the moon.
The Artemis 1 mission is unmanned, but the developers should use it to check whether and how well language assistance systems could support and relieve the crew of spaceships in their work in the future. To do this, “virtual crew members” should transmit test commands to the assistance system via radio link. The flight is said to be the first step in a research and development program called Callisto, designed specifically to demonstrate the suitability and adaptability of Amazon’s Alexa and Cisco’s Webex video conferencing platform for such a deployment.
“We want to demonstrate that this technology can help astronauts deal with the unique interfaces [der Raumfahrttechnik] can help make their work easier, safer and more efficient,” says Rob Chambers from Lockheed Martin. In fact, classic operating elements such as switches, rotary controls, but also modern touch screens can pose a challenge for astronauts, since weightlessness impairs hand-eye coordination and – after a longer stay in space – also affects the view.
Active use of Alexa by astronauts is not yet planned
Callisto’s Amazon and Cisco technology should initially allow astronauts to use voice commands to establish a connection with the ground team, control the lighting in a capsule, carry out diagnostics or retrieve data, for example. “One way to use Callisto would be to say, ‘Alexa, what’s the average temperature of all the batteries and what’s the peak temperature?’ Then she’ll do the data processing,” says Chambers. Almost like what science fiction fans know from TV series like Star Trek.
In the long term, the developers can certainly imagine handing over control of the propulsion and control systems of a spaceship to language assistants – at least in part. Astronauts could speak reentry angles instead of typing them in. “Our engineers, along with the Lockheed Martin and Cisco teams, have done a lot of work to ensure Alexa works well in this very demanding environment,” Aaron Rubenson of the Amazon Alexa team told SpaceNews . For example, unlike with a home Alexa, the speech analyzes via artificial intelligence must be carried out from a local computer instead of from a server in the cloud.
Active use of Alexa by astronauts on a space mission is currently not planned. Neither on an Artemis mission nor on any other NASA mission. Still, developers can envision Alexa eventually being integrated into space stations like the Lunar Orbital Platform gateway or lunar vehicles. At the moment it is only a matter of checking whether and how the current technology works in the environment of a space capsule, whether it works safely and comprehensibly in such an environment. Because, of course, astronauts should be able to trust the technology and be sure that scenes like those from 2001: A Space Odyssey will not occur.
Written by Michael Fortsch