Today, Amazon and Lockheed Martin announced that Alexa will joining NASA’s upcoming Artemis I mission as part of Callisto, a technology demonstration payload embedded into the Orion spacecraft and built in collaboration with engineers from Amazon, Cisco and Lockheed Martin.
NASA’s Orion spacecraft was built by Lockheed Martin to meet America’s ambition of deep space exploration, from the Moon to Mars and beyond. Although Artemis I is an uncrewed mission, it is an important step toward future crewed missions.
The Alexa integration in Callisto combines custom, space-grade hardware from Lockheed Martin with acoustic and audio processing software from Amazon, helping us explore how Alexa can help support future crewed missions to space. Alongside the mission, we’re adding new ways for customers on Earth to follow along with Alexa, and learn more about the Artemis program, helping further NASA’s goal of engaging the American public as it enters an era of deep space exploration.
Here are five ways scientists, astronauts, students and the broader community are coming together to enable a world of ambient computing in space, and on earth.
They can ask mission-specific questions such as:
In addition to interacting with Alexa from mission control, customers around the world will be able to ask Alexa about her time in space and get updates on the mission from their homes. On visual devices such as Echo Show and Fire TV, Alexa will display visual features for customers to follow along with telemetry images. People can follow along with the mission by saying, “Alexa, take me to the moon.”
During the mission, Alexa will connect to Mission Control using the Deep Space Network. The bandwidth available to Alexa via this network is equivalent to a dial-up connection, so the Alexa integration in Callisto leverages Local Voice Control, which allows Alexa to process voice commands locally, rather than sending information to the cloud.
Callisto features novel technology that allows Alexa to function without an internet connection, allowing future astronauts to access specific functions without the latency associated with cloud-based interactions. The experience is based on the same technology that powers offline Alexa experiences on Echo devices or Alexa-enabled vehicles.
The Callisto hardware is 1.5 feet by one foot and approximately five inches in depth, and scientists at Amazon and Lockheed Martin had to overcome those physical constraints to help Alexa get onboard. For example, equipment developed for the mission had to be resilient to extreme shocks and vibrations. It also had to be at least minimally resistant to radiation emissions in space, and utilize highly specific and custom-built components such as power and data cables.
Alexa for Astronauts will offer live virtual tours from Johnson Space Center, and provide students a first-hand look at the virtual crew experience and other facilities around mission control. Alexa for Astronauts will also include STEM curriculum powered by MIT App Inventor.
MIT App Inventor is an intuitive, visual programming environment that allows participants to build fully functional apps for Android and iOS smartphones and tablets. Those new to MIT App Inventor can have a simple first app up and running in less than 30 minutes. The blocks-based tool facilitates the creation of complex, high-impact apps in significantly less time than traditional programming environments.
Alexa for Astronauts has been designed in collaboration with the National Science Teaching Association and Mobile CSP, allowing educators to dive deeper into computer science learning, skill development for Alexa, and the Artemis I mission within their classrooms.
Dave Limp, Amazon’s senior vice president for devices & services, has spoken about how Alexa was in part inspired by the Star Trek computer.
“The bright light, the shining light that’s still many years away, many decades away, is to recreate the Star Trek computer. That computer, you could be anywhere on the Starship Enterprise and you could say the world “computer” and it would wake up and answer any question, and that’s our goal,” he said.
We envision a future in which astronauts could turn to an onboard AI for information, assistance and companionship, and we’ve added new capabilities to bring that experience to life. Offline features include the ability to answer questions about mission telemetry – things like “Alexa, how fast is Orion traveling?” or “Alexa, what’s the temperature in the cabin?” – and the ability to control connected devices onboard the spacecraft.
Using the Deep Space Network, Alexa can also retrieve information from back on Earth, from news briefings to sports scores, helping astronauts stay connected during long missions to deep space. These features will help make life simpler and more efficient for those on board the spacecraft, especially when they’re buckled in or occupied with other tasks during the mission.