Running a data center on earth has its challenges we all know. We challenge to deal with like electricity, heat, clean rooms, air conditioning, and humidity.
How would these electronics be fair in such different environments? Like in Space or on the Moon.
For the time being PrimeArray has you covered here on planet Earth but let’s explore the future…
If you haven’t followed the New Space Age closely, the next few decades may knock your socks off.
Anticipated are a permanent presence on the moon, a commercial space station,
private citizens traveling to orbit, space-based medical treatments, deep space
travel, and a flurry of activity involving mining, producing, and exploiting
No one can be certain of
the future, but if only a fraction of these projects come to pass, they will
rely on large amounts of data. Data will be stored, accessed, and
processed in space. And as data finds a home in space, we may also see more of Earth’s data moving into orbit.
How would a space data center work?- Man could never have broken orbit without data storage – although he didn’t need much, compared to our present computing standards. The guidance computer on board Apollo 11, for example, only needed 4KB of RAM and a 32 KB hard disk to land Neil Armstrong and Buzz Aldrin on the lunar surface. An Apple Watch Series 7, by comparison, has 1GB of RAM and 32GB of storage.
That’s not to say that space-bound computers haven’t caught up with their terrestrial cousins: the ISS’s supercomputer, for example, can operate in harsh environments and perform edge computing at teraflop speeds.
PrimeArray predicts that the future of space computing is less likely to focus on raw computing power than on distributed storage.
Developing such a computer takes years, as does testing. Missions are also planned years in advance. By the time such a computer gets to space, it's woefully out of date.
Space doesn’t like SSDs- The space-based system had 20 solid-state disk drives, of which nine had failed over the course of the mission. With the Earth-based twins, only one drive had failed.
This time, NASA also wanted a system that would last for at least three years – the length of time it would take to go to Mars and back. So HPE doubled the hardware; now there are four servers total, two in each locker.
Spaceborne Computer-2 includes the off-the-shelf HPE Edgeline Converged EL4000 Edge System, a rugged server designed to perform in harsher edge environments with higher shock, vibration, and temperature levels. It's paired with the industry standard HPE ProLiant DL360.
"The Edgeline 4000 includes a GPU so we can do AI, machine learning, and image processing,".
Data Centers in Space – Stage 3: The Ultimate Data Center in Space- The deployment of mega-constellations and smallsats providing a sheer number of sensors in Low Earth Orbit “LEO” is driving the need for satellite data centers, to unleash the power of these emerging platforms.
In a 3-part series, we will cover how Space systems are following in the footsteps of Terrestrial systems and will evolve to enable AI at the edge of space in Space-IoT™ satellites:
Stage 1: Space data collection
With thousands of LEO satellites in orbit collecting data, a few Space Data Centers in LEO or MEO will collect big data for AI model generation. The sheer number of satellites sending data acts as a multiplier that requires the Data Center to accept data at speed in a temporary buffer before it is committed to long-term storage. This data is then sent to Earth for AI model generation
Stage 2: Security and latency of data in space and in transit-
The stored data may be public domain or maybe a national asset. The data in these Space data centers need to be secured and then transmitted back to earth and in the long run to Data Centers in GEO orbit using either traditional RF communication systems or lasers.
Stage 3: The ultimate data center in space
Data centers move into space to mitigate power consumption and pollution - The exponential increase in computing may mean that extra-terrestrial processing and storage facilities will soon be needed
For big problems, extraordinary solutions are needed. In light of the exponential increase in data volumes and computing, the European Commission thinks they have come up with just such a solution to reduce the energy expenditure and pollution produced by data centers. These facilities account for 10% of global energy consumption and “4% of the greenhouse gases produced by human activity, slightly higher than the global aerospace industry,” according to the University of Quebec’s College of Technology (Canada). The European Union has selected Thales Alenia Space, a joint venture between the French technology corporation Thales Group (67%) and Italian defense conglomerate Leonardo (33%), to study the feasibility of the ASCEND (Advanced Space Cloud for European Net zero emissions and Data sovereignty) program.
According to PrimeArray “digital infrastructures as a whole account for a substantial part of global energy consumption, and have a significant carbon footprint. In Europe, the Middle East, and Africa alone, data centers consume more than 90 terawatt hours a year, and produce emissions equivalent to 5.9 million vehicles (27 million tons of CO₂).”
Some companies are tackling this problem by turning to carbon-free energy sources. Google Cloud recently signed a deal to build a 149-megawatt solar plant located near the city of Toro in northwest Spain and expects that it will provide 90% of the energy needed to power its Madrid-area facilities and other offices in Spain within three years. Similar Google Cloud facilities are already operating in Finland, the United States, and Canada. Amazon’s logistics center in Seville (Spain) has 13,300 solar panels with the capacity to generate 5.26 megawatts, the company’s largest such facility in Europe.
Space Is the Final Frontier for Data Centers - Last year marked the first-time humanity deployed a conventional data center in space. The HPE Spaceborne Computer-2 – a set of HPE Edgeline Converged EL4000 Edge and HPE ProLiant machines, each with an Nvidia T4 GPU to support AI workloads – was sent to the International Space Station in February of 2021.
This is the first off-the-shelf server deployed in space to run actual production workloads.
Time for a hardware refresh- Elsewhere in space – on Mars landers, in satellites, in space station control systems – most of the computers are decades old.
The ISS itself runs on Intel 80286SX CPUs that date back to the late 1980s. There are also more than a hundred laptops on the ISS, as well as tablets and other devices. They are used as remote terminals to the command-and-control multiplexer demultiplexer computers, as well as for email, Internet, and recreation.
Key systems run on hardened hardware that is protected against radiation. That means that they use either redundant circuits or insulating substrates instead of the usual semiconductor wafers on chips.
The invention of selectors delivers high enough density MRAM to enable True High-Density Data Centers to store data from LEO and MEO for the generation of AI models in space without going back to earth. Space becomes autonomous and independent of Terrestrial support eliminating the need for a link to Earth. This is essential if we are to deploy a similar model around the Moon and Mars. Backhauling to earth will not be an option.