Jump to content

Why the Perseverance Rover computer is so "outdated"

Fabioo

In the wake of the Perseverance Rover landing I was nerding out in some stuff and I stumbled upon the Rover's computer specs. As you can see from the screenshot that looks pretty "outdated". (Source: https://mars.nasa.gov/mars2020/spacecraft/rover/brains/)
Don't get me wrong, I'm sure that hardware is state of the art technology and space operations have different requirements as our average gaming PC, but I imagined that due to all its sensors, real time maneuvering capabilities, machine learning and so on the computer would be a beast. Something multicore with a little bit more processing power than 200 megahertz. I know that it must be energy efficient, but I might be wrong but even my router is more powerfull than that and consumes little eletricity. 

Not to mention the storage, I'm no space explorer and have no idea how it works, but since just the landing phase was expected to register more than 28,000 images (source: https://www.researchgate.net/publication/346537343_The_Mars_2020_Engineering_Cameras_and_Microphone_on_the_Perseverance_Rover_A_Next-Generation_Imaging_System_for_Mars_Exploration page 31) doesn't that storage seems a little bit odd?
 

So to sum it all up I'm intrigued by this all, and I'm sure I'm not the only one. Every computer and space nerd out there might be intrigued as well.

I think we probably have someone in our community that has the knowledge for clearing that up
So if anyone can shed some light on this topic that would be greatly appreciated. 


Maybe if we raise enough awareness about this topic the mighty LS can help us out on that too.

Cheers,

 

Screen Shot 2021-02-21 at 08.07.08.png

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Fabioo said:

Don't get me wrong, I'm sure that hardware is state of the art technology and space operations have different requirements as our average gaming PC, but I imagined that due to all its sensors, real time maneuvering capabilities, machine learning and so on the computer would be a beast.

The rover mostly just has to collect data. All of the heavy processing will happen on Earth.

 

And yeah, it does have different requirements:

  • Be as reliable as possible
  • Use as little power as possible
  • Be resistant to space, radiation etc.

Using bleeding edge technology is no good if you need something that is ultra reliable, because you haven't had decades to fix all the kinks. And smaller process nodes are much more susceptible to stuff like radiation and high energy particles. You don't have a magnetic shield on Mars like you do on Earth.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

It's not meant to run Doom.

6 minutes ago, Fabioo said:

Maybe if we raise enough awareness about this topic the mighty LS can help us out on that too.

What...?

 

It's purpose built, it's not meant to compete when what you have at home.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, lewdicrous said:

It's not meant to run Doom.

 

It's purpose built, it's not meant to compete when what you have at home.

No shit!!!! Wow that is just mind boggling. I was sure that engineers spend Billions of dollars to play doom on mars....

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Eigenvektor said:

The rover mostly just has to collect data. All of the heavy processing will happen on Earth.

 

I get your points, but I'm not sure if you are right on that one. All the landing phases had to be done remotely, with no earth inout whatsoever. All the sensors onboard the landing aircraft used machine learning (from the rover and from the landing crane) so plenty of processing happened on the rover.

Quote

And smaller process nodes are much more susceptible to stuff like radiation and high energy particles. 

Doesn't shielding mitigate that?

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Fabioo said:

In the wake of the Perseverance Rover landing I was nerding out in some stuff and I stumbled upon the Rover's computer specs. As you can see from the screenshot that looks pretty "outdated". (Source: https://mars.nasa.gov/mars2020/spacecraft/rover/brains/)
Don't get me wrong, I'm sure that hardware is state of the art technology and space operations have different requirements as our average gaming PC, but I imagined that due to all its sensors, real time maneuvering capabilities, machine learning and so on the computer would be a beast. Something multicore with a little bit more processing power than 200 megahertz. I know that it must be energy efficient, but I might be wrong but even my router is more powerfull than that and consumes little eletricity. 

Not to mention the storage, I'm no space explorer and have no idea how it works, but since just the landing phase was expected to register more than 28,000 images (source: https://www.researchgate.net/publication/346537343_The_Mars_2020_Engineering_Cameras_and_Microphone_on_the_Perseverance_Rover_A_Next-Generation_Imaging_System_for_Mars_Exploration page 31) doesn't that storage seems a little bit odd?
 

So to sum it all up I'm intrigued by this all, and I'm sure I'm not the only one. Every computer and space nerd out there might be intrigued as well.

I think we probably have someone in our community that has the knowledge for clearing that up
So if anyone can shed some light on this topic that would be greatly appreciated. 


Maybe if we raise enough awareness about this topic the mighty LS can help us out on that too.

Cheers,

 

Screen Shot 2021-02-21 at 08.07.08.png

Well, seeing as the rover hasn't left low earth orbit and the entire thing is a scam, I'm sure it really doesn't matter too much what onboard systems it has.

😀🤞👍🤦‍♂️

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Fabioo said:

Doesn't shielding mitigate that?

Shielding means extra weight. The less dead weight you have to carry into space, the more room you have for scientific instruments.

 

The source you quoted actually mentions this "radiation-hardened CPU". So this is one of the reasons. The source also mentions that it is 10x faster and 8x more storage than previous rovers. So I don't think this is bad hardware, considering what could be done with previous rovers already.

 

4 minutes ago, Fabioo said:

I get your points, but I'm not sure if you are right on that one. All the landing phases had to be done remotely, with no earth inout whatsoever. All the sensors onboard the landing aircraft used machine learning (from the rover and from the landing crane) so plenty of processing happened on the rover.

And apparently the rover's systems where good enough for that.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

I think one of the major issues is actually space radiation. Down here on earth we are protected by our lovely atmosphere and magnetic shielding that the earth so nicely provides. In space (and on Mars as well as far as I know) there is no shielding and very little atmosphere to protect from the radiation. That means that there is a lot of it floating around and actually messing with electronics. Thing is, the smaller you make stuff (which you need to make it more powerful while still being energy efficient) the more likely it is to be affected by interfering radiation. While you can of course add shielding to your electronics that does add a lot of weight which is not something you want to do on a mission to Mars. You also need the stiff to be super reliant so they don't crap out just as you are about to enter Mars' atmosphere and then crash land because some bits were flipped at the wrong time and nor your multi-billion Dollar mission just became a useless scrap of metal hurling down to Mars with no means of stopping in time.

I am no expert on any of this but I do recall hearing about the same issue as to why the computers on the ISS and Space Shuttle where also very 'low' end compared to what was actually available at the time.

Link to comment
Share on other sites

Link to post
Share on other sites

Design considerations for space craft are significantly different for anything designed for earth. That rover is pretty much an input and output device, gets data, sends data, no major data processing needs to be done on the rover itself.

mY sYsTeM iS Not pErfoRmInG aS gOOd As I sAW oN yOuTuBe. WhA t IS a GoOd FaN CuRVe??!!? wHat aRe tEh GoOd OvERclok SeTTinGS FoR My CaRd??  HoW CaN I foRcE my GpU to uSe 1o0%? BuT WiLL i HaVE Bo0tllEnEcKs? RyZEN dOeS NoT peRfORm BetTer wItH HiGhER sPEED RaM!!dId i WiN teH SiLiCON LotTerrYyOu ShoUlD dEsHrOuD uR GPUmy SYstEm iS UNDerPerforMiNg iN WarzONEcan mY Pc Run WiNdOwS 11 ?woUld BaKInG MY GRaPHics card fIX it? MultimETeR TeSTiNG!! aMd'S GpU DrIvErS aRe as goOD aS NviDia's YOU SHoUlD oVERCloCk yOUR ramS To 5000C18

 

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Fabioo said:

All the sensors onboard the landing aircraft used machine learning (from the rover and from the landing crane) so plenty of processing happened on the rover.

Another point about that: Using a program made with machine learning doesn't require that much processing power.

 

What does require a lot of power is training this algorithm in the first place, but that is not something that is happening on the rover in real time. Training an AI means you have to repeat a scenario over an over again using different parameters. Not something you can do in a once-in-a-lifetime landing that has to be perfect. The rover can't learn how to land as it goes along, it already has to know how to do it. So it doesn't do anything except use a pre-made algorithm.

 

~edit: Take a look at Nvidia Jetson for example: https://developer.nvidia.com/buy-jetson

These are tiny boards made to run AI, some of which require as little as 5 watt. Here's the catch: These boards can't be used to train a neural network. They simply run an existing neural network. Training is where the beefy GPU comes in.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Fabioo said:

So if anyone can shed some light on this topic that would be greatly appreciated. 

Older technology with larger nodes is more resistant to radiation. A lot of satellites still run on a 486.

 

And of course there's the power consumption concern, don't want any more hardware than you need on something like this.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Eigenvektor said:

So I don't think this is bad hardware, considering what could be done with previous rovers already.

 

Just to be clear, I never said that the hardware is bad. I have no idea how space exploration works so I'm not the one to judge that.
I'm just wondering why it is so different since we have such faster hardware these days

Link to comment
Share on other sites

Link to post
Share on other sites

As others have said, the fact that it needs to be super ultra mega reliable means tried and tested hardware. It also doesn't need a i9 10900k to do the relatively basic tasks that need to happen, plus there is possibly a lot of room for hardware optimization as the things it needs to do are well defined leading to effectively lower hardware requirements.

 

Another point that I did not yet saw mentioned is the time development cycles for these things take. It's not like we decide to go to Mars next year and start building a rover. Space missions are planned many many many years ahead. There are many design an validation phases before they even start building the thing and the whole concept to mission can easily take a decade or two. I had to look it up, but Perseverance started in 2012. So not only will you use reliable, "outdated" hardware, you are looking at using  reliable and outdated hardware from the point of view of 2012.

 

 

53 minutes ago, lewdicrous said:

It's not meant to run Doom.

Yet it probably could 😛 Given DOOM's 66 MHz processor and couple dozen MB storage requirments. My god that would be an epic joke to me haha.  Ultimate immersion by sending a rover to Mars, to play DOOM on the planet.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Eigenvektor said:

Another point about that: Using a program made with machine learning doesn't require that much processing power.

....

Good point

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, tikker said:

My god that would be an epic joke to me haha.  Ultimate immersion by sending a rover to Mars, to play DOOM on the planet.

Doom with augmented reality!!

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, lewdicrous said:

Doom with augmented reality!!

Actual footage minutes after landing:

Spoiler

7ro5v1cwmbe11.jpg

 

 

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Maury Sells Wigs said:

Well, seeing as the rover hasn't left low earth orbit and the entire thing is a scam, I'm sure it really doesn't matter too much what onboard systems it has.

😀🤞👍🤦‍♂️

pardon me?

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, ki8aras said:

pardon me?

Don't feed the trolls m8. 

Gaming HTPC:

R5 5600X - Cryorig C7 - Asus ROG B350-i - EVGA RTX2060KO - 16gb G.Skill Ripjaws V 3333mhz - Corsair SF450 - 500gb 960 EVO - LianLi TU100B


Desktop PC:
R9 3900X - Peerless Assassin 120 SE - Asus Prime X570 Pro - Powercolor 7900XT - 32gb LPX 3200mhz - Corsair SF750 Platinum - 1TB WD SN850X - CoolerMaster NR200 White - Gigabyte M27Q-SA - Corsair K70 Rapidfire - Logitech MX518 Legendary - HyperXCloud Alpha wireless


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

Synology-NAS:
DS920+
2x4TB Ironwolf - 1x18TB Seagate Exos X20

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Fabioo said:

Doesn't shielding mitigate that?

not always

if some basic shielding was enough there wouldn't be specialized fabs that can produde radiation safe silicon.

if radiation is already a problem on earth ( ECC memory is used to circumvent the " flipping " of bits due to radiation from space ) how big will it be in space ?

20 minutes ago, tikker said:

Another point that I did not yet saw mentioned is the time development cycles for these things take. It's not like we decide to go to Mars next year and start building a rover. Space missions are planned many many many years ahead. There are many design an validation phases before they even start building the thing and the whole concept to mission can easily take a decade or two. I had to look it up, but Perseverance started in 2012. So not only will you use reliable, "outdated" hardware, you are looking at using  reliable and outdated hardware from the point of view of 2012.

that's another important point. the development of this stuff takes a lot of time

Hi

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

hi

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Fabioo said:

Just to be clear, I never said that the hardware is bad. I have no idea how space exploration works so I'm not the one to judge that.
I'm just wondering why it is so different since we have such faster hardware these days

Here's an article that mentions the key points: The technology aboard the Mars rover Perseverance: An inside look

Quote

The closer you pack your transistors, the more susceptible to radiation you get […] With space hardware, you need high reliability, and the RAD750 has had a couple of hundred missions in space

And, as @tikker said, the mission started over a decade ago. You're not going to use the latest stuff that just came out, because the would mean redoing all the tests and validation steps beforehand. If you discover a critical flaw in the CPU after launch, you can't just swap it out for a different one like you could do here on earth. So you're going to make absolutely sure that stuff you're working with has been tested and tested again to be as reliable as possible.

 

1 minute ago, Drama Lama said:

if some basic shielding was enough there wouldn't be specialized fabs that can produde radiation safe silicon.

This. Some high energy particles you can't really shield against plus you'd need a (literal) ton of shielding. All of the extra weight means you lose out on other payload stuff. Shielding doesn't help you do science, so it's not something you want to carry along unless you absolutely have to.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

On a similar note someone had asked why better cameras aren't used in space. The short answer was that the limit is in the data path of getting that data back to Earth. They only have a certain amount so generating much more than that doesn't help. Also there was a case of, they have a camera and know it works.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Fabioo said:

I get your points, but I'm not sure if you are right on that one. All the landing phases had to be done remotely, with no earth inout whatsoever. All the sensors onboard the landing aircraft used machine learning (from the rover and from the landing crane) so plenty of processing happened on the rover.

Nah, the EDL phase was pretty much a script. They did all the hard work on Earth before it ever made it into space. Once the landing site was finalised they used satellites they have orbiting Mars to take as many images as possible of the landing crater then they terrain mapped it all back on Earth (they used different colours to designate different terrains) then its a simple case of having the lander scan the ground as it descends and making sure it aims for the right colour on its map. The lander did have some "AI" built into it, for example at high altitude it had to be able to pick the best area without seeing the ground properly and it had the ability to override its own choice as it descended up to a certain altitude however it was simply a script, a VERY complex one, but still a script.

 

Interestingly NASA have spoke about how they only had 1GB of storage available to hold everything despite the rover having 2GB on board. Because the rover was doing the processing for EDL they had to include 2 OSes on the same drive, 1 for EDL and 1 for the rover. Right now they're in the process of testing out the rover OS to make sure its doing what they expect it to, IIRC they said this would take anywhere from 4 days to 2 weeks and for obvious reason they really don't want to rush. The last thing they want is a $2.9B brick sat on Mars. They also said they have a 1GB backup of the EDL OS (which was tested before it left Earth so they know it works) and worse case they can always go back to that, that would mean them having to send a 1GB upload of the fixed rover OS to Mars over shortwave.

 

As for 200Mhz, its not like the rover will ever be multitasking. Each task it can perform will be the only thing its doing at that time, with optimised code 200Mhz is plenty to work with, especially since speed is not a concern. The rover can only send data back when there's a satellite overhead. For me 256MB of RAM is the bigger surprise.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Absolutely wonderful video from criminaly underrated channel...can almost guarantee basically the same thing applies to Perseverance.

 

Quote me to see my reply!

SPECS:

CPU: Ryzen 7 3700X Motherboard: MSI B450-A Pro Max RAM: 32GB I forget GPU: MSI Vega 56 Storage: 256GB NVMe boot, 512GB Samsung 850 Pro, 1TB WD Blue SSD, 1TB WD Blue HDD PSU: Inwin P85 850w Case: Fractal Design Define C Cooling: Stock for CPU, be quiet! case fans, Morpheus Vega w/ be quiet! Pure Wings 2 for GPU Monitor: 3x Thinkvision P24Q on a Steelcase Eyesite triple monitor stand Mouse: Logitech MX Master 3 Keyboard: Focus FK-9000 (heavily modded) Mousepad: Aliexpress cat special Headphones:  Sennheiser HD598SE and Sony Linkbuds

 

🏳️‍🌈

Link to comment
Share on other sites

Link to post
Share on other sites

You might be overestimating the performance needed to do everything the rover does.

The entire ISS was controlled by a bunch of 20 year old thinkpads until recently.

The space shuttle programs had computers designed in the 70’s controlling all of them.

A lot of cubesats run micro controllers weaker than the average raspberry pi by an order of magnitude.

 

Simplicity is key to reliability. And when other factors come into play like security, redundancy and freedom of control, the choices get far away from consumer technology.

If you tossed a Ryzen 5 3600x into that rover yeah I’m sure it would do the same thing. It would also have a significantly higher failure chance, consume a lot more power than what’s available, and operating it would require a ton of extraneous software that’s insecure and not needed for the mission.

 

Dont forget that for a brief while, humanity was throwing bombs at ships using birds, simplicity is key.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×