Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Since Apollo, NASA has been suffering from too large projects. It's a spiral: once a project grows beyond a certain size, failing would be politically catastrophic. Hence the project must have even more tests and simulations, and the time table and cost move to the right, and it gets even more risk averse. The vicious cycle.

Instead if there were small enough missions and failure was tolerated as an unfortunate part of the process, progress would be a lot faster, and new technologies could be attempted etc.

This was the whole Faster Better Cheaper approach. Small teams of highly competent people, quick iteration, little bureaucracy...

Pathfinder/Sojourner was a FBC mission. It had a freaking webcam there. :) Opportunity/Spirit were somewhat bloated and the latest Curiosity rover was already a relatively traditional megaproject. Naturally there has been science instrument maturation etc as well, it's not a one sided story, but it is important to understand.

Look how many launchers SpaceX has developed and how much it has iterated, while NASA has worked on SLS. Sure, SpaceX has crashed and burned plenty of rockets and test vehicles on the way. It has changed directions dramatically. But looking as a whole, it has made large progress.

There is some kind of spectrum here. Of course if you launch deep space probes for 20 year missions, you test differently than for a launch test vehicle. But if you had many of those space probes as well, it wouldn't be so catastrophic if some of those failed.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: