Can Software Be Durable?

39 points by maraoz a day ago

What would you change about software development if your apps must last 50 years? What must software developers do differently to build apps that will still work 50 years from now? Can software be durable?

JonChesterfield a day ago

There's fifty year old simulation software in use today. I ported some to semi-modern x64 and learned things I've gratefully forgotten about floating point in the process. Also that "VAX" and "SUN" Unix behaved differently, for whatever that's worth.

I would say that was viable because it had zero dependencies and could be built with the equivalent of gfortran *.f77, provided one changed the source in plausible looking ways first.

If your software relies on fetching things from the Internet it is probably doomed within a year or so and surely within a decade.

Wouldn't bet on today's up and coming language still existing either. C89? Will probably still build fine with some compiler passing appropriate flags.

Hardcoding x64 or aarch64 assumptions likely bad for longevity too, as both are surely gone before 2075 ticks around, though equally I couldn't find a VAX and still got that code running. So that's more about minimising the cost of porting rather than anything fundamental.

  • phkahler a day ago

    >> I would say that was viable because it had zero dependencies and could be built with the equivalent of gfortran *.f77, provided one changed the source in plausible looking ways first.

    Came to say this. Minimize your dependencies. Software can last forever, but everything around it changes and can break or otherwise cause incompatibilities.

al_borland a day ago

Stop chasing trends and changing for the sake of change. At my job, management is constantly forcing us to reinvent the wheel. The systems we used 15 years ago would still work fine, and be very robust and mature by now. Instead, with each re-write new gaps and bugs come to the surface. For some reason we choose to live in Groundhog Day, instead of making the choice to prioritize cheap, boring stability. Each new solution feels more fragile than the last. When we know someone will tell us to throw it all out in 2 years, there is little incentive to prioritize durability.

  • pan69 10 hours ago

    > At my job, management is constantly forcing us to reinvent the wheel.

    It's called; sustaining the gravy train. Why fix something once when you can "fix it" over and over again...? Grifters and posers.

robin_reala a day ago

We’re coming up on 50 years for Space Invaders. We can say that it’s endured because:

- it’s self-contained: it works without dependencies, and with the hardware it was designed for

- there’s an ongoing need: peope want to continue playing Space Invaders

- it’s transposable: the environment it runs in can be recreated through additional layers (emulators here)

- it’s recreatable: the core concepts behind Space Invaders can be reimplemented in the prevailing contexts of the time

  • OhMeadhbh a day ago

    Modern software development started for me in 1985 when we disassembled the Choplifter arcade game they mistakenly put in the college dorm with several student engineers. Within 15 minutes of them dropping it off, we had the back door open and the eproms lifted from the motherboard. Within a couple of hours we had reverse engineered the code, modified it for a free game hack, burned new eproms and put back in service.

    In honor of thr 40th anniversary of this hack, I recently played the stock version in an emulator on a web page. Code lives on, I suppose.

    • supportengineer a day ago

      Talented people with access to resources! And no fear of punishment ( fines, expulsion, etc. )

  • continuational a day ago

    Yeah, Super Mario Bros is 40+ years old by now. Runs just fine in an emulator, even if you can't find the original hardware.

    Modern SaaS apps can't be run once the company shuts it down. You don't have the code, not even the executable.

Python3267 42 minutes ago

This is a principle that can be applied outside of this question, but is applicable here. Many non-engineers view software as an asset, *it is not*. Every line of code is a liability, so minimizing the size of a codebase and limiting the complexity of distributed systems (If posible avoid DS's) is key for good reliable software.

If the code needs to be fast (written in a systems language like C, C++, or Rust) write it in Rust. It will limit the amount your can shoot your foot both at the compilation stage and while writing the code (You have to think about what you're doing to avoid the compiler yelling at you. Otherwise, for the love of god write your project in a simple garbage collected language. Not having to handle memory directly removes a lot of what can go wrong.

pentaphobe 11 minutes ago

The most consistent non-durability I see these days is business logic intertwined with external (API / file system / env vars) resource access - all scattered throughout layers of a codebase.

(Basically a violation of Factor IV in "Twelve Factor App" [^1])

Certainly not the _only_ durability concern present, but it seems to crop up more extremely and in more places.

Dependency Injection can help mitigate this a little but seems to often be a crutch which covers for poor architecture and enables eventually atrocious readability (a sad admission as I'm fundamentally a fan)

TLDR I find a good question to ask is whether the core logic and journeys can be executed in total isolation from any real resources. If it can't, you'll likely have a rough time testing, maintaining, or scaling - or at least changes will be more troublesome than they could be

(NB: bunch of sibling comments saying similar - but felt worth highlighting the local-first / isolated / separable aspect)

[1^]: https://12factor.net/backing-services

jfk13 a day ago

An example worth considering is TeX, which is now 43 years old (considering only TeX82; the earlier TeX78 was a substantially different piece of software). There has been some maintenance over the years, it's true, including a few feature additions in 1990 (TeX 3.0), but I would suggest it has shown itself to be extremely durable.

  • WillAdams a day ago

    At the heart of this are two wildly different technologies:

    - Literate Programming which was developed so as to work around limitations of the Pascal development stack as it existed when the project was begun: http://literateprogramming.com/

    - web2c which allows converting .web source into a format which may be compiled by pretty much _any_ C compiler

    LP was described by Knuth as more important than TeX, but it suffers a bit from folks not understanding that it's not so much documentation (if it were, then _The TeXbook_ would be the typeset source of plain.tex) as code commentary only useful to developers working to extend/make use of the application --- there really does need to be some sort of system for manual documentation, but I suspect that it will continue to be a talented technical writer for the foreseeable future.

bullen a day ago

C and Java are the most durable in different ways.

- C gives you the least likely to be broken by evolution of OSes without added effort, because they need backwards compatibility.

- Java will be even more durable as it can be implemented/fixed after you compile your software. I ran my games compiled 20 years ago on a Risc-V computer flawlessly = not a single glitch! That said it requires JVM efforts to be maintained.

Oracle is heading the wrong direction in every aspect (removing fundamental features like sandbox without a clear path to replacement), but the bytecode standard will prevail, ahead of clones like C# and WASM.

That said if you make a VM why not have it JiT/AoT on all 3 bytecodes?

Either way the JVM is the final emulator, as long as it's maintained.

aristofun a day ago

This is based on fundamentalky wrong assumption that software is something like a hammer or a bulldozer.

Software is more like a plumbing. It a) wears out b) requires maintenance c) people maintaining it is integrated part of the whole system.

  • OhMeadhbh a day ago

    I think it depends on the project and the programmer. Code I wrote for a Nuclear Power plant is still there. Ditto for the code I wrote for the steel mill. And you can't find ATM machines w/ my code in them easily, but you see them from time to time. Code I wrote to go into the DoD's CAC card is still there, even though the hardware went through a new Rev a decade ago, so it surprises me to say this, but "yay, java?" Some of the code I put in Firefox to support TLS 1.1 & 1.2 is still there, but most of it got refactored out a while ago (and thank $DEITY Brian(?) replaced the old libpkix library that had been in Netscape navigator and then Firefox for about 20 years.) Much of the POTS network has been replaced since the 80s, but I'm told there are one or two switches I contributed to while at DSCCC are still around.

    But on the other hand, my 1993 1 character patch to the Linux kernel was replaced in around 96 or 97. I hope to whatever benevolent Supreme being exists that the crap pyth*n code I added to Second Life has been replaced. No one still uses Palm Pilots or Handspring Treos anymore, so I doubt that code has much life in it. Virtually every web app I wrote is dead, but they were fun to write, so whatever. And the code I added to a couple of satellites is dead (though my ground station code is still alive.) I bet that some of the avionics code in the cockpit is hard to update as well.

    So... it depends... my nuke plant code still has another decade probably and my old room-mate's anti-lock braking code will probably outlast us all. Embedded systems are probably more long-lived than the Facebook front page. Some are just hard to update cause you can't easily get to the machine, others are hard due to regulatory or compliance reasons.

    • applied_heat 17 hours ago

      What kind of company do you work for that is doing so much embedded/controls type work? I program hydro power plants and governors with a 30 year mindset and no guarantee I ever get access to the plant again once I sign off as complete

      • OhMeadhbh 13 hours ago

        I worked for several companies: DEC and DSCCC for Telephony, IBM for Nuke Plants, DEC again for steel mills, Skybox, Planet Labs and Kubos for satellite, RSADSI for ATMs. Once people know you have embedded on your resume it's easier to get the next embedded job. But embedded work is thin and I've done a bit of web plumbing (reverse proxies, net controls, "converged" communication, etc.)

        I've been lucky to avoid the "web framework of the hour" grind... and for the most part work on heisenbugs in net plumbing. Fortunately that part of the stack will invest a little time to avoid serious problems. But their timelines are shrinking to "just long enough to sell the company and it's IP" or about 5 years.

    • aristofun 19 hours ago

      It’s just the lifecycle and halflife for nuclear reactor software is different:)

  • tertle950 a day ago

    Please explain how a set of instructions for a computer can "wear out" and "require maintenance", Word 2007 still writes documents fine enough for me

    • aristofun 8 hours ago

      The sole purpuse of a “set of instructions” is to serve end users, to fulfill some business function. Without it it is useless bunch of symbols.

      The set contain bugs itself that are getting revealed over time. But more importantly end users and businesses function evolve and change and if people have no choice but to adapt to such a “hammer” - it’s a piece of crap, not a software.

    • Espressosaurus a day ago

      Programs run on an operating system, the operating system runs on real hardware.

      The real hardware gets old, wears out, parts become difficult and perhaps even impossible to source.

      The operating system accumulates known vulnerabilities until it's no longer safe to connect to anything.

      You can work around the latter two problems with emulation, but it's never the same--display technology if nothing else is different and presents differently. Emulation is dependent on the fidelity of the emulation. It's much harder to make it exactly cycle-and-timing accurate, though in most cases (like Word 2007) it doesn't matter.

      The instructions might exist, but they are not runnable without other supporting infrastructure.

      This also ignores programs that are wholly reliant on third party compute and instructions you have no access to that can be shut down and no longer available, like your MMOs.

      • mikewarot 20 hours ago

        The IBM 1401 is still being emulated to this day. It will never die.

        I sometimes emulate a DEC VAX 11/780 running OpenVMS 7.3 on my phone.

    • thom a day ago
      • al_borland a day ago

        Security vulnerabilities only really matter if you're going to have the system online.

        George R.R. Martin still writes using WordStar 4.0 in DOS.

        I remember reading a news story years ago about a guy who brought a vintage Mac from the 80s into the Apple Store to see if it could be a little faster. He had been using it for his accounting (or something) for the past 30 years.

        Air gapped systems like these, with some degree of physical security, are pretty safe.

  • poisonborz a day ago

    Software is and ideally must always be a hammer. It is only since the last 15 years that monopolist platforms try to trick us believing they are not.

    • NeutralForest a day ago

      To be fair, the surface of software has gone through the roof. Unix utils like `grep` or `find` can be hammers while a retailer's website with varying promotions, inventory and overall content needs to be maintained, moreso like a car.

    • scarface_74 a day ago

      The last 15 years was 2010. I’ve been in the industry since 1996 and programming as a hobbyist since 1986. Computers change, operating systems change. It’s not like I was using the original AppleWorks 3.0 on my Apple //e in 2010, or ClarisWorks from 1996 on my LCII.

      While you can still buy Microsoft Office once and use it “forever”, I much prefer the $129 a year, 5 users deal with 1GB of online storage per user and each user can use office between their computers, online and on mobile regardless of operating system.

      A desktop only office suite would do me no good as I go back and forth between platforms.

      • poisonborz a day ago

        I reflect on this as a user: I've been a Windows user since around 1996, and most of my workflow is unchanged, I still use the ~same software (new versions of course, but those old versions would mostly still run). This could be even more true for Linux. Apple had a wilder ride, but you can always emulate. This isn't about user experience, more about a mindset, that newer generation seems to tragically forget. Once you buy a service, you give up control and ownership. Same way how artists, works and prices constantly change on streaming platforms. On the surface it seems easier, but...

        For your example, Microsoft can alter that price any time it sees fit, discontinue platforms you use, alter the interface drastically or cut a crucial feature in tomorrow's update... I wouldn't cope with such, and am sorry for those who must.

        • scarface_74 a day ago

          And when that happens, LibreOffice, GSuite and even Apple’s iWork suite can open the documents.

          Word for the Mac has been in continuos production since 1986. It made through a 16 bit MacOS (I’m yada yada yada’ing) , to 32 bit “clean” System 7 on a 68040, to classic MacOS PPC to native OS X on PPC Carbon 32 bit, to x86 64 bit OS X APIs to ARM.

          Would you really want to use Windows 3.1 Word in 2025? One of the major issues with Windows is that it carries stoubf cruft from 1992.

          There are for instance still a dozen ways to define a string in Windows programming depending on the API and you have to convert between them

  • gblargg a day ago

    Software is a design. In the same conditions it never wears out. As conditions change it tends to work less and less well unless modified, as expected.

  • solannou a day ago

    First time I read (c). That's the real truth, thank you for putting words on it! I understand way better people and places where I've worked.

ARothfusz a day ago

You mean like the Voyager probes? https://en.m.wikipedia.org/wiki/Voyager_program

Almost 50 years old now, and still sending data.

What would I change about software development now to program like they did 50 years ago? I would program like they programmed 50 years ago: assume it has to work. Assume updates will be risky and expensive. Build in failsafes and watchdogs and redundancy. Be able to replicate the build every year for 50 years. Train people to know what the logs really mean, every year for 50 years. And launch it before the bike shedding can begin!

poisonborz a day ago

Use the least amount of dependencies, only old mature stable APIs, obviously offline only. This way you can be pretty sure the environment can be easily emulated.

queenkjuul 17 hours ago

I just saw somewhere else that NASA has a manual on exactly this -- apparently there's software running on satellites 30+ years without crashing

Also those tales of C64s running HVAC systems for 30+ years. Aren't there German trains running Windows 3.1?

I guess i am making an assumption that if it's lasted 30 years, it'll just keep going.

codingdave a day ago

I've been in this industry now for 35 years, and while there is some tech from my starting point that is still around, works fine, and could still be used effectively today... few people know how to use it, so from a pragmatic perspective, it is useless.

So your starting point for this question needs to focus on that evolution of talent. Your app needs to be well-defined and compartmentalized so that each layer can be refactored when (not if) the talent pool dwindles to nothing. You need to develop a strong refactoring culture not only to handle the reality that the best thing you deliver today will be tech debt before the 50 years are up, but that whatever problem it is solving today will likewise evolve.

The durability you seek needs to be ironically focused on change management.

AA6YQ 15 hours ago

The suite of amateur radio applications I began developing in 1990 are still under active development, with an estimated tens of thousands of users around the world. I typically make several new public releases per month.

The current number of reported but uncorrected defects across this suite? Zero.

comrade1234 a day ago

I do have a project that's been running about 25-years. But it's not static - new features are added, bugs are fixed, and it just keeps going. Some of the code and the overall architecture are original but there's plenty of new.

That said though, the vast majority of projects I've been on are probably no longer in existence. This is why I take a more casual approach to most projects - I see it as somewhat temporary and it doesn't make sense to put so much effort into a clean project.

mamcx a day ago

You need to answer this from many directions:

* Hardware * Network * Storage * Electricity

Before go to software, because if the hardware break then what?

After it, you need an OS where you can be certain don't depend on the cloud or phone home ever, then after that you probably need a decent set of:

* Solid programing lang: Like Rust * Or very barebones: Like C * A RDBMS like sqlite

Set of tooling like code editor, image, etc

ciconia a day ago

Simplify as much as possible and reduce the surface area: the least amount of code, the least amount of dependencies, the least amount of infrastructure.

rcxdude a day ago

Given existing examples, be useful (or appealing) enough relative to the complexity of the ecosystem it depends on that people are willing to maintain the means to run it, whether by porting, compatibility layers, or full emulation. Being open-source helps a lot there. Being small and/or efficient enough to run acceptably even with these layers can also help.

rickcarlino a day ago

I've thought about this problem also, and I would love to see in-depth analysis. It would be important to pick hardware that has been widely deployed and can be easily sourced in the future. Documentation must be written in a timeless manner rather than a concise manner. Dependencies should be kept minimal, even OS level system call dependencies.

  • maraoz a day ago

    "Documentation must be written in a timeless manner rather than a concise manner." <- loved this!

trevorLane a day ago

use COBOL..

  • OhMeadhbh a day ago

    You laugh, but I was paid a decent hourly rate last year to update a COBOL program. About a decade ago they moved it into a mercurial repo, so I couldn't see history before that. I can't imagine the original app being less than 40 years old (probably 50.)