Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A lot of those "successful" examples have teams of battle-scarred engineers dealing with all the failures of those ideas. Control loops running away to infinity or max/min bounds, a cache that can't recover from a distributed failure, corrupted live-migrated state, bursts causing overload at inconvenient times, spurious anomaly-detection alerts informing you of all the world's holidays, etc.

Underneath all of those ideas is a tangle of complexity that almost everyone underestimates.



I know I'm not alone in this, but after doing this for more than 20 years I can't shake the idea that we are doing it wrong -- meaning programming. Is it really this nit-picky, brittle, and hard?

The brittleness is what gets me. In physical mechanical and even analog electrical systems there are tolerances. Things can almost-work and still work for varying degrees of work. Software on the other hand is unbelievably brittle to the point that after 50 years of software engineering we still really can't re-use or properly modularize code. We are still stuck in "throw it away and do it over" and constantly reinventing wheels because there is no way to make wheels fit. The nature of digital means there are no tolerances. The concept isn't even valid. Things fit 100% or 0%.

We keep inventing languages. They don't help. We keep inventing frameworks. They don't help. We keep trying "design patterns" and "methodologies." They don't help. If anything all the stuff we invent makes the problem worse. Now we have an ecosystem with 20 different languages, 50 different runtimes, and 30 variants of 5 OSes. Complexity goes up, costs go up, reusability never happens, etc.

I remember for a while seeing things like the JVM and CLR (the VM for C# and friends) as a way out-- get away from the brittle C API and fully compiled static native code and into a runtime environment that allowed really solid error handling and introspection. But that paradigm never caught on for whatever reason, probably because it wasn't free enough. WASM is maybe promising.


No, it's not terribly hard for a suitably compensated and skilled team with the appropriate tools and timeframe. Yes, we are doing it wrong.

Most of the 'inventions' you describe are more aimed toward reducing the barriers of entry: the promise that your team of expensive C wizards would now use Java at greater speed with less defects became "now we can just use cheap CS grads" at slightly worse but still acceptable levels.

Without any real consequences for poor software (see CrowdStrike's YTD despite its multi-billion dollar farce in July) it's only logical that the standard will always be "bare minimum that can be shipped". Developer productivity is a misnomer really - it just means company profits increase thanks to a widening pool to hire from and even more crapware per dollar can now be squeezed from each worker.


Nah. CLR/JVM/WASM are the same pipe dream of a universal architecture/shared-compute utopia.

But I think you have the wrong take on “reusability”. Every non-software engineering project is an exercise in custom solutions as well. The reusable parts are the tools and materials. Likewise in software engineering the languages, OSes, protocols, libraries, design patterns, and frameworks are the reusable bits. Code is how we describe how it all fits together, but a huge amount of what it takes to run a system is being constantly reused, much bigger than the code we write to implement it.


I’m aware that many HN people view this as a pipe dream, but why? It works. The largest compute platform in the world, namely the web, is like this, and many of the largest businesses run on the JVM and the CLR. Loads of nasty problems go away when you are not directly mangling bits in memory and when you have a real runtime.

Of course modern safe languages like Rust give you some of those benefits in compiled code too.


>The brittleness is what gets me. In physical mechanical and even analog electrical systems there are tolerances. Things can almost-work and still work for varying degrees of work.

I think you are misrepresenting how flexible software is versus hardware. Mechanical and electrical systems have tolerances but if you go outside those tolerances, the whole system can be destroyed. Nothing like that is common in software. Worst-case outcomes might be like "the performance isn't as good as we want" or "this code is difficult to work with." Software components are very flexible compared to anything physical, even in the worst cases.

>We are still stuck in "throw it away and do it over" and constantly reinventing wheels because there is no way to make wheels fit. The nature of digital means there are no tolerances. The concept isn't even valid. Things fit 100% or 0%.

I don't know how one can look at the amazing array of libraries out there and conclude that we have no reuse. Sometimes people build their own solutions because they need something very simple and the libraries are too big to be worth importing and learning in those circumstances. That's not a flaw in the libraries. It's human nature.

>We keep inventing languages. They don't help. We keep inventing frameworks. They don't help. We keep trying "design patterns" and "methodologies." They don't help. If anything all the stuff we invent makes the problem worse. Now we have an ecosystem with 20 different languages, 50 different runtimes, and 30 variants of 5 OSes. Complexity goes up, costs go up, reusability never happens, etc.

All of this is too pessimistic. These tools do help. Exactly how many languages do you think we should have? Do you think exactly one group is going to develop for each use case and satisfy everyone?

>WASM has its uses but I can't escape the idea that it's like "let's build a VM and carry all the shortcomings of C into it."

I'm not a web guy but this sounds silly. It's not meant to be written directly. Complaining about shortcomings of WASM is literally like complaining about shortcomings of assembly language. It's not intended for human consumption, in modern times.


There's probably more tendency to be sloppy in software because "we can always fix it in post." But absolutely, with hardware, if you don't get it right--especially with heavy construction or modern electronics, you're going to have to rip a lot out and start over.


I've thought that we should be able to move so fast with software, and sometimes I see that successfully, but usually it seems like we leverage computers' superior performance poorly. To pick on frontend, it's trivial to create the next shiny Javascript framework, but what of it? Or, how come refactoring can be so painful, when the semantic change might be small? I think the flexibility and performance of computers is such that we programmers are usually incapable of effectively using them. It's like a 3D optimization problem visualization, looking for the highest peak around, except the cursor moves a lot faster than it can peruse the landscape. It zooms around aimlessly, easily getting to arbitrary places but without the capacity to make sense of them. When the train is in motion, switching tracks is hard, even if that would be the best move.


It's easy to assume that you can always fix things after the fact. And I was only half joking with the fix it in post comment. Modern film suffers from some of the same problem. No need to get it right on the first pass. We can always apply corrections later.


I think that's fair: the success stories are very rarely "yay, we did it!" but much more often "this single change was the sole focus of a team/multiple teams for X months, and the launch/release/fix was considered a culmination of a significant investment of resources".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: