Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Destroying software" is broader than the creation of new, working software artifacts in the moment. The phrase refers to changes in engineering culture in software and it's long term effects, not the immediate successes.

Writing a new green field project using 10.000 npm dependencies for an electron based front end is shockingly easy. But how do you keep that running for the next 15 years? Did the project need npm? Or a web browser? How do all the layers between the lamguage of choice and the bare metal actually behave and can you reason about that aggregate accurately?

The field has come to a point where a lot of projects are set up with too many complexities that are expedient in the short term and liabilities in the long term.

The current generation of junior devs grows up in this environment. They learn that these mistakes as "the right thing to do" when they are questionable and require constant self-reflection and reevaluation. We do not propagate a hacking culture enough that values efficiency and simplicity in a way that leads to simple, efficient, stable, reliable and maintainable software. On a spectrum of high quality craftsmanship to mass-produced single use crap, software is trending too much to the latter. It's always a spectrum, not a bunary choice. But as a profession, we aren't keeping the right balance overall.



I've been a backend engineer for about 10 years, with my last job doing an aws lambda stack.

I started a job in manufacturing a few months ago and having to think that this has to work for the next 20 years has been a completely different challenge. I don't even trust npm to be able to survive that so web stuff has been been an extra challenge. I landed on lit web components and just bringing it in via a local CDN.


World is full of abstractions on many different levels. Something being on a lower level doesn't inherently mean superior. You can go in any direction on the scale or spectrum. Do you know how exactly atoms behave that computers are made out of? There are plenty of people working on all different sorts of abstractions, new abstractions appear and demand for lower level increases when it is needed. You could say that as more abstractions are built on top of lower level the balance of all the field will go higher in abstraction level on average, but that is the natural way to evolve. Abstractions allow you to build faster and the abstractions are possible because of lower level elements. In the end if you are measuring what an average level of abstraction for current industry is you can draw the line arbitrarily. You could include the people who use website builders and you can calculate the average to be even higher. We need people working on at all different levels of abstraction. We could divide the groups with 2 different naming convention for lower level engineers and higher level, then technically you could go back to calculating that average is still where it used to be.

I definitely use npm (or rather pnpm) because I know it will allow me to build whatever I want much faster.


Abstractions are only part of the whole issue. Maybe I focused too much on that. But I'll argue that point once more.

How much complexity is actually required? What changed in software in the last 20 years so that the additional bloat and complexity is actually required? Hardware has become more powerful. This should make software less reliant on complicated optimizations and thus simpler. The opposite is happening. Why? What groundbreaking new features are we adding to software today that we didn't 20 years ago? User experience hasn't improved that much on average. In fact, measurements show that systems are responding more sluggishly on average.

Intrinsic complexity of the problems that software can solve hasn't really changed much as far as I can see. We add towers of accidental complexity on top that mostly aren't helpful. Those need to be questioned constantly. That isn't happening to the extent that it should. Web-based stuff is the poster child of that culture and it's hugely detrimental.


> What changed in software in the last 20 years

Backends handling tens / hundreds of thousands or more of concurrent users rather than locally deployed software on a single machine or a small server with a few 10s of users?

Mobile?

Integration with other software / ecosystems?

Real time colaboration amoung users rather than single user document based models?

Security?

Cryptography?

Constant upgrades over the web rather than shipping CDs once a year?

I'll pass on AI for the moment as it's probably a bit too recent.


Why is a single, scaled up backend required in products effectively have onky multi-tennancy?

Software can be distributed onto client machines and be kept up to date. That was first solved with Linux packages managers more than 25 years ago.

Before mobile we had a wide range of desktop operating systems with their own warts.

TLS 1.0 was introduced in 1999. So cryptography already a concern back then.

So what is really new?


Cant do all on clients. All the 5m client machines daily are connecting to "my" one authorization server


> Something being on a lower level doesn't inherently mean superior. You can go in any direction on the scale or spectrum. Do you know how exactly atoms behave that computers are made out of?

This is a false equivalency; no one is suggesting that it’s necessary or even useful to understand electron drift to write programs. It can, however, be extremely useful to understand memory allocation and retrieval, and how your languages’ abstractions over it work.

Take UUID generation, for example. I needed to generate millions of them for a tool I’m building, and I found that it was massively slower on Linux than Mac, even taking raw CPU performance into account. I eventually tracked this down [0] to BSD using arc4random, while until a fairly recent glibc release, Linux was using ul_random_get_bytes. Most people are never going to need or notice this difference, because for a single UUID, it simply doesn’t matter. But it did for me, and the only reason I was able to figure it out and solve it was by having a passable understanding of implementations buried behind abstractions.

[0]: https://gist.github.com/stephanGarland/f6b7a13585c0caf9eb64b...


It's not a personal value judgment, it's a debugging issue.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: