- Using http://linear.app style data-flows ala Replicache (https://replicache.dev/) where JSON data is preloaded for all sub-links and stored offline-first like a desktop app, so clicking on anything loads immediately w/o waiting for network requests to finish
- Now with 'resumability' where the server-side framework was built with client hydration in mind and delivers the bare minimum event/DOM data necessary to make the page interactive (instead of just being a glorified HTML cache load before the usual JS activates)
For people not following JS these might all seem like constantly reinventing past lessons, but there is a logical evolution happening here towards desktop-style performance and interactivity on the web. Or pure server-side performance but with full JS interactivity.
The next set of frameworks is going to be as big of an evolution the way Angular/Backbone->React/Vue was ~8yrs ago. But it's going to require new backend server frameworks, not just a new client framework. There's probably a big opportunity for the project that can combine this stuff properly.
> The recent evolution of JS frameworks has been really nice. Performance is basically getting identical to desktop.
It's getting much much better but performance is only "identical to desktop" if you ignore anything about its resource usage or speed increases in processors over the past decades.
> For people not following JS these might all seem like constantly reinventing past lessons, but there is a logical evolution happening here towards desktop-style performance and interactivity on the web. Or pure server-side performance but with full JS interactivity.
For people following JS these are examples of constantly relearning past lessons.
I'm not sure how anyone could reliably expect 100+ms round-trip time (on a good connection) to offer the same experience as something local but I think what it actually means is that the people writing JS-based software haven't actually used a native desktop app for years and have done mostly web-based things.
You could be forgiven it since HTML/JS as a user interface design language appears to have taken over completely, to the point where even the most popular code editors are now web browser-based.
Seriously though, go load up any natively compiled app on your OS of choice and compare the speed of it doing any given task to what you get out of web-based versions, electron versions, etc. There isn't a comparison.
My griping aside, I recognize JS as a language is here to stay and it's important to stay on top of its developments and improvements.
> Seriously though, go load up any natively compiled app on your OS of choice and compare the speed of it doing any given task to what you get out of web-based versions, electron versions, etc. There isn't a comparison.
My experience is a bit different...
Google Docs loads faster than Numbers
Figma loads faster than Illustrator
VS code loads faster than Xcode (not a fair comparison)
Quicknote.io loads faster than SimpleNote (which is blazing fast!)
Google Meet loads faster than Zoom
For VS code, it creates a whole new chromium process, but for websites like quicknote.io, do you count the browser base usage and loading time? Or just the incremental time to load up in a new tab?
"Loads" is an interesting take to consider when looking at application performance.
Simply starting the application is largely irrelevant. What's the experience using it? What features do you have access to? Those are the things that matter.
If we wanted to take that point though, even if we assume the browser-based stuff gets the startup time for free (since most people have a browser open most of the time I'd wager), what's the process of loading a local file on my system like? Docs or Sheets might "load" faster than LibreOffice but what's the time-to-use look like? Do I have to sync files somewhere Docs/Sheets can have access to it? What's the "[double-]click to load local file" part of the equation here?
What about the actual use of the application? Is there the instant responsiveness to open color selection dialogs in the case of Illustrator? How useable is it from a plugin perspective? Are we actually comparing comparable applications? There's lots to ask there.
This is why I don't think startup time actually matters. What matters is what the experience of using something like Figma is over Illustrator, Sheets vs Numbers, XCode vs VSCode, etc.
The example I gave of two local apps is a relevant and more directly comparable one. Eclipse does everything just faster than VSCode. VSCode is my typical "general editor of choice" for a variety of things but the slowness of reading a few large Java packages led me to get a current version of Eclipse and give it a go. Seriously, go try it out on any substantially large Java project. There's measurable lag in syntax highlighting upon opening a file, switching between files, performing some basic "find references" type tasks, and so on. In Eclipse it's just fast.
Maybe everything will eventually be JS-based and it will become a meaningless comparison. I don't buy that we're there yet.
Since CS6, all Adobe apps have been getting worse and worse in terms of start speed, UI performance, etc.
Some apps like Acrobat Pro are an absolute disgrace in terms of performance. It's like it's trying to do PDF editing in an AngularJS app from 10 years ago.
VS Code loads slower than Sublime or Kate (though they're a bit lighter on features than Code, not sure it's slower than Qt Creator). And Google Docs takes many seconds (worse on older hardware) to load 100-150 page documents (4 seconds for warm load on Firefox on a top-of-the-line (for single-threaded) Ryzen 5600X) than LibreOffice Writer (just over 1 second on warm startup, 0.6 seconds if Writer is already running).
The initial claim wasn't about load-time, it was about "doing any given task," and I frequently encounter delays when actually using Google Doc, delays I never experience with compiled text editors or spreadsheets.
All that and TextEdit still starts up much more quickly than Google Docs for me.
> It's getting much much better but performance is only "identical to desktop" if you ignore anything about its resource usage or speed increases in processors over the past decades.
The high power use is what kills me. That and input lag. Fix those and I'd give way fewer shits that an Electron app eats 10x the memory that's remotely justifiable by what it's doing, and more like 20-100x what a well-made desktop program would for the same purpose.
[EDIT] Yeah, I know, high power use and input lag are in part because Webtech disrespects things like system memory, so in practice I'm not going to see one fixed without the other.
> Seriously though, go load up any natively compiled app on your OS of choice and compare the speed of it doing any given task to what you get out of web-based versions, electron versions, etc. There isn't a comparison.
I highly recommend trying out Linear.app, it's as fast as any desktop app I use.
But if you're comparing the current era of React/electron apps (or even most Next.js apps) of course you're not going to see Desktop-type speeds yet... these new developments are closing the gap but it's only just starting to be adopted.
That demo literally shows a brief "Loading..." screen for the first open issue, which was pretty much my point.
I'm not sure what other desktop apps you use but for something as simple as viewing a single database record I've not seen loading screens since about 2000 on any native apps.
> That demo literally shows a brief "Loading..." screen for the first open issue, which was pretty much my point.
Yes when you open a desktop app that has to load the data from the internet it has to load first... You can't get around that fact unless you have zero data remote? Which defeats the purpose of a collaborative B2B app.
After that first load, strictly the first time you open the app, you never see another loading screen.
Saying desktop apps don't do the same thing is disingenuous.
And as I mentioned, that's a simple demo for a new framework in beta hosted on a free Heroku instance. Linear's production app is even faster. Especially when you download the desktop version.
> Yes when you open a desktop app that has to load the data from the internet it has to load first... You can't get around that fact unless you have zero data remote? Which defeats the purpose of a collaborative B2B app.
This could be true for data being loaded from the Internet, yes, though it assumes the premise that JS based applications will still do this faster. However, collaborative B2B apps existed before the JS craze and worked perfectly well (and faster) without it. They also don't strictly need to be querying data from the broader 'net but can be doing things like talking to a local database server, asynchronously retrieved local cache, etc.
> After that first load, strictly the first time you open the app, you never see another loading screen.
That's actually only half true. The "Loading" screen does indeed not popup. Instead, you just get broken empty UI until it eventually loads the record. What I observed was scrolling to the end and selecting any of the final few records there was a substantial and noticeable second-plus delay loading the record data into the view. Steps are: Load page -> scroll to very end using scrollbar -> pick last record.
For fun, I profiled it. According to the profiler it takes 3.8s for it to successfully load and process the record, of which 2.1s is just in one promise.
It's also a demo site and could easily cheat this (but to their credit appears as though they are not quite doing so at the expense of heavier page loads). It's not a compelling example of what you're talking about.
It seems like you’re missing the point of the demo. It is purposely not trying to hide the initial load. There’s more (a lot more) to application performance than page load speed. This obsession with page load speed is weird. How many times a day do you open Excel? How many time a day do you click on things in it?
> I've not seen loading screens since about 2000 on any native apps.
Have you used any real apps, because Macromedia Dreamweaver/Adobe Photoshop/etc had literal minute long splash loading screens where they zoomed text on a small snippet telling you what it was doing since at LEAST 2000....
My comment was clearly in the context of loading an individual database record within the application once running. Sure, splash screens exist. Are those part of using the application?
Arguably, I guess though if that was really the comparison to draw, should we then compare it to closing down a browser entirely and booting up the demo via shellopen or equivalent for the URL in question?
The web apps that take 100+ms round trip are ones that have to download the entire page. Downloading a new desktop app takes even longer. If you use a local-first webapp that is cached on your desktop, that's a fairer comparison
This makes no sense. Downloading a web browser takes even longer. It's an insane comparison to draw in the context of the experience of actually using an application.
Nobody cares about startup time when it's a task you do once a day compared to using the application which might be constant throughout the day. How often are people booting up their IDEs?
It's equally unfair to compare the round-trip load of an application that people expect to download on-demand, versus an application that is already installed on the client
I actually just updated a bunch of old Java code in Eclipse. It is simply faster than VSCode at everything -- syntax highlighting (noticeable delay in VSCode on a 1000+ line Java class), switching between files, loading files, etc. I only updated and used Eclipse because VSCode was being noticeably slow.
That was in a bare J2EE Eclipse instance, and didn't have any of dozens of plugins running that I typically would have back in my enterprise-y Java dev days. Visual Studio seems to have gone the "kitchen sink" route. JetBrains' IDEs typically wind up crammed with plugins from what I've seen. I wonder how much that screws with people's perceptions.
The things you have listed minimize the impact of network latency, they don't affect the rendering performance which is still a big deal. Apps that need to render large amounts of data still kind of suck, you'll see many apps "virtualize" things. So rather than having 10,000 elements, you have however many fit in your viewport + N and as you scroll they get reused. The tearing hurts my soul.
Compare the scrolling in Excel 97 + Windows NT to Google Sheets/Office 365. It's night and day. The webapps that render everything with WebGL do preform better, but then you have non-native widgets.
In my experience, time-slicing and deferring renders for large lists (via APIs like useDeferredValue and/or requestIdleCallback, etc), combined with memoization can be a great alternative to virtualization.
For a lot of use cases where people jump straight to virtualization, it's not actually the number of elements that exist in the DOM at once that's the problem (React and browsers these days can handle a lot more than what's intuitive to most people). The problem is usually the cost of rendering all those elements at once in the initial mount, which often can cause visible frame drops and even noticeable freezes of the page.
Deferred rendering and time slicing can amortize this cost over a longer period of time while providing essentially the same UX (by eagerly rendering the first X # of items in a list), while memoization can keep transitions to new states fast by reusing the same large list of nodes when it hasn't changed. All of these techniques combined is still orders of magnitudes less complexity and requires fewer UX compromises (tearing, loss of browser search, etc) compared to virtualization, which should be reserved only for cases where there is no other workable solution IMO.
> The recent evolution of JS frameworks has been really nice. Performance is basically getting identical to desktop.
Web browsers in general are not able to match applications on the desktop. Additionally, typical JS frameworks come with at least a 2x performance penalty compared to hand-optimized vanilla JS (Not a commonly done thing).
Being excited about getting reasonable performance with a great development environment is fine, but deluding yourself into thinking that its great performance is not.
> but deluding yourself into thinking that its great performance is not.
What's the actual issue though? Sure on HN we care a lot about performance. But outside these walls performance has to be really bad for someone to actively avoid it. Even then, if the product has a stronghold on its userbase, you have to really degrade performance for engagement to falter.
The problem is not that a particular library, framework, or app is slow. The problem is the mindset "This is the best performance possible". Having that mindset makes you accept extremely poor performance even when it is not economically advantageous to stop optimizing.
Even though I am personally frustrated at how willing users are to accept slow software; I do recognize that the velocity and comfort of development outweighs the performance considerations in many cases. However, a mindset that makes you never check to see if the performance considerations are worth it to the users WILL make you make the wrong decisions.
> For people not following JS these might all seem like constantly reinventing past lessons, but there is a logical evolution happening here towards desktop-style performance
Funny, my desktop itself is already written in JS, and supports/integrates with apps written that way, too, (and also that aren't), and that's been the case for a while now. And the same has been true of apps from the Mozilla platform lineage for even longer; Firefox has been an Electron-style app for 100+% of its lifetime, for example. Talk to any self-styled JS programmer for any length of time, though, and it's like these things don't even exist—like the latter was actually invented by the Electron folks, and the only thing that made JS development viable generally as a serious applications programming language is the trio of NPM+Webpack+React/React-alikes.
It's overall not worth taking their opinions at face value. They tend to be the ones who are "not following JS". They're worshipping some weird toolchain cult and calling it JS. Indeed, judging by the compulsion to try to work around ostensible deficiencies in the language and deal in cargo cult advice (esp. e.g. concerning things like `==`/`===` and `this`, and insisting on NodeJS-isms like its non-standard `requires`) it's evident that they actually hate JS, despite what they're likely to say to the contrary.
> Firefox has been an Electron-style app for 100+% of its lifetime
That's technically correct, which is the best kind of correct, but horribly misleading.
Firefox has always been mostly written in JavaScript, but not HTML [1]. A bunch of features that are being standardized now, like Web Components [2], are pretty similar to stuff that Firefox has used in non-standard form for decades.
1. Mozilla deprecated XUL a long time ago; it's been a long time since began the transition to favoring HTML over XUL within Firefox.
2. Even if Firefox were still 100% XUL today, it wouldn't matter. The context here is the use of JS as a general applications programming language and a signpost addressing the uninitiated who haven't been "following JS". Whether it's touching DOM nodes that are in the XUL namespace or the (X)HTML one (or whether it involves DOM nodes at all) is orthogonal. Bringing this up is the kind of change of subject that constitutes misdirection.
3. I don't know what you think the role of Web Components being like XBL plays in this conversation, but it strengthens the underlying point; it doesn't weaken it...
Overall, this is a very odd response, to be generous. More accurately, it's horribly misleading to label my comment "technically correct[...] but horribly misleading".
The point is that "JavaScript Applications" aren't just written with JavaScript, and so-called "JS frameworks" really aren't doing much with JavaScript (except, I suppose, JSX).
Mostly, they're DOM enhancers, and Firefox has always been coded against its own internally-maintained, enhanced version of the DOM.
So, again, my comments were not misleading. And these remarks don't provide any information that contradicts the claims I made. (You are, though, calling attention to what is actually misleading here—and tacitly supporting what was my argument all along, without seeming to realize it.)
You did a good job summarizing the frontend framework evolution, but I'm curious where you think the evolution in backend frameworks are going? I was thinking LiveView, but I also think WASM could come into play as well.
The three recent developments I've noticed:
- "Islands" in Deno https://fresh.deno.dev/ and https://remix.run/ where only small isolated parts get hydrated, instead of the whole page
- Using http://linear.app style data-flows ala Replicache (https://replicache.dev/) where JSON data is preloaded for all sub-links and stored offline-first like a desktop app, so clicking on anything loads immediately w/o waiting for network requests to finish
- Now with 'resumability' where the server-side framework was built with client hydration in mind and delivers the bare minimum event/DOM data necessary to make the page interactive (instead of just being a glorified HTML cache load before the usual JS activates)
For people not following JS these might all seem like constantly reinventing past lessons, but there is a logical evolution happening here towards desktop-style performance and interactivity on the web. Or pure server-side performance but with full JS interactivity.
The next set of frameworks is going to be as big of an evolution the way Angular/Backbone->React/Vue was ~8yrs ago. But it's going to require new backend server frameworks, not just a new client framework. There's probably a big opportunity for the project that can combine this stuff properly.