While I agree with your point, I wonder why that does not seem to be the case with programming languages.
For example, in iOS development (and, more in general, on Apple platforms), there has been a huge shift from Objective-C to Swift.
The same arguments should apply there. Swift is much better, but Objective-C got the work done, and many codebases were written in it, especially at Apple. And yet, the whole community switched pretty quickly.
One could argue that Swift was easier to pick up for newcomers. While that's true, I would then expect the argument to apply also to SQL alternatives.
I don't think this is a great example. The Objective-C to Swift transition going smoothly is only because of Apple's almost total control of the ecosystem (I mean this positively).
Apple is automatically the loudest voice in the room for iOS development. If they embrace Swift, the writing is on the wall for Objective-C. It's not just sticks, I'm sure they also put a lot of effort into making the transition as easy as possible.
In SQL - there is no equivalent. I think a better example is x86.
It took a generational new form of computing (mobile) to give ARM the momentum to seriously challenge x86. It's been almost 15 years since the original iPhone and we're only now seeing ARM based processors in computers.
SQL IMO is EVEN harder to displace than x86. x86 has a massive ecosystem but only only two serious manufacturers. SQL has a similarly massive ecosystem AND is the de facto language of the vast majority of major databases. Going to be very hard to unseat.
> It's been almost 15 years since the original iPhone and we're only now seeing ARM based processors in computers.
Are we just ignoring the Acorn Archimedes series of computers which gave rise to the ARM processors in the first place, fully 20 years before the first iPhone was launched?
No, we’re factoring in the market impact of the Acorn Archimedes on the wider desktop computing ecosystem compared to the impacts we’re seeing now around ARM on the server and the iMac of M1 based desktop and laptop machines.
There was a strong push for "NoSQL" about a decade ago, but it got marred by document databases trying to usurp relational databases around the same time. When people realized they chose the wrong tool for the job (that is, the document database), they were happy to return to their relational databases using SQL. That completely killed any momentum that had been built to replace SQL with different languages.
I don't think that NoSQL ever was about better query languages for relational databases. It was about making databases faster and easier to use by using simpler data models with fewer guarantees.
The author seems to think that putting structured data in columns is a good idea. That is pretty clearly contrary to the basics of the relational model itself, never mind SQL. In fact it's quite close to how document databases work, so a very NoSQLish proposal overall.
>That is pretty clearly contrary to the basics of the relational model itself
The consensus among relational theorists appears to be that data types can be arbitrarily complex. For instance, C.J Date and Hugh Darwen write the following ([1] page 56):
Third, we remind you that types are not limited to simple things like integers. Indeed, we saw in Chapter 1 that values and variables can be arbitrarily complex—and that is so precisely because the types of those values and variables can be arbitrarily complex. Thus, to paraphrase a remark from that chapter, a type might consist of geometric points, or polygons, or X rays, or XML documents, or fingerprints, or arrays, or stacks, or lists, or relations (and on and on).
Ironically, at least in the case of ObjectiveC->Swift, the answer is almost certainly "because those languages compose with each other reasonably well". I say ironically because I believe most of this article boils down to how uncomposable SQL is.
I think the same can be said of Java -> Kotlin, C -> Python (I know, I know), and lots of other medium-to-large scale language migrations over the past several decades. When people move to a new language, it's because there's strong interoperability with what came before that everyone would like to quit using but can't because they have too much invested in it.
This suggests to me that anything that wants to beat SQL will in fact have to compose with it - probably partly by generating it, but also by having a fairly solid "drop down to SQL" story. In other words, a language that, at least on the read-side, can somehow take two separate SQL queries and automatically rewrite them as subparts of a different SQL query. It might not be fast, but it needs to work, because you're going to want to reuse that work on that one gnarly query you did that gets all the important business metrics, and you also are going to want your results to be free of consistency issues.
> ”And yet, the whole community switched pretty quickly.”
This is only true for hobbyists and indie/startup developers. The largest and most popular iOS apps remain Obj-C (with a lot of C++ and in-house frameworks in the mix). There’s no incentive to rewrite something like the Facebook app in Swift.
Maybe the objc / swift difference is great enough to justify such change.. meanwhile the query languages have not evolved enough to do so .. just wondering. Maybe ask the datomic/datalog folks.
I don’t know if that’s really true these days. Each SQL server has its own dialect with nuances you have to pick up sooner or later, and if you’re a backend engineer for more than a couple years you’ve likely dealt with multiple types of data stores.
For example, in iOS development (and, more in general, on Apple platforms), there has been a huge shift from Objective-C to Swift.
The same arguments should apply there. Swift is much better, but Objective-C got the work done, and many codebases were written in it, especially at Apple. And yet, the whole community switched pretty quickly.
One could argue that Swift was easier to pick up for newcomers. While that's true, I would then expect the argument to apply also to SQL alternatives.
So, what is the difference here?