Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Counter-example: Soldier of Fortune is broken on modern windows because of a misapplied compatibility hack. Rename the binary and it works with no problems.

This is an awful way to implement backwards compatibility. Opaque and ad-hoc. They have been using similar toolset to break competitors applications.

The choice of what old version of windows to run the program on is typically to try them one by one.

Linux is no better with no stable ABI. Mac is a mixed bag of excellent Rosetta and breaking apps for no reason. Who did it better? FreeBSD? Some extinct “grown-up” OS like VMS?



> Mac is a mixed bag of excellent Rosetta and breaking apps for no reason.

They will probably retire Rosetta2 in a few years, like they did with Rosetta.

Apple usually seems to care about getting the bulk of applications to transition over, and the rest is just collateral damage/the devs should’ve just updated their software.


> They will probably retire Rosetta2 in a few years, like they did with Rosetta.

Counterpoint: The PPC-to-Intel version of Rosetta was licensed technology (QuickTransit); Apple was undoubtedly paying for it, possibly even per user, so there were financial reasons for them to get users off of it ASAP.

Rosetta 2 was developed in-house by Apple, so there isn't the same hard timeline to get users to stop using it. I wouldn't expect it to survive long beyond support for running the OS on x86, though.


It could last longer if gaming with their game porting toolkit gets big enough to drive more Mac sales. Money talks.


That's a long shot. I wouldn't hold my breath for seeing the day when gaming on Mac is a real choice for big games.


Yes, just like they had valid technical reasons to kill 32-bit iOS apps. The point is that they don’t go above and beyond like Microsoft (although of course even MS has deprecated e.g. 16-bit apps).


On the bright side, the end result is that on Mac there are only apps that have been updated this decade.


Is the lack of an app a good thing?


It can be because it’s an incentive to create a new, up-to-date app.


This is basically the defense of bad deeds: bad deeds inspire others to do good deeds.


No, the absence of a bad thing creates room for a good thing.

Because an old, out of date application is not available, there is a viable market for a new, up-to-date application that serves the same purpose.


Gee isn't it great when modern businesses don't have to compete with decades old software and can sell you their crapware for whatever price they want because there is no alternative.


The general idea behind capitalism is that the market provides alternatives. Why would there only be evil giant corporations whose applications are so bad that they would be trumped by those great ancient, obsolete applications? There is also room for small vendors to provide innovative new alternative, and for the open source community to try their hand.

The reality is that on the Apple platforms these ancient, obsolete applications are not available and instead there are new, modern, better applications because there is a market for them. While on the Windows platform it’s a big, inconsistent insecure mess because everyone is clinging to obsolete, unsupported software that is barely good enough.

By the way, keep pressing that button you think is the disagree button!


Is incentivizing churn a good thing?


Is it really fair to characterize replacing a 30 year old app as churn?


Not if the developer isn’t around anymore.


In a world with only one developer, yes. In the real world, where other developers can create new apps that do the same thing better, more secure, easier or in a more modern way, no.


Also in the real world, where (for any number of reasons) users sometimes prefer older apps regardless, yes.


And sometimes users do not get what they prefer and they have to make do with what they get. In the Microsoft world and in the Apple world. Tough luck.

It’s not worth living in the past because some hypothetical users want to cling to it. It’s worth promoting innovation because innovation is replacing old things with better new things.


> I wouldn't expect it to survive long beyond support for running the OS on x86

Even if the support for running x86 Mac GUI apps along with x86 macOS, they might still keep the technology around for docker, linux VMs, etc.


And this is why Apple will never be a serious gaming platform for non-exploitative/GaaS games. Personally I think it's good that I can run games that were last updated in the early 2010s on my computer.


Definitely. All Intel Mac apps will be abandoned. Even tiny apps like Spectacle will cause pain.


I've found Rectangle to be a good substitute / in-place replacement.


Thanks, it actually makes sense to switch. And Spectacle is even open source, so the amount of pain is minimal.


> Linux is no better with no stable ABI.

I’m confused. Linus has repeatedly stated that the ABI should be stable, “we don't break user space”. There are exceptions, but any proposal that makes a breaking change to the kernel’s external symbols is very hard to push through.

I don’t remember anything breaking because of a new kernel version except device drivers, which are part of the kernel anyway and should be compiled for a specific kernel version. They are not applications, so they shouldn’t rely on assumptions about the ABI.

Most Linux distros offer mechanisms to compile a version-dependent interface to isolate a version-independent driver or program that messes too closely with the kernel.

> Some extinct “grown-up” OS like VMS?

I’d say the age of binary compatibility ended with most of those “grown-up” OSs becoming legacy. I usually test (compile and test) my C code on multiple platforms, ranging from Windows to Solaris on SPARC (emulated these days, sadly). I haven’t yet figured out a cost-effective way to test it under IBM z/OS’s USS (which makes z/OS a certified UNIX).


The kernel userspace APIs are pretty stable, the APIs provided by the rest of what constitutes a complete Linux "operating system" are not. I've ended up using a lot of hacks and tricks to get some early Linux games running on modern systems. Some applications designed for X still have broken features in Wayland, and likely won't be fixed without new versions of said apps because making Wayland compatible would break the entire security model.

It's generally not a huge issue in Linux, because most of the software you use day to day is open source and probably maintained. The real problem children, across all operating systems, is proprietary video games. They're closed source, unmaintained, and their cultural value makes them highly desired targets for ongoing compatibility and preservation.


> The kernel userspace APIs are pretty stable, the APIs provided by the rest of what constitutes a complete Linux "operating system" are not.

There are plenty of userspace ABIs that are extremely stable, including whaever you need to run a game (like the C runtime). There are also APIs without stability guarantees (like the C++ standard runtime). A lot of games that no longer work depend on some of the latter libraries. There are also ABI bugs, no compatibility is perfect, but those usually do get fixed when found unless doing so would break more programs.

> Some applications designed for X still have broken features in Wayland, and likely won't be fixed without new versions of said apps because making Wayland compatible would break the entire security model.

That's not a long term compatibility problem but using a zero trust mobile phone security model on a desktop problem. That security model should be broken and moved to /dev/null where it belongs.

But really at some point you are going to need compatibility layers anyway. We already have Wine with great support for old Windows apps, there is nothing preventing something similar for legacy Linux on modern Linux emulation - except a lack of interest because there really aren't that many Linux-only legacy applications.


> The real problem children, across all operating systems, is proprietary video games.

They can be distributed packaged with the requirements (Wayland and X still would be an issue) but things like snaps of flatpacks solve that.


> Linus has repeatedly stated that the ABI should be stable, “we don't break user space”.

Linus said that the userspace API to the kernel should be stable, which it mostly is. But a GNU/Linux system contains a lot more APIs (in userspace).


Drivers being kernel specific is really annoying


All they have to do is upstream their drivers into the kernel instead of shipping proprietary blobs. Why is this so hard?


Because the Linux kernel has stringent guidelines how a driver should be written and work. Companies don't want to put in the work (i.e. pay someone with experience) to upstream their drivers. Whether this makes sense monetarily isn't really relevant to many decision makers. At least that's how I explain why Nvidia and Broadcom don't upstream their drivers.


> Because the Linux kernel has stringent guidelines how a driver should be written and work

Preventing poorly-written software to be added to the kernel is a Good Thing. The system is working as planned.


Major problem with android, stuck with whatever version of android due to binary blob drivers.


The major problem is Google not enforcing updates via their contracts.

Project Treble has made Android a pseudo microkernel with stable ABI for drivers.

However Google has decided it is still up to OEMs to decide if driver are to be shipped or not.

With no legal enforcement for accessing Google services, OEMs rather sell updated hardware.


The obvious problem seems to be propriety device drivers, no? If they didn't shoot themselves in the foot with their licensing, the drivers would work with any kernel version.


> They have been using similar toolset to break competitors applications

Source(s) ?


This, for example. And they got caught. How many times they did not?

https://www.theregister.com/1999/11/05/how_ms_played_the_inc...


Your evidence is an article from 24 years ago about behavior that happened 32 year ago? And it's not even about them breaking competitors applications, it's about them refusing to run on a competing OS (in a bit of a sleezy way).

Do you have more evidence of your claimed behavior?

I dislike MSFT, a lot, but that's a _very_ big claim and needs to be backed up with evidence.


My claim is that Microsoft operating system was silently detecting competitors software and changing behavior to break compatibility. That is proven. The war on WordPerfect was equally shady.

Did Microsoft clean its act at some point and stopped doing so? They force Edge at every opportunity, so even the behavior that almost got them forcefully partitioned is back.

I don’t think we have caught them outright sabotaging e.g. Chrome aside from the default browser shenanigans, but who would bother to check unless it’s a repeatable crash? Aside from Chrome what app do they even have a need to sabotage? Steam?


> My claim is that Microsoft operating system was silently detecting competitors software and changing behavior to break compatibility. That is proven

That's a false claim. That code was in a beta and never shipped. You're just spreading FUD..


The code was shipped, just disabled.


Plus, let's be real, the Register is basically the tech equivalent of the Daily Mail. Often amusing, occasionally occurate.


Sure, but the DR DOS case is well known. Is ZDnet better? It also lists other cases related to efforts to destroy Novell.

https://www.zdnet.com/article/caldera-unlocks-microsoft-evid...


Never expected someone would bring up the SOF issue on kodern windows. PCgamingwiki FTW.


Apple dumped 32 bit support which sealed MacOS fate as a gaming platform.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: