It's amazing to me that no one out there seems to do server-local handling of ads, either... If you put ads directly into your page instead of relying on burdensome external systems, suddenly blocking isn't a thing anymore. ALL of the functionality supposedly needed for analytics and an ad-driven business model can happen server side, without the page becoming sentient and loading a billion scripts and scattered resources, with the one exception being filtering out headless browsers. If external systems need to be communicated with, most of that can happen before or after the fact of that page load. Advertising and analytics is implemented in the most lazy, user hostile way possible on the majority of sites.
I don't think it's very surprising. Advertisers won't let publishers serve ads directly because that requires trust in publishers to not misrepresent stats like impressions and real views. I don't know how you'd solve that trust problem when publishers are actually incentivized to cheat advertisers.
For advertisers to trust this proxy server, the NYT cannot control this proxy server to preserve its integrity. So now you're asking the NYT to base their business on an advertiser-controlled server?
What happens when the proxy goes down? What happens when there are bugs? Do you think publishers can really trust advertisers to be good stewards of the publisher's business? Think for a moment about publishers that are not as big as the NYT.
Okay, maybe they do trust an advertiser-controlled proxy server. This means that both tracking scripts and NYT scripts are served from the same domain, meaning they no longer have cross origin security tampering protection. What's stopping the NYT from serving a script that tampers with an advertiser's tracking script?
Those are issues, but not insurmountable, especially when the benefit is "obviate any adblocker".
They can use a trusted third party to run the proxy and use industry standards/SLAs for site reliability/uptime. And they can still use different subdomains with no obvious pattern (web1.nytimes.com vs web2.nytimes.com -- which is the ad server?) or audit the scripts sent through the proxy for malice.
- It externalizes resource usage - the waste happens on users' machines. Who cares that it adds up to meaningful electricity consumption anyway?
- It makes it easier for marketing people and developers to independently work on the site. Developers can optimize, marketers can waste all that effort by including yet another third-party JS.
- It supports ad auctions and other services in the whole adtech economy.
- You don't have to do much analytics yourself, as the third party provides cute graphs ideal for presenting to management.
There used to be an open source self-hosted (php) ad application called openx. It worked well for quite a while. In its later years, it suffered a number of high-profile security vulnerabilities, and the open source version was poorly maintained since OpenX [the company] was focused more on their hosted solution which probably had migrated to a different codebase or at least was a major version past the open source codebase.
The open source version has been renamed "Revive Adserver", and it looks maintained, but I don't think it's used nearly as much as the openx [open source version] of old.
If you use Revive Adserver or you design a server-local ad system in-house, it won't be as sophisticated as gigantic ad-providers who can do all sorts of segmentation and analysis (producing pretty reports which execs and stakeholders love even if that knowledge adds no value to the business).
It's because they use systems that identify the client via js to deliver the most "expensive" ad possible. It's complete garbage of course, Google/Facebook should be held liable for what they advertise, not run massive automated systems full of fraud. If Google delivers malware they shouldn't be able to throw their hands up and go "well, section 230!".