Here's a thought experiment. This one requires some long-term thinking, outside the box and well past recent history and the status quo.
What if the majority internet usage is non-interactive, from so-called "bots", what we may refer to as "automated use". Google and Facebook, among others, rely on the use of automation and "bots". The non-interactive clients ("bots") being used by these companies are not asked to solve captchas. (In turn, after collecting data from public sources, these websites attempt to prohibit the use of automation by their users wishing to access it. What is interesting is that neither company provides any definition of "automated" nor any clearly stated limits on the speed at which a user may access resources or the quantity of resources they may access in a stated time period. One might be apt to find such limits associated with an "API".)
In 2013 an Incapsula report suggested that the majority of internet usage is in fact automated and not "malicious"^1 -- what if public information sources on the internet catered to the use of automation rather than trying to limit such use, e.g., with speed bumps^2 like "captchas". What if servers treated all clients equally, instead of having data forcibly collected by a few large clients that receive preferential treatment, then siloed and protected from "automation". What effects would this have on "centralisation" and levelling the playing field.
"Do not ask for permission, ask for forgiveness." What does it really mean when applied to the internet. Perhaps it means there is an endemic lack of clarity about "the rules". Prohibiting "automation" is far too vague and in many cases it makes no sense. The growth of computers and the internet is the growth of automation. Both servers and clients may have concerns about resource utilisation. Websites do not ask for permission when they decide to use large amounts of the user's computer resources.
Consider that a Google could not exist without being "given permission" to use automation. Does the GoogleBot have to solve captchas. No automation means no company such as this could exist. How useful would the web be without anyone being able to use automation to create an index. Based on the HN comments about web search I have read over the years, I would guess that for many commenters, it means the usefulness of the web would be dramatically reduced.
Imagine an automation-friendly internet. The truth is, I think (the data shows) we already have one, except we are in denial that "the rules" actually allow it. An early metaphor for internet and web use was "surfing". It may be that those who are constantly fighting against automation are fighting against the waves instead of riding them. Time will tell. It stands to reason, IMO, that every internet user, whether a server or a client, should be expected to use automation.
What if the majority internet usage is non-interactive, from so-called "bots", what we may refer to as "automated use". Google and Facebook, among others, rely on the use of automation and "bots". The non-interactive clients ("bots") being used by these companies are not asked to solve captchas. (In turn, after collecting data from public sources, these websites attempt to prohibit the use of automation by their users wishing to access it. What is interesting is that neither company provides any definition of "automated" nor any clearly stated limits on the speed at which a user may access resources or the quantity of resources they may access in a stated time period. One might be apt to find such limits associated with an "API".)
In 2013 an Incapsula report suggested that the majority of internet usage is in fact automated and not "malicious"^1 -- what if public information sources on the internet catered to the use of automation rather than trying to limit such use, e.g., with speed bumps^2 like "captchas". What if servers treated all clients equally, instead of having data forcibly collected by a few large clients that receive preferential treatment, then siloed and protected from "automation". What effects would this have on "centralisation" and levelling the playing field.
"Do not ask for permission, ask for forgiveness." What does it really mean when applied to the internet. Perhaps it means there is an endemic lack of clarity about "the rules". Prohibiting "automation" is far too vague and in many cases it makes no sense. The growth of computers and the internet is the growth of automation. Both servers and clients may have concerns about resource utilisation. Websites do not ask for permission when they decide to use large amounts of the user's computer resources.
Consider that a Google could not exist without being "given permission" to use automation. Does the GoogleBot have to solve captchas. No automation means no company such as this could exist. How useful would the web be without anyone being able to use automation to create an index. Based on the HN comments about web search I have read over the years, I would guess that for many commenters, it means the usefulness of the web would be dramatically reduced.
Imagine an automation-friendly internet. The truth is, I think (the data shows) we already have one, except we are in denial that "the rules" actually allow it. An early metaphor for internet and web use was "surfing". It may be that those who are constantly fighting against automation are fighting against the waves instead of riding them. Time will tell. It stands to reason, IMO, that every internet user, whether a server or a client, should be expected to use automation.
1. https://www.incapsula.com/blog/bot-traffic-report-2013.html
2. An early metaphor for the internet was a "superhighway". Speed bumps would seem out of place on a superhighway.