The rough heuristic is: the longer a technology has been around, the longer it will last into the future.
That is, it's NOT the case that every piece of technology lasts roughly the same amount of time, and then is replaced.
For example, a chair vs. an iPhone. Which one will be used further into the future? Almost certainly a chair.
----
Land lines have been around a lot longer than fax machines (both in the article), so they will likely outlive fax machines.
Will HTML or JavaScript last longer? Probably HTML, since it came first.
What about ASCII or HTML? Probably ASCII.
These have a "dependency stack" issue, but it applies regardless. And I think that is part of the phenomenon -- low level technologies that are widely adopted take longer to go away. Plumbing, electrical (AC vs DC), land lines, TCP/IP, BIOS, etc.
I can't find a link now, but there was a recent Kevin Kelly article about finding farming tools from a catalog in the 1800's still in use. I think he said that almost EVERY one was still in use, or able to be purchased, which is a nice illustration of the point. It takes a long time for old tech to die, and arguably it never does.
They were sending drawings through telegram lines for newspapers during the American Civil War. Thus, the fax is older than the phone. But in terms of general population use, of course most people encountered a phone before they encountered a fax.
Telegrams were always expensive but the price now is outrageous. I followed the link and chose Norway as the destination country and was told that it would cost GBP 22 + 67p/word! Even worse for Mexico: GBP53 + 67p/word.
That doesnt sound like fax. Or better, if you include this and fax together, you’d have to describe it as a generic technology that allows sending images remotely. Which may actually survive the telephone.
Backing up the original point of the parent post's spirit, I actually see fax lasting longer. It's special cased in a lot of regulatory structures as 'secure' and has quite a bit more use than the HN crowd might think.
Fax is the cheapest way to get forecast if you sail offshore through a service called "weather fax". Alternatives works through satellite with hefty monthly subscriptions. That service is the reason I bought yet another raspberry pi with a software defined radio module.
It kind of makes you feel ridiculous about the amount of work that we put into securing even systems that are in no way critical and then the world runs on unsecured mailboxes, homes which can be broken into by someone willing to kick hard enough, Social Security cards with no security features whatsoever, funds transfer that only requires your account and routing number to withdraw, and so on.
It's not that ridiculous. Securing stuff on computers is more important due to the scale at which attacks can occur. On the internet, you can gather information on millions of users in little time and without putting yourself in a place where you could easily be caught. And for a machine that isn't connected to the internet, it can still give you far more data than you could get by stealing paper that takes up the same amount of physical space.
I'd say most of the parents' comment is applicable to the same scale. People can abuse your social security number or bank account and routing numbers across the Internet. Phone numbers being relatively easy to spoof and/or hijack makes fax trivial to mess with internationally as well.
I think you are understating the extent to which mailboxes and homes are secured by men with guns. Sure; there’s very little preventing someone from kicking their way into someone’s house, but if someone did so, they would very likely end up in a jail cell or even, in some places, dead.
Unfortunately, you can’t physically harm people over the internet, so different security measures must be taken.
Your odds are very good stealing from my mailbox and not that bad burglarizing my house. Maybe you have much more vigilant police where you live. What’s more, unauthorized access to computer systems is also a crime.
The odds of getting caught robbing a single mailbox are not high, but if you do it on a large scale, it starts to get pretty risky.
The odds of getting caught attacking a Russian computer from the US or attacking a US computer from Russia are essentially zero for any scale small enough to not have major foreign policy implications.
Homes are massively protected by people’s goodwill, not by the judicial system. In most of Europe and more than you suspect in US too, if you call the police, they come more than 30 minutes later (especially in France where they have no right to use their guns and don’t risk their own security), and if you “handle the matter yourself”, you are in just as much problems as the thieve, particularly in US where you still have to use a lawyer to prove your innocence. No, what really holds houses from being broken into, is mostly that people don’t do it (hence the use of a society where people earn enough to not be willing to risk physical fight).
Most forms of security rely on the goodwill of most people in society; the entire problem consists of handling the minority of people who lack goodwill. These people exist in every society.
Wow, given the answers to that thread, it seems that the HN crowd has either no notion of conditional probabilities or that their “future is better” mindset is clouding their rational judgement.
The Lindy effect makes perfect sense has an heuristic to evaluate the remaining lifetime of technologies and species at a given point in time. It’s trivial to prove that it works for anything whose survival curve is convex.
Despite its apparent love for rationality (which I enjoy), it seems like a large part of HN is just as subjective as anyone and unable to accept things that don’t fit their mental model.
This field also seems to attract (or foster) a particular mindset that we might call "I am smart and hyper-rational", which then adds an additional set of blinkers and an enhanced ability to discard and ignore evidence, and an enhanced faith in one's own powers of deduction.
All people have that skill, of course, but this field seems to have more than its fair share of such.
> Wow, given the answers to that thread, it seems that the HN crowd has either no notion of conditional probabilities or that their “future is better” mindset is clouding their rational judgement.
I mean it's a tech entrepreneurial news aggregator on a tech incubator's website.
Their whole raison d'etre is that "future is better buy my technology so I can get $$$" -- that's why this site exists.
Is it that it doesn't fit the mental model, or is it failing to read the room?
> C programming language will be 50 years old next year.
This principle implies the C programming language will outlive Rust, Go, Python, etc. I'm not sure how I feel about that. There's a good chance its right, but I'm uncomfortable imagining my grandchildren learning C's weird quirks.
There's definitely codebases written in C today which will outlive me, like the Linux kernel. And most of those codebases will probably never migrate to a different language.
I think C will outlive those languages, in the sense that it will exist. But it doesn't necessarily follow that your grandchildren will have to learn it. There will be some other language like JS or foobar 2050 that's just way more relevant and popular :) But they certainly COULD if they wanted to.
The question of whether a technology simply exists vs. whether it retains its popularity is an interesting one. Chairs and forks will both exist and be popular. iPhones will exist, but they won't be popular on some time scale. So it is interesting to think whether C is closer to one or the other :)
While C is not perfect, it is in a sense something that most other programming languages are not. It's simple and if you program in C you probably touch upon the entirety of the language on a regular basis. Because it is simple you can write a compiler for it with some reasonable effort if need be, even for completely new architectures.
In any case, I'd argue that most of the quirks are due to undefined behaviour as implemented in the compilers. And 50 years from now the language have probably evolved some more anyway.
I'm more worried my grandchildren will still be learning/using QWERTY than they have to learn C's weird quirks :). My opinion is that it's easier to get a group of tech people together and gradually replace C (like Android Rust development) than getting people to replace QWERTY or getting the US to move to Metrics system.
Arguments could be made that it's inefficient or un-ergonomic, though I'm doubtful how much difference it makes on a modern keyboard. There's an oft-repeated story about the qwerty layout being designed to keep people from typing fast enough to jam typewriters. I can't speak to the legitimacy of the story, but I do collect typewriters and I can say for sure that there's a lot more pressure and key travel (and time!) required to slam a type bar into the ribbon than is needed to press a modern keyboard key.
I learned Dvorak a decade ago and loved it unless someone else had to use my computer or I had to use a shared computer. I'm typing this on a qwerty keyboard and I'm okay with that now.
> There's an oft-repeated story about the qwerty layout being designed to keep people from typing fast enough to jam typewriters.
Typewriter layouts are designed to avoid jamming, but not by reducing typing speed. They place letters that are pressed in direct succession apart, so that the levers don’t collide. Trained typists can write pretty damn fast on a typewriter.
I thought I remembered something about that, thanks. Dvorak does seem to cluster a lot of commonly used letters, which would definitely jam neighboring type bars pretty quickly unless they used a Blickensderfer-[1] or Selectric-style unified type element.
I've always wanted a Blick, their "scientific" DHIATENSOR layout intrigues me. They also made a qwerty version and I seem to remember a story that their salespeople would make you sign a waiver that you were choosing a less efficient keyboard if you bought one. That's certainly 90% marketing for their layout and maybe 10% fact, but I find it amusing (if true, but I can't seem to find a reference to it any more).
I'll bet typewriter-trained typists can really move. It's definitely a separate skill, at least for the manual typewriters in my collection which have about an inch of key travel and a heck of a lot more actuation force than the gateron red switches I use at work. :-)
Ultimately I find that the fastest layout is the one you know.
> They place letters that are pressed in direct succession apart,
... and that really reduces typing speed. When i switched to dworak I was amazed how my fingers just roll on a keyboard, forming sentences faster than I even form a conscious understanding in which succession should I move my fingers. It just happens as if by itself!! Just think of a word to type and it is already on the screen.
That may be the effect for you, but it's not a goal. It's maybe a tradeoff, but machine writing is a trainable craft. Usual results for trained people are at about 200 - 400 characters per minute measured over 10 minutes. Championship results are at up to 900 characters.
There are plenty of unofficial records and leader boards for typing speeds where qwerty typists are just as capable (more so even due to sheer availability) as Dvorak typists at hitting 200+wpm, and while that alone shouldn't be used to judge the quality of a particular layout it shouldn't be ignored that there's no difference in attainable max speeds done in short bursts. That says nothing about sustained typing of course, but I strongly suspect if you're already a competent and fast typist in qwerty you'll be comparably as fast in Dvorak and vice versa.
Dvorak is actually pretty bad as a smartphone keyboard, it's advantages with two-handed typing on real keyboard become disadvantages if you type with your thumb.
Dvorak has specialised right- and left-handed layouts. Maybe it also needs specialised one finger layouts, too.
I had wondered about phone keyboards. Part of the reason I switched back was that I couldn't change the layout on my phone and got tired of switching back and forth. (I know there are probably some options now, but there weren't on iPhone 4, Kindle touch, or feature phones with physical keyboards.)
I can imagine you'd need to use both thumbs extensively or you'd be jumping back and forth. I definitely found that when I was laying on my side on the couch, propped up with one arm, qwerty was easier to use with one hand. Not that that's an ergonomic or sustainable way to type, but I was a lazy college student and I didn't care.
> I'm uncomfortable imagining my grandchildren learning C's weird quirks
What about them learning English's even weirder quirks? Does that make you uncomfortable as well?
I have come to the point that I sort of accept C's quirks, and, with the appropriate mindset, I can even love them. I certainly see myself as an old man teaching these arcane quirks to a young and innocent audience.
Sure but I think programmers is an implicit assumption here when talking about which programming languages to learn. We don't need to start splitting hairs here.
And COBOL is extremely niche, while C is still quite common and useful to know.
Isn't that the whole worse is better thing in action? People complained about C and Unix when they were new, but they picked up so much traction, and people invested enough effort into working around the flaws, that it just achieved critical mass. Same thing with JavaScript; it's never going anywhere.
Certainly higher-level languages have edged out C in many domains though.
> Laptops (since first Apple Powerbook) - 30 years.
That kind of Apple revisionism is not correct. The NEC UltraLite predated Apple clamshells by 3 years (1988), and luggables with a battery configuration have been available as early as 1981 (Osborne 1 with an aftermarket 1 hour battery add-on).
The Grid Compass was probably the first true laptop in 1982. But it ran a proprietary operating system by default and costs about $8,000 in 1982 money. The Data General-One came out in 1984 and might be a better candidate. But neither of those were really mainstream.
“The TRS-80 Model 100 is a portable computer introduced in 1983. It is one of the first notebook-style computers, featuring a keyboard and liquid crystal display, in a battery-powered package roughly the size and shape of a notepad or large book.”
Nor was the Powerbook the first laptop nor the IBM PC the first PC. Those are all, however arguably responsible for popularizing the technologies in something resembling their current form.
How is that inconsistent with the parent statement? Even if the form factors were inevitable--e.g. someone would have decided that smartphones didn't need a physical keyboard--someone had to be first to popularize. (Not sure I agree on the laptop but the iPhone pretty clearly popularized the modern smartphone form factor.)
To say they are "responsible" for something can be read as "allowed to take credit for". Being the first to popularize something is a business achievement, which is quite meaningless compared to the technical achievement of being the first to build something.
Something that's a mainstream business success is far more interesting in general than progenitors that never really took off for whatever reasons. They're may still be important as technical achievements but the history books notice those who took things mainstream.
James Watt didn't actually invent the steam engine. He just came up with an invention that made it a lot more efficient.
I can see this in the case of the PC and the laptop, but proto-smartphones were around for years before the iPhone and they were not developing in an iPhone-like direction. Every iPhone-like product that came after was the result of copying. I don't think it was inevitable at all.
One thing I do think is true is that Apple developed sufficient brand permission to be able to do things that were definitely outside of a lot of buyers' comfort zone and arguably needed some iteration to really nail.
I'm still inclined to think someone would have jumped to a smartphone without a keyboard. But it's also true that the iPhone had plenty of critics early on and arguably didn't fully hit its stride until the 3GS.
Blackberry, Palm, and Windows Mobile phones were more widespread than you're giving them credit for, as were laptops considerably preceding Apple's PowerBook series.
The mindblowing part about these kind of numbers for me always is the sheer amount of smartphones out there.
I mean, imagine a parallel world where those smartphones weren't designed to shove ads down your throat and where they could be used to be as productive as with a laptop, and where people could help to automate their lives on their own with it.
Smartphones are a huge productivity tool, that's why they took off in the first place. Especially Blackberry, which offered the magic technology of accessing your email and calendar from anywhere. The ads are not an obstacle to this, especially not on iPhone.
They are far more than that. For a large number of people, the smartphone is their first and only computing device. Enabling internet access is like rocket fuel for advancing socioeconomic conditions for those in developing nations.
Entire generations have been lifted from poverty due to it.
I don't know about you, but a consumption-only "first and only computing device" sounds incredibly dystopian. I get that they have a positive impact, but it feels like we could be doing better.
Doing anything but consumption and lightweight content discovery on smartphones is basically a farce. They are not usable as general-purpose creation machines.
> Doing anything but consumption and lightweight content discovery on smartphones is basically a farce.
I disagree, even when talkingn about a smartphone as a human interface device and not, as was the actual context of the thread, a computing device.
Having a, say, DeX-enabled Android (or even potentially powered by fairly traditional Linux, if one doesn’t just use a major maker device with stock software) smartphone as a computing device doesn’t preclude standard desktop I/O devices fo interact with it in manner very similar to a standard desktop PC.
> They are not usable as general-purpose creation machines.
Again, as an exclusive computing device, they are just about as much as is any computing device.
Even as an exclusive HID, there’s a giant excluded middle between “consumption only” and “not usable as general purpose creation machines”.
What makes a smartphone a smartphone is the ability to use it with just the touchscreen. A smartphone with desktop peripherals is just that, a desktop. What I am arguing is that a bare smartphone (aka what most of those people in the developing world can afford) cannot meaningfully serve, due to being a form factor with a very imprecise input method, as a creation platform for any sort of precise content.
> What makes a smartphone a smartphone is the ability to use it with just the touchscreen
You seem to be conflating the ability to be used with a limited interface with a restriction to be used with only a particular interface. I desktop PC has the ability to be used without a graphics tablet. It would be a much more limited creation device if it were restricted to being used without one. And the same is true of everything that isn’t a touchscreen, microphone, or other built in human interface and smartphones. But a smartphone as an exclusive computing device does not imply the limitations of its built-in interfaces, since having external interfaces is also a standard feature of the class.
While I agree with your definitions of what a computation device is, I also don't see Android being developed on Android in the near future. So many UX conceptual problems won't allow this, so I think that as long as computers are required to build smartphones, they cannot be described as "general purpose" machines.
That world cant exist. A flaw of anti capitalist alternatives is that they're non-natural.
Any alternative where trade didn't follow optimal path is due to regulation / force. And in most, the force required to steer humans away from their nature also kills innovation.
Ergo: you cant have a miracle chip in your pocket with no one using it to sell you potato chips.
1. Nobody mentioned capitalism or alternatives in the parent comment.
2. Capitalism is not "natural" either and only started developing in the post-renaissance world.
3. Natural selection demonstrates conclusively that relying on purely natural processes to drive "innovation" often lead to highly non-optimal and harmful trends.
4. Most of the major scientific and technological innovations of the 20th century were either the direct result of or funded by "force" which I take to mean government.
5. Advertisement was not the only monetisation strategy that the internet and telecommunication industry could have taken. The internet could have very easily gone down the subscription route and the only reason why anybody thinks otherwise is because of the decade long marketing that has normalised getting everything for "free".
It really depends on how you define laptop. There was this sort of thing in the 80s, (https://en.wikipedia.org/wiki/Toshiba_T1100), of course. And there were a couple of laptop-shaped laptops from 1988 on, but they were either spectacularly compromised (the NEC Ultralite had a max of 2MB of storage, for instance) or spectacularly expensive or both. In terms of laptops that were shaped like laptops as we know them, and that were actually usable outside of very specialized applications, the Powerbook 100 and Thinkpad 700T could be reasonably claimed to be about the first.
Every time someone brings up the Lindy effect I can't help but roll my eyes. It should be replaced with "Survivorship bias". Almost every technology that humans used that lasted for a long time and no longer used has disappeared and is no longer in used, tautologically. The Lindy effect just seems to be a list of examples of cherry-picked technologies.
Based on your eye rolls and subsequent "explanation", it's clear that you don't understand the Lindy Effect. It's not about listing examples of things that have been around for a while. It's about predicting the likelihood that something will continue to be around given how long it has already been around. This effect is well studied and just a cursory glance at the Wikipedia page will give you some solid sources for more rigorous understanding.
There are no "solid sources" there. Its a bunch of books and articles.
Well-studied? By whom? In what journals?
The Lindy Effect may be true, but based on that Wikipedia article's sources, you can't make a good scientific claim for that being the case. Even if you could, you still run into all the current problems such a nebulous branch of science must contend with, such as the peer review problem and the reproducibility problem.
I think you could mount some interesting objection to the Lindy effect, but this isn't it. I'm not really sure what you're trying to say.
It's not claiming to be a scientific law; it's a heuristic for making decisions. The rest of Taleb's books are also about making decisions, not "being right" (whatever that means).
A concrete example is if I'm writing a blog, and I want people to read my posts in 5 or 10 years. Do I go with the cloud platform that just launched or an older hosting provider? This is a decision people make every day. Of course there are many people who don't care if their blog is readable in 5 years; this isn't a judgement.
The Lindy effect is not about what's "better"; it's about what lasts longer. It's also not making statements about the present, which is what survivorship bias typically means.
Another helpful angle is to consider things that aren't Lindylike. People for example -- we expect older people to die sooner than younger ones, not later. And radioactive nuclei -- we expect their ages to be irrelevant to their expected future longevity.
Yeah there are definitely some subtleties, and they would be interesting to tease out.
Older people obviously will die sooner, but having survived does give you some information. For example, your life expectancy at age 1 is 75 years, but at age 40 it's closer to 79 years.
And if you are writing a blog, it should be written in a way that it could be read 10 years ago if you want it to be read in 10 years if we're following Taleb's thinking.
I can see that, I think the Lindy effect needs some refinement.
My personal take is that there’s a an apex for a particular generation of technology, and that is good forever. A 1930s Farmall tractor is an example of that... there are improved modern replacements, but the 1930 model still does the job near optimally. I would guess that a non trivial number of those tractors will be in use in 2130.
1980s/early 90s minicomputers are similar. Many of these devices are still in use today, and probably could be kept in use for decades to come.
Modern tech is a little harder because we’ve been in a rapid growth phase and the software services based world is more aligned with production than sustainment. I’d bet that trend will change in 20-30 years.
Yeah I just watched some videos that is extremely related by this modern homesteader (and YouTuber! -- apparently he was on the TV show "Alone").
He says "one of the best pieces of advice I've ever gotten is: Don't trade a gun for a snow machine". This is exactly what you're saying, and it's backed up by a lot of experience living without power and water!
He also says "everyone one of us has to decide when to jump ship on a technology"
He says canoes peaked in the 1960's, and you can buy a used one for like $125 that's the same as what you'd buy today for thousands. Same with hand saws. He maintains old saws and chainsaws and uses them:
when you look at any kind of manufactured goods a lot of things have reached their peak and are either poorer quality than they used to be or they're just the same quality as their peak
My favorite example of the quality issue is the “whirly pop”, a stovetop popcorn maker.
The old one my parents had was aluminum with a metal gear. The modern version has been MBAed to death — the gear is plastic, and the lid is so thin that you could probably replicate it with 2 plys of aluminum foil. It costs more and is measurably worse in any dimension.
I recently bought a leaf blower/vac mulcher. Took me way longer than it should to figure out that the difference between the $50 model and the $100 one was that the latter had a metal mulching blade instead of a plastic one, and that the former would likely break as soon as I vacced up a stick that was a bit too big. Thanks random forum poster!
Grummans are great and they're still being made (though not by the original company).
However, for recreational/tripping/whitewater, Royalex-based canoes were better for a variety of reasons. Unfortunately the material is no longer being made because its intended use (Go Carts) didn't take off to the degree planned. The company continued to make it on a more or less breakeven basis but upon a change of ownership the new owner decided to scrap it. There have been one or two efforts to make something equivalent, but AFAIK they haven't panned out.
There are still plenty of well-made fiberglass/Kevlar boats being made but they're much more fragile.
The tractor thing is quite an interesting one( not necessarily with just this particular one): the older tractors ended up being so reliable that people often try to get an older one instead of splashing out on a brand new John Deer and this annoys the manufacturers down deep to their bones.
Of course, you'll also see people arguing that you shouldn't be driving a 10 year old car for the same reason. There's some level of tradeoff where using an older product without the latest safety features makes sense.
It's exactly survivorship bias, but the contextual usage is different. Usually you use survivorship bias to discredit the relevance of an observation. You should think of the lindy effect as survivorship bias as a supporting heuristic for a prediction.
I think the idea is if you randomly sample a range you have weak evidence as to the size of the range. For example if you randomly sampled and got "2" it would be more likely the range had a span of 0 to 4 than 0 to 100,000 though either are possible. On average your random sample will be at the halfway point of the range.
The Lindy effect is the realization that your encounter of something is like a random sample. "How old are chairs when I exist?" "How old are iPhones?"
I think the difference is that survivorship bias applies when the difference between winners and losers is mostly due to chance. I don't think the fact that we use 4-legged chairs and not 5-legged is survivorship bias. I believe the Lindy effect's prediction that 4-legged chairs will be around a long time. Of course, whether it's survivorship or not is case-by-case.
There's a good reason why office chairs have five legs/wheels where regular chairs have only four: safety.
Office chairs have a reclining mechanism. If you ever leaned back too far in an old office chair with four wheels, you would find out the hard way that four isn't enough. It's very easy to lean back comfortably and not realize you've reached the tipping point, fall back and hit your head on the floor or have a close call.
My first jobs long ago had four-wheeled office chairs, so you can guess how I found this out.
The fifth wheel goes a long way to preventing this danger.
You're much less likely to be leaning back in a regular four-legged chair so the front legs come off the ground, and if you do you'll probably be more aware that you are doing something outside the chair's normal mode of use.
And even if offices go out of style, office chairs will likely be around long past then.
This might be obvious, but... the reason we have a special name for things that behave that way (Lindy effect) is because this is usually a terrible heuristic. Most things are "perishable", as the article puts it - past a certain age, the older a human is, the shorter their life expectancy. This is true of most things.
The Lindy effect talks about the rare cases where this isn't true.
It is just a natural case of the exponential distribution, which is widely used to model expected waiting time. I wouldn't call it 'rare'.
Absent other observations and/or priors, the best estimate of your expected waiting time is the amount of time you have already waited.
So if you've been waiting 10 minutes for something to arrive, your best estimate of how much longer you have to wait is 10 minutes.
If you've been waiting 10 years for a tech to become outdated, your best estimate for how much longer it will take is 10 years.
People usually use buses as an example. So it makes sense: if the bus is supposed to come every few minutes, and it hasn't come in the last 15 minutes, then odds are something is wrong thus increasing the prob that the additional wait will also be 15 minutes.
Well yes, but not only. I think there are examples where it applies to living organisms too. E.g. a few hundred years ago, a human who lived past the age of 5 would be more likely to grow to be 40, afaik.
I was mostly pointing this out because a lot of people in the comments seem to be talking about this as if it's some kind of universal law, or a proposed universal law, when really this applies to only exceptional circumstances.
> Land lines have been around a lot longer than fax machines (both in the article), so they will likely outlive fax machines.
I'm less sure about this one, because of the relative difficulties of replacement. You can replace someone's POTS line with VOIP and they'll likely never notice (and this is underway). They'll notice if you take your fax machine.
>You can replace someone's POTS line with VOIP and they'll likely never notice (and this is underway). They'll notice if you take your fax machine.
The question is if fax on VoIP is still fax in that sense. At least the wikipedia article on fax mentions transmission through audio-frequency tones [1]. However if you count fax over VoIP as fax, I guess you should also count phone on VoIP as "landline".
Fax machines seem long extinct everywhere except in Japan and in some hotels. Landlines probably only survive in business environments, I hardly know anybody who would have and use a landline at home. Also, both businesses and consumers mostly use VoIP-based "landlines" employing codecs which can't support fax.
Faxes have become a largely digital thing, as there are various digital fax services. Neither side needs to have a phone connection that can support it. The fax services just need to be able to communicate to each other.
I was in a radiology startup for a bit, and getting people to fill out forms on something like an iPad is still a problem. You need staff able to help them, and people damage or try to steal them. So then you end up with paperwork, and if that paperwork needs to move somewhere else, people fax it.
Fax machines are still widely used in healthcare, at least in the U.S. Can't speak for other countries, but I'd be surprised if that weren't also the case elsewhere.
My employer was moving us to a new office, right before the pandemic, it's not seen much action, but... there's a fax machine. I doubt any of us would use it, but it comes with our corporate really estate package. Just in case you need to send a fax to Japan... or some hotel?
15 years ago, the office I worked in received daily menus via fax from local delis and restaurants with their specials, in case we wanted to get lunch. I have no idea if this is still a common practice but it seemed to be at the time.
This even had the slight benefit over sending email to a random address because a fax can just be posted on a common board in the office space, rather than someone having to take the step of printing the email first.
Perhaps. Whatever, it sounds comforting and fun. I've heard they have a colour (!) fax machine at almost every home Japan. If I moved into an office which had one, I would actually contact somebody there and have fun sending hand drawn pictures to each other :-)
Landlines are probably more common than you think. I only got rid of mine last year. I would have kept it for backup but it just cost more than I was willing to pay. Many of us don't get great cell reception and WiFi assist isn't always perfect.
Adding to my first reply: if Trinitron displays were gone (it appears they aren't), and some newer tech isn't, that would NOT contradict the Lindy effect.
If you already KNOW that Trinitron displays are gone, then there's no uncertainty. The Lindy effect is a heuristic for making decisions under uncertainty.
The relevant situation is if you have two things that still exist, and you want to guess which one will last longer. This doesn't override other facts about the domain -- it's SOME information in the absence of any other knowledge. It's for poker players, not scientists.
The example I gave was the new cloud blog startup vs. the old hosting platform. Which one would you put your blog on if you wanted people to read it 5 years from now? All other things being equal, I'd take the old hosting platform. But if you think the startup has a really good business model or you like the founders, maybe you choose that one. It's just common sense.
Another example might be cold-blooded crocodiles vs. a warm-blooded mouse as an example. All things being equal, you could guess that the crocodiles will survive further into the future, since they were here first. But someone with a specific theory or expertise could also argue that the bigger animal is less likely to survive, etc.
Similarly, we already know that dinosaurs are gone. This isn't a situation where you need to act or make a prediction.
Someone who understands Bayesian statistics can probably explain it better than me, but you have to take into account existing knowledge, and update it with new knowledge. Picking out something that you know is obsolete isn't relevant.
It's a heuristic. Although I bet you can find those things in use somewhere.
This article is a little different -- "things my son would use" implies that they're still popular, not just extant. Both questions are interesting, and influenced by the same principles.
The Lindy effect is one reason I'm working on https://www.oilshell.org/, because shell is now more than 50 year old, much older than Python/JS/Ruby, etc.
i.e. When people want to explain a modern cloud platform, they use shell. Go would have been more obvious, but shell is clearer. Lindy prediction: shell will outlive Go :)
I think you can argue that its not true for things in which item A and item B are members of a broader class of things wherein changing from A -> B incurs no or trivial costs or little fundamental changes in fulfilling the purpose of the class of items and there exists no immediate need to stop using A.
For example nobody expects the 2004 Toyota Carolla to be forever but the gas powered car will be far harder to kill.
> The rough heuristic is: the longer a technology has been around, the longer it will last into the future.
"The term Lindy refers to Lindy's delicatessen in New York, where comedians "foregather every night at Lindy's, where ... they conduct post-mortems on recent show business 'action'"."
And no more should be read into that. There are solutions to problems which are adequate, e.g. "chair", where further changes can be expected to be modest. And since the problem isn't going away (unless someday we're told that sitting kills us and that we need to stand or lay instead), the solution won't either.
Otoh, there are technologies which simply supersede and obsolete others. E.g. UTF8 has ASCII as subset and hence I don't expect to see the latter around for long.
UTF-8 is backwards compatible with ASCII “as she is spake” but not strictly speaking ASCII as any ASCII control characters will break UTF-8. It also breaks any 8-but extensions/code pages. ASCII vs HTML is a bad example though because HTML is used globally and although ASCII is too this is more a historical artefact. It’s not hard to imagine ASCII dying out over the next few years while HTML continues to adapt to every encoding under the sun and pure ASCII becomes used less and less ...
Nope. If you read an ASCII file with control characters in Java you’ll get an exception. Also it won’t work with the 8-bit ASCII variants. Neither are “true Scotsmen” of course, but the point still stands that HTML could yet be more durable.
I’m almost certain the default encoding for reading/writing files in Java is UTF-8 and similarly for the source files. I don’t think I encounter wide char data much really at all day to day ...
> Will HTML or JavaScript last longer? Probably HTML, since it came first.
Well, barely. I’d bet on JavaScript for this one; programming languages are almost immortal once they reach the popularity of JS, while HTML would be easier to replace.
Technologies never truly die due to outdatedness they just become rare. Horse buggies are still used, though rarely, steam engines are more hobbies now but still around.
What truly causes a technology to die is when it's kept a secret and all those who know the secret pass away without passing it on (Greek fire, ancient Babylonian batteries, Roman stainless steel). It's the one good argument for having a patent system, to keep that knowledge from being lost.
Land lines are mostly gone where I live, Sweden. I don't really know anyone who uses then, of course they do still exist due to some alarm systems and so on.
As for ripping out the physical service entirely, I don't have any numbers but I saw the lists from 2020 and 2021 and they are huge. So it is clearly happening. And they have been doing this for quite a few years.
Where I live (the Netherlands) all landlines (copper) have been replaced by fiber, I still call them landlines. I guess OP does too?
If you mean landlines for Phones: The fiber can handle that, and does, but people are indeed dropping their telephone-number-for-a-house subscription in favor of individual cell phones (and my Phone is generally on WLAN-call when I'm at home, so it use the fiber as well). My parents in law were the last ones I knew with a house-based phone number and they dropped it this year. That said, I think you still get a house-telephone-number for free with many internet subscriptions and the ISP's modems have ports for Phones on them, so indeed the landlines as defined still exists, but it's a matter of definition.
Maybe I should have specified, "in my city", indeed, when the concentration of people drops, people are still on copper or on satellite or other wireless alternatives. The point is, the country is investing in fiber: Landlines. Allthough this may change when we blanked the country with 5G towers. Already I have 60/60 mbit in my home via 4G, approaching my current 100/100 fiber subscription. Not sure which is going to scale better in the future.
I guess time averages the satisfaction humans have around a thing. Fads come and go and attracts towards new sensations but over time.. that old thing might be the only one that has the right blend.
And the US Navy, after a hiatus has started training officers and crew on how to use a sextant since modern warfare will probably result in GPS being either jammed to oblivion or shot down.
It's the same thing. horse-drawn carriages are automobile. Today's car is a more advanced version. In a sense, today's laptop is a spaceship comparing to the first laptop.
His #1 can not and will not ever happen. The radio spectrum is a shared resource. The total information capacity of the usable spectrum, say from 100 KHz to 100 GHz, is massive but most of it has terrible propagation and all of it can only be used once at a time. Massive MIMO helps in dense city cores with lots of independent paths reflecting everywhere but it's still just one spectrum in practice.
Whereas with physical transmission lines, be they cables, fiber optics, or whatever, each run can re-use the entire spectrum.
I'm surprised his #1 is still TBD. To me, ethernet lines have become more important in the last 5 years as competitive gaming/esports has completely taken off. Latency is a far more prevalent in gamers minds today, I'd argue moreso than bandwidth and the first networking related advice a gamer receives to make sure you are on ethernet.
Exactly. I have a suspicion that a large part of people complaining about "Internet issues" are faced with WiFi issues. Apparently many struggle with the concept of separate links that make up a connection (LAN vs WAN).
But the reverse mix-up also happens: Hotels, restaurants and the similar businesses like to boast their "included free WiFi" when they really mean complimentary Internet access, provided with an AP.
I'd take that as a given for 99% of cases. As fibre takes over from piggybacking off copper, IME actual internet issues seem to be pretty rare: [UK and] I haven't noticed that kind of service issue for a few years now, whereas used to be relatively common.
Happy to keep the pretence up though -- what else is a as neutral and widely understood as "internet issues"? Whoops, seems as if my connection has dropped out! Yes, I was very interested in that discussion over sales targets that's been going on for the past half an hour, please email me a summary! Bloody internet, always dropping out!
I always laugh when I see Comcast commercials for "fastest in-home Wifi". An 802.11ac router isn't going to help anything when you can only get 10/10 bandwidth on a good day.
I love cables while I'm using them. I hate cables while I'm cleaning. It seems almost impossible to have a clean cable setup that doesn't become a rats nest behind the desk.
My cable management is based on velcro strips nailed to the desk's edges and undersides in order to keep cables off the ground. It keeps the cables off the ground (and also mostly off the desk surface) and looks reasonably clean from eye level: https://imgur.com/a/tRamJ65
I use a set of G.hn ethernet over powerline adapters in my (small, rented 2 bed) flat. I can get 200 Mbps with no latency, packet loss or drop outs from the modem in the living room to the office upstairs.
This is a distance of about 10 meters with 2 brick Victorian internal walls in-between. No matter how much you spend on WiFi equipment you can't get more than 20 Mbps with high packet loss here.
Even downstairs, with no walls to worry about, only 5 Ghz is usable. 2.4 GHz is completely occupied by neighbours, and the lower 5 Ghz channels are all crowded out because of compatibility too.
I hate high rate data links over power lines. Power lines are not impedance controlled. At every bend, every approach to some metal in the building, they are going to radiate interference and accept/receive interference from their environment. Using powerline networking is irresponsible and rude. The fact that any are approved by the FCC at all is entirely due to contrived testing setups that are never replicated in real building wiring.
If you use real transmission line like ethernet cable (~70 ohm twisted pairs) or coax the impedance remains constant and they might even have a bit of shielding. You'll get faster, more reliable speeds and pollute the radio spectrum significantly less.
- It's fiddly and ugly compared to hidden cables. I wouldn't do that in my own home, I'd run them in the walls.
- It's disruptive and time consuming. It would likely take me a whole day
- I don't see why I should invest further in improving the property when I will likely move out in 1-2 years time and won't reap the long-term benefits. I've done that already in other areas.
- It's hassle. My landlord could protest that I've done a shitty job when I move out, or, even if I haven't, make me remove it.
- I need multiple WiFi adapters anyway to cover both upstairs and downstairs effectively.
- It's not as flexible (in terms of moving things around) as plugging an adapter in to any electrical outlet.
- I needed a quick solution when I moved in so I could work.
Your country almost certainly has rules about emissions in the HF frequency ranges too. Even milliwatts of radiated interference at these low frequencies will bounce around the world and interfere with everyone.
I have four adapters in my house, I have one TL-WPA7510 and three TL-PA7010. They are sold as pairs so when I wanted the version with an access point built-in, I had to buy another wired version. But, you can add them individually to an already existing network. The Homeplug protocol is a standard so as long as you buy the right versions, you get max performance. These things are great. I have one in the basement for the Verizon router to plug into, one on the second floor for my PS4, the AP version behind the 4K TV in the family room, and another at a desk in the guest room. The Verizon router has a lot of interference nearby so quality is poor in some spots in the house. Quality is good, it sometimes reaches gigabit speeds but it's way more robust than wifi extenders. The apartment I lived in previously was built like a brick shithouse so wifi dropped off pretty fast once you got to the other end.
Video calls too. I just got back from visiting my in laws since everyone is now vaccinated and one of my projects while I was there was running ethernet lines to their work areas. They're both on daily video calls now and poor wifi performance was driving them crazy.
Beam forming, phased array antennas, sophisticated coding (CDMA) and "time slots" (TDMA) will provide a lot more availability than purely looking at bandwidth available.
I still agree wired/optical is best for most fixed installations both LAN and WAN, but people are getting more out of wireless than I would have predicted. And the "last mile" capacity of today's technology far exceeds what people seem to want even looking forward a decade...which paradoxically suggests that wireless might be adequate in the interim for some use cases.
None of these technologies allow you to exceed the available bandwidth. They just make use of the shared bandwidth more efficient. It's still a shared medium.
Where I live, a lot of people have 3G internet because mobile data is pretty cheap and the companies advertise it as an alternative to cable. And now they all have really crappy internet. In the evening when everyone watches youtube you get a fraction of the advertised bandwidth.
With fibre, every customer gets the full spectrum. And since the frequency of light is a lot higher than radio frequency, you also get a lot more bandwidth. At radio frequency we're already hitting the physical bandwidth limits; with optical transmission there's still a lot of bandwidth left.
Thinking that radio frequency transmissions are an alternative to cable / fibre is pretty short sighted thinking. Data usage is going to grow, more devices are going to use data, and wireless transmission is going to seriously limit us.
> None of these technologies allow you to exceed the available bandwidth. They just make use of the shared bandwidth more efficient. It's still a shared medium.
The beam forming and such slice bandwidth availability spatially (that's what the cell network does too) so more people can use the shared medium by...not sharing it! It's not like AM radio that gets sent in all directions regardless of whether there is someone in a given direction to listen.
> With fibre, every customer gets the full spectrum.
Well by definition that's true whether fibre or copper, but that fibre or copper is itself aggregated into connections to upstream provider, so you're just kicking the problem down the road. You don't really get the full bandwidth end to end.
As I said I think fixed wireless is an acceptable transitional technology but wired makes more sense longer term for most locations.
>The beam forming and such slice bandwidth availability spatially (that's what the cell network does too) so more people can use the shared medium by...not sharing it! It's not like AM radio that gets sent in all directions regardless of whether there is someone in a given direction to listen.
Nope. In theory yes, but unfortunately diffraction is going to get you every time if youre not transmitting in deep space.
Level 3/CenturyLink/Lumen CEO reassured panicked investors and employees scared that their company would be worthless with 5G and beyond by basically saying that the last mile will increasingly become the last tens or hundreds of meters and that fiber is still the infrastructure on which these increasingly dense base stations depend on. And that the “edge computing” will likely live on the cabinets owned by the fiber provider.
It makes sense to me. Just add more fiber and let people access them over whatever.
My biggest gripe is we could choose to do away with licensing fees and spectrum auctions and open mm wave 5G to be something like Wi Fi but we are shortsighted as usual.
A bit of a nitpick, but most residential fiber deployments are PONs. With a PON, a single fiber gets split into a lot of separate wavelength channels with prisms. It's still tons more usable bandwidth available than wireless.
>None of these technologies allow you to exceed the available bandwidth
Exceed available bandwidth of what? Per Spectrum? Shannon–Hartley theorem?
The whole point of 4G, and 5G, mentioned in the GP as Massive MIMO was that we could workaround those limits with more Antenna. Everything we are doing today and aiming to do in 3GPP Rel 17 in a few years time are literally impossible to even infer about in the early 2000s. When Massive MIMO, or it was known as Very large Array of Antenna was first published people called the idea "crazy". And CoMP, whether the marketing decide to call it 5.5G or 5.9G along with distributed antennas being worked on in 6G.
There are no fundamental technical reason why we cant have a fully wireless Internet. Although there are many business and economical reason why this may never happen.
Sorry, I think I meant 4G, not 3G. It's marketed as LTE here.
> Exceed available bandwidth of what? Per Spectrum? Shannon–Hartley theorem?
No, not Shannon-Hartley. That's just a mathematical model.
When EM waves propagate, they are subject to diffraction, which limits both the amount of information that can be transmitted per time interval, and also the spatial resolution of the transmission. Even with antenna arrays or distributed antennas you can't get past diffraction limits; you can just get closer to them.
To get around diffraction limits, you need to use higher frequencies / shorter wavelengths. (Which has the side effect that you loose the ability of signals to go around / through obstacles, so you need a lot more cell towers.)
We're at a point where new technologies just make different trade-offs (eg. shorter wavelengths for areas with lots of wireless clients vs. longer wavelengths for sparse areas).
With a fibre connection, you don't need to make these tradeoffs; you just need to dig up the ground and you can have as much bandwidth between two points as you want.
The pixel is 17x, not bandwidth. Even Compressed RAW size dont scale linearly with pixel count. I dont have any experience with 8K, but compressing / encoding 4K with HEVC or AV1 tends to easier with fixed VMAF score than a comparatively low pixel count of 2K / 1080P. I would imagine the same if not better for 8K. And that is discounting the use of much better video codec like VVC which brings another 40 to 50% reduction in bitrate.
Quadrophonic sound did not replace the "good enough" stereo even though humans can move their heads. Of course an 8K TV is just as easy to set up as a 4K one, unlike a quadrophonic stereo. And the functionality of quadrophonic records became available as "surround sound" which some people do have.
It's possible there are enough people who will appreciate the difference for 8K to become established. Personally I doubt it, but it's certainly possible.
Another possibility is that your TV watches you and provides just a 4K or 4K-ish image unless you walk close to your TV at which point it displays a higher resolution image where you are looking. Possibly with some hint-driven mix of AI upsampling and more image data.
The opposite could be true of course with 8K VR rigs in every home. I also doubt it, but it's quite possible.
The 8K-production -> 4K result is well established and will remain even if my guesses above turn out to be correct. That needs a lot of bandwidth but not at the point of viewing.
Unless you're displaying on movie theatre sized screens, 8K seems like a waste of space/bandwidth. Even 4K is generally overkill for the typical living room.
I think we're hitting the point with video resolution that music CDs hit with audio, where improvements in fidelity are largely outside the range of human perception. It's one of the reasons the music DVDs and SACDs never really caught on.
This is well within what someone with good vision can see at for example 6-8 feet
For a personal reference I could tell the difference in clarity at 8 ft between a 1080p 24" monitor and a 28"4k monitor. That is 157 vs 92 ppi on a screen a fraction of the size 5 minutes ago.
I must imagine people making such claims have poor eyesight or are using optimum viewing charts as a proxy for distances wherein human vision was sufficiently acute instead of looking for themselves.
For reference someone with good vision ought to be able to distinguish up to about 115ppi at 8ft according to this source.
> Especially since I and many others can verify this in 10 seconds.
Try it with a high resolution video you've transcoded locally (so you're not dealing with streaming quality differences) at various resolutions. The human visual system is a lot less precise with video than it is with still images. A paused DVD frame often looks like a blocky, terrible mess, but it's perfectly fine when displayed as video.
I won't argue that if you pick your content carefully, and know exactly what to look for, you can tell the difference. But, practically speaking, most people watching a decently encoded video won't see a huge difference from 1080p to 4k at typical screen sizes and viewing distances, and will see even less difference going to 8k.
I honestly can't tell you if I'm watching 720p or 1080p content on our TV without looking at the source file info.
You're confusing "What can be perceived" with "What actually matters." I'd argue 4k was a bit of a stretch to sell new TVs, and 8k is going to be even more so.
We've hit, as someone else said, the 44kHz/16bit point of video. Encoding quality and bitrate matter a bit, just like mastering does with audio, but we're past the point of diminishing returns at this point.
If you can't tell the difference between 34ppi and 65ppi and typical viewing distances your eyes just don't work very well and you just literally don't realize most people see better than you even if they need glasses to do so.
We aren't talking about near the limits of human perception we are capable of perceiving 115ppi at a typical 8 ft viewing distance. 8K for a 65" screen is about the limit. 4k is itself about 1/4 the resolution we can perceive and 1080p is 1/16. You are saying that people can't on average distinguish between 1/4 the resolution we can perceive and 1/16th.
The link I posted was the informed opinion of an electrical engineer that worked in display tech. If people really can't tell the difference between 4k and 1080p shouldn't there be scientific research by this point to show this? You can't because there isn't.
You are like a color blind person exclaiming there is no difference between green and red! There is.
... wait, what? 8 foot viewing distance with a 65" screen? Ok, we have very, very different definitions of "typical" here. You're using a 50% larger TV from half the distance.
I just measured our living room. We have a 43" (4k) TV at about 16' viewing distance. Just about every chart I can find on TV resolution vs viewing distance (assuming 20/20 vision) indicates that I'm in the "720p is fine" region, on the border of "You really won't notice a difference from 480p." Which is consistent with my experience. I can see the difference between 720/1080/4k if I'm right on top of the TV, but once I'm back where I actually watch from, anything I notice is more a factor of bitrate than resolution - I can certainly tell the difference between a low bitrate MPEG2 source and a high bitrate h.264 source, but that has nothing to do with the resolution.
I don't really have the motivation tonight to pull out my trig tables (I'm about to watch a movie instead, of what resolution I honestly don't know), but... at least for my case, yes, I'm saying I can't tell the difference. And it's entirely possible my glasses need updating (quite likely, honestly), but the "TV size vs distance" charts I've found say nothing of what you're claiming either. And they're by people trying to sell TVs!
It suggests that your tiny tv ought to be optimally viewed from 4.8 feet away and at 16 feet away you really ought to have a tv greater than 100 inches.
Of course most people don't have 100+ inch tvs but they also don't watch tv from 16 feet away unless they are watching a 45-60 FOOT theatre screen. Play with some values in the calculator if you want to see how other people use tvs.
People normally sit 8-10 feet from 60-75 inch tvs and use 40 inches for smaller rooms like bedrooms where they are 6 foot away.
To be clear about the whole picture you are a fellow with bad vision watching a small tv from nearly 3 times further back as the rest of planet earth.
At that distance your tv literally does already exceed average visual acuity and for your usage you are absolutely correct.
Meanwhile the rest of us will benefit from up to 8k in the future.
> To be clear about the whole picture you are a fellow with bad vision watching a small tv from nearly 3 times further back as the rest of planet earth.
My vision is perfectly fine, thank you very much, as you seem to insist on going on about how horrid it must be. The difference is entirely explained by the fact that I watch a reasonable sized TV from a perfectly sane distance in a living room that isn't a home theater room and has space for plenty of other activities. No idea how you get a TV 6' away in the bedroom, though, unless it's literally at the foot of the bed. I've never had one in there and never intend to.
Your linked calculator suggests that the "optimum" distance for my TV is about 6' away, which I find quite absurd (having just tried it). I'm watching movies, not programming on it.
We appear to have rather divergent views (and social group priorities, if yours has 60-70+ inch TVs at your suggested 8-10 foot viewing distance) on the nature of television viewing, and, as such, there's not an awful lot more to discuss.
I don't think its a matter of divergent priorities. The argument up thread was that we had reached the zenith of resolution based on visual acuity. What we established is that this is almost entirely true or false based on the persons vision, size of screen, and viewing distance. Based on how the majority of the public uses TVs we have provably not reached such a point.
In point of fact a medium sized living room is 12x18 with a TV on either a stand or the wall along the longer wall and seating that is around 2 feet deep and off the wall by at least a foot. This means that in an medium size living room there is 8-9 foot between seat and screen. In a large living room 15x20 there is still only 11-12 feet.
People actually are putting 60-75 inch screens therein so we can support the position that the figures given by the calculator represent how people actually use TVs and supports the position that HD -> 4k -> 8k will be perceptively still beneficial given the constraints of human eyeballs.
I'm interested in sources that establish "how the majority of the public uses TVs". I've personally never seen anyone use TVs in line with the recommendations from TV manufacturers screen size and viewing distance. Maybe this is a USA cultural thing (or, contrary, maybe my own observed viewing habits are a cultural AU/NZ thing).
What about a house size and falling TV prices thing? Half of the US couldn't sit 16 feet from their TV if they tried.
1/3 of the US lives in apartments and 80% live in urban area where housing is comparatively more expensive. Even for slightly larger houses its common to have a living room and a family room and give up a lot of space to the individual bedrooms. It's supper likely that the TV is in a room that is between 12-18 and 15-20. With the TV on the long wall.
Also for some reason sticking your TV on a stand is still pretty popular vs putting it on the wall. This means that in a 14' room your tv is between 0' and 1' off the wall and your rear end is 3' off the wall. This puts as little as 8' between viewer and tv in 12-18 and as much as 12' in the larger room if they hang it on the wall.
How the US population is distributed is kind of interesting.
Regarding the question of average living space in the US people in the US almost universally prefer single units detached housing but as the song goes you can't always get what you want. Many for example would prefer to be close to urban settings with more jobs. Others can only afford to rent.
Among renters, around 1/3 of the population, half live in 111 sq meters or less
1/3 of owners live in domiciles of 167 sq meters or less and and 60% of owners live in domiciles 222 sq meters or less.
Combining renters and owners 77% of Americans live in 222 sq meters, 54% live in 167 sq meters or less, 25% live in 111 sq meters or less. Some rounding and conversion between sq ft and meters.
The people paying lots for tiny apartments in the city think your 16' of open space in your living room is as unusual as you think they are with their relatively giant TVs in smaller spaces.
Interesting, but doesn't tell me much - I live in an apartment, well under 111sqm, and have a 55" TV around 12' from the couch - so still too small / too far away, according to recommendations. People with even smaller living space (at ~80sqm, my apartment is reasonably large for a two bedroom unit where I live) may simply get smaller TVs to compensate, still being "too small".
These are anecdotes. I'm curious about what people actually do, I don't know, and (contrary to your assertions about what most people do) it doesn't sound like you do either.
I showed with data how many people live in relatively small spaces and asserted but didn't prove that Americans often devote much of their space budget to bedrooms and houses which have both a living and a family room wherein they had more budget to spare. This last shouldn't require a ton of proof.
My assertion is that due to the above asserted space constraints the majority of people in America are 8-12' from their TV. I also assert that the same people are buying 60"+ TVs
I linked a source that suggested that people could distinguish up to 115ppi at 8' If I understand correctly that means they ought to be able to distinguish about 90ppi at 12'.
If these assertions are correct most people would benefit from 90-115ppi and based on THX recommendations would benefit from the increasingly popular and cheap large screens that people are provably buying despite living in not so huge spaces. In fact tv size purchases are trending upward based on what people can afford.
A 60" 4k screen is only 73 ppi meaning we aren't beyond human visual acuity yet and 4k at 75" we are down to 59ppi.
The majority of consumers will based on good visual acuity be able to distinguish between 4k and 8k in the room sizes and screen sizes where such products would be relevant and applicable and a median American would benefit.
I agree with Syonyk here. And since you bring up quality of vision, I'll note that I do have 20/20 vision because I'm wearing prescription glasses.
> You are like a color blind person exclaiming there is no difference between green and red! There is.
You're arguing against a complete strawman here. The key point that Syonyk was making is this:
> I won't argue that if you pick your content carefully, and know exactly what to look for, you can tell the difference.
If you're watching a nature documentary or anything else that puts an emphasis on the visuals, people are going to notice the difference by themselves. But most of the stuff we watch doesn't focus on visuals, it focuses on narrative. If I'm watching a legal drama, I don't really care if I can count the hairs in the lawyer's nostril or not. Again, I'll probably be able to tell apart the 4K version from the 1080p version from the 720p version of the lawyer's face's closeup, but even if I watch just the 720p version, it won't significantly impact my enjoyment of the narrative.
Although its not one singular person arguing I can't help but feel like the goal posts are pretty mobile here.
First goalpost high dpi video is already outside the range of human perception.
> I think we're hitting the point with video resolution that music CDs hit with audio, where improvements in fidelity are largely outside the range of human perception.
Second goalpost ok its not but its really close
> I won't argue that if you pick your content carefully, and know exactly what to look for, you can tell the difference. But, practically speaking, most people watching a decently encoded video won't see a huge difference from 1080p to 4k at typical screen sizes and viewing distances, and will see even less difference going to 8k.
Third goal post ok its actually clearly discernible but I mostly care about the story anyway.
> But most of the stuff we watch doesn't focus on visuals, it focuses on narrative. If I'm watching a legal drama, I don't really care if I can count the hairs in the lawyer's nostril or not.
Believe it or not but some people said the same about HD vs SD towards the beginning. It was complete bullshit then too. I too enjoy the story and psychological drama but its not books on tape. Film and TV are a visual medium wherein being able to see the world more clearly adds to the experience.
Well of course presenting three seperate people's ideas, some of whom are refuting each other, as one continuous argument will come out looking ridiculous.
The first guy is a gentleman with bad eyesight watching a 43" tv from 16 feet away. If he watches it from any further away he wont be able to tell if its on. The second is inaccurate. The third is a complete cop out.
Not sure about that. With large screens 55" at home and over I can see a difference between FHD and 4K, enough to know I am having network problems when the streaming service starts to downscale the resolution in a 4k video.
that could be the bitrate more than the resolution. streams are highly compressed. would be interesting to see whether you prefer 1080p bluray over a 4k netflix stream (I suspect yes).
> The difference in HD to 8k is 17x the bandwidth.
Not practically. NetFlix will just compress it down to 3Mbit anyway like they do with 4k, but be able to call it "8k content!" and (probably) charge a premium for it.
This comment betrays a lack of understanding of some of these multiplexing technologies. They minimize wasted bandwidth due to signal collisions and interference, but they don't increase the overall bandwidth available.
Let's take TDMA as an example. TDMA means that instead of using the available bandwidth continuously, each participant only gets to use that bandwidth for a fraction of the available time. Saying that TDMA helps us increase the available bandwidth is like saying queueing up at the restroom will "provide a lot more availability than purely looking at the number of stalls".
CDMA is more complicated but it's still a similar story. Look at this diagram of a CDMA signal:
Notice how the "data" actually being transmitted is only a couple of bits, but the CDMA signal includes many more transitions. CDMA is essentially using N bits of transmission bandwidth to send a signal bit of data signal, the benefit being that if multiple signals interfere it's possible to extract one of them using some complicated math. It's like if 10 people were sharing a phone line, and instead of taking turns talking they all spoke at the same time, but repeated themselves 10 times so you could pick up enough snippets from one speaker to understand what they were saying if you concentrate hard enough.
TDMA/CDMA don't bypass information theory, though. Shannon always wins in the end.
You can improve the situation with spatial beamforming, polarization isolation, mmWave bands, etc, but there eventually comes a point where the added complexity doesn't win over just hiring a few guys with a backhoe to run fibre down the street.
Current laser com systems are getting multiple bits per photon - it seems that the communications field will keep pushing the limits until there is simply no business case, and then push some more. Really, the only limit is Shannon's limit, but that's only concerned with data rate. The modulation schemes that sit on top of the raw data is where all the capacity magic happens.
I assume you're referring to something like this: https://www.sciencedaily.com/releases/2017/02/170203102740.h... but this isn't a scalable transmission system since it's based on single photon detection, and single-photon detection is not a high bandwidth communications ability because you have to discriminate the photons. It's a far cry from "current laser communications".
> the only limit is Shannon's limit, but that's only concerned with data rate.
But that limit, the capacity, is also the maximum possible limit of information transfer on the channel. Modulation schemes don't increase it or change it, they just push the achievable information rate of the system closer to the capacity.
Modulation is definitely the wrong word in this context. I was thinking about channel coding and compression algorithms that increase the efficiency of the system without requiring additional capacity.
A lot of confusion in the replies below. Here is an illustration of this point: Suppose you’ve saturated your communication bandwidth between two points, sending bits through the entire 500 Thz spectrum. If you have a wired connection, you can lay another set of cables between these points and send more bits. With wireless, there is nothing more you can do. No amount of beam forming helps you.
Intel was so pushing WiMAX at one point. I really irritated someone there when a wrote something critiquing their efforts. We are seeing wireless technologies (including satellite) starting to handle some use cases where it's hard to wire. But I do expect denser areas to remain mostly wired.
Yup. I’ve always questioned people who think wireless can replace a wire at scale. It just can’t.
That being said I can browse most sites and do 90% of my internet just fine either on my phone or tethered to my phone using my data plan. So maybe I’m wrong!
Well wireless won't ever replace wired for every use case, I don't see why we can't get to a point where 90% of households are provided internet service via cellular technologies, either to a MiFi-like device or with cellular built into the computer. If my iPhone, iPad, and Apple Watch have cellular built in, why not my Mac too?
But why would you want to have that? Simply put, with wireless you are trading the convenience of having no cables for essentially everything: Wired has better throughput, lower latency, lower bit-error rates resulting in less retransmissions (better goodput).
I'll admit there is one aspect where wireless might be interesting: Comparatively lower cost of initial set-up, especially when infrastructure is lacking. In other words, if you're trying to provide Internet access to a developing country (little existing cable/telephone infrastructure), wireless might be an interesting intermediate option. But if you're looking at a industrial nation where (rough guess) more than 90% of households already have some kind of cable/telephone/fibre infrastructure, wireless would be a downgrade.
This sounds like a very "faster horses" kind of comment.
Simply put, with SSDs you are trading the convenience of having faster boot times for essentially everything: HDDs have (a lot) more space, lower prices, longer lifespans.
Simply put, with electric cars you are trading the convenience of charging at home for essentially everything: Gas has higher range, maintenance is easier to find, filling your tank takes 2 minutes instead of hours and that tank's capacity doesn't diminish over time.
Simply put, with screens you are trading the convenience of not having to deal with paper for essentially everything: Printers have better resolution, paper is easier on your eyes, it's more portable.
Wireless is a worse product than wired, but you trade the worse product for the convenience of not having the wire. None of the examples you give match that pattern.
SSDs are a better product than HDDs, they trade capacity for speed, but if you need capacity it's likely you don't really need the extra speed, and you're likely running into other bottlenecks anyway (a HDD can still saturate 1G Ethernet)
Electric cars are a "worse" product, but for most use cases they're functionally equivalent. Most people don't drive farther than the range in a day, maintenance is at the same place as ICE, slow recharge times are easily negated by just plugging in every night and charging while you sleep.
Paper is a worse product than a screen because a screen can be updated. It's about as easy to read, the resolution is about the same, they're equally portable (see a cellphone), but you can put just about anything on a screen and update it many times a second. You can't do that with paper, which is why screens have mostly replaced paper.
You connect literally all those things to a wifi hub in your house most of the time, and the likelihood of big data consumption on all of them is quite low.
It's a huge difference to think you're going to be servicing all rather then some user workstations data requirements with just todays celltowers - which isn't to say you couldn't make it feel similar, but at some point you're cramming a 5G tower into every apartment and those still need fiber backhaul.
The radio is the most trivial part possible in fact computers with cellular radios have been a thing for a very long time.
The little micro cell tower alternative arrangement exists to support devices that expect to communicate with a cell tower where cell service is poor. You normally actually plug them into your wired router so they are actually for areas where relying on wireless would be the worst possible experience.
Fiber is able to provide Gbps to for example all homes in a square mile where they each get Gbps and can indeed fairly heavily use it. Cellular internet isn't actually wireless you run fiber to the towers and then everyone in that square mile.
Urban areas where 80% of people live have a high density so for example in New York City that 1 sq mile contains 27000 people.
Quality of Service is definitely one of the metrics ISPs track, wired is a more reliable way to guarantee service than wired in a fixed setting, upto the house.
Directional antenna completely break those bandwidth limitations. It’s not currently practical for hand held devices to make significant use of it, but ultimately everything is point to point.
As someone working on 5G, I disagree strongly. 5G supports up to 1 million devices per square kilometer (the most dense city, Manila, in the world has about 50k people per square kilometer). My understanding is that you can have about 100k devices per square kilometer before speeds drop below ideal (20 gigabit). So even if everyone in downtown Manila wanted stream seven movies at once they could all do it. As other commenters have said, that’s because of clever techniques like beamforming and OFDMA.
But cities aren’t really the area where 5G home internet provides an advantage. Sure, the protocols more efficient than WiFi - the tower allocated bandwidth and time blocks to coordinate transmissions instead of having everyone’s router blast away on the same hardcoded WiFi channels. But it still achieves those high speeds and great connectivity through having a bunch of small cell towers - hardly much different than wired internet.
The real advantage is suburbs, small towns, and rural areas. You solve the last mile problem for cables, eliminate outages, and you do it all with a couple large cell towers.
Yeah, people (or more likely businesses) with high throughput needs will still use wired connections. But it will be because the electricity costs of transmitting from a 5G tower are high enough that there’s always a base cost per GB, not because there isn’t enough bandwidth to go around.
Why do you expect 5G to eliminate outages when we can't eliminate them with existing LTE towers, with home wifi, or even with direct ethernet connections to modems?
As of now, my least reliable network is always my cellular network.
> you can have about 100k devices per square kilometer before speeds drop below ideal (20 gigabit)
When, though? We're far, far from this at the moment. What is going to fix it and when?
I think you’re imagining some centralised MMDS type architecture but with cellular architecture this is a reality in many places. Even in modern settings if you have WiFi you need never even be aware that there is a wire there feeding it. Indeed to many of my younger contemporaries it can take a moment to explain the difference ...
I understand there's a horizon for cells (vhf and up isn't going to be reflecting off the ionosphere) but a whole lot of people can exist within a single cell. My argument is about the informational capacity per cell.
Well cells are getting smaller all the time and currently there is enough capacity available in an optimally configured network to provide all the services one could possibly need. Of course you’ve got physical cable tying it all together but the “experience” is wireless.
Each cable node shares ~1GHz of bandwidth among all the separate runs to individual houses. Relative to that, 5G has similar local bandwidth, given the low propagation of mmwave.
It's not going to completely replace hardwired internet in suburbs/cities, but it might just kill last-mile fiber rollout short of new construction.
Can it exist and provide a usable service to some? Yes, probably.
Can it replace cabled internet, period? Can it replace terrestrial cellular? Unlikely:
Mobile internet is constrained by the capacity and coverage of the cell. "Capacity" being the practical sort of what the maximum achievable information rate is, given the available receivers and transmitters. Satellite links are a bit complicated, but essentially the Shannon-Hartley theorem holds and you either have to increase bandwidth or increase signal-to-noise ratio to provide better data rates within a cell. Bigger cell, more users, less capacity per user, worse data rate per user.
I don't know what the cell sizes Starlink will use are, but I've read numbers around 200 km^2. My hometown of 10000 homes is 21 km^2. If they were to provide 100 Mb/s just to my hometown, they would need a 1 Tb/s capacity in a much smaller cell. Of course, not everybody will use 100 Mb/s all the time, but it illustrates the problem.
This is why SpaceX is mainly focusing on low-density areas in places where the infrastructure is less-developed/monopolized. The capacity / coverage problems works much better to their advantage.
The 3D TV phase was so odd. It really seemed like an inevitability: there was a technological breakthrough, it was genuinely much better than what came before (the red and blue stuff), it was relatively inexpensive, and on paper it seemed like a strict improvement on the status quo (flat TV).
But... somehow, I never actually found myself wanting it that badly at home. And I guess everyone else agreed because one day it just vanished.
So few technologies follow a trajectory like that. Usually you either get a success or a failure, where successes only end once they're replaced by something new and strictly better. But this was truly a fad; it was kind of successful for a minute, and then everybody lost interest.
It’s honestly just more trouble than it’s worth. I have a 3D TV (by relative accident) and I’ve never used the 3D functionality. Even going to the movie theater it was honestly never a big draw once I got over the novelty of it, which happened approximately 2/3 of the way through Avatar. That was almost the only content that was ever specifically made for the 3D format—a quickly forgotten, crappy science fiction movie that wasn’t even that much better in 3D compared to 2D.
VR might be another story, since VR content has a much more convincing 3D effect. But that entails pretty fundamental changes to cinematography, and requires the viewer to wear a massive headset that blocks out the real world; having some friends over for a movie night is virtually impossible.
I saw it in theaters because it was advertised to be the best use of the new and improved 3D film technology at the time, so I was part of that high box office gross. Despite the strong positive reception at the time, though, the movie didn’t really seem to achieve a lasting legacy.
It doesn't sound like they really started until 2017, but the movie appears to be in post-production. I think the delay from 2021 to 2022 was about finding a good release window.
VR movie watching is surprisingly good, works well with the few 3D movies that exist too because it can just send a different frame to each eye rather. Definitely can lose yourself in the film in the same way you do at the cinema.
Check out Big Screen too, lets you watch movies together with people in the same virtual space, might not be great for an IRL movie night but its great for a remote one.
> Check out Big Screen too, lets you watch movies together with people in the same virtual space, might not be great for an IRL movie night but its great for a remote one.
I’ve had enough of that remote shit over the past year and I don’t think I’m the only one.
As someone who had had a surround sound setup, but then not even bothered to set it up after moving, it was pretty obvious to me even back then that 3D TV was in the same category of "technically better but practically pointless" for the vast majority of movies, which people would forget about as soon as the ads for it stopped.
Yeah, I was buying a TV at the time and got one that supported 3D. I have maybe a half-dozen 3D movies and I'll pull one out once a year or so. Gravity is probably the best example of using 3D.
I wonder if we just hit a technology wall. The glasses were a pain, the non glasses ones didn't work so well and nothing else better could be found.
People were willing to put up with the inconvenient version for the novelty but they quickly wanted something better and no one could provide better. 3D video may come back with mobile VR devices.
I don't think that people simply didn't like 3D video, only that we weren't able to give a good enough experience for it and one day if someone invents something that does it better, it will come back.
I don't think 3D video was really a different class of media. 3D would never replace animation but imo real 3D would replace 3D projected on to a 2D screen just like 4K replaced FHD and FHD replaced HD.
I honestly think AR is poised to blow up, there has been a lot of sustained activity in the space for years - including lots of exploration of business/industrial uses - and we haven't even seen a true consumer device yet (Google Glass doesn't count). It's going to simplify/streamline a decent subset of the top uses for smartphones, while also providing novel experiences.
I do believe VR is here to stay, but I really think that's going to be more gamer-centric. The 'social' aspects of VR seem to be very, very niche; most of what users are talking about (and spending time/money on) is games.
All of the AR use cases seem to have been mostly replaced with the smart watch. The only thing the smartwatch is missing is having an always on camera and mixing real and fake images which seems to be not as useful as getting notifications and navigation info on your watch.
Football (real football, the one with actual kicking) was pretty good in 3D. Occasionally it got a bit weird when the ball would move in a certain arc relative to the camera but mostly it was an improvement, and the crowd scenes were kind of mindbending[2].
It's how I imagine football would look if the animators of the original Paddington Bear series[1] were rendering it.
I'm still salty about that. I love 3D movies, even with the fuzziness and the glasses (I wear glasses anyway so, I mean, what's the big deal?) but nobody else can be bothered, and now you can't buy them any more. I really hope my current TV doesn't die.
Do you mind sharing what you love about 3D movies? Even ignoring the glasses and the fuzziness (and the headaches and the blurry vision in my case) I can't find one positive thing about it.
I enjoy the extra sense of immersion and the added depth, and I feel like I get a better understanding of what's going on. I get a sense of 'being there' that I've never really had with 2D movies.
It's because of the lack of 3D content for TV. Occasional 3D movies, that you'd better watch in a cinema, is not a very big incentive. For games there are VR helmets.
It wasn’t just the content. It was the glasses weren’t included and were almost always super expensive. Or at least costly enough to make it something you’d have to really really want.
My dad has a 3d TV and I found the main issue was just the fundamental physics - like unless the TV is absolutely massive like a cinema then there are fundamental limits on how "3D" it can appear.
This meant that even the relatively small amount of 3D content didn't really feel that 3D, it just felt like more realistic 2D.
And if there's only a small amount of content you might as well just go to the far superior 3D cinema for the experience.
I'm convinced 3d tv/movies only had that resurgence because movie theaters were using the '3d' (along with some other gimmicks') to raise the price of tickets.
I found 3D TV broke the experience. I'd get into a movie, then at some point into where the 3D effect would get shown off, and it would pull me out of the movie entirely. Perhaps that is more about poor usage of the technology than a problem fundamental to 3D TV.
For a while, it was really hard to go to the theater. Everything interesting was only in 3D. I didn't want to pay more for an experience I liked less than the previous status quo.
1. Mechanical hard drives. These have become a bit more niche, but if you deal with larger files like that movie library the author ought to rip from those fragile optical disks he still has lying around, spinning rust is the far more cost-effective option for storing files you don't use all the time.
2. Phone numbers. His original prediction is that people wouldn't use them, not that people wouldn't remember them and dial them manually. A number of popular messenger services including Signal use phone numbers as identifiers, and I wish they didn't.
3. The fax machine. This absolutely, 100% deserved to be dead a decade ago. Perhaps most faxes aren't actually sent using physical machines anymore, but a lot of businesses and some government institutions treat fax as more secure than purely digital file transfers. My vote in the 2020 US election involved a fax. I'm disappointed in the tech community for not producing a solution that achieved near-universal buy-in from more conservative institutions in the past two decades.
4. Optical disks. I haven't used one in years, and I suspect a lot of other people here are in the same position. There's not that much content I want to watch more than once, so the issue of streaming non-ownership isn't a big problem for me. The author has kids, and kids definitely do that, but there are both legal and less legal ways to obtain permanent copies of content by purely digital means. Note to content sellers: I'm happy to pay for content; don't make it so difficult for me to do so that I seek alternatives.
If we define hard drives and optical media as "memory based on periodic rotation" then the technology has been around for generations, in fact, with specific materials and mechanisms serving to update the paradigm. Tape, drum memory, magnetic discs, and optical. Periodic rotation trades blows with periodic refresh - tubes, DRAM - and phase-change - relays, core, flash - as the prevailing memory paradigm, with hybridization of features often occuring.
Von Neumann wouldn't be too surprised at the memory paradigms in use now, because they haven't changed, even though the implementation has become considerably more sophisticated.
But, of course, that can't dismiss the idea that a better implementation is a substantial technical advance either.
> 1. Mechanical hard drives. These have become a bit more niche
No, they're still pretty mainstream. When someone runs out of disk space (e.g. with media files), they tend to buy a large HDD (whether external or internal), not a second SSD.
So, your point about HDDs is even stronger, and the author's son will definitely get one.
They are basically a legacy mode now. The playstation offers an option with no disk drive and I imagine the next gen will have no disk drive option at all.
Console makers want it to be legacy because they can charge a higher price for a game when there is only one store in town compared to multiple stores when it comes to discs.
It’s honestly really hard to find less well known movies of more than a few years old. Torrents really aren’t what they used to be. There were a handful of movies I’d seen in my teens I could not find anywhere. I ended up importing DVDs from Japan and manually aligning some awful fan subs I’d found.
Call me crazy but I actually recently switched back to DVD Netflix. The selection is way better than any individual streaming service and I have this timer ticking in my head that makes me actually watch what I rent instead of wasting my life watching reruns of comfort shows. It’s been fantastic.
Our household doesn't have any optical drives. We have an xbox360 and a blueray/dvd in storage that hasn't been used since before our last move 3 years ago. So, essentially it's non-existent.
Actually, I'm wrong wife's desktop does... and our car does...but I mean the frequency of use is like twice per decade...
"On the bright side, you can replace almost any remote with a smartphone app, depending on your TV, cable box or streaming box. You can also use voice assistants such as Alexa or Google Assistant to control your home theater. "
I fail to see how this is the bright side. Both of those ways are worse at interacting with literally anything, especially compared to a dedicated remote for a TV. I'd try a foot controlled pedal for my TV before I'd be ok with using voice controls.
I recently decided to install smart bulbs in a couple of lamps. I set them to "on" permanently so that I can exclusively use the app to turn them on/off and adjust their brightness and warmth.
After having them installed for a few weeks, their benefit is mostly a wash. I'd rather just flip them on/off, but unfortunately one of the bulbs doesn't retain its "memory" of the last setting this way, so to use the features, I have to use the app and keep them on. Additionally, the app sometimes takes a second or to to connect to their service, so I'm standing in front of the lamp waiting several seconds just to be able to turn it on/off. This is definitely a case where a manual switch is so much better. It. Just. Works.
For this, I can’t recommend and isy994 controller and Insteon light switches and outlets enough. You get physical control, automation via scripting and events, app control, and I’ve never had a (noticed) problem with the devices remembering state.
Voice controls via Alexa or any other home assistant actually work pretty well for smart lights. No fumbling with a device, just tell the lamp you want it off. You can also group them together usually to turn off multiple with a single command.
> but unfortunately one of the bulbs doesn't retain its "memory" of the last setting this way
Yeah, my last smart light setup a few years ago had this problem too. I ended up running a script on Raspberry Pi that detected when a light bulb appeared on the network again and reconfigured it immediately.
But my ultimate conclusion from that setup was, smart lights make no sense without smart switches. You want to be able to both actuate physical controls and switch the lights through software.
I recently bought a few smart bulbs for my desk, which are always connected to power. But I also bought a physical switch that controls them. It's instant and turns them all on or off simultaneously.
To be honest I almost always use the switch, including changing the color. Cycling through colors with a button is still more convenient than opening an app and picking one. I really only use the app when I need to control individual lights.
If you are on iOS and your bulbs work with Apple's Home app (which most seem to these days), you can set that up and just swipe up from the bottom of the screen and use the quick controls. Works quite well. I think a similar thing is possible on Android with the Google Home app, and more vendors seem to support that.
Also going down the smart device route. Recently tied 2 switches and a remote fan into a single action. I've setup a time filter too and now my lights can be shut off and fan turned on when I go charge my phone before going to bed.
I'd always wanted a "light alarm" to wake me up, now in the morning I set my lights to turn on if I'm not out of bed (phone still charging).
Are there better places you could use the lights? Where the flexibility is more useful?
I agree. I can’t even turn them on and off away from my home for some reason. That was the main reason I bought them. The other reason was for the color temperature and dim control.
I am not a user, but I believe the idea is to have the light settings be part of a larger macro.
For instance, you're upstairs and you wish to watch a movie in your home theater in the basement. You hit the button on your phone and while you travel to the basement the lights lining the hallways and staircases come on (if off); the projector turns on and switches to the movie input; the receiver turns on and switches to the movie input; the theater recliner adjusts to your preferred position; the bias lighting turns on; and two minutes later the lights lining the hallways and staircases revert to their previous state.
Ok but this is still mostly for novelty though, right? It’s not really about saving 2-3 seconds of work I presume (probably takes an hour to program it just right), it’s more that it feels cool and futuristic I imagine.
Especially since this kind of thing has only been possible in really high end homes until now, so it feels exclusive. At least that’s my read.
With my new TV, I have a "modern" remote with few buttons because everything happens in the UI. When I watch TV I need sometimes to enable subtitles (only on non-French speaking channels). With a traditional bulky remote I had a subtitle button: simple and straightforward. With the new remote I must click on a menu, navigate through items, select subtitles, and close the overlay window. It's really annoying. I miss old big Sony remotes with a lot of options for subtitles, image ratio, sound, speed control...
While I agree with you in principle, I don't in practice.
The whole problem is that TV remotes keep changing. My TV now we have two remotes, one that came with the TV and one for Google TV.
I used the TV remote the other day out of necessity...and it took me about 20 seconds to find the mute key. On the GTV remote, it's one of like 5 buttons.
I'll always prefer physical buttons to menus, but not when the layout and meanings of said buttons change every year.
Our new* black TV remote includes a big, white with red text Netflix button. We don’t have Netflix and thus will never use this button. The rest of this remote is fine.
I remember when someone brought an Amazon Echo Dot to a party to play music.
It ended with a bunch of drunk guys desperately shouting what music to play next, with Alexa getting it right maybe half of the time.
I have two remotes: one for my TV, and one for my AV receiver. I realized I generally only need four buttons: the power button on each, and volume up and down on the AV receiver remote. Arguably that could be compressed to three; I very rarely want the AV receiver on and TV off, or vice versa (unfortunately neither remote supports programming it for the other device, so I can't use the "system on/off" functionality).
So for my regular use, I wouldn't mind voice control for this, though voice assistants have trouble hearing you when there's extraneous loud noise, so volume control (especially when a really loud scene comes up and I want to lower it) would be difficult.
I don't like the idea of a smartphone app, because there's the hassle of unlocking my phone and finding/switching to the right app.
The author also mentions using gestures, which also seems very error-prone.
So I guess a dedicated remote control is the way to go, though I wouldn't mind having a single bare-bones power+volume remote, so I could toss the full-featured remotes into a drawer and only pull them out when (rarely) necessary.
“Alexa, watch Daredevil” is much easier than turning the TV on, opening Netflix, and searching for the show.
The remote can then be for just pause/play or volume.
Search is the only time I use voice commands, and it's only because "typing" with the directional pad sucks. If my remote had a mini keyboard instead of buttons for rarely used features or specific channels, I'd never use voice.
The older I get the more I realize how incredibly hard it is to predict anything about the future, especially when it comes to technology.
Growing up in the 90s I thought virtual reality was just around the corner - only now 30 years later are we starting to see virtual reality.
5 years ago it seemed like 100% self driving cars were just around the corner. You can argue we are much closer than we were, but it still seems like we are pretty far away.
10 years ago, the web was dead and apps were the future... today hardly anyone believes apps are the future.
I imagine that travel to Mars, and a moon colony seemed like it was just around the corner in 1970 and yet here we are 50 years later and neither one of those came to fruition.
The reality is tech is incredibly fast moving, which makes it hard to predict, but still not as fast as we think it is.
I've got a long-standing bet with a friend that 'VR' will take off as soon as it becomes 'AR', ie transparent glasses that overlay information on the real world.
That, IMO, is the killer feature, and once it hits takeoff, the headset era of VR will be looked back at as a necessary stepstone that was ultimately completely replaced with what ultimately will be used.
VR by itself is probably a thing. There are times when you want an immersive experience such as gaming or virtual exploration. But I expect it's a niche. I'm not wearing VR to your virtual meeting.
AR, in the inobstrusive/genuinely useful sense is harder but seems far more interesting. Yes, there are social factors to deal with as well, but I can certainly see worn information displays becoming a thing.
I don't disagree! Rather, it's just that I expect AR glasses to have a fully-blacked-out mode when necessary, and those full-immersion times will just be one (small, I'm willing to bet) mode of the overall headset.
As a side note, my friend and I first made this bet back in the DK2 days, and I was ~60% confident I was correct. What pushed me in to the 90%+ region was playing with an Oculus Quest. The guardian mode, freedom from wires, hand tracking, pass-through mode, etc... Everything that felt like a real step forward was also something that will ultimately apply to AR glasses. It really made me think I was on the right track.
I think the other things that's happened with VR is just the quality/size of TVs generally. No that doesn't deal with a few specific aspects of VR like flight simulators and FPS. But having a high-res 75" or whatever screen in front of you basically handles "virtual reality" for anything that doesn't involve looking around.
Can we be so certain of this, especially given the events of the last year? If VR technology had been perfected at the time, it seems very likely to me that instead of a shift to Zoom at the outbreak of the pandemic, many companies, government agencies and (especially) schools would have made the move to VR. It will be interesting to see how VR is integrated into our every day lives (both voluntarily and otherwise) as it is perfected.
Because that's not how people attend meetings. Meetings are mostly not-full focus events. That's not to say that VR couldn't have a role in, say, an in-depth review of a hardware design. But, the typical meeting? People are turning their cameras on and off and are probably spending about 50% attention depending upon how relevant the current topic is to them. This of course happens in the physical world as well.
>Meetings are mostly not-full focus events. That's not to say that VR couldn't have a role in, say, an in-depth review of a hardware design. But, the typical meeting? People are turning their cameras on and off and are probably spending about 50% attention depending upon how relevant the current topic is to them.
This is true, at least in part, because Zoom meetings allow this, not necessarily because this is the behavior that an employer views as ideal. If VR technology was well-developed, I think its likely that many employers would hold VR meetings for the very reason that employees would be forced to give 100% of their attention. Certainly this is true when it comes to schools, which have done everything possible to ensure that students are paying 100% attention all the time, aren't using supplemental materials during tests (cheating), ect.
>When North Carolina A&T State University junior Arielle G. Brown took her International Marketing exam in September, a cheating-detection program analyzed her behavior through a computer webcam the entire time. After the test, her associate professor fired off a furious email ripping into her class for some “negative behavior” the software had flagged.
>“A STUDENT IN 6 MINUTES HAD 776 HEAD AND EYE MOVEMENTS,” she wrote, adding later, “I would hate to have to write you up.”
Everyone keeps recommending vacuum robots to me, but they don't solve the hard parts of vacuuming. A vacuum robot will not move furniture around to get the dust in the corners. A vacuum robot will not clean the dust on my desk, or in my keyboard.
It is very true that a robot vacuum will not solve ALL your vacuuming problems.
It really only solves maintenance cleaning IE doing the main bits that get dirty most often like common walkways, near the front door etc.
For me, that is 90% of the work. I only live in a small place and wife and I appreciate it. In a large house, with 3 or more kids? I'd buy one for every floor if I had to.
For the other 10%, I have a dyson battery powered stick thing. Gets into the nooks, does the keyboard and the car too.
And there is something nice about being able to do the dishes, wash and dry clothes, vacuum the floor; all while playing games.
But there are already glasses like this? Google Glass, Epson Moverio, Magic Leap are the first few that enter my mind. And none of these seem to really be "taking off". Sure, there are niche applications that match the constraints of these, but it's not clear to me at what point your bet would be considered to have failed because of a lack of "taking off"...
The only thing that comes close is the Hololens, and if you ever get a chance to play with one (which I do recommend!) you'll see immediately why it Isn't There Yet™. The biggest killer is that the field of view is tiny- think a single A1/Letter sized sheet of paper held at arm's length. It feels less like AR and more like a view portal, and since it currently has no way to block light behind its projections, everything is washed out. Not to mention that it's closer to the headset side of things than regular glasses.
It's certainly a start, but there's a long way to go.
Nitpick: you mean A4. As an aside to those who don’t get to use ISO A paper sizes, they are trés cool. The x/y edge sizes are the (edit) square root of two. A0 is 1 square metre in area. Each step (A1, A2, A3, ...) just chops the sheet in half.
As an aside, there are also ISO B and C series paper, although I believe that the C standard was withdrawn somewhat recently. The area of B_n is the geometric mean of that of A_n and A_{n-1}, while the area of C_n is the geometric mean of that of A_n and B_n. Without context, C4 paper sounds quite scary, until you realize it's probably related to A4 paper in some way. As to why it isn't just called A3.75, I have no idea.
It's bizarre how inverse-correlated motion sickness susceptibility vs VR sickness appear to be. I don't get motion sick at all, but even a few minutes in a poorly-designed VR experience makes me want to puke. Some of my friends are complete opposites. Most people I know are lean one direction or another, usually pretty strongly.
I do wonder if this is a simple statistical fluke, or if it's pointing at some deeper aspect of our biology.
But motion sickness is when there's acceleration. That is, the physical acceleration is the cause of the sickness. It's not the visuals.
VR sickness is when there's somewhat equivalent visuals, but not acceleration. So I could see motion sickness and VR sickness being essentially opposites.
Arm-chair psycho-psychologist: Extra attuned to your eyes, and you get VR sickness because your body keeps refusing to follow what your eyes are telling your brain. Extra attuned to your body, and you get motion sickness because your eyes keep refusing to confirm what your body is telling you.
Anecdotal evidence: if you get motion sickness, you can help mitigate it by focusing on the horizon (ie, feed your eyes what your body is already telling you).
If you get VR sickness, maybe you can concentrate your focus on your body/breath to mitigate.
I worked on a relatively compact head mounted display that the company thought would be successful for 3D movies (this was 2005/6). They also imagined that users would want to watch this content while traveling on airplanes. We did some (cheaper) user testing by putting people in the back of a limo and driving on highway 280 in the bay area. One of the users had to pull over to vomit. He was an ex-fighter pilot.
I get very motion sick, but I do fine in VR. The only time I get sick with VR is when the motion is not correlated with my head movement.. for example, if you turn the camera with a controller.
I was "walking" around in some VR meeting thing called RecRoom, I wanted to throw up. If I was teleported, it was mostly okay, still nauseating but holy cow it physically hurt to move like that.
Yeah, I really wish people qualified where they're having the problems.
Seated experiences where you use smooth locomotion are bad. Aircraft (or spaceship) cockpit type experiences are even worse because not only do you have translation that's out of sync with your inner ear and sense of movement, but you have rotation as well (which is much worse).
also, newer headsets are much, much better than in the 90s.
Right, but as I'm sure you realize, if that's the case, then mass-adoption of VR is a non-starter. We /love/ games and movies that transport us to places. There are very very few movies with a static camera, though I do happen to like them.
I am not sure. In the past year I have actually started to see quite a few non techie friends buy the oculus quest and are avid users.
I do feel like we are at a turning point where VR will become the dominant non mobile gaming device of the future. If it will be used much outside of gaming is the big question.
Agree with you. I think we've just entered a period of rapid adoption, and the driver is FB making better and better low-cost entry level devices, hopefully pulling along more and more competitors.
I think VR headsets are within a factor of 10 of the cost of a good monitor, and within a factor of 5 of the angular resolution of 20:20 vision. It seems very plausible to me as the resolution goes up and costs come down that e.g. a company would start pushing employees towards a headset instead of multiple monitors within the next decade.
DRAM would like to have a word with you. And I can't say laptops have improved much for a decade now (you'll see an increase in quality per euro still, but it's very marginal and not a single component made leaps), I was rather disappointed when getting a new one after my 2011 model. Back in the good old days, three years would go by and I'd love the speed of a new one, or of ssds, or the gigabit ethernet, or the 802.11n.
Smartphones too. 2011->2017 changed big time - in hardware at least. But now it doesn't really matter if you get a 250 or a 1000 euro phone. You'll get a better camera and some gimmicks like oled (objectively better, I went from oled to lcd and it bothered me for weeks, but honestly I'll use a phone for years, I'm used to it now, and don't miss it at all anymore) or usb-c. Can't do much more without reducing battery life or other aspects. And sometimes they do -- make a worse phone with better specs I mean. People don't mind because it's all the same anyway, the tech is stagnating there.
As a rule, you're right, but the exceptions and limitations are also everywhere.
Sure, things that were produced by teensy niche companies got cheap when the met the economy of scale. That's true for everything, not just tech. On the other hand, phones are way more expensive today than when the original iPhone came out for 'just' $499, while wages have held steady over that time.
This is true. Another thing that drove the prices down is the fact that the countries where we outsourced manufacturing have built up their own industry and know how. They are now selling the same products for half the price directly to the consumer.
And $20 can get you an amazingly fun kiddie drone. Got for my son a HS drone and was amazed at what you can get for 20. Yes, prices’ drop is a big one on technology evolution
I still use safari on an iphone6 to browse youtube, reddit, and whatever else.
I like that it saves my tabs for me. I have old ones sitting from nearly 5 or 6 years ago that I still go back to occasionally.
I also like purposefully obfuscating browsing social media and youtube so that I spend less time there and am less inclined to use it all day. I can even have a reddit account without ever getting any kind of push notifications. Its great!
That being said, I am probably going to get a linux phone once this one finally leaves me, so I do not represent a large segment of the consumer base.
I think the people who were marketing/hyping VR and self-driving cars for their own gain made it seem like those technologies were just around the corner.
And also just boring, mundane “futurists” - it’s not as hard to predict the direction of technology as the timing. Futurism is a kind of inspiration porn, I’d say.
> I imagine that travel to Mars, and a moon colony seemed like it was just around the corner in 1970 and yet here we are 50 years later and neither one of those came to fruition.
It was just around the corner. The grand arc of human life altered course, and it took the market 50 years to catch up to collective action.
Well, war and the resulting competition drove the space race. Once the Soviets and Americans decided that space wasn’t worth fighting over they spent their money elsewhere and we stopped advancing in that direction.
I usually give a lot of the credit for the recent push back into space to private companies like spaceX, but, now after making this comment part of me is wondering if I am just being naive and the real reason for space becoming a priority again is because China has started to make significant progress in their space program.
SpaceX is doing things, but I wonder if that is just result of excess capital we have on markets... Because no one honest can really calculate reasonable return of investment on things like colonisation of Mars...
I don't think so, in this case. Their initial starting capital though, that was won on tech startup lottery...
As for the Mars thing, SpaceX is funding it with their profits from boring commercial launches, and I don't think that is driven by excess capital on the market.
This is exactly what I want to build and think it's one of those foundational building blocks that we can't believe we lived without. Files stored on IPFS with their hash being their universal ID. Metadata would be associated with that IPFS hash such as name, maintainer, description, etc. The storage would be a smart contract on Ethereum with the key being the IPFS hash and the value being the current metadata. It's basically a distributed web app store. Throw on a Sybil resistant voting mechanism and you have a discoverable shared and programmable dataset of applications and modules.
If a project like this exists I unfortunately haven't come across it.
I vividly remember in high school in the late 90s...the Prius was out. I thought the first car that I buy (with my own money) would be an electric vehicle. Though EVs have been around, they aren't mainstream.
One place I still see point-and-shoot stand alone cameras is in construction/contracting.
When I've had companies come and bid on things like roof repair, for example, the person that comes to access the job and prepare an estimate will use one. During the job I see the workers using them to document what they are doing.
I know they have modern smart phones, because I see them use them.
My guess is that they use the stand alone camera because of the better interface. Smart phones can be a pain to operate while wearing heavy work gloves, or to operate one handed.
Just this afternoon I wanted to take some photos of the gutters on my front roof to see if they needed cleaning. That section of roof is only about 7.5 ft above the deck, so it is easy to just reach up to where the camera has a view of inside the gutters. Except on about half the shots, some part of my hand would hit some on-screen control and do something to mess up the shot, such as changing the mode from photo to video.
It would have been easier if I had used my old Sony DSC-P93 [1].
There's also the "rugged camera" aspect. Some of the ruggedized point and shoots are designed to be used in the wet (most are rated for limited underwater use), to withstand drops from "chest high or a bit higher" onto hard surfaces without breaking, and generally tolerate levels of abuse that a smartphone, even with a case, usually won't put up with. I'll let my toddler run around taking pictures with a rugged point and shoot without worrying about anything beyond "Where on earth will they hide it when they're done?" I usually get something resembling a low FPS video of the house, mostly blurry, but they enjoy it!
There's also the aspect that, for construction documentation, everything is on the camera card. You don't have to pay attention to who has which photos, but, oh, their phone won't run the uploader app right, or their home ISP is slow, or... whatever. You use the camera, you take pictures, they're there.
And I'm still inclined to use a camera for documenting a number of my projects around the property. I don't have to worry about it getting beat up, I can use it in gloves, etc. Plus, the lens is better.
Maybe those guys use them because they can or that they haven't caught up with the trend.
In my country, everyone uses a smartphone (practically speaking). Delivery, repair, inspection people, all use their smartphones to take photos and upload to destination via an app.
Apart from being convenient, this has a side effect that the organization can push device cost onto individual contractors and lay people. I'm fully certain western countries will catch up with this trend just to save a buck.
The reasoning behind this, that everyone is going to have a place big enough for a great home cinema setup is laughably privileged and off the mark. Such a huge amount of people live in absolutely tiny places with neigbours that mean the audio and video could never be a decent home cinema.
Hard disagree from me. Large TVs have gotten so cheap and movies are so expensive. We found one of our TVs on the curb for free, and it’s miles ahead of what we watched tv on when I was a kid. Who can afford to go to the movies on the regular but not afford a tv?
And I know, cheap TVs aren’t what you meant by “decent home cinema” but for many people a 60-75” tv with a midrange soundbar (or decent headphones) is plenty. Both those things can be had secondhand. Obviously below a certain income level it’s still not doable, but I can’t imagine being so poor I couldn’t afford a secondhand tv but still willing to drop $15 on admission then pay $8 for a popcorn.
I think this prediction is laughable. There are many people around the world who absolutely cannot afford a high quality picture and sound system at home, but can easily pay the occasional price of a movie ticket.
Example in India. The average ticket price in a high-end movie theater is ₹239 ($3.18), most tickets are priced way below that [1]. The cheapest 4k 55" TVs I can find on amazon.in cost around ₹35,000 ($466)[2]. And these units will have the absolute crappiest screens and even worse sound. Anything smaller and you won't have a screen big enough to replace the theater-going experience.
I think your perception is distorted by the extreme high price of movie tickets where you live ($15 would be insane here). However TV prices are comparable. So obviously the TV looks more attractive to you.
I have a 65” TV and a great 5.1 surround sound system and it’s still incomparable to the feeling of actually being in the cinema with my vision filled and body rumbling, it’s like comparing the experience of an electric bicycle to a Harley.
What’s more, the context switch alone is worth it. In the cinema my job is to watch the film. At home I’m thinking about chores and checking my phone. It’s not remotely the same.
Don't you have to pay extra for the movie anyways? I recall Disney recently charged a movie ticket price for the showing of their latest princess movie.
At that point, you are really only saving on snacks. I agree that that's not thing, but its not as if buying the TV grants you the cost of admission to the movie as well.
I don't own a TV and I still go to theaters for the occasional movie. I've always felt that I got my moneys worth when going out to the theater. Plus, having a large TV mounted somewhere is just another piece of furniture that I have to account for.
I think a home theater system makes sense for a diminishing segment of the population. A theater is cheaper if you are only interested in major releases.
I don't think going to the movies is mainly about the pictures and sound. At least it was never for me. I think it's about the atmosphere and the event with friends, family or partner. It's just different watching a movie in the theater vs watching it at home.
That's why I don't think it will go away anytime soon.
so for one thing, it really depends whether "great" means "loud" to you. I personally find the levels in most theaters to be uncomfortably loud, but maybe I'm just lame. unless you need to feel the floor shake, it's totally doable to have a great home theater experience in an apartment. $2000 can get you a pretty nice 55-60" tv and a capable audio system to go with it. $2000 is a lot of money for some people, but each movie outing for two is going to run $25+ these days. and this is for enthusiasts. a "just okay" HD setup is more like $500.
I kinda suspect it will be more like a CD-quality audio situation. is 1411 kbps PCM better than a 128 kbps AAC? yes, but for most people it doesn't outweigh the convenience of being able to stream on your phone. and people have shitty earbuds anyway because they don't care.
Our 55" TV with a 5.1 sound bar setup is more than enough. Unlimited pee breaks, snacks and comfy seating. No need to dress up (or at all).
Compare that to a 20-30 minute car trip one way + parking + walking to theatre, expensive snacks, loud-as-shit misbehaving kids interrupting the movie etc.
We've paid for every single Disney+ "premium release" and haven't regretted a second. I don't think we're going to a movie theatre as a family ever again - as long as there's a simultaneous digital release.
I'll go solo to see my AAA R-rated blockbusters if cinemas ever open again.
Idk, this and the original article are just clickbait. Most of the predictions on that list were outright silly. Anyone who thought 3DTV was anything more than a fad is delusional, a better prediction would have been that his son would never have to experience 3DTV. Wireless will never replace wires and the same goes for desktop PCs, sure their market share will reduce but their demise is greatly over exaggerated.
Most of his predictions were based in fantasy desire.
I would argue that while his son will probably never use an actual dedicated Fax Machine, he will probably have to figure out how to send a fax at least once. Lawyers just can't seem to get away from those damn things.
> Most of the predictions on that list were outright silly. Anyone who thought 3DTV was anything more than a fad is delusional, a better prediction would have been that his son would never have to experience 3DTV.
That seems a bit harsh, considering that the 3D versions of major blockbuster movies do well in theaters. It doesn't seem unreasonable to expect that quite a few people would want to be able to get the 3D versions of those movies for home viewing.
A 3D TV is not much different from a 2D TV. Doesn't it mainly just need a higher refresh rate, and for some 3D technologies a higher vertical resolution? The higher end TVs from most manufacturers already often have twice the refresh rate of mid end TVs, and higher resolution (e.g., there are 8K TVs out now, even though most programming is 4K or less).
I'd have expected TV makers to continue offering 3D, but only on their high end models. It wouldn't really cost them much, and it would get those people that watch 3D movies in theaters and want the same version at home to buy a higher end TV than they would otherwise need.
> the 3D versions of major blockbuster movies do well in theaters.
Did well, past tense. 3D movies peaked in 2010 and box office sales declined annually. By 2019 they were less than 50% of their 2010 peak. TV makers stopped making 3D TVs after 2016.
You're right that 3D TVs were just high refresh rate TVs but they have to sync with shutter glasses so you can't easily do 3D without TV support.
The latest HDMI spec supports variable refresh rate so in theory you could now make a player capable of handling the shutter sync.
It was a fad just like in the 50s and 80s. It lasted longer this time but yeah 3D is dead.
The first article was written in 2012, box office numbers peaked in 2010. Everyone was wowed by Avatar in 2009, and then less impressed when the best 3D movie of 2010 was Avatar: Special Edition.
Almost no one cared about 3D movies in 2012 and it was only industry momentum and sunk cost fallacy that kept things going for as long as it did.
I’m a lawyer in a government agency and never once in my 20-year career have I sent or received a fax. The only faxes that come out of the machine are junk faxes. People have been emailing scanned PDFs for at least 15 of those 20 years.
Clearly you haven't had to change your benefits in your 20 year career. I briefly worked for the state 5 years ago, and had to fax them my social security number to elect for benefits. No receipt at the other end other than benefits being successfully pulled the next paycheck. Fax works its way in to some roles where it is entrenched, namely HR.
Clearly I have changed my benefits many times in my 20-year career. Most of the systems to do this in my government are Web-based. For those that aren’t, I email a PDF form to the appropriate personnel.
Weird how different these systems are. Anything involving something like a social security number could not be emailed. HR would be explicit telling you not to reply back with the filled PDFs via email.
And yet it's ok to email PDFs containing all the bank verification info to transfer your retirement account. Ask me how I know...
We have software installed in our email gateway that detects key words like SSN or 16 digit numbers that look like credit cards and blocks them from being sent.
Weird, maybe it's just real estate and medical then but several times we've been unable to get around it.
I initially wrote a fax App using the Twilio API to send some medical forms that could only be faxed. Since then I've had to use it for the sale of two homes, and this past summer it was augmented to receive faxes for closing on a home. My partner ask me for the number just last week to request some medical records from out of state.
I agree. It's quite clever, though. You have the benefit of two "engaging" articles: one for the initial predictions and one for the results years later.
It's easy to predict things correctly when you control what your son will use, e.g., 'my son won't use a landline', well yes, you cancelled your home landline before your son was born. The original prediction was that most people and most businesses would stop using landlines but he concedes his son might still use a landline in an office some day and yet still considers the original prediction to be correct. The same is true for phone numbers, dedicated cameras, mechanical harddrives, arguably prime time tv but I can't really blame him here. Theatres are considered TBD but it took a pandemic that also shut near everything else down for much of the world, many businesses are in for a rough shake up.
The crazier predictions are ones that didn't come to fruition line no more floating window managers or mice and people no longer building desktop pcs, or ones the author still thinks are going to happen like no more wired internet connections.
> The original prediction was that most people and most businesses would stop using landlines but he concedes his son might still use a landline in an office some day and yet still considers the original prediction to be correct.
For what it's worth I'm 15 years older than his son, and haven't used a landline in over a decade. I'm typing this from my office desk that has a landline phone on it, that as far as I'm aware has never rung.
It's not an unreasonable conclusion to mark that prediction as correct.
EDIT: actually upon closer inspection it appears that the phone on my desk might be VoIP, so I've not only not used a landline phone in a decade, I haven't even seen one.
I actually do have a landline. My internet company told me my bill would go up by $10 a month if I dropped the landline service from my internet+landline combo. Welcome to the free market where I have this or DSL. I don't have a phone hooked up to it, nor do I even know the number.
Among the ones he claims as having come true, a couple of them didn't. Physical media like HDD has greater permanence than SSD, so it isn't going away yet. But his son also used it in gaming consoles, which (it may come as a surprise) are computers.
Also, you can't use whatsapp without a phone number. Several apps and services require a phone number for sign up or 2FA. So that one is bogus too.
I really hate this sort of breathless futurism (and futurists) that dismiss perfectly good tech because it seems outdated. I am glad author is pretty much wrong on most counts!
He listed quite a bunch of tech that has no reason to die (like mice), and often has a good reason to live (like wired ethernet or windowed desktop environments).
Some of the technologies hi lists are indeed are on its way out from home experience, like HDDs and landline phones, but it does not mean they do not have niches where they are doing to linger for at least a decade.
This list seems to have been written by a journalist type extrapolating from very visible consumer trends and completely uninformed by actual engineering or physical realities. I’m no expert but from what little I know about how complicated wireless standards are and the ever increasing bandwidth demands I can predict continued wired communication. And the mouse and keyboard vs touch thing should be pretty obvious for anyone who has done office work.
I think a lot of these predictions depend on your use case.
Point and shoot...I have DSLR. I didn't have one 15 years ago. It's not for every day use, of course but I use it frequently. I can do so much more with it and the quality is so much better.
Mouse...my parents/in-laws haven't used one in years since their main computer is a tablet
Home phone? I installed one (VoIP) a few years ago. If I need to reach anyone at home I just have 1 number to call and I don't need to worry about phones having no battery, on silent etc...Also my kids (oldest is 4) and may need to reach me when they are older and for sure they aren't getting their own cellphone for a long time.
I hope hope hope that the death of theaters never happens.
> In my original article, I said that a confluence of factors would kill movie theatres: the improving quality of home theaters, the eventual death of the 90-day theatrical window and the cost and hassle of the movie-going experience.
I mean, I appreciate being able to just have a nice quiet evening watching Netflix, but if anything after the pandemic I yearn to see a movie in the theater. This type of commentary rarely mentions the social aspect of going to the movie theater, watching with friends or a date, etc. It always goes with the "going to a movie theater is not efficient" take, which makes me think that people's brains just must be wired differently. I consider myself an introvert but I'm so excited about being able to have normal in-person interactions soon.
As a live theatre industry professional, movie theaters will go the way of theatre, once it stopped being the main form of entertainment, it didn't die, it just became an occasional, expensive treat for lovers of specific genres.
Hollywood may die, but you'll go to a movie theatre just like you may go to see a Broadway tour or a regional Shakespeare once in a while.
Films might actually get better when the demand to crank out the most popular drek for box office bang fades away.
> Films might actually get better when the demand to crank out the most popular drek for box office bang fades away.
I think we have been at this step for decades already. Look at what is featured and promoted at the Sundance festival and what Annapurna produces and what others do: we already have such movies without the marketing and production budgets of Marvel or Disney or big budget producers.
At some point theaters and movie studios need to have variable pricing for movies. All first run movies being the same price is silly and leaves a ton of money on the table. May be one way it gets to be more like theater.
I’d be really curious what the margins on the in theater experience break down as. I wouldn’t be surprised if big crowd pleasers take a bigger cut then small time films
I expect it's not so simple. People aren't rational about pricing. I expect a lot of people would be outraged if you tried to charge more for a movie expected to have a popular open.
They already do a lot of price variation in the release process, by the time it gets to cable their marginal revenue is something like a few cents.
Folks already pay different prices for different movies on the streaming services. I don't think it is a big stretch to think that they would pay more for better movies.
> As a live theatre industry professional, movie theaters will go the way of theatre,
They've already gone that way, in part, particularly during the 60s, 70s and 80s when a large share of cinemas closed. Then it sort of stabilised, at least in terms of screens: while cinemas kept on closing, some others got bigger. But those bigger ones rely on the business of blockbusters.
Maybe it will start dropping again and turn the way you foresee, I don't know.
It's differently social, at least in my experience.
Getting the boys together to watch a film at home usually means playing MST3K all night while we pound beers and butcher a pizza.
Hitting the movie theatre means we keep our mouths shut, focus on the film and the experience of being in the theatre, and the compare notes over dinner/beers after the fact.
It's like comparing going to Easter Mass with watching a televangelist over public access on Easter morning. The two are vaguely analogous, but experientially incomparable because of the environments and framings in-which they take place.
Personally,both do appeal to me.
Some beers with friends, relaxing on a couch and watching some easy going film sounds like a nice Saturday night.
On the other side,I love going to cinema- the smell of overpriced popcorn, big screen and the same exciting feeling I get when I see a studio logo and know that the film is about to start. And I can always throw some popcorn at those who think having their phones on is a good idea:)
Going to a movie theatre is a really variable night out. The quality of the experience hinges on so many factors that it's practically a lottery whether it's good or not, especially with a cinema chain movie theatre and a mainstream film. It doesn't take much to tip the balance from a great night out to one that feels like a waste of money. A mediocre film, unbalanced sound mixing, noisy or phone-using people within a few rows, stale popcorn... Any number of things ruin it.
I much prefer arthouse cinemas (my favourite has a bar, and you can take a bottle of wine in to the theatre) these days. They cost a bit more but the experience is usually pretty good.
This! I only go to a theater if there is no risk it will turn out to be a dud movie. For me, I am mostly just waiting until Christopher Nolan's next movie.
I seriously miss movie theaters. The movie theater experience is such a great group of friends activity.
Plus, Marvel has gotten so good at turning it into a crowd participation activity. It’s almost like going to a sporting event from a crowd energy perspective.
During the pandemic I’ve watched those “Audience Reaction” videos on YouTube a lot just because I miss it so much.
I don't think movie theatres will ever go away, especially after the pandemic they got for sure 10 extra years anyway. The social aspect is so important, it also means doing something else (like eating at a restaurant), there will always be people preferring them.
I am thinking if we might get different form of theatres? Not everyone has a house, not every teens can watch movies on their date in their "parents" house. Unless there are much better things to do in 20 years time than going to see a movie together on a date I dont see theatre every going away.
I do wonder if we get smaller, private space theatre in less prime locations. Basically these rooms could be used to watch latest movies, live sport, or other things where a group of people can stay together and socialise. You still get a 90 days theatrical window with much higher quality stream than you would be renting on Netflix 90 days later.
I’m optimistic that some theaters will survive. I think a good analogy is bookstores.
In the 90s there were all sorts of bookstore chains, now it’s pretty much B&N. At the same time, at least here in Boston, there are several long-standing bookstores (Harvard, book smith, trident, etc.) that still get a ton of traffic and are local institutions in their own right.
So when I think of local independent theaters that already have a dedicated following (thinking mostly of Coolidge corner theatre) I think they’ll find a way to thrive.
I think millennials will be the last major theater goers. We grew up through a time when the social pressure to ignore your phone was stronger than phone addiction, so theaters still serve as a place where we go and have an uninterrupted group experience. But phone usage during movies hurts that, and unless future generations can kick the addiction, then theaters will start to feel pointless as everyone is on their phone.
I rarely go to theaters but then I own a house with good movie-watching options. Personally I tend to do live theater rather than going to movies with folks. But it seems as if going to the movies will remain a fairly popular option for younger people at least.
I love movie theaters, but I think they’ll die because they are mostly anchored to malls, and our development patterns make standalone theaters difficult.
Online ordering and increasing poverty makes the mall a declining asset where the movies are one of the last big drivers of demand.
> because they are mostly anchored to malls, and our development patterns make standalone theaters difficult
Interesting—what part of the country do you live in? Where I am in the North-East, movie theaters attached to malls is more of a minority. Not non-existent, but not at all a majority.
> That being said, VoIP landline phones are still a part of many peoples’ lives. Cable and fiber optic providers continue to offer these services and many people, including my mom, still have them.
I work from home and I still use a land line phone. I bought the best quality speakphone I could find and I don't regret it a minute. Cell phones just don't have the same voice quality, especially for speakerphone, as a land line. And I avoid doing voice over the computer because my computer is busy and I don't want to sound like a robot because it can't handle the workload.
I'm an industrial electrician. Changing well-embedded standards and technology is extremely difficult, except over the course of decades to century+. Locomotives use 64 volts for control voltage. Eight 8-volt batteries. (Drive voltage is 500-750, usually 600.) Why? Because they settled on it when locos were first electrified, long before 12 volts was a standard (or other, now common voltages). Many steam engines had 32 volts; 64 was an easy doubling to handle higher power start loads. They will continue to be 64 volt until the end of time.
This one being in the list surprised me. I was born more than 20 years earlier and I don't think I've ever seen a fax machine in action, let alone used one.
That's because dedicated fax machines mostly died out in the late 90s when multi-function machines and software modems arrived.
I was working at Office Max at the time and the transition happened fast. We went from having more than eight dedicated fax machine models down to one or two.
The first multi-function machines looked like fax machines and could function without a PC but we're so much better when connected to a PC.
Cheap software modems allowed people to send and receive faxes without owning a dedicated machine. I remember eMachines bundled software with their PCs to make them effectively behave like a fax machine. Later they pushed the eFax internet based fax software.
It's crazy that many business transactions still require Fax but the machines don't really exist. We needed to send/receive faxes to purchase a home last year as PDF/email was unacceptable. I was able to create a virtual Fax Server using Twilio in a couple hours and deal with these silly requirements.
Sadly Twilio is shutting down their Fax service later this year and the suggested alternative is vastly more expensive.
I had the same thing with a refi. They said they needed to send me a fax, have me sign it and send it back. I asked them what millennium they thought it was. They looked at me for a second, then said "We can send it as an email attachment".
So I suspect that they don't require fax. Fax is their normal way of operating, but it's not a legal requirement or something. It's just their default, and they'd rather not have to deal with changing it for you.
But who's paying who? Oh, I'm paying them? Then they can forget making me jump through their obsolete technology hoops. No, they can figure out a way to send me the documents that I'm already set up to handle.
I was born 30 years earlier, and I have used a fax machine, but only a few times, and interestingly only within the last 15 years.
You're fairly likely to have to use one even today if you end up in certain places (Japan, Germany) or in certain businesses (medical, restaurant delivery). Then again, I bet many of these have the physical fax machine replaced by digital fax services. It's ironic, because a PDF would obviously be higher quality and faster to transmit, but some places still require fax but will accept you "faxing" a PDF via an online service, which then gets received by a fax-to-email service.
I sadly had to use one recently and it was to fax documents to the IRS. Their only two options were fax or mail the documents. No email. Thankfully I had a scanner (in a box somewhere) and found an online service that faxes PDFs on your behalf.
I've also had to fax documents to immigration (again, government). I guess my point is if you're dealing with the government, you'll probably have to fax something at some point.
If you have a landline, you can send a fax from your PC with just a $15 USB fax modem. Windows has built-in faxing software (Windows Fax and Scan) that is dead simple to use. Have you ever noticed that "Fax" is also one of the default printers? Yeah, me neither, until I saw a YouTube video on it.
Just plug a phone line into the modem, print to "Fax" from any program, enter the fax number, and hit Send.
Doctor's office or lab in the US. Although this is slowly changing. And I did have to fax something a few months ago, albeit using an online fax service.
I'm pretty sure optical media will be around for a long time although not as mainstream as was. It's cost is still unbeatable. You can get a 25gb brd for less than a $ while a memory card with the same capacity is much more expensive. Shelf life is also better for optical discs, they are great for create and forget backups.
A digitized a bunch of home movies, and one of the formats I settled on was DVDs. The latest Xbox and Playstation still play them, so I expect to be able to find readers for the next 20 years.
It's a lousy format, though. For files intended to be played on a computer, I used webm/vp9/opus. It seemed like browser-supported, royalty-free formats that play on everything not Apple would last a while.
When my first daughter was born, 6 years ago I thought "she will never need a driving license, even if we live in the countryside". Now, I'm not that sure.
I think it will be decades before self driving cars replace humans entirely. The transition from 0 to “some” is happening at snail’s pace, and even once the cities are conquered, there is the enormously long tail of edge cases like countryside driving that will not be ready for a very long time, if ever. Even a 2020 Roomba can’t vacuum your entire house.
Calling the camera prediction as right seems dubious. Smartphones have replaced low-end cameras, absolutely, but have not and will not replaced high-end cameras. At age nine, it's no surprise that his child has not used a high-end camera. It would be more of a surprise if he makes it through adulthood without ever using one once.
Mechanical storage will probably change first time he builds a desktop to play games on. AAA games will require more and more space and I doubt SSD space will catch up for the budget of kids/teenagers within the next 5 years.
Or when he starts using a computer at school, other family or friends.
And even though I have an SSD in my laptop, I have a bunch of USB HDDs in my office for things like backup and ripped video. I expect my SDD mix to increase over time but I expect to keep using HDDs for non-performance critical work for the foreseeable future.
I have circa 1,5TB of SSD storage on my desktop. And I find it sufficient for my gaming needs. Fast Internet helps though.
Solid state isn't too expensive now, if waiting for downloading is okay. You can get 480GB disks for around 50€. That is what, less than price of new title? And likely can at least one to three of them at one time.
> In 2012, I predicted that point-and-shoot cameras and camcorders were so dead that my son would never use them.
Point and shoot digital cameras are interestingly bad compared to cellphones nowadays. The camera will generally have better glass, better sensor, beefier battery, more space to house the components, but will get beaten by a 7mm thick phone in most circumstances. Amazing how much can be done just by crunching numbers harder and faster.
On the contrary, DSLRs and mirrorless cameras have held their own against camera phones. The hardware advantage is too vast to be bridged by software right now. Or maybe the camera manufacturers are starting to use some software tricks themselves in their newest stuff.
The movie theatre prediction seems a bit bizarre, but I suppose if you don't do it yourself, you'd have little reason to think it would continue. If anything they'll probably be a resurgence in demand after they open back up.
Most of these are debatable and even some of the bets would need arbitration to determine if this was a prediction market (never sending a fax - wrong - vs never using a fax machine - right), but nice time capsule and fun list
Well of course this isn't the case. I've banged on about this a lot, but we've given up on optimising our software and have just opted for making the hardware faster instead.
We have gone in the opposite direction because now my TV takes a minute to boot. Earlier that used to be a 3-4 second process. For this reason, we just leave the TV on throughout the day. Massive wastage of electricity, so I'm trying to not fall into this habit.
Yeah ours is the same. I can Siri to turn on our AppleTV and a few minutes later we get to watch something. I mean it's only a few minutes but the question, "WTH is this thing doing?" is very valid.
I find it interesting that the author is only considering optical media, and streaming media from a third party. If the only concern is about owning a copy of the content, then why not ditch the optical disks and go fully digital? If you set up your own cloud storage for them, you'll even get the benefit of being able to stream your media anywhere you have an internet connection. Plus with good data storage practices (e.g. protection from bitrot, off-site backups, etc.) your media will be safer when digitized than it could ever be when stored on optical disks.
My TV’s, garage, and ceiling fans can and are controlled via Apple TV remote app and Home app. If the remote doesn’t happen to be near us, we don’t see a need to reach for it. My toddlers even know how to use it, so I can see them using apps most of the time instead of dedicated remotes.
I'm curious about the raspberry pi pc build kit for kids. I think that's a great way to get kids introduced to technology and hardware, do they make dedicated kits for young kids?
I had to use a fax machine a few years ago because the university I was attending (UC Merced) would only accept some documentation in person, or by fax (waiving the annoying-to-opt-out student medical insurance). Why you couldn't submit it through the student web portal is beyond me.
It was a bit of a hilarious experience plugging a VoIP box into a ~2010 Brother MFC printer/scanner.
It might be the only time I've ever sent a fax in my entire life (and I'm 28).
Focusing on boot times is weird though, it used to be important when you would boot up your computer in order to use it. Nowadays everything is always sleeping, and can be woken up in an instant to be used, so how long things take to cold boot aren't relevant anymore.
Booting an Android emulator is a benchmark that only an Android developer would care about, though. It isn't directly comparable to booting real hardware, anyway.
Maybe that’s the case in theory but in practice I have never had a fast booting android device and some of them have been pretty good. Pretty much any other OS is fine. I guess in a virtualised environment you don’t have the same challenges regarding integrating a disparate hardware stack ...
For the early versions on Archimedes hardware, yes. Later versions had increasing amounts on disk, which increased the boot time from 'nearly instant' to merely 'fairly quick'.
Although recent versions of RISC OS are now entirely softloaded, the increased speed of the Raspberry Pi and other modern hardware platforms has cheerfully reduced boot time again: by modern standards RISC OS is tiny!
It helped at the time, certainly. An SSD in a modern computer is much faster than ROM from back then, though, and instead of an 8Mhz CPU we have 2000+ MHz CPUs.
And I'd also argue Windows update being so bad is an outlier. I now have one Windows machine in my house and I'm astonished updates still take so long and also prevent you from doing anything else while they install. This is one area that Microsoft really lags behind the competition.
> This is one area that Microsoft really lags behind the competition.
What competition? If you need to run Windows you'll run Windows. Mac and Linux don't compete with them, these are alternative products but not competing ones.
2 seconds was probably an unrealistic goal. Probably more to the point is that I rarely need to reboot a system and coming back from suspend almost always works quickly and reliably. In fact, I'd probably argue that's the more relevant metric at this point.
Agreed. And even if the computer booted in 2 seconds, you would still have to start all the programs you had running again.
I am super happy that my computer unsleeps quicker than I can move my hand from the finger print sensor to the keyboard, but I also basically never reboot it.
Chromebooks were already that fast at the time he made the prediction, mostly because they were designed to use coreboot instead of the usual bloated uefi/bios.
The only reason wee don't see coreboot everywhere is that motherboard manufacturers refuse to adopt or even allow it.
Yeah, ChromeOS downloads and applies updates unobtrusively and raises a notification to reboot, which takes a few seconds. macOS takes half an hour to update even if you have an M1 and their fancy SSD. Android takes about the same time and their handsets will get scalding hot during the process.
ChromeOS is also the fastest of the non-mobile operating systems to wake from sleep. It is up and running and on wifi before I can raise the lid to its normal position.
I'd argue that this is correct in spirit. Not because operating systems boot really fast, but because they have been designed to not need to boot at all. The only time I need to cold boot my computer or phone is after an update (which is maybe every 2-3 months?)
> He Won’t Go to the Movies
This is not TBD but straight up wrong. Pandemic notwithstanding, there is no end to movie theaters in sight.
I’m willing to be called Scrooge, Luddite, &c here, and here I go:
> 2. No Dedicated Cameras and Camcorders
I’m almost 40. I (finally) own a smart-phone. I also own a good digital camera. I tested smartphone cameras for “a leading tech company in Mountain View” for several years.
I’m fully aware of the saying, “The best camera you have is the one you have with you”.
One day I saw a coworker pointing to my digital camera and saying “That type of camera is obsolete”. I didn’t feel good about that, and I have no regrets owning it and still using it with no issues for quality or availability. When I go out for pictures, I bring my camera; when I don’t, I accept I have to accept smartphone quality.
> 7. He Won’t Go to the Movies
I love movies (and television). I watch at home and out.
I go out to the movies for the experience and to get out of the living space. No regrets. I'm an introvert.
> 8. He Won’t Use a Mouse
I cannot stand using a trackpad or the “mouse button” [there are more crude words for it, think the red thing on ThinkPads].
It is a mouse for me for life because it’s actually useable.
> 10. He Won’t Use a Remote Control
Does a bluetooth mouse count as “remote control” for my computer dedicated for watching movies and television 6 feet away?
> 14. He’ll Never Use a Fax Machine
If only some businesses or government I need to deal with was the same.
I'm fairly sure theaters won't be going anywhere. You can't have the ability to watch a 20 foot by 20 foot projection of a movie at home, no matter how cheap the technology gets unless you have a massive backyard and a projector or a huge room in your house you weren't using anyway.
It isn't just about the giant screen for everyone. For many in my circle, the only reason they still go to the theatre is because you can see things when they release. If they had an option for same day streaming (as covid has provided in some cases), they will drop the theatre completely. The quality of TVs and sound systems at home can provide a better visual/audio experience than many theatres.
I don't agree about cameras. Take a look at your favourite youtubers - for the most part anyone who is even half way serious about making videos has moved past the limitations of phone cameras.
Well, as an avid movie watcher who absolutely loves the theater experience, I truly do hope they never go away, and that my hypothetical children get to experience them too.
I could not read more than 10 sec after clicking the link. I mean for God's sake how many ads on single page? I suggest move content to static site for better readability.
I always wondered what makes some businesses so inflexible that they still require faxes. It’s getting rarer these days but I’ll still hit it once a year.
The longevity of these technologies amuses me. I'd say that some of the technologies listed are in demand right now, and I don't see signs of it changing soon. Slowly but surely, new technologies will replace them both faster and have more extended longevity.
Most - not all - of these feel obvious. Land lines? Fax machines? They were already niche in 2012. IDK, these weren't (bold) predictions, as much as already established market trends push out 10 years and then deciding how dead they'd be or not.
Side note: In the late 80's, I worked for AT&T in the consumer marketing dept. I remember there was a manager who repeatedly said, "Someday our phone numbers will follow us no matter where we live." Now, he was _not_ predicting the mobile phone, only that if you moved you wouldn't have to change numbers. But that, even then, he was viewed as a mad man. I wonder that he'd say today.
None of them feel obvious. I never would have made these predictions in 2010, 2015, or 2019, and I wouldn't feel comfortable making any of them out until 2030 at minimum (and only for fax machines and spinning drives on consumer PCs).
1 - Effectively, ubiquitous wifi - The obvious direction of technology + market. Something Comcast / Xfinity was offering and expanding.
2 - Mobile devices with better cameras - Both were handheld with similar and overlapping technologies. The obvious direction of technology + market
3 - Landlines - The obvious direction of technology + market. I haven't had one since early 00's, and that was strictly as a back up for business purposes.
4 - Instant on - Who wouldn't want that? TVs do it. The obvious direction of technology + market.
5 - Non-UI UX - The obvious direction of technology + market, ask Siri or Alexa.
6 - SSD / Cloud - HDs are the fax machine of storage. The obvious direction of technology + market
7 - Movie theaters - The obvious direction of technology + market.
8 - Mouse? - See above about voice, and of course there's touch. The obvious direction of technology + market
9 - 3D glasses - Been hyped for the longest. The obvious direction of technology + market.
10 - No remote - Remember the infomercials, "the clapper"? Also, back to voice, or mobile device (see cameras). The obvious direction of technology + market.
11 - No desktops - He was midtaken.
12 - No phone dialing - Most comms aren't via phone number, and if they are no one dials. Ever since mobile devices - even flip phones - fewer and fewer were dialing. Again, the obvious direction of technology + market.
13 - Won't watch prime time - I owned a TIVO (DVR) in the late 90's and the promise of 500 channels is as old as cable TV. The obvious direction of technology + market.
14 - No fax machine - Never really a consumer product. Why would a 9 y/o need one in 2021? The obvious direction of technology + market
15 - No optical media - See SSD / cloud. The obvious direction of technology + market.
Almost all of these were overly optimistic bandwagon riding on replacements for things that already work fine and most people don't care about replacing.
Fax and optical media are the big exceptions; good riddance. SSD/Cloud are being pushed by an oligopoly of mega-corporations and people don't have a choice. And even then, it's not like CDs are rare like Betamax tapes.
And almost of these are "obvious" directions with respect to adoption beyond enthusiast types who like cool high-tech shit (with the above exceptions). They are niche, not poised for total takeover so complete that a child born in the 2010s would literally never experience them once ever.
Don't confuse "hype in tech magazines" with "things people care about or want". And don't confuse "fun and/or useful technology that will gradually increase in popularity over time" with "overnight society-transforming".
No confusion. Smaller? Less expensive? More convenient? Many of them two, or all three.
Keep in mind, per the original article, some of these are biz / enterprise driven. An SSD might cost more up front but over its lifetime, it saves money. That increase in demand lowers prices, to the point that they are consumer friendly.
I think most are discounting how often ppl still bring their work tech into their home.
I was born in 1996 and have never used a fax machine. Unless you live in Japan, you are being extremely conservative with not being comfortable predicting that until 2030.
https://en.wikipedia.org/wiki/Lindy_effect
The rough heuristic is: the longer a technology has been around, the longer it will last into the future.
That is, it's NOT the case that every piece of technology lasts roughly the same amount of time, and then is replaced.
For example, a chair vs. an iPhone. Which one will be used further into the future? Almost certainly a chair.
----
Land lines have been around a lot longer than fax machines (both in the article), so they will likely outlive fax machines.
Will HTML or JavaScript last longer? Probably HTML, since it came first.
What about ASCII or HTML? Probably ASCII.
These have a "dependency stack" issue, but it applies regardless. And I think that is part of the phenomenon -- low level technologies that are widely adopted take longer to go away. Plumbing, electrical (AC vs DC), land lines, TCP/IP, BIOS, etc.
I can't find a link now, but there was a recent Kevin Kelly article about finding farming tools from a catalog in the 1800's still in use. I think he said that almost EVERY one was still in use, or able to be purchased, which is a nice illustration of the point. It takes a long time for old tech to die, and arguably it never does.