Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Netflix Now Streaming AV1 on Android (netflixtechblog.com)
295 points by discreditable on Feb 5, 2020 | hide | past | favorite | 135 comments


For once I wish services like Netflix and YouTube would announce that "we've improved streaming video quality at the same data usage" as opposed to "we've improved data usage at the same quality". YouTube's compression is _horrid_, and Netflix's isn't much better. The quality between a 1080p BluRay versus Netflix's 4K stream is quite noticeable, especially in darker scenes.


I'm going to bet the overwhelming majority of users don't care about either. This is Netflix cutting costs by requiring less data to serve their content at the quality that's good enough to make most people happy.

Any time I talk to anyone outside our nerd bubble I'm reminded and humbled by what simply doesn't register as relevant to them.


Conscious and subconscious awareness are different things. The mythical "most person" can't describe the difference between a blocky artifacted video and a clean high-bitrate video, but they will prefer the clean over the blocky and enjoy it more.


Sure, but OP argued they didn't care. I don't think its a question of conscious awareness, but rather opportunity cost.

The lessened quality is not something (most) people will cancel their subscription for, nor would they go out and buy the dvd just to get the better quality.


That's true now but in the future when people start realizing that "Pay this premium for amazing 16K content!" doesn't do shit when they sit in their couches with their normal size tv:s we are gonna start hearing "Pay this premium for our 4k option with our Premium Non-Blocky Patented Compression Algorithm SilkComper™"


I think that's probably true if they got to A B test the options.


Further, they aren’t going to cancel their subscription because it’s blocky. What would they do instead, go out and buy The Office box set, and a optical drive to play it with?


These experience degradation pushes people gradually towards a cliff.

Another competing thing comes up in people's life (it need not be even in the same category), the previous one would be dropped if there are unpleasant association with it - conscious or subconscious.

OR, ..need to cut back (finance, digital wellbeing, etc.), again the first thing would go out is the one which is associated with less value i.e., of lesser quality, not living upto expectation, and so on..


You would still buy the burger even if it has a few fewer sesame seeds, would you? ... until there are just three artfully placed seeds on each bun and nobody buys any?


No, but they might pirate (but I bet yify is even worse)


On the contrary, the pirated media I've seen has been by far the highest-quality. It doesn't have to be delivered live, so encoding can be more efficient, and the source is almost always a 4k blu-ray or whatever else is the best that can be bought. In many cases it's even better than playing it through a DVD player because your computer uses better codecs.


Netflix (and most of Youtube) is offline encoded.


I believe the limitation being referred to is that they're bandwidth-constrained -- encoding for streaming needs to strongly favor size over quality.

Netflix needs a movie to be 500MB whereas torrents can happily be 2-8GB


Or 50GB, for some torrents 4K torrents.


I've heard from Netflix people that they re-encode their entire library every month. They are continually improving their encoding and storage tools so the streaming bandwidth/quality savings dwarfs the costs of re-encoding.


Is it really easy to find high quality pirated content?

I've got the impression that highly-compressed content is often so popular that bigger/higher quality files are hard to find.


As someone who occasionally downloads unlicensed content, I find that most movies (and occasionally high profile TV shows or those with a focus on visuals--like BBC's nature docs) are available in multiple qualities.

There are the basic 300-700mb copies for people who just want to watch something and don't have the time or bandwidth to bother with amazing quality (these look like your typical compressed cable or "tube" stream).

Then there are the 720/1080 versions which are still pretty compressed but at least encoded with decent settings and at a resolution that doesn't look bad on an average TV. Often these are "webrips" where someone capped the stream from Amazon or whatever online streaming rental service had the movie.

And last are the massive bluray rips that can go over 50GB in size and include 10bit HDR video, Dolby Atmos, etc. I never bother with these because honestly, if I'm pirating a movie/show it's because I'm not sure I'll enjoy it or it's just not available for me to watch yet. The movies I would want in maximum quality are the ones I would be buying anyway.

I'm sure plenty of people can't or won't buy the official blurays so these are the alternative, but for me, "bootleg" copies are the equivalent of "I'd watch it if it was on TV" but not a replacement for stuff I'm really invested in seeing at max quality because I love it.


Amazon's 1080p copies are usually very good quality, certainly comparable to the bluray release. Pirates have been able to download and strip DRM from there and itunes, etc. for a long time, no capturing+re-encoding necessary.


(Throwaway account)

Depends on the content but if it’s something remotely popular, and was released on BluRay, the answer is yes. The most popular torrent at the moment apparently 1.92GB BluRay rip of Joker (2019), with 8k seeders.

There’re 2 higher-quality versions found on the Internets, 6GB and 16GB. Both have 4k video encoded in 10-bit h265, have 5.1 / 7.1 audio, the former one has 1k seeders, the latter 100. These numbers are very approximate, I haven’t downloaded nor watched that movie. But usually, even 100 seeders is enough to saturate my 120 mbit/sec download link.

YMMV.


This is much harder than simply opening the netflix app built into your TV.

I'm bothered by the netflix quality every time I use it, but the hassle to download the movie before hand, connect my laptop to the HDMI, need to get up to control it, need to get up to move the mouse to make it disappear, etc. makes it a non-option. Sure there are solutions to those problems, but they're also work.


What makes you think it’s worse exactly?


I’m probably on the few folks who was searching for a copy of “The Witcher Season 1” on Blu-ray until I realized Netflix wouldn’t want to compete against its own platform.

Would be a major point of contention for content producers seeking to become Netflix originals. Traditional studios still produce lossless versions which then see re-releases in whatever format would be best decades after the original run has ended e.g. Sopranos getting a BR release, etc


Many popular Netflix originals have been released on Blu-Ray.


I'm on the Netflix SD service. It's never bothered me at all and I don't see a particular reason to upgrade.

I enjoy watching the BBC on HD, however for stuff like the Attenborough films.


I have watched Planet Earth in 4K (Real 4K not Netflix "4K"). on my new Home Cinema and it's the most beautiful digital content I have ever witnessed. Absolutely incredible.


I think that’s the best use for 4k, there are lots of tiny details in nature that come to life in 4k. For other things, movies and such it may even make it worse in some cases, too much detail isn’t always better.


Recently switched from a - what used to be high-end 10 years ago - 1080p plasma TV to a 4k OLED with HDR, and I must say, the HDR is much more impressive than the 4k tbh... 1080p with HDR would be fine for me...


I personally don't care. I even usually opt for the cheaper Netflix without "Full HD".


Just recently I was watching a Blu-Ray, while I'm mostly watching Netflix these days. There was a nice gray to dark gradient in in the intro and I noticed I was expecting some posterization artifacts and... there were none. ;)

Kind of sad that I'm already used to them it seems.


Compressed blacks / greys, they work until they don't.

Especially for drama and horror where dark scenes and night scenes are frequent!


Quality doesn't really matter for the general audience, beyond some reasonable threshold. https://mux.com/blog/youtube-viewers-dont-care-about-video-q...


Engagement seems a pretty blunt metric for this. Especially since Youtube has a near monopoly on user generated video content. A user would have to be mightily perturbed to abandon their favorite content altogether over poor video quality.

I bet most of those 90% of people who stuck with 480p don't even know they can manually set the resolution.


Yeah the image quality is poor, and it's especially frustrating for ad-based video streaming sites like Hulu or Tubi. The movie struggles to get 720p, buffering a few times, then the ad comes loud and bright at 1080p instantly.


+1 H.264 on Youtube is often comparable to files compressed at crf of ~30+, and while VP9 is more efficient, it also comes with weird encoding artifacts.


Does 4K Netflix target 50 Mbps? Blu-ray does. While lossy compression performance is a deep and incomplete field, more bits always helps. That requires more storage on servers and more throughput. There needs to be a “good enough” point and the hosts will glue themselves to it. If it moves up over time then they will have to update their library where high quality sources are available.


Netflix recomends a 25 Mbps download speed to watch UHD but according to some redditor it is actually at 15.6[1].

[1] https://old.reddit.com/r/netflix/comments/5lc2nl/what_bitrat...


Honestly, I find the quality decent. I watch so much content on YouTube; making the encoding artifacts imperceptible would mean I need more mobile data. Here in Canada, mobile data is more expensive than anywhere else in the world, so it can make a difference of about $500 CAD a year, the cost of a new mid-market smartphone.


The sad thing is, they could offer two levels of peak quality, one less than today’s and the other more than today’s—and their overall data usage rate would probably go down not up... especially if the lower bitrate was marketed as “faster starts, less buffering” etc.


Don't these services adapt the quality based on the quality of your network connection? In which case, it might be the same thing.


I agree that quality is not satisfactory, Im not convinced that it is the top problem to solve either. Let me explain: I read that video streaming was 50% of Internet's bandwidth, itself was about 3% of worldwide CO2 emissions. It's quite significant, and I would love quite the opposite of your reasoning: a lesser quality to leave us, engineers, time to find a sustainable way of watching videos online.


Should we guess whether you've done any air travel in the past couple of years? (I have too, I'm a hypocrite as well.)

I have no idea the source of your number. I see half a percent here[1]. That is also possibly inaccurate, because at least one major internet company offsets all their carbon emissions with things like farm animal feces digestors and methane recapture covers for mines. Even if it is three percent, it's not clear that a given amount of bandwidth is responsible for the same fraction of emissions. Also, perhaps the more people watch movies, the less they drive around and do fun things outside the house. So it may be that for each % you increase internet emissions you decrease other emissions by %*2. Who knows?

[1]: https://www.theguardian.com/environment/2010/aug/12/carbon-f...


That's not really relevant, though. The point is not "Video produces significant emissions, therefore it is bad." The point is "Video produces significant emissions, therefore making it more efficient would be good." The fact that video is not as bad as something else has no bearing on whether video could be made even better. It's worth bringing up because in a decision between x% better image quality or x% lower emissions, the total amount of those emissions is the key piece of information.


Why don’t you go beat up on useless technology which provably pisses away energy (by design!) like the blockchain before you claiming consuming video content on the internet is killing the planet.


There's no need to get defensive over this. One can do both.


Useless to you does not mean useless to other people and unless you somehow are the person who has an ultimate mandate over valuing things throughout the universe, then it's just your opinion.

If you for some reason have that mandate, you might as well look at other "useless" industries such as christmas lights, mechanical watch production (very demanding and "useless", right? since quartz watches are much cheaper and more accurate anyway?), vinyl records (think of all that plastic!!!), road lighting during the night (most roads don't have any visitors for hours!), jewelry, public webcams (all that energy and data transfer!). How about heating in far up north? How much energy is "wasted" there, "why don't all those people just move"? Or expensive motherboards which have thousands of components, of which only a small part will realistically be used by the owner... Or gaming consoles which can in fact be used for other calculations, but damn uselessly stuck there to just produce some 3d simulations... Or gaming in general, how much does that "waste"?

Basically what I'm trying to say is that your line of reasoning is extremely selfish and short-sighted, and there are people who would very much like to have some blockchains working, and they will keep doing so, just like they will keep setting up and powering christmas lights and paying for their gaming rigs and buying expensive watches that take a fortune to manufacture.


It feels a bit irresponsible to through around unsourced numbers like that. Makes responding responsibly difficult.


Sending data through the internet contributes to 3% of worldwide CO2 emissions? Finding a 'sustainable way to watch videos online'? I think you have your numbers mixed up. Maybe you are confusing the amount of electricity all computers use with the internet and CO2 emissions.


For anyone interested in some more novel ways of reducing video bandwidth with neural networks using foveated rendering to 'recreate/generate' the high res frames client side.

https://www.youtube.com/watch?v=eTUmmW4ispA


Computers use electricity. Find a way to generate electricity without CO2. If people can't watch videos of foreign places then they will have to travel to foreign countries which will cause more CO2.


So many well informed comments here I have to ask... and please forgive the spam... If you're interested in this area and might be be interested in working at YouTube then please email me using the address in my profile. I work in this area at YouTube.


A 20% data saving but how much extra battery usage for devices which don’t have hardware decoding for AV1 (which last time I checked was all of them)?


Not all of them, hardware AV1 has been rolling out over the last year. LG include hardware AV1 decoding in two of their TV lines now (OLED ZX and NanoCell). MediaTek's new SOCs include it, and they're being used in Xiaomi, Nokia, and Motorola phones this year. Amlogic's TV and set-top box SOCs, used in things like the Amazon Fire TV as well as Android smart TVs and dedicated media players, include it. Realtek's TV SOC with hardware AV1 support ships "early 2020."

Hardware AV1 decoding is available on some Android devices right now, and will ship with a lot more over the course of the year. So it seems the perfect time to ship an AV1 option behind an opt-in setting, which is what they've done. If you don't have hardware support and it's not worth the battery drain then you don't have to select it and there's no way they're removing H.264 support anytime soon.


Hopefully the open/free nature means it'll be cheaper to integrate into lower end ARM IPs so that is not just the flagship Snapdragon and Exynos chips that offer AV1

Getting it into the hands of the top 5 Samsung devices would probably cover a significant market share (not to mention getting it into all Apple phones/tablets).


>it'll be cheaper to integrate into

It is not like the hardware part of the die is free. AV1 has comparatively higher decoding requirement even in hardware.


Indeed. I imagine some older codecs could be dropped though, and the cost saved on IP should offset it. I'm not sure what the comparative silicon size might be though so I can't speculate.


Which is why they launched it behind an opt-in checkbox.

New codecs are always chicken-and-egg -- you need to have content available before the hardware encoders/decoder silicon gets produced and then trickles down to the phones.


Do phones have HW support for VP9?

edit: looks like it, at least for recent snapdragon phones:

https://en.wikipedia.org/wiki/VP9#Hardware_implementations


Do 2+ plus year old phones count as recent?


From the link,

Snapdragon 820: launched early 2016

Snapdragon 660: announced on May 9, 2017

Not saying that all phones made after 2017 will have it, but there are mid range phones on sale that do.


I see a question mark next to the 660 on the Wikipedia page. Is there some indication of that date from elsewhere?


Exactly. I am thinking if this is going to be some sort of play against VVC.

Or will Youtube and Netflix force AV1 upon everyone where if you want high quality HDR or 4K Content you will require AV1.


> Or will Youtube and Netflix force AV1 upon everyone where if you want high quality HDR or 4K Content you will require AV1.

I sure hope so!


I sure hope it's a war on VVC. I don't know why anyone would want a patent encumbered pay-to-use video codec over an open free one unless you're a stakeholder.


>I don't know why anyone would want a patent encumbered pay-to-use video codec

Again AV1 is not Patent free either.


But it is royalty-free. The patents are freely licensed to those implementing/using AV1.


All else being equal, that is...


On YouTube you already need a device that supports VP9.2 to get HDR. And there's a huge amount of HDR TVs/STBs (including Sony and the flagship Shield TV) which don't get HDR YouTube because they only support base VP9 and H.265.


Youtube probably will; I doubt Netflix will. Netflix has no horse in this race, so to speak.

It's also just easier for Netflix to keep the data in more formats, because they have far, far less data than Youtube.


The real question is how the rest of the market will respond to EVC, not necessarily VVC.


As far as I can tell, I dont see EVC being in the race. Its free profile is barely any better than H.264, while its pay profile is no where near VVC.


Edit: to clarify, this is not a rhetorical question - I would like to know so that I and other end users can make an informed choice about whether to opt in to this or not.


If you are on battery devices, you either do it in Hardware or not watch it. Even Software H.264 decoding on most Android phone is quite taxing, as they are relatively weak in single core performance. The Energy saving from 20% bitrate in 4G transfer is going to be minimal comparatively speaking.


When video industry is going step away from 24fps legacy? Judder on panning camera scenes kills all the joy. That would be much bigger improvement than switching to 10bit color.


Unfortunately the fact that lower budget TV was recorded onto tape at 60fps in the old days of analog television has effectively ruined high frame rate for everyone.

Gemini Man just came out at 120fps in Cinemas and 60fps on blu ray and regardless of your feelings about the quality of the film, I would have thought anybody with eyes could appreciate how amazing the high frame rate looks.

But no, it’s all people moaning about how it looks like a soap opera and “so unnatural” (too lifelike).

It seems anyone who tries high frame rate is crucified for it, so directors aren’t exactly incentivised to try again.


High framerates DO look like soap operas. It makes it feel like you're watching a play, rather than a movie, and is extremely distracting.

When I go to to people's houses, and they have a newish TV, it will invariably have frame interpolation of some kind on which effectively gives everything an approximated high frame rate. I immediately turn it off for them, and it is always met with either "oh, ya that was really bothering me" or "I can't tell the difference." I've never had someone want it turned back.


You're mixing up frame interpolation and original 60FPS content. That's not the same thing at all.


I agree. The best frame interpolation implementations of today, that I’ve seen, suck. In principle I think the algorithms and implementations will improve when low persistence, high refresh rate displays are commonplace (ie the 1000 Hz OLED ideal). With proper frame interpolation, you could, in theory, mimic any persistence profile you like. Like the look of CRTs? There is a filter for that. Like the look of cinema shutters? There’s a filter for that. We’re a long ways away from that though. The added latency would only be as long as one frame time of the targeted interpolation source frame rate. So 16.6 ms at 60 Hz. If that is unacceptable you can always do sample and hold or BFI (the true holy grail for gaming).


No, I'm not. The frame interpolation is actually pretty good, and approximates high framerate relatively well. It is the high framerate itself which causes this effect, however. I've seen two movies in theater at high framerate that were shot as such (The Hobbit and another I'm forgetting), as well as the ability to watch high framerate content on YouTube, and it is absolutely the high framerate that is jarring in cinematic content.


IMO 48fps was great in the Hobbit trilogy. Maybe not so much in the first one. But I wish directors would opt for higher frame rates if they have fast moving pans, especially in 3D. 3D in 24fps oftentimes looks choppy af.

By all means, keep indie and art house productions non-3D, 24fps and on celluloid. But for insane tripple a productions: higher frame rate pls.

Supposedly the Avatar sequels are gonna be in 48 or even 60. Maybe they‘ll act as a catalyst again, like the first one did for 3D.


My experience is different: I've seen all three Hobbit movies in HFR with 48fps and it was gorgeous whereas interpolation is not even close to what I've seen in the theaters. Cinema is not in FPS count but in camera movements and frame composition. Cheap TV shows and movies look cheap regardless of FPS count.


I almost walked out of the theater, and my whole group was on the same page. It genuinely felt like we were watching a play.


Thanks for the clarification. I'm not convinced that TV upsampling is generally good, but you make a coherent point and I'm going to try it out myself :-)


I love those TV options when they work. I hate hate hate hate juddering motion on any screen. I will agree that sometimes the quality of an actor can degrade when you get to see more frames of them per second. (As if you are able to see more nuances which makes you know they are acting.)


I wonder how high frame rate will be received in the future when everyone who grew up watching 60i soap operas is gone and everyone instead grew up watching 24p content made for Netflix/Amazon Prime/AppleTV+


23.976 FPS in most cases actually. For anyone wondering, this is a rate that fits at an even ratio in the old 60 Hz analog TV broadcasts, minus the frames reserved to piggy back color information along the old black and white signal. It's a meaningless technical artifact nowadays.


Agreed - especially in VR 360 stuff.

There's some astonishing VR footage to be found on youtube -- the VR180 stereoscopic stuff you can watch with a headset is on a whole other level when it comes to experiencing filmed medium.

But, for me at least, as a Quest owner (so maybe the mobile hardware can't process any higher-bitrate stuff, I don't know), the juddery, blocky, muddy, motion-smeared 24fps quality of even the highest bitrate videos youtube will serve in that format detracts a lot to the experience and reminds you that you're not a drone sweeping across the Saharan desert, you're some sweaty nerd sat with a fancy hat on.


How much slower are encodes vs H.264 or HEVC?

The way some of the comments here are talking about it. It's like it requires a 2 hour encode to encode 1 hour.

For Netflix this is an encode once then just CDN the files. I can't imagine that this would be a significant cost.


SVT-AV1 is faster than x265. https://medium.com/@ewoutterhoeven/av1-is-ready-for-prime-ti...

"We have found that the real-world compression/speed trade-offs for AV1 are in fact excellent, and better than HEVC." https://blogs.cisco.com/collaboration/cisco-leap-frogs-h-264... Discussion https://news.ycombinator.com/item?id=20291344


No it's not. It compares it only to x265 veryslow. A fair comparison would be to test every x265 mode against every av1 mode, x265 is still faster for same quality in for example slow mode. Even then you can only talk about faster for a certain use case. E.g the article starts about video conferencing, that is typically a special mode where you only do 1 pass and disable a lot of optimizations to meet latency requirements. So you can only compare similar modes for a specific use case.


x265 is still faster for same quality in for example slow mode

Do you have a recent benchmark showing this? Things are changing fast enough that it's hard for me to Google the current state of things.


It takes years after standardization for software encoders to become efficient, it was this way with x264, x265 and will again be the same with rav1e. Initially they're only going to provide a good quality/size tradeoff if you throw ungodly amount of CPU resources at them. There's also Intel's SVT-* encoder family, but that's meant for many-core server class machines, so again only good for batch encoding.

But for streaming services even those initially costly encoders are worth it since they pay the cost only once and then stream to millions of viewers.


There is a significant one time cost to a new codec, because the entire library needs to be re-encoded, but that cost is more than made up for as long as one of two things are true:

1) It takes significantly fewer resources to stream to customers (bandwidth, CPU, disk, RAM, etc) or

2) It provides a significantly better experience for the user, which translates to more retention (either better picture quality or fewer video pauses or both).

If you can find a codec that does both, that's worth far more than any encoding cost.


I can list multiple streaming services off the top of my head that could really use better picture quality.


I was shocked by how poor the HBO show I just watched on Amazon Prime looked on my 2019 TV. Visible banding in backgrounds like 16 bit color from the late 90s. For all the flack you hear about Netflix streaming compared to 4K BDs I very rarely notice such visible encoding artifacts.


Hugely slower, and there is little (maybe zero?) hardware encoding support out there, which makes it too expensive for most use cases.

You are completely right about Netflix’s needs here. Spending thousands of dollars on re-encoding video to shave off 1% of its size in bytes can easily be profitable at the scale of a hit show like Stranger Things.


> there is little (maybe zero?) hardware encoding support out there

There are companies offering cloud AV1 FPGA encoding on beefy servers. I'd imagine the performance and quality is much better than what is available to mere mortals right now.


Yeah, I was going to comment to that effect as well. Another comment implied they went for SVT-AV1, but (provided the IP is available) encoding on a big FPGA makes sense, regardless of the codec. Plus, you buy the HW once, and can use it multiple times thereafter.


And the savings are much better than 1%, if they really do spend that much time encoding. The complexity of encoding is relatively similar for equivalent quality/bitrate (at least with some of the encoders).


30fps on a 9900KS at 1080p. Three to four times slower than x264. As you say, for ahead of time, write once, read many media like Netflix, that's an acceptable trade. It's not going to work for live media yet.

But 10bit 4k streams need nearly 50gb RAM and exponentially more time than 1080p. I've never had much luck.


> It's not going to work for live media yet.

Cisco's AV1 encoder does 1080p30 encoding for video calls on a laptop. They found it's a matter of tuning the encoder for the live video use case:

https://blogs.cisco.com/collaboration/cisco-leap-frogs-h-264...

Xilinx is working on FPGA based encoding for AV1. They already offer it for VP9 (they bought NGCodec) and Twitch is using it:

https://www.xilinx.com/applications/data-center/video-imagin...

https://blog.twitch.tv/en/2018/12/19/how-v-p9-delivers-value...


You can't draw conclusions from this for live streaming. That's typically just a subset of the encoder being used in 1 pass. Same for h264 etc.

The live streaming part of av1 is still full on the works. RTP packet formats for SVC isn't even finished yet, let alone implemented in a codec.


Not drawing conclusions. That's why I said "yet".

Spec, hardware encoders and maybe even software encoder improvements will likely make it practical.


Which codec features do make it RAM hungry? Are they optional and can be turned off in the encoder?


Codecs like AV1 do a LOT of motion and transformation detection. That's generally where a lot of that ram goes, They are storing a metric ton of matrix data from prior (and future? I'm not sure if AV1 does this) frames so that when calculating the current frame stream they can quickly look up that info.

Needless to say. figuring out which piece of data should be plucked out and transformed from prior frames is really computationally expensive.

Here's a paper on exactly what sort of tools are available for the codec. https://jmvalin.ca/papers/AV1_tools.pdf

edit I should note that the data stream for H.264 is FAR simpler. It doesn't support nearly the number of transformations that AV1 does.


Given that this is ARM, will this become available to the Raspberry Pi some time? I've set up a Raspberry Pi 4 as a media center but Netflix doesn't work because of codec issues.


Netflix only works on devices that support its DRM implementations. I don't think Widevine works on ARM/ARM64

On Android there's some specific API they call that only works on "certified" devices from Google. Random hobbyist builds wont work


Widevine is supported on Android, and is utilized by the MediaDrm APIs.[1][2]

Netflix uses Google’s SafetyNet API to make sure the device is “certified”. When this API check returns false, some features (like downloading media for offline viewing) will be disabled for the device. In some cases, the app won’t even show up in Google Play. Devices with a custom ROM are likely to fail the test, but luckily the check can be circumvented with the hide feature of Magisk[3]. Netflix works fine on my LineageOS-powered device when using Magisk.

But even if your device passes the SafetyNet checks, you might not be able to play HD videos. This depends on the Widevine security level, which can be L1, L2 or L3; only devices with L1 are allowed to play HD videos. You can check your device’s Widevine security level with DRM Info[4], for example. Netflix also lists specific device models and chipsets for which they support HD streaming.[5]

[1] https://source.android.com/devices/drm

[2] https://developer.android.com/reference/android/media/MediaD...

[3] https://github.com/topjohnwu/Magisk

[4] https://play.google.com/store/apps/details?id=com.androidfun...

[5] https://help.netflix.com/en/node/23939


Is there a convenient way to check which instruction set your phone supports? Something akin to hwinfo for Windows.

https://www.guru3d.com/index.php?ct=files&action=file&id=263


How is the hardware decoding story nowadays? Have devices with hardware decoders already started shipping?


Broadcom seems to have a hardware decoder ready now: https://www.cnx-software.com/2019/09/25/broadcom-bcm7218x-st...


Newer Snapdragon SoCs, and that's about it right now.


Only mediatek has a smartphone SOC with AV1 available.


No word on what their encoding toolset is. That's been the limiting factor with widespread adoption thus far; it just takes too much CPU and too much time to encode as AV1 vs 264, 265, or VP9


> No word on what their encoding toolset is.

They're using ST-AV1[1], the encoder they've collaborated with Intel on.[2][3]

[1] https://github.com/OpenVisualCloud/SVT-AV1

[2] https://newsroom.intel.com/news/intel-netflix-deliver-av1-sc...

[3] https://netflixtechblog.com/introducing-svt-av1-a-scalable-o...


So SVT-AV1 now does 20% better than VP9?

Edit: They did mention the 20% is from AV1-libaom compression efficiency as measured against VP9-libvpx. So not SVT-AV1.


They cite AV1-libaom for the 20% efficiency improvement over VP9, but I didn't take that to mean that they're using the reference encoder in production. Do you happen to know for sure?


For Netflix this likely doesn't matter. While Youtube deals with 300+ hours of content uploaded every minute that it has to encode, Netflix only has to encode ~100 titles per month. Even if it takes a month to encode 1 of these, it's still easily doable for Netflix.


Perhaps they're only AV1-ing a blockbuster per week, say.


Yes and no. It's a cost-benefit deal. Just because they _can_ (and they surely can), doesn't mean it is cost beneficial to do so now. It takes CPU time (and therefore money) for each encode they do. If one takes much more resources to do, it may not be beneficial to do so yet. I originally posted my comment wondering if there had been some encoder breakthrough that I had missed that had made AV1 a viable option finally. Looks like there was, but it's still not nearly as good as it needs to be fore widespread adoption (looks like it's still 30x slower than a comparable HEVC or VP9 encode). So I guess it comes down to: are they saving enough on downstream bandwidth to balance out the CPU time for the encode? Releasing it would suggest yes, or they see some benefit to pushing the standard into the mainstream now rather than in six months regardless of the cost to them.


savings on bandwidth will make up for it

Almost like Netflix has data analysts and experts in the field who have already calculated the cost benefits based on internal metrics unavailable to us and chosen to implement it...


> doesn't mean it is cost beneficial to do so now

But they are using AV1 now i.e. you should presume NetFlix have judged that the benefits outweigh the costs (unless you have other relevant information to share).


I would argue that since the CPU is a one time cost but the bandwidth benefit is recurring and not just for the company but for the customers as well especially on mobile the calculation is rather easy.


That kind of CPU load is a drop in the bucket for Netflix.


Yup. Run it during off peak with extra hardware. EZ.


Yeah this seems like a good job for AWS spot or the like. If it costs $.01/minute of video, probably worth it even if only a few hundred people watch the title.


Does Netflix encode realtime? I would think they would encode all of the possible data rates exactly once, then stream off disk. I would think realtime encoding would add a significant cost per client.

* Disclaimer: I know nothing of AV1.


They encode once then stream off disk.


No, they don't have much time constraint on encoding. But the bill still matters; if AV1 costs $$$$$ to encode because you need so many servers it's a bad situation.


I remember an article on video encoding where Netflix had approached some people basically with the question "what if we didn't care about encoding times?" Netflix doesn't want to was money, but encoding cost might not be something that they want to optimize against other things. For example, let's say that you have an episode of a TV show. Netflix is going to transfer that file 10M times. If that file is 2GB, that's 19PB by my math. I know that they aren't paying AWS's exorbitant transfer fees, but it can still add up to a lot of money. $200,000 at a cent per GB, $20,000 at a tenth of a cent per GB. On top of that, more CDN storage, more servers needed to transfer additional data, etc. There are certainly costs. How much would it cost to encode things? Let's say that AV1 can be encoded at 1/10th real-time on a 16-v-core box (which seems reasonable given https://medium.com/@ewoutterhoeven/av1-is-ready-for-prime-ti...). That's $7.60 on Google Cloud for an hour-long show (I know they use AWS, but this is just representative pricing).

Let's say that AV1 reduces file sizes by 40% (compared to H.264), saving them 0.8GB per stream. Let's say that only 10% of users have AV1 capability. And let's say that they only pay a tenth of a cent per GB for bandwidth. They're still saving 800,000 GB of bandwidth or $800. Of course, there are also going to be other savings like electricity since transferring smaller files means needing fewer servers (servers have network cards that get saturated whether in their data center or CDN).

At Google, encoding costs can be big since a lot of YouTube videos aren't watched a lot. With Netflix, the library is a lot smaller and more popular. Netflix is up to around 170M customers. Each of those customers might have a few household members watching something popular (and maybe re-watching those things). Even encoding something very slowly ends up not costing that much money.

But it's also about the customer experience. Mobile companies throttle video services. I believe Google started using VP8 so that T-Mobile customers could see 720p video without paying extra for HD video. When downloading files for offline usage on a plane, the size is going to matter both in terms of the time it takes to download and the storage used on your device which is limited.

Even if bandwidth only costs Netflix a hundredth of a cent per GB, the compute needed for AV1 just doesn't seem to matter given the way Netflix's business works.


Agreed completely with this - there are second order effects to this as well on the customer side.

A smaller file for the same quality means that customers need less bandwidth to stream a show, improving their experience and decreasing buffer times, etc. This might not be as relevant in the USA as other places, but I'm sure there are a significant number of customers and potential customers that this will make a difference with.

With an encode-once stream-many setup, it's likely quite worth it for Netflix to pay whatever upfront transcoding cost if the resulting file is advantageous in any way.


Independent of network cards, TLS using AES-GCM will cost one cycle per byte.


>"what if we didn't care about encoding times?"

Yeah, I can already see it. AV5 with 3 months of encoding time on a 128 core EPYC for a 90 minute movie.


Theoretically yes, but the hardware cost of encoding their microscopic, in streaming terms, number of videos would have to be absurd to even come close to the savings they get from lowering their bandwidth by 20%.


I assume they are using AV1+Opus?

When are hardware decoders and encoders for AV1 coming for common GPUs (Intel, AMD, Nvidia, Qualcomm, etc.?).


Does this include Fire TV devices?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: