I would agree that the TV is the problem. Or, if my recent experince is anything to go by, the quality of the Wi-Fi signal reaching the TV. I would try to improve that first - my own Roku enabled TV was having streaming issues with the Spectrum app (all other streaming apps were just fine) when the Roku in it showed signal strentgh as only "Fair". Things have improved dramatically since I was able to get the signal strength up to "Good", which admittedly took a lot of work AND the purchase of a more capable router.
The most believable numbers I've found are 1.5Mbps for SD, 3Mbps for DVD quality, 5Mbps for HD quality and 8+ for 1080 on PS3. I don't see Netflix offering official numbers, but playing with Speedtest.net and testing by adding bandwidth eaters like VPN connections until I saw the quality degrade.
While I couldn't find any hard numbers on Netflix's site, it seems the consensus is that as long as you have a decent DSL connection 1.5 Mbs, you should be able to stream successfully (there might be a decent amount of buffering though).Source
I do not personally have Netflix, but my aunt does (the standard definition version through a Wii) and she has no problem watching movies with connection speed that hovers between 700 Kbps and 900 Kbps (tested at Speakeasy.net). I was actually surprised that video playback didn't lag with speeds that low, but there it is.
We have a 1.5/10 Mb connection and see neflix eat as much bandwith as it can get. Not uncommon for it to be using 9mb on HD programs over our xbox 360. However if there is anyother machines online it will kick back to less and change the quality of the stream...
This is Neil Hunt, Chief Product Officer, to tell our members in Canada that starting today, watching movies and TV shows streaming from Netflix will use 2/3 less data on average, with minimal impact to video quality.
I ran the Activity Monitor app on my MacBook Pro while simultaneously streaming Netflix on 4 devices on my home wifi (my MacBook Pro, an iPhone 5, an iPhone 4 and a smart TV upstairs). The total data rate never got higher than 709 kbps (less than 1 Mbps) and on average it stayed around 200-300 kbps. All devices were streaming flawlessly. I even called Comcast on my VOIP phone and the bandwidth usage stayed the same.
To my knowledge, the Activity Monitor shows bandwidth used by the wifi network as a whole (which was what I was interested in), not the individual devices. You see, I am using Comcast "High-speed Internet." How fast is that? They don't say, but apparently I am also getting their Blast service which gives me "up to" 50 Mbps. Wow, right? Except why do I need that kind of speed? Or more to the point: why do I need to PAY for that speed? Or even more to the point: do I ever actually get that speed? I'm not a gamer, though I wonder if I was, would I still need that kind of bandwidth?
Enterprising users may be able to unlock the ability through software modification by making use of a hacked console; however, these methods have a variety of associated dangers such that I have never bothered to investigate them.
There is a way to abstractly manage your streaming video quality on Netflix through the website. After logging in click on Your Account & Help in the top right corner. Follow this up by clicking on Manage Video Quality on the right side about half way down the page.
Finally, select the video quality you'd like to use and click on Save.
I bought the most recent Apple TV 4K 2021 but to my disappointment, when I plugged it in all my video content from apps like amazon prime and Netflix is showing only in HD. Note that I also have the 2017 4K apple TV and when I plug it to the same HDMI port, it displays in 4K as expected on my TV.
Literally, nothing is changed in terms of TV hardware, software, cables, ios video settings, just the apple TV hardware from 2021 which has a bug where it can play video only in HD though it is advertised as 4K hardware. See the pictures attached below so you know exactly what I am talking about.
I have been dealing with this issue since Monday. I swapped in the new AppleTV 4K on my LG OLED for the previous gen on Friday, but I didn't notice the issue right away because I only watched AppleTV+, iTunes purchased content and Disney+ until Monday. Monday evening I went to watch the last episode of "Jupiter's Legacy" on Netflix and one screen had some pretty obscure dialog. I went to turn on the subtitles via the pull down menu so that I could play with the new "circular scrubbing" by rewinding and I saw that there was just an HD badge in the info pane instead of the Dolby Vision badge I was expecting.
Since then I have played with all of the settings on the AppleTV to get this "fixed" but with no positive results. I finally started testing a theory that the AppleTV is actually playing the 4K stream, just the badge isn't right because the stream still looked good to me and because the TV flashes up either the Dolby Vision or HDR label when the stream starts from Netflix or Prime even though the info pulldown only shows HD and no mention of HDR or DV. Because the AppleTV upscales to 4K, my TV always reports 4K even if it is a title that is only available in 1080p. So I switched the max resolution on the AppleTV to 1080p and when I did that the TV of course only reports 1080p, but the Netfilx app shows only an HD badge on the title page of a 4K title. When I switch the max resolution back to 4K the 4K label (or the Dolby Vision label) returns on the title page. In both cases the info pull down in the player only displays the HD badge even though the played stream looks much better with the max resolution set to 4K. This of course could be because the AppleTV is no longer upscaling, so I tried to think of some other test.
Then I had the idea of playing "The Expanse" and checking the TV info, because it is oddly enough filmed with HDR not Dolby Vision. The TV will report the HDR profile for HDR, HDR10, and HDR10+ content. The TV reports that, even though the pull down info in the Prime app only displays HD, the stream is playing HDR with the bt.2020 profile. This HDR profile is only applied to 4K or 8K streams. This makes me think that the Prime stream is actually 4K. Additionally, YouTube reports over its completely different info function that it is playing a 4K stream.
I have a theory (a bit of a stretch perhaps) that this is just a labeling problem in the info pull down with apps that still use the "old" video API on the AppleTV. The video player or API set has been updated to work in the new "circular scrubbing" gesture with the new AppleTV. Netflix and Prime can use this new gesture, but they still use the old info pull down and so are somehow affected by this bug (partial use of the latest API set). The AppleTV player uses a new API set that has the info displayed at the bottom of the screen by pressing the back button. This player (AppleTV+ or purchased content) does not seem to be affected by the bug. YouTube and Disney+ appear unaffected by this bug, but both use a completely different proprietary player that does not behave at all like the AppleTV+ player and also does not yet support the circular scrubbing.
Hopefully this is indeed just a software issue. Whether it be with Apple or some of the various streamers that need to update their apps to correct this issue. Those affected will have to wait on more info from Apple (perhaps a software update) or an App update from affected streamers.
90f70e40cf