Wifi 6 vs Ethernet? Any real world comparisons? | GTAMotorcycle.com

Wifi 6 vs Ethernet? Any real world comparisons?

Lightcycle

Rounder of bolts, Dropper of tools
Site Supporter
Our house is pre-wired for Ethernet, looks like Cat5e cabling. Thinking about dropping in a switch and LAN-connecting the TV so it has direct access to Internet streaming services and the media server. Also to provide a wired backhaul for another wifi access point on another floor, as the signal drops off rapidly from the single access point currently.

However, I've heard that 802.11ax can achieve faster speeds than Gigabit Ethernet. Anyone have first-hand real-world experience? What access point do you use? What is its speed and throughput effectiveness through walls and floors? What about the client network adapter? USB 3.0 vs 2.0 make a difference?
 
A 4K TV doesn't benefit from anything faster than 10BaseT (10mbps), I'm guessing they are 100mb because that's the least expensive hardware these days.
 
And it gets better, my TV doesn't have a gigabit ethernet interface. Just 100Mbit.

Apparently no TV currently made supports gigabit ethernet.

Maybe I'll just do nothing.
I found that hard to believe, but apparently it's true! I am very surprised to learn this... the cost difference between a gigabit ethernet and a fast ethernet PHY has got to be pennies these days.
Our house is pre-wired for Ethernet, looks like Cat5e cabling. Thinking about dropping in a switch and LAN-connecting the TV so it has direct access to Internet streaming services and the media server. Also to provide a wired backhaul for another wifi access point on another floor, as the signal drops off rapidly from the single access point currently.

However, I've heard that 802.11ax can achieve faster speeds than Gigabit Ethernet. Anyone have first-hand real-world experience? What access point do you use? What is its speed and throughput effectiveness through walls and floors? What about the client network adapter? USB 3.0 vs 2.0 make a difference?
Most of these questions I can't answer, other than saying 802.11ax probably only pays off in a few specific optimal scenarios. However, with regards to USB 2.0 vs 3.0 - in theory, a good performing 802.11n adapter would completely saturate a USB 2.0 interface (802.11n goes as high as 600mbit/s on paper, USB 2.0 only goes to 480mbit/s). In reality, that kind of throughput is extremely difficult to attain.

Ethernet is always a better option than wifi for performance. If you connected your TV via the ethernet cabling that already exists then that frees up wifi bandwidth for other devices, even if the data wouldn't be getting to your TV any faster. You are much better off spending like $30 on a switch than you would be spending hundreds on a 802.11ax router
 
You would most likely need to buy an AX wifi adapter for any devices looking to connect to an AX wireless router. CAT5e cable is fine for gigabit networking, no real need for CAT6. If the cabling is in place use it. It is the most stable way to build a network.

If you have dead wifi spots in the house and you have cable near that location pop in a cheap wifi router in the dead zone to deal with the specific issue, most can be configured as repeaters or go whole home mesh.

There are some good deals for used AC routers and switches on kijiji. Pay attention to power consumption, since networking gear is always on. If you go down the switch road, pick one with more ports than you currently need. I put in a sixteen port switch thinking it was enough....wrong.

Whatever you do keep it simple, since you will likely be the network admin.
 
802.11ax can achieve faster speeds, however, it's not as easy as a single Access Point.

My house is also wired with Cat 5e cable, I switched to Gb switches and router a few years ago and all of the drops are Gb capable. I use a Ubiquiti AC WiFi AP that could originally get 200 ~ 250 Mbps, but as time went on and more people in my neighborhood installed AC APs and congestion has dropped my speeds to ~ 150 Mbps.

A couple of years ago we did the power and cabling for a commercial video production co. (Think Pepsi commercials vs Game of Thrones) They installed AX APs in a surprisingly dense configuration to support laptop computers and phones. They also ran Cat6 to each work station. I asked about just using the WiFi as they had an AP for every 4 workstations, but they said the WiFi couldn't handle the throughput in the evenings when the workstations switched over to rendering.

There is also the issue of what feeds your Access Point. There are commercial units with dual Gb adapters, but for the most part anything you or I might use would have a single Gb connection. So, while the protocol might be faster than Gb, we won't see it.
 
Why would you care if your TV has GigE? It can't use data that fast anyway.

Our house came with cat5e wired but never terminated at the switch end. Wireless works for us as I am rarely in a huge rush to fire large amounts of data around the network. Currently I have 3 wireless bridges converting wireless to wired in various places (either devices need a wire or there are a lot of devices in one spot so that minimizes wireless connections and traffic). Two of those spots are near wired ethernet. When the bridges fail, I may convert those locations to wired.
 
And it gets better, my TV doesn't have a gigabit ethernet interface. Just 100Mbit.

Apparently no TV currently made supports gigabit ethernet.

Maybe I'll just do nothing.
You could put a streaming box infront of your TV (aTV, andriod box, shield etc) to hook into the gig ethernet. But otherwise for streaming 100 base would be sufficient for it.
 
Our house is pre-wired for Ethernet, looks like Cat5e cabling. Thinking about dropping in a switch and LAN-connecting the TV so it has direct access to Internet streaming services and the media server. Also to provide a wired backhaul for another wifi access point on another floor, as the signal drops off rapidly from the single access point currently.

However, I've heard that 802.11ax can achieve faster speeds than Gigabit Ethernet. Anyone have first-hand real-world experience? What access point do you use? What is its speed and throughput effectiveness through walls and floors? What about the client network adapter? USB 3.0 vs 2.0 make a difference?

The real question is what problem are you trying to solve?

An optimized connectivity/wiring layout for cost?, A limitation in bandwidth, or perhaps something else?

From an industry insider, here is a dirty little secret. It's a lie. 4K TV + Faster network = better TV experience is a complete lie. Will it get there one day? Yes, but not today or anytime soon.

Most streaming content (which effectively everything now, underneath) is so heavily compressed, 4K or otherwise, it does not matter. As long as your internet bitrate is slightly greater than the underlying video bit rate, that is all you need. Faster will not make it better. It's like having a 10x bigger hose, but the same amount of water. The source bit rate is typically fixed. Watching SD content on an HD TV, is no better than HD content on a 4K TV. That is what we are really doing. It's a placebo effect. Rarely do we watch true native 4K content for TV/movies/sports. It will come, but not yet.

TL;DR; For those that are technical, we can't even hit visually lossless bit-rate for 1080p (HD) yet. e.g. for H.264/VP8 this is roughly 50 Mbps, or 25Mbps for the newer H.265/VP9/10/etc (8-bit @60 fps) based on testing I've done with objective industry experts. Most TV content is 24/29 fps. For Netflix bitrate is ~2-5 Mbps (HD), although may vary. This is 10-20% of the required bit rate for visually lossless HD. In other words, we have a long ways to go. Whether you have a 100 Mbps internet connection, 250 Mbps, 1Gbps (1000 Mbps), or 30 Mbps, it does not matter for most video content. Gaming is different. For the average household a 30-50 Mbps downlink is more than enough for a few simultaneous video streams. Don't pay for more internet pipe than you use. It just sits idle anyways.

Source content is by far our biggest limitation, not your TV or network speed. It's simply too expensive a.t.m. to send 100-200+ Mbps to each house steadily, no one is paying 4x as much for Netflix to get 4x the data rate. We've had 4K projection in theaters for >10 years which use 12-bit colour, uncompressed native content. It looks good. To this day many blockbuster releases are still only released in 2K in theaters. Can you tell? And this is on a huge screen. 99% can't, even up close. This is why HDR (usually 10-bit vs. 8-bit) has been the push from studios on direct release for the last 5 years, not 4K. Why? Movies are already in 12-bit (actually many are 16 bit-raw). You should focus on higher frame rate (sports/racing!) and higher bit rate over higher resolution.

To answer the original question, for USB 2.0 vs 3.0, USB 2.0 is limited to 480 Mbps (full bus). Is it too slow? If you're content is Netflix @ HD, say streaming at 5 Mbps, we would use maybe 1% of that a.t.m. I'm still a fan of wired connections, but bandwidth/latency is not your limitation usually for wireless.

Last, pro tip, avoid buying $80 "monster" gold plated cables for digital connections (analog is a different story), they are a waste of money. Most industrial testing we did we used the cheapest cables we could find. They were literally from the surplus store, why? If it works with those, it works with anything. Digital signals are discrete. They either make it or they do not, there is no in-between.
 
Last edited:
The real question is what problem are you trying to solve?

That's a good question.

We were Zoom teleconferencing over Christmas and I was experiencing bad lag and throughput degradation from being too far from the single Google mesh AP one floor away. It's the older non-Nest version (AC1200).

Anyway, the obvious solution was to drop in another AP puck on that floor, that's what they advertise is the selling feature of mesh networking. The newer Google Nest Wifi is compatible with the older pucks and will go AC2200. That way, I don't have to toss the old puck by buying another brand that wasn't compatible. Then I was also thinking, since the house was wired for Ethernet, why not create a wired backhaul between the pucks for speed instead of chewing up the existing 2.4Ghz and 5Ghz bands for the wireless backhaul. Google does not have a dedicated band for wireless backhaul.

And then that lead me to think, well, if I was wiring the house, why not connect the TV to the switch as well and forego wireless to the TV. I have a DLNA media server on the network, so instead of streaming content wirelessly, I could do it over the LAN. Currently, there is no problem with speed to the media server, but I just got YouTube Premium and I have noticed that the TV is loading YouTube videos noticeably slower than my ethernet-connected laptop. Thinking perhaps going wired would fix that.

I've run Speedtests on my current setup, ethernet vs wifi, and the numbers I am getting are 650 Mbps over ethernet (full utilization of the Cable service - no surprise), but 400 Mbps over 5Ghz 802.11ac with the mesh AP sitting right next to the laptop. Haven't run the test closer to the TV, but from what I'm reading here, those speeds should still be fast enough for 4K video. Then I read that all TV ethernet interfaces cap out at 100Mbps anyway, so the wifi interface is actually faster...

Still, as @TwistedKestrel pointed out, wiring up the TV would save wifi bandwidth for the other wireless devices that need to be wireless. It would probably also solve any latency issues vs speed issues.

Anyway, to make a long story even longer, at the point when I was asking the original question, I was mulling over whether to actually buy that Google Nest Wifi puck or to futureproof by moving straight to 802.11ax. Then I started reading that Wifi 6 is theoretically faster than Gigabit Ethernet, so why even install a switch? But it's so new, are there any real world experiences with AX? Let's ask GTAM!

Having typed all the above out, I realize I've embarrassingly revealed my rat's nest of convoluted thought processes...


From an industry insider, here is a dirty little secret. It's a lie. 4K TV + Faster network = better TV experience is a complete lie. Will it get there one day? Yes, but not today or anytime soon.

Most streaming content (which effectively everything now, underneath) is so heavily compressed, 4K or otherwise, it does not matter.

That's a good point. But since I am mostly pre-downloading shows, the sources are mainly 20-40GB 2160p x264 files for a 1.5-2 hour movie. I expect the bitrate and compression with those files will be better than streaming content.
 
Last edited:
A couple of years ago we did the power and cabling for a commercial video production co. (Think Pepsi commercials vs Game of Thrones) They installed AX APs in a surprisingly dense configuration to support laptop computers and phones. They also ran Cat6 to each work station. I asked about just using the WiFi as they had an AP for every 4 workstations, but they said the WiFi couldn't handle the throughput in the evenings when the workstations switched over to rendering.

Interesting. Which brand AX APs?

Cat6 must mean 10GBASE-T... I can see how that would be much faster than AX.
 
Last edited:
Interesting. Which brand AX APs?

Cat6 must mean 10GBASE-T... I can see how that would be much faster than AX.
They were a Cisco shop. PoE APs, that had to start in sequence as the start-up power exceeded Class 3.

All of the copper (Cat6) was Gb, the uplinks to the switches were bonded fibre pairs (I don't know the speed, probably in the 10 Gb range)

I had a fleeting thought of using fibre at home, but my bottle neck is already storage I/O.
 
Interesting. Which brand AX APs?

Cat6 must mean 10GBASE-T... I can see how that would be much faster than AX.
10GbE is another thing that's great in theory, but ends up being not very useful due to how expensive the NICs & switches are. E.g. Netgear XS505M is an unmanaged 4-port 10GbE switch that costs $600+

"Multi-gig" i.e. 2.5GbE/5GbE is something that is trickling into consumer devices as it is much, much cheaper currently. A few semi-normal consumer 802.11ax routers come with one or two multigig ports.
 
As others have said, running gigabit to your TV is entirely unnecessary. It probably wouldn't even have the processing power to handle that sort of traffic load even if it did, which is why it doesn't.

As for the wired vs wireless debate, there's many things that matter. I'm sitting within sight of my reasonably high end Wireless AC router right now and am saturating my gigabit fibre connection over a 5ghz 802.11AC Wifi connection as I type this downloading a large DMG update file for my Macbook Pro.

In this situation a gigabit cable connection would offer basically no advantage.

However if I moved to a different room I'd probably lose half my thoroughput (5 ghz falls off fast with distance and obstructions) so a cable connection would be of benefit again.

Many questions to ask ultimately to decide which one is better.
 
Currently, there is no problem with speed to the media server, but I just got YouTube Premium and I have noticed that the TV is loading YouTube videos noticeably slower than my ethernet-connected laptop. Thinking perhaps going wired would fix that.
This may not be a networking issue but rather a processing/resource problem...You likely have way more processing power and faster buffering on your laptop than your TV.
 
That's a good question.

We were Zoom teleconferencing over Christmas and I was experiencing bad lag and throughput degradation from being too far from the single Google mesh AP one floor away. It's the older non-Nest version (AC1200).

Anyway, the obvious solution was to drop in another AP puck on that floor, that's what they advertise is the selling feature of mesh networking. The newer Google Nest Wifi is compatible with the older pucks and will go AC2200. That way, I don't have to toss the old puck by buying another brand that wasn't compatible. Then I was also thinking, since the house was wired for Ethernet, why not create a wired backhaul between the pucks for speed instead of chewing up the existing 2.4Ghz and 5Ghz bands for the wireless backhaul. Google does not have a dedicated band for wireless backhaul.

And then that lead me to think, well, if I was wiring the house, why not connect the TV to the switch as well and forego wireless to the TV. I have a DLNA media server on the network, so instead of streaming content wirelessly, I could do it over the LAN. Currently, there is no problem with speed to the media server, but I just got YouTube Premium and I have noticed that the TV is loading YouTube videos noticeably slower than my ethernet-connected laptop. Thinking perhaps going wired would fix that.

I've run Speedtests on my current setup, ethernet vs wifi, and the numbers I am getting are 650 Mbps over ethernet (full utilization of the Cable service - no surprise), but 400 Mbps over 5Ghz 802.11ac with the mesh AP sitting right next to the laptop. Haven't run the test closer to the TV, but from what I'm reading here, those speeds should still be fast enough for 4K video. Then I read that all TV ethernet interfaces cap out at 100Mbps anyway, so the wifi interface is actually faster...

Still, as @TwistedKestrel pointed out, wiring up the TV would save wifi bandwidth for the other wireless devices that need to be wireless. It would probably also solve any latency issues vs speed issues.

Anyway, to make a long story even longer, at the point when I was asking the original question, I was mulling over whether to actually buy that Google Nest Wifi puck or to futureproof by moving straight to 802.11ax. Then I started reading that Wifi 6 is theoretically faster than Gigabit Ethernet, so why even install a switch? But it's so new, are there any real world experiences with AX? Let's ask GTAM!

Having typed all the above out, I realize I've embarrassingly revealed my rat's nest of convoluted thought processes...




That's a good point. But since I am mostly pre-downloading shows, the sources are mainly 20-40GB 2160p x264 files for a 1.5-2 hour movie. I expect the bitrate and compression with those files will be better than streaming content.

Since you are on the heavier end of usage I would hardline where possible. There is so much interference over these bands it is not hard to saturate them if you are talking these kinds of data volumes (which I would define as power user level usages, in which some of my previous comments do not apply). Offloading is best if link saturation is the issue. Keep in mind your neighbors/other mobile devices might be flooding the bands as well. Wireless is unpredictable at best. An additional AP may help boost signal, if that is the issue, but will not fix a noisy band. Keep in mind the Google Nest AP's will run at half the data rate as the main hub.

Wifi 6 is faster in theory, it is just increasing the data frequency (slightly) among other tricks. The reality is the higher the frequency the shorter the distance required, plus ensuring you have support on your device for it, which is still limited. It will work great... until everyone else gets it too. I would suggest hardline if you are doing heavy xfers, and make sure your hardware acceleration is enabled. Depending on what you are using either a GPU, or for those unfamiliar Intel has (very quick actually) hardware encode/decode support right on the CPU. x265 hardware encoding has come a long ways. Same quality at half the bit rate (at ~10x the cost of processing) which may be an option.
 

Back
Top Bottom