2.4GHz wifi is not suitable for two big reasons, interference and low bandwidth. 2.4GHz wifi in any kind of suburban or city environment and sometimes even in rural will be congested with other networks, microwaves, other appliances, etc causing massive speed degradation or fluctuations. The range of 2.4GHz is just too large for all the equipment that uses it in today’s world. In my previous apartment complex for example my phone could see 35 distinct 2.4GHz wifi networks while only 3 at max can operate without interfering with each other. In that same building i could only see 13 5GHz networks. Which brings me to the second issue of bandwidth
2.4GHz at least here in the US only has channels 1, 6, and 11 that will not interfere with each other. if anyone puts their network between these three channels it will knock out both the one below and the one above. Channel 3 would interfere with both channels 1 and 6 for example. By going up to 5GHz you have many more free channels, fewer networks competing for those channels, and higher bandwidth channels allowing for much higher throughput. 2.4GHz allows 40MHz wide channels which in isolation would offer ~400mbps, but you will never see that in the real world.
Personally, i think OEMs should just stop including it or have it disabled by default and only enable it in an “advanced settings” area.
Edit: I am actually really surprised at how unpopular this opinion appears to be.
It is always amazing how many people think their own specific situation should be used as the defining standard for the rest of the world.
5 ghz just doesn’t get through stucco, concrete or even an inconveniently located furnace very well, nor does it reach nearly as far as a 2.4 ghz signal when only drywall and wooden studs are in the way. It would take 5 AP’s at 5ghz to cover the same area as 2 at 2.4 ghz in my environment.
The great thing is that you can disable 2.4 ghz wifi on all your devices and the rest of us can continue to do what works for us.
I did specifically mention either removing it entirely OR disabling it by default.
So the rest of the world should change for your convenience. Got it.
I think it’s a fair opinion, but a lot of “cheap” IoT devices only support 2.4GHz, so I do have both networks setup in my house for that reason…
IOT devices should support 5 GHz and at least for me personally, if it doesn’t support it, I don’t buy it. Which also means that I have no IOT devices. LOL. My alarm system only supports 2.4 GHz, but it also has a cellular radio, so has never been connected to Wi-Fi in the time I’ve owned it.
Why would you refuse to buy IoT devices unless they’re more expensive, use more battery and have less range? Like why, what does it give you to not have a 2.4 GHz network? It’s not like it’ll interfere with the 5 GHz network.
Like sure the 2.4 GHz spectrum is pretty crowded and much slower. But at this point that’s pretty much all that’s left on 2.4GHz: low bandwidth, battery powered devices at random locations of your house and on the exterior walls of your house and all the way across the yard.
It’s the ideal spectrum to put those devices on: it’s dirt cheap (they all seem to use ES8266 or ESP32 chips, lots of Espressif devices on the IoT network), it uses less power, goes through walls better, and all it needs to get through is that the button has been pressed. I’m not gonna install an extra AP or two when 2.4 reaches fine, just so that a button makes my phone ring and a bell go ding dong or a camera that streams and bitrates that you could stream on dialup internet.
Phones and laptop? Yeah they’re definitely all on 5 GHz. If anything I prefer my IoT on 2.4 because then I can make my 5 GHz network WPA3 and 11ac/11ax only so I don’t have random IoT devices running at 11n speeds slowing down my 5 GHz network.
But cameras on 5GHz could stream very high quality 4K video directly to your phone or whatever 2.4GHz would be lots more likely to buffer and skip doing that.
deleted by creator
My best camera does 1080p at 150kbit/s H264. Most “4K” cameras have such shit encoding they’re nowhere near exceeding what 2.4 GHz can provide still. And if I were to spend money on a nice 4K camera that actually streams real 4K I would also invest on making it run over PoE because that would chew through battery like there’s no tomorrow and needs a power source anyway, and would go to an NVR to store it all on a RAID array.
And if that had to happen I’d just put it on a dedicated 5 GHz network, because I want to keep the good bandwidth for the devices that needs it like the TV, phones and laptops. Devices on older WiFi standards slow down the network because they use more airtime to send data at lower rates, so fast devices gets less airtime to send data at high rates.
Using the most fitting tech for the needs is more important than trying to get them all on the latest and greatest. Devices needs to be worthy of getting granted access to my 5 GHz networks.
Channel slicing into units solves some of this and when you go higher frequency like that you can put more antennas in the same physical space so you can have like 16 transmit 16 receive to combat those airtime issues.
You sound like a USA citizen. There many places in the world where walls are made of concrete. 5Ghz doesn’t penetration concrete.
In such cases, the only way to get 5GHz into every room will be passing cat5 cable in the wall and placing an AP.
Passing a cable in concrete walls requires a pipe in the wall, that was placed there when the house was built! But in many cases, the tunnels that exists are too narrow for cat5 and are already in use anyway.
So to fulfill your idea and still have WiFi we will need to raze to the ground whole cities and rebuild them.
Unless you are footing the bill, and take care of the CO2 emissions, just learn to disable 2.4GHz on your router.
The problem with 5Ghz is that it doesn’t go through walls very well compared to 2.4Ghz, resulting in APs having less range (or having to use several times more power)
The max power a 5 gigahertz access point puts out is 1 watt where the max on 2.4 gigahertz is 0.3 watts. You are right though. You do have to do better about centrally locating the access point in your home in order to get the best performance from it. Because otherwise, one side will have good WiFi and the other side will have nothing or very weak WiFi.
Edit: Another benefit of that is that if somebody wants to crack your Wi-Fi network, they have to be physically closer to your house to do so. So, on like 60 gigahertz, where the signal doesn’t leave the room you’re in, it’s basically as secure as Ethernet, because an intruder would have to break into your house to crack your Wi-Fi network.
Don’t forget it’s 2-way communication so your device also needs to output enough power to reach the AP.
That is a fair point, and mobile devices are going to be hardest hit by that, since they have such small batteries, but laptops and desktops and stuff would be just fine since they are constantly connected to a power source, and can use a card with higher transmit power.
Until I can get a decent 5Ghz signal on the other side of a wall from the router, I can’t do without 2.4Ghz.
What are your walls made of lead? Because mine is about a foot thick brick. And it still gets through.
Don’t know about op, my 1950’s home is a half inch of plaster over chicken wire over wooden lattice. My options are 2.4GHz or ethernet. And ethernet for phones is problematic.
That chicken wire could have been intentionally designed to absorb 5ghz signal, and is death to it. Literally any other material would be fine up to 3 rooms away depending on the noise floor in the space. 6ghz /might/ be able to punch through depending on the width of the space between the wires though, and might be worth exploring in your case.
Chicken wire in walls is something that came WAY before wifi. It’s used for plaster in much the same way rebar is used for concrete.
No, i know, but my point was if you were designing a wall material to block 5ghz you would end up with plaster on wire mesh. Couldn’t have been better if it were on purpose