I don't get what the point of the article is. Is the takeaway that I should lower the channel width in my home? How many WAPs would I need to be running for that to matter? I'd argue it's more important to get everyone to turn down TX power in cases where your neighbors in an apartment building are conflicting. And that's never going to happen, so just conform to the legal limit and your SNR should be fine. Anything that needs to be high performance shouldn't be on wifi anyway.
If you want to spend a really long time optimizing your wifi, this is the resource: https://www.wiisfi.com/
This sort of thing is definitely in the class of "are you experiencing problems? if not don't worry about it".
If you are experiencing problems, this might give you an angle to think about that you hadn't otherwise, if you just naively assume Wifi is as good as a dedicated wire. Modern Wifi has an awful lot of resources, though. I only notice degradation of any kind when I have one computer doing a full-speed transfer for quite a while to another, but that's a pretty exceptional case and not one I'm going to run any more wires around for for something that happens less than once a month.
The takeaway is that you'll probably experience more reliable wifi if you turn your 5ghz channel width down to 40mhz and especially make sure your 2.4ghz width is 20mhz not 40mhz. As noted, you can't do anything about the neighbors, but making these changes can improve your reliability. And I think the larger takeaway is that if manufacturers just defaulted to 40mhz 5ghz width, like enterprise equipment does, wifi would be better for everyone. But if your wifi works great then no need.
Also that's an amazing resource, thanks for linking.
2.4GHz wifi at 40MHz squats literally half of the usabke channels, you speed improvement, very likely you now get 100mbps. If you just disabled 2.4GHz and forced 5GHz you would get the exact same improvement and wouldn't be polluting half of the available frequencies.
Add another idiot sitting on channel 8 or 9 and the other half of the bandwidth is also polluted, now even your mediocre IoT devices that cannot be on 5GHz are going to struggle for signal and instead of the theoretical 70/70mbps you could get off a well placed 20MHz channel you are lucky to get 30.
Add another 4 people are you cannot make a FaceTime call without disabling wifi or forcing 5GHz
I lose wifi signal consistently in my bedroom on my 80Mhz wide 5Ghz wifi.
I just now reduced it to 20Mhz, and though there is a (slight) perceptible drop in latency, those 5 extra dB I gained from Signal/Noise have given me wifi in the bedroom again
*If the bandwidth of the Analog Front End (AFE) and Analog to Digital Converter (ADC) / Digital to Analog Converter (DAC) doubles as well. In the real world the AFE of any wifi radio has a fixed bandwidth, with the ADC sampling rate and accuracy being fixed as well. The end result is that doubling the channel width in a wireless network requires a received signal strength that is roughly 3 dB more in real world devices. This constraint is quite visible in data sheets for most wifi cards like here: https://compex.com.sg/wp-content/uploads/2024/01/wle7002e25-...
Wow! There are certain areas of my house that I get such bad wifi signal that I often switch to cellular data since it's more reliable. I didn't even know you could change a setting like this to reduce speeds but improve reliability - it worked like a charm, thanks!
If you want to spend a really long time optimizing your wifi, this is the resource: https://www.wiisfi.com/