> USB-IF is made up of manufacturers. The primary goal is lower BOM. The confusion is the point. Making everything optional is the point.
Yep. If you start looking at it from the point of view of who benefits and who makes the standards, it makes a lot more sense.
The physical connector is terrible and ends up being the first thing to break on a lot of electronics. The labelling is confusing and no one knows what to buy, so you always end up buying more than you need. Motherboard manufacturers were happy to re-label their marketing materials on old boards to say USB 3.1 so they could sell them to unsuspecting consumers that were tricked into thinking 3.1 > 3.0. It goes on and on.
Look at the big picture and it starts to feel like the goal is to create a bad design because it means the manufacturers can sell us more junk.
> The physical connector is terrible and ends up being the first thing to break on a lot of electronics.
The USB consortium already botched that badly with mini-A, so learned from that in the design of micro-A and then C. C was also influenced heavily by Apple's experience with lightning (Apple is a core member of the USB Consortium).
The physical part of type C is quite good: the nice grounding shell (preferable IMHO to lightning) high frequency design, and most importantly (per your comment) failure is mostly directed to the cable, not the jack (one of mini-A's problems) since a bad cable can be replaced while a damaged jack these days means replacing the device.
I meant the jack. I know a lot of people that try to use their phones for 3+ years and I get my own phones used from a friend I know that owns a shop that does phone repairs. Maybe I don’t have a big enough sample, so it’s too subjective, but the jack fails plenty based on what I see in my social circle.
AFAIK the Essential phone was brutal for jack failures. My friend (repair shop owner) gave me an LG V60 this year and basically said it’ll be great until the USB-C port wears out.
My newest iPad with USB-C seems like the jack is already flaky while old iPads with lightning ports are all good even with the abuse they have to endure from kids they were handed down to.
I have so little faith in USB-C ports that I’ve started trying to buy magnetic adapters for all my stuff.
People break the stupid wafer trying to clean them.
One of my favorite little features of the M1 MacBook is that the charging cable is only connected by contact enforced by magnets so that there isn't even a jack that can be damaged. Pulling on the charging cable simply breaks the connection.
It’s come and gone though hasn’t it? I seem to recall Apple switched to something else (USB-C?) and a few here on HN were annoyed because MagSafe was so good - I still have a ~2013 Macbook Air with magsafe that I use in the kitchen. If it’s back then that’s great news for Apple laptops - I haven’t paid attention to them for a while
Apple has shown that they are not interested in bettering the cable situation and just want you to buy overpriced cables and lock you in. If they were, they would have licensed/opened MagSafe, Thunderbolt, etc.
They could licence the magsafe tech to other brands without allowing them to sell cables/chargers for Apple devices, maybe even alter the size/shape as a requirement.
Not so sure about that. I have a phone with USB-C and it's the first time I've ever had a jack fail (after about 2 years of nightly charging). Luckily it's a fairphone, so I can replace the bottom assembly cheaply.
Except I like many people gave up on USB because it’s a mess so I don’t care what standard is on the motherboard whatsoever. I am not buying USB devices require higher standards which means no selling replacement cables etc.
Wireless charging or dedicated cables are just less painful than power over USB. The idea of running video signal over USB is just cringe inducing.
Wireless charging cellphones uses so little power that their ~70% efficiency is practically irrelevant. A single USB wall charger that wastes 0.5W when not in use consumes about 4x as much energy as is wasted by using wireless charging your cellphone.
As to recharge speeds, it’s easily comparable with USB. Sometimes USB charging is noticeably faster, but use the wrong charger and USB will be far slower.
70% efficiency requires 43% more power, your article says “47% more power than a cable” at the top which is a little worse but wait “In general, the propped-up design helped align the coils without much fiddling, but it still used an average of 19.8 Wh, or 39% more power, to charge the phone than cables.” that’s better than 70% efficiency. So ~70% efficiency is spot on.
Wireless charging can do 15+W depending on the phone.
As to the scare quote about 3.5 Billion phones, the charger they mentioned consumes 0.25W when the phone isn’t attached or 6Wh per day that’s the same energy it wastes charging a battery 12Wh per day. You can find many wired wall chargers that waste more than 0.5W when the phone isn’t charging so using one would actually take more energy than using that wireless charger for a single phone.
Wireless isn’t “close enough” for charging larger phone batteries.
I travel a lot for business and I will soon be doing the digital nomad thing flying across the United States. I use a portable monitor that is both powered via USB C and does video over USB C.
A 40$ cable with 5 dongles is a great argument that USB is fine.
As to wireless, an iPhone 13 pro max can do 15W wireless vs up to 27W with the right USB charger or 2W USB if you happen to plug into the wrong charger without thinking about it. In practice I only charge overnight or while driving so it’s hard to see a practical difference.
> The idea of running video signal over USB is just cringe inducing.
Although the cable can passively fall back all the way to USB 2.0, apart from that the only thing that these cables have in common with USB is the name. It's a very high speed serial bus with insane tolerances and complex protocols (even the cables have CPUs in them). So no need to cringe, except at the absurd naming conventions of the USB consortium.
Of course this also means there aren't many chipsets, unfortunately.
Yep. If you start looking at it from the point of view of who benefits and who makes the standards, it makes a lot more sense.
The physical connector is terrible and ends up being the first thing to break on a lot of electronics. The labelling is confusing and no one knows what to buy, so you always end up buying more than you need. Motherboard manufacturers were happy to re-label their marketing materials on old boards to say USB 3.1 so they could sell them to unsuspecting consumers that were tricked into thinking 3.1 > 3.0. It goes on and on.
Look at the big picture and it starts to feel like the goal is to create a bad design because it means the manufacturers can sell us more junk.