> The press release says that the update is “specifically targeted to developers at this time,” with final branding and marketing guides (including things like logos) to come later.
I choose to believe that there is still time to fix all this.
Here is what they need to do:
* Specify a set of power tiers: 15W, 30W, 100W.
* Specify a set of bandwidth tiers: 10Gbps, 20Gbps, 40Gbps, 80Gbps
* Specify a set of icons representing protocol capabilities (e.g. “DP” for DisplayPort)
* Mandate that every port and cable is labelled with those two numbers and any associated icons.
* Call it USB 5.
* Drop all the “2x2” and “version 2” and “Gen2” and “SuperSpeed+” terminology.
Apologies if I forgot some dimension of USB functionality.
USB-IF is made up of manufacturers. The primary goal is lower BOM. The confusion is the point. Making everything optional is the point.
USB-IF is not going to adopt a standard that requires every manufacturer to retool or redesign their Widget in order to slap a USB 4 label on it. Why would they vote for that? The whole 2x2/Gen2/etc thing was so some of the members could do some minor revisions then keep shipping their existing chips and products without them being considered "old", with a bit of "design by committee" mixed in.
USB-IF doesn't give two shits what you or I or anyone else thinks about it, nor what our actual experiences with it are. They care about how much it costs. And by cost I mean an $0.01 difference per unit is considered a significant savings.
> USB-IF is made up of manufacturers. The primary goal is lower BOM. The confusion is the point. Making everything optional is the point.
Yep. If you start looking at it from the point of view of who benefits and who makes the standards, it makes a lot more sense.
The physical connector is terrible and ends up being the first thing to break on a lot of electronics. The labelling is confusing and no one knows what to buy, so you always end up buying more than you need. Motherboard manufacturers were happy to re-label their marketing materials on old boards to say USB 3.1 so they could sell them to unsuspecting consumers that were tricked into thinking 3.1 > 3.0. It goes on and on.
Look at the big picture and it starts to feel like the goal is to create a bad design because it means the manufacturers can sell us more junk.
> The physical connector is terrible and ends up being the first thing to break on a lot of electronics.
The USB consortium already botched that badly with mini-A, so learned from that in the design of micro-A and then C. C was also influenced heavily by Apple's experience with lightning (Apple is a core member of the USB Consortium).
The physical part of type C is quite good: the nice grounding shell (preferable IMHO to lightning) high frequency design, and most importantly (per your comment) failure is mostly directed to the cable, not the jack (one of mini-A's problems) since a bad cable can be replaced while a damaged jack these days means replacing the device.
I meant the jack. I know a lot of people that try to use their phones for 3+ years and I get my own phones used from a friend I know that owns a shop that does phone repairs. Maybe I don’t have a big enough sample, so it’s too subjective, but the jack fails plenty based on what I see in my social circle.
AFAIK the Essential phone was brutal for jack failures. My friend (repair shop owner) gave me an LG V60 this year and basically said it’ll be great until the USB-C port wears out.
My newest iPad with USB-C seems like the jack is already flaky while old iPads with lightning ports are all good even with the abuse they have to endure from kids they were handed down to.
I have so little faith in USB-C ports that I’ve started trying to buy magnetic adapters for all my stuff.
People break the stupid wafer trying to clean them.
One of my favorite little features of the M1 MacBook is that the charging cable is only connected by contact enforced by magnets so that there isn't even a jack that can be damaged. Pulling on the charging cable simply breaks the connection.
It’s come and gone though hasn’t it? I seem to recall Apple switched to something else (USB-C?) and a few here on HN were annoyed because MagSafe was so good - I still have a ~2013 Macbook Air with magsafe that I use in the kitchen. If it’s back then that’s great news for Apple laptops - I haven’t paid attention to them for a while
Apple has shown that they are not interested in bettering the cable situation and just want you to buy overpriced cables and lock you in. If they were, they would have licensed/opened MagSafe, Thunderbolt, etc.
They could licence the magsafe tech to other brands without allowing them to sell cables/chargers for Apple devices, maybe even alter the size/shape as a requirement.
Not so sure about that. I have a phone with USB-C and it's the first time I've ever had a jack fail (after about 2 years of nightly charging). Luckily it's a fairphone, so I can replace the bottom assembly cheaply.
Except I like many people gave up on USB because it’s a mess so I don’t care what standard is on the motherboard whatsoever. I am not buying USB devices require higher standards which means no selling replacement cables etc.
Wireless charging or dedicated cables are just less painful than power over USB. The idea of running video signal over USB is just cringe inducing.
Wireless charging cellphones uses so little power that their ~70% efficiency is practically irrelevant. A single USB wall charger that wastes 0.5W when not in use consumes about 4x as much energy as is wasted by using wireless charging your cellphone.
As to recharge speeds, it’s easily comparable with USB. Sometimes USB charging is noticeably faster, but use the wrong charger and USB will be far slower.
70% efficiency requires 43% more power, your article says “47% more power than a cable” at the top which is a little worse but wait “In general, the propped-up design helped align the coils without much fiddling, but it still used an average of 19.8 Wh, or 39% more power, to charge the phone than cables.” that’s better than 70% efficiency. So ~70% efficiency is spot on.
Wireless charging can do 15+W depending on the phone.
As to the scare quote about 3.5 Billion phones, the charger they mentioned consumes 0.25W when the phone isn’t attached or 6Wh per day that’s the same energy it wastes charging a battery 12Wh per day. You can find many wired wall chargers that waste more than 0.5W when the phone isn’t charging so using one would actually take more energy than using that wireless charger for a single phone.
Wireless isn’t “close enough” for charging larger phone batteries.
I travel a lot for business and I will soon be doing the digital nomad thing flying across the United States. I use a portable monitor that is both powered via USB C and does video over USB C.
A 40$ cable with 5 dongles is a great argument that USB is fine.
As to wireless, an iPhone 13 pro max can do 15W wireless vs up to 27W with the right USB charger or 2W USB if you happen to plug into the wrong charger without thinking about it. In practice I only charge overnight or while driving so it’s hard to see a practical difference.
> The idea of running video signal over USB is just cringe inducing.
Although the cable can passively fall back all the way to USB 2.0, apart from that the only thing that these cables have in common with USB is the name. It's a very high speed serial bus with insane tolerances and complex protocols (even the cables have CPUs in them). So no need to cringe, except at the absurd naming conventions of the USB consortium.
Of course this also means there aren't many chipsets, unfortunately.
> USB-IF is made up of manufacturers. The primary goal is lower BOM. The confusion is the point. Making everything optional is the point.
That's an unfortunate truth and it happens to other standards as well, e.g. HDMI:
> "Products can no longer get certified for 2.0 only for 2.1, and also 2.1 features are optional to implement, so popular features like 4k120, ALLM, VRR are not required," Brad Bramy, VP of marketing and operations for the HDMI LA, confirmed to Ars. "Manufacturers could only implement eARC, for example, and claim to be a 2.1-enabled device."
> And by cost I mean an $0.01 difference per unit is considered a significant savings.
Could someone ELI5 this line of thinking? Exactly how is a $0.01 saving so consequential? Is a better-performing $9.99 cable too expensive for some people but a $9.98 cable within their budget somehow? Is each manufacturer manufacturing 100 million cables and saving $1M here? What is the logic?
The cost of these cables is sub-dollar. Like probably less than $0.50. Let's say $0.50 for argument's sake. Bringing the cost down a penny means 2% more sellable product for the same input materials. And when the margins are in the $10 range, that's a huge amount of leverage. They're not passing the $0.01 savings along to their customers. Or even if they are (and say, charge you $0.01 less), it's still a huge differential after you consider the 20x margin.
Quick math:
Let's say 1 million cables at $0.50. That costs you $500k. If you get the cost down to $0.49, you just enabled another 20k cables to be manufactured, and another $200k in profit.
That extra $200k you made by saving a cent literally bought you your next $4M profit if you reinvest it. Not a rounding error in any way when you're reinvesting at such a massive margin.
The $0.01 savings is for the maker, and not for the buyer.
For a typical cable, it's probably around $0.30 to $0.50 to make, and they tend to sell between $5 and $15, depending on the spec.
For simplicity's sake, we'll assume $0.50 cables that sell for $10. Now, if you save $0.01 in manufacturing, so the cost is now $0.49, and if you sell say 100k cables (feels like a believable amount) then you've saved a cool $1000 in making them. So your profits are $1000 larger now, and you can pat yourself on the back. Good job!
The kicker then is how those profits can be used to make MORE cables. And $1000 of extra profits means $1000 of extra cables. Just over 2000 more, in fact, which you can also sell for $10 each.
Those extra 2000 cables have now earned you a cool $20k of profit ALONE, from only $1000 saved dollars, which itself was from only a single saved cent! Amazing job!
If you keep repeating this, you can probably start to see how this begins to run away:
Even starting with only $1 to make $0.49 cables, you end up earning over $1M on your 19th batch, even if you were spending only 10% of your sales to make new batches of cables and pocketing the other 90%! You actually pocket $1.5M on that batch, and then only spend $167k on the next batch, which promises to earn you even more!
Meanwhile, your competitor making $0.50 cables, starting with a similar $1 and 10% to new batches scheme, would only get to pocket $1.1M on their 19th batch! That $400k is a pretty tidy bonus you've earned.
On your 20th batch, you're selling 80k cables, but your competitor can only afford to sell $60k. We've not even reached the 100k a batch mark, let alone a million cables, and you're already way ahead.
And that's compound interest, folks! From $1, saving only a single cent, we can earn $400k more than we would have otherwise. Outstanding.
EDIT: I took so long to write this that I see andrewstuart2 has summed up the first half of the story far more concisely than I. Nicely done! I feel that compound interest is important for seeing the bigger picture of profit over time, so I'll keep this comment as is.
This isn't a super unusual markup. There's a huge amount of other costs which bring the overall margin down (e.g. even if you just looked at the markup retailers use you'd think everyone should be in retail or trying to bypass them, until you look into the cost of warehousing, warranty returns, unsold product, 'shrinkage', etc).
Those numbers are marginal costs only, the manufacturers also have substantial fixed costs. Anyhow, there already are lots and lots of USB cable brands, although I suppose many of them are just rebranding bulk cable products.
Thats a 95% gross margin, and in reality the manufacturer is not going to be selling at retail prices. A lot of the available margin is going to go to the retailer.
There's two main groups of people. Cable manufacturers and device manufacturers. Cables are commodity and most of the value is consumed by retail. Manufacturers compete against each other by tiny optimizations and huge scale. Unless there's someone on the retail side who can actively sell more expensive cables despite consumers having very little ability to distinguish them, the factories will cost optimize to death and beyond.
On the device side, the ability to get those few percent extra profitability is the difference between a failed product and a successful one to higher ups. If you give a TPM the option to choose between $0.05 off the BOM and theoretical improvement to reliability, they'll choose the former almost every time.
I remember one product launch that turned into a disaster because it had been cost optimized so hard that the circuit would go slightly out of spec under physical strain and it turned into a yelling match over the few cents of additional plastic in the housing to mitigate the issue.
A penny here, a penny there, and soon you're talking something like $0.15/cable.
If you sell a million cables, that's $150k.
But there's also the fact that people usually buy stuff based on price. If you go on Amazon, as a consumer, you're likely to buy the $3.99 cable over the $4.99 cable.
And with the profit margins necessary that $0.15 cost savings can easily translate to a $1-$1.50 price difference for the end consumer.
Did you just ignore the entire point of parents comment? It's not a technical problem, is that the companies/people involved don't want to solve it, because it doesn't benefit them.
They already put capabilities on the box. The problem is that it's not visible to the end user what is required. Simple color bands would help a lot.
This problem can never be solved. Where there are improvements over the same connector, you need some way to identify. Either with a simple solution such as colors or markings, or with on-screen / audio feedback.
What it needs is extensions that go up to 3 kW so that it can replace power outlets worldwide.
It's totally possible to deliver 3kw in the form factor of USBC safely. Obviously there needs to be extra circuitry in both ends of the connection, but that extra circuitry should still be cheaper than today's thick dumb AC cables.
One mismanufactured cable away from a fire I imagine. There’s a review on Amazon by a (at least at the time) Google of a specific USB-C cable that would burn out the USB-C port on MacBooks because the cable wasn’t manufactured well. In fact, that guy has many many reviews on Amazon of the various defects in various cables (he stopped once that cable burned out his port).
Did some napkin math for fun. There are 12 contacts spaced within 6mm inside a USB-C cable, on each side. Assuming each contact is 0.25mm wide with a 0.25mm gap between each, it would only take somewhere around 1000V to simply arc through the air between contacts. And anything over 5 amps is going to start causing resistive heating simply due to how small and thin the wires and contacts are. So maybe, if you built it and tuned it really well, you could push up to around 5kW through a USB-C, with it being on the brink of shorting itself out and/or melting itself.
And 10,000 volts build up on your body when you rub a balloon on your head...
With high voltages, it's all down to how many joules you allow to be dissipated in a human. Ie. Current*voltage*time. Specifically, anything over about 0.01 Joules is dangerous. To use high voltages in a consumer product you need to limit either the current or the time, and you need to do it in a way that is very robust to being chewed on by a baby in a bath. And it needs to be robust to that even when built by a corner-cutting factory.
One approach to that is to have either end of the link 'test' the other end - ie. Simulate a human touching the wire. If power isn't turned off fast enough to be in spec, then the remote end or cable isn't working correctly and shouldnt be used.
Wattage is a product of voltage and amperage, so the question would be what the voltage supply is (either 100~120V or 200~240V depending on region) and how much amperage the cable and connections can handle.
To supply 3000W with a 110V supply would require around 28A, which will trip any 10A or 15A breaker commonly found on most house lines going to electrical outlets, but it's well within the realm of something that could be supplied by a circuit equipped with a 30A or higher breaker and qualified wiring.
Of course at 30A or higher, we're talking about things that connect via those bulky connectors you would find on dryers and other high load electronics. Bulky for your safety, mind you.
> To supply 3000W with a 110V supply would require around 28A, which will trip any 10A or 15A breaker commonly found on most house lines going to electrical outlets
The utter, utter majority of the world runs on something around ~230V [1], except America and Japan.
Sort of. Outside of industrial or businesses, my understanding is that it is rare to see multi-phase power (400V) used in power applications. You may get 3 phases into your house, but the phases are used separately (each at 240V) for most things in residential.
No, it's not. USB-C doesn't have anywhere near the appropriate amount of creepage/clearance allowance for something high voltage enough to handle 3kW. The only way to handle 3kW is a cable is to completely change the port, at which point it's no longer USB-C.
> Mandate that every port and cable is labelled with those two numbers and any associated icons.
That's the crux. Your first two points already exist, but no one is required to use them, so no one does.
The only way to mandate this is making USB a standard not open as-is (ideally patent-protected). But that is the reason USB succeeded and changing this isn't a good idea.
* Superspeed is the bandwidth tiers - 5, 20, 40, and 80 soon. High speed and Basic speed are for older and/or simpler devices (defined by USB 2.0).
* They do specify power tiers, 15W is the minimum for chargers, 30W I believe is the minimum for power delivery cables, with cables being additionally certified (and marked) for 60W, 120w and 240W.
* A certified cable with USB-C ports on both ends should be a USB cable, full-stop. Negotiation of alt modes is up to the devices on both ends.
The problem is that there are a lot of vendors that use garbage terms like USB 3.2 2x2, which were never meant for the market at all. I suspect this is usually a consequence of those products not actually being certified, and thus avoiding using the official marks.
IMHO, that there was anything publicly talking about "USB 3.2 2x2" is a side effect of the USB-IF not having the teeth to enforce use of the "USB" mark. Vendors would start using the right terms and running basic conformance if, say, they found they were no longer able to get their products imported. But the Implementers Forum is made up of the same vendors who are shipping broken devices, cheap charging cables, and motherboards with spec-invalid ports.
When many people talk about USB 3.2, people assume there is some sort of semantic versioning of the port or cables itself. Thats not how specs work - everything which was USB 3.0 compliant is USB 3.2 compliant. If you want to mandate new things or break old ones, you create a new version like USB4.
Many people have the same problem with HDMI 2.1a - there is no device certified with HDMI 2.0 that is incompatible with HDMI 2.1a. Thus, advertising 2.1a support means _nothing_ compared to advertising HDMI 2 support in general. It was a marketing failure that the HDMI licensing organization allowed vendors to use version numbers to imply _anything_, vs advertising that the devices you are connecting support features like 8K/60 or VRR.
There was a mandate for USB type-A cables to have a tactile USB logo on the "top" side, so that you could always feel when it was in the correct orientation.
In the current generation I've seen devices with "DP" used to mean "Power Delivery" instead of "DisplayPort"
Any mandate that is possible to ignore or misinterpret will just add noise.
most of the cables I used did have that... but the "top" side was only ever usable with laptops... lots of desktops had the ports upside down, and other USB hosts like monitors, TVs, chargers, cars frequently have the ports sideways, or top/bottom...
* Somehow get all manufacturers to only create quality cables complying with the standard whose capabilities match what they say they do.
That won’t happen. Some manufacturer will start making a cable that’s longer than what the spec says is possible, or will try to get away with a bit less shielding or a bit less conductor in the cables than what the spec requires, or will decrease quality control, so that their cables are fine, on average, but 1:1000 buyers will get a melon that’s sub-spec.
If you buy a certified USB4 cable, it has the markings and should work as you expect it to work.
If you buy a cheap Chinese cable, it won't have any markings or it has wrong markings. It won't be possible to mandate that every cheap Chinese cable has correct markings. It would require Amazon or Walmart to do some kind of quality checking for all their sellers, which won't happen. They allow SSD scams, so they also allow USB scams. https://news.ycombinator.com/item?id=32628381
How much more would each cable cost if they all had to do the max feature set? 100W, 80Gps, support for everything.
Is there anything the requires them to not all support the max feature set, apart from cost - some use case that would break, or be dangerous, or be mutually incompatible with another feature?
A minimum cable -- USB 2.0 and 3 amps only -- can be considerably thinner. You don't need the extra USB 3 wires and the power wires don't need to be nearly as thick.
It isn't just a question of cost -- if you want a long cable to charge a phone, you'd probably prefer it to be thinner and more flexible, and you don't need the extra power.
> Mandate that every port and cable is labelled with those two numbers and any associated icons
lol there are still HDMI cables without an HDMI logo on them. There are still plain old full speed USB cables without the USB logo on them. China alone will destroy this single requirement and laugh as they sell hundreds of millions of cables per year.
I really do wish it were possible to mandate this. Things would be easier.
Someone should just do this for them, maintain a database with Amazon etc. links and validated power and bandwidth tiers that does this naming for them, and if enough people exclusively only use products that are tested on that site, then they won't be able to ignore that demand without affecting their bottom line.
If I design a standard and call it "USB 7.1 UltraSpeed++ 200Gbps Lightspeed Pro C, 4x4 64-QAM 100W PD with DisplayPort and PCI Express, Type R, brought to you by SalesForce" and the media refer to it as "USB 7" - is the media really at fault?
Since stuff keeps getting renamed or added I vote to have a requirement to have a display on each cable indicating to me it's capabilities. The manufacturer should then be required to update the firmware on this cable to match any changes. /s
I would make it dead simple: A whole new version number for any difference from the former; starting from USB3.
* USB3.1Gen1? USB3 (because it literally is).
* USB3.1Gen2? USB4.
* USB3.2Gen1? USB3 (because it literally is).
* USB3.2Gen2? USB4.
* USB3.2Gen1x2? USB5.
* USB3.2Gen2x2? USB6.
* USB4Gen2x1? USB7.
* USB4Gen2x2? USB8.
* USB4Gen3x1? USB9.
* USB4Gen3x2? USB10.
* USB4v2? USB11.
Power Delivery? Likewise one whole version number (eg: PD100W) based off watts carried, and append that version number to the USB version number.
So for example: USB9 PD100W, USB7 PD30W, etc.
Dead. Fucking. Simple.
The U in USB stands for Universal. The first step to being universal is to be so dead fucking simple even your technologically illiterate grandma can tell things apart.
I hate the Chrome Versioning System, but it is by far dead fucking simple, and USB could use some dead fucking simpleness with its version numbers.
I wonder how long before some group of manufacturers comes out and say “Ah fuck it, just make a new connector with a new baseline level of performance!”
I hope they innovate a way to make the naming scheme even more confusing.
I humbly propose:
USB 4.0Gen 0a (USB 1.0 speeds)
USB 4.0Gen 0b (USB 2.0 speeds)
USB 4.0Gen 1 (also USB 2.0 speeds)
USB 4.0Gen 1z (USB 1.0 speeds again)
USB 4.0 (20 Gbps)
USB 4.0e (40 Gbps)
USB 4.0Gen 2a (also 40Gbps)
USB 4.0Gen 2e (80 Gbps)
USB 4.0Gen 2f (charging only)
This is actually better than the human-readable scheme because then you can go on AliExpress/Google and actually land on the exact product by matching the hash.
With human-readable names, the product pages start to have too much creative freedom like "USB 3.0 3.1 Gen 1 2" which becomes hell for browsing, as now the people buying get to be blamed if they make a wrong purchase
They could also just make all those compatibilities non-optional. Problem solved.
Want USB 2.0 but not USB 3.0 or 4.0? Just put a USB 2.0 port on the damn device.
Want USB 4.0 but faster? Name it USB 5.0. USB 1.0, 1.1, 2.0 all used the same plug and nobody is confused about that. Same applies to USB-C and USB 4: either you implement it (fully) or if you don't want to: implement legacy USB instead.
Maybe we should have thought of that ahead of time. USB-C with USB 3, 4 and Thunderbolt just pushed us back into RS232 DB-9 territory where the plug doesn't mean anything anymore, and while a lot of things might physically fit, you won't be able to predict what's going to happen, or if anything will even happen at all.
That cheap cable probably also has an even cheaper variant that simply sets itself on fire if you put more than 10W through it. And that variant might only support USB 2.0 and as such transfer speeds are so bad people now have to think in terms of 'which one was my slow cable'.
If all cables had to be complaint with all features we would see economies of scale that are never going to happen in the current market.
The high speed cable might never be as cheap as €7, but in a couple years it could be €9-11.
I'm sure at some point high speed cables will be a lot cheaper than they are now, but as long as the make up a small percentage of the market they will command a big premium.
In some respects this is simpler than the current scheme. You don't have the SuperSpeed+ Gen 2x1 and Gen 1x2 variations, which are both the same X Gbps nominal speed, except one uses an older less efficient encoding than the other and is measurably slightly slower in practice
I don’t care about the name. Can you please write the version on the cable? I have a dozen of USBc cables in my drawer and I have no idea what each one is capable of.
1. It's either a regular USB-C cable or a Thunderbolt (USB4) cable. If it's a Thunderbolt cable, it'll have a large, long, square strain-relief section behind the connector (there are active logic components inside this part.) Otherwise it won't.
2. The charging speed is fundamentally constrained by how thick the cable is. So thick cables are very likely fast-charge cables (they wouldn't spend the extra money on lower-gauge copper if they weren't going to need it for something), and thin cables definitely aren't.
Just a caution: I've cut apart several microusb cables (using the ends for power, and cables for sensors, etc, just because I have a bin full of them - mostly for esp32-based projects). The thing I've noticed is cable thickness doesn't line up with conductors thickness. I've opened "large" cables to find tiny 28awg wire inside and a very thick jacket. Maybe just my cables and limited sample size, but it's like 50% of the time. A lot of cables in my bin are the random ones that get included in the box with something else.
I so far haven't chopped up any usb-c cables, but I don't see why to expect any different.
> I've opened "large" cables to find tiny 28awg wire inside and a very thick jacket.
In a way, this is actually fine too — the conductor itself can be higher-resistance, as long as there's enough shielding to prevent the heat the conductor puts out from setting anything on fire. (Basically, following the same rules for sheath thickness as in-wall cabling uses.)
The only real practical difference, for most people, will be that anything electrically connected to the "thick sheath, thin conductor" cable, will get very hot. (And also, if you use such a cable to connect a device to a portable battery bank, you'll find you get your battery bankk will run dry sooner than expected, since your cable is acting as an additional load.)
Small tidbit: for cables that are permanently installed in buildings, the insulation and jacket material is more important than the thickness. Typically these installations are inspected by the local jurisdiction and the electrical code is very specific about what gauge conductors have to be used in which circumstances, so the inspection will generally prevent anything egregious enough to cause spontaneous combustion due to heat from electrical resistance.
The bigger issue is that the insulation material will degrade over time if it's consistently exposed to temperatures that are too high, and at that point it becomes a safety hazard because the conductors are no longer properly protected. So typically cables with a higher temperature rating will have insulation and/or an outer jacket made with different materials.
Doesn't the shielding help prevent heat output only with short duration power use peaks? If we imagine the conductor is heating at the same rate 24/7, all of the heat must flow out of the cable at the same rate it's getting generated unless the conductor's temperature can increase indefinitely.
If the insulator material is truly a (thermal) insulator, and thick enough, then the heat will "prefer" to flow out the ends of the cable rather than making its way through the skin of the insulator. (Remember, electrical conductors are usually also great thermal conductors; an insulated copper cable is essentially a solid-state heat pipe.) Pick up one of these thick-insulation cables, even after hours of operation — the cable itself won't be hot. But the ends sure will be!
For mains leads, this usually means you're "grounding" heat into:
1. the device itself, usually at the power supply. The heat dead-ends between the coils of a transformer, this being why high-wattage PSUs have fans.
2. your wall — specifically, your wall junction-box power sockets, at the point that the device's power lead's male connector touches the faceplate of the wall socket. This is why you'll often see slight electrical singing on such sockets when you're operating e.g. a portable air conditioner or microwave on a 120V 16A socket: despite most of the heat following the thermal conductive path of the cable into your house wiring, some of the heat ends up transferring into the air where the thicker insulator ends, through the bare contacts, and into the plastic of the wall-socket faceplate.
3. The rest of the house, through the thermal conductive path of the house wiring — the house wiring, being lower-gauge and so needing relatively thinner insulation, leaks heat more readily than the sort of "super-insulated" cables made for high-amperage appliances, and so the heat is gradually sloughed off all over your walls.
4. Your neighbourhood electrical distribution box. Despite heat leaking through your house, modern residential wiring's electrical insulation is still thick enough, that if you're using enough household load to just be barely below tripping your mains breaker, you're probably heating up your neighbourhood distribution box. Ever noticed one humming really loudly when you walk by? Just like your computer's PSU, these have active cooling that goes harder when houses are dumping heat at them.
Note also that this is among the reasons that industrial power routing (for e.g. factories, mills, etc) doesn't use insulated NM in-wall wiring (Romex), but instead routes individual [lightly-insulated, essentially just for electrical separation] conductors through metal conduit, with as little in-wall routing of that metal conduit as possible. The metal conduit is a heat sink for the conductors — such installations are trying to take on the responsibility of dumping electrical waste heat locally, so that they don't overheat their nearby electrical distribution station, or produce line sag on the poles connecting them to it!
I think all Thunderbolt cables are actually active, it’s just the chips at each end are pretty tiny so for a lot of cables you wouldn’t know it’s hiding inside the connector.
Searching around, I can find USB testers (some with 3-1 functionality, including USB-C + PD) in the range of $20 - $100 on Amazon and bunch of other websites, that doesn't seem "very very expensive". Would anything be wrong with such devices?
Many of those will just check continuity on pins (which might be enough for what you want) - a full on tester will run actual data at faster and faster speeds until it fails; report error rates, etc.
I've taken to using a label maker to version my cables cause I got sick of throwing away HDMI/DisplayPort cables when I couldn't determine what they actually supported.
I honestly don't get why they would not just call it USB 5. If every new non-backwards compatible USB feature would just be a new USB version number none of this insanity with naming would occur. Slap a unique color on the logo to identify the cable and you would be golden.
People here and in media seems to be only obsessed about the naming scheme.
But I have an important question I haven't seen anyone address - does the 80 Gbps performance means the eGPUs will be that much less constraint? Will this enable true gaming and AI work with a laptop connected to an eGPU?
I want my computer to warn me if I plug in a faulty cable. Or a cable that is preventing some functionality of the device - eg. Lowering speeds or charging slower than it could.
The numbering system is completely made up, and wildly inconsistent. Apparently it's intentionally confusing because the hardware manufacturers that control the USB group like it that way.
If bodies like the EU stepped in, we would still be “mandated” to use micro USB. When the EU first was thinking about “mandating” a standard, that’s what they were referring to.
This is completely untrue.. The EU mandated micro usb while that was the latest and then switched to mandating usb c when that became the latest standard.
How long would have taken the EU to mandate USB C? Why should we have to wait on slow bureaucrats to come up with a standard?
How would USB C ever become “a standard” if no one could come out with one first because they had to wait on the law to change?
BTW, USB C is not “mandated” yet. If they did “mandate” it, are they going to “mandate” that the cords standardize on a power delivery? video out? A certain data speed? Protocols the vendors must support?
Will “the standard” be like the indecipherable 99 section 11 chapter GDPR that made the web worse and festered with cookie pop ups?
The GDPR didn't fester the web with cookie popups. There are literally zero cookie popups that exist because of the GDPR. Businesses who wanted to inconvenience customers festered the web with cookie popups in the hope that the customers would complain and obligations would be revoked. Every time you see a cookie popup, it should be a reminder that businesses are out there to screw you, not to engage in a mutually beneficial exchange.
So, before the GDPR there were no cookie pop ups forced onto users to be compliant. But to be compliant now there are. But it’s still not the fault of the GDPR?
And guess what? None of the companies that exist solely to track users had to change anything because of the GDPR. But one private company - Apple - forced app developers to allow users to opt in to tracking and companies like Facebook announced that it affected their revenue by billions.
Maybe the government isn’t effective at regulating tech?
The cookie stuff is older than the GDPR. It was only required for sites that had non functionality based cookies for tracking purposes. The idea would be it would cut down on tracking cookies by making them visible. But obviously this had little effect because every site just put up a banner.
So the GDPR changed things by making it so you can't just inform the user about tracking, you must also give the user the option to say no which is why the next gen of banners have opt out options.
What about RJ45? Why it did not require any bureaucratic regulations to become the standard connector in network devices? I am convinced the less regulation in trivial matters the better.
RJ45, part of the Registered Jack family of connectors designed by government-mandated telephone monopoly AT&T and codified in 47CFR-68 by the FCC, is free of bureaucratic regulation?
It was an existing standard, already mass produced that was unencumbered, convenient, and sufficient.
Wired networking standardization tends to be lead by groups that want to use the technology and also sell it, rather than by groups that mostly want to sell it. There still were four competing 100M ethernet standards, and now a fifth 100Base-T1 has appeared (but targetting specific niches).
If all that changed between USB1.0 and USB 4 version 2 (or wtf it ends up being called) was twisting the pairs more (more or less), the cabling issues would be less too.
We were testing our software on smartphones around that time and were forever trying to find the specific charger for some phone (inevitably got left on a powerstrip when the phone was returned to the storage shelf) since even close sibling devices could have connectors that looked the same at a glance but differed by the addition of one pin, a small plastic lug, etc.
You really think forcing standardization of micro USB would have been good? At the time, iPhones were still using the iPod 30 pin connector. I would have hated to be using micro USB over Lightning.
This is incorrect. The common EPS memorandum was signed in mid 2009 as a result of negotiations between the dozen dominant phone manufacturers at the time and EU. The standardization did not begin until the phone makers realized the EU was serious about this.
Is there some official document describing USB naming convention or reasoning behind those seemingly randomly chosen combinations of numbers and letters?
Usually when products have confusing names you can at least see how by deceiving the customer its benefiting the whoever is making the product but with USB cables??
We don't have actual standardizing bodies with teeth so enforcing anything is just not happening.
I'll never miss communism but the fact is that here in Eastern Europe we had an actual agency that vetted various cheese and milk and sausage products with class A, B, C etc.
You knew what you were getting for your money. I miss that.
At one point the world has to return to it -- constantly swindling the consumers is not a sustainable business strategy.
That’s interesting. Eastern European perspectives always are. We do have “grade A” eggs and such (in the US). It requires a balance of free market and regulations. Tough to strike the right mix.
I’m a free market, no government intervention guy for non-necessities like phones. Less a fan of such an approach on basics. Still needs to be lighter touch. Food should be heavier handed since it leads to healthcare outcomes and productivity losses.
I wonder at what point internal components go external again.
Imagine having a slot for an usb-stick in your notebook, which contains your hard drive. Laptop broken? take your usb-stick and a new Laptop and off you go working again.
Actually I just realized thats what I want.
Want a backup? put stick in Nas and wait 30 minutes.
Want a new PC? take your win10 install and go.
Private surfing on employer hardware? Put the stick into the machine and remove the other one.
PS: I do realize that booting from USB is a thing since a long time, this "vision" is more like an actual end to end concept and, you know, 80GB ps ;)
The Framework laptop sorta has it -- it has 4 bays for 'modules' which are actually USB-C interfaces of some fast variety, and you can put a bootable 1TB SSD there.
I just bought a USB SSD 2TB with USB3.2 gen 2x2 (20 Gbps). Almost no computers today support 2x2. Will future USB4 devices support it or 2x2 is a dead end in USB development?
Dear USB Committee, please bump the major version on major technical improvements, else shopping for the desired performance gadgets becomes a nightmare. Thank you!
I don’t want usb4 cables. USB-C was a cluster** when launched with varying levels of compatibility and interoperability that you couldn’t tell by looking at the cable. T3 and T4 also look identical to USB-C except for the thunderbolt icon; how you differentiate T3 from T4 without plugging it in is beyond me.
The USB folks have a good thing in USB-C at the current 3.X revision. Let it simmer for a while. Introduce Thunderbolt 5 if someone really needs that kind of bandwidth, but let USB4 mature before inflicting it on the world.
The cynic in me sometimes enjoys the irony of watching USB become the fragmented ecosystem of loosely (or not) compatible interconnects that it was designed to replace.
(ok, it's not quite /that/ bad, but you get the sentiment)
It's so much worse than what it replaced because physical incompatibility meant you had no confusion. A VGA cable was a VGA cable, a power port was a power port, a ps/2 mouse was a ps/2 mouse.
The only thing it improved over was the hellscape if charging ports.
The thing is that they never fail catastrophically; t worst you get USB 2.0 level of interoperability.
But if you wan these top Mbps or W, or want 60 Hz 4k video, you may find out that not everything works with everything else, and proper cables play a crucial role, while being visually not distinctive at all.
I seem to recall that back in the early days of USB, the organization would hold a conference where all the vendors brought their devices, and everyone would go around testing compatibility with everyone else's devices. Often by making the most convoluted device configuration tree possible (because that's what the users would do).
Myself, I ran into an incompatibility for the first time ever last week in over 24 years of using USB devices. A new Thinkpad with Windows 10 would lock up it's USB bus if I plugged a USB speaker in using the hub on my Dell monitor. Hooking it up directly worked fine. These are all "legacy" USB 3.0 devices with Type A and Type B connectors, and have worked for years with other computers (PC + Mac). What a shitshow USB has become.
I also had a situation where a usb device was able to lock up the bus last week. Windows 10, dell dock, usb 3.0 hub.
Same setup worked fine 3 weeks ago.
I almost wonder if there was some Windows driver update causing it.
Hardly any of these cables are labeled properly. Even Apple has a Thunderbolt 3 and Thunderbolt 4 cable that it sells for $150 without any difference in the labeling.
Let's put resister color codes on the wires, 2 sets of 3 bars (digit digit exponent), 1 for megabytes and 1 for milliamps. So red black yellow space green black red is 20 gb 5 amps.
Great idea! But what to do if some jack manufacturer will take a beautiful cable and waste it with an awful jack for the sake of some economical reasons?
Ah, you need the scratch and sniff version of this card. My only complaint is how subtle the difference is between between the smell of the 45W fast and regular charging cables.
Seriously .. the world is going up in flames .. and at the same time there's all these people walking around in the global west who're apparently hooked on gigapixels and ever more bandwidth/storage space/gigaherz and so on .. it's so f*** up.
An example of a slower and less dumb way to run our ecosphere into the ground:
I convert all podcasts I listen and want to keep to opus @ 20 kbit/s which on average results in 1/5 of the original file size.
Conversion also costs energy. Save more energy by not listening to podcasts.
But that's the cheap shot at your slightly ranting arguments. USB isn't going to run us into the ground. Transport uses up so incredibly much more energy, almost none of which comes from sustainable sources at the moment. Cooling and heating too, obviously. Don't worry about a better USB cable if you're really concerned with the environment.
> Transport uses up so incredibly much more energy, ...
True. I never claimed otherwise.
> ...almost none of which comes from sustainable sources at the moment.
I know. The world is roughly at 80 % fossil energy right now, as far as I remember.
Trust me on this, energy is one of the topics I spend time reading and thinking about several times a day.
> Don't worry about a better USB cable if you're really concerned with the environment.
Here's the thing. It's about this mentality of: this small thing here doesn't really matter. But people in this "ever more culture" reliably put one "small thing" after the other in their basket of consumption capacity .. until it's full.
These people ask "I can have more, why shouldn't I?" rather than "I can have more, do I really want/need more?"
Do you enjoy watching your movie at 8k 4 times more than at 4k because it comes with 4x the pixels? If you do, I pity you.
I recently watched some Linus Tech Tips video where he installs their 1 PB of storage and boasts about their new camera that can record videos with a data rate of 300 MB/s ... 1 TB per 3 seconds ... yeah! ... sorry (not sorry), despite all my fascination with tech, this attitude is just so heinously wasteful and damaging.
As if these hundreds of HDDs hadn't taken any ressources and weren't consuming quite a bit of electricity.
Again ..
> Don't worry about a better USB cable if you're really concerned with the environment.
I don't worry about the cable. I worry about the f*** up culture/attitude.
> Conversion also costs energy. Save more energy by not listening to podcasts.
Yea, no .. people should have fun and do the things they want. But being needlessly wasteful while doing them is just asocial because in the end it hurts the whole ecosphere.
I want to store the podcasts I listen to so I do. But I do it in a way that doesn't need TBs of disk space.
Besides, the conversion is only necessary because the content is delivered in utterly over the top bitrate.
And also, once I've converted a file to lower bitrate, from then on every time I do something with it (play, copy, ...) it will consume less energy than doing the same thing with the large original file.
I choose to believe that there is still time to fix all this.
Here is what they need to do:
* Specify a set of power tiers: 15W, 30W, 100W.
* Specify a set of bandwidth tiers: 10Gbps, 20Gbps, 40Gbps, 80Gbps
* Specify a set of icons representing protocol capabilities (e.g. “DP” for DisplayPort)
* Mandate that every port and cable is labelled with those two numbers and any associated icons.
* Call it USB 5.
* Drop all the “2x2” and “version 2” and “Gen2” and “SuperSpeed+” terminology.
Apologies if I forgot some dimension of USB functionality.