The reason for detecting the orientation of the connector is for higher speed communication. USB-C 20gbps uses both sets of pins on the connector to shotgun two usb3.2 10gbps to get 20gbps. That is why the technical spec name for 20gbps is "USB 3.2 gen 2x2". That is what the "x2" means.
Knowing that USB has this feature is follows that USB-C needs to be self orienting in case both ends of the connector plugged in different orientations.
You say Ethernet got this part right, well it got this part right by not having a reversible connector. Ethernet has 4 tx/rx pair and USB-C has 2 rx/tx pairs per usb 3 connection with 4 in total for 20gbps. The difference is reversibility. Is it worth the tradeoff?
That might work for Ethernet, but how would you do that for any unidirectional USB-C alternate mode without protocol-level feedback such as analog audio or DisplayPort video?
If you want to allow all of
- Reversible connectors
- Passive (except for marking), and as such relatively cheap, adapters and cables
- Not wasting 50% of all pins on a completely symmetric design connected together in the cable or socket
there's no way around having an asymmetrical cable design that lets the host know which signal to output on which pins.
That’s basically how USB-C does it too (except that the chip isn’t strictly necessary; an identifying resistor does the job for legacy adapters and cables).
Knowing that USB has this feature is follows that USB-C needs to be self orienting in case both ends of the connector plugged in different orientations.
You say Ethernet got this part right, well it got this part right by not having a reversible connector. Ethernet has 4 tx/rx pair and USB-C has 2 rx/tx pairs per usb 3 connection with 4 in total for 20gbps. The difference is reversibility. Is it worth the tradeoff?