USB C is the best thing to happen to peripherals since the mouse.
I would agree with you if there were a simple way to tell what the USB-C cable I have in my hand can be used for without knowing beforehand. Otherwise, for example, I don’t know whether the USB-C cable will charge my device or not. There should have been a simple way to label them for usage that was baked into the standard. As it is, the concept is terrific, but the execution can be extremely frustrating.
Buying a basic, no-frills USB-C cable from a reputable tech manufacturer all but guarantees that it’ll work for essentially any purpose. Of course the shoddy pack-in cables included with a cheap device purchase won’t work well.
I replaced every USB-C-to-C or -A-to-C cable and brick in my house and carry bag with a very low cost Anker cable (except the ones that came with my Google products, those are fine), and now anything charges on any cable.
You wouldn’t say that a razor sucked just because the cheap replacement blades you bought at the dollar store nicked your face, or that a pan was too confusing because the dog food you cooked in it didn’t taste good. So too it is not the fault of USB-C that poorly manufactured charging bricks and cables exist. The standard still works; in fact, it works so well that unethical companies are flooding the market with crap.
If anyone disagrees with this, the original USB spec was for a reversible connector and the only reason we didn’t get to have that the whole time was because they wanted to increase profit margins.
That’s the reason Apple released the Lightning connector. They pushed for several features for USB around 2010, including a reversible connector, but the USB-IF refused. Apple wanted USB-C, but couldn’t wait for the USB-IF to come to an agreement so they could replace the dated 20-pin connector.
There should have been a simple way to label them for usage that was baked into the standard.
There is. USB IF provides an assortment of logos and guidelines for ports and cables to clearly mark data speed (like “10Gbps”), power output (like “100W” or “5A”), whether the port is used for charging (battery icon), etc. But most manufacturers choose not to actually use them for ports.
Cables I’ve seen usually are a bit better about labeling. I have some from Anker and ugreen that say "SS”, “10Gbps”, or “100W”. If they don’t label the power it’s probably 3A and if they don’t label the data speed it’s usually USB 2.0, though I have seen a couple cables that support 3.0 and don’t label it.
IDK I’ve had PD cables that looked good for a while but turns out their data rate was basically USB2. It seems no matter what rule of thumb I try there are always weird caveats.
Correct. The other commenter is giving bad advice.
Both power delivery and bandwidth are backwards compatible, but they are independent specifications on USB-C cables. You can even get PD capable USB-C cables that don’t transmit data at all.
Also, that’s not true for Thunderbolt cables. Each of the 5 versions have specific data and power delivery minimum and maximum specifications.
You can even get PD capable USB-C cables that don’t transmit data at all.
I don’t think this is right. The PD standard requires the negotiation of which side is the source and which is the sink, and the voltage/amperage, over those data links. So it has to at least support the bare minimum data transmission in order for PD to work.
Technically, yes, data must transmit to negotiate, but it doesn’t require high throughput. So you’ll get USB 2.0 transfer speeds (480 Mb/s) with most “charging only” USB-C cables. That’s only really useful for a keyboard or mouse these days.
This limitation comes up sometimes when people try to build out a zero-trust cable where they can get a charge but not necessarily transfer data to or from an untrusted device on the other side.
True but pretty much the only devices that need those are high-end SSDs and laptop docks and in both cases you just leave the cable with the device rather than pulling it out of your generic cables drawer.
Damn, check out the price of the thing someone else linked to at AliExpress for a fraction of that price. But having to spend money on that should not be necessary.
That aliexpress device doesn’t tell you what wattage or data speed the cable will max out doing. Just what wattage it’s currently doing (to which you’d need to make sure that the device you’re testing with on the other side is capable and not having it’s own issues). Also can’t tell you if the cable is have intermittent problems. If all you care about is wattage, then fine. But I find myself caring more about the supported data speeds and quality of the cable.
But yes, I agree that cables should just be marked what they’re rated for… However it’s possible well built cables exceed that spec and could do better than they’re claiming which just puts us in the same boat of not really knowing.
Edit: oh! and that aliexpress tester is only 4 lines(usb2.0 basically)… usb 3.0 in type c is 24 pins… You’re not testing jack shit on that aliexpress. The device I linked will actually map the pins in the cable and will find you breaks as well.
The cheaper aliexpress item you actually want is this one, it will read the emarker and tell you the power/data rates it supports, if it supports thunderbolt etc https://www.aliexpress.com/item/1005007287415216.html
The really janky ones you get with like USB gadgets like fans only have the 2 power lines hooked up and not the lines needed to communicate PD support, those will work exactly the same as the same janky USB A-microUSB cables they used to come with, supplying 5V/2A. You throw those away the second you get them and replace them with the decent quality cables you bought in bulk from AmazonBasics or something.
Nope. My daughter is notorious for mixing up cables when they come out of the brick. Some charge her tablet, some are for data transfer, some charge other devices but not her tablet. It’s super confusing. I had to start labeling them for her.
Come to think of it, all the USB C cables I have are from phone and device chargers so I just took it for granted. Good to know. Thanks for sharing some knowledge with me
USB-c cables can vary drastically. Power delivery alone ranges from less than 1 amp at 5 volts to over 5 amps at 20 volts. That’s 5 watts of power on the low end to 100 watts of power on the high end and sometimes more. When a cable meant to run at 5 watts has over 100 watts of power run through, the wires get really hot and could catch fire. The charger typically needs to talk to a very small chip in the high power cables for the cables to say, yes I can handle the power. Really cheap chargers might just push that power out regardless. So while the USB-c form factor is the one plug to rule them all, the actual execution is a fucking mess.
I agree with USB-C, but there are still a million USB-A devices I need to use, and I can’t be bothered to buy adapters for all of them. And a USB hub is annoying.
Plus, having 1-2 USB-C ports only is never gonna be enough. If they are serious about it, why not have 5?
That’s still only 3 simultaneously if I saw that right. My old Lenovo laptop had 3 USB-A 2.0 ports, 2 x USB-A 3.0, RJ45 and HDMI. That was gold. Everything that comes now is a bloody chore.
You can’t buy a UCB-C Wifi dongle that last time I checked. You have to buy a c-to-a adapter, then use a usb-a wifi dongle. It’s nuts that those don’t exist.
Pinetab2 shipped with a wifi chip without any Linux drivers. The drivers eventually got made, but before that, you needed a USB dongle with Ethernet or a adapter.
I would also like a USB-c wifi dongle for tech support reasons. Sometimes, the wifi hardware fails and you need a quick replacement to figure out what happened.
Maybe the preferred Linux distro doesn’t work with them. I had to use another distro for a while because Debian didn’t immediately support the card, but there are apparently cases where the internal card just permanently wouldn’t work (like in fully free software distros). I would rather replace the card inside the laptop than use a dongle, but idk if this can always be the answer.
Even for like 20 years after mousing became the primary interface, you could still navigate much faster using keyboard shortcuts / accelerator keys. Application designers no longer consider that feature. Now you are obliged to constantly take your fingers off home position, find the mouse, move it 3cm, aim it carefully, click, and move your hand back to home position, an operation taking a couple of seconds or more, when the equivalent keyboard commands could have been issued in a couple hundred milliseconds.
I don’t think mice were a mistake, but they’re worse for most of the tasks I do. I’m a software engineer and I suck at art, so I just need to write, compile, and test code.
There are some things a mouse is way better for:
drawing (well, a drawing tablet is better)
3d modeling
editing photos
first person shooters (KB works fine for OG Doom though)
bulk file operations (a decent KB interface could work though)
But for almost everything else, I prefer a keyboard.
And while we’re on a tangent, I hate WASD, why shift my fingers over from the normal home row position? It should be ESDF, which feels way more natural…
Thanks, I got you beat on ESDF though because i’m a RDFG man, since playing counter strike 1.6. With WASD they usually put crouch or something on ctrl but my pinky has a hard time stretching down there, but on RDFG my pinky has easy access to QW AS ZX, and tab caps and shift with a little stretch. It’s come in handy when playing games with a lot of keybinds.
What pisses me off even more is many games bind to the letter instead of physical key position (e.g. key code), so alternative layouts get a big middle finger. I use Dvorak, and I’ve quit fighting and just switch to QWERTY for games.
I don’t have a problem with hitting control (I guess I have big hands), but I totally agree that default key binds largely suck. I wish games came with a handful of popular ones, and bound to key codes so hs Dvorak users (or international users) didn’t have to keep switching to QWERTY.
I always rebind to ESDF if the game doesn’t do stupid things preventing it from being practical. The addition of the 1QAZ strip being available to the pinky is a killer feature all on its own. I typically use that for weapon switching, instead of having to stretch up to 1234 and take my fingers off the movement keys.
Tablets are better than mice at drawing, modelling, and photo editing. Mice are good for first person shooters. Game controllers are better for most other games. You can mouse in dired-mode i guess, if you’re a casual.
The problem is they generally use E and F for something, which results in a cascade of rebinding.
And yeah, tablets are better, but they’re also more expensive and don’t do other mice things. For how rarely I do 3D modeling and whatnot (pretty rare), making sure my mouse has a middle button is plenty.
And yeah, I much prefer controller, even for FPS since I don’t play competitively (even then, I’ve seen awesome videos about gyro aiming).
E and F is certainly is a problem, but developing your own custom key map is almost always part of a larger process of becoming more effective anyway. Typically I start by just moving all left-hand bindings right by one key.
I feel like the mouse is a good generalist, jack of all trades input device, but outside of fps, I feel that any task that quote requires unquote a mouse is done better with a tablet. They are of equivalent price, honestly. Mice are not cheap, tablets are not expensive.
Right now I am using voice dictation because it is better than typing on a phone, but oh my God it sucks so bad.
That functionality (first necessary, then required by guidelines, then expected, and then still usual) disciplined UI designers to make things doable in a clear sequence of actions.
Now they think any ape can make a UI if it knows the new shiny buzzwords like “material design” or “air” or whatever. And they do! Except humans can’t use those UIs.
BTW, about their “air”. One can look at ancient UI paradigms, specifically SunView, OpenLook and Motif (I’m currently excited about Sun history again), Windows 3.*, and also Win9x (with WinXP being more or less inside the same paradigm). And one can see that of these only Motif had anything resembling their “air”. And Motif is generally considered clunky and less usable than the rest of the mentioned (I personally consider OpenLook the best), but compared to modern UIs even Motif does that “air” part the way it seems to make some sense, and feels less clunky, making me wonder how is that even possible.
FFS, modern UI designers don’t even think it’s necessary to clearly and consistently separate buttons and links from text.
And also - freedom in Web and UI design has proven to be a mistake. UIs should be native. Web browsers should display pages adaptively (we have such and such blocks of text and such and such links), their appearance should be decided on the client and be native too, except pictures. Gemini is the right way to go for the Web.
Sounds like I’m glad “home row” style typing fell out of favour. It may be the theoretically fastest way to type eventually, but it seems to lead to pretty rigid behaviour. Adapting to new things as they come along and changing your flow to move with them instead of against them is just a much more comfortable way to live. Even if I only type 80% as fast.
I have no idea what you mean by “fell out of favour”. Does your keyboard not have pips on F and J? People still touch type. Dunno what to tell you.
You’re getting hung up on “home row”. You still have to move your hand from the keyboard to the mouse and back. It’s the same problem, whether or not you know how to type well and stare at your hands, except now you have to add steps for “look at the screen” and “look back at your hands”.
Fell out of favour in that it isn’t taught as “the correct way to type” any more. Largely because most devices you type on now wouldn’t even have physical keys. So learning home row typing for the occasional time the thing you are typing on is a physical full sized keyboard just disrupts the flow of everything else.
Being perfectly optimal isn’t as productive as it feels, especially when it leads to resistance to change and adapt.
To an extent. Early 90’s I could navigate WordPerfect in DOS faster than I’ve ever been able to work in MS Word, because it was all keyboard even before I learned proper home key 10 finger typing in high school. Technically my first word processor was Wordstar on one of those Osborne “portable” computers with the 5-inch screen when I was a young kid, but Wordperfect was what I did my first real ‘word processing’ on when I started using it for school projects. So I might just be older in that ‘how do you do fellow kids’ in this sort of discussion.
To this day, I still prefer mc (Midnight Commander, linux flavored recreation of Norton Commander that does have a Windows port (YMMV on the win port)) to navigate filesystems for non-automated file management.
I’ve been thoroughly conditioned for mouse use since the mid-late 90s (I call it my Warcraft-Quake era, we still used keyboard only for Doom 1/2 back in the early days), and I feel like it’s a crutch when I’m trying to do productive work instead of gaming. When I spend a few days working using remote shells, I definitely notice a speed increase. Then a few days later I lose it all again when I’m back on that mouse cursor flow brain.
Nah, USB-A was the best since it replaced serial ports (esp PS/2, which was much harder to plug in) and outlived/outclassed FireWire. USB-C is the best thing since HDM (screw you VGA amd DVI), which was the best since USB-A.
Fuck firewire. Glad it’s dead. USB C is the best thing to happen to peripherals since the mouse.
I would agree with you if there were a simple way to tell what the USB-C cable I have in my hand can be used for without knowing beforehand. Otherwise, for example, I don’t know whether the USB-C cable will charge my device or not. There should have been a simple way to label them for usage that was baked into the standard. As it is, the concept is terrific, but the execution can be extremely frustrating.
Buying a basic, no-frills USB-C cable from a reputable tech manufacturer all but guarantees that it’ll work for essentially any purpose. Of course the shoddy pack-in cables included with a cheap device purchase won’t work well.
I replaced every USB-C-to-C or -A-to-C cable and brick in my house and carry bag with a very low cost Anker cable (except the ones that came with my Google products, those are fine), and now anything charges on any cable.
You wouldn’t say that a razor sucked just because the cheap replacement blades you bought at the dollar store nicked your face, or that a pan was too confusing because the dog food you cooked in it didn’t taste good. So too it is not the fault of USB-C that poorly manufactured charging bricks and cables exist. The standard still works; in fact, it works so well that unethical companies are flooding the market with crap.
Hey that’s a fair point. Funny how often good ideas are kneecapped by crap executions.
I’m pretty sure the phrase “kneecapped by crap executions” is in the USB working groups’s charter. It’s like one of their core guiding principles.
If anyone disagrees with this, the original USB spec was for a reversible connector and the only reason we didn’t get to have that the whole time was because they wanted to increase profit margins.
USB has always been reversible. In fact you have to reverse it at least 3 times before it’ll FUCKING PLUG IN.
That’s the reason Apple released the Lightning connector. They pushed for several features for USB around 2010, including a reversible connector, but the USB-IF refused. Apple wanted USB-C, but couldn’t wait for the USB-IF to come to an agreement so they could replace the dated 20-pin connector.
There is. USB IF provides an assortment of logos and guidelines for ports and cables to clearly mark data speed (like “10Gbps”), power output (like “100W” or “5A”), whether the port is used for charging (battery icon), etc. But most manufacturers choose not to actually use them for ports.
Cables I’ve seen usually are a bit better about labeling. I have some from Anker and ugreen that say "SS”, “10Gbps”, or “100W”. If they don’t label the power it’s probably 3A and if they don’t label the data speed it’s usually USB 2.0, though I have seen a couple cables that support 3.0 and don’t label it.
Burn all the USBC cables with fire except PD. The top PD cable does everything the lower cable does.
IDK I’ve had PD cables that looked good for a while but turns out their data rate was basically USB2. It seems no matter what rule of thumb I try there are always weird caveats.
No, I’m not bitter, why would you ask that?
There are many PD cables that are bad for doing data.
Correct. The other commenter is giving bad advice.
Both power delivery and bandwidth are backwards compatible, but they are independent specifications on USB-C cables. You can even get PD capable USB-C cables that don’t transmit data at all.
Also, that’s not true for Thunderbolt cables. Each of the 5 versions have specific data and power delivery minimum and maximum specifications.
I don’t think this is right. The PD standard requires the negotiation of which side is the source and which is the sink, and the voltage/amperage, over those data links. So it has to at least support the bare minimum data transmission in order for PD to work.
Technically, yes, data must transmit to negotiate, but it doesn’t require high throughput. So you’ll get USB 2.0 transfer speeds (480 Mb/s) with most “charging only” USB-C cables. That’s only really useful for a keyboard or mouse these days.
This limitation comes up sometimes when people try to build out a zero-trust cable where they can get a charge but not necessarily transfer data to or from an untrusted device on the other side.
You forgot thunderbolt and usb4 exists now
True but pretty much the only devices that need those are high-end SSDs and laptop docks and in both cases you just leave the cable with the device rather than pulling it out of your generic cables drawer.
You can buy a single cable that does 40GB and USB4 and charges at 240w.
https://caberqu.com/home/39-ble-caberqu-0611816327412.html
This would do it.
Damn, check out the price of the thing someone else linked to at AliExpress for a fraction of that price. But having to spend money on that should not be necessary.
That aliexpress device doesn’t tell you what wattage or data speed the cable will max out doing. Just what wattage it’s currently doing (to which you’d need to make sure that the device you’re testing with on the other side is capable and not having it’s own issues). Also can’t tell you if the cable is have intermittent problems. If all you care about is wattage, then fine. But I find myself caring more about the supported data speeds and quality of the cable.
But yes, I agree that cables should just be marked what they’re rated for… However it’s possible well built cables exceed that spec and could do better than they’re claiming which just puts us in the same boat of not really knowing.
Edit: oh! and that aliexpress tester is only 4 lines(usb2.0 basically)… usb 3.0 in type c is 24 pins… You’re not testing jack shit on that aliexpress. The device I linked will actually map the pins in the cable and will find you breaks as well.
The cheaper aliexpress item you actually want is this one, it will read the emarker and tell you the power/data rates it supports, if it supports thunderbolt etc https://www.aliexpress.com/item/1005007287415216.html
Some photos of it in action https://bitbang.social/@kalleboo/109391700632886806
Do not all USB C cables have the capability to do Power Delivery? I thought it was up to the port you plugged it in to support it?
The really janky ones you get with like USB gadgets like fans only have the 2 power lines hooked up and not the lines needed to communicate PD support, those will work exactly the same as the same janky USB A-microUSB cables they used to come with, supplying 5V/2A. You throw those away the second you get them and replace them with the decent quality cables you bought in bulk from AmazonBasics or something.
Nope. My daughter is notorious for mixing up cables when they come out of the brick. Some charge her tablet, some are for data transfer, some charge other devices but not her tablet. It’s super confusing. I had to start labeling them for her.
Come to think of it, all the USB C cables I have are from phone and device chargers so I just took it for granted. Good to know. Thanks for sharing some knowledge with me
USB-c cables can vary drastically. Power delivery alone ranges from less than 1 amp at 5 volts to over 5 amps at 20 volts. That’s 5 watts of power on the low end to 100 watts of power on the high end and sometimes more. When a cable meant to run at 5 watts has over 100 watts of power run through, the wires get really hot and could catch fire. The charger typically needs to talk to a very small chip in the high power cables for the cables to say, yes I can handle the power. Really cheap chargers might just push that power out regardless. So while the USB-c form factor is the one plug to rule them all, the actual execution is a fucking mess.
I agree with USB-C, but there are still a million USB-A devices I need to use, and I can’t be bothered to buy adapters for all of them. And a USB hub is annoying.
Plus, having 1-2 USB-C ports only is never gonna be enough. If they are serious about it, why not have 5?
I bought some adaptors in China for around $0.50 each. It really isn’t that big of a deal
It really is a big deal for me, they stick out too far and are making the whole setup flimsy.
Then just buy a framework like I did and switch ports whenever you feel like it
That’s still only 3 simultaneously if I saw that right. My old Lenovo laptop had 3 USB-A 2.0 ports, 2 x USB-A 3.0, RJ45 and HDMI. That was gold. Everything that comes now is a bloody chore.
You can have 6 ports of any kind you like on the framework 16
Oh nice, that’s something.
Yeah, I’d love at least one USB A type cause most of the peripherals I own use that.
You can’t buy a UCB-C Wifi dongle that last time I checked. You have to buy a c-to-a adapter, then use a usb-a wifi dongle. It’s nuts that those don’t exist.
Genuine question - what device do you have that has USB-C ports, no USB-A ports, doesn’t have WiFi, but supports the dongle?
Pinetab2 shipped with a wifi chip without any Linux drivers. The drivers eventually got made, but before that, you needed a USB dongle with Ethernet or a adapter.
I would also like a USB-c wifi dongle for tech support reasons. Sometimes, the wifi hardware fails and you need a quick replacement to figure out what happened.
Why do you need a wifi dongle when wifi is built into every single laptop sold?
Maybe the preferred Linux distro doesn’t work with them. I had to use another distro for a while because Debian didn’t immediately support the card, but there are apparently cases where the internal card just permanently wouldn’t work (like in fully free software distros). I would rather replace the card inside the laptop than use a dongle, but idk if this can always be the answer.
I hated when mice became the primary interface to computers, and I still do.
tell me you use i3 without telling me you use i3
You have passed the test. We can be friends.
Is this for real?
Even for like 20 years after mousing became the primary interface, you could still navigate much faster using keyboard shortcuts / accelerator keys. Application designers no longer consider that feature. Now you are obliged to constantly take your fingers off home position, find the mouse, move it 3cm, aim it carefully, click, and move your hand back to home position, an operation taking a couple of seconds or more, when the equivalent keyboard commands could have been issued in a couple hundred milliseconds.
I love how deeply nerdy Lemmy is. I’m a bit of a nerd but I’m not “mice were a mistake” nerd.
I don’t think mice were a mistake, but they’re worse for most of the tasks I do. I’m a software engineer and I suck at art, so I just need to write, compile, and test code.
There are some things a mouse is way better for:
But for almost everything else, I prefer a keyboard.
And while we’re on a tangent, I hate WASD, why shift my fingers over from the normal home row position? It should be ESDF, which feels way more natural…
Thanks, I got you beat on ESDF though because i’m a RDFG man, since playing counter strike 1.6. With WASD they usually put crouch or something on ctrl but my pinky has a hard time stretching down there, but on RDFG my pinky has easy access to QW AS ZX, and tab caps and shift with a little stretch. It’s come in handy when playing games with a lot of keybinds.
Pfff, minutes after trying to minimize your nerdiness, you post this confession.
What pisses me off even more is many games bind to the letter instead of physical key position (e.g. key code), so alternative layouts get a big middle finger. I use Dvorak, and I’ve quit fighting and just switch to QWERTY for games.
I don’t have a problem with hitting control (I guess I have big hands), but I totally agree that default key binds largely suck. I wish games came with a handful of popular ones, and bound to key codes so hs Dvorak users (or international users) didn’t have to keep switching to QWERTY.
That feel when you switch languages to chat and the hotkeys don’t work
I always rebind to ESDF if the game doesn’t do stupid things preventing it from being practical. The addition of the 1QAZ strip being available to the pinky is a killer feature all on its own. I typically use that for weapon switching, instead of having to stretch up to 1234 and take my fingers off the movement keys.
Tablets are better than mice at drawing, modelling, and photo editing. Mice are good for first person shooters. Game controllers are better for most other games. You can mouse in
dired-mode
i guess, if you’re a casual.The problem is they generally use E and F for something, which results in a cascade of rebinding.
And yeah, tablets are better, but they’re also more expensive and don’t do other mice things. For how rarely I do 3D modeling and whatnot (pretty rare), making sure my mouse has a middle button is plenty.
And yeah, I much prefer controller, even for FPS since I don’t play competitively (even then, I’ve seen awesome videos about gyro aiming).
E and F is certainly is a problem, but developing your own custom key map is almost always part of a larger process of becoming more effective anyway. Typically I start by just moving all left-hand bindings right by one key.
I feel like the mouse is a good generalist, jack of all trades input device, but outside of fps, I feel that any task that quote requires unquote a mouse is done better with a tablet. They are of equivalent price, honestly. Mice are not cheap, tablets are not expensive.
Right now I am using voice dictation because it is better than typing on a phone, but oh my God it sucks so bad.
That functionality (first necessary, then required by guidelines, then expected, and then still usual) disciplined UI designers to make things doable in a clear sequence of actions.
Now they think any ape can make a UI if it knows the new shiny buzzwords like “material design” or “air” or whatever. And they do! Except humans can’t use those UIs.
BTW, about their “air”. One can look at ancient UI paradigms, specifically SunView, OpenLook and Motif (I’m currently excited about Sun history again), Windows 3.*, and also Win9x (with WinXP being more or less inside the same paradigm). And one can see that of these only Motif had anything resembling their “air”. And Motif is generally considered clunky and less usable than the rest of the mentioned (I personally consider OpenLook the best), but compared to modern UIs even Motif does that “air” part the way it seems to make some sense, and feels less clunky, making me wonder how is that even possible.
FFS, modern UI designers don’t even think it’s necessary to clearly and consistently separate buttons and links from text.
And also - freedom in Web and UI design has proven to be a mistake. UIs should be native. Web browsers should display pages adaptively (we have such and such blocks of text and such and such links), their appearance should be decided on the client and be native too, except pictures. Gemini is the right way to go for the Web.
So I see you clearly haven’t heard of i3, sway or hyperland …
Sounds like I’m glad “home row” style typing fell out of favour. It may be the theoretically fastest way to type eventually, but it seems to lead to pretty rigid behaviour. Adapting to new things as they come along and changing your flow to move with them instead of against them is just a much more comfortable way to live. Even if I only type 80% as fast.
I have no idea what you mean by “fell out of favour”. Does your keyboard not have pips on F and J? People still touch type. Dunno what to tell you.
You’re getting hung up on “home row”. You still have to move your hand from the keyboard to the mouse and back. It’s the same problem, whether or not you know how to type well and stare at your hands, except now you have to add steps for “look at the screen” and “look back at your hands”.
Fell out of favour in that it isn’t taught as “the correct way to type” any more. Largely because most devices you type on now wouldn’t even have physical keys. So learning home row typing for the occasional time the thing you are typing on is a physical full sized keyboard just disrupts the flow of everything else.
Being perfectly optimal isn’t as productive as it feels, especially when it leads to resistance to change and adapt.
Home row is absolutely still taught as the “correct” way to type. Source: kids are in elementary school
To an extent. Early 90’s I could navigate WordPerfect in DOS faster than I’ve ever been able to work in MS Word, because it was all keyboard even before I learned proper home key 10 finger typing in high school. Technically my first word processor was Wordstar on one of those Osborne “portable” computers with the 5-inch screen when I was a young kid, but Wordperfect was what I did my first real ‘word processing’ on when I started using it for school projects. So I might just be older in that ‘how do you do fellow kids’ in this sort of discussion.
To this day, I still prefer mc (Midnight Commander, linux flavored recreation of Norton Commander that does have a Windows port (YMMV on the win port)) to navigate filesystems for non-automated file management.
I’ve been thoroughly conditioned for mouse use since the mid-late 90s (I call it my Warcraft-Quake era, we still used keyboard only for Doom 1/2 back in the early days), and I feel like it’s a crutch when I’m trying to do productive work instead of gaming. When I spend a few days working using remote shells, I definitely notice a speed increase. Then a few days later I lose it all again when I’m back on that mouse cursor flow brain.
Early ’90s*
You got it right the second time though, champ!
Nah, USB-A was the best since it replaced serial ports (esp PS/2, which was much harder to plug in) and outlived/outclassed FireWire. USB-C is the best thing since HDM (screw you VGA amd DVI), which was the best since USB-A.