Article makes exactly zero arguments why switching to Linux would be beneficial to any developer. Instead, it argues that if something breaks -- then it's your own fault, makes vague claims such as "Most of the Internet runs on Linux", and says that Linux is harder to "learn" than Windows.
If I was trying to makes sense out of this as someone who doesn't use Linux daily, I'd be heavily confused and discouraged by such writing.
This is completely my personal opinion, but I find Linux much easier to navigate and set things up in than Windows. I barely used macOS, so I can’t judge that. In Linux, pretty much everything can be done in the terminal, and it's easy to find workarounds. If you “screw something up” badly—with no snapshots/backup—you can reinstall the whole system without touching the home partition, assuming it’s on a separate one. And we are not even talking about nixOS or the likes ones.
I think the main argument the author is making about why you should learn Linux is to better understand how your software is hosted in the cloud on Linux servers.
This is also why the Mobile Web Developer program at my college teaches Linux and Linux Web Hosting (including Apache and Wordpress) early on. We find this also helps students feel comfortable working within the macOS Terminal during all later courses, as well as with git commands.
> better understand how your software is hosted in the cloud on Linux servers
Since the days of Apache/PHP I don't think people (need to) host their dev work on proper servers at their local computers. So this argument is kind of invalid unless you want to set-up your dev machine as some kind of dummy server.
> Since the days of Apache/PHP I don't think people (need to) host their dev work on proper servers at their local computers. So this argument is kind of invalid unless you want to set-up your dev machine as some kind of dummy server.
True, and false
I do host the cloud software I work on on a local machine (running Debian-12) because it is easy. I do not need to, I want to and it makes me more productive
I also spend a lot of time in a terminal on a remote machine. I have to be careful to not get the two confused (I use different background colours for terminals on different machines.)
I have done the "remote terminal on cloud software" on Windows - a difficult and painful experience. Tools I take for granted, and get gratis, on Linux are not there on the Windows servers.
What ever flaws the article has it is not out of date
This has been my observation, I think people will get caught up in the details but most dev workflows I see now are either containerised (docker etc) or serve themselves (rails, node, even PHP can serve itself with no webserver)
I see less developers setting up local servers and managing them for development purposes. Containers are not the same thing, for the purposes of this commentary on how people work now.
> You must understand why you damaged your system and why the fix you applied fixed it.
Only GNU/Linux (in fact any free software) gives you the possibility to fully understand your system. It allows you to drill down to any level required. Thriving for such understanding is a trait of hackers, but having the freedom to do so is beneficial for any software developer- at whom the article was targeted.
I would argue that nowadays most dev workloads don’t break operating system.
Most of the time I would rather have devs spending time to learn their database engine for back end. For front end devs would rather that they spend time understanding how web browsers work.
While I like to spend time getting to know OS details and running windows or Linux servers - for instance I do get a grasp of CSS and can use dev tools in browser, but I see how a front end developer fixes CSS issues in 5 mins that take me 30mins.
We cannot expect everyone to know everything - but also I think we can expect some basic knowledge of different parts of stack.
This is becoming less and less the case, unless you use a more DIY-oriented distribution such as Void.
Most systemd distributions have a lot of technical merit (I like Fedora Silverblue), but they are increasingly becoming monolithic, so hackability isn't one of their strengths.
I remember a time when Linux was pure anarchy, with loosely coupled independent subsystems that could easily be experimented with and replaced. Now, most distributions are much closer to macOS: they "just work", but you're not supposed to mess with them. So long for hackability.
I (literally) remember a time from some years before Linux just was - because i was alive, and working in software, then.
source code of packages, including major ones, was available to download ... and source code for programs was available probably much before the technique of downloading was even a thing. people used to ship around programs in source code format via (magnetic) tapes or usenet or sneakernet, later, floppies, etc. see the Jargon File for much more computer lore than you would ever want to know about, including some of this stuff, although plenty of it is interesting, imo.
>So long for hackability.
pardon me.
are Linux distributions not downloadable as source anymore?
I guess, to many devs, including me, that's the ultimate meaning of hackability.
because then you can hack anything from the top to the bottom of the stack.
> Most systemd distributions have a lot of technical ... but they are increasingly becoming monolithic,
I am not a fan of Systemd for irrelevant reasons. But the "monolithic" nature of systems has to do with the interdependence of the parts not the glue.
> I remember a time when Linux was pure anarchy, with loosely coupled independent subsystems
Sounds like my Linux box I have for fun. We are still making systems like that, but for business purposes we were always making solid and staid systems.
Very little has changed, only the words we use to say it (and the raw power of the underlying hardware of course!)
In many relevant ways, Windows is more easily hackable than Linux. That is in no small part due to Autohotkey, but also many other tools made by independent programmers.
Autohotkey is vastly superior to anything available on Linux, where one must string together several arcane tools to achieve similar scripting effects. Autohotkey serves for everything, it is extreme reliable and predictable. I know someone will shout "there's Autokey on Linux! And pyautogui!". Rest assured, I know these tools. They're not nearly as reliable and comprehensive as Autohotkey is on Windows.
I would most certainly not compare Autohotkey to Emacs in that way, as Emacs, powerful as it is, is not really meant to be the same as Autohotkey. However, Autohotkey can do pretty much anything you can think of in terms of GUI OS level local automation (for cli, I'd probably use WSL).
On Linux, I may sometimes have to glue xmodmap + xcape + xbindkeys + xdotool + wmctrl + whatever-else in a bash script that will probably require reading multiple man pages and multiple iterations to get right. Autohotkey would only require accessing a single source of solid documentation, as it can all be done in a single Autohotkey script that doesn't rely on any other tool.
More often than not GPT will give you the entire code that you need on the first or second response. But it only knows Autohotkey version 1.
Or for a more "hackable" experience on Windows, you can do what (I expect) AutoHotKey does, and send window messages to apps to trigger "key press" and "mouse down at X coords".
By analogy it's as is there's a single stable interface to the GUI layers in apps mediated by the OS, instead of poking at X APIs.
> A vastly overrated reason unless you are regularly in conflict with corporate entities/the government.
Software freedom is a very important business property.
Closed source software can be OK while it is supported, and the licencing agreements not too onerous. But when those conditions no longer apply, very expensive retooling ensues. I have seen this often
Free software does not suffer those fates.
This is a very important business reason for Free Software
i encourage a lot of people to learn linux but not for programming specifically. some things are easier on windows, others linux i guess. if u need to crack open WSL i'd say learn linux. the cmdline is nice also to learn automations and scripting, but powershell is also cool to learn. i'd say it really depends on what you want to do. bsd might be even a sane choice. mac i havent the moneys for :D
>i encourage a lot of people to learn linux but not for programming specifically
Outside of native phone and desktop applications, 90%+ of the time your code is going to run on some form of linux. Maybe you can argue a developer who only does windows desktop applications would not benefit from linux knowledge, however I'd argue they would benefit most of all lol.
This is a fair point, but even with SaaS that completely abstracts away linux infrastructure (which I support, there's no reason to manage infra if it does not bring you value), there's still value in having some understanding of how the underlying infrastructure works when developing your application.
As a comparison I don't have the skills or need to develop firmware or other software that directly interacts with hardware, but having some high level knowledge of how CPU, RAM and I/O physically operate has been very useful in designing good programs. I feel comfortable arguing that investing a small amount of time in basic linux knowledge is going to be a net benefit for virtually any professional developer.
> Why should you develop on Linux in a world where the environment for developing on Windows is constantly improving?
Is it? Is it though? I've used both extensively, and I have a Mac M1... and by far Linux is the best for development, Mac is second, and windows is dragging farrrrr behind both.
WSL is fine as a user and can be really helpful... but one thing that I've noticed is that a lot of Windows devs use that as a shortcut and will target applications at WSL, which usually don't work on actual Linux whatsoever. They only work on WSL. A part of that is their fault, a part of that is WSL, but it sucks either way.
Most AI projects are like this because Python is especially horrific to work with package wise. I've also noticed this phenomenon with Windows Docker. Idk what it is exactly, but the Dockerfiles these guys write work on Windows Docker, but fail miserably on Linux Docker. Its like "how is this even possible?"
So its nice for Windows people, but not nice for literally everyone else.
It's slow, volume support is worth calling out separately as extremely slow and the whole thing is... not exactly unstable, but unexpectedly brittle. I had to resort to really obscure commands and clean up files in weird locations when the desktop app broke on multiple occasions.
Now the really truly bad part is that most of it won't ever be fixed because of how docker needs to run on a VM (I'll happily and vocally admit I was wrong if Apple releases a mac subsystem for linux.) I don't care about the desktop app, I'm perfectly fine with running docker from the command line, hence why WSL2 makes so much more sense for me.
Depends what you do! I've never found anything quite as good as Visual Studio for C/C++ coding and debugging. I can (and have...) got by with Xcode on macOS and CLion on Linux, but they're just not quite as good. (I did a stint doing Go with Visual Studio Code - I suspect the same would apply to using Visual Studio Code for C++, too.)
The Visual Studio text editor isn't amazing, but it's good enough that I can put up with it (and I can always load a file into Emacs when I need to do something specific), and the code browsing and code completion typically works well out of the box. The main draw is the debugger, which is decent, and, being integrated, the debugging UI gets to reuse the reasonable text editing UI for looking at and navigating source code. The debugging panels are all independent, can be docked anywhere in the window (or left floating), and arranged into tab groups as the situation demands - a big improvement over most other debuggers I've used, which have a bad habit of providing a single debug state panel and/or not letting you lay things out as you see fit.
(Once you leave Visual Studio, the Windows experience isn't always ideal, but you have options, according to personal taste. I have Emacs, Python, git, and the usual GNU tools (git comes with a good set) - and for the stuff I do, that's enough to make the experience pretty consistent whichever OS I'm using.)
Interesting how tastes can vary, I’ve not used visual studio as much but always dread it when I’m forced to. CLion’s debugger has been fine for my uses, although the best debugging experience I’ve ever had was emacs and gdb many windows mode; Worth noting those were for some fairly simple C programs. Radare is also really powerful once you are used to the keybindings. Colleagues tell me cutter is decent as well!
The reason why you should learn Linux, or more broadly Unix is because if you're going to take software development seriously as a profession for 40+ years of your life you should have an actual well grounded understanding of how computing works. It's like asking, should you learn how to read sheet music if you want to become a musician, or should I learn Hanzi if I want to learn Chinese.
Technically you can get away with not learning these things if you wanted to do the bare minimum but if you're gonna be even remotely serious about doing something as your craft just learn the things that underpin what you do. Learning how Unix systems work isn't just learning about a product, you're going to learn about file system paradigms, input/output, kernels, networking, address spaces and all the basics of CS that are everywhere.
If I were hiring, I'd give special preference to Mac and Linux users.
Not because you can't learn Linux on Windows but because using Linux is usually a sign that the person is not afraid of a little discomfort, not afraid to learn more about the computer they use and willing to accept change and even willing to try new things.
Linux represents a different mindset than Windows and for me, it's very easy to tell who loves what he does and who doesn't, who loves a challenge and who doesn't, who can get stuff done in any situation and who doesn't. All because I used to be a Windows user and I can easily compare the mindset that went into justifying sticking to a difficult OS.
Thank goodness you aren't hiring, since you apparently believe that you can judge a person's abilities based on OS preference. God help your org if they decide to grant you more responsibility.
I switched from windows 7 to Ubuntu and never looked back. Bizarrely I find desktop Linux to be closer to the (very good) windows 7 experience than modern windows is. I detest the nagginess and ads in windows now. Not to mention trying to force Microsoft accounts to use your own damn computer. Meanwhile Apple is slightly better but wants to wall the garden on Mac, scaring people away from using software that apple didn’t approve.
Gnome. It’s fine. It’s not fancy. I’m happy enough with it.
It’s less true now but for a long time installing stock Ubuntu was the lowest hassle way to get a Linux desktop that pretty much worked with everything. They flirted with Amazon ads for a while which I didn’t like. Not crazy about snaps either. But it’s fine and I get my stuff done.
I also just buy boring hardware known to work. First an xps 13 (remember project Apollo?) then a framework.
I have been using Linux on "bare metal" for quite some time, as my primary operating system on the machines I've owned. When I've needed Windows, I've run it in a virtual machine. This has been my MO on my personal machines for over 10 years.
On my most recent machine however, I've opted to do this the other way around. I'm running Windows 11 Pro, running Linux on Hyper-V.
The experience has been.... Fine! I may actually prefer this setup (time will tell). Everything hardware related "just works". As per another thread on HN, Linux does seem to run very well virtualised compared to Windows. People will get riled up about needing an MS account. But I suppose that hasn't bothered me too much yet (who knows I may change my stance on this).
The reality is, messing around with drivers, the Linux wireless stack, display resolutions, firmware updates (ie the nuances of running Linux on a laptop) offers zero value to cloud workloads.
So I don't think running Linux directly on the hardware is an absolute necessity.
You can achieve similar levels of productivity and knowledge uplift if you: reserve Windows use for only things that require Windows (eg Ms office, and zoom meetings with a Bluetooth stack that won't drive you insane), and do EVERYTHING ELSE in your VMs.
I realize that this is by far not the norm, but I spend nearly all of my time on a computer between a browser and a terminal. In the past, I’ve had such a hard time trying to figure out how to use windows that I’ve nearly quit jobs over it.
I hated the wsl experience and had some network issues with it that I couldn’t resolve, making it nearly unusable. I ended up finding m2 in conda, and that was for me, the best compromise. If you’re looking to dabble in Linux command line apps, that’d be my recommendation.
Why? I don't see any signs of that happening. The only real major success stories for Linux in the consumer space Android and Chrome OS are effectively (of course not technically) proprietary(ish).
The Steam Deck's sole purpose is to emulate Windows APIs to run Windows games. It's not a win for Linux, it's an admission that Linux's only value is as a way to avoid paying the Windows tax.
We've heard grandiose claims that proprietary operating systems are going to go away for 25 years now. We will be hearing the same grandiose claims that proprietary operating systems are going to go away someday for the next 25 years as well.
Huh? It brings the sort of functionality that many computer users still use their PC's for over to Linux. You bring over the ability to have fun and spend leisure time on Linux, more people may make the jump in the future. This in turn puts pressure on software and hardware makers to support Linux.
Steamdeck user experience once you well leave the Steam and software run from there is well not really that great. It is different Linux experience and not actually that great. Specially if you are trying to do something like developing stuff.
Sure but most users are not interacting directly with the "Linux" part (and probably they don't have much interest in that anyway).
IMHO Linux is mostly an implementation detail for the Steam Deck/OS. So sure we can put it in more or less the same category as Chrome OS and Android. It's just there to run a proprietary layer/interface.
I don't think anyone is questioning whether that's technically possible, just whether it's an appealing use case for the majority of it's target users.
I’ve read here twice recently that Windows Server is “dead.” Will leave that argument to them.
Certainly noticeable that there’s less investment in Windows and less ability to charge for a proprietary OS. That leads to enshittification, which is having a long term significant negative effect.
Perhaps. But these are entirely different segments. OSX server has been dead for many years now (it was hardly even a thing to begin with) yet the OS is doing just fine.
They won't disappear, but for sure the corporations behind them need to sustain huge costs that the FOSS community will never see, and we're starting to see some signs: Linux is free and shows no ads or collects users data; Windows costs money, yet it shows ads and collects users data. One would expect the opposite, and I'm glad that it's not the case.
> Can you see my point here? If you do not learn Linux, you will keep hitting a wall on your skill development because Linux is everywhere.
No, I don't. Bash is not limited to Linux.
> But if you use the right approach for when things are broken, it will pay off, In a relatively short time, you will get better, your developing speed will grow and you will have better usage of many tools you use daily as a software developer.
You know how do you get better as a developer? By actually developing stuff instead of cosplaying as a Unix graybeard.
It depends what you want to do. If you want to develop native desktop applications or video games you're probably better off doing that on Windows as the addressable market is so much larger.
Web development on Linux makes a lot of sense because it will likely be hosted on Linux, as the article says. But it's not really that big of a deal typically what OS your Python/Node/PHP/Ruby/etc web application is running on as you typically aren't doing much OS specific things. In fact, I would encourage people to keep their web applications as OS agnostic as possible.
It's easier said than done (keep apps OS-agnostic).
I had to explain how AWS lambdas really run in a Linux VM when a software engineer was perplexed how /tmp files were sometimes persisted and sometimes weren't (and had to teach them about TMPDIR, tempfile module in Python etc).
Cold-starts were also easier to explain to anyone familiar with Linux, though that's more general.
Similarly, Docker permission issues during deployments strike entirely differently if you are used to developing on a Mac (where all file ownership gets translated to UID 0).
"But what you must know is that WSL has a lot of problems when compared to a full Linux OS. Because WSL is basically a virtual machine of Linux running on Windows, it will be a lot slower and memory-consuming"
I think that would be a great foray, yes. If you understand how to poke around the filesystem (cd, ls), manipulate the filesystem (cp, mv, rm, mkdir), how to search (find, grep), and how to edit and save a file (vi/vim {yes there are a plethora of linux text editors, vi is ubiquitous, which is the only reason I learned it}) you’re in really great shape.
The best part is, you can learn this subset of tools in a day, a week at worst.
There is nothing linux-specific in this list, you describe learning UNIX and could do it on any *BSD, including macOS. I would agree that knowing you way around unix is very helpful, having some specific linux knowledge is worth very little if your job does not consist of managing linux machines. And even then you will require a lot of distro-specific knowledge.
Who wants to learn Linux? I’d be happy to teach it if there’s a need. I’m happy to answer any random questions about it. My email address is in my profile.
I have some expirience with Linux, but not so much as I want. I want to install Linux on a old laptop, so is there a way to quickly restore a system to some previous point in case of issues?
One (popular) option is using a filesystem that supports snapshots (IE. BTRFS, or maybe XFS). You can set up your system to take snapshots pre-and post-update snapshots, or manually, or on a regularly scheduled basis.
If you use GRUB as your bootloader, you can boot into a stored snapshot if something gets borked. You can also recover deleted files from previous snapshots.
I might like punishment, but I don’t mind a completely fresh install every once in a while.
The only way to not go insane is to keep notes about what you like to change, why, and how. That way you can get up and running with a fresh system quickly. No need for snapshots and the like.
Some people say that I like to do things the hard way, however.
Immutable distros like Fedora Silverblue, Kinoite, etc don't let you modify the base system at all, and always let you roll back in case an update broke something. They're probably the most reliable Linux systems available today.
We'd better learn Linux system interface for most of back-end, infra or even front-end trouble shooting. But you don't need to learn about complex shell scripting or internal level. Also, knowing how Linux abstract communication to the machine is beneficial.
There are people who have no idea how to use sed and will insist you need spark or hadoop for a few gigabytes of data.
'Complex shell scripting' as in 'putting together a pipeline with sed and awk and maybe jq' will save you time and money on both Windows and Mac... if you let it.
if you are anything besides maybe a strictly frontend web-dev, you should learn Linux (and likely already have)
These skills translate to embedded systems, the inner workings of BSD based systems (playstation, nintendo, Mac) and obviously anything Linux based (linux and android), which is pretty much everything.
There is probably a handful of jobs where you could avoid it, but for the majority, it is a useful skill.
I disagree. First of all, probably 90% of developer jobs fall outside of the categories you mention (embedded and frontend development). Most of those roles don't involve knowing the difference between /shin and /bin lol
My point was that these skills translate to the BSD's, mac, the cloud, android, desktop linux, etc... not to say that only two categories exist. I think even if you do frontend you would benefit from learning linux, but it was one of the larger examples of people who could eek by, by skipping it. It's also not that hard to do and brings a lot of benefits.
I just bought an Asus Chromebook and it has Debian linux kind of built in. I can apt-get install GUI apps. The employee logs in with their company account (we are a GSuite shop). Vendor-support means there are no driver concerns.
As a company, I would rather employees work in the same environment as the destination of the software.
I can understand Mac workstations and Linux deployment environments, but I can't really understand having Windows workstations with Linux deployment environments. Is it because Windows has better enterprise device management tools?
> Yes, Linux breaks, but only if you do not know what you are doing
Depends on the distro. I used to break Arch all the time. It's fun and teaches a lot, but at some point it gets on the way of productivity. It's a choice.
If you're on Windows and want to try Linux, start with Ubuntu. You can do your daily business for years without breaking anything.
I had my time using Arch, even Gentoo before that. It was fun but I'm done with that kind of "fun".
Run your usual update packages and suddenly networking no longer works... Need to go to the forums to see which package broke and how to workaround, and being forced to do so before you can do the important job you need to do... Yeah, I prefer to get my fun in different ways now.
I've been using Ubuntu for years and recently switched to Fedora.
If you want to learn linux take an old computer and repurpose it as a headless home server that you depend on (Plex, torrents, etc). You'll be under extreme family pressure to keep it going! Extra points if you use something like Fedora Server since you'll do a system upgrade every ~6 months or so. More opportunities to break it and fix it!
>But what you must know is that WSL has a lot of problems when compared to a full Linux OS. Some packages will not be up to date and sometimes will not be as functional as a Linux distro.
AFAIK it's just a regular linux distro (in a VM), so I'm not sure why it would have different packages?
Every time I’ve tried Linux it’s always been a little clunky. I think it’s come a long way, but recently I loaded Ubuntu on to my old 2013 MacBook Pro and it works pretty well.
What’s driving me crazy is I have all these shortcuts for Mac and I’m trying to figure out the equivalent in Linux. One stupid thing is paste doesn’t work in the terminal… seriously? Why can’t I ctrl-v after ctrl-c from web browser?
The little cursor/word navigation shortcuts are driving me crazy as well like jump word to the right or left, highlight entire line, jump to end of line, etc.
I should probably try the Framework laptop or some other hardware Linux ready though.
I've tried MacOS on recent Macs and, boy, it sucks. Inconsistent shortcuts, buggy keyboard switchers (randomly it shows an unlock screen without my quertz layout and I mess up my password), and window management issues all around.
Audio devices randomly hit buffer overflows or something in Google Meet, it randomly mutes and unmutes devices, etc.
Linux ain't perfect either (and really, it was much better ~10 years ago), but it's still so much better than either Windows or MacOS.
Or maybe, just maybe, I am used to it having used it since 90s, and they all have their kludges?
Can it be more inconsistent than copy-paste working differently in the terminal and all other apps, though? Of course it's trivial to change and Linux is more or less fully configurable (e.g. NumLock being broken on KDE for a few years)
Right, so if a new Linux user who uses Ctrl+C/V in all GUI apps finds that he needs to paste something into terminal that should be perfectly obvious for them?
Or should that person feel confused and annoyed since they can't they don't know how to basic operation just so that "power users" could feel better about themselves (I can't think of any other reason which would someone think purposefully inconsistent UX is a good thing...)
Also "Insert" is not a thing on most laptop keyboards. So it would be Ctrl + Fn + Delete(or something like that) which is also perfectly obvious.
"Most" laptops actually do have an Insert key. Perhaps you've been using the wrong laptops? :)
Anyway, my comment was in jest. None of them are perfect, and to a non-Mac user, its imperfections make it "clunky", whereas they are used to imperfections of their go-to systems.
Bugginess one sees when switching to a Mac for the first time is really the same feeling for me like Linux feels to you.
I don't agree. This isn't about different behaviours in Linux vs Mac/Windows but rather about Linux apps having different shortcuts for common basic operations in different apps. That's just poor UX.
I think Ctrl-C behaving consistently in a Linux terminal, be it an actual terminal or a GUI-based one, is good behaviour. Ctrl-C for "copy" actually came a long time after Ctrl-C already had a meaning in a text terminal.
For that matter, how does Ctrl-C behave in Windows Command Prompt (if that's still how they call it)? In what way is it inconsistent: with the GUI or with the originating DOS terminal behaviour?
Mac went with Command and Control as separate keys on a keyboard, which avoids this particular issue, but it gets confusing quickly: what's a "command" and what's a "control"? Which key is to switch workspaces, and why is that "Control" and not "Command"? With every single shortcut, I wonder which was the one I need to use, and it's so hard to remember.
I've been using macOS for years (all my employers tend to give me a MBP) and the one thing I never get used to is the lack of magnetic borders for windows.
You can never arrange them perfectly, they're always a bit disorganized.
Or how some windows don't receive a resize signal when you disconnect from a USB-C display and they get resized without being redrawn (Firefox and Chromium for instance), and you have to resize them which is never trivial if they were full screen or taking a half screen (not all borders are draggable or something of the sort).
Keeping that as a default is silly though. Intentionally having poor UX that's unnecessarily confusing to new users because of "reasons" is why Linux can never become a credible consumer OS.
The poor UX comes from the browser using Windows shortcuts instead of something that's consistent with the rest of the environment.
Compare this to macOS, where copy and paste are consistently Cmd+C and Cmd+V. More generally, keyboard shortcuts tend to use Cmd. That leaves key combinations with Ctrl to their traditional uses in the terminal. Which is pretty convenient, as the combinations won't randomly change when you ssh to a remote Linux system.
Keeping ctrl-C as the binding for copy is silly though. Intentionally having poor UX that's unnecessarily confusing to power users because of "reasons" is why Windows can never become a credible developer OS.
You mean "Super/Meta + C" is the default for copy on Linux making the whole argument redundant? I don't think that's a default shortcut on Gnome or KDE....
Of course if the developers of Linux/KDE/Gnome/etc. want to make their software unnecessarily confusing to new/less experienced users which would make inconsistent UX an intentional feature then you're right.
> (Ctrl-C is especially common, being "interrupt" or "quit".)
That is same for terminal in macOS. Mac uses cmd-C/V for copy/paste in gui, so it is not in conflict with ctrl in terminal. Which I find nice, but ctrl+shift is sufficient as well.
> Why can’t I ctrl-v after ctrl-c from web browser?
Same reasons numlock being off is still the default and a bunch of other silly defaults (that are of course trivial to change but are still annoying): masochism combined with the need to prove something.
> One stupid thing is paste doesn’t work in the terminal… seriously? Why can’t I ctrl-v after ctrl-c from web browser?
It's because copying from a terminal is ctrl+shift+C because ctrl+C sends an interrupt to the program running in the terminal. So paste is ctrl+shift+V to be consistent with that.
> One stupid thing is paste doesn’t work in the terminal… seriously? Why can’t I ctrl-v after ctrl-c from web browser?
Going from Mac to Linux, I felt that the shortcuts in Mac were worse.
On Linux, you don't have to reach for a totally different key to copy/paste from the terminal. You just add Shift to the Ctrl +V and Ctrl+ C. On Mac, it's a totally different button to exit CLI programs and do other things.
All other shortcuts are also a bit better on Linux. Ctrl + J for downloads in Chromium. Ctrl+ H for history. Ctrl + Tab to move through tabs. Ctrl + T for new tab. On Mac, they're very inconsistent.
> Going from Mac to Linux, I felt that the shortcuts in Mac were worse.
They're so much better on Mac, that it's not even a comparison. Mac shortcuts feel like they're designed by someone who actually types on a keyboard. Using COMMAND, that is naturally mapped by a thumb, is so much better than reaching for CTRL. And don't even get me started on idiotic Windows machines having CTRL as left most button that obliterates your pinky and instantly requires remapping some other key, like CAPSLOCK.
> All other shortcuts are also a bit better on Linux. Ctrl + J for downloads in Chromium. Ctrl+ H for history. Ctrl + Tab to move through tabs. Ctrl + T for new tab. On Mac, they're very inconsistent.
Mac shortcuts are literally consistent across all applications, unlike Linux.
I just gave you real examples of how they're so inconsistent and difficult on Mac.
Linux uses Ctrl for most things, you don't even need to lift the finger up away from Ctrl for most shortcuts. Super key is only used for OS shortcuts. Linux is way more consistent in its application.
On Mac, the distinction between Ctrl, Option and Cmd is arbitrary.
Command is the key on Mac, option adds hidden capabilities and Control was added for whoever wanted to use it. For OS shortcuts I don’t even know if I ever use Control, only Vim.
And it is used for OS shortcuts, which also includes most used functions like switching windows and copy pasting, unlike Windows and Linux.
And just because they’re more “consistent” in your head, even though they’re not, it doesn’t make them any less awful. Using Command key for every OS shortcut is far superior than using mix of Control and Option.
Just one most obvious example: copy and paste between windows. CMD C, CMD TAB, CMD V. Who’s more consistent now?
I'm not sure why you're agitated. You and I and simply stating opinions and preferences.
> I can use the same set of shortcuts across all applications on Mac, such as copy paste as was pointed out earlier, or exit, or preferences, etc.
Literally the same with Linux. There's no difference.
> And just because they’re more “consistent” in your head, even though they’re not, it doesn’t make them any less awful. Using Command key for every OS shortcut is far superior than using mix of Control and Option.
Okay, let's see:
- New Tab: Cmd + T vs Ctrl + T
- Switch Tabs: Ctrl + Tab (Can't use Cmd + Tab) vs Ctrl + Tab
- History in browser: Cmd + Y vs Ctrl + H (H is for History)
- Downloads Page: Cmd + Shift + J vs Ctrl + J
- Swift through words: Opt + Left/Right vs Ctrl + Left/Right
You see how consistent Linux is with Ctrl? You don't need to use other keys most of the time. Everything's fixed. Terminal is literally the only app where the difference comes up because Ctrl + C is used for SIGINT, thats literally it!
Inconsistency is not an issue for those who have learned the shortcuts. Everybody has trouble adapting to new keyboard shortcuts so that's not the main issue. Mac objectively uses way more keys in shortcuts compared to Linux which is what makes it worse.
> Just one most obvious example: copy and paste between windows.
Always the same, Ctrl + C/V. Except for the terminal because Ctrl + C is used for SIGINT, even on Mac. So instead of having to invent a whole new key for copy/paste in terminal, Linux uses an extra Shift, which imo is much better changing the whole key.
Let's not talk about Cmd + Tab, Alt + Tab on Linux is much better than Cmd + Tab + Cmd + `. I use the Alt Tab app on Mac too, it's way better than the default, unless of course one likes to use more keys to switch between windows for no reason.
I use both Mac and Linux for work. It's not me who has issues. I'm able to work on both just fine. I just mentioned why I think Linux is better, because the keys are consistent within the Linux + Windows world and they're much easier to learn. Of course you're gonna have to adjust to the difference when switching to Linux, is that even something to be angry about?
Honestly, it depends on what kind of developer you are. If you are doing mostly web development, I don’t think you will need Linux specifically; Windows or macOS will be just fine. However, if you are doing embedded systems, robotics, etc., then learning it is a must. Sometimes when new engineers join, I tell them to learn Linux. Some start learning it while the majority just ignore it, only to find out a few months later that the embedded/SBC is running Linux and now they have to!
> If you are doing mostly web development, I don’t think you will need Linux specifically
That leads to the nightmare that is Windows servers running custom code.
Nope, if you are doing web development, you should learn Linux too. You are excused if you are doing Windows desktop software, or working on one of those proprietary platforms that only run on Windows. But even on the second case, you should learn Linux so you can look for alternatives, as they are a very degrading experience.
> you will break your Linux many times.... and I had no idea how to fix it.
Yeah, that's exactly what killed it for me. ALSA produced weird sounds... After a long time of trying various things, the solution was "emerge -e @world" (recompile everything).
I will gladly endorse the "learn the root cause why it doesn't work" motto from the article. That will serve everyone well, though be judicial when to apply it. Some things are not worth it.
> Those fields are C, Unix, and Computer Networking.
Learning C is a dead end, soon to be nearly illegal dead end (memory unsafe languages are not conductive to secure software and laws on sw security are being passed).
I took a lot from my time with Linux, bash is great (to me preferable to powershell, though idea to pass object is quite appealling), but now mostly in form of WSL.
I have encountered a lot of weird bugs and I am at the point of my life, when I don't want to spend on a long time studying obscure configuration file with multitude of options (at least they have often explanations and examples... sound of progress).
I will take my happy abstracted layer (linux docker mostly ¯\_(ツ)_/¯).
Try NixOS. It's the most important advance for Linux in many years. It hides a lot of complexity.
Most systems can be configured with a file that is just a sequence of key-value pairs. For example, enabling PulseAudio is simply: hardware.pulseaudio.enable = true;
Things are rock solid. You can try new things and it's trivial to rollback to previous configurations.
I've had the opposite experience, about "it breaks and you have no idea how to fix it".
On Linux when something breaks, you go to a forum and can understand exactly what's wrong and what's the right way to fix it. If you encounter the problem again, you'll know right away the fix.
On Windows, when something breaks, all the info you get are random incantations, that may or may not work, may or may not create new problems along the way. Sometimes you'll fix it without really understanding why, and sometimes you're stuck with the nuclear option of reinstalling Windows.
I would love to. For me the main issue was that things are unnecessary cryptic.
Take vim for example, what a hostile tool for a newbie. Reboot machine much?
No offense towards vim, like any tool it’s great when you know it.
Is this the whole sentiment for Linux, it’s great when you know it? Does it help you to get to know it? No!
Yeah, yeah, so much documentation etc. etc.
But if you don’t have a freaking folder under / saying programs then you failed. Different for all the distros? You failed hard.
So if anyone makes this stuff more logical, beginner friendly, I will be among the first to jump ship. Until then I stay with windows and see linux as a means to run containers.
Recently I needed to do some ESP32 development. The project was built on Windows and I wanted to maintain fidelity to the build environment.
So I installed WSL along with all of the rest of what I’m used to like NeoVim and Fish. Made my changes from there, then I opened a Windows command prompt to actually build, which ironically depended on MinGW.
The watch windows are vastly inferior in the kinds of formatted information they can very quickly show, as the program is stepped through. Take a look at "Twenty Minutes of Reasons to Use the RemedyBG Debugger": https://www.youtube.com/watch?v=r9eQth4Q5jg
Haven't tried the Qt or KDE or Jetbrains environments in about 3 years, but last I did, they were much like Visual Studio -- slow, odd latency in displaying formatted values as you step through, lack of easy ways to format data in a human readable way in the watch window.
To be clear, when I say "slow", I mean the GUI is slow, not the underlying debugger, which I presume is plenty fast.
I was doing a fair bit of windows debugging then, but not a lot since, and it looks a great option now having watched some vids in the last 10 minutes.
The issue is that Visual Studio Debugger used to be good on a few of these features like speed up the watchdog window updating. One of the main benefits of RemedyBG is building a VSD style experience from a decade or two ago. So I don't fully agree with the implication that Linux is just not fully caught up to date with relatively new Windows debuggers -- it has basically never been up to date with debuggers from a decade or two ago.
I should say that I never used to use debuggers until I found ones as fast as VSD was a decade or two ago, and I used to primarily dev in Linux since the mid 2000s.
Edit: a decade or two ago is probably too short. I keep thinking the 90s were 10 years ago :)
If I was trying to makes sense out of this as someone who doesn't use Linux daily, I'd be heavily confused and discouraged by such writing.