I installed my Arch on Desktop in 2009 and it was just cloned from one disk to another through multitude of PCs, and sure, there were occasional troubles, like upgrade from SysV init to systemd, when KDE plasma 4 released, or the time, when I had to run a custom kernel and mesa which supported the AMD Vega 56 card ~month after release.
But nowadays, I didnt had a single breakage for several years, my RX6800 GPU was well supported 3 months after release, and most things just work... BTW I run arch also on my home server, in 6 years it had literally zero issues.
Ok wow! This is really impressive. I couldn't even run Windows or Debian or something like that for 15 years, yet you managed to do it with Arch. May I ask what was the main reason behind trying to keep this Arch installation for so long? Were you just to lazy to reinstall or are there other factors?
And reinstalling the packages, moving over all the configs, setting up the partitions and moving the data over? (Not in this order, of course)
Cloning a drive would just require you to plug both the old and new to the same machine, boot up (probably from a live image to avoid issues), running a command and waiting until it finishes. Then maybe fixing up the fstab and reinstalling the bootloader, but those are things you need to do to install the system anyways.
I think the reason you'd want to reinstall is to save time, or get a clean slate without any past config mistakes you've already forgotten about, which I've done for that very reason, especially since it was still my first, and less experienced, install.
Well not really, cloning is much easier than reinstalling and then configuring everything again...
I have LVM set up from the start, so usually I just copy the /boot partition to the new disk, and the rest is in a LVM volume group, so I just use pvmove from old disk to the new one, fix the bootloader and fstab UUIDs, and Im ready to reboot from new disk, while I didnt even left my running system, no live USB needed or anything. (Of course I messed it up a first few times, so had to fix from a live OS).
But once you know all the quirks, I can be up and ready on a new drive withing 20mins (depends mainly on the pvmove), with all the stuff preserved and set
There were no real reasons to reinstall it, it works fine, occasionally had to purge some config files in home for some apps after major version changes, or edit them, but most work for years. I mean, my mplayer config is from 2009 and last edited 4 years ago...
The thing I hate about the "value your time" argument is that windows is shit.
Let's be generous for a minute and assume that windows and linux have the same amount of problems. Someone who is on windows for the past 30 years has 30 years of acquired knowledge and will probably know quickly how to solve it on windows, but not linux. Someone who is on linux for the past 30 years has 30 years of acquired knowledge and will probably know quickly how to solve it on linux, but not windows.
So the entire argument is just "but I have muscle memory tied to windows, and I already know how to solve those problems, but I dont know how to solve the linux ones, so they take me a lot of research and time to solve, therefore all linux problems always take a lot more time to solve"
On windows, I have to spend time fighting BSODs and finding out where to download software from that isn't just bloated up with viruses, and how to run registry hacks to get rid of start menu ads and to stop microsoft from phoning home. None of those things i have to do on linux.
On linux, today my biggest issue was figuring out how to change the keybinding for taking a screenshot... And that was an easy issue, but it's also not even possible on windows.
So I guess different types of problems. My "wasted" time is customizing my OS/environment so it works the way I want it to, not trying to fight back any ounce of control.
I really don't get these memes. In about 9 years of daily use on multiple systems I never had anything break beyond a multitude of failures to update with pacman - all of which could be fixed within minutes - and in the early years having to restart my system every couple of months because it stopped recognizing USB devices - after many rounds of updates mind you. I've had more frequent troubles with windows. How did Arch get this bad rep?
The X server has to be the biggest program I've ever seen that doesn't do anything for you.
Ken Thompson
I see Wayland's flaws but X is such a bloated piece of hardly maintainable spaghetti code that it is sadly beyond saving or prospects for anything in terms of significant improvement
the real question is whether you use git variants. Which is another way of not making arch (and Gentoo) certainly not free as in free beer, especially if you live in Europe and need to deal with those outrageous energy prices. btw imo one should be suspicious of projects with long tagged release cadence since it's usually a sign of technical debt and the need to look for alternatives.
Same here. 10 years on my laptop and it broke only once: I accidentally closed the terminal where the initramfs was installed. So my mistake. I could fix it by using an arch install on an USB and my knowledge of how to install the system, since I did it myself, by hand.
well in this particular case it's initramfs' fault for not designing for all-or-nothing atomicity (a operation either completes fully or not at all). which you can work around with a terminal multiplexer where a session can be re-attached later in such cases btw.
That's because arch is very old and back in the days it was prone to breakage. Ironically, it is now much more stable and easy to maintain than an Ubuntu derivative but people will still recommend Mint to beginners for some reason.
well in my experience it was opensuse tumbleweed or Manjaro that were significantly less stable, but perhaps my perception is a little bit skewed since I use artix and it's certainly not too rarely just the bloated, tightly coupled nature of shitstemd that causes some of arch's issues.
IDK, I've found Gnome unusable for a long time. I tried to make up for it with extensions for a while, but every release would unapologetically break something I found essential and the extension devs would give up trying to keep them going.
I understand that eventually they got better about dropping breaking changes without warning, because extension devs were leaving in droves, but at that point KDE got good again with Plasma, and I've never looked back.
Gnome has their vision to be a completely hands off, dumbed-down, unbreakable DE for the lowest common denominator. I guess judged by that light, it's a success. It's the default in a lot of distros because it's low maintenance for packaging and support. Frankly, I think it's a major reason for the slow speed of Linux desktop uptake, but what do I know.
programming.dev
Heiß