areyouevenreal

@areyouevenreal@lemm.ee

Dieses Profil is von einem föderierten Server und möglicherweise unvollständig. Auf der Original-Instanz anzeigen

areyouevenreal ,

Fragmentation is only an issue if you run a HDD.

areyouevenreal , (Bearbeitet )

It's called memory training. Disabling it will hurt either stability, performance, or both. I really wouldn't bother. Just use sleep mode if time is of the essence. I would also don't unplug your machine from the wall. If it remains powered a lot of systems will skip the training.

areyouevenreal ,

Are you an experienced Linux user?

areyouevenreal ,

Windows boot times aren't nearly that bad actually.

areyouevenreal ,

You could also just press the power button at the GRUB screen, assuming you have one obviously.

areyouevenreal ,

I thought the point of lts kernels is they still get patches despite being old.

Other than that though you're right on the money. I think they don't know what the characteristics of a microkernel are. I think they mean that a microkernel can't have all the features of a monolithic kernel, what they fail to realise is that might actually be a good thing.

areyouevenreal ,

I don't think a microkernel will help with zombies.

areyouevenreal ,

But generally microkernels are not solution to problems most people claim they would solve, especially in post-meltdown era.

Can you elaborate? I am not an OS design expert, and I thought microkernels had some advantages.

areyouevenreal ,

They are planning to port Cosmic DE, and have already ported several applications from Cosmic including the file manager and text editor if I remember correctly.

areyouevenreal ,

They have already ported apps from Cosmic though.

areyouevenreal ,

How do you even manage that much storage? Btrfs or snapraid or something?

Who needs Skynet Englisch

A meme in the "IQ bell curve" format. On the left, stupid wojak says "If we don't stop AI, it will destroy humanity", while thinking about rogue robots from Terminator. On the right, sage wojak also says "If we don't stop AI, it will destroy humanity", but he's thinking about massive energy requirements and carbon emissions associated with AI. In the middle, average intelligence wojak is in favour of AI: "Noooo AI will make our lives easier, we can automate so many tasks. Only a few more years and we'll achieve AGI, just wait and see. Surely this time a couple exajoules of energy spent on training will do the trick."
ALT
areyouevenreal ,

Technology is a product of science. The facts science seeks to uncover are fundamental universal truths that aren't subject to human folly. Only how we use that knowledge is subject to human folly. I don't think open source or open weights models are a bad usage of that knowledge. Some of the things corporations do are bad or exploitative uses of that knowledge.

areyouevenreal ,

That's not at all what I am doing, or what scientists and engineers do. We are all trained to think about ethics and seek ethical approval because even if knowledge itself is morally neutral the methods to obtain that knowledge can be truly unhinged.

Scientific facts are not a cultural facet. A device built using scientific knowledge is also a product of the culture that built it. Technology stands between objective science and subjective needs and culture. Technology generally serves some form of purpose.

Here is an example: Heavier than air flight is a possibility because of the laws of physics. A Boeing 737 is a specific product of both those laws of physics and of USA culture. It's purpose is to get people and things to places, and to make Boeing the company money.

LLMs can be used for good and ill. People have argued they use too much energy for what they do. I would say that depends on where you get your energy from. Ultimately though it doesn't use as much as people driving cars or mining bitcoin or eating meat. You should be going after those first if you want to persecute people for using energy.

areyouevenreal ,

What exactly is there to gain with AI anyways? What’s the great benefit to us as a species? So far its just been used to trivialize multiple artistic disciplines, basic service industries, and programming.

The whole point is that much like industrial automation it reduces the number of hours people need to work. If this leads to people starving then that's a problem with the economic system, not with AI technology. You're blaming the wrong field here. In fact everyone here blaming AI/ML and not the capitalists is being a Luddite.

It's also entirely possible it will start replacing managers and capitalists as well. It's been theorized by some anti-capitalists and economic reformists that ML/AI and computer algorithms could one day replace current economic systems and institutions.

Things have a cost, many people are doing the cost-benefit analysis and seeing there is none for them. Seems most of the incentive to develop this software is if you would like to stop paying people who do the jobs listed above.

This sadly is probably true of large companies producing big, inefficient ML models as they can afford the server capacity to do so. It's not true of people tweaking smaller ML models at home, or professors in universities using them for data analysis or to aid their teaching. Much like some programmers are getting fired because of ML, others are using it to increase their productivity or to help them learn more about programming. I've seen scientists who otherwise would struggle with data analysis related programming use ChatGPT to help them write code to analyse data.

What do we get out of burning the planet to the ground? And even if you find an AI thats barely burning it, what’s the point in the first place?

As the other guy said there are lots of other things using way more energy and fossil fuels than ML. Machine learning is used in sciences to analyse things like the impacts of climate change. It's useful enough in data science alone to outweigh the negative impacts. You would know about this if you ever took a modern data science module. Furthermore being that data centres primarily use electricity it's relatively easy to move them to green sources of energy compared to say farming, or transport. In fact some data centres already use green energy primarily. Data centres will always exist regardless of AI and ML anyway, it's just a matter of scale.

areyouevenreal ,

Think, genuinely and critically, about what it means when someone tells you that you shouldn’t judge the ethics and values of their pursuits, because they are simply discovering “universal truths”.

No scientist or engineer as ever said that as far as I can recall. I was explaining that even for scientific fact which is morally neutral how you get there is important, and that scientists and engineers acknowledge this. What you are asking me to do this based on a false premise and a bad understanding of how science works.

And then, really make sure you ponder what it means when people say the purpose of a system is what it does.

It both is and isn't. Things often have consequences alongside their intended function, like how a machine gets warm when in use. It getting warm isn't a deliberate feature, it's a consequence of the laws of thermodynamics. We actually try to minimise this as it wastes energy. Even things like fossil fuels aren't intended to ruin the planet, it's a side effect of how they work.

areyouevenreal ,

The purpose of a system is, absolutely, what it does. It doesn’t matter how well intentioned your design and ethics were, once the system is doing things, those things are its purpose. Your waste heat example, yes, it was the design intent to eliminate that, but now that’s what it does, and the engineers damn well understand that its purpose is to generate waste heat in order to do whatever work it’s doing.

Huh? Then why is so much money spent on computers to minimize energy usage and heat production? This is perhaps the biggest load of bullshit I think I have heard in a long time. Maybe there is some concept similar to this, but if so you clearly haven't articulated it well.

Anyway I think I am done talking about this with you. You are here to fear-monger over technology you probably don't even use or understand, and I am sick of lemmings doing it.

areyouevenreal ,

Most countries Primaries don't exist at all. Getting to choose who represents a given party is a luxury.

areyouevenreal ,

HDR is awesome if you have the right hardware. I've never seen a movie look so good. Someone needs to get HDR working.

areyouevenreal ,

Other systems like ChromeOS and Silverblue do atomic updates in the background and then switch on next restart. No waiting at screens like this. Heck even the conventional Linux update system, while far from foolproof, doesn't require waiting like this.

areyouevenreal ,

Fairly often if it wasn't for the whole fast startup thing, which isn't present in Linux land. I would say at least every couple of weeks, which is good enough for updates.

areyouevenreal ,

No Windows doesn't do atomic updates in the background, that's why there is the whole installing updates screen on reboot or shutdown.

areyouevenreal ,

You vastly misunderstand both what I am talking about, and how updates work on both Windows and Linux.

You don't press shut down and then get a blue updating screen that stops you from doing anything on Linux. Go and update a Linux system and you will see what I am talking about. You run it just like a normal command or program.

Also yes they update the files on the drive while the system is running.

areyouevenreal ,

People argue that systemd is too much like Windows NT. I argue that Windows NT has at least a few good ideas in it. And if one of those ideas solves a problem that Linux has, Linux should use that idea.

It's actually closer to how macOS init system launchd works anyway, not the Windows version. MacOS is arguably closer to true Unix than Linux is anyway, so I don't think the Unix argument is a good one to use anyway.

areyouevenreal ,

This isn't actually true. They offer both glibc and musl these days. Glibc is the normal one most Linux distros use. Musl doesn't work with some things, but is still desirable to some people for various reasons. Flatpak could be used to work around this, as it should pull in whatever libc that the program needs. Distrobox would also work. Though again this only applies of using the musl libc version.

Another potential sore point is not using systemd init. There are some things dependant on systemd, though generally there are packages which act as a replacement for whatever systemd functionality is needed.

I still have no idea what's wrong with Voids fonts though. You are on your own there!

areyouevenreal ,

Cinnamon isn't that lightweight. You will probably find KDE uses less resources.

areyouevenreal ,

Have you missed the top comment? This isn't actually copilot. It's from 5 years ago. In other words fake news.

areyouevenreal ,

See the top comment. This is from 5 years ago not actually copilot.

areyouevenreal ,

It's a repost from Reddit. Doesn't matter what your filter is set to

areyouevenreal ,

A hormone in the human body that's typically present in higher concentrations in females than males. Often given as HRT to trans women and post-menopausal women.

areyouevenreal ,

They all support two monitors (one internal and one external for macbooks, and two external for desktops). It's not an artificial restriction. Each additional monitor needs a framebuffer. That's an actual circuit that needs to be present in the chip.

areyouevenreal ,

Not necessarily. The base machines aren't that expensive, and this chip is also used in iPads. They support high resolution HDR output. The higher the number of monitors, resolution, bit depth, and refresh rate the more bandwidth is required for display output and the more complex and expensive the framebuffers are. Another system might support 3 or 4 monitors, but not support 5K output like the MacBooks do. I've seen Intel systems that struggled to even do a single 4K 60 FPS until I added another ram stick to make it dual channel. Apple do 5K output. Like sure they might technically support more monitors in theory, but in practice you will run into limitations if those monitors require too much bandwidth.

Oh yeah and these systems also need to share bandwidth between the framebuffers, CPU, and GPU. It's no wonder they didn't put 3 or more very high resolution buffers into the lower end chips which have less bandwidth than the higher end ones. Even if it did work the performance impacts probably aren't worth it for a small number of users.

areyouevenreal ,

Not really. There is a compromise between output resolution, refresh rate, bit depth (think HDR), number of displays, and the overall system performance. Another computer might technically have more monitor output, but they probably sacrificed something to get there like resolution, HDR, power consumption or cost. Apple is doing 5K output with HDR on their lowest end chips. Think about that for a minute.

A lot of people like to blame AMD for high ideal power usage when they are running multi-monitor setups with different refresh rates and resolutions. Likewise I have seen Intel systems struggle to run a single 4K monitor because they were in single channel mode. Apple probably wanted to avoid those issues on their lower end chips which have much less bandwidth to play with.

areyouevenreal ,

Well yeah, no shit Sherlock. They could have done that in the first generation. It takes four 1080p monitors to equal the resolution of one 4K monitor. Apple though doesn't have a good enough reason to support many low res monitors. That's not their typical consumer base, who mostly use retina displays or other high res displays. Apple only sells high res displays. The display in the actual laptops is way above 1080p. In other words they chose quality over quantity as a design decision.

areyouevenreal ,

Yeah people don't get that they are trading output quantity for output quality. You can't have both at the same time on lower end hardware. Maybe you could support both separately, but that's going to be more complex. Higher end hardware? Sure do whatever.

areyouevenreal ,

Sigh. It's not just a fricking driver. It's an entire framebuffer you plug into a USB or Thunderbolt port. That's why they are more expensive, and why they even need a driver.

A 1080p monitor has one quarter of the pixels of a 4K monitor. The necessary bandwidth increases with the pixels required. Apple chooses instead to use the bandwidth they have to support 2 5K and 6K monitors, instead of supporting say 8 or 10 1080p monitors. That's a design decision that they probably thought made sense for the product they wanted to produce. Honestly I agree with them for the most part. Most people don't run 8 monitors, very few have even 3, and those that do can just buy the higher end model or get an adapter like you did. If you are the kind of person to use 3 monitors you probably also want the extra performance.

areyouevenreal ,

It's not just about Retina displays. High res and HDR isn't uncommon anymore. Pretty much all new TVs anybody would want to buy will be 4K. It has to support the Apple 5K display anyway because that's one of their products.

As we've discussed two external displays are supported on the new macbook base models. It was a bit of an oversight on the original sure, but that's been fixed now.

Also the same SoCs is used in iPads. It's not mac only. I can't imagine wanting three displays on an ipad.

areyouevenreal ,

You can't fly directly in a commercial aircraft. The airspace has routes and points you have to follow. Smaller planes don't always have to, but big planes almost always do. Altitude is one of the determining factors.

areyouevenreal ,

I rarely have WiFi issues on Linux. At least not with internal WiFi cards. USB ones can sometimes be a problem, but not often.

areyouevenreal ,

???

Lookup the definition of immutable and then lookup what an immutable Linux distro is.

We were specifically talking about immutable Linux OSes/distros

areyouevenreal ,

I never said the objective was to learn Linux. The objective is to use it.

You can get terminal access on a smartphone. I can give you a screenshot or two if you like. You can also run Linux apps on ChromeOS now. It leverages containers from my understanding, a technology baked into the modern Linux kernel.

No it's not pedantic you are using a true scotsman fallacy. That being said I don't think using Linux Mint is any harder than using and maintaining Windows especially if you don't have existing knowledge of either. macOS is easier to use than either Linux Mint or Windows.

areyouevenreal ,

You can create controlled Linux environments fairly easily. Heck just disabling root access gets you half way there.

areyouevenreal ,

Is your bar seriously advertising? The mental gymnastics going on here is crazy. Yes it's a true scotsman fallacy. It doesn't matter if someone born in Scotland calls themselves Scottish or not, legally speaking they are still Scottish. That's how that works.

A child might not bother with the terminal on Linux Mint either. Does that mean Linux Mint isn't a real Linux now?

You also severely underestimate what children are capable of. I installed my first Linux distro, and taught myself HTML and CSS while still in primary school. I don't think you have any idea what children can do. I could probably outclass most adults with technology when I was 16 or so.

areyouevenreal ,

Then what do you mean?

  • Alle
  • Abonniert
  • Moderiert
  • Favoriten
  • random
  • haupteingang
  • Alle Magazine