GhostFence ,

Eh, autism jokes are not funny. Ableism isn't funny.

the_seven_sins ,

Ever wondered why ${insert_proprietary_software_here} takes so long to boot?

CosmicCleric ,
@CosmicCleric@lemmy.world avatar

The problem I have with this meme post is that it gives a false sense of security, when it should not.

Open or closed source, human beings have to be very diligent and truly spend the time reviewing others code, even when their project leads are pressuring them to work faster and cut corners.

This situation was a textbook example of this does not always happen. Granted, duplicity was involved, but still.

GamingChairModel ,

100%.

In many ways, distributed open source software gives more social attack surfaces, because the system itself is designed to be distributed where a lot of people each handle a different responsibility. Almost every open source license includes an explicit disclaimer of a warranty, with some language that says something like this:

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.

Well, bring together enough dependencies, and you'll see that certain widely distributed software packages depend on the trust of dozens, if not hundreds, of independent maintainers.

This particular xz vulnerability seems to have affected systemd and sshd, using what was a socially engineered attack on a weak point in the entire dependency chain. And this particular type of social engineering (maintainer burnout, looking for a volunteer to take over) seems to fit more directly into open source culture than closed source/corporate development culture.

In the closed source world, there might be fewer places to probe for a weak link (socially or technically), which makes certain types of attacks more difficult. In other words, it might truly be the case that closed source software is less vulnerable to certain types of attacks, even if detection/audit/mitigation of those types of attacks is harder for closed source.

It's a tradeoff, not a free lunch. I still generally trust open source stuff more, but let's not pretend it's literally better in every way.

5C5C5C ,

There are two big problems with the point that you're trying to make:

  1. There are many open source projects being run by organizations with as much (often stronger) governance over commit access as a private corporation would have over its closed source code base. The most widely used projects tend to fall under this category, like Linux, React, Angular, Go, JavaScript, and innumerable others. Governance models for a project are a very reasonable thing to consider when deciding whether to use a dependency for your application or library. There's a fair argument to be made that the governance model of this xz project should have been flagged sooner, and hopefully this incident will help stir broader awareness for that. But unlike a closed source code base, you can actually know the governance model and commit access model of open source software. When it comes to closed source software you don't know anything about the company's hiring practices, background checks, what access they might provide to outsourced agents from other countries who may be compromised, etc.

  2. You're assuming that 100% of the source code used in a closed source project was developed by that company and according to the company's governance model, which you assume is a good one. In reality BSD/MIT licensed (and illegally GPL licensed) open source software is being shoved into closed source code bases all the time. The difference with closed source software is that you have no way of knowing that this is the case. For all you know some intern already shoved a compromised xz into some closed source software that you're using, and since that intern is gone now it will be years before anyone in the company notices that their software has a well known backdoor sitting in it.

Draegur ,

i feel like the mental gymnastics should end with a rake step

veganpizza69 ,
@veganpizza69@lemmy.world avatar

It's about the complex rationalizations used to create excuses (pretexts).

The original is this:

https://lemmy.world/pictrs/image/dbbc96fa-c180-472b-ab49-9ae079dc479c.webp

johannesvanderwhales ,

Alright I won't argue about that specific version's point, but this is basically a template for constructing a strawman argument.

CodexArcanum ,

I've gotten back into tinkering on a little Rust game project, it has about a dozen dependencies on various math and gamedev libraries. When I go to build (just like with npm in my JavaScript projects) cargo needs to download and build just over 200 projects. 3 of them build and run "install scripts" which are just also rust programs. I know this because my anti-virus flagged each of them and I had to allow them through so my little roguelike would build.

Like, what are we even suppose to tell "normal people" about security? "Yeah, don't download files from people you don't trust and never run executables from the web. How do I install this programming utility? Blindly run code from over 300 people and hope none of them wanted to sneak something malicious in there."

I don't want to go back to the days of hand chisling every routine into bare silicon by hand, but i feel l like there must be a better system we just haven't devised yet.

Killing_Spark ,

Debian actually started to collect and maintain packages of the most important rust crates. You can use that as a source for cargo

JustEnoughDucks ,
@JustEnoughDucks@feddit.nl avatar

Researchers have found a malicious backdoor in a compression tool that made its way into widely used Linux distributions, including those from Red Hat and Debian.

https://arstechnica.com/security/2024/03/backdoor-found-in-widely-used-linux-utility-breaks-encrypted-ssh-connections/

Killing_Spark ,

Yeah they messed up once. It's still miles better than just not having someone looking at the included stuff

GhostFence ,

You'd think this would be common sense...

corsicanguppy ,

those from Red Hat

Not the enterprise stuff; just the beta mayflies.

wolf ,

THIS.

I do not get why people don't learn from Node/NPM: If your language has no exhaustive standard library the community ends up reinventing the wheel and each real world program has hundreds of dependencies (or thousands).

Instead of throwing new features at Rust the maintainers should focus on growing a trusted standard library and improve tooling, but that is less fun I assume.

areyouevenreal ,

I thought they already had decent tooling and standard libraries?

areyouevenreal ,

Can you give some examples of things missing from Rust standard library?

wolf ,

Easily, just look at the standard libraries of Java/Python and Golang! :-P

To get one thing out of the way: Each standard library has dark corners with bad APIs and outdated modules. IMHO it is a tradeoff, and from my experience even a bad standard library works better than everyone reinvents their small module. If you want to compare it to human languages: Having no standard library is like agreeing on the English grammar, but everyone mostly makes up their own words, which makes communication challenging.

My examples of missing items from the Rust standard library (correct me, if I am wrong, not a Rust user for many reasons):

  • Cross platform GUI library (see SWING/Tk)
  • Enough bits to create a server
  • Full set of data structures and algorithms
  • Full set of serialization format processing XML/JSON/YAML/CVS/INI files
  • HTTP(S) server for production with support for letsencrypt etc.

Things I don't know about if they are provided by a Rust standard library:

  • Go like communication channels
  • High level parallelism constructs (like Tokyo etc.)

My point is, to provide good enough defaults in a standard library which everybody knows/are well documented and taught. If someone has special needs, they always can come up with a library. Further, if something in the standard library gets obsolete, it can easily be deprecated.

corsicanguppy ,

Like, what are we even suppose

supposed

to tell “normal people” about security? “Yeah, don’t download files from people you don’t trust and never run executables from the web. How do I install this programming utility? Blindly run code from over 300 people and hope none of them wanted to sneak something malicious in there.”

You're starting to come to an interesting realization about the state of 'modern' programming and the risks we saw coming 20 years ago.

I don’t want to go back to the days [...]

You don't need to trade convenience for safety, but having worked in OS Security I would recommend it.

Pulling in random stuff you haven't validated should feel really uncomfortable as a professional.

CameronDev ,

To be fair, we only know of this one. There may well be other open source backdoors floating around with no detection. Was heartbleed really an accident?

squaresinger ,

The only real downside on the open source side is that the fix is also public, and thus the recipe how to exploit the backdoor.

If there's a massive CVE on a closed source system, you get a super high-level description of the issue and that's it.

If there's one on an open source system, you get ready-made "proof of concepts" on github that any script kiddy can exploit.

And since not every software can be updated instantly, you are left with millions of vulnerable servers/PCs and a lot of happy script kiddies.

See, for example, Log4Shell.

DemSpud ,

bUt gUyS WhAt aBoUt sEcUrItY ThRoUgH ObScUrItY??

squaresinger ,

hEy, yOu lEaRnEd A bUzZwOrD aNd rEcEnTlY dIsCoVeReD tHe sHiFt KeY!!!! cOnGrAtS!?!

  • Alle
  • Abonniert
  • Moderiert
  • Favoriten
  • random
  • linuxmemes@lemmy.world
  • haupteingang
  • Alle Magazine