@fasterthanlime@hachyderm.io titelbild
@fasterthanlime@hachyderm.io avatar

fasterthanlime

@fasterthanlime@hachyderm.io

hi, I'm amos! 🦀 I make articles & videos about how computers work 🐻‍❄ cool bear's less cool counterpart ✨ be kind

Dieses Profil is von einem föderierten Server und möglicherweise unvollständig. Auf der Original-Instanz anzeigen

fasterthanlime , an Random Englisch
@fasterthanlime@hachyderm.io avatar

I messed something up in my asset pipeline and ended up with this beautiful and eerie bear chain

(can you guess what I got wrong?)

fasterthanlime , (Bearbeitet ) an Random Englisch
@fasterthanlime@hachyderm.io avatar

Reported another Zed papercut because it's so good that even small bugs are really noticeable

edit: Already fixed!! Thanks Piotr :)

https://github.com/zed-industries/zed/issues/13419

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@mikkelens it builds for non-macOS platforms already :)

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@misty Yes!! Not only are they generally very reactive, a few folks from Zed industries also follow me closely because, well, I have a big mouth 😌

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@mikkelens ? They have employees working on it? And a Discord channel for it?

fasterthanlime , an Random Englisch
@fasterthanlime@hachyderm.io avatar

Can someone hardcode in rust-analyzer that Future always comes from std::future::Future and not, like, futures_util::Future or something

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@jplatte

> The Rust 2024 Edition is currently slated for release in Rust 1.82.0

can't wait!

fasterthanlime , an Random Englisch
@fasterthanlime@hachyderm.io avatar

Since when did rustfmt start formatting "calls on tuples" like this:

{  
 (  
 StatusCode::OK,  
 format!("serve transcoded for {hapa} / {bp}"),  
 )  
 .into_http()  
}  

And how do I convince it to simply do

{  
 (  
 StatusCode::OK,  
 format!("serve transcoded for {hapa} / {bp}"),  
 ).into_http()  
}  

Instead?

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@dysfun load-bearing simply

fasterthanlime , an Random Englisch
@fasterthanlime@hachyderm.io avatar

One thing that is cool about LLM‘s right now is how there’s actual competition, the likes of which I haven’t seen in other fields lately.

I’ve been enjoying GPT-4o, now Claude Sonnet 3.5 is out and I can just switch to it with one line of config. Until next time.

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

I remember reading about how AI companies were currently undercharging for models and they would pull the rug later — experimenting with local models, which are much smaller, but also still useful has changed my mind about that.

fasterthanlime OP , (Bearbeitet )
@fasterthanlime@hachyderm.io avatar

I guess we’ll see over time, but we are carrying in our pockets an amount of computing power that was completely unfathomable at some point in the past, so… we’ll see.

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

One thing that hasn’t changed is how polarized people are about LLMs. Everyone is entitled to their opinions, but it is changing things even if you don’t personally use them, so I would recommend getting some first-hand experience to learn more about what they can and cannot do.

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

I feel like a lot of people keep seeing pathological cases with poorly written prompts or tasks really not well suited to LLMs, and end up dismissing them as “completely useless”. That’s not true!

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

I feel getting what you want out of an LLM is a good exercise for software engineering types in general.

Coming up with a prompt that is unambiguous and contains all the relevant context is an incredibly valuable skill that obviously translates to human collaboration.

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

I’ve gotten good results carefully designing prompts in the Notes app before submitting them to the model.

If you know where you’re going, use technology with safeguards (e.g Rust), and the task fits in the context window, then you get to operate one level of abstraction higher.

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

It’s gotten me excited about working on some systems again. It used to be complete drudgery but large language models are excellent at taking care of boilerplate for you.

They end up plugging the developer experience holes in a lot of technologies .

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

I think LLMs make “loose” languages less appealing than before. Nowadays, I would rather instruct a model on which Rust code to write for me, and end up with a very fast solution, than hack some Python/JS myself and pay the performance tax (+ maintenance burden) forever.

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

Even if models don’t get any bigger or any “smarter”, a big difference compared to even a year ago, is how up-to-date the recent models are?

I’ve gotten GPT-4o to port code between two rust crates just by mentioning them my name.

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

Even hallucination in the context of software development is not necessarily a bad thing? When an LLM tries to use an API that doesn’t exist, it’s often a sign that it should exist!

Think “they didn’t know it was impossible so they just did”, junior dev perspective

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

I’m just tired of reading that they are either completely useless or going to replace us all.

They’ve become a really useful tool, in my opinion there’s never been a better time for small teams to compete with larger companies. And it’ll drive the rebirth of bespoke software!

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

I think we’re still not collectively over the “well if we don’t need to go pump water out of the well ourselves anymore, then what does it even mean to be human??” moment, but we’ll get there.

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

Oh btw here's a prompt I used with GPT-4o a few days ago (I was working on my CDN): https://gist.github.com/fasterthanlime/35782ceffb16ba61ac7296114ec52013

(I of course reviewed the integration test but it saved me like 20 minutes of back and forth with the compiler)

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@scherzog yes but no — you're well aware there's different levels of abstractions.

Some folks snicker at C compilers because it's really hard to get them to output the assembly you want 🤷

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@whitequark Yup! I updated the gist with the generated code.

I probably adjusted it a bit after generating, but 95% is stilla s-is.

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@yerke In one go, I knew it was a big one and wanted to get it right.

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@Eunakria That's interesting. Do you have any further reading on the topic?

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@dluz Yeah, I agree the next few months will be crucial to the space and I'll be keeping a close eye on this.

fasterthanlime OP , (Bearbeitet )
@fasterthanlime@hachyderm.io avatar

@__head__ Like another answer mentioned, it's delicate when you lack the requisite skills to validate whether a solution makes sense or not, but I would recommend using chat interfaces to even know what to search for.

fasterthanlime , an Random Englisch
@fasterthanlime@hachyderm.io avatar

💪 I finally took the time to report my current-most-annoying @zeddotdev issue: pasting changes indentation

(feel free to leave a 👍 if you've noticed this too!)

https://github.com/zed-industries/zed/issues/13338

thejpster , an Random Englisch
@thejpster@hachyderm.io avatar

Watching Buffy the Vampire Slayer (1992)

fasterthanlime ,
@fasterthanlime@hachyderm.io avatar

@thejpster Nice!

fasterthanlime , an Random Englisch
@fasterthanlime@hachyderm.io avatar

this is probably just me / my fault in some way but it is, to me, really funny

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

oh wow no, they're returning 520 (TIL about this status code) and also using astro dot build???

fasterthanlime OP , (Bearbeitet )
@fasterthanlime@hachyderm.io avatar

OH MY FUCKING GOD, cloudflare docs were missing CSS because they were in the middle of deploy and the Cloudflare* folks still haven't figured out atomic deploys????

CSS cachebust was Dx13URD1 (which returned 520 for MINUTES, and now 404). New one is DxT3URD1 (which returns 200)

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

Also why are those two so close??

  • Dx13URD1
  • DxT3URD1
    I had a to do a double take, thinking I messed up the copy-paste.

No way a truncated "classic" hash (think: SHA1/SHA256) would do that — what are they doing for cache-busting? So many questions.

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@cwalkatron Yes very entertaining to find out Cloudflare in the top results for "HTTP 520"

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

On second thought, don't tell me. I don't want to know.

Everyone says not to make my own blog engine, but at least I don't have to deal with... whatever is going on over there.

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@onelson @misty y'all don't deserve CSS and you're GROUNDED

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@erika Good to know!! Edited my post.

fasterthanlime , an Random Englisch
@fasterthanlime@hachyderm.io avatar

Avec mes excuses aux parisiens de la TL (ou pas)
https://hachyderm.io/@fasterthanlime/112645162644857554

mcc , an Random Englisch
@mcc@mastodon.social avatar

Me, stepping out of the Paris Metro: Aha, now that I am outdoors and the risk of COVID is reduced, I can remove my N95 mask and appear Socially Normative to the people around me.

Me, 3 seconds later, after removing my mask: Holy crap has the smog in Paris always been this bad

fasterthanlime ,
@fasterthanlime@hachyderm.io avatar

@mcc « Votre cité a d’une poignée de clous de girofles régurgités l’odeur, mais curieusement, en manque la vertu »

fasterthanlime , an Random Englisch
@fasterthanlime@hachyderm.io avatar

Who needs a bundler when you've got ✨ global scope ✨

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar
fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@raymondwjang @jonny You’re both grounded

fasterthanlime , an Random Englisch
@fasterthanlime@hachyderm.io avatar

I'm looking at parcel's "engines" field — what's a reasonable browser target nowadays? Still Chrome 80? Aren't we at Chrome 243 or smth? https://parceljs.org/features/targets/

fasterthanlime , (Bearbeitet ) an Random Englisch
@fasterthanlime@hachyderm.io avatar

When you use the phrase “let’s be honest” do you mean “let’s be honest with each other“ or do you mean “let’s be honest with ourselves“?

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@malwareminigun
each other = I will be candid with you
ourselves = I will admit hard truths about myself (to myself)

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@malwareminigun yes! our two own selves, rather than the collective selves

fasterthanlime OP ,
@fasterthanlime@hachyderm.io avatar

@malwareminigun
> the UML designer
whatever it is, it's macabre

(but I suppose, both?)

  • Alle
  • Abonniert
  • Moderiert
  • Favoriten
  • random
  • haupteingang
  • Alle Magazine