Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I mean, being a bit glib here, but a lot of programming is dealing with someone else's type system.

Moreover, for those of us who write C fairly often, the mnemonics here are familiar.

Actually, as custom type systems go, this one is pretty elegant. Reminds me of Rust.



Speaking of type systems, I read glib as g-lib a few times and tried to understand how you were talking about the GNU lib in that sentence.


I did too. Humans use context to resolve ambiguities in language, and in this case the context was very much statistically favouring the library; if you're using it, glib is literally "someone else's type system".


Wouldn't that have to be `glibc`?


It is unfortunate that the two have such similar names because there's a lot of room for confusion. It doesn't help that they have somewhat adjacent functionality almost.



I see. Thanks for that link!


Yeah but not for basic types. Also, most code mingles sooner or later with other code. Than this is just ugly.


Rust made the correct choice: things used most often should be assigned the shortest names. This "Huffman encoding" style is what natural languages have evolved toward as well. In 2023, if I were to write C, and didn't have existing guidelines to adhere to, I'd most probably introduce the same typedefs as the author here has done.


It’s better than dealing with needlessly long type names like uint32_t though


uint32_t isn't that long, and is quite clear what type it means.


No. It would be better if the standardization groups would have done that, but not when every developer has a different scheme


It’s not great but they’re just aliases so they’re interchangeable, which means you can keep everything consistent within a project and it won’t cause any problems when interacting with outside code


Until you include a header written by someone with the same opinion, and now you get compile errors because they both defined 'u8'.

I gotta be honest, all of those style suggestions look good until you try them in a non-solo and non-isolated project, and then you see what a mess you created.

We've all been there, as C programmers, and we've all done that in the past, which is why we don't do it anymore


Unless they were defined to completely different types, that shouldn’t be an error


> Unless they were defined to completely different types, that shouldn’t be an error

In this case it almost certainly will be - after all, the blog posts `byte` is defined as char, which could be signed or unsigned. A correct typedef for `byte` is `uint8_t`, so it's almost guaranteed that this will conflict.

Which is why I said it's best not to redefine the primitive types - you're almost certain to conflict with someone else who defined it differently.


Long type name?? uint32_t is literally 8 characters long. There's not much to cut here.


How about... u32?


gasp blasphemy! C programmers literally can't have nice things


Unsigned 32 bit IEEE 754 floating point, right?


Sure, that's totally readable and meaningful.


It’s longer than it needs to be when you’re typing it out so many times


What are we trying to optimize, the number of characters to type or clarity/readbility?


Both I guess? I don’t think the extra length helps with readability at all


if you like typing _, go ahead, but it's not 1 key press and shouldn't be treated as a keystroke like the letter a...


On my keyboard layout it's one keypress. And since code is read about 100 times as much it's written, I don't particularly care about reaching 250 WPM while writing code. The difficulty of writing code is thinking about it, not actually physically writing it.


Integer types are very common so it does still get a bit tedious. I guess it causes clutter when reading too




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: