Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hopefully I never need to review code written with these definitions. It's an awful idea to do this.

The first section shows why some languages have no 'typedef': introducing another layer of aliases is just not a good idea. It's confusing, it changes the appearance to basically a new language. Just use the standard names, instead of redefining your language, like everyone else. This style is almost as bad as '#define begin {'.

Many of the other defs are obfuscations or language changes -- this coerces C to something else. I'd not like to read code written with this, as it heavily violates the principle of least astonishment (POLA).

(As a side note, I don't understand the #define for sizeof. The operator sizeof returns size_t -- it's size_t's definition, so what is this for?)



> As a side note, I don't understand the #define for sizeof.

Their `size` type is signed. It's `ptrdiff_t`, not `size_t`.


Ah, thanks! So code will look normal, but be subtly different. This is even worse than I thought!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: