Hacker Newsnew | past | comments | ask | show | jobs | submit | HendrikHensen's commentslogin

Can you explain it for those out of the loop?

If this is to be a real, (relatively) widely-used language, I would make some tough choices on where to innovate, and where to just leave things the same.

One thing I noticed in the example is `num target`, especially because the focus is on "clarity". When I read the example, I was sure that `num` would be something like the JavaScript `Number` type. But to my surprise, it's just a 64-bit integer.

For an extremely long time, languages have had "int", "integer", "int64", and similar. If you aim for clarity, I would strongly advise to just keep those names and don't try to invent new words for them just because. Both because of familiarity (most programmers coming to your language will already be familiar other languages which have "int(eger)"), and because of clarity ("int(eger)" is unambiguous, it is a well defined term to mean a round number; "num" is ambiguous and "number" can mean any type of number, e.g. integer, decimal, imaginary, complex, etc).

The most clear are when the data types are fully explicit, eg. `int64` (signed), `uint64` (unsigned), `int32`, etc.


[author here] That’s a good point. I can see why int might be clearer than num, especially given the long history of that naming. I’ll think about it.

Definitely int for signed numbers. But I would call it "int64".

Clarity means saying what you mean. The typename int64 could not be clearer that you are getting 64 bits.

This is consistent with your (num32 -->) "int32".

And it would remain consistent if you later add smaller or larger integers.

This also fits your philosophy of letting the developer decide and getting out of their way. I.e. don't use naming to somehow shoehorn in the "standard" int size. Even if you would often be right. Make/let the developer make a conscious decision.

Later, "int" could be a big integer, with no bit limit. Or the name will be available for someone else to create that.

I do like your approach.

(For unsigned, I would call them "nat32", "nat64", if you ever go there. I.e. unsigned int is actually an oxymoron. A sign is what defines an integer. Natural numbers are the unsigned ones. This would be a case of using the standard math term for its standard meaning, instead of the odd historical accident found in C. Math is more universal, has more lasting and careful terminology - befitting universal clarity. I am not a fan of new names for things for specialized contexts. It just adds confusion or distance between branches of knowledge for no reason. Just a thought.)


Thank you and others for articulating this naming (and semantic) problem so well.

I've just shipped int64, float64, and removed num32, float, etc.


I would recommend outright copying Rust.

Among other things, it's a systems programming language and hence its naming scheme is largely (if not entirely) compatible with modern C++ types.

I.e.:

    +----------------+-------------------------+------------------------------+
    | Rust           | Modern C++              | Notes                        |
    +----------------+-------------------------+------------------------------+
    | i8             | std::int8_t             | exact 8-bit signed           |
    | u8             | std::uint8_t            | exact 8-bit unsigned         |
    | i16            | std::int16_t            | exact 16-bit signed          |
    | u16            | std::uint16_t           | exact 16-bit unsigned        |
    | i32            | std::int32_t            | exact 32-bit signed          |
    | u32            | std::uint32_t           | exact 32-bit unsigned        |
    | i64            | std::int64_t            | exact 64-bit signed          |
    | u64            | std::uint64_t           | exact 64-bit unsigned        |
    | i128           | (no standard type)      | GCC/Clang: __int128          |
    | u128           | (no standard type)      | GCC/Clang: unsigned __int128 |
    | isize          | std::intptr_t           | pointer-sized signed         |
    | usize          | std::uintptr_t          | pointer-sized unsigned       |
    | f32            | float                   | IEEE-754 single precision    |
    | f64            | double                  | IEEE-754 double precision    |
    | bool           | bool                    | same semantics               |
    | char           | char32_t                | Unicode scalar value         |
    +----------------+-------------------------+------------------------------+

Noticed the part where the exact instructions from the Readme were followed and it didn't work?


So we're down to a missing or unclear description of a dependency in a README - note following the instructions worked for others -, from implications the compiler didn't work.


Good for you. But there are already so, so many posts and threads celebrating all of this. Everyone is different. Some of us enjoy the activity of programming by hand. This thread is for those us, to mourn.


You're still allowed to program by hand. Even in assembly language if you like.


People are allowed to mourn the music styles of previous decades even though the same music genres are still being created, not just popular the same way they used to be.


Imo, it's allowed theoretically, as a hobby, but not really as a practice. This is what the blog is about.


There are literally still programmers who make their living writing assembly code by hand for embedded systems.


> You're still allowed to program by hand.

In fact, you'll probably be more productive in the long term.


I have an llm riding shotgun and I still very much program by hand. it's not one extreme or the other. whatever I copy from the llm has to be redone line by line anyways. I understand all of my code because I touch every line of it


Thanks, ChatGPT.


The GPT found this and thought it was relevant: "an introduction of library operating system for Linux" - https://lwn.net/Articles/637658/


All I can think about is how much power this takes, how many un-renewable resources have been consumed to make this happen. Sure, we all need a funny thing here or there in our lives. But is this stuff really worth it?


Luckily we live in a society where its ok to use power for personal pleasure, such as running an A/C in the summer which accounts for much more electricity use than LLM inference.

https://www.eia.gov/tools/faqs/faq.php?id=1174&t=1


[flagged]


> U.S. data centers consumed 183 terawatt-hours (TWh) of electricity in 2024, according to IEA estimates. That works out to more than 4% of the country’s total electricity consumption last year – and is roughly equivalent to the annual electricity demand of the entire nation of Pakistan. By 2030, this figure is projected to grow by 133% to 426 TWh.

https://www.pewresearch.org/short-reads/2025/10/24/what-we-k...

There are ~10M cows nationally. The average energy consumption is ~1000 kWh/cow annually. Summing up, the entire dairy industry consumes ~10TWh. That is less than 10% of the national data center energy burn. [edit: was off by a factor of 10]


Not to mention dairy cows store chemical energy for human consumption, so we got some of the energy invested back.


> One dairy operation uses more resources than all the datacenters in the united states

citation for this claim?

https://www.pewresearch.org/short-reads/2025/10/24/what-we-k...

> U.S. data centers consumed 183 terawatt-hours (TWh) of electricity in 2024, according to IEA estimates. That works out to more than 4% of the country’s total electricity consumption last year – and is roughly equivalent to the annual electricity demand of the entire nation of Pakistan. By 2030, this figure is projected to grow by 133% to 426 TWh.


lol what? Can you please cite some sources for this claim?


The actual energy usage is probably not a big deal comparatively. But the attention / economic energy is absolutely a big deal and an increasingly farcical one.

I think the market is just waiting for the next Big Think to come around (crypto, VR, etc.) and the attention obsession will move on.


Trivial in the grand scheme of things. There are much larger problems to attend to - if worrying about the cost and impact of AI tokens was a problem, we'd be living in a utopia.

Literally pick any of the top 100 most important problems you could have any impact on, none of them are going to be AI cost/impact related. Some might be "what do we do when jobs are gone" AI related. But this is trivial- you could run the site itself on a raspberry pi.


I think this is a strange, and honestly worrying, stance.

Just because there are worse problems, doesn't mean we shouldn't care about less-worse problems (this is a logical fallacy, I think it's called relative privation).

Further, there is an extremely limited number of problems that I, personally, can have any impact on. That doesn't mean that problems that I don't have any impact on, are not problems, and I couldn't worry about.

My country is being filled up with data centers. Since the rise of LLMs, the pace at which they are being built has increased tremendously. Everywhere I go, there are these huge, ugly, energy and water devouring behemoths of buildings. If we were using technology only (or primarily) for useful things, we would need maybe 1/10th of the data centers, and my immediate living environment would benefit from it.

Finally, the site could perhaps be run on a Raspberry Pi. But the site itself is not the interesting part, it's the LLMs using it.


I don't think it's odd at all- having taken a deep look at the potential impact and problems surrounding AI, including training and datacenters, I've come to the conclusion that they're about as trivial and low ranking a problem as deciding what color seatbelts should be in order to optimize driving safety. There are so many more important things to attend to - by all means, do the calculus yourself, and be honest about consumed resources and environmental impacts, but also include benefits and honest economics, and assess the cost/benefit ratio for yourself. Then look at the potential negatives, and even in a worst case scenario, these aren't problems that overwhelm nearly any other important thing to spend your time worrying about, or even better, attempting to fix.


It’s odd that people seem to be so against the AI slop in particular, because energy and water and whatnot. I’m fairly sure video games eat a lot more power than AI slop and are just as useless. So is traveling - do people truly need to fly 3000 miles just to see some mountains? Why do people demand food they like when you’d survive just fine off of legumes and water?

> Everywhere I go, there are these huge, ugly, energy and water devouring behemoths of buildings.

Everywhere you go? Really?

The water consumption is minor, btw. Electricity is more impactful but you’d achieve infinitely more advocating for renewables rather than preaching at people about how they’re supposed to live in mudhuts.


I land here: it's probably not the best, most useful thing to spend electricity and compute on, but in order to compel people to spend it on what I consider to be optimal, you'd have to make me dictator, and there are a million other people who have equally strong and well reasoned opinions about where those resources should be spent, and if you're going to be fair about resource allocation, you inevitably end up with something that looks and works like a marketplace. None of them can ever be perfect, so you aim for reasonable and fair, and push for incremental improvements to the fairness over time. You gotta be realistic about least and lesser evils, and have gratitude and appreciation for the genuine good, and be extremely pragmatic about the measure and rate of progress. Things are pretty damn good - not utopian or optimal, but pretty damn good. And getting better, 3 steps forward, 2 steps back, consistently, decade over decade.


> I’m fairly sure video games eat a lot more power than AI slop and are just as useless

What makes you so sure? I'm fairly sure they eat a fraction of what AI slop does and are much more useful.


I'm under the impression LLMs don't generally work that well on an RPI, and I'm guessing that's what the GP is referring to.


Evidence of European or misinformation


You are consuming non-renewable resources by reading this on your device and posting a comment for your entertainment.

At least with Moltbook, it is an interesting study for inter-agent communications. Perhaps an internal Moltbook is what will pave the path towards curing cancer or other bleeding-edge research.

With your comment, you are just wasting non-renewable resources just for your brain to feel good.


All I can think about is how much power this takes, how many un-renewable resources have been consumed to make this happen. Sure, we all need a funny thing here or there in our lives. But is this stuff really worth it?


Why though? On Mac, I have tons of unsaved work: many TextEdit windows which keep their state for many months, even through reboots. And it has been working like for at least 10 years. It's such a simple, little quality-of-life thing. And Microsoft just doesn't care.

This is what a computer should be doing: helping the user to get their work done, without the user having to worry about insignificant details about saving files. E.g. does Google Docs ever ask where to save a file before closing the browser or shutting down the computer? No you just get an untitled document that is automatically saved. If I want to rename it or save it in a different location, I am free to do so. But as long as I don't, it doesn't get in the way and just persists stuff automatically.


I don't disagree, but you have to know which applications reliably keep their state across restarts. You can't blindly rely on it on any desktop system. The Microsoft Office applications actually do auto-save documents since a couple of years ago, even though the recovery UX can be a bit awkward.

What Microsoft doesn't care about is that you may have applications running that don't do that, when Windows reboots for updates.


On macOS the feature is baked into the OS's APIs, the app developer just opts into using them. If they don't, quitting with unsaved work will prompt the user modally, and block the restart to the point where the OS will timeout the reboot process and give up. The only way to purposefully lose unsaved work in almsot every app I've ever used on macOS is to yank the power cable or hold the power button down.

Window locations and app state are written to plist files, again, using OS libraries and APIs for app resume. I can reboot my Mac and not even realize it happened sometimes it all comes back the way it was.


The blocking happens on Windows as well, except that the timeout logic is the reverse: it force-quits the applications then, because presumably the potential security update is more important.


Yep. On Mac (and Linux, actually) I know of some applications that do that. I also know that on Windows most applications don't do that. I would also never leave un-saved work open on Windows.

I was replying to: "The fact that you leave unsaved work overnight is the actual crazy part". As long as you know which apps auto-save and know you can somewhat rely on them, it's not so crazy.


> Why though?

> Microsoft just doesn't care.

So you know why. Also, Macs have other apps besides textedit, do all of them preserve unsaved docs across restarts?

> what a computer should be doing

Ok, but the discussion is about reality


> Macs have other apps besides textedit, do all of them preserve unsaved docs across restarts?

Every Mac app I’ve ever used does.

I don’t really care though, I reboot at most once ever six months


Most stuff on my mac seems ok. The clunkiest is the Microsoft software - Word and Excel but even that sort of works.


[flagged]


Of course, everyone has their own workflow. I won't tell anyone to adjust their workflow. But the exact point I was trying to make is that it's not random apps. It's specific apps that one knows about and how they behave. And once you know those apps (like TextEdit, Google Docs, etc) you can pretty much rely on it to survive reboots and power outages.


More and more desktop apps are just becoming websites. More and more desktop apps are using Electron rather than some native app. Windows is slowly becoming a dumpster fire in terms of usability and issues. Most games these days Just Work on Linux without any tinkering.

While I hardly think that this year will be "the year of the Linux desktop" or whatever, but if these trends keep going, I really foresee Linux market share growing, slowly, each year, until it's not so microscopic anymore.


Honestly, it feels like straight up plagiarism. When I saw the title, I thought I knew which website was posted because I had seen it before. When I clicked, I saw an unfamiliar website and was surprised that it was posted 3 days ago rather than a couple months ago.

The contents are so similar, that it cannot be coincidence. It really seems like the author of this blog simply plagiarized the strangestloop post without referring to it at all...


Same thoughts here. I gave it the benefit of the doubt, thought it might be an adoption for a specific field, or an extension of thought, or maybe a fun twist or something.

This is a tasteless copy.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: