But also because companies that produce web content wanted it to be seen by humans who would look at ads, not consumed by bots and synthesized with other info into a product owned by some other firm.
And yet today most websites are being scraped by LLM bots which don't look at ads and which synthesize with other info into a product owned by some other firm.
Optimistically, the semantic web is going to happen. Just that instead of the original plan of website owners willingly making data machine-readable, LLMs will be the ones turning non-machine-readable data machine-readable (which can then be processed by user agents), even if the website owner prefers you looked at the ads instead.
Because web content is generated by humans, not engineers.