Hacker Newsnew | past | comments | ask | show | jobs | submit | knallfrosch's commentslogin

They A/B test titles. You can see it in the URL, where the recessive title often lives on. They may also use different titles for print/digital.

Google/Apple already know where you and your mistress live. In case you pay for any service, they've got your identity too. Ever had a single shipment confirmation to your address come to your mail? They know who you are.

The hardware providers already have the information. You only need to make them reveal it to 3rd parties.


We should be banning groups from collecting age related information, and not requiring it. And we definitely should not be forcing companies to share that information with third parties.

All adults proof their identify multiple times per month: Every time they access digital health records, or when they use any electronic payment.

Just make Google/Apple reveal part of that data (age > x years) to websites and apps.

Boom, done. Privacy guarded. Easy.


>Every time they access digital health records, or when they use any electronic payment.

This is the internet.

Among those who were very familiar with it, the smartest money never started doing things like that.


The problem is obvious: People spend much more attention on cat videos from strangers than on their own friends' posts. Ads turn this attention into money.

It's about attention. You can check the schedule without thinking about messages, likes, or the news.

My solution is based on 12.48 inch Magic Ink Calendar:

https://github.com/speedyg0nz/MagInkCal

A 12.48 Waveshare eink display costs $175. Sadly haven't gotten it to work with the Raspi Zero and therefore can't use it battery-powered. Got an ugly cord right now. Running power to the right place through the walls is definitely dedication!


From the viewpoint of a security clearance, the employee is the enemy.

That's the point though. The testers wouldn't actually abuse their victims without the conviction of doing something righteous. Or they would, accidentally or intentionally, spill the secrets.

But if you make even the instruction material lie, then there is nothing that could be leaked and "expose" the system.


I always thought the workings of polygraphs were common knowledge.

It's fiction. Analysts get scared and don't do anything wrong preemptively. Analysts admit stuff they'd never do otherwise. The agency gets to show who's in charge. It creates a legal fiction that allows you to abuse your employees. It creates a fiction that the abusers themselves can believe in.

Why should the believe in the non-working polygraph be any weaker than in a nonexistent god?


You could start by not buying an always-on AI device. Just saying.

(The article is an AI ad.)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: