New protocol proposal for indie web: human.json
The past week or so, the indie web circles have been talking about this new protocol proposal for a human.json file by Beto Dealmeida.
Beto describes the background as:
One of the problems with the internet today is that a lot of the content is AI generated. There's no way to know for sure if a site is maintained by a real human or if it's just slop. The only way to know for sure is by getting to know the authors, which usually takes time and requires developing a relationship with them through other channels, like email or social media. But what if we could expand that trust by building a web of vouches between sites?
I really like these kinds of ways of verifying identity or building trust between people. Not through corporations destroying our privacy or through governmental proof of identity which can have its own problems.
For me, it’s not crucial that someone is who they claim in some absolute sense. Rather — for example in the case of Mastodon verification — if I know you through your website, it’s enough of verification for me to know that the person who runs that website is the same person who runs the Fediverse account.
Same kind of thinking applies with the human.json idea. The way it works is that you can put a file to your website and then add a
<link rel="human-json" href="/human.json" />
Inside that file, you state which version of the protocol it’s written for and what domain/subdomain/subpath it covers, like mine.
{
"version": "0.1.1",
"url": "https://hamatti.org"
}
Additionally, you can vouch for other websites. For example, if you have a file in your site and you want to vouch for me, you can add:
{
"vouches": [
{
"url": "https://hamatti.org",
"vouched_at": "2026-03-18"
}
]
}
to your human.json.
I started by adding the base version and I’ll be working over the vouches as I go forward.
Other people have been writing about this too. Evan Hahn shared it on Saturday, followed by Seth Larsen the same day who also wrote a Python script that checks who in his RSS feeds has added it to their site. Gina Häußge wrote about it too. As did Dave Slusher.
Beto Dealmeida has also written a Firefox extension that helps you see if sites you visit are within a trust network of people you trust.
What about humans.txt?
A similar but slightly different approach is humans.txt that’s been around for quite a while and is a human counterpart for robots.txt. It’s described as:
It's an initiative for knowing the people behind a website. It's a TXT file that contains information about the different people who have contributed to building the website.
If something above resonated with you, let's start a discussion about it! Email me at juhamattisantala at gmail dot com and share your thoughts. This year, I want to have more deeper discussions with people from around the world and I'd love if you'd be part of that.