You’ve heard of robots.txt – how about humans.txt? We look at the growing trend for websites to offer a non-intrusive way of showcasing the real people behind the development of the site – and also to shout out to the various technologies and languages underpinning the site’s functionality.
Websites usually include a robots.txt file in the site header to provide information to web crawling agents (i.e. robots) – typically telling the ‘bots to not crawl certain areas of the site, or that they should avoid the site altogether (see an example at https://github.com/robots.txt).
Abel Cabans (@abelcabans), a Barcelona-based web developer, thought it odd that we make allowance for agents automatically traversing our sites, but scarce mention is often made of the humans that put the site together. So he put the following file up on his site: http://www.pixelbinario.com/humans.txt – laying the foundation for the Humans.txt standard.
Jason Tipton over at frontlinedev.com neatly sums up why adding a humans.txt file is a good idea:
“Often times business owners don’t want a development company’s info listed on their site in the footer or perhaps an about page. Yet developers deserve the right to leave a small non-intrusive signature claiming their work.”
The key benefit of humans.txt is that it is completely non-intrusive – the site doesn’t even have to link to the file, but interested users would (in theory) always know to look for it.
So, the next time you’re browsing your favourite site, check to see if they have a humans.txt file – it’s a great opportunity to see if this trend will carry on, and to find out more about the real people behind the development of the site.
The humans over at humanstxt.org have a neat compendium of sites that have used a humans.txt file – submit yours!