1
Dec 2025
NLJ News and Notes Tech Guides
Back in August, I noticed that a web crawler had requested a “carbon.txt” file (see my recent article on a less friendly web crawler). I had never heard of a carbon.txt file, but I assumed that it would be conceptually similar to the humans.txt file I wrote about in November 2022 (and similar to the ubiquitous robots.txt file found on many websites, including ours). From the name of the file, I assumed it would be information about a website’s “greeneess.” From the top, I note that, while …
1
Dec 2025
NLJ News and Notes Tech Guides
Back in August, I noticed that a web crawler had requested a “carbon.txt” file (see my recent article on a less friendly web crawler). I had never heard of a carbon.txt file, but I assumed that it would be conceptually similar to the humans.txt file I wrote about in November 2022 (and similar to the ubiquitous robots.txt file found on many websites, including ours). From the name of the file, I assumed it would be information about a website’s “greeneess.” From the top, I note that, while The New Leaf Journal is the perennially virid online writing magazine and I make ample use of viridian in our theme colors, achieving low carbon bench-marks has never been on my priority list. However, I do make a point of making our site lightweight, so I figured before taking on this project that we are more than green enough for all normal purposes.
I do not remember the specific user agent string which requested our then-non-existent carbon.txt file, but the information sufficed for leading me to the maintainer of the carbon.txt initiative: the Green Web Foundation (hereafter “GWF”). GWF has a stand-alone page explaining the purpose of carbon.txt:
carbon.txt is a single, discoverable location on any domain – /carbon.txt – for public, machine‑readable sustainability data. Organizations publish their data and let others know where to find it through their carbon.txt file. Meanwhile data consumers know where to look for these disclosures. Carbon.txt files become a vital link making it much easier for everyone.
In short, carbon.txt is a plain text file with specific formatting which provides sustainability data about a specific website. Its format is meant to be consumed by web crawlers but also intelligible to human visitors. A different page of the GWF website explains how to implement carbon.txt in three steps.
Firstly, we must create a carbon.txt file. GWF offers a carbon.txt builder, but it also explains the syntax for people who prefer a “DIY” approach. Because I am adding a carbon.txt file in an effort to collect all of the .txt files, I went with the builder.
The carbon.txt builder provides text boxes to add organizational disclosures and additional information (with the example being website hosting). For disclosures, we can pick from the following Document Types: Web Page, Annual Report, Sustainability Page, Certificate, CRSD Report, and Other. I was initially at a loss here, but I managed to hunt down a few web tools which (A) check a website/web page for its green qualities and (B) produces the report as a URL that can be viewed by others. I added links to three such reports with the “Web Page” document type. The GWF also has an optional field for a “Valid Until” date, but that is not applicable to my web pages (the carbon.txt file contains a “generated on” date so people can see I generated our current, as of this writing, carbon.txt on December 1, 2025).
Next, GWF has an optional field for adding upstream services. The example is hosting provider. Our hosting provider is Hetzner (VPS), so I added Hetzner with the service type vps-hosting-provider.
I downloaded my carbon.txt file and proceeded to step two. Step two requires us to upload our carbon.txt file to our website. GWF “strongly recommend[s] that you upload the file to the root of your domain,” but the GWF FAQ also offers alternative options. I have already uploaded several txt files (notably robots.txt and humans.txt) to our domain root, so I did the same with our carbon.txt file, which is now available at https://thenewleafjournal.com/carbon.txt.
Finally, GWF asks us to validate our carbon.txt file using their carbon.txt validator. This is to ensure GWF (and by extension anyone else) can access the file on the website in question. As you can see below, the validation was successful.
That about covers everything. Unfortunately my Hetzner hosting is not certified “green hosting” according to GWF, but eco-conscious readers who peruse the links in our carbon.txt file will be happy to see that our homepage/site grades out well for being light weight and having proper caching. I will try to remember to generate new reports if I make any significant changes to the site or add reports if I come across new green checkers, but readers will always be able to see when the “current” carbon.txt was generated.
For anyone interested in adding carbon.txt to their website, I recommend using GWF’s carbon.txt generator for information about the most up-to-date version. GWF also allows people to check if a given website is using green hosting. It is worth noting that the carbon.txt specification is open source (see source code on GitHub).