
Sign up to save your podcasts
Or
JournalList.net is a non-profit membership organization that originated and currently maintains the trust.txt reference document. Trust.txt is a simple page of code with the goal that it would be used worldwide to distinguish legitimate news organizations from non-legitimate ones.
The idea of having a universal code that achieves mass adoption by digital platforms is not a new one. Back as early as 1994, before the internet reached widespread use, a company called Nexor initiated the robots.txt reference document to aid early search engines, like WebCrawler, Lycos and AltaVista, to correctly crawl and index page contents correctly within any Website. Today it is estimated that over 90% of global sites have this file placed within their route directly that anyone can see (https://www.nytimes.com/robots.txt, https://www.cnn.com/robots.txt ).
Just a few years ago, the ads.txt file was created by The Internet Advertising Bureau to allow online buyers to check the validity of the sellers from whom they buy for internet fraud detection.
Now comes Scott Yates, a lifelong journalist and entrepreneur. In the past three years, he has devoted most of his time and energy establishing JournalList.net with a mission to introduce and promote the adoption of a new reference document —Trust.txt. This is a file that news publishers will add to their websites to signal their affiliations with other trusted news organizations.
In this 102nd episode of “E&P Reports,” Publisher Mike Blinder speaks with JournalList.net founder and Executive Director Scott Yates about their mission to have the Trust.txt reference document adopted and utilized by journalistic sites worldwide as a means to establish their authority as a trusted, credible, legitimate news outlet.
4.3
1212 ratings
JournalList.net is a non-profit membership organization that originated and currently maintains the trust.txt reference document. Trust.txt is a simple page of code with the goal that it would be used worldwide to distinguish legitimate news organizations from non-legitimate ones.
The idea of having a universal code that achieves mass adoption by digital platforms is not a new one. Back as early as 1994, before the internet reached widespread use, a company called Nexor initiated the robots.txt reference document to aid early search engines, like WebCrawler, Lycos and AltaVista, to correctly crawl and index page contents correctly within any Website. Today it is estimated that over 90% of global sites have this file placed within their route directly that anyone can see (https://www.nytimes.com/robots.txt, https://www.cnn.com/robots.txt ).
Just a few years ago, the ads.txt file was created by The Internet Advertising Bureau to allow online buyers to check the validity of the sellers from whom they buy for internet fraud detection.
Now comes Scott Yates, a lifelong journalist and entrepreneur. In the past three years, he has devoted most of his time and energy establishing JournalList.net with a mission to introduce and promote the adoption of a new reference document —Trust.txt. This is a file that news publishers will add to their websites to signal their affiliations with other trusted news organizations.
In this 102nd episode of “E&P Reports,” Publisher Mike Blinder speaks with JournalList.net founder and Executive Director Scott Yates about their mission to have the Trust.txt reference document adopted and utilized by journalistic sites worldwide as a means to establish their authority as a trusted, credible, legitimate news outlet.
4,075 Listeners
2,823 Listeners
38,121 Listeners
6,571 Listeners
8,810 Listeners
548 Listeners
30,217 Listeners
111,486 Listeners
5,892 Listeners
105 Listeners
5,353 Listeners
15,251 Listeners
347 Listeners
12 Listeners
3,140 Listeners