NOTENO (feed the bots)

The foundation of Google's success is the capability to evaluate and sort the overwhelming amount of information accessible throughout the internet. For this purpose autonomous programs, running on enormous server farms, are used to gather information about every single website. These so called bots, spiders or crawlers are designed to read through and classify all the data provided by a website. Bots also memorize and follow all links on a website to gather information about the linked pages and thus crawl from one site to another throughout the whole internet.

The NOTENO website is a tiny program itself that creates an unlimited amount of pages filled with random information. How will bots behave when they are confronted with a unlimited amount of information? Normal websites offer a limited amount of subpages. On the NOTENO website every single page is followed by another page and the total amount of subpages is endless. The NOTENO program is as small as 40 KB but can easily serve several GB of random information to users and bots every day. In 2016 Google was reading and average amount 170.000 NOTENO pages per day with a average data volume of 2.5 GB.

noteno.ralph-schulz.com

collection of names of known bots

NOTENO Live text stream output while site being visited by bots and users