If you're reading this, chances are you've seen our robot visiting your site while looking through your server logs. When we crawl to populate our index, we advertise the "User-agent" string "GeonaBot".
Our bot follows the robots.txt exclusion standard, which is described at http://www.robotstxt.org/wc/exclusion.html#robotstxt.
Remove your website from the Geona index
If you wish to exclude your entire website or a specific section (directory) of your server from Geona's index, you can place a file at the document root of your server called robots.txt.
To ban all bots from your site, place the following in your robots.txt file:
User-agent: * Disallow: /To ban GeonaBot from your site place the following in your robots.txt file:
User-agent: GeonaBot Disallow: /Remove individual pages
If you want to prevent all robots from indexing or caching individual pages on your site, then you can place the following meta tag elements into the page's HTML code:
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">or
<META NAME="ROBOTS" CONTENT="NOARCHIVE">Geona updates its entire index automatically on a regular basis. When we crawl the web, we find new pages, discard dead links, and update links automatically.