Difference between revisions of "access"

From GeneWeb
Jump to: navigation, search
(Access (en) pass 1)
(No difference)

Revision as of 23:26, 30 October 2015

150px-Geographylogo svg.png Language: English • français

In complement to the wizard and friend password mechanism controlling modifications rights and private data visibility, there are several other mechanisms restricting global access to the base or to the gwd service. Some features restrict access by robots, and black lists prevent total access from specified Internet nodes.

These additional features apply only in server mode, and not in CGI mode. In the latter case, you should use the access restrictions provided by your HTTP server.

Global access restriction to a base

If you want to limit global access to one base to a limited set of persons, you should create a global-access.auth file, and specify this filename with the auth_file variable of the configuration file for the base. The global-access.auth file should be located in the bases folder

This file has the same structure as that of friends and wizard password files, namely username:password:


When accessing the site, visitors will be requested a username and a password which should match one of the pairs stored in bases/global-access.auth.

Global restrictions to gwd service

The previous access restriction applies to a single base. You can implement a similar access restriction to all the bases managed by your site by providing gwd with an global authorization file of the same structure as above. This filename is supplied to gwd at launch time through the option -auth filename.auth. In case of the existence of both a global access restriction to gwd service, and global access restriction to a base, the latter has priority over the former.

Robots management

Some robots visit regularly most web sites, exploring their content to build up their search capabilities. To do this, they download a page, and within this page, follow any valid HTTP link and repeat the process. With bases such as the ones manages by GeneWeb, this is an endless process with little value, and possibly bad consequences:

  • it slows down your server,
  • it impacts other visitors,
  • it destroys access statistics to your site,
  • it gives a bad taste of spying or stealing;

While GeneWeb's HTTP server and the generated HTML pages adhere explicitly to the exclusion standard, followedby by most "big names" search engines, some to ignore it and continue exploration of yopor site in spite of the "soft" interdiction. To alleviate this problem, GeneWeb watches the frequency at which a remote site comes visiting GeneWeb and puts it on a black list if this limit grows beyond a threshold. This limit is provided to gwd at launch time with the -robot_xcl xx yy option which means "put this remote site on black list if it performs more that xx visits in less that yy seconds". Further requests will receive a warning message stating that they have been put on the site's black list.

The black list is a file called robot stored in the bases/cnt folder.

In order to restore access, one should remove the file (as explained in the gwd log file)!

Black list

Another black list allows to refuse access to a selected list of individual or grouped Internet sites. A file names gw/gwd.xcl contains a list of excluded sites, one per line, with * replacing any sequence of characters.


prevents access for "@grand-mechant@loup.bois@", "@fournisseur-22@d.acces@", "@fournisseur-xx@d.acces@", etc. Putting a single *on a line will block all access, including for your own address!!