NotRobotUA — specify user-agents that will NOT be classified as crawler bots (search engines)
The NotRobotUA
directive defines a list of useragent strings which will
never be classified as crawler robots
(search engines) visiting the site.
This directive has priority over RobotUA
.
If the user agent matches NotRobotUA
, then the check for
RobotUA
is not performed and the client is not treated
as an unattended robot.
For more details regarding web spiders/bots and Interchange, see robot glossary entry.
For more details regarding user sessions, see session glossary entry.
Interchange 5.9.0:
Source: lib/Vend/Config.pm
Line 3853 (context shows lines 3853-3857)
sub parse_list_wildcard { my $value = get_wildcard_list(@_,0); return '' unless length($value); return qr/$value/i; }