I have the following in my .htaccess file on website2, but when my other website on website1 makes requests to website2 it is blocked because the server user-agent is blank.
Is there are a way to allow a single IP address into the code below, or some other method to block the empty user-agent bots but allow my website1 to make requests to files on website2?
SetEnvIfNoCase User-Agent ^$ bad_bot #leave this for blank user-agents SetEnvIfNoCase User-Agent "^more-bad-bots" bad_bot <Limit GET POST HEAD> Order Allow,Deny Allow from all Deny from env=bad_bot </Limit>
Here is Solutions:
We have many solutions to this problem, But we recommend you to use the first solution because it is tested & true solution that will 100% work for you.
… because the server user-agent is blank.
Why not set a user-agent in your script? You shouldn’t be making HTTP requests with blank user-agents. (If using PHP then this may be as simple as calling
Is there a way to allow a single IP address
You can unset the
bad_bot environment variable if the requesting IP is known. For example:
# Leave this for blank user-agents SetEnvIfNoCase User-Agent ^$ bad_bot SetEnvIfNoCase User-Agent "^more-bad-bots" bad_bot # Unset bad_bot if request is from 203.0.113.111 SetEnvIf Remote_Addr ^203\.0\.113\.111$ !bad_bot :
I’ve also removed your line-end comment – Apache does not support them! That “line-end comment” was setting 5 additional environment variables:
Apache only supports full-line comments.
Note: Use and implement solution 1 because this method fully tested our system.
Thank you 🙂