Currently, I have a script that curls for HTTP code (200,404 etc.) from a list of websites and check for HTTP code error 404, if the webpage returns HTTP code error 404, it will call another script to send out an email. However, the flaw in this script is that the script is constantly running 24/7, each cycle of the script is about 15 minutes. Every 15 minutes, if the website is still down (HTTP code 404), it will send out email. This means that I receive an email every 15 minutes if the webpage is still down. However, this is not what I want as I only want to receive 1 email when the webpage switches from a 200 webpage to a 404 webpage. Is there a way where I can enhance this and reduce the 404 error?
Due to confidential issues, I cannot disclose the script, however, this is a short example in my script that I used to check for HTTP code check on the script:What I want to achieve is instead of checking for HTTP code error 404, is there a way to detect if there's a page change from a 200 webpage to 404 webpage, send email. And if the webpage is already in a 404 state, do not send email.. I know this is a very vague question because I cannot provide my script to all of you.. but any suggestion in theory is good too. Thank you in advancemy $HTTPCode =`curl -s -w "%{http_code}" -o /dev/null https://$THIS_UR +L 2>&1`; #this is the line i used to retrieve the http_code if($HTTP Code == 404){ #Send email }
In reply to Detecting for HTTP pages code changes by dotowwxo
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |