DEBUG: .../IO/Socket/SSL.pm:202: CA file certs/my-ca.pem not found, us
+ing CA path instead.
DEBUG: .../IO/Socket/SSL.pm:1382: new ctx 153784112
DEBUG: .../IO/Socket/SSL.pm:421: no socket yet
DEBUG: .../IO/Socket/SSL.pm:423: accept created normal socket HTTP::Da
+emon::ClientConn::SSL=GLOB(0x9ddf480)
DEBUG: .../IO/Socket/SSL.pm:439: starting sslifying
DEBUG: .../IO/Socket/SSL.pm:479: Net::SSLeay::accept -> -1
DEBUG: .../IO/Socket/SSL.pm:1131: SSL accept attempt failed with unkno
+wn errorerror:1407609C:SSL routines:SSL23_GET_CLIENT_HELLO:http reque
+st
DEBUG: .../IO/Socket/SSL.pm:1417: free ctx 153784112 open=153784112
DEBUG: .../IO/Socket/SSL.pm:1420: OK free ctx 153784112
I will need help interpreting this output but it looks like something
or someone tried to submit to the soap server and it failed.
Is that what I'm seeing here? If so, how can I make it not stop the process when this happens?
I think I may have figured out what the problem was. Since it looked like something was trying to access the soap server process in the above debug output, I thought maybe a web bot crawler is doing that. So I put a robots.txt file in my root web directory disallowing all agents from accessing the soap server directory. It has been running now for over 24 hours without stopping. It was stopping every evening around 7pm but now it seems okay. Hope this helps someone else that may be having a similar problem.
Unfortunately, the preceeding paragraph about the robots.txt file, did not fix the problem. The server is still stopping intermittedly!!
|