- Write some tests to verify that things work as you think they should.
- Identify potential security threats.
- Write a test for each one.
The process isn't very difficult once you get started. I suspect, though, that, given your history, you're not going to do much research on your own and, instead, are going to ask a bunch of questions here and not really work hard at getting to the meat of the answers. Honestly, if I were in your shoes, I would hire someone to do the work and you look over their shoulder. That way, it gets done properly and you have a reference implementation to look back at when you go to your next project.
My criteria for good software:
- Does it work?
- Can someone else come in, make a change, and be reasonably certain no bugs were introduced?
| [reply] |
I have no idea ow to use the module you suggested to me.
Apart from that iam rwlly curious on what googl's crawl bots are really doing on our webpages.
The one visited my page(crawl-66-249-67-210.googlebot.com) tried actually to pass a value in my scripts variable 'select' from url?!? Why on earth did it want to do that for? And if it wantes to select somethign from my popup menu why not select a valid item?!
Does that make any sense to you?!
| [reply] |
Please can somebody answer me about my question on google bots? It visits my webpage very 2 days or so...
Is it really automated? or one can intentionally manipulate them to atatck pages?
| [reply] |