I just moved to a new web server (powweb, if anyone knows anything bad, let me know so I can pull out before the 30 day trial is up!) and I have a few questions about setting up my new page.
1) I want to move the content from the old server, but I don't have access to it and the person that does may not be available for a few days or more. I know there are modules like WWW::Robot that will crawl all links from a page recursively, but I've never used anything like this before and reading some of the documantation made my head spin! I'm not sure if this will do what I want it to, either. I basically want to prevent any broken links, so I want a list of all the URLs (including images) that the page(s) link (if the module provided a way for me to downloadthem too, that would be a bonus, but not a requirement). I thought maybe a hook in WWW:Robot to output the URL of any successfully visited page and also to read through it and pull out the image locations, but is there a better solution?
2) The web server unfortunately doesn't have in-place editing (1 of the 2 negatives I found, the other is no mailing lists!). There's nothing like a file manager that allows you to view your directory structure and edit files in a browser window. I've already thought about some ideas for coding this myself, but, as I've seen others say before, I don't want to re-invent the wheel. Any modules or scripts people know of off the tops of your heads? I say off the tops of your heads because I don't want to do my research for me, but I did a SuperSearch and a search on CPAN and didn't find anything I liked. I may have just not been searching for the right thing.
Thanks!
--Wink
In reply to Moving hosts: questions about web content by wink
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |