Problems? Is your data what you think it is? | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
If the indices you wish to search are all available to use as web pages, then use LWP::UserAgent, build a user agent, and download all the pages in your list. Then using something like HTML::TokeParse to grab all the <a href> elements (and any other interesting bits of the pages), build a hash containing of all the things you want to search on (perhaps using lists of these things as the values).
If the source pages are fairly constant, then run this separately from your search function and use Data::Dumper to save the hash to disk. Otherwise proceed immediately to... For searching the elements of the hash, use a function like grep to quickly isolate a list of keys. Using CGI present the information in a web page replete with hyperlinks and additional info by iterating over the list from the previous step. Have fun and think of other ways to do it. In reply to Re: Miniature search engine
by ichimunki
|
|