in reply to Perl solutions for large web sites?

I'm a bit confused. You need static pages, because you can't handle the load, but you want perl to power it all?

Your low load option is to just use static pages. Very little processor load, because serving static pages it the easiest method of web serving.

Your medium load option is to use a mod_perl backend to write your own SSI scheme (A la Douglas M. and Lincoln S. in "Building Apache Modules in Perl and C"). This will give you added functionality (custom SSI), and tight integration with a real web server (Apache). The custom SSI can do as much as you can code.

Your high load option is to just use mod_perl to add headers, footers and menus to your pages on the fly, and just let maintainers worry about content (making your pages into an enormous table with one row as a header, one td of the second row a menu, a second as the main body of the page, and another row as a footer is most common).

Spiders can index results from CGIs, but can't generate the parameters passed to CGIs that you may want.

If you have more questions, /msg me.

JJ

J. J. Horner

Linux, Perl, Apache, Stronghold, Unix

jhorner@knoxlug.org http://www.knoxlug.org

  • Comment on Re: Perl solutions for large web sites?

Replies are listed 'Best First'.
RE: Re: Perl solutions for large web sites?
by Anonymous Monk on May 24, 2000 at 14:59 UTC
    I'm a bit confused. You need static pages, because you can't handle the load, but you want perl to power it all?
    Yes, as in Perl will generate those static pages once in a while, from templates that I create. Therefore, Perl will be powering it.

    Your low load option is to just use static pages. Very little processor load, because serving static pages it the easiest method of web serving.
    Exactly, is that I'm doing it with a twist. Static pages will be generated by Perl scripts.

    Spiders can index results from CGIs, but can't generate the parameters passed to CGIs that you may want.
    Either I'm misunderstanding you, or you're simply incorrect. Search engines won't spider:
    http://www.foo.com/foo.cgi?whatever=300 but they would spider:
    http://www.foo.com/foo.cgi