perleager has asked for the wisdom of the Perl Monks concerning the following question:

Hey,

I'm going to be making a website for this business and they require the websites content to be updated quite frequently.

They asked me to make it so theres a way to edit the content of the website web based. I was thinking about using Server Side Includes to do this. Possibly for each page lets say main.html, have a main.txt file that holds the content and in the main.html add the server side include code. Then in the web-based program, I could make it edit that main.txt file.

Since this is a pretty important website, are there any real security risks using Server Side Includes or any other security risks that I don't know about if I choose to do it this way? Any other safer ways to do this web-based content editor?

Thanks,
Anthony

Replies are listed 'Best First'.
Re: Server Side Includes
by perrin (Chancellor) on Jan 26, 2004 at 04:07 UTC
    There are no security issues with server-side includes. There are many security issues with editing web content through a browser. That's where you should focus your security efforts.
Re: Server Side Includes
by atcroft (Abbot) on Jan 26, 2004 at 01:17 UTC

    It might be easier to do if you put the content in a database, and pulled it from there. This would also give you the ability to make a page they could easily use to update the content, a way to track or display previous entries, etc., plus does not run into the issue of someone retrieving the page while you are rewriting the file with the content....

    Just a thought. Hope that idea helps...

      atcrofts idea is probably best, but you could have a look at using a wiki if the data is less static. Wiki's are usually open for anyone to use, but most of them have authentication options. You would need to alter the templates to make sure it doesnt look like a wiki to outsiders.

      I'm using twiki at the moment, which has both authentication and easy to edit templates, however I'm thinking of moving to tikiwiki for various offtopic reasons.

      atcroft has the right idea. I had a simlar project and used databases for the following reasons:

      1. Most databases have built in security, like only allowing certain IP addresses to access only certain usernames, and allow only certain usernames to access and/or change certain portions of the site. That's a lot of stuff that would take forever to write out in Perl but is there and ready for the taking.
      2. Databases are fast and support multiple concurrenct connections. Files are slow and you can have race conditions if someone is trying to modify a file while it is being read. (Not to mention that you don't need to worry about things like file locks, etc.)
      3. Not all users are sys admins. Giving FTP access to a directory your scripts live in and telling them to upload files is asking for trouble.
      4. Regarding the last one, it's pretty easy to create a Tk or GTK based client to connect to the database and do everything transparently.
      5. Regarding the last one, if you use CSS on your site you can even allow your users to change things at will. You can also have sections formatted using class="" attributes, i.e. user inputs 5 paragraphs, perl splits the paragraphs on m/\n\n/, (remember to s/\r//sg), the CGI script you write outputs them as <p class="sectionFoo">content</p>, and it will look nice because the CSS tells the browser what to do for each class sectionFoo
      6. It's much easier to understand whats what when your directories aren't crowded by random text files.

      I'd agree with doing it with a database if at all possible. Most of the sites I've built in my time have been DB driven and in my experience it makes like a lot easier.

      Probably the most important thing is to not let the user touch anything directly (you don't want them in the actually DB interface or writing directly to a file just in case they end up putting something in that your output system doesn't like) - if you write a set of HTML forms to enter the content and validate it before it gets anywhere near storage you should be fine.

      In short - if you have access to a DB, then I'd always use it in preference to static include files.

Re: Server Side Includes
by dws (Chancellor) on Jan 26, 2004 at 03:52 UTC
    They asked me to make it so theres a way to edit the content of the website web based. I was thinking about using Server Side Includes to do this.

    That's one approach. Another is to consider this a problem that's already been very well solved by such things as MoveableType. MT is written in Perl, and can make for enlightening bedtime reading.

Re: Server Side Includes
by iburrell (Chaplain) on Jan 26, 2004 at 22:49 UTC
    I have to maintain a web site that uses server-side includes for content management that sounds similar to what you want to do. It has .shtml files for the main pages, .txt files for the content included in each page, and has include files for headers and sidebars. It is a pain to manage because it requires multiple files for each content page. It has the management headaches of a more complicated static web site, and the performance of a dynamic web site.

    I would invert the problem and generate the html files from the content. The html files become the on-disk cache. Add the standard elements with a template system. With a small website, storing the content on disk would work fine. For a large web site, storing it in a database makes more sense. The CGI script would generate the HTML files.

    I think there hasn't been enough exploration of dynamically managed but statically served web sites. Many web sites don't update the content frequently enough to require every page to be dynamically generated. However, a web interface to modify content and add the standard elements is very useful.