in reply to Fast processing of XML files for CGI

Looking at it from a different perspective (and not immediately helpful): did you think of putting your data in a database instead of in a lot of small XML-files?

CountZero

"If you have four groups working on a compiler, you'll get a 4-pass compiler." - Conway's Law

  • Comment on Re: Fast processing of XML files for CGI

Replies are listed 'Best First'.
Re: Re: Fast processing of XML files for CGI
by AcidHawk (Vicar) on Dec 09, 2003 at 05:54 UTC

    Typically what happens is that, when an event occurs in our organisation and action is triggered which creates one file for the event that occured. There are other processes that monitor for these dirs for the xml files and as soon as they appear will handle them. I.e. log/update/close a call in the heldpesk Once these background processes have handled this event xml the file is deleted..

    The problem is that if the helpdesk is down, we stop the processes that handle these files. So every time there is an event the files get created but not removed. We would like to keep it like this because then when the helpdesk comes back up we can simply start these processes again and all the event files will be processed (in sequence).

    What we need, while the call logging processes are down, is visualisation for the operators. This is what I am trying to accomplish with a view of each dir in a web page. Some way of viewing the contents of the files. I would like to extend this a little also in that if an event is 'Critical' and later there is a corresponding event which changes the status to 'Repaired' I don't want to see this in the web page.

    -----
    Of all the things I've lost in my life, its my mind I miss the most.