brandon8696 has asked for the wisdom of the Perl Monks concerning the following question:

Hey Monks,

My current company has their entire web application written in Perl using the now defunct CGI modules. I have been tasked with making a 2 factor authentication system for it, and this defunct CGI module is the bane of my existence. I am doing a simple AJAX call passing JSON data back and forth. The first AJAX call is fine, but the second AJAX call breaks the CGI header data and spits out my raw file path to my Perl login function.

I have been running debugging, and the header type is set properly to application/json, yet it is not getting interpreted properly. If I recall, a Stack Overflow post similar to mine seemed to indicate it is a core CGI header caching issue, but I'm genuinely stumped as to what that means conceptually, and what the working solution is. I feel like I can't be the only Perl user stuck using defunct modules trying to use AJAX functionality.

My temporary working solution has just been to call a CGI redirect instead of returning the second AJAX response, but this was a horrific band-aid I don't want to keep.

I suppose my questions are, 1) Conceptually what the heck is going on. 2) How can I manually command this CGI module to not short-circuit and respond with what I actually want?

Replies are listed 'Best First'.
Re: CGI Header Breaks on Second AJAX Call
by Corion (Patriarch) on Apr 10, 2025 at 15:24 UTC

    As an additional thing, please realize that "AJAX" has very little impact. You can replicate the HTTP requests that Javascript does from within your browser, and in the end it does not matter where an HTTP request originates.

    Consider reproducing the first and second HTTP request to the CGI script externally. For example, you can see the requests your browser makes in the developoer console and copy them as curl commands. Then you can look at the difference between the working and failing request. This should give you a good hint as to where the problem lies.

      I did not want to leave the thread dead, and I wanted to make sure I got back to everyone here.

      I did heed your advice and used cURL to break this down as much as I could. I also tried to cut down the process to a minimal reproducible call. The results were even more perplexing to me.

      my $json = JSON->new->utf8->encode($data); my $q = $self->query; print $q->header( -type => 'application/json', -charset => 'utf-8', ); print $json; $self->session->flush if $self->can('session') && $self->session_l +oaded; # Absolutely murder the rest of the call stack exit; # OR die if mod_perl yells at exit()

      The code above intercepts my return, spits out exactly what I want, and nukes Perl immediately to prevent any unwanted data manipulation.

      The browser then received that JSON as a return object (I checked the network call), and then Chrome or perhaps js wrapped the whole darn thing in HTML as some default behavior Also note the url then redirects/points to my raw Api.pm file path, not my original url login path

      <html> <head><meta name="color-scheme" content="light dark"><meta charset="ut +f-8"></head> <body> <pre>Content-Type: application/json {"abbr":"Btest","companyid":7,"email":"bjp@example.com","_resume":"nul +l","name":"Brandon Test","success":1,"userid":1234}{ "error" : "API Error. ModPerl::Util::exit: (120000) exit was called + at /var/www/html/testapp/TestTracker/Api.pm line 1005", "event_threadid" : 21169609 } </pre> <div class="json-formatter-container"></div></body></html>

      This is kind of why I'm here and why it has been so hard to ask for help. I don't even know what TO ask or how to break this behavior down more. CGI is a beast I have yet to tame, and something is not happy with this second AJAX call on the same page.

      Oh, and for sanities sake, here's my current AJAX call as of now from debugging it a bit. Note the receiving logic in Perl has no issues interpreting this call and using the JSON data sent.

      $('#twofactor_form').on('submit', function(event) { var formData = $('#twofactor_form').serialize(); event.preventDefault(); $.ajax({ url: 'api/userLogin', method: 'POST', data: JSON.stringify(formData), dataType: 'json', beforeSend: null, error: function(event) { alert(event.responseJSON.error); //debugger; }, success: function(data) { //Note this never gets tripped ever. //It never reaches the success or failure state, it just spi +ts out that raw html and my url points to the api.pm path. if ( data._resume ) { //console.log("2FA _resume hit"); var resume = JSON.parse(data._resume); window.location.replace(resume.url); } else { //console.log("2FA no-resume hit"); window.location.replace(BASE_URL); } } }); return false; });

        I think you get the error message because you called exit() in your reduced example. Don't do that.

        If you then still get the HTML with the HTTP headers in the HTML, then maybe try outputting the HTTP status code first:

        print "200 OK\r\n"; print $q->header(-status => '200 OK', ...); ...

        Apache looks for HTTP headers in the output of your script, and if it doesn't find them, it assumes that your script outputs raw HTML (which your browser then turns into more HTML). So if the above works, then maybe something else is outputting a string via print first, or you need to reconfigure Apache to (not?) want the HTTP status code first from your script.

Re: CGI Header Breaks on Second AJAX Call
by talexb (Chancellor) on Apr 10, 2025 at 15:16 UTC

    If you had a small, self-contained example of your problem, this would help us diagnose the situation.

    Without some idea of how the CGI module is 'short-circuiting' the process, it's hard to say exactly what's going on.

    Alex / talexb / Toronto

    For a long time, I had a link in my .sig going to Groklaw. I heard that as of December 2024, this link is dead. Still, thanks to PJ for all your work, we owe you so much. RIP Groklaw -- 2003 to 2013.

Re: CGI Header Breaks on Second AJAX Call
by karlgoethebier (Abbot) on Apr 10, 2025 at 16:22 UTC

      Yeah, I remember seeing a whole article about a utility written using PSGI and Perl where the author lamented having to write additional scripts to ensure the services were running. How often was the script run? Once or twice a month I remember -- a use case that should have resulted in CGI being an obvious choice.

      The problem with CGI is that it runs a separate process per invocation, but in cases where availability is paramount and low usage requirements exist CGI just makes sense. I am one that believes it never should have been removed from core Perl distributions.

      Celebrate Intellectual Diversity

        a utility written using PSGI and Perl where the author lamented having to write additional scripts to ensure the services were running.

        Same problem as with almost any setup where webserver and applications run on in different processes. PSGI is perl-specific, FastCGI is not. FastCGI keeps your application running in an isolated process, so crashing the webserver does not kill the application, and crashing the application does not kill the webserver. Getting the application process running is a problem.

        You could start it manually. Every time your application crashes.

        You could construct something crazy, involving hand-started scripts, cron jobs, and I don't know what else to start the application process.

        Or you could make starting your application a problem for the operating system. After all, it is able to start the webserver without manual intervention, so it can do the same to your application. (With daemontools and runit, you simply need a shell script to invoke your application. With systemd, you write an INI-style configuration file to do the same job in a much more complicated way. With a classic init system, you set up a daemon by an init script, or invoke it directly from init. On Windows, you need to set up a service.)

        Apache also offers mod_fcgid to manage your FastCGI application process(es) for you, instead of using operating system services. Other webservers might offer similar modules. This has the advantage that you don't need to keep the application process(es) running if there are no requests for them. The disadvantage is that they are not as isolated from the webserver as a process started as an independant daemon started by init, daemontools, runit, or systemd.

        Alexander

        --
        Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)
Re: CGI Header Breaks on Second AJAX Call
by Discipulus (Canon) on Apr 16, 2025 at 11:52 UTC
    Heello brandon8696 and welcome to the monastery!

    if I'm permitted to shoot in the dark I'd say nph- non parsed header

    L*

    There are no rules, there are no thumbs..
    Reinvent the wheel, then learn The Wheel; may be one day you reinvent one of THE WHEELS.