in reply to Re: Are there any issues with JSON
in thread Are there any issues with JSON

If I never saw a twirly... I'd be a Happy camper, to0! :)

Thanks for taking the time to suggest RPC::Any::Server. I see it also speaks JSON -- RPC::Any::Server::JSONRPC, and RPC::Any::Server::JSONRPC::HTTP. In your experience; does it provide for anything close to what I'm attempting to accomplish? I can see where it provides for pretty quick responses. But I fear, that now I've better articulated my needs, that this might not quite hit the mark.

Sorry for any sort of misunderstanding I may have caused.

--Chris

¡λɐp ʇɑəɹ⅁ ɐ əʌɐɥ puɐ ʻꜱdləɥ ꜱᴉɥʇ ədoH

Replies are listed 'Best First'.
Re^3: Are there any issues with JSON
by locked_user sundialsvc4 (Abbot) on Jun 04, 2014 at 04:24 UTC

    Yes, Chris, I think that it does ... if (smile! wink!) you have a clear idea of what you’re attempting to accomplish and how it must be done.   But I’m not quite sure that there is clarity on that point yet.

    As I see things, on the client side you are going to have to pick your poison as to (yeah, big phat) JavaScript libraries.   There is simply no other way to get the things that you want on that side, such as nice editors.   You are going to build-out your application substantially as a JavaScript program.

    Although, at this juncture, may I please send you to peek at http://www.haxe.org and ask you to look beyond the look of that site.   (Site sucks, tool does not.)   But, I digress.

    This leaves us with the Perl side, where you are going to be using the JS to issue JSON requests (or XML, pick your poison) to the server side, and where you are going to need to provide support for every single one of them.   I like RPC::Any::Server because it is quite logical and extensible, and I find that it scales-up nicely.   As I said, it will be very important that you “do your homework well” here, before you embark.   Stay in port with the sails furled until you have planned and researched the entire voyage, for there be dragons.

    Once you do “pick your JavaScript poison,” you ought to be able to find corresponding support for it in the form of other CPAN modules.   Take full advantage of these, because it means that someone else has banged his head mightily against something that, maybe, you at least won’t have to bang your own head against quite as much.   (Roll-eyes ....)

      Greetings, sundialsvc4, and thank you for such a thoughtful, and concise reply.

      Honestly, I (where JavaScript is concerned) want to keep it simple. As noted previously, and elsewhare. I'm not keen on the addition of JS, anyway.

      Main points of contention, are that JS is fragile, in the sense it's easily disabled (within browser). This leads to [potential] failure of any aspect [depending] on JavaScript. Other point; Don't at all care for those "BloatWare" libraries floating around -- I'm uncomfortable going to sites that use them, mostly, because I think the're easily abus(ed|ive). I have no idea what's in them, and could easily be a big fat security risk. Further (as my CPU meter indicates) that even with 3 cores, running @3.2Ghz w/128Gb RAM the meter shyrockets, when subjected to these libraries. Can you say inefficient, poorly designed, spaghetti code? Sure, I knew you could.

      So, for me. I think those are reasons enough for anyone to feel compelled to reject such option(s). Sure, you could always examine the JS library, and make any changes/corrections you felt necessary. But in the end, wouldn't just be adding yet another big JS library to the pile?

      In the end, and as you note so wisely, forethought is required, in order to make the correct choice the first time. Lest one squander precious time needlessly. :)

      As I'm already using XML (XML::Parser). I'm feeling inclined to travel down that road. In fact, I'm also looking at possibly replacing XML::Parser with one, or perhaps both Modules created by Jenda -- XML::Rules, and XML::DTDParser. By going (sticking to) the XML path. I think I keep the current path of the CMS, both narrower, and more direct. I can then be more easily focused. Not having to keep changing hats, while working with/on it.

      I'd like to close by thanking you again, sundialsvc4. For both your time, and perhaps more importantly; your patience. Given my bone headed initial conception, where JSON is concernecd.

      --Chris

      ¡λɐp ʇɑəɹ⅁ ɐ əʌɐɥ puɐ ʻꜱdləɥ ꜱᴉɥʇ ədoH

        Search for “graceful degradation” and “progressive enhancement” to learn more about considerations and strategies. If you’re committed to a perfectly atomic app you’ll have to look in other places though. The magic of the web is its deep, if sloppy and choppy, interoperability. Web pages from the early 90s load fine on Chrome or a PSP. Many, though obviously not all, web pages from this year would load “ok” on Mosaic or Navigator.

        What you gain from committing to iOS or something, you lose by committing to iOS. :P A unified app has to be specially built/compiled for every platform, sometimes in proprietary languages, and sometimes recompiled for every platform upgrade, firmware change, etc. A Perl/JS app can be run with just a little forethought on nearly limitless combinations of platforms, agents, and their manifold versions and is somewhat future proof by default just because of the nature of the web.

        P.S.

        Just had a look at haxe. Seems [at least] a bit like Clang && LLVM.

        How well would it work on my Timex Sinclair? ;)

        I think [where my CMS is concerned], it's a bit more than my needs require. But could easily see where it'd fit in where using some [compiled] form of JavaScript, was concerned.

        --Chris

        ¡λɐp ʇɑəɹ⅁ ɐ əʌɐɥ puɐ ʻꜱdləɥ ꜱᴉɥʇ ədoH