It seems to me, that the general strategy is to find a way to do non-blocking I/O, so that one of your html div's dos'nt stall the whole page.
Of course, after playing around with a few Perl GUI libs, like Tk and Gtk2, you see the value of not having direct socket connection loops as your main loop. Instead, you run event-loop systems on both ends, both the socket and the server. The event-loops make and service the socket connections, so the sockets can be read or written to, in a non-blocking manner.
So, the whole idea of Web 2.0 is to break your monitor/screen into little divs( divisions), and each little div can run it's own javascript routine, automatically opening sockets and transferring information. So with event-loop based non-blocking I/O, many sockets can be simultaneously opened from your browser window, and be updating independently, without requiring an entire page refresh, you only need to refresh the div which originated the socket request.
The much heralded node.js is nothing more an eventloop system, written in javascript, which can handle JSON ( javascript object notation) packages of data. Furthermore, javascript itself is the eventloop running your browser. If you turn off javascript, you only prevent external scripts from being run, but the basic browser eventloop continues to run.
So..... if I was designing a general purpose strategy, it would be to be eventloop based, on both the server and client, AND the client should be able to be integrated into a web browser, which should be easy enough with javascript. It needs to handle things like I/O Promises, which is the eventloop's way of saying the "information is being readied for delivery".
Those are my current thoughts. :-)
In reply to Re: Current Thoughts on Server/Client Best Development Strategies
by zentara
in thread Current Thoughts on Server/Client Best Development Strategies
by rl4518
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |