Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Re: Web based password management (or how *not* to blame tye)

by derby (Abbot)
on Mar 24, 2002 at 20:52 UTC ( [id://153935]=note: print w/replies, xml ) Need Help??


in reply to Web based password management (or how *not* to blame tye)

maverick,

++. Just one question.

the IP from which they connected

What happens if the user is sitting behind a pool of proxies? Do you run the risk of "false-positive" hijacked session?

-derby

  • Comment on Re: Web based password management (or how *not* to blame tye)

Replies are listed 'Best First'.
Re: Re: Web based password management (or how *not* to blame tye)
by mattriff (Chaplain) on Mar 24, 2002 at 21:24 UTC
    In my experience, yes you do. :)

    I worked on a web application that started by comparing whole IP addresses on each access, and we started to have quite a few reports of people behind proxy pools having a problem.

    Backing up a bit and only checking to see if the IP is in the same /16 or /24 (checking the first two or three numbers, that is) helps, although it doesn't eliminate the problem entirely (and it really weakens the effectiveness of the test).

    Checking IPs can be useful in some situations, but for large-scale applications where the "general public" will be connecting to your interface, I wouldn't recommend it.

    - Matt Riffle

      I worked on a web application that started by comparing whole IP addresses on each access, and we started to have quite a few reports of people behind proxy pools having a problem

      Really? I figured this sort of thing would be a very fringe condition. I figured that most people weren't behind proxies at all, and that of the proxied people, most only had one. Then of those that had multiples, the auto-proxy configuration script would pick one of the pool at random, or round robin. The concept of sending different requests from the same browser through different proxies seems counter productive to efficient caching...

      /\/\averick
      perl -l -e "eval pack('h*','072796e6470272f2c5f2c5166756279636b672');"

        I've been doing something similar (though I included the web browser and several other things sent by the browser) and found out that ALL users of AOL browser have this problem. A friend tried to connect using the AOL browser and was kicked out by the very first page, so he used the normal MSIE (via the same modem connection!) and everything was fine.
        == Jenda@Krynicky.cz == http://Jenda.Krynicky.cz ==
        Always code as if the guy who ends up maintaining your code
        will be a violent psychopath who knows where you live.
              -- Rick Osborne, osborne@gateway.grumman.com

        The issue isn't always caching, though ... I've run into this problem more than once by now. For example, my previous school got 3 dsl and 3 distinct isdn lines. Each of those end up in one computer, which creates ppp connections on each of the lines. HTTP (and a few other TCP protocols) is now load balanced transparently -- there are no automatic proxy selectors. The clients get one IP address to use as a proxy. This proxy sends out distinct requests over the connections in weighted round-robin fashion.

        Since the school only has about 40 clients, it makes sense (this way the lines are utilized about equally). I have seen similar setups elsewhere in schools or even on lanparties ...

        Scripts that check for the originating IP (as opposed to a session ID) are always a PITA with that -- and not really all that much more secure (IP spoofing is a concern, then), and if an attacker can easily hijack a session, one usually has different, bigger problems ;)

        Even if those setups were made for caching content (as opposed to just proxying), nobody is to say that the datastore for caching isn't the same for all the interfaces, at least when they're on the same computer, thus not being counterproductive to efficient caching ...

        You have just described AOL. If my memory serves me correctly, I have seen sub-second IP address changes in my web server logs; 1 user, 1 burst of 3 image requests, 3 different IP addresses.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://153935]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others sharing their wisdom with the Monastery: (1)
As of 2024-04-25 00:25 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found