in reply to Lost characters using Device::Modem

Whilst you can approach timing problems by adding delays, I think it can lead to a fragile solution and also a sub-optimal one (lower speed of transfer than is possible).

The link between the modem and the computer should implement flow control, where the receiver tells the sender when to stop and start sending. This is normally done either in-band (with XON/XOFF signalling - those ctrl-S and ctrl-Q ASCII characters that you can find locking up your xterms :-) or on seperate lines of the serial connection (RTS/CTS - ready-to-send, clear-to-send), which is a little better since it makes your connection 8-bit clean and doesn't take up any comms bandwidth.

See the &Kn section of the link above.

Disclaimer: I haven't done this sort of thing for a long time, but fixing timing problems with delays isn't a good idea. ( 0.1s between chars? You've just limited yourself to 10chars/second, approx 80baud :-(

Replies are listed 'Best First'.
Re^2: Lost characters using Device::Modem
by roboticus (Chancellor) on Nov 24, 2006 at 22:46 UTC
    jbert:

    I totally agree with you--you should use flow control whenever possible. Unfortunately, it's not always possible. In order to implement flow control, all devices in the chain must support it. It's not uncommon in the embedded world to connect with things that have no flow control. You may just have minimum/maximum timings on a specification sheet.

    For example, let's suppose that the modem on the SMS system, once it gets the proper "doit" string from the originator triggers a relay that connects the modem to a transmitter. That relay could take a few milliseconds to engage, and you could easily lose the first character or two while the relay is actuating.

    --roboticus

      Oh, agreed. If you're talking to a device which won't play ball with flow control, you don't have much choice. But, as you say, hopefully you then have a spec sheet and can put in the "theoretically correct" delays and timings, or something close.