in reply to Re^2: Losing bytes with Device::SerialPort ?
in thread Losing bytes with Device::SerialPort ?

- is there any easy way to determin if something else is listening on serial port ?

There's no portable way but on some operating systems you might try fuser.

- possibility that end character might be swallowed is mentioned. I'd like to clear this in my head: I have start sequence of bytes ,and then info about length -so I receive only declared number of bytes - is there any way that end byte of previous message would be swallowed if it is a part of starting ?

So you get a declared length and then read that many bytes. If any one of the bytes that is part of the body of the message got lost in transmission, you will end up reading one byte past the end of the message (therefore, probably the start of the next message).

A really robust protocol like the asynchronous version of HDLC that is used in PPP defined a sequence of bytes (sequence of bits in the true synchronous version of HDLC) that can never appear anywhere inside a frame. It always marks the start of a new frame. So what you do is that you read until you find a frame header, then start reading the contents of a frame. If, while you are doing this, you find a copy of the FLAG sequence prematurely, you know that something has gone wrong and the frame you are presently reading is corrupted. You will have to discard it and start a new frame header from the present position (i.e. the position where you just found the FLAG sequence). You lost a frame but at least you are synchronized to the start of the next frame (which will hopefully not be corrupted).

This robustness requires using a protocol that provides such a guarantee of a sequence that can never occur except at the very start of a frame. 0x55 0xaa might serve that purpose in your protocol, but I suspect that the protocol you are dealing with is a little dumber than that and that 0x55 0xaa can probably appear inside the data if you are unlucky. My guess is that this is some kind of embedded device and it doesn't use a very wonderful protocol. In that case your ability to resynchronize to the start of a frame in the face of a corrupted frame is quite limited and you should complain to the protocol designer. By the way, the ability to synchronize is also important when your application starts, if the first thing it reads is some gibberish message fragment from something that was sent before your application started, so that you know when the gibberish is over and the real fun starts.

Synchronization over data transmission lines is not a concept that is reserved for the telecommunications field. An example of a data encoding quite different from HDLC that also provides excellent resynchronization capability is UTF-8.

In answer to your last question, you could probably use timing heuristics to work around a bad protocol. That will be some trial and error.