in reply to Serial Port timing and write blocking?

A couple of thoughts:


TGI says moo

Replies are listed 'Best First'.
Re^2: Serial Port timing and write blocking?
by worldsight (Initiate) on Aug 04, 2008 at 19:57 UTC
    TGI,

    Correct you were on the first part, I had that in my code, I watered the code down a bit for the forum. I've corrected that math error for clarity here. Thanks.

    As for the write_bg, that's something I'll definitely try. Never used that before.

    As for the clock on the device I'm writing to...not really important, because the recieving device loops and waits for 2 bytes of input. As soon as the receiving device gets it's 2 bytes, it takes actions based on the data code it received. The sending device (as you can guess by the code) is a windows pc.

      How much out of sync are you getting? 1 ms, 100ms or 10s? How much drift is acceptable?

      What exactly should the timing look like? Consider the timing diagram below:

      A. Start sending command C. Wait period ends / / ---+------+--------------------+-----------+---- \ \ B. Command finish transmission D. Send next command.

      If I understand correctly, you want to have the time between A and C be X milliseconds as determined by input from the file. The time between C and D should be as close to zero as possible.

      Are you reading in all of the commands in the command set at once, and operating from memory? Or, are you reading each command from a file? The second approach could add time between C and D.

      My next step in trouble shooting this would be to try and figure out which part of the process we are at when the time is being lost/gained.


      TGI says moo