Keep It Simple, Stupid | |
PerlMonks |
Re: making a loop script with a remote URL call fasterby stevieb (Canon) |
on Jan 15, 2022 at 17:35 UTC ( [id://11140474]=note: print w/replies, xml ) | Need Help?? |
I have a microcontroller on my garage wall that displays information about my Tesla car, with a visible charge level and an audible alarm if the charge is below a certain point so I don't forget to plug the car in. The device turns on only when there is motion in the garage. It reaches back every one second to a computer in my house to fetch updated data about the car. The computer in the house serves that device the data, but it also fetches that data from Tesla's API. While there is motion (ie. the microcontroller is asking for updated data), the computer returns whatever data it has available, while repeatedly fetching data from Tesla in case it changes. The fetching from Tesla happens as fast as when one pull is done, another starts immediately. This data is fetched in a separate process than the process that returns existing data to the microcontroller. When the Tesla data updates, the shared variable is updated, and the next return to the microcontroller has this new data. There is no lag or delay, thanks to the separate process. It also means that the call from the controller to the server is always extremely consistent with no delay. Here is an extremely (!) simplified version of that you might be able to use as an example. It uses my IPC::Shareable distribution to create the shared memory backed variable that's used between the two processes, and my Async::Event::Interval for the external async process used to fetch the data from the website. Feel free to ask any questions. I've put this together rather hastily so I'm sure I may not be explaining things very well
Output. The first output is when my car was asleep. The next one was after I sent a wakeup call to it.
In Section
Seekers of Perl Wisdom
|
|