There are three ways of handling this, that I can think of.
1) Use a single common repository on a
library machine, set up common directories on target machines and ssh trust relationships using the appropriate public keys. Keep your scripts in those common directories, with updates pushed out regularly from your
library machine.
2) Use a single common repository on a
library machine, and export a single directory of usable code as a NFS mount.
3) You could do what Mindspring used to do, which is use the
apt package manager to package all their code, and then set up an internal repository. Then, synchronization would become:
apt-get update
apt-get upgrade
Update: if your code distributions would include large flat file database dumps,
then I cannot recommend CVS as a distribution mechanism. I saw CVS being used once as a backup mechanism for flat file database dumps. By the time the dump reached a few hundred megabytes, it exceeded the memory of the target CVS machine and the backup was failing. Even scp would have been more robust with respect to size.