My first taste of real IT was as part of a one weeks high school “works experience” at Strathclyde University’s IT department back in the 90’s (it probably was called data processing or something back then). The systems were all unix and I was used to an Acorn Electron and BBC Master, both running a version of the BASIC programming language, it was a tough learning curve, especially when I was given a manual, pad of paper and asked to write a script. I can’t remember what the script was supposed to do but I remember being lost in this manual, trying to figure out this strange programming language that made no sense to me.
The university IT guys obviously didn’t understand my situation as their experiences were all with Unix shell scripting and possibly some C programming. To them I was a waste of space as ‘all computers worked the same’ and this should be simple for a kid with a home computer.
As I grew up I learnt Pascal, C, COMAL, COBOL at college and tried to teach myself Assembly language and C++ (both with not much luck) at home.
The last year (or so) I have written more programs in AutoIT and Batch (DOS) scripting than I have done in a long time.
As with all programming languages, sometimes there are ‘quirks’ that you have to get your head around but it’s amazing what you can do with a bit of creativity.
One of my most recent Batch scripting triumphs was to resolve an issue we have with Altiris.
We have a wide range of sites in one of our sections ( 13 large sites with servers and 51 smaller sites with a Local NAS each), with each site needing support via Altiris. The central Altiris server does not have the spare capacity for all the installs and whilst the network links are sufficient for standard use, pulling multiple installs over the links can bring sites to a crawl.
My aim was to create a script that could be used to determine the site, check the local data store ( NAS or server) for the install and if not present copy the files from the central Altiris server to the local data store, then run the install. This sounds quite simple bit you need some safeguards to prevent multiple PCs from copying the files. some way to check the files copied successfully last time and a check that if the copy was taking too long that another computer would take over (assuming that the original had failed to copy the files – ie powered off or disconnected from the network ).
So what appeared, at first look, to be a simple little script, soon became a challenge.
I’m happy to announce that the script is currently working. So what does the script do?
First it determines the site, originally this was done using the computer name as each site names it’s PCs with a site specific prefix. After a bit of planning it was realised that some support staff take the computers to one of the larger sites when the machines are rebuilt or if there is a large number to be built. So to account for this it was realised that the unique identifier for each site was the IP address, or in our case the gateway address, using a FOR loop it was easy to strip the gateway address from the IPConfig command.
Another FOR loop then striped the IP into the data we needed and then compared it to a CSV file to get the Server / NAS details.
So that was the easy part done, we know the site & local server / NAS ( data store) but what about the rest?
Checking for the files is easy using IF EXIST, but what if it’s not there. It would be easy to just copy the files but what if 20 machines are all trying to copy at the same time? Simple, create a lock file. Each machine ruining the job cheeks for the file, if it exists it loops until the file is deleted. If the lock file doesn’t exist it is created with the name of the creating PC (for manual error checking) before using robocopy to copy the files/ directories over (other tools are available & better but Robocopy comes with Windows 7 and a version exists for Windows XP – so the easy option was taken), once
Robocopy completes the lock file is deleted and all the PCs continue with the install from the local data store.
So what if the files exist but there have been changes ? This was a bit trickier, I used the DIR command to check the number of files in the directory (and sub folders) as well as using DIR to report the size of the directory(s). This should be a simple (but not completely fool proof) way to check for changes and act accordingly.
So what if a PC is powered off and the lock file exists? Well there are two checks, the first is when the script runs – it checks the date of the lock file and if it doesn’t match the current date then it’s deleted (keeping it simple). The second check is if the batch script comes out of the loop and the lock file is still there then it sends an error back.
There are loads of other error checking in the script and there is a heavy use of CALL, IF and FOR commands within the script to do what it does, but it does everything I had planned and more than the original concept.
So, after a lot of testing and a few changes, I’m now off to look at my next challenge.
If this post is of interest please let me know by liking or posting a comment and I may add some more of the same.