FreakoutITGeek's Blog

Random IT postings from Freakz

Monthly Archives: March 2013

Remote support of Windows without Altiris

I’ve recently posted some information about the work I do with Altiris, but I realise that there are loads of organisations out there that don’t have Altiris or similar deployment tools within their organisation, so what can they do to save time and effort supporting their desktop (and server) users?

Well, in my opinion, nearly everything you can do in Altiris you can do without it (if you’re creative enough).

Remote tools

Regedit: Microsoft’s humble registry editor which support people have used time and again to resolve issues by tweaking registry keys in Windows. Many don’t realise that you can remotely connect to another PCs registry from your own (Administrator rights required). Now it’s true that you don’t get the full set of registry keys, but in my experience most issues can be fixed as long as you have access to HKeyLocalMachine (which remote regedit gives you).
Using File / Connect Network Registry… and then providing the name or IP of the remote PC you can change, add delete and change permissions on the keys you need without having to visit the PC (or annoy the user by kicking them off the PC).

Reg: this command line utility allows you to query, add, delete, amend, copy, save, load, unload, compare and loads of other handy features. Using some simple batch scripting techniques you could easily create an inventory of a PC, report registry entries and Change them if wrong and all manner of cool possibilities. Whilst some of the commands are designed to run on the local machine ( see tip below about AT), most support the \\PCName\HKLM\… convention to run the command on a remote PC.

AT: another long standing Command line Windows utility that goes under the radar. AT is a scheduling utility which can (Admin rights required) create a scheduled task to run on a local or remote PC. The scheduled task can be set to run according to a schedule set up as part of the command and can be Interactive (ie users see what’s happening) or not. the command can even be set to repeat of this is required.
This is a handy replacement for Altiris scheduled jobs and allows commands that can only be run on the PC (ie there’s no remote option) to be run on a remote PC of your choosing. When used with some Batch / shell/ Windows Scripting host or other scripting some logging and admin can be built in, possibly to save logs to a central network location. Please Note that for security reasons (I’m thinking Blaster Virus) the task scheduler may be disabled on a remote PC (possibly via Group Policy).

Windows shares: Windows (by default) has two shares C$ (the C drive of the PC) and admin$ (%SYSTEMROOT% ie c:\windows or equivalent ). These are commonly known as Administrative shares and the $ at the end of the share name indicates that the share is hidden when browsing to the PC. If these shares are not disabled by group policy (or other security method), installs and required files can be copied to the PC so that you can perform whatever task you need (possibly with other tools such as AT?).

Group Policy: Most desktop support staff don’t have access to group policy as it is seen as a part of a Server administrator/support role, in my experience, however of you do then it’s a godsend. Microsoft allow you to change registry settings, install programs via Microsoft installer (MSI) and many vendors provide administrative ad dons to Group Policy that you may be able to use to support the end users. Group Policy is very powerful so it’s best to know what you’re doing but it’s handy for doing things like locking PCs down to prevent uses from installing unwanted applications, forcing settings (Windows or other Software Application vendor) etc.

Microsoft Server Toolkits: (such as windows 2003 Toolkit) These Toolkits contain a wealth of tools and utilities that can be used either remotely on the local PC (see AT) to perform actions such as configuring printers (eg PrintUI allows install of printer drivers, printer ports and printer queues), investigating group policy issues and much more.

SysInternals: SysInternals were a group of IT staff who created little hacks to do things that Windows didn’t. Microsoft eventually brought them into Microsoft and allowed them to develop the tools. Most of these are diagnostics tools but many are extremely handy, such as the tool that lets you investigate all the processes (and associations) running and not just what Task Manager shows [handy for Virus & malicious software removal]. AutoRuns is another handy diagnostics tool to identify what applications are running at startup [handy for investigating viruses or slow startup of Windows PCs]. The list of tools is massive and grows as Windows develops and changes.

AutoIt: AutoIt is a programming tool that uses a Visual Basic Script type language but is capable of things Visual Basic Script can’t do. My most common use for this tool is for reading prompts that appear during an install that can’t be run silently allowing some form of automation and allowing installs to be done in a fraction of the time it would take to do manually. AutoIt can read the header, content of the prompt and sometimes the buttons and can be scripted to send key presses or mouse clicks to buttons, dialogue boxes etc. freeing the IT support person from repeatedly entering details repeatedly and prevents manual errors during installs. I have found it especially useful for installation to suites of 20+ computers, when the installer uses a non standard software (usually created in Visual Basic or Borland Delphi by the vendor) or created without the proper silent install functions of the installer.
AutoIt can also be used for simple actions like checking for network drives, permission to a folder/file or creating/deleting/modifying registry entries, something that can be handy if you want to check users permissions, force a registry change of similar before starting a program (rename original .exe and replace with autoIt script compiled as an .exe then call the renamed original .exe)

Scripting: Scripting languages such as Visual Basic Script, JavaScript (using Windows Scripting Host) and simple Batch commands can be used to perform complex tasks that other tools just can’t. With a bit of practice and research these tools can add/delete printers, Find uninstall commands, script windows installs and so much more. Some of these commands can be run remotely but if not some creative thinking and tools such as the AT command (see above) are all you need.

[to be continued ]

Advertisements

Altiris (intro)

Note: Despite Altiris Deployment Server version 6 being quite dated, I feel that the information provided in this section of my blog is useful, even for IT support staff that are using other deployment methods (most of the Altiris jobs I have created could be adapted to use Batch scripting and work equally as well – something I have had to do in the past when our Altiris server has not been available over several days)

Quite a few years ago the company I presently work for purchased Altiris Deployment Solution and Altiris Notification Server.

At that point Altiris Inc. we’re still running the show and we were introduced to version 6. Since then Symantec purchased Altiris and have brought a new version to market with some new features, additions, improvements and merged some of the functions of Notification Server and Deployment server into a new version 7 product.

Unfortunately I have not been able to use version 7, so I can’t advise on the merits or otherwise of Version 7, so these postings will focus on version 6.

There are rumours that Version 7 is currently running on one of our servers in some testing and Inventory control capacity, unfortunately this is currently out with my current role and remit, so the information and access is ‘cloak and dagger’. The only reason I’m aware of it’s existence is because of my interest in the product and colleagues appreciation of my ongoing investigation and development of skills in relation to version 6.

So what is this blog posting about?

We’ll I thought I’d share some insights into the past few years using Altiris, with a view to helping others to develop their skills and hopefully benefit others (and possibly myself via feedback to these postings).

Looking back at the early Altiris deployment jobs I created, most appear to be a case of push out the installer and hope for the best (something it appears my colleagues still do !!?). So some of my future posts will cover concepts that I consider support staff have to consider when creating an Altiris deployment job?

We currently, as most organisations do, run on a mixed platform of OSX, Windows XP and Windows 7 and as such I’ve learnt how to utilise Altiris to standardise and make supporting the computer estate easier within the sites that I cover.

At present the organisation do not support Apple OSX, much to my personal dismay, however there are Altiris V6 and V7 clients available and with a bit of knowledge of Unix, OSX command line [see Mac OS X Server Command-Line Administration on Apple.com] and access to a NetBoot facility [OSX server or shareware DHCP/NetBoot utilities for Windows / OSX]) Altiris is capable of doing as much with Apple OSX as you can with Windows (or Unix / Linux for that matter). (If there is interest in this topic I may dig out my old documentation / notes and create a post)

So I hope you don’t find the intro too boring and I hope that the follow up posts are of benefit.

Altiris: Part 1

I thought I’d give my insights into what I believe to be best practice when it comes to creating deployment jobs.

Looking back at my early days creating Altiris deployment jobs, most of them appear to be a case of push out the installer and hope for the best (something it appears my colleagues still do !!?). So what do I suggest has to be considered before creating an Altiris deployment job?

Here are my top recommendations.
(more details to follow in further posts)

  • What Operating system is the software compatible with? Are there different installers for different OS ?
  • What software does the installer rely on (ie prerequisites)? Is the required software already installed? Can you check? Can you install it if not?
  • Is there sufficient space on the client to Install the software? (Some jobs may need to copy the install locally before they are run)
  • Are other installs (ie Windows update, previous Altiris jobs) running, which could cause the install to fail?
  • Which type of installer does the application use? Does it support silent / quiet / unattended / admin install? How can you tell?
  • Has the installer really worked and not just returned successfully? How do you check?
  • Have you cleaned up any temporary files and/or folders created during the install?
  • Have you updated the inventory?
  • Has the installer completed before the ‘get Inventory’ task runs?
  • Are there any permissions / access changes required? (registry keys / folders)
  • Are changes needed to the program, config files, shortcuts? Or Is additional software required to check for access (ie to a network share)?
  • Have you tested your install? Is there a standard build / department build you can test it with?

More to follow…

Head in the cloud(s)

Recently I was approached by a cloud computing organisation in relation to a role they were looking to fill within their current structure.

 

Now I’ve been an IT support guy for many years and, until that approach, my view of the cloud was either Apple’s iCloud or the other offerings from computer companies which, to me, appeared that they had just taken their most popular application and created it on the internet and said it’s something new. They were just repackaging something because it was the latest thing to do and they wanted to reach out to all the people with tablet computers and smart phones, but it was really the same old internet, same software, just with a fancy front end.

 

But how wrong was I..

 

I have now had a look at the services offered by Amazon Web Services (AWS) and all I can say is “Oh My!!”…  Looking at what is presented on the aws.amazon.com site is enough to make an IT geek drool and believe that they had found nirvana.

 

Now as much as I like Apple products and I have bought into Apple in a major way (because they appear to be developing ideas and concepts that fit into my life and the way I can envision the future of technology developing), I have to admit that I have had my eyes opened to the cloud and it’s possibilities, I have found a new vision. Not that I have turned my back on Apple, I don’t think that will happen any time in the near future, but now that I have opened pandora’s box I now believe that the world of IT has another bright future, out with Apple’s ecosystem.

 

To give you an idea of what I’m talking about (assuming you’re not already aware of Amazon’s offerings), Amazon offer the ability to ‘hire’ servers connected to their internet connected infrastructure, with each server (Windows or Unix) hosted ‘virtually’ on a high power, high availability and resilient physical server in one of their worldwide locations. Now this is noting new, but this is just the start, you can configure the virtual server the way you want and add software to your server ‘image’ to do whatever you need. You can  add more servers for resilience in the form of load balancing (including automatically starting a new instance of your image in the event a server fails), automatically increase or decrease the number of virtual servers as and when demand requires, have your data replicated over the servers and much, much more.

 

So some of you can probably see the possibilities, whilst others need more details… So here goes.. 

 

With all this (and there is more) you can create possibilities that have normally only been available to companies that have bucket loads of money, office, server space and IT support.

 

Traditionally if you wanted to set up an internet startup (for example) you would have to buy a web domain, servers (minimum 2), data switches, high speed internet connections from an ISP, cooling systems to ensure that the servers kept working effectively, power management systems (minimum UPS, probably backup generators, etc), tape backup (incase required), Redundant data and disks (incase of data loss) and IT support people to keep the servers running properly…  and that’s not all.  All this is pretty costly and takes up a lot of physical space and time.

 

So assuming that everything goes fine and you’re getting loads of users and the idea is profitable, you soon find that you need to order, fit and support more servers, upgrade power, security and other essentials..  this all takes time and yet more money and you need to future proof so you have to guess how big a new server(s) you will need to deal with not just the current demand but the future demand.  You need to hire extra staff or do more of the work yourself (which takes you away from building the customers and improving the product you’r promoting/selling).

 

Now this scenario is for an internet startup, but the same is true for other IT areas for example you standard IT department has been doing fine, but the business needs more, you have tried to save costs by virtualising the servers and putting more onto the physical servers that the IT department currently have, but this takes time, money and takes you away from the standard day to day IT issues.  When the business need more,  there’s the same upscaling required (servers, staff, power, security etc) and if there’s a new office opening somewhere you have to go through costings for either data links to the new location or if required, building and developing a new hub server setup. But what if the new office is in another country? You will probably have to have new servers, have onsite IT staff, have some way to share data, system updates, etc between the sites.

 

The traditional way to do IT is costly and upgrading is a risk to the business. With the cloud you can have the servers set up on a high availability network, even available around the world if required.

 

For the Internet startup, they can create their early prototype on a tight budget, servers instantly available from day one, data secure and available to the users either in just one country or available from around the world almost instantly.  When they get bigger, the new servers can be added instantly, no wait until they are delivered, no costly upgrades, no support staff to employ and train. They just do what they need to do…  they can add test servers to develop new ideas and have ‘trusted’ beta users trial new features and if they don’t work out just get rid of the virtual server, if they do work then an upgrade can be performed on a few servers at first and then rolled out to the rest if no issues are detected. 

 

For the IT department, they can move from their internal servers to the cloud servers, having instant availability, instant upgrade possibilities, cutting costs by setting the virtual servers to shut down automatically, when not required, during quiet periods and bringing them up or adding new servers automatically when high availability is required (ie financial end of year, rollout of a new product, etc).  By moving to the cloud the data is replicated between servers, resilience ensures that if a server becomes unavailable it is shut down and a new one started (no more urgent IT callouts, expensive out of hours work and downtime), company systems can be made available to executives from home, their holiday villa on smart phone, laptop or tablet device without having to pay for expensive data lines or specialist software or equipment.  Opening a new office even an international one, is as easy as getting an internet (possibly even a basic broadband) link installed and setting up the office, company servers are therefore available as soon as they are required.  VPN and other secure access set up is available to ensure that your company data is safe and secure and only those that should have access do.

 

So what’s next, well there are already companies like Netflix, NASA and others using the more advanced services to provide things like streaming media and encoding services (an Amazon service available on AWS) and using power of parallel processing (where several servers work as one to process large amounts of data quickly and efficiently) to perform things in fractions of the time and cost that they would have traditionally have taken.  These sort of services are only limited to people’s imagination and creativity, so there is a lot to look forward to.

 

So regular readers of my blog will be wondering, “well where’s the spin or Apple slant to this?”, well that’s what I’m trying to figure out.  Amazon have taken advantage of their existing International server farms, which are used for their existing businesses (ie selling things on Amazon, developing web applications and probably using it to design their next smart phone OS, e-reader or tablet device) and are probably minimising their costs by offering the services to others.

 

So what about Apple? They have iCloud which is a storage area for iOS backups, users files and application data, e-mails, photo sharing between devices, but not much more.  iWeb disappeared and was never replaced, but then again Apple are very secretive.  They have a large developer base that could easily take their knowledge of iOS and OSX development and utilise it on a similar cloud offering from Apple. I assume those developers would value such a service from Apple?

 

So would Apple go down the same route? Could they compete?

 

I think so, Apple presently have some massive server farms / data centers, which deal with the iCloud, iTunes, iOS upgrades and other services that are needed to support their existing product ranges, not to forget about their stores, with their high speed internet connections and back office systems (I have made an assumption here – are the back office systems really iCloud systems ?).

 

So would Apple be able to compete with Amazon’s server offering? I think so, OSX has been around for some time, and in my opinion is the poor cousin (ie not many people outside of the IT industry know or care about it) to other offerings from Apple, but it could offer quite a punch if it was used in a large scale development. OSX Server already has parallel processing capabilities built in and publicly available, it has streaming capabilities in the form or quicktime, and there are so many other features that I have yet to discover (unfortunately the last OSX server I used was OSX 10.3 on a G4 Mac Pro, which was linked to a Windows 2003 domain controller and delegated nearly everything to it). I believe that OSX Server could be possible of much more than just the accounts management and application control that most companies probably use it for. I believe that OSX server could be press-ganged into service to offer a similar, possibly even better, service to what Amazon presently offer (even down to Windows server – running on parallels or similar product).  

 

Apple have recently removed their Mac Pro tower system in Europe because of concerns over the cooling fans not meeting EU regulations, but rumors are that a new design is coming soon. Apple have also previously removed the disk storage system XServe which enterprise companies used for data security and storage. So do Apple have a full cloud option up their sleeves?  As usual, only time will tell..

 

Apple or Amazon which one would you choose ?