Jump to content

Corporate Soe


 Share

Recommended Posts

Hi all,

I've been a member of this forum for a while now but haven't really made any posts......here we go.

I've been given the task to develop an SOE for the organisation I work for. I've had a look at their old method and it is a mess. It used a dos floopy and a LOT of batch scripts to install win2k and all the apps....not nice.

Has anyone seen Business Desktop Deployment at microsoft??

It is very interesting. Uses a HTA to automate a lot of stuff.

I'm just wondering how other organisations have approached the SOE in there organisation. How did you build a method of coping with a large number of pc models? Did you use sms? What about WinPE?

I've done many Windows XP unattends but this isn't really suitable as we have quite a few different hardware platforms and software requirements.

What does everyone think?

Link to comment
Share on other sites


The SOE's i have built for a number of Enterprise's i have worked for over the years have used many methods including the scripts and floppy approach you mention.

Basically what is the best solution for the organisation, may and will vary based on the infrastructure available and business requirements they have.

1. Floppy/script approaches often are used in a Novell server environment when you have an unattended source available that you boot to the network from a floppy and then automatically run the unattended installation. This method can be time consuming to deploy to a large organisation - but it works!

2. CD Unattended installations are great and faster as you can have multiple copies of CD's and have a number of machines building at once without real impacts on the network, however it can be time consuming again in big organisations and people intensive.

3. Build a master machine image using a unattended installation and then "ghost" (or equivalent) the machine image. This is very handy as you can then multicast the image to numbers of machines at once... Down side is network traffic impacts, and if you don't have a fleet of like machine configurations (don't need to be exactly the same, but similar) you can have some issues with the single common image. Microsoft also have similar process for doing similar things and this is essentially what they do in-house.

4. Part build images - this is when you load the TEXT based component of the installation onto a machine and then image it... this can often be very good if the organisation has many different machine configurations - then you can deploy a common "base" image and then let the machines build themselves... down side - can be slower than method 3, but best all round approach. This is also the idea of the method behind longhorn installations in the future to speed up the installation process.

I have used all these methods and a few variations on these - but as i said you need to base the solution based on the infrastructure and requirements of the business.

Hope this information helps. If you want to speak about specific issues, feel free to PM me.

Link to comment
Share on other sites

What we currently do is build the entire OS and appz via a long series of batch scripts and then ghost the machines for deployment.

We have probably a dozen or so models and this is where the problem comes in.

We can quite easily develop an unattended install for a specific model but we don't want to have to do it all over again every time we get a new model of PC.

So we need an unattended management system. Most would argue the SMS is the way to go for this and we agree, however SMS still needs a base operating system. At the very least we need to install the OS and apply any tweaks before installing an SMS client and kicking app installs off.

There must be a method of organising Unattend.ini files, drivers, model specific things in such a way that each time a new build comes along you just add the needed drivers and model specific files to a dir structure and then kick off a common installer.

If you have a look at Business Desktop Deployment article I linked to you will get an idea of what I mean. This doesn't however exactly meet our requirements. I'm just wondering exactly what others do with a varying hardware fleet.

Link to comment
Share on other sites

I have been to M$ training for the Business Desktop Deployment stuff and while it is very interesting and includes alot of slick stuff it isn't feasible at my company. We don't have AD yet. No SMS 2003 server. No WinPE :)

What we have been using is Altiris for about 3 years now. The Altiris Deployment Solution allows you to catagorize all the PC's in designated groups in your deployment server console. In my organization our PCs are grouped by cities. Once you have layed out your groups you can manage one PC or the entire city or entire organization. For example I can create an event to set the time zone to Eastern Standard Time and drag and drop that event on a PC, City, Multiple Cities etc.

Here is a breakdown of the features we use.

Altiris Client. Installed on every PC in the organization. Sends machine inventory everytime PC is logged in. Allows for remote control and package deployment.

Altiris eXpress Deployment Console. This piece our enterprise admins use for managing the organization. Via the deployment console we have access to everything with the click of a mouse. Imagine wanting to install Acrobat Reader on every PC without interfering with the user as they work. Fire up the RapidInstall tool and create a baseline on a clean image. Install Acrobat run RapidInstall for post installation tasks. Aft er a few moments you have an executable that can be use in a silent and hidden installation. In your deployment console make an event to execute your Acrobat package silent and without user interaction. Then drag and drop the newly created event on the PC's or Cities you want to get this package.

For imaging we create and image much like you would with Ghost. Install, update, tweak, clean, and image. With Altiris it can be an .IMG or you can make it executable. Having an executable image is great because you can put it on CD and run it without the Altiris console. Being we have such a powerful tool for software deployment we NEVER put software in our images. Software gets updated to often and it also increases the size of the image well over CDR size :)

As far as the issue with varying hardware what I do currently is create an image using the Altiris RadidDeploy application and my images are chipset specific. I support Intel 810, 815, 845, 865 chipsets in my organization. I am changing that process however because we are expected to get new PCs again soon and they will have Intel 925 chipset heh. Right now we have 4 images total. What I plan to do is when I have completed the tweak/clean phase and I am ready to create an image I add sysprep to my installation and set it up to run minisetup when the image is deployed. This way it is not hardware specific.

So to answer your question hehe.. If you cannot go the AD/RIS/SMS/WinPE route then I would suggest something like Altiris. Create your images, Create your packages, deploy and support your clients all from one software suite instead of 10 :thumbup

Link to comment
Share on other sites

Here at my office, we are using Novell Zenworks 6.5 for imaging.

This allows us either a imaging CD (using scriptable linux),

or PXE.

The server side allows rules based imaging, so when the system boots, it will check against certain hardware rules (including chipset, processor, video, nic, ram, hd size, etc), which will determine the particular image it is supposed to use. Using PXE we can also force a reimage remotely.

The image and all the apps are seperate, but also fall into the script.

The Apps are created as Zenworks NAL objects, when can be deployed (similar to group policy), or in this case, a seperate image file for each app is created.

The script then tells which apps go with which system and they are pulled down together with the OS image. In our office we are using sysprep'd images, but it will work for standard load as well.

For reimaging, Zenworks allows personality migration (which will save user data, settings, profiles, etc), before reimaging and restore them afterwards.

Nice features, for a Netware environment.

Link to comment
Share on other sites

Here at my office, we are using Novell Zenworks 6.5 for imaging.

This allows us either a imaging CD (using scriptable linux),

or PXE.

The server side allows rules based imaging, so when the system boots, it will check against certain hardware rules (including chipset, processor, video, nic, ram, hd size, etc), which will determine the particular image it is supposed to use. Using PXE we can also force a reimage remotely.

The image and all the apps are seperate, but also fall into the script.

The Apps are created as Zenworks NAL objects, when can be deployed (similar to group policy), or in this case, a seperate image file for each app is created.

The script then tells which apps go with which system and they are pulled down together with the OS image. In our office we are using sysprep'd images, but it will work for standard load as well.

For reimaging, Zenworks allows personality migration (which will save user data, settings, profiles, etc), before reimaging and restore them afterwards.

Nice features, for a Netware environment.

I have used Zen and it is kickd a**. Unfortunately we don't have a Netware environment.

Your post made me realize I missed a few things hehe

Altiris has PXE support and PC Transplant which wraps profiles, application settings and files into an executable that can be run on the new machine to migrate all their settings and data :)

Link to comment
Share on other sites

Unfortunately we are not allowed to persue PXE based solutions. We have about 1800 PC so ghost seems to be the best way to address this number of PC's.

I think we will make our own. We have been playing with the idea of writing a small .net app to automate a lot of the manual task.

Link to comment
Share on other sites

Manual Tasks:

-Model specific settings in the Unattend.ini/winnt.sif

-Drivers for each model

-Post installation task

-OS customization ie. is it win2k or xp

That is all i can think of atm.

I've almost convinced work to let me do RIS, how does that actually work?

Link to comment
Share on other sites

RIS is beautiful. If you have a Windows server you already have RIS. You simply need to make sure you have more than one logical drive in the system, Install the system component and then run RISetup.exe. It will create a base image that can be used to install on any PC. I used to use one RIS image to install on a good 30-40 different model PCs and laptops. It does use PXE to connect. You simply boot the client, tell it to boot to the network (F12 usually, but dependant on your MB) You get a Client Installation Wizard which you can completely customize. You spend about 1 minute selecting some things and then walk away. It will install the OS for you and any drivers you need. Then you can use any of the methods you wish that are listed on this site to do post installation, but I always used batch scripts. You can check this out for more info... http://forum.osnn.net/showthread.php?t=36648 (I really need to move that post over to this site)

Good luck. Let me know if you have any more questions about RIS

Link to comment
Share on other sites

  • 2 weeks later...

I sort of left this for a while.

What I am trying to get around is the process of manuall building an SOE each time some id*** buys a new computer in our organisation. Application install is not an issue as we will either use scripting or SMS to install all the apps.

We just need a base OS with the SMS client and then we can let SMS do it all. This is where the problem comes in. Each time a new model comes in we currently have to make a whole new unattended setup. These are the things that change:

-winnt.sif (for PnP Drivers)

-Drivers in $OEM$

-Post installation of drivers (PnPDriverPath doesn't always work)

-Model specific post install tasks

I've heard some people say that as long as the harward is close to the same you can use disk imaging and run sysprep....is this true? I know you have all suggested very valid methods for building SOE/MOE environments but non that really tackle the problem of varying hardware. I'm seriously considering making a small .net the will automatically create the necessary dir in the $oem$ folder and configure everything that we need. What do you guys think about this?

Link to comment
Share on other sites

I use disk imaging...

We install the base image, install all our software (ms office, citrix client, sms client, etc.), then change the IDE controller to Standard Dual-Channel IDE Controller, then run sysprep -mini (we use a sysprep.inf file for automating sysprep), then image the system. Works on all our different PC models (about 15 at the moment, ranging from Dells to HP's). Works great for us, and we never have a problem.

The coolest part is it only takes 4 minutes to re-image a dead system...

Link to comment
Share on other sites

Interesting...So you just install xp on a computer and then image it. Then you can use that image on different PC's with differenct hardware as long as you rung sysprep -mini??

Link to comment
Share on other sites

Exactly, as long as they use an IDE controller. The key was changing the IDE controller to "Standard Dual Channel IDE Controller"...then it gets re-detected during the mini setup...and life is good :)

Link to comment
Share on other sites

Don't use Ghosted images - period.

Especially in a diverse hardware environment like yours. You are just asking for weird problems down the line with ghosted images. Trust me, I know what I'm talking about, I've been rolling out Windows to my fortune 500 corp since win 3.0. Over the years I've tried all the deployment methods.

When you ask questions like this, in forums like these, you'll get a lot of people who insist a imaged drive is the best approach. It isn't, it sucks. It's not just changing the IDE controller, it the imaged systems HAL type (single cpu, multi-threading P4, dual processor?) and PnP setup, etc. I could go on and on.

The best way, and the most professional IMHO, is to use the unattend.txt method downloaded from a network installation point (not a cdrom). This is pretty much exactly the setup you have now, and it is the best.

You may not like the way it is layed out, or think it's "messy" but it isn't. It's the best solution. You don't like updating the driver directory? Tough, it's part of the job to qualify new hardware. You're not supporting Mac's or Sun's, you knew PC's were dangerous when you took the job :)

With a scripted network installation it takes only minutes to update the baseline build to incorporate new drivers, hotfixes, applications, etc. Where it can take hours to update images.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.


×
×
  • Create New...