Jump to content

Unversal Windows 7 image assistance /advice needed.


Recommended Posts

Posted (edited)

Hi

I have built a windows 7 universal image that is working great in all of my six labs, all hardware was found on all the different hardware that resides in each of the labs. I have included all the common applications that are in ALL the labs on the image. I am now in the process of scripting the installations of the unique applications that is specific to each lab.

What would be the best way to do this, ie: scripting the installs that sit on a network share?. would it not take a lot of bandwidth to do what I am wanting to do. Should I be thinking about setting up packaging server?

What is the best setup for what I am wanting to do?

Thanks for any ideas /solutions you offer.

Edited by clivebuckwheat

Posted

Well, these questions require a bit of scoping for sure. How many applications are we talking about, and will you be deploying them post-image, or are you looking to stage them with an image, etc? Would having a factory staging image be a possibility, or do you have access to System Center ConfigMgr 2007 perhaps? MDT is capable of doing a lot of it, although you may want to move this to active directory or SCCM depending on how "final" your images are.

I guess you'll probably need to think more about what types of apps they are, and what kind of automation you can achieve before thinking specifically about how to deploy. Personally I prefer images that sysprep to factory, and I use SCCM or MDT to install applications at that point, but I've worked with clients that use SCCM and/or Group Policy to do deployment post-image. I've also worked with a few that have multi-staging - MDT in their "build lab" building a "clean image", and then that image is copied to SCCM or MDT at each custom site to be put into another MDT or an SCCM task sequence that automates only the things they need there (sort of a "master image with children" setup).

Posted

Thank you Cluberti. I'll answer in detail tomorrow, maybe you can help me sort out my thoughts and tell if what I have planned is doable, and figure out the best solution for my environment.

You always seem to point me onto the right path.

Hi

I have built a windows 7 universal image that is working great in all of my six labs, all hardware was found on all the different hardware that resides in each of the labs. I have included all the common applications that are in ALL the labs on the image. I am now in the process of scripting the installations of the unique applications that is specific to each lab.

What would be the best way to do this, ie: scripting the installs that sit on a network share?. would it not take a lot of bandwidth to do what I am wanting to do. Should I be thinking about setting up packaging server?

What is the best setup for what I am wanting to do?

Thanks for any ideas /solutions you offer.

Posted (edited)

Cluberti, and whoever else wants to chime in

To explain the situation in our environment it is as followings. Recently we began our migration to Windows 7, the person in charge of building the reference masters that are to be deployed, is close to retirement, and all this is a lot adjust too. As of right now we have several masters, for each different make and model of pc we have onsite. I was asked to look into making this better and hopefully streamline the procedures for deployment. Making a universal image was a first step in the right direction, and I am proud to say it's working on almost ever different make and model we have.

On my base universal image I have included applications that I know is on every machine in house, such as Office 2010, firefox, java, flash etc. For the applications that are specific to each lab, I have decided to script the installs, by using any available command line switches if available, or repackage them using snapshot technology. The programs that will be installed via scripting after the base image has been laid down sits on a shared volume on a Windows 2003 server.

I then use net use and map to shared drive and kick off the installation scripts to start the installations, we are talking no more than 15 applications that will have to be installed using this method per lab, because they are unique to that lab. I do not want to have to make a separate image for each lab because of different software needs, that is how we do it now and the amount of images we have is mind blowing and a headache to keep up to date and to manage. My concern by installing applications from a network share, using net use and the kicking off an install script, does this / will this use an absorbent amount of network resources /bandwidth?.

I have never used MDT, but I have read a lot about it, I don't know if we can use this as we use ghost to deploy our images, as for SCCM,we do not have access to this. Let me ask you, we are a large organization so would setting up a SCCM be wise?, considering this is the direction we would like to go in (Have 1 base universal image, and then script the installs, or push the unique applications needed in each lab).

Lastly we are not in an Active directory environment, but on a Novell Network, but we will mostly likely be moving to AD in a couple years.

Thanks if you can point me in the right directory or shed some light on some of the concerns I have raised

Well, these questions require a bit of scoping for sure. How many applications are we talking about, and will you be deploying them post-image, or are you looking to stage them with an image, etc? Would having a factory staging image be a possibility, or do you have access to System Center ConfigMgr 2007 perhaps? MDT is capable of doing a lot of it, although you may want to move this to active directory or SCCM depending on how "final" your images are.

I guess you'll probably need to think more about what types of apps they are, and what kind of automation you can achieve before thinking specifically about how to deploy. Personally I prefer images that sysprep to factory, and I use SCCM or MDT to install applications at that point, but I've worked with clients that use SCCM and/or Group Policy to do deployment post-image. I've also worked with a few that have multi-staging - MDT in their "build lab" building a "clean image", and then that image is copied to SCCM or MDT at each custom site to be put into another MDT or an SCCM task sequence that automates only the things they need there (sort of a "master image with children" setup).

Edited by clivebuckwheat

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...