Sean Kaiser (dot) com

A home for things (work-related & not) that I feel inclined to share with others.

AutoPkg Change Notifications

| Comments

Background

We use AutoPkg to automatically download (and process into munki) updates to our commonly installed applications and internet plugins. One common practice is to run AutoPkg via JenkinsCI, but I have not taken the time to install and configure Jenkins, so I just run AutoPkg via a cron launch daemon script.

Because I didn’t want to have to check munki periodically to see if AutoPkg had downloaded anything, I wrote a wrapper script which emailed me the output from each AutoPkg run, which in our environment happens at the top of each hour. After a weekend of getting hourly emails (which I had to browse to check for any updates), I decided there had to be a better way to be notified of AutoPkg’s work.

Slides From 1400+ Computers in 3 Weeks? Are You Nuts?!?

| Comments

Slides from my presentation at PSU Mac Admins Conference 2013

If you were in my presentation, or even if you weren’t, and would like a copy of my slides, I’ve posted them below.

As a reminder, I’ll have a blog post in the next couple of days with links to the various sources of information that I used to build our deployment system, as well as some expanded notes on why I designed things the way I did.

I’m offering the slides in two versions: PDF and keynote file (in case you want the multisite reposado demo video mainly… or in case you really liked the transitions.)

More information on multi-site reposado is also available.

If you attended my session, thank you. I hope that I offered something that will help you deploy your own modular deployment system. If you have any questions, you can reach me via twitter @seankaiser. Or, if you would like to contact me via email, my email is my first name -at- seankaiser.com.

Multi-site Reposado

| Comments

A conundrum

Let’s say you work in an environment where you’re running reposado. Let’s also say that your environment consists of several locations with relatively slow WAN links between them. Additionally, let’s say that some of your users roam between locations, and before they move, they just put their MacBooks (or Airs or Pros) to sleep instead of shutting down (because who shuts their machine down every time they’re not using their machine?)

In an ideal world, you want to point the machine to the reposado server, but you don’t want the machine to download updates over the slow WAN link, and while you could run a reposado server at each location, but by configuring the machine to look at an onsite reposado server, the machine will likely move to another location before softwareupdate checks for updates.

You’re running munki and have it set to install Apple software updates? Awesome. You could set the appropriate CatalogURL in your preflight script, but that means that you have to maintain catalog files on several reposado servers, and who wants to do that? (Ok, you could just clone the master reposado server, including the catalog files to get around that last part.)

But what happens if the user has the ability to install Apple software updates via Software Update from the Apple menu (or by running softwareupdate itself)? Their machine might have their previous location’s CatalogURL set…

What do you do?

Since /Library/Preferences/com.apple.SoftwareUpdate.plist doesn’t allow you to configure a PkgURL like munki does, everything goes to the server that the catalog file defined by CatalogURL goes to. But that’s the problem.

The workaround? You set up redirects on the master reposado server based on the client’s IP address. It seems simple, but I haven’t found any references to anyone else doing this. Interested? Great. Let’s set it up.

The Big Project- Preparing the Machines for Deployment With DeployStudio

| Comments

Previously on “The Big Project”

In part 1 of this series, I provided an overview of what I now call “The Big Project.” Part 2 talked about the importance of inventorying. This article is the first of a series of more detailed technical articles describing various aspects and the tools we used to pull off this project. FIrst up…

DeployStudio

We’ve been using DeployStudio for several years to help us image machines. In the (relatively distant) past, we restored full monolithic images to machines, new or old. Over the past almost two years, we’ve switched to the thin imaging model. Thin imaging is a method of preserving the contents of a machine’s hard drive and deploying either site specific applications or deployment tools to facilitate the installation of such applications and other files in an effort to minimize the amount of time needed to get a machine ready for a user. In our case, we deploy our deployment tools (puppet and munki) and some basic settings. This cuts down on the deployment cost, in both time and bits traveling across the network.

In situations where we need to reimage a machine, we have a workflow that lays down an InstaDMG created vanilla image and then runs the normal thin imaging workflow.

The Big Project- Setting the Stage by Inventorying

| Comments

A small detour

I was going to keep this article for near the end of the series, but then it might imply that the prep work wasn’t important. This is wrong. It’s very important, and in our case just as critical as the deployment tools themselves. I’d guess this is (or should be) the case everywhere.

Why is inventorying so important?

Any time you deploy a machine, whether it’s one machine or (in our case) part of a 1300+ machine deployment, you need to add the machine to your inventory system. In our case, our inventory system is our help desk (we run Web Help Desk.) As I’ll describe in later articles, many of the processes in our deployment workflow refer to the help desk and custom asset fields so we can have a dynamic configuration without having to edit files on individual machines.

The Big Project- We’re Doing What, and on What Timeline?!?

| Comments

The project

Deploy 1170 MacBook Pros (later amended to 1250), 60 iMacs, 214 Apple TVs, 106 (full sized) new iPads, and 30 iPad minis. Relocate 100 or so “newer” (one or two year old) machines into different locations, either in different classrooms in the same building or to a different building. Retrieve nearly 1700 old machines (iBooks, eMacs, iMacs, Mac Minis, MacBooks) back to the warehouse for disposal. And do all of this in roughly 3 weeks (once the computers started arriving.)

The “problems”

Under any normal circumstances, this would be a huge summer project that we’d start planning around this time of the year. But this project was on the fast track. For various reasons, we were going to do this during the school year. The vast majority of devices would be ordered in two waves, the first in mid-November and the second a week or two later (with the additional 80 MacBook Pros ordered in early December), and we’d start deploying machines ASAP because of limited warehouse space. And, we’d try to have everything deployed before winter break was over. Although I was involved with the later stages of deciding what computers we were buying and in what quantities, most of the rest of our team hadn’t been involved, and didn’t even know about the project until around the time that the first order went in. (We didn’t want too many people knowing about the project until it had official board approval.) Our team (our supervisor, 3 technicians, our administrative assistant and myself) had our work cut out for us.

Cross-domain Contact Sharing in Google Apps

| Comments

A bit of background

For our Google Apps implementation, we are using two different domains: one for staff and one for students. This was recommended to us by others, and we already owned the domains, so it made sense. The problem with this approach is that the user directory that allows users within a domain to simply type in a user’s name and then select their address from a list doesn’t work when some users are in one domain and the rest are in the other. Sure, each user could create the contacts in their personal contacts list, but for a teacher to create a new contact for each of their students would take considerable time. They’d also have to have access to the user list to know what the student email addresses are.

Tools

Google provides APIs to allow 3rd party scripts and solutions to interact with the domains. As we were setting up the domains, I remembered seeing something about a Shared Contacts API. Yesterday I started looking into what this API could do to help us solve the cross-domain contact issue. I found a Google Code project called Google Shared Contacts Client (or gscc for short for the rest of this document.) This python script lets you interact with the domain’s shared contacts.

To get started, you’ll need to follow the installation instructions. They’re simple. Be sure to install the GData Python client library, or nothing will work.

Getting Started

| Comments

This is my first time doing any sort of blogging. I’m going to try to provide technical writeups on major projects that I work on. We’ll see how it goes. I’m not the best writer or communicator. I ramble. A lot. If you can get through that, hopefully you’ll be able to find a small nugget of information that helps you in your job or life.

My current focus is moving our district from First Class email to Google Apps for Education. In First Class, we provided email accounts for most staff, but no students. Each staff member had a limited amount of disk space available for their email and a simple website. Now, staff will have tons of space for email, virtually unlimited space for documents, and the ability to have multiple websites. On top of all of that, we’re providing accounts for our students to do the same thing. All of a sudden, instead of supporting around 650 or so email users, we’re going to be supporting nearly 7000 email, docs, etc. users. You know what the best part is? Our department has learned a ton, has come together like never before, and we’re pumped.

With that focus, most of my initial posts will be related to this migration. Over time, posts will shift from one technology to another, from desktop management to network infrastructure to who knows what, and back and forth. If my work inspires even one person, or even just helps save someone some time or their job, I’ll be happy. If not, I’ll still be happy. My life is good.