Choose your iPod Retailer Wisely

Tim Kleemann, Managing Director (no less) of NextByte, responds to my post on “iShonky”

Tim’s reply goes to show that you must buy from a retailer who cares.

So, armed with the facts from the horses’ mouth: one must ask what data Choice has used to determine their “shonky-ness”?

The last thing we need in Australia is a magazine out for column-centimeters rather than the truth about products.

Choice Magazine calls iPod Shonky

The CHOICE iSHONK for Dual-level Shonkyness is awarded to the Apple iPod, mainly relating to the repair “procedure”

Choice Magazine has been the respected voice of Australian consumers; and with strong consumer protection laws in Australia: you must comply with the laws.

This comes on the back of the RMonvirus on 1% of video iPods sold after 12th September. Now, that’s doubly shonky. Who was spot checking?

Repairs to technology where the margins are slim and the volumes are large can wipe out profit in an instant. The key is to make the product correctly in the first place. Quality systems, W. Edwards Deming.

Someone at Apple PR should be getting cranky about this – there are competitors on the horizon; and customers expect more than Aussie Post style repairs.

Parallels 1884 Vista Quick Notes (and update)

Download the 21Mb update to Parallels (to build 1884)

Boot Windows XP to ensure all is OK before I install Vista. Windows XP “seems” to boot a little faster. Unable to quantify exactly how much.

Backup existing 15Gb Windows XP .hdd, just in case. Create a new 15Gb image to install Vista into.

Pararllels settings:

Parallels settings

Install into the fresh 15Gb image, 1024Mb of RAM allocated to image. Vista is marked at (experimental) as OS. Installing onto a MacBook Pro with 2Gb of RAM and MacOS X 10.4.7

  • Beta 2 Build 5384 DVD (thanks, Frank Arrigo at Microsoft Australia)
  • Started install at 11:05am
  • Vista install auto-restarted at 11:35
  • Vista install auto-restarted at 11:43am
  • Questions (location, time, username) at 11:46am
  • Vista install auto-restarted at 11:47am
  • Into Vista Beta 2 at 11:50am
  • Install Parallels Tools from the Parallels VM menu. Note that these don’t seem to be signed drivers, so ignore all the warnings and install away
  • Manual Vista Restart
  • On restart, if the “Welcome Center” doesn’t appear, choose it from the Start menu. Click on Add Hardware.
  • Vista found network card, and automatically configured network. Also note that Vista also finds “PCI Bridge Device” which I asked Vista to ignore
  • Restart; Vista found network card, and automatically configured network. Note that the Network Adaptor settings for the Parallels VM set “Bridged” worked OK

In short, it works. Note that I haven’t stress tested this; and the Parallels guys say its experimental. Beta OS on experimental hypervisor virtualization. Your mileage may actually turn into inchage quickly.

vista login

Vista Desktop first questions

RC1 Note from 8:20pm

You cannot install Vista RC1 on Parallels. Bugger. ISO, DVD burnt or upgrade from Beta 2 to RC1. None of these paths work.

***STOP: 0x000000A5 (0x0001000B, 0x50434146, etc)

The ACPI Bios in this system is not fully compliant to the specification. Please read the Readme.txt for possible workarounds, or contact your system vendor for an updated bios.”

Uptime: 22 days. And I run Windows XP SP2.

I am not a Mac fan-boy. Been there, done that. And to be truthful, I think I am a little too old for zealotry. The innocent dogmatism of youth has been replaced with that pragmatism to the point of pessimism middle age.

My 15″ MacBook Pro runs MacOS X 10.4.7. The last time I rebooted was the installation of the MacOS X 10.4.7 update. That restart was so long ago, I honestly cannot remember rebooting.

uptime

Pop over to a Terminal window, uptime: up 22 days.

Up until May this year I had been a Windows person. Dell this, Windows that. A clean shutdown or restart at least once per week would keep the Dell going. After constantly sleeping/hibernating, things just didn’t feel stable anymore under Windows XP. Maybe it was all the weird VPN networking stuff that I had to run. Or memory not being freed up.

This MacBook Pro gets an equal amount of digital thrashing. It’s turned on and being used at least 14 hours per day. During the day, there are multiple shut-the-laptop lid hibernations, running multiple applications. Installing, launching Mac apps; de-installing (drag-install, drag to trash deinstall). Mad as hatter cats pulling out the magsafe power connector; Dashboard widgets are added, removed and refreshed. PowerPC (Rosetta) applications launching, force-quit Sheepshaver. Wireless network router reconfiguration. The screen in brilliant for spreadsheets – the performance on the Mac and Windows under virtualization are excellent.

During these 22 days I’ve booted Windows XP at least 15 times using Parallels. Most recently to run a TRS-80 emulator, and to take a look at a personal email in an archive .pst file. Even backing up the PC is easy. Drag copy the disk image onto our family file Debian server.

Under Parallels, everything I’ve installed has worked first time. Office 2003, Office 2007 Beta. Adobe Flex 2.0, Adobe Premiere Elements 2.0. Microsoft XML Notepad.

In a smartly organized corporate environment, and some smart configuration created by some smarter infrastructure cookies, a single standard Windows XP image could be created on a server. This could be pulled down when people come into work as their standard “office” suite. Separating the environments for executives could be a mechanism of saving costs.

Without the apple-coloured glasses, there are some deficiencies: the MacBook Pro has an integrated video camera in the lid but there are no device drivers for Parallels; and ACPI is yet to be supported under Parallels: so no Vista Beta/Vista SP1 yet. Not a big gamer thankfully as games performance/Direct3D sucks.

It’s still not a real Windows XP machine. There is no little laser-etched blue OEM badge (the Windows XP Professional installed is a box copy). So 22 days uptime or not, there is something that just doesn’t feel right: running Windows on a Mac is like listening to Country and Western in a Ferrari. You feel, well, dirty.

Still, this MacBook Pro has been the most stable Windows laptop I’ve had the pleasure of using. So, by definition – is the safest way to run Windows XP is under virtualization on MacOS X?

A/UX 3.0: Apple’s first Unix OS

History revisionists state that Apple had to buy NeXT as they could not write their own pre-emptive/protected memory OS.

Apple A/UX 3.0 integrated the best of System 7 and Unix. Maybe not the latest Unix available at that time, nor on the fastest hardware; nor with the best driver support. But it rocked for its time.

From memory, Apple needed to created A/UX to permit their hardware to be sold as “POSIX” compliant to US DoD. With more internal, less “not-invented-here” thinking – the need to have a bogus OS (Copland) and the ultimate reverse take-over by NeXT could have been avoided.

One Mac Head, Two Minds

An excellent article from the New York Times: Weighing a Switch to a Mac. Interesting, as it goes through the two options: BootCamp or Parallels.

You don’t need to leave your Windows-mind behind when switching. Now that I am disconnected from the Adobe-mind, I rarely use Windows applications. But then again, I’ve not really done much in the last two weeks apart from fill this blog up with stuff!

State of Mac Virtualization

MacWorld reports from the WWDC and an interview with Ben Rudolph of Parallels:

…”What’s more, Parallels Desktop for Mac will see “fast 3D graphics support,” presumably to help cater to gamers who want to run Windows games without having to reboot their machine”…

I’ve just updated to the latest Parallels beta; it was smooth and you can notice the graphics improvement. Being able to tweak the virtual environment/MacOS X is cool. Not ACPI BIOS yet, so no Vista install. Yet.

Now that Microsoft has left the MacOS X sphere, Parallels seems to be positioning itself at the consumer end of the market: games and ease of use. And increasing its distribution was a smart and calculated move.

This leaves VMware to the high end. As predicted here, two of the three predictions have come true; and according to a Macintouch interview with Dave Schroeder of VMware, the third is going to need customers to voice their needs to Apple. So it is not off the table, however we have Apple’s mantra/dogma of “MacOS X will never run on non-Apple hardware” to surmount.

It is within the realms of possiblity that Apple could create a version of MacOS X Server that had a distinct, non-desktop personality (desktop APIs removed), and checked for either Apple or VMWare “virtual hardware” — creating a stable, enterprise level Unix. This leaves customers to choose either XServe hardware with MacOS X Server, or VMware virtual hardware with MacOS X Server. The result is a live market test and ROI of being in the highly competitive and fast moving blade server marketplace.

Leave the desktop MacOS X to run on Apple hardware only.

There must be a gaggle Product Managers and Finance-types deep inside of Cupertino running their pivot tables in Excel to argue both sides of the equation. The sales of the these new XServes in the next 2-3 quarters will predict the future of MacOS X Server on a virtualization platform.

Virtualization, MacOS X Server

Silicon Valley Sleuth writes a short article on the appearance of VMware at WWDC. It’s about more than just the desktop OS.

Here is another pie-in-the-sky, non-desktop scenario:

  • Apple releases new versions of both their Xserver and the MacOS X Server.
  • Xserve becomes a tested and supported platform for VMWare Server and more importantly VMware’s ESX Server. This will permit new Intel-based XServes to be installed into Datacenters with their heads held high. VMware endorsement is cred Apple needs to go to the next revenue level with their servers.
  • An implementation of Leopardized MacOS X Server will run on non-Apple hardware on VMware. This is a counter-punch to the recent Xen/Microsoft/VMware wrangling. Now MacOS X Server can run on a stable and supported platform (VMware ESX) rather than the multitude of hardware configurations found in the Intel world.

So, what’s the net-net of this? Apple has VMware supported as an application on MacOS X desktop; endorsement of their blade server environment and more sales of MacOS X Server without the support hassles.

VMware gets unique and in-demand server OS with excellent corporate support. Rather than IS managers adopting the Linux/Intel “build it yourself” approach; a supported platform is important.

It is not so much about the desktop, but the server.

The next few days will be very interesting!

Our Virtual Future: There are Cycles to Burn

Just over the digital horizon, your Apple MacBook will boot multiple MacOS X 10.x, Windows Vista and Linux/Ubuntu operating systems at launch. You don’t see all of their friendly faces, but they are there ready-to-go. Get-info on Firefox, and you inform the primary OS which of these operating system environments you would like the application to launch into. Need to run Outlook 2007 for your large organisation? No stress: its there, behind the scenes as you running Photoshop on the MacOS side. If your Windows instance crashes as you are testing a new application; it is shutdown and relaunched simultaneously. Far fetched? With Virtualization and multiple Core CPUs, no. On the desktop, usable virtualization is relatively new; what can we do with modern processors and software. There are cycles to burn.

Virtualization is nothing new in the IT world. Mainframes, the Crocodilian survivors of last century have long used Virtualization as a method of performance management and isolation. If you are login as user on a mainframe in, your session is a virtualized instance of the whole operating system. This isolates you from other users, protects the whole system from strange things you may do: like runaway queries. SELECT * FROM INVOICES;

Micro-processors since the Intel 4004 have survived with a single Central Processing Unit (CPU) that stepped through the commands in strict military time as sounded by the clock. We started with Hertz, then x1000 to Megahertz and now x1000 again to Gigahertz. A big leap has been the multiple “cores”, or separate CPUs added to the Intel processor line, with the VT-x,With multi-core processors being the current “thing”, and with Intel talking about their processors having upto 32 cores in 2010 – performance is going horizontal (more CPU cores) and vertical (Ghz Clock speeds). The drummer in that military band must be getting sore arms beating that fast!

PowerPC and other RISC-like processors started to split their processing commands to internal co-processors. This is like delegating the “difficult” jobs to the specialized underlings. For instance, Maths to Floating Point Units (FPUs). To gain performance during the latter days of the PowerPC, Apple started adding multiple processors to split the workload. Smart software could take advantage of these to speed up heavy processing tasks.

Multiple Cores in the same processor is bringing this same philosophy to everyone – including low-end MacBook laptops. Few applications scale well into multi-cores: one that does is Virtualization.

How does this related to Virtualization? Now we have multiple cores, one potential use is to run separate operating systems on these multiple core CPUs; and each operating system running at full speed. Veterans in the Mac world would remember the first versions of non-Microsoft VirtualPC. This software emulated (and dynamically recompiled!) the Intel processor commands on PowerPC like a human language translator. Translating takes time, and therefore the operating system just didn’t feel snappy. Now on we have Intel multiple cores, the OS world is a user’s oyster as the translation is no longer required.

As Virtualization is in the hardware, the software that has been unique can be commoditised. This has drawn WMWare and Microsoft into a war for the “Virtual Server Platform”, and the first battle on the desktop to grab the hearts-and-minds. The dollar returns for these organizations lies with the “ownership” of the server platform.

Microsoft has released VirtualPC 2004 as now a free product, with VirtualPC 2007 also following for free. This is a parry to the VMWare thrust of WMWare Player. VirtualPC comes from a product started by Connectix. Announcement made at a recent MacWorld that Virtual PC Mac would support Intel based Macs in the future. The Channel9 Team at Microsoft have an interview with the engineers of VirtualPC. This provides an excellent backgrounder on the history, and technology of virtualization.

EMC, and specifically their VMWare division, has free versions of VMWare Server and VMWare Player available for free. Conveniently, VMWare have links to pre-created VMWare images, or “appliances” as they name them.

Server Appliances: one of the VMWare images available is a Firefox Browser appliance.. A self-contained operating system with user interface (Ubuntu) and Firefox – pre-installed, ready to go. We are all waiting for VMWare for the Intel-based Macs.

The recent scrabble over the Xen environment for virtualization between Microsoft and VMWare, and recent comments by Novell are an indicator of the importance of server virtualization. Many large server hardware companies must concerned about the impact on their business as server hardware is consolidated.

Server, large systems, IT shops with many physical servers and the need to support legacy server applications benefit from Virtualization today. How about desktop environments?

One of the first applications I installed on my MacBook Pro was Parallels Desktop for the Mac. There are a few Windows-only applications I need to use; and within 30 minutes I was sold (and yes, I purchased a copy). I use Windows XP daily, Parallels has been useful in seeing what this commotion about Ubuntu has been about, and doing a test install of Debian prior to production install as our home server.

10 Ways Virtualization Will Change Our Digital Lives:

  1. Critical Application isolation In our daily computer-lives, there are applications that are more missing critical. Imagine you have a soft-phone running via VoIP. You really want that application running, rain hail or crash. With Virtualization, you could put that application in an environment that is isolated from your World-of-Warcraft environment. In fact, you have a “play”, “work” environment to separate your digital life.
  2. Holistic Backups Backup your VM, and therefore your whole environment. I remember the first time I lost a hard drive (an amazing 100Mb external SCSI drive on a Mac II) and lost all the data. Bourbon and Coke soothed the loss; now I have a laptop with 100Gb. Learning from the original loss of data, there are a myriad of backups lying around: burnt CDs, DVDs, Debian server with 450Gb, some data inside GMail. However, the investment in the setup of my machine is worth 4-5 working days to become “online” if I lost the hard drive. With an image of the think investment in the setup, let alone the data.
  3. Pre-setup Server Platforms:A project like Ruby on Rails. To install this beastie, you have to install Ruby, Apache, MySQL (or Postgres), the Ruby Gem remote installation/dependency system, Rails plus a few Ruby hooks to connect all this together. For the faint of heart, or non-OS tweaker – its a major and potentially impossible chore. Deploying this in a development/test/production cycle increases the complexity by an order of magnitude. Servers, and server software is just too darn complex for the average developer.

    Instead, if there was a lightweight OS with these pre-configured, installed, security checked – a developer could simply download the VirtualPC/VMWare image and start.

    It would be cool if organizations that sold/developed server tools provided a “pre-installed” image rather than an a myriad of inter-linked installers and required dependencies.

  4. Lightweight Server operating systems, or based “framework” on which “virtualized” systems can be created and easily deployed. Guess what? there are these environments for Linux and potentially for Windows.

    Rather than building a large, complex OS with multiple software sub-installs, commercial server applications should move to a VMWare/VirtualPC image for double-click launch-and-go.

  5. Deep debugging: imagine an instrumented, or as Microsoft call it an enlightened OS. Run the VM, Record all the commands until crash. Rewind, Replay.
  6. Virtualizable OS for Test/Development; or Quick OS Undo: launch a VM and test an application and keep it separate from your working operating system. If something “breaks”, throw away the VM image and restart. Keeping clean images of various operating systems in “cold storage” permits quick resetting back to a baseline.

    In a recent podcast Ruby on Rails interview with : “Its about breadth, not depth”; as processors are starting to go multiple core, applications need to scale horizontally, not expect more processor speed to magically work. This has an impact on Dynamic/Scripted languages being deployed as web applications. Also SOA is about these applications connecting to each other between different servers.

  7. Hosted Servers from Virtualized Images: grab a “copy” of a virtual server from your new hositng provider; locally test/install & and then remotely deploy and start. After writing this, I notived that TextDrive is offering Solaris Container style serving.
  8. Clone + Clone back: Take an image of your current VM OS configuration. Make some change (install an application or OS upgrade), and binary re-merge the now modified clone back onto the original. This is probably more difficult to do than think through, but a well architected OS could permit smaller diff style changes.
  9. Why re-invent the wheel? Large applications have large problems; saving files. foundation: with IO/MM and low-level access to the hardware, could a large application run in its own operating system? Separating the IO/MM out of the large framework into a well-tested OS results in re-factored frameworks. Higher level applications really don’t know they are in a virtual environment.
  10. Network Virtual desktops. A twist on the Citrix environment, but the CPU in use is on the desktop, file image on the server. VMware is a part of a gorup working on desktop VDI. What we need is a small, easily Virtualizable (and pre-installed image) MacOS X and Windows XP/Vista.

Virtualization, as the processors go madly horizontal with multiple CPU cores, is going to be a large part of our future: not only on the server, but also on the desktop.

Download Parallels (if you on an Intel Mac) or VMWare Player / Microsoft VirtualPC and give it a whirl and experience the beginning of our virtual future. Burn those Cycles!

Bedtime Reading