Being the Forest, Forgetting the Trees

Microsoft is on the cusp of shipping a whole forest of new products. Vista, .Net 3.0, Office 2007 and *.live.com stuff than you can poke a branch/stick at. All of which presents Microsoft with some tall challenges. How does a single tree get noticed? How does the world find the saplings that are going to be the next Sequoiadendron giganteum? Does the forest work together as a cohesive eco-system?

Today, thanks to Microsoft Australia’s, Frank Arrigo, I attended the Blogger’s Brunch. Great of Microsoft to reach out to a section of the local technology blogging community. None of the attendees (except Angus Kidman and Nic) are famous in the blogosphere, but on the internets – noone knows you are an Australian.

Whilst having been a Microsoft customer since 1984 (Microsoft Basic 1.0 on a Macintosh 128K – and the box is in storage somewhere), I am a relative noob to “marketectural” Microsoft. The speak is strangely familiar to my ears.

The following are some random thoughts and un-expressed questions from this morning’s session:

  • To the Microsoft PR people. Sorry it paralleled Microsoft-Groove/Ray Ozzie history with Apple-NeXT/Steve Jobs. To Frank Arrigo. Sorry I stated that the *.live.com people are having fun being compatible with all the versions of Internet Explorer rather than implement Firefox support. Both of these were intended as jokes, not memes.
  • Today’s Australian Financial Review’s IT section has quotes from various large Australian financial organisations stating that they are taking a wait-and-see approach to Windows Vista. Some are only now installing Windows XP. These organisations state they will install Vista in 2-3 years. I find this quite interesting as it has taken them 4-5 years to install Windows XP. Personally, I am concerned if a large financial organisation is not running a recent, up to date, tested and secure OS on all their desktop computers. I’d love to know what features in upcoming products are direct feedback from Australian customers. This would show that the software development process is a two-way street.
  • Sharepoint should evolve into *.live.com server for the Enterprise. If Vista has all the hooks, and the connected/disconnected world and new applications are going to be mashed (lashed?) together with live stuff, this seems like a logical move. However, large organizations will be reluctant to put all their data into the world’s cloud for all to stumble upon. I am no expert on Sharepoint and all the positioning stuff, but it seems there might be a little “tension” (not a bad thing, mind you) between these two environments.*.live.com is garnering the mindshare as it is new-ish; many of the APIs and licensing models are to be determined. Come to think about it, these are probably the two reasons why they are still separate. Revenue and developer penetration.
  • After hearing about the IT professionals fawned over the coolness of Vista infrastructure deployment … I left the session (both mentally and physically) asking “what are Microsoft’s customers going to do with all these fine trees?”Customers doing meaningful stuff with Microsoft’s software so that they can impress their customers is where it is at. Marketing people might call it Unlocking the value of the platform.
  • Virtualization on the desktop has been one of my “things” for a while, so it’s interesting to hear that VirtualPC is to be included in the Enterprise version of Vista. Whilst listening to the intricacies of Vista vs XP deployment, my mind was racing thinking about the future of operating systems.So here goes: why is the Enterprise desktop so fat? Why not have a Singularity-based OS with .Net 3.0 Framework as the API. Win32 + other legacy apps could be virtualized to the desktop. As the world and work becomes more connected, the smart client at the edge of the network will have a different face.

In summary, I groked that Microsoft groks (sorry, Heinlein) the world as it exists today. Ensuring that no trees are felled in the rush to market is going to be an interesting challenge.

One Mac Head, Two Minds

An excellent article from the New York Times: Weighing a Switch to a Mac. Interesting, as it goes through the two options: BootCamp or Parallels.

You don’t need to leave your Windows-mind behind when switching. Now that I am disconnected from the Adobe-mind, I rarely use Windows applications. But then again, I’ve not really done much in the last two weeks apart from fill this blog up with stuff!

State of Mac Virtualization

MacWorld reports from the WWDC and an interview with Ben Rudolph of Parallels:

…”What’s more, Parallels Desktop for Mac will see “fast 3D graphics support,” presumably to help cater to gamers who want to run Windows games without having to reboot their machine”…

I’ve just updated to the latest Parallels beta; it was smooth and you can notice the graphics improvement. Being able to tweak the virtual environment/MacOS X is cool. Not ACPI BIOS yet, so no Vista install. Yet.

Now that Microsoft has left the MacOS X sphere, Parallels seems to be positioning itself at the consumer end of the market: games and ease of use. And increasing its distribution was a smart and calculated move.

This leaves VMware to the high end. As predicted here, two of the three predictions have come true; and according to a Macintouch interview with Dave Schroeder of VMware, the third is going to need customers to voice their needs to Apple. So it is not off the table, however we have Apple’s mantra/dogma of “MacOS X will never run on non-Apple hardware” to surmount.

It is within the realms of possiblity that Apple could create a version of MacOS X Server that had a distinct, non-desktop personality (desktop APIs removed), and checked for either Apple or VMWare “virtual hardware” — creating a stable, enterprise level Unix. This leaves customers to choose either XServe hardware with MacOS X Server, or VMware virtual hardware with MacOS X Server. The result is a live market test and ROI of being in the highly competitive and fast moving blade server marketplace.

Leave the desktop MacOS X to run on Apple hardware only.

There must be a gaggle Product Managers and Finance-types deep inside of Cupertino running their pivot tables in Excel to argue both sides of the equation. The sales of the these new XServes in the next 2-3 quarters will predict the future of MacOS X Server on a virtualization platform.

VMWare Player for MacOS X

Pre-register your interest for the VMWare Player for MacOS X 10.4. No comments on requirements, etc – but at least we are heading in the correct direction. Now Mac users can run those self-contained appliances, easily.

Update: 11:25am 8th August: “Working in the labs…” Srinivas Krishnamurti of VMWare talks about the forthcoming MacOS X version of their virtualization software. Of note is the quote “This product will allow you to create and run virtual machines on OS X” and “virtual machines created with this product are fully compatible with the latest release of other VMware products“.

Virtualization, MacOS X Server

Silicon Valley Sleuth writes a short article on the appearance of VMware at WWDC. It’s about more than just the desktop OS.

Here is another pie-in-the-sky, non-desktop scenario:

  • Apple releases new versions of both their Xserver and the MacOS X Server.
  • Xserve becomes a tested and supported platform for VMWare Server and more importantly VMware’s ESX Server. This will permit new Intel-based XServes to be installed into Datacenters with their heads held high. VMware endorsement is cred Apple needs to go to the next revenue level with their servers.
  • An implementation of Leopardized MacOS X Server will run on non-Apple hardware on VMware. This is a counter-punch to the recent Xen/Microsoft/VMware wrangling. Now MacOS X Server can run on a stable and supported platform (VMware ESX) rather than the multitude of hardware configurations found in the Intel world.

So, what’s the net-net of this? Apple has VMware supported as an application on MacOS X desktop; endorsement of their blade server environment and more sales of MacOS X Server without the support hassles.

VMware gets unique and in-demand server OS with excellent corporate support. Rather than IS managers adopting the Linux/Intel “build it yourself” approach; a supported platform is important.

It is not so much about the desktop, but the server.

The next few days will be very interesting!

Our Virtual Future: There are Cycles to Burn

Just over the digital horizon, your Apple MacBook will boot multiple MacOS X 10.x, Windows Vista and Linux/Ubuntu operating systems at launch. You don’t see all of their friendly faces, but they are there ready-to-go. Get-info on Firefox, and you inform the primary OS which of these operating system environments you would like the application to launch into. Need to run Outlook 2007 for your large organisation? No stress: its there, behind the scenes as you running Photoshop on the MacOS side. If your Windows instance crashes as you are testing a new application; it is shutdown and relaunched simultaneously. Far fetched? With Virtualization and multiple Core CPUs, no. On the desktop, usable virtualization is relatively new; what can we do with modern processors and software. There are cycles to burn.

Virtualization is nothing new in the IT world. Mainframes, the Crocodilian survivors of last century have long used Virtualization as a method of performance management and isolation. If you are login as user on a mainframe in, your session is a virtualized instance of the whole operating system. This isolates you from other users, protects the whole system from strange things you may do: like runaway queries. SELECT * FROM INVOICES;

Micro-processors since the Intel 4004 have survived with a single Central Processing Unit (CPU) that stepped through the commands in strict military time as sounded by the clock. We started with Hertz, then x1000 to Megahertz and now x1000 again to Gigahertz. A big leap has been the multiple “cores”, or separate CPUs added to the Intel processor line, with the VT-x,With multi-core processors being the current “thing”, and with Intel talking about their processors having upto 32 cores in 2010 – performance is going horizontal (more CPU cores) and vertical (Ghz Clock speeds). The drummer in that military band must be getting sore arms beating that fast!

PowerPC and other RISC-like processors started to split their processing commands to internal co-processors. This is like delegating the “difficult” jobs to the specialized underlings. For instance, Maths to Floating Point Units (FPUs). To gain performance during the latter days of the PowerPC, Apple started adding multiple processors to split the workload. Smart software could take advantage of these to speed up heavy processing tasks.

Multiple Cores in the same processor is bringing this same philosophy to everyone – including low-end MacBook laptops. Few applications scale well into multi-cores: one that does is Virtualization.

How does this related to Virtualization? Now we have multiple cores, one potential use is to run separate operating systems on these multiple core CPUs; and each operating system running at full speed. Veterans in the Mac world would remember the first versions of non-Microsoft VirtualPC. This software emulated (and dynamically recompiled!) the Intel processor commands on PowerPC like a human language translator. Translating takes time, and therefore the operating system just didn’t feel snappy. Now on we have Intel multiple cores, the OS world is a user’s oyster as the translation is no longer required.

As Virtualization is in the hardware, the software that has been unique can be commoditised. This has drawn WMWare and Microsoft into a war for the “Virtual Server Platform”, and the first battle on the desktop to grab the hearts-and-minds. The dollar returns for these organizations lies with the “ownership” of the server platform.

Microsoft has released VirtualPC 2004 as now a free product, with VirtualPC 2007 also following for free. This is a parry to the VMWare thrust of WMWare Player. VirtualPC comes from a product started by Connectix. Announcement made at a recent MacWorld that Virtual PC Mac would support Intel based Macs in the future. The Channel9 Team at Microsoft have an interview with the engineers of VirtualPC. This provides an excellent backgrounder on the history, and technology of virtualization.

EMC, and specifically their VMWare division, has free versions of VMWare Server and VMWare Player available for free. Conveniently, VMWare have links to pre-created VMWare images, or “appliances” as they name them.

Server Appliances: one of the VMWare images available is a Firefox Browser appliance.. A self-contained operating system with user interface (Ubuntu) and Firefox – pre-installed, ready to go. We are all waiting for VMWare for the Intel-based Macs.

The recent scrabble over the Xen environment for virtualization between Microsoft and VMWare, and recent comments by Novell are an indicator of the importance of server virtualization. Many large server hardware companies must concerned about the impact on their business as server hardware is consolidated.

Server, large systems, IT shops with many physical servers and the need to support legacy server applications benefit from Virtualization today. How about desktop environments?

One of the first applications I installed on my MacBook Pro was Parallels Desktop for the Mac. There are a few Windows-only applications I need to use; and within 30 minutes I was sold (and yes, I purchased a copy). I use Windows XP daily, Parallels has been useful in seeing what this commotion about Ubuntu has been about, and doing a test install of Debian prior to production install as our home server.

10 Ways Virtualization Will Change Our Digital Lives:

  1. Critical Application isolation In our daily computer-lives, there are applications that are more missing critical. Imagine you have a soft-phone running via VoIP. You really want that application running, rain hail or crash. With Virtualization, you could put that application in an environment that is isolated from your World-of-Warcraft environment. In fact, you have a “play”, “work” environment to separate your digital life.
  2. Holistic Backups Backup your VM, and therefore your whole environment. I remember the first time I lost a hard drive (an amazing 100Mb external SCSI drive on a Mac II) and lost all the data. Bourbon and Coke soothed the loss; now I have a laptop with 100Gb. Learning from the original loss of data, there are a myriad of backups lying around: burnt CDs, DVDs, Debian server with 450Gb, some data inside GMail. However, the investment in the setup of my machine is worth 4-5 working days to become “online” if I lost the hard drive. With an image of the think investment in the setup, let alone the data.
  3. Pre-setup Server Platforms:A project like Ruby on Rails. To install this beastie, you have to install Ruby, Apache, MySQL (or Postgres), the Ruby Gem remote installation/dependency system, Rails plus a few Ruby hooks to connect all this together. For the faint of heart, or non-OS tweaker – its a major and potentially impossible chore. Deploying this in a development/test/production cycle increases the complexity by an order of magnitude. Servers, and server software is just too darn complex for the average developer.

    Instead, if there was a lightweight OS with these pre-configured, installed, security checked – a developer could simply download the VirtualPC/VMWare image and start.

    It would be cool if organizations that sold/developed server tools provided a “pre-installed” image rather than an a myriad of inter-linked installers and required dependencies.

  4. Lightweight Server operating systems, or based “framework” on which “virtualized” systems can be created and easily deployed. Guess what? there are these environments for Linux and potentially for Windows.

    Rather than building a large, complex OS with multiple software sub-installs, commercial server applications should move to a VMWare/VirtualPC image for double-click launch-and-go.

  5. Deep debugging: imagine an instrumented, or as Microsoft call it an enlightened OS. Run the VM, Record all the commands until crash. Rewind, Replay.
  6. Virtualizable OS for Test/Development; or Quick OS Undo: launch a VM and test an application and keep it separate from your working operating system. If something “breaks”, throw away the VM image and restart. Keeping clean images of various operating systems in “cold storage” permits quick resetting back to a baseline.

    In a recent podcast Ruby on Rails interview with : “Its about breadth, not depth”; as processors are starting to go multiple core, applications need to scale horizontally, not expect more processor speed to magically work. This has an impact on Dynamic/Scripted languages being deployed as web applications. Also SOA is about these applications connecting to each other between different servers.

  7. Hosted Servers from Virtualized Images: grab a “copy” of a virtual server from your new hositng provider; locally test/install & and then remotely deploy and start. After writing this, I notived that TextDrive is offering Solaris Container style serving.
  8. Clone + Clone back: Take an image of your current VM OS configuration. Make some change (install an application or OS upgrade), and binary re-merge the now modified clone back onto the original. This is probably more difficult to do than think through, but a well architected OS could permit smaller diff style changes.
  9. Why re-invent the wheel? Large applications have large problems; saving files. foundation: with IO/MM and low-level access to the hardware, could a large application run in its own operating system? Separating the IO/MM out of the large framework into a well-tested OS results in re-factored frameworks. Higher level applications really don’t know they are in a virtual environment.
  10. Network Virtual desktops. A twist on the Citrix environment, but the CPU in use is on the desktop, file image on the server. VMware is a part of a gorup working on desktop VDI. What we need is a small, easily Virtualizable (and pre-installed image) MacOS X and Windows XP/Vista.

Virtualization, as the processors go madly horizontal with multiple CPU cores, is going to be a large part of our future: not only on the server, but also on the desktop.

Download Parallels (if you on an Intel Mac) or VMWare Player / Microsoft VirtualPC and give it a whirl and experience the beginning of our virtual future. Burn those Cycles!

Bedtime Reading