AUReMIX07 Silverlight Video

frankheadgeek

Watch the video here of Frank Arrigo and Monique Eagles here. Yes, you will need to install Silverlight.

This is my first experiment with Silverlight and the Microsoft Expression set of tools. Using the inbuilt players in Media Encoder saved many days/hours of hand coding; yet I am sure there is more in there that will tickle out over coming weeks.

NOTE: Silverlight 1.1 is alpha-release!

Workflow (all on Vista Ultimate):

  • Edited footage in Adobe Premiere Pro 2.0
  • Export Sequence from Premiere Pro using Adobe Media Encoder 960×720 WMV9/WMA9, very light compression.
  • Import into Microsoft Expression Media Encoder (May preview)
  • Export footage as VC-1 Web Server High Speed (using a normal web server). This setting is 640×480. Obviously, I could compress this more.
  • Edit the Default.html to correctly reference EmePlayer.js (note: this got me for an hour. Linux web servers are case-sensitive, and the Default.html points to emeplayer.js. 404! Bug reported)
  • FTP files to directory onto nickhodge.com (could have used Expression Web, but I was debugging the problem with upper/lower case file naming above)

Thoughts? Comments?  I only have Silverlight 1.1 alpha installed. I’ve tested in Windows IE/FireFox and MacOS X 10.4 Safari/Firefox. The Mac’s audio might be out-of-sync. Again, this is reported.

 

Climb every Mountain

Spent the last week and this week on a personal Ruby on Rails project. This involves subversion (as a version management system), mongrel, capistrano, ftp, postgresql, some smarts with DNS, exim and a two-day complete re-install of Debian. That re-install was not expected.

Unix has this wonderful and powerful concept: the root user knows what they are doing at all times. 99.9% of the time this is a safe assumption. 0.1% of the time you type “yes” instead of “no” – removing the kernel in this fashion is highly not recommended.

How do you fix a broken Linux install?

Stage one involved making what is known as a LiveCD, or bootable Linux. I decided to download and boot from a Knoppix LiveCD. A quick restart from the CD, and I could see that the data was intact.

Stage two was installing a new 350GB HD for the data to bring the server up to 0.5TB of storage. The old faithful Unix standbys of dd, fsck from the old 200GB to the new 350GB and start the difficult work.

Stage three is a full Debian reinstall onto the old 200GB with 0.5Gb download of the most current packages, and re-apt on a 686 rather than 386 kernel. This didn’t take too long. Re-configuring all the servers and services: dns, dhcp, CUPS, Samba, Apache, subversion, rails+gems, python took most of the weekend.

Stage four: backup scripts. 0.5TB is too much of a mountain of data to lose.

Our Virtual Future: There are Cycles to Burn

Just over the digital horizon, your Apple MacBook will boot multiple MacOS X 10.x, Windows Vista and Linux/Ubuntu operating systems at launch. You don’t see all of their friendly faces, but they are there ready-to-go. Get-info on Firefox, and you inform the primary OS which of these operating system environments you would like the application to launch into. Need to run Outlook 2007 for your large organisation? No stress: its there, behind the scenes as you running Photoshop on the MacOS side. If your Windows instance crashes as you are testing a new application; it is shutdown and relaunched simultaneously. Far fetched? With Virtualization and multiple Core CPUs, no. On the desktop, usable virtualization is relatively new; what can we do with modern processors and software. There are cycles to burn.

Virtualization is nothing new in the IT world. Mainframes, the Crocodilian survivors of last century have long used Virtualization as a method of performance management and isolation. If you are login as user on a mainframe in, your session is a virtualized instance of the whole operating system. This isolates you from other users, protects the whole system from strange things you may do: like runaway queries. SELECT * FROM INVOICES;

Micro-processors since the Intel 4004 have survived with a single Central Processing Unit (CPU) that stepped through the commands in strict military time as sounded by the clock. We started with Hertz, then x1000 to Megahertz and now x1000 again to Gigahertz. A big leap has been the multiple “cores”, or separate CPUs added to the Intel processor line, with the VT-x,With multi-core processors being the current “thing”, and with Intel talking about their processors having upto 32 cores in 2010 – performance is going horizontal (more CPU cores) and vertical (Ghz Clock speeds). The drummer in that military band must be getting sore arms beating that fast!

PowerPC and other RISC-like processors started to split their processing commands to internal co-processors. This is like delegating the “difficult” jobs to the specialized underlings. For instance, Maths to Floating Point Units (FPUs). To gain performance during the latter days of the PowerPC, Apple started adding multiple processors to split the workload. Smart software could take advantage of these to speed up heavy processing tasks.

Multiple Cores in the same processor is bringing this same philosophy to everyone – including low-end MacBook laptops. Few applications scale well into multi-cores: one that does is Virtualization.

How does this related to Virtualization? Now we have multiple cores, one potential use is to run separate operating systems on these multiple core CPUs; and each operating system running at full speed. Veterans in the Mac world would remember the first versions of non-Microsoft VirtualPC. This software emulated (and dynamically recompiled!) the Intel processor commands on PowerPC like a human language translator. Translating takes time, and therefore the operating system just didn’t feel snappy. Now on we have Intel multiple cores, the OS world is a user’s oyster as the translation is no longer required.

As Virtualization is in the hardware, the software that has been unique can be commoditised. This has drawn WMWare and Microsoft into a war for the “Virtual Server Platform”, and the first battle on the desktop to grab the hearts-and-minds. The dollar returns for these organizations lies with the “ownership” of the server platform.

Microsoft has released VirtualPC 2004 as now a free product, with VirtualPC 2007 also following for free. This is a parry to the VMWare thrust of WMWare Player. VirtualPC comes from a product started by Connectix. Announcement made at a recent MacWorld that Virtual PC Mac would support Intel based Macs in the future. The Channel9 Team at Microsoft have an interview with the engineers of VirtualPC. This provides an excellent backgrounder on the history, and technology of virtualization.

EMC, and specifically their VMWare division, has free versions of VMWare Server and VMWare Player available for free. Conveniently, VMWare have links to pre-created VMWare images, or “appliances” as they name them.

Server Appliances: one of the VMWare images available is a Firefox Browser appliance.. A self-contained operating system with user interface (Ubuntu) and Firefox – pre-installed, ready to go. We are all waiting for VMWare for the Intel-based Macs.

The recent scrabble over the Xen environment for virtualization between Microsoft and VMWare, and recent comments by Novell are an indicator of the importance of server virtualization. Many large server hardware companies must concerned about the impact on their business as server hardware is consolidated.

Server, large systems, IT shops with many physical servers and the need to support legacy server applications benefit from Virtualization today. How about desktop environments?

One of the first applications I installed on my MacBook Pro was Parallels Desktop for the Mac. There are a few Windows-only applications I need to use; and within 30 minutes I was sold (and yes, I purchased a copy). I use Windows XP daily, Parallels has been useful in seeing what this commotion about Ubuntu has been about, and doing a test install of Debian prior to production install as our home server.

10 Ways Virtualization Will Change Our Digital Lives:

  1. Critical Application isolation In our daily computer-lives, there are applications that are more missing critical. Imagine you have a soft-phone running via VoIP. You really want that application running, rain hail or crash. With Virtualization, you could put that application in an environment that is isolated from your World-of-Warcraft environment. In fact, you have a “play”, “work” environment to separate your digital life.
  2. Holistic Backups Backup your VM, and therefore your whole environment. I remember the first time I lost a hard drive (an amazing 100Mb external SCSI drive on a Mac II) and lost all the data. Bourbon and Coke soothed the loss; now I have a laptop with 100Gb. Learning from the original loss of data, there are a myriad of backups lying around: burnt CDs, DVDs, Debian server with 450Gb, some data inside GMail. However, the investment in the setup of my machine is worth 4-5 working days to become “online” if I lost the hard drive. With an image of the think investment in the setup, let alone the data.
  3. Pre-setup Server Platforms:A project like Ruby on Rails. To install this beastie, you have to install Ruby, Apache, MySQL (or Postgres), the Ruby Gem remote installation/dependency system, Rails plus a few Ruby hooks to connect all this together. For the faint of heart, or non-OS tweaker – its a major and potentially impossible chore. Deploying this in a development/test/production cycle increases the complexity by an order of magnitude. Servers, and server software is just too darn complex for the average developer.

    Instead, if there was a lightweight OS with these pre-configured, installed, security checked – a developer could simply download the VirtualPC/VMWare image and start.

    It would be cool if organizations that sold/developed server tools provided a “pre-installed” image rather than an a myriad of inter-linked installers and required dependencies.

  4. Lightweight Server operating systems, or based “framework” on which “virtualized” systems can be created and easily deployed. Guess what? there are these environments for Linux and potentially for Windows.

    Rather than building a large, complex OS with multiple software sub-installs, commercial server applications should move to a VMWare/VirtualPC image for double-click launch-and-go.

  5. Deep debugging: imagine an instrumented, or as Microsoft call it an enlightened OS. Run the VM, Record all the commands until crash. Rewind, Replay.
  6. Virtualizable OS for Test/Development; or Quick OS Undo: launch a VM and test an application and keep it separate from your working operating system. If something “breaks”, throw away the VM image and restart. Keeping clean images of various operating systems in “cold storage” permits quick resetting back to a baseline.

    In a recent podcast Ruby on Rails interview with : “Its about breadth, not depth”; as processors are starting to go multiple core, applications need to scale horizontally, not expect more processor speed to magically work. This has an impact on Dynamic/Scripted languages being deployed as web applications. Also SOA is about these applications connecting to each other between different servers.

  7. Hosted Servers from Virtualized Images: grab a “copy” of a virtual server from your new hositng provider; locally test/install & and then remotely deploy and start. After writing this, I notived that TextDrive is offering Solaris Container style serving.
  8. Clone + Clone back: Take an image of your current VM OS configuration. Make some change (install an application or OS upgrade), and binary re-merge the now modified clone back onto the original. This is probably more difficult to do than think through, but a well architected OS could permit smaller diff style changes.
  9. Why re-invent the wheel? Large applications have large problems; saving files. foundation: with IO/MM and low-level access to the hardware, could a large application run in its own operating system? Separating the IO/MM out of the large framework into a well-tested OS results in re-factored frameworks. Higher level applications really don’t know they are in a virtual environment.
  10. Network Virtual desktops. A twist on the Citrix environment, but the CPU in use is on the desktop, file image on the server. VMware is a part of a gorup working on desktop VDI. What we need is a small, easily Virtualizable (and pre-installed image) MacOS X and Windows XP/Vista.

Virtualization, as the processors go madly horizontal with multiple CPU cores, is going to be a large part of our future: not only on the server, but also on the desktop.

Download Parallels (if you on an Intel Mac) or VMWare Player / Microsoft VirtualPC and give it a whirl and experience the beginning of our virtual future. Burn those Cycles!

Bedtime Reading