“Stallman: Linux used to track Londonersâ€
Finally, Stallman suggested keeping Oyster cards in aluminium foil when they aren’t actually being scanned for travel, to prevent them being scanned secretly.
“Stallman: Linux used to track Londonersâ€
Finally, Stallman suggested keeping Oyster cards in aluminium foil when they aren’t actually being scanned for travel, to prevent them being scanned secretly.
Watch the video here of Frank Arrigo and Monique Eagles here. Yes, you will need to install Silverlight.
This is my first experiment with Silverlight and the Microsoft Expression set of tools. Using the inbuilt players in Media Encoder saved many days/hours of hand coding; yet I am sure there is more in there that will tickle out over coming weeks.
NOTE: Silverlight 1.1 is alpha-release!
Workflow (all on Vista Ultimate):
Thoughts? Comments? I only have Silverlight 1.1 alpha installed. I’ve tested in Windows IE/FireFox and MacOS X 10.4 Safari/Firefox. The Mac’s audio might be out-of-sync. Again, this is reported.
Spent the last week and this week on a personal Ruby on Rails project. This involves subversion (as a version management system), mongrel, capistrano, ftp, postgresql, some smarts with DNS, exim and a two-day complete re-install of Debian. That re-install was not expected.
Unix has this wonderful and powerful concept: the root user knows what they are doing at all times. 99.9% of the time this is a safe assumption. 0.1% of the time you type “yes” instead of “no” – removing the kernel in this fashion is highly not recommended.
How do you fix a broken Linux install?
Stage one involved making what is known as a LiveCD, or bootable Linux. I decided to download and boot from a Knoppix LiveCD. A quick restart from the CD, and I could see that the data was intact.
Stage two was installing a new 350GB HD for the data to bring the server up to 0.5TB of storage. The old faithful Unix standbys of dd, fsck from the old 200GB to the new 350GB and start the difficult work.
Stage three is a full Debian reinstall onto the old 200GB with 0.5Gb download of the most current packages, and re-apt on a 686 rather than 386 kernel. This didn’t take too long. Re-configuring all the servers and services: dns, dhcp, CUPS, Samba, Apache, subversion, rails+gems, python took most of the weekend.
Stage four: backup scripts. 0.5TB is too much of a mountain of data to lose.
Just over the digital horizon, your Apple MacBook will boot multiple MacOS X 10.x, Windows Vista and Linux/Ubuntu operating systems at launch. You don’t see all of their friendly faces, but they are there ready-to-go. Get-info on Firefox, and you inform the primary OS which of these operating system environments you would like the application to launch into. Need to run Outlook 2007 for your large organisation? No stress: its there, behind the scenes as you running Photoshop on the MacOS side. If your Windows instance crashes as you are testing a new application; it is shutdown and relaunched simultaneously. Far fetched? With Virtualization and multiple Core CPUs, no. On the desktop, usable virtualization is relatively new; what can we do with modern processors and software. There are cycles to burn.
Virtualization is nothing new in the IT world. Mainframes, the Crocodilian survivors of last century have long used Virtualization as a method of performance management and isolation. If you are login as user on a mainframe in, your session is a virtualized instance of the whole operating system. This isolates you from other users, protects the whole system from strange things you may do: like runaway queries. SELECT * FROM INVOICES;
Micro-processors since the Intel 4004 have survived with a single Central Processing Unit (CPU) that stepped through the commands in strict military time as sounded by the clock. We started with Hertz, then x1000 to Megahertz and now x1000 again to Gigahertz. A big leap has been the multiple “cores”, or separate CPUs added to the Intel processor line, with the VT-x,With multi-core processors being the current “thing”, and with Intel talking about their processors having upto 32 cores in 2010 – performance is going horizontal (more CPU cores) and vertical (Ghz Clock speeds). The drummer in that military band must be getting sore arms beating that fast!
PowerPC and other RISC-like processors started to split their processing commands to internal co-processors. This is like delegating the “difficult” jobs to the specialized underlings. For instance, Maths to Floating Point Units (FPUs). To gain performance during the latter days of the PowerPC, Apple started adding multiple processors to split the workload. Smart software could take advantage of these to speed up heavy processing tasks.
Multiple Cores in the same processor is bringing this same philosophy to everyone – including low-end MacBook laptops. Few applications scale well into multi-cores: one that does is Virtualization.
How does this related to Virtualization? Now we have multiple cores, one potential use is to run separate operating systems on these multiple core CPUs; and each operating system running at full speed. Veterans in the Mac world would remember the first versions of non-Microsoft VirtualPC. This software emulated (and dynamically recompiled!) the Intel processor commands on PowerPC like a human language translator. Translating takes time, and therefore the operating system just didn’t feel snappy. Now on we have Intel multiple cores, the OS world is a user’s oyster as the translation is no longer required.
As Virtualization is in the hardware, the software that has been unique can be commoditised. This has drawn WMWare and Microsoft into a war for the “Virtual Server Platform”, and the first battle on the desktop to grab the hearts-and-minds. The dollar returns for these organizations lies with the “ownership” of the server platform.
Microsoft has released VirtualPC 2004 as now a free product, with VirtualPC 2007 also following for free. This is a parry to the VMWare thrust of WMWare Player. VirtualPC comes from a product started by Connectix. Announcement made at a recent MacWorld that Virtual PC Mac would support Intel based Macs in the future. The Channel9 Team at Microsoft have an interview with the engineers of VirtualPC. This provides an excellent backgrounder on the history, and technology of virtualization.
EMC, and specifically their VMWare division, has free versions of VMWare Server and VMWare Player available for free. Conveniently, VMWare have links to pre-created VMWare images, or “appliances” as they name them.
Server Appliances: one of the VMWare images available is a Firefox Browser appliance.. A self-contained operating system with user interface (Ubuntu) and Firefox – pre-installed, ready to go. We are all waiting for VMWare for the Intel-based Macs.
The recent scrabble over the Xen environment for virtualization between Microsoft and VMWare, and recent comments by Novell are an indicator of the importance of server virtualization. Many large server hardware companies must concerned about the impact on their business as server hardware is consolidated.
Server, large systems, IT shops with many physical servers and the need to support legacy server applications benefit from Virtualization today. How about desktop environments?
One of the first applications I installed on my MacBook Pro was Parallels Desktop for the Mac. There are a few Windows-only applications I need to use; and within 30 minutes I was sold (and yes, I purchased a copy). I use Windows XP daily, Parallels has been useful in seeing what this commotion about Ubuntu has been about, and doing a test install of Debian prior to production install as our home server.
Instead, if there was a lightweight OS with these pre-configured, installed, security checked – a developer could simply download the VirtualPC/VMWare image and start.
It would be cool if organizations that sold/developed server tools provided a “pre-installed” image rather than an a myriad of inter-linked installers and required dependencies.
Rather than building a large, complex OS with multiple software sub-installs, commercial server applications should move to a VMWare/VirtualPC image for double-click launch-and-go.
In a recent podcast Ruby on Rails interview with : “Its about breadth, not depth”; as processors are starting to go multiple core, applications need to scale horizontally, not expect more processor speed to magically work. This has an impact on Dynamic/Scripted languages being deployed as web applications. Also SOA is about these applications connecting to each other between different servers.
Virtualization, as the processors go madly horizontal with multiple CPU cores, is going to be a large part of our future: not only on the server, but also on the desktop.
Download Parallels (if you on an Intel Mac) or VMWare Player / Microsoft VirtualPC and give it a whirl and experience the beginning of our virtual future. Burn those Cycles!
Bedtime Reading