Just a small post to inform you that Linux Mint 8 (aka Helena) is here. Linux Mint is based on Ubuntu 9.10 (aka Karmic Koala), but includes DVD, java and flash support in the default installation, meaning that most users won't have to install any extra packages after the initial setup. For those that want the extras though a graphical software manager is available. The update system has been revamped: not only does it gives you a rating for the impact of most updates, but you can now configure the system to ignore updates completely, as well as configure what information appears in the update manager. It is also easier to perform OEM installs, a good thing if you are installing Mint for someone else. As usual the Linux Mint theme is very polished and looks very elegant. New user can rely on an updated user guide in pdf format to help get them started. On the technical side, you get the kernel 2.6.31 and Gnome 2.28 which include quite a few improvement if their own. You can download Linux Mint 8 from here!
I had previously pointed out that the lack of supported platforms was a serious problem for Silverlight, especially when compared to Flash. The root of the problem was that Moonlight, the Linux version of Silverlight, is usually at least one release behind the Windows and mac versions of Silverlight. This caused confusion for developers as it was not clear which features would work on Linux. Rather than working to fix the problem it seems that Microsoft is making it worse by introducing Windows only features in Silverlight 4.
The need for COM
One of the most widely used api in Windows is COM. With it you can access almost anything on a Windows machine and that makes it a very powerful tool for developers. The problem is that it is a technology that only exists on Windows and that you can't easily retrofit it in OSX or Linux. Here comes a choice for Microsoft: either they give Silverlight developers access to COM, which will strongly increase the usefulness of Silverlight on Windows but will fragment the Silverlight market even more, or they don't and try to unify their supported base to compete with Flash. They choose the first option.
Silverlight gives the multi-platform market to flash
What I get from this decision is that the objectives Microsoft had with Silverlight have changed. It looks like competing with flash in the wider, multi-platform market is taking a back seat to the introduction of new functionality. What Microsoft is pushing is Silverlight as the default web based development platform for Windows, with some limited compatibility with non Windows platforms. This goes in the opposite direction to Adobe Flash which seems to favor a consistent set of functionality and compatibility across all platforms. Flash is not only available on Windows, Mac and Linux, but also on the Wii, and soon an ARM version should be released for smartbooks. And that does not even cover gnash, the open source version of flash that is more or less to Flash what Moonlight is to Silverlight. In short, Microsoft is giving up the multi-platform market to Adobe.
The impact for the developers
With many Linux based web devices based on ChromeOS in the works for next year and OSX market share on the rise, choosing Silverlight as a web development platform need to be carefully considered. The developer needs to be fully aware that some Silverlight 4 functionality will not be available if cross platform support is required (and on the web it is almost always required). If Linux support is to be assured the situation is even worse, as targeting anything above Silverlight 2 level could possibly break compatibility with Moonlight until late next year. This makes Adobe Flash the safer choice for Web development.
Is Silverlight COM support useless?
There are however cases where the COM support in Silverlight 4 will be useful: for enterprise development. If your company is a Windows shop you can use Silverlight 4 to develop very powerful web applications that run straight from the company intranet. You need of course to be sure that the application will not have to be made available to external customers that may use other clients. In these 'intranet' scenarios the addition of COM to Silverlight 4 is clearly a benefit and is indicative of the will of Microsoft to reposition Silverlight as an "enterprise" technology as well as a "web" technology.
Even if I my Linux netbook is my main machine nowadays, there is one thing for which it can't replace my trusty Windows XP desktop: gaming. I have long been a fan of playing on the PC rather than on consoles because in my opinion a mouse is required for the type of games that I like (FPS, RTS, RPG) and there are many more quality games available on the PC in those genres, some of which are even free. The problem is that my recent PC gaming experience has been less than stellar, making me consider consoles more and more as an alternative. Here are some of the main issues with modern PC gaming:
It takes forever before I can play
I haven't played Battlefield 2 for a few months, but 3 weeks ago I wanted to have a quick game online, only to find that no servers were available. After investigation it turns out that a 1.5 Gb patch was released, and that I have to download it before I can play online. Granted, the patch adds many maps so it is not all negative, but there was no hope of a quick game: the soonest I would be able to play would be tomorrow. I had a similar issue with Runes of Magic, a free MMORPG inspired by World of Warcraft. I had not played for some time, and when I wanted to have a look there again, it started by several hours of patch downloads and installation (and I am on a 6 Mbps ADSL connection). Although I do like that game publishers add new free content to their games, the update mechanism clearly needs improvement. Either there needs to be a way for that patching to be brought to my attention before I want to play the game (because when I want to play the game, I want to play the game, not patch it!), or the publisher needs to ensure that the game can start while the client is being updated, for example by keeping some servers compatible with older versions of the client.
Selling half finished software.
Recently I started playing League of Legends, a very good free online RTS / RPG crossover that plays a bit like Demigod. The game was officially out of beta earlier this month. The game is great except for a few things: there are only 2 maps to play on (and one of those is still in beta), the rest of the maps have not been released yet. The in game store that allows you to get "Runes" will only open Monday 23 November, so that part of the game doesn't work yet. If this was a free game still in beta that would not be much of a problem, but not only is the game officially released for a few weeks, but the League of Legends Collector Pack that cost about $30 is already selling on Amazon. It is clear here that people are asked to pay for a game that is only half finished and that is simply not acceptable.
I needs an internet connection to play a single player game
I recently purchased BioShock, which is an excellent single player game. I was surprised however to notice that to play the game you required an internet connection. It is not a problem for me, but there are still people without a broadband connection in many areas of the world. Why don't the publisher give the option to use the internet or the DVD to prove that you actually own the game? It is not as if these "protections" will prevent pirates to copy the game anyway, so why cause a problem for those of your customers that don't have an easy access to the internet?
It's not all bad
Now there are still a lot of advantages to PC gaming. As mentioned a lot of games are free or cheap, extra content can easily be added, the mouse and keyboard interface is a must for some games and online play is often free, but publishers active in the PC gaming market should not fall asleep at the wheel: if the issues mentioned above are not fixed PC gaming will stop to be the platform of choice for a lot of gamers.
There seems to be quite a few concerns and complains about recent Ubuntu releases. Are there really that many regressions and instabilities with the latest releases of Ubuntu? Probably! Should we accept that in a production OS? No, but there is something that a many people tend to forget: the primary objective of these interim releases is not stability. I think that a lot of people tend to dismiss the Ubuntu release cycle, and for a good reason: that cycle is not a perfect solution. Lets look at the problem in detail:
Ubuntu 9.04 and 9.10 regressions
No one can argue that the two most recent releases of Ubuntu have been full of problems. The 9.04 release brought a lot of regressions and instabilities with the Intel video driver, which unfortunately is the most common graphic adapter in use. The 9.10 version seems to have it's own share of problems with a lot of people reporting troubles after upgrading in the Ubuntu forums. This certainly discredit Ubuntu as a consumer ready OS, but the problem is that Ubuntu 9.04 and 9.10 do not aim to be consumer ready but merely a rehearsal for the next LTS version of Ubuntu!
Ubuntu's misunderstood release cycle
Let's look in more detail at the Ubuntu release cycle. Every two year we get a Long term support (or LTS) release. That release is supposed to be stable, consumer ready and widely used. Currently the LTS release is version 8.04 and there are very little issues with it as long as you install it on supported hardware. In addition every 6 month you get an interim Ubuntu release. That release is not intended for mainstream users but rather for people who want (or need) the bleeding edge in Linux packages, drivers and kernel. They are not meant to used for extended period of time, so they have a short support cycle of only 18 months. The long term releases on the other hand is supported for a much more comfortable 3 years, and you can upgrade from LTS to LTS without ever having to touch an interim release.
LTS releases are the true consumer Ubuntu
If you take the time to think about it the message is clear: if you just want to use your Ubuntu computer without having to muck around too much with the OS, just install the LTS release and skip the interims! My MSI wind running Ubuntu 8.04 is still working fine, but Ubuntu 9.10 would not work with it. Interim releases don't focus on stability and reliability, that's the job of the LTS release, they focus on new features. You are probably wondering then why so many people install interim releases and complain about stability then? Well, it is part ignorance, but also part of a far more sinister issue with LTS.
The problem of Long Term Support releases
There is a major problem with LTS though: If you just bought a brand new computer, there are chances that some of the hardware won't work with Ubuntu 8.04. After all, the OS was released more than 18 months ago, in the meantime new hardware has appeared, and it was not possible at the time to included drivers for devices that did not even exist. As an example I recently purchased an Acer Aspire One for my wife and wanted to replace Linpus Linux by a newer version of Ubuntu or Linux Mint. In the end I used Linux Mint 7 (based on Ubuntu 9.04) because there were too many driver issues with Ubuntu 8.04. In the end it was easier to fix the problems with the Intel display driver in 9.04 than to sort out all the other issues with 8.04. Note that I won't upgrade the machine OS anytime soon, maybe I will reinstall when the next LTS release is available if it solves the few remaining issues.
The problem with interim releases
Interim releases have the opposite problem: they include bleeding edges software and drivers, but these have not been tested by a large amount of users, and as a result regressions and breakages are fairly common. Canonical started working on Ubuntu 9.10 six months ago, while Ubuntu 8.04 probably has 2 years worth of troubleshooting and patching behind it. It is not difficult to guess which release will be the best as far as stability is concerned. In 3 to 6 months Ubuntu 9.10 will be a lot better as the biggest issues are fixed by patches, but when that happens people will only talk about Ubuntu 10.04, and most of them will say that it is not as stable as 9.10. In my opinion it takes 3 to 6 months after initial release for an Ubuntu version to be ready for mainstream users. The problem is that by that date most users consider it outdated.
So is Ubuntu broken
I don't think so, at least not more than most other Linux distributions. The problem is that we have two kinds of Linux desktops with their own problems. On one side you have the sedate LTS releases that are stable and ready for the average user, but may be incompatible with newer hardware and software. On the other side you have the bleeding edge interim releases with all the their problems and breakneck 6 month release cycles. Most problems arise when someone wanting a long term solution (a LTS) is forced to use an interim release instead because of hardware compatibility. Is there a solution to this? Ubuntu could make LTS releases every year, reducing the problem. They could invest more in backporting drivers and applications to the current LTS (although this can be problematic since drivers are part of the kernel). Better driver support from hardware manufacturers could probably help too. In the end there is probably no perfect solution.
Many Linux enthusiast are despairing of the low uptake of desktop Linux and its poor availability in high street shops. This is especially frustrating because most of the people using desktop Linux would consider it to be a superior solution to the Windows based machines on offer (and it probably is). I think I have fingered one of the causes for this problem though: desktop Linux needs salesmen!
To illustrate this principle I'll use the following anecdote from Rich Dad, Poor Dad:
One day a Journalist was interviewing the author of that best-seller. The journalist being a writer herself asked the successful author what she should do to produce a best seller like he did. Much to her surprise he told her: 'You should follow some sales training!' The Journalist was shocked and said: 'I want to be a writer, not a saleswoman, why would I lower myself by studying sales techniques?' The successful author took his book, turned it around and said: 'Here it says that I am a best selling author, not a best writing author!'
Now let's transpose that to the world of operating systems: there are many talented developers and programmers that are working on desktop Linux but there are very few talented salesmen that are working on selling desktop Linux. The result: desktop Linux doesn't sell! Of course, it sells to some people, the people "in the known", but it doesn't sell well to the mass market. It doesn't sell in high street shops because no one is selling desktop Linux to the big electronic retail chains. There is no advertising of desktop Linux so there is not an overwhelming demand for it, so the retailers won't stock Linux machines.
Let's try to see this from the point of view of the retailer. What he wants to do is sell as many computers as possible. He can do this 2 ways: either he sells a product that many people want, or convince people to buy what he has. Now predicting what people want is easy for heavily marketed items like iPods and iPhones, but it is much more tricky for computers. When it comes to computer operating systems a retailer is much more likely to stock something fairly generic and to convince its customers to purchase what he has, even if that is not the best product for that customer, or not exactly what that customer wants.
If we follow the reasoning above what desktop Linux needs is either:
- Salesmen who go "sell" desktop Linux to OEMs first, then to retailers and to a lesser extend consumers. This is the "top to bottom", sell what you have approach. The problem is that you need to have a very efficient selling structure and organization to do that. Ubuntu had some success selling Desktop Linux to Dell and Google seems to be gaining some traction with ChromeOS but beyond that there is currently not much progress being done.
- A lot of very visible advertising to consumers to generate a lot of consumer demand for desktop Linux. This is the "bottom to top", sell what the customer wants approach. The main problem is that this require not only a good marketing organization but also a large advertising budget, things that desktop Linux lacks right now.
The fact is that there are many projects and organizations devoted to maintaining and improving Linux, there are a few organizations devoted to the promotion of Desktop Linux, but there are almost no organizations devoted to the sales and advertising of desktop Linux. I think that one of the reasons why the Firefox browser is much more successful than desktop Linux is because the Mozilla foundation invested much more time and energy in advertising and promoting of Firefox as a product than most Linux distribution have. As long as Linux distributions focused on the desktop do not put much more effort in their sales and adverting desktop Linux will remain a "best writing" operating system rather than the "best selling" OS it deserves to be.