header image
The Ubuntu Way
May 16th, 2010 under OSS, rengolin, Software, Unix/Linux. [ Comments: 2 ]

It’s been five years now that I switched from Debian to Ubuntu, primarily for the updated software and radical changes in the user interface, and there are quite a few things that were constant all this time. When on Debian, I always used the unstable branch. It was the obvious choice for a non-mission-critical desktop environment I always needed. But even being unstable, it lacked a bit of risk-taking that made me some times having to compile (or download binary) applications by myself, working around the packaging management system.

With Ubuntu, it’s the exact opposite. The ongoing lack of support for nVidia and ATI boards, PulseAudio and the new Plymouth splash are good examples of major failures on deploying a technology that is yet too young to be in a distribution, especially a Long-Term-Support one. Recent rumours on changing Firefox to Chrome is a more critical change, since the whole community around Firefox (add-ons, plug-ins, bookmarklets, etc) cannot easily be migrated to Chrome or any other major browser. But this is all about the Ubuntu Way.

Identity

Ubuntu, like many other Linux distributions (especially Debian), has built its identity around the OS that most users share. It’s organic, and grows with time and feedback from the users, joined with the directions the “board” is taking in what goes in and what goes out. The original Linux community (back in mid-90’s) was a bit homogeneous in that respect, with most distributions being yet-another-collections-of-packages, be it RPM, DEB, Tar balls or anything else. With time, strong feelings were separating some distributions apart, and specializing others. Debian, for instance, became over preoccupied with license issues (no other than open source was allowed), while RedHat became more enterprise focused, flooded with third-party libraries, commercial products and a licensing scheme that was more like Microsoft than anything else.

Still, within the Debian community, some people (like me) thought that the release schedule was too wide and the licensing issues were too narrow to produce a really helpful desktop replacement for other commercial systems, like MacOS. Indeed, after a few releases, Ubuntu has shown that it can replace them for most uses to most users. I, as a Linux user for so many years, welcomed the ease of use of a MacOS without the lock-downs and lame packaging systems.

But they went further, and decided to be very (very) much the same as Apple. Initially, the Linux way was to offer everything there was available for everything. There were dozens of instant messengers, browsers, picture viewers, consoles, etc, all installed by default (or to pick from a selection of thousands of packages in the installation process), which was a major pain. Recently, Ubuntu has provided an installation process easier than Windows and MacOS, and for every application type, there was only one default option. That is, what has become, the Ubuntu Identity.

Taking Risks

To keep that identity, and still progress as fast as they (and me) would like, one has to take risks. I have to say that, for the most part, they were right on the spot. Some failures (as mentioned) are expected to happen and you are left with the consequences and decision of those risks. For a company with such a tight budget (and such high expectancy), there is little they can do differently. If they had bigger budgets, they could spend more time adapting the proprietary graphic drivers and the update system (that never works on fine-tuned machines), but they don’t. And based on how updates work on Windows and MacOS (ie. they don’t), I’m not surprised with Canonical’s failures.

I like Firefox, ALSA and Pidgin, but if the overall experience is more stable (and complete) with Empathy, Chrome and PulseAudio, so be it. We’re passt the time to complain about personal preferences in favour of a wider viewpoint. I’m too old to rant about how pity is the new splash screen when using ATI proprietary drivers for the time being, I just want to install and run. As long as my VIM is working and there is a browser and an IM to use, I’m happy. I don’t care Gimp is not included by default, I do dislike that GCC is not, but I understand the reasons and always install it first thing when I get a new system.

That’s the Ubuntu identity and the risks Canonical takes to move the desktop experience forward. As unstable Debian people used to say, that’s the risk of being on the edge…

Upgrades never work

So, I stated that upgrades never work for fine-tuned machines, and that has been my experiences until today. In the beginning, I thought it was that Ubuntu was still immature, but today I had to roll-back my Lucid installation I did yesterday for major incompatibility issues, mainly with the ATI proprietary graphic driver (splash and return from sleep).

So far, the only way I can upgrade Ubuntu is by installing a complete new copy of it every time, and apply the backed-up changes in configuration files manually after all is done. It may seem a lot of work, but every time I try to upgrade and every time I end up installing from scratch and applying the few manual tuning later. Now that I know exactly what I have to change and where (after years of doing), it takes me roughly 15 minutes to customize it.

My configuration is in such a state that it takes me zero maintenance and little backup disk space, as well as easy installation process. The magic is simple.

Preparation

This is one thing I recommend to any system, Linux, Windows and MacOS: Split into, at least, two partitions. One, around 50-80 GB, for your system, preferably the first one (primary partition). The other(s), taking up the rest of the hard-drive, for your data/home directories. If using Linux, of course, reserve (at the end), a space for your swap (4GB is more than enough, even if you have that, or more, of RAM). Swap is a safety measure and not to be used under any normal circumstance.

Daily Usage

Backup your home directory often, including personal configuration files, IM history, panel short-cuts, everything. Apart from your data, the rest might give rise to some complications when upgrading the user environment (Gnome, KDE) but that’s minor and can be overcame easily. That will help you in case things go awry in your update/replace process. A cron job or manual invocation to a script is recommended for that.

Also, remember to back up (manually, by copying) every system configuration you change. Because most configuration on Linux is a text file, that part is very easy. It has to be done manually because, as it’s very simple and easy (you shouldn’t change that many configuration files), you can do a detailed comparison between what’s in there and what you want to replace or add. This will be important for your post-update process.

Additionally, any non-essential data can be moved to a shared disk (with appropriated backup), accessible over the network. This way, you not only don’t have to backup all your data (photos, videos, documents) every install (could take days), but they will also be available from other computers while you upgrade your machine, so you can continue working on them as soon as your machine is ready.

Upgrade

Upgrades never work, especially if you have changed the configuration. Some systems evolve and can’t read old configurations properly, new systems won’t read other systems configurations and migration scripts never work properly on modified files. What’s worse, as Ubuntu has its own identity, the new systems will work better (or only work) with other new systems. So the integration between the new systems and your old, changed, systems will most likely fail silently. PulseAudio is the best example of that conflict.

To update, simply re-install the new version from CD (USB, or whatever) into the OS partition. So far, they have managed to make the upgrade to new systems pretty easy, if you discard your old ones. Empathy imports pidgin accounts (and history), all basic systems are properly configured if you do a fresh install. As wireless network passwords, panels, personal short-cuts, and other configurations are stored in your home directory, you just have to log in to see your old desktop, just the way it was.

The few things that aren’t installed (like GCC, VIM, gstreamer plugins) can be easily installed if you have a list of things you always install in a file (in your home dir), like build-essentials, ubuntu-restricted-extras bundles. Printer VPN, printer and share configurations can be easily copied over from your backup as soon as you installed and an apt-get upgrade can be done to get the new stuff since the CD was released.

Roll back

What’s best in this strategy is that roll backs are extremely easy. You can’t roll back a dist-upgrade using apt, but you can safely re-install the previous CD in case it breaks up things so badly it becomes unusable. Like the new Ubuntu, it’s still bad with proprietary graphic drivers and the open source ones are not nearly as good. So I just rolled back and will wait until it stabilises.

Instabilities occur most often in Long-Term-Support releases (like the current). It might seem weird, but it’s pretty simple: they commit to three to five years of support, so they must get new software that will last that time. The lifetime of open source projects is not great (still, longer than many commercial products), but a five year commitment on a software that is already five years old is a big risk. Ext3 and Ext4 filesystems are a good example of this case.

So, instead of providing the stable components, they change radically the interface and sub-systems and wait for them to stabilise, hoping that the production state of the release will remind developers to speed up the fixes. While not optimal to the users, it’s more or less the only way they can go without breaking the promise of support when the application goes dead. This is why enterprise Linux is so expensive, because companies require stability as well as support, and ultimately, the distribution companies will have to maintain some of the dead application for years, if not decades.

Not only roll back is easy, but changing distribution entirely. As your data is distribution agnostic (Linux centric, not package-system centric), you can re-install virtually any other Linux distribution, as many times as you want, and keep the same look and feel.

Conclusion

In summary, it might look more complicated to use and maintain, but it’s not. Once your setup is done (partitions, backup scripts), the rest is pretty easy and quick. So far, I have stubbornly upgraded every release (since 7.04) to make sure it’s still harder than re-installing and it has been the case for every release.

Also, if you have nVidia or ATI graphic boards, never upgrade in less than a month after the release is out. I recommend you upgrade at least two or three months later (mid-releases), as most of the vendors will have updated to match the new Ubuntu Way.

Lastly, as I normally fine-tune my computer, I haven’t had a successful migration of any operating system until today. I always try to upgrade, if available, and end up re-installing everything. That was true with DOS, Linux and Windows, since 1990 and I doubt it’ll change any time soon. It’ll be necessary an intelligent installation process (which our computers are not able to run, yet), to do that.

In the far future, it lies, then.


Humble Bundle
May 10th, 2010 under Digital Rights, Fun, Games, rengolin, Software, Unix/Linux. [ Comments: none ]

I’m not the one to normally do reviews or ads, but this is one well worth doing. Humble bundle is an initiative hosted by Wolfire studio, in which five other studios (2D Boy, Bit Blot, Cryptic Sea, Frictional Games and the recently joined Amanita Design) joined their award-winning indie games into a bundle with two charities (EFF and Child’s Play) that you can pay whatever you want, to be shared amongst them.

All games work on Linux and Mac (as well as Windows), are of excellent quality (I loved them) and separately would cost around 80 bucks. The average buy price for the bundle is around $8.50, but some people have paid $1000 already. Funny, though, that now they’re separating the average per platform, and Linux users pay, on average, $14 while Windows users pay $7, with Mac in between. A clear message to professional game studios out there, isn’t it?

About the games, they’re the type that are always fun to play and don’t try to be more than they should. There are no state-of-the-art 3D graphics, blood, bullets and zillions of details, but they’re solid, consistent and plain fun. I already had World of Goo (from 2D Boy) and loved it. All the rest I discovered with the bundle and I have to say that I was not expecting them to be that good. The only bad news is that you have only one more day to buy them, so hurry, get your bundle now while it’s still available.

The games

World of Goo: Maybe the most famous of all, it’s even available for Wii. It’s addictive and family friendly, has many tricks and very clever levels to play. It’s a very simple concept, balls stick to other balls and you have to reach the pipe to save them. But what they’ve done with that simple concept was a powerful and very clever combination of physical properties that give the game an extra challenge. What most impressed me was the way physics was embedded in the game. Things have weight and momentum, sticks break if the momentum is too great, some balls weight less than air and float, while others burn in contact with fire. A masterpiece.

Aquaria: I thought this would be the least interesting of all, but I was wrong. Very wrong. The graphics and music are very nice and the physics of the game is well built, but the way the game builds up is the best. It’s a mix of Ecco with Loom, where you’re a sea creature (mermaid?) and have to sing songs to get powers or to interact with the game. The more you play, the more you discover new things and the more powerful you become. Really clever and a bit more addictive than I was waiting for… ;)

Gish: You are a tar ball (not the Unix tar, though) and have to go through tunnels with dangers to find your tar girl (?). The story is stupid, but the game is fun. You can be slippery or sticky to interact with the maze and some elements that have simple physics, which add some fun. There are also some enemies to make it more difficult. Sometimes it’s a bit annoying, when it depends more on luck (if you get the timing of many things right in a row) than actually logic or skill. The save style is also not the best, I was on the fourth level and asked for a reset (to restart the fourth level again), but it reset the whole thing and sent me to the first level, which I’m not playing again. The music is great, though.

Lugaru HD: A 3D Lara Croft bloody kung-fu bunny style. The background story is more for necessity of having one than actually relevant. The idea is to go on skirmishing, cutting jugulars, sneaking and knocking down characters in the game as you go along. The 3D graphics are not particularly impressive and the camera is not innovative, but the game has some charm for those that like a fight for the sake of fights. Funny.

Penumbra: If you like being scared, this is your game. It’s rated 16+ and you can see very little while playing. But you can hear things growling, your own heart beating and the best part is when you see something that scares the hell out of you and you despair and give away your hide out. The graphics are good, simple but well cared for. The effects (blurs, fades, night vision, fear) are very well done and in sync with the game and story. The interface is pretty simple and impressively easy, making the game much more fun than the traditional FPS I’ve played so far. The best part is, you don’t fight, you hide and run. It remembers me Thief, where fighting is the last thing you want to do, but with the difference is that in Thief, you could, in this one, you’re a puss. If you fight, you’ll most likely die.

Samorost 2: It’s a flash game, that’s all I know. Flash is not particularly stable on any platform and Linux is especially unstable, so I couldn’t make it run in the first attempt. For me, and most gamers I know, a game has to work. This is why it’s so hard to play early open source games, because you’re looking for a few minutes of fun and not actually fiddling with your system. I have spent more time writing this paragraph than trying to play Samorost and I will only try it again if I upgrade my Linux (in hoping the Flash problem will go away by itself). Pity.

Well, that’s it. Go and get your humble bundle that it’s well worth, plus you help some other people in the process. Helping indie studios is very important for me. First, it levels the play-field and help them grow. Second, they tend to be much more platform independent, and decent games for Linux are scarce. Last, they tend to have the best ideas. Most game studios license one or two game engines and create dozens of similar games with that, in hope to get more value for their money. Also, they tend to stick with the current ideas that sell, instead of innovating.

By buying the bundle you are, at the very least, helping to have better games in the future.


 


License
Creative Commons License
We Support

WWF

EFF

National Autistic Society

Royal Society for the Prevention of Cruelty to Animals

DefectiveByDesign.org

End Software Patents

See Also
Disclaimer

The information in this weblog is provided “AS IS” with no warranties, and confers no rights.

This weblog does not represent the thoughts, intentions, plans or strategies of our employers. It is solely our opinion.

Feel free to challenge and disagree, and do not take any of it personally. It is not intended to harm or offend.

We will easily back down on our strong opinions by presentation of facts and proofs, not beliefs or myths. Be sensible.

Recent Posts