The Death of the Desktop (a video panel discussion)

In the last couple of years, I find myself returning time and again to discussions about cloud computing, what it means, where it's going, and what the implications are for both business and end users. Meanwhile, many of you reading this have happily upgraded your Ubuntu Linux systems to release 10.04 (aka the Lucid Lynx) complete with what some argue is the best Linux desktop ever. I tend to run Kubuntu which essentially means that I'm using the KDE desktop instead of GNOME (though I do use that as well). On Tuesday, KDE released and update to its 4.4 software compilation (4.4.4) and I happily updated my own system from the Kubuntu repositories. It's a beautiful thing and with each update (and that includes GNOME by the way), I grow more and more attached to my personal desktop environment. And yet, there are those who claim it's pretty much over for the desktop as we know it.

All this flows very nicely into the discussion you are about to watch.

At this year's COSSFEST in Calgary, I took part in a lively panel discussion titled "The Death of the Desktop", essentially a free-for-all discussion on whether 'the cloud' will kill off our beloved desktop. My fellow panelists included Aaron Seigo, Brad "Renderman" Haines, Adam McDaniel, and Craig McLean. The whole thing was taped and later edited (wonderfully, I might add) to create the video you see below. Given that this was the last panel of the two day event and that they gave us beer to drink during the discussion, I should probably offer up a little disclaimer before you proceed. If you are sensitive to such things, it's entirely likely that you'll hear the occasional 'bad word '. I'm just saying. You've been warned.

Enjoy the show!

So, what do you think? We've had our say; now it's your turn. Enter your comments below.

Until next time . . .

Comments

Death of the desktop? Don't be absurd!

The very idea that cloud 'computing' will spell the death of the desktop is as absurd as the idea bandied around in the 1970s that the 'computer age would herald the 'paperless society'. The fact remains that the desktop remains an important interface for many if not most computer users, and programmers. Another point is that a stand-alone machine is still relevant and valuable to many people, often far more valuable than an internet-dependent system.

I'm sure that there are many who are either so uninformed or naive as to believe that their personal information will be safe in a cloud; they already exist aplenty: just look at Facebook, Myspace, MSN, etc., to witness the idiocy of people who are fooled into believing that every moment of their lives is of interest to the rest of the world. These are the people who won't think twice about sticking their 'souls' in a cloud. It should be quite obvious by now that I don't subscribe to any of this: I have real friends, made of flesh and blood, and I don't do online banking.

I totally mistrust the cloud, and I would no more entrust any of my valuable personal information or documentation to it than I would ask a total stranger to look after a large sum of cash for a few days. Apart from the possibility of information being held up for ransom, the cloud is surely a paradise for pirates, virus writers, hackers and cyberpunks. While I have no doubt that there's an immense amount of security programming going on, it may take only one badly written programme - a.k.a. a 'virus' - to bring an entire system down; it doesn't even have to be deliberately written malware. The recent issue with McAfee's 'security update' is an obvious case in point.

The big ethical issue here, perhaps, is whether 'corporations' or governments decide to attempt to impose the cloud, using some cleverly worded pretext. It could be done quite simply, by restricting the hardware available for purchase.

Cloud? No thanks! Which reminds me: I must remove Ubuntu One from my installation; it's wasting my resources!


No, the "cloud" is ...

just a clever marketing term for a publicly accessible Internet server.

The "cloud" gives cover to a corporate plan dreamed up to cut development costs and to lock consumers into a corporate subscription revenue stream, along with additional monthly income stream for storing user data on an Interrnet server, both costs gradually but continually increasing. Eventually the cloud will convert to a gradually escalating pay-per-use cost, while their pay-per-mb/month data is kept hostage via proprietary formats to ensure that the consumer remains chained to the application, which guarantees the revenue stream.

The standard development model is to create applications which are sold, plastic wrapped, unit by unit through the retail channels. It is an expensive process. Fixing bugs or patching security holes requires that the consumer download and install those software bug and hole fixes. Many do not, resulting in consumer horror stories which reflect poorly on the quality of the products and the corporation. To avoid the negligence of the consumer "patch Tuesday" was created, but it is a cost center, not a profit center. Switching to subscription access to applications residing on an Internet server will change that.

The corporate software house, instead of trying to get millions of application users to upgrade their applications, only have to upgrade the executable residing on the Internet server. That's millions versus hundreds, and it's all from their end. It also makes it easy for them to regularly modify the file formats, and convert the data, making it essentially impossible for FOSS developers to reverse engineer the file formats and help consumers export their data to FOSS file formats.

All in all, the "cloud" is a WIN-WIN for corporations and LOSE-LOSE for the consumer.

Equating FOSS repositories with "clouds" is disingenuous in the least. Users do not have to pay for access to the repositories, and the repositories are not the only sources for the applications or the application source code. Further, the repositories are not used to hold consumer data. The equation is also disingenuous because the application is run on the consumer's local hardware, NOT as a client-server application on a "cloud".

The "cloud" fails at another level, too. Most Federal, state and local governments cannot store citizens financial data, especially from the IRS, on servers which grant public access. I'm retired now, but while working for a state department of revenue agency I wrote software to compare state income tax filings with federal filings in order to identify discrepancies greater than some arbitrary value, usually $1,000. After the IRS caught some of its OWN employees examining movie star incomes without a valid reason they pass regulations on EVERYONE ELSE that required I keep a log and record what was on my screen at the moment anyone walked by my cubical and what their badge ID was. The IRS data had to be stored on servers under physical lock-and-key control of the department.

Finally, the "cloud" will fail for corporate users too. Every technology follows the development curve down to the commodity level. That's why most produced sold in America are made in China, under sweat-shop, or worse, conditions. The Internet servers that corporations would use for their "clouds" will follow that same curve to the lowest cost basement. Those servers will end up in countries where the IT staff works for peanuts and their lives are controlled by dictators. When an agent for a dictator walks into a building housing some corporate cloud and demands that they be giving a mirror image of the drive do you think the any of the staff will say no? Corporations will soon find themselves competing against other corporations who obviously obtained access to all of their corporate data. In other words, eventually the "cloud", regardless of its short term profit advantages, will spell corporate suicide.

Cloud vs local performance

One important thing which was not mentioned for Cloud vs Local application performance is latency time.
Notwithstanding how fast network is, it doesn't change speed of light.
You can pump gigabytes in second, but if server is 3000km away it takes 0,3 s for each request/response turnaround.
In these terms Avatar movie is bullshit because it doesn't take it into account. Only way you can get real experience is to copy your brain content and act locally.

The expanse of space surrounding Planet *buntu is getting busier and busier. As a result, achieving a stable orbit is particularly difficult when you're easily distracted. Consequently, Marcel Gagné's blog looks at pretty much anything and everything that orbits Planet *buntu. News, howtos, rumors, opinions, controversy, tech tips, helpful hints . . . you'll find it all here. Oh look! A shiny object!

Mon Tue Wed Thu Fri Sat Sun
        1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31