Archive for the ‘Technology’ Category

It’s Windows, Stupid!

Tuesday, October 23rd, 2007

Well, it’s no surprise to those of us that have recently switched to the Mac that Apple blew past its forecasts yesterday. Sales of the Mac have nearly doubled in the last year. Pundits appear mystified by this phenomenon and are trying to come up with all sorts of convoluted explanations like in this post on CNET.

To me, it’s plain and simple – frustration with Windows has reached the proverbial tipping point and the best alternative on the market is the Mac …

Goodbye Windows, hello Mac!

Tuesday, October 16th, 2007

About three months ago, after going through yet another uphill battle with Windows on my laptop – I finally decided to call it quits and switch to a Mac with OS X. Since then, I have been surprised to hear I am not alone. Several people have told me they have switched to the Mac recently because of increasing flakiness with Windows.

It seems like the very reasons which have made Windows and the PC platform such a runaway success are now haunting them. The vast ecosystem that exists around them have clearly led to a lot of innovative stuff being developed, but there is a huge combinatorial explosion that causes pretty much every machine to have its own unique configuration of software and hardware similar to an individual’s fingerprint or DNA. The trouble is that most of these configurations have never been tested and are therefore highly fragile and failure prone.

It seems like Linux on the desktop will likely suffer the same fate.

At least in the short term, the Mac being a somewhat closed system seems to help. In the three months I’ve owned mine, I’ve had to do one update and subsequent reboot which is pretty amazing given I was at about two reboots a day with Windows!

I hope for my own sake that the Mac does not become too successful – shhhh, don’t tell anyone you saw this :-)

Microsoft’s economic drag

Saturday, June 2nd, 2007

If you’re like me and most other technology/professional workers, you’ve been using some version of Microsoft Windows for some time now. And it is also likely you’ve experienced countless blue screens, freezes and reboots courtesy Microsoft through this period.

Last night, while waiting for a reboot to complete after my screen froze for the millionth time, I decided to do some calculations on the economic loss being caused by Microsoft. Here is what I came up with:

  • 4 crashes/week @ 15 minutes lost/crash (reboot, recovery of context, lost data) = 1 hour lost/week
  • 12 years of using Windows results in 624 hours lost (I was fortunately in grad school using Unix before then)
  • Using an average wage of $60 per hour results in $37,440 lost

I’m purposely being very conservative above with all the numbers – besides the system crashes, there are the umpteen update related reboots and instabilities – security patch du jour, network on the blink, program suddenly stops working, etc.. Furthermore, I have suffered through 3-4 major system crashes (registry corrupted, etc.) over this period. Let’s say all this contributed another 125 hours or $7,500 lost. This gives us a grand total of about $45,000 lost over 12 years or about $4,000 per year. Wow!

Of course, some people will likely make the argument that Windows has enabled a plethora of technology over the years that have all greatly enhanced productivity; however, I will not buy this argument for even one second, these new technologies would have been created anyway and if anything, Microsoft has probably slowed down the rate and pace of innovation with its monopolistic and predatory practices.

I wonder if corporations have done this type of math – a large corporation with tens of thousands of employees is looking at tens or even hundreds of millions of dollars in lost productivity. If one sums up the effect across global corporations, it probably adds up to tens or even hundreds of billions of dollars – a size-able chunk of the global economy.

Amazing what an effect having a stable alternative to Windows could have on the global economy. Desktop Linux – where art thou ?

Google Developer Day Report

Thursday, May 31st, 2007

Spent the morning and part of the afternoon at Google Developer Day.

Google sure knows how to throw a party! Plenty of free food and drinks, a pool table, bean bags – you get the picture :-)

On a more serious note, they made several new product announcements – Google Gears, Google Mashup Editor and a new version of Google Web Toolkit.

Gears is an open source project (BSD license) done in partnership with Adobe’s Apollo. Gears has three components to it – a database (SQLite)  for offline data storage/access, a local server for storing and serving pages/scripts while offline and a utility to create worker thread pools (for any heavy computes on the client – not sure a lot of people are going to need this unless they are doing serious analytics – I’m sure Google is planning some use for it). Google Reader is the first Google app to be Gears enabled – the guy demoing the app wanted to show how it works when disconnected and had a lot of trouble getting off the network! (he unplugged his network cable and his wireless came on). I think this is really a sign of things to come. I personally believe that putting a lot of effort in to support disconnected access is a temporary thing and a likely waste of resources. I believe that thinking of interesting possibilities in a continuously connected world is more worthwhile …

Google Mashup Editor was not so impressive – at least with the short demo I saw. To be fair, I should check it out before spouting, but hey, that hasn’t stopped me before :-)  Seemed like a bunch of HTML/XML hacking required – lots of sleight of hand involved in the demo to make it look easy – the guy clicked on links titled step 1/2/3 and voila! more HTML/XML appeared in the editor pane …

I’ve never been a real fan of Google Web Toolkit. I tried it sometime last year and it sucked! There was a lot of stupidity with carefully naming interfaces/classes/methods a certain way to make sure everything linedup right. The programming model mimicked the classic GUI development model which is pretty outdated – I like using markup for describing UIs – to me its the single best thing about XML. Looks like a million people checked it out over the past year -  no mention of actual uses …

In the afternoon, there was a very interesting talk about Google’s compute infrastructure. Google apparently uses the cheapest hardware it can lay its hands on and uses lots and lots of it. They have several layers of abstraction to make it easy for their programmers to create applications over this massive hardware infrastructure. They seem to have optimized for a class of computing problems – very large scale query/analysis on entities that are essentially bags of attributes. GFS is their lowest level distributed storage engine. BigTable provides a SQL-like abstraction over GFS. MapReduce allows programmers to collect attributes of interest from a collection of entities in the first phase (Map) and then do analysis on the values of the attributes in a second phase (Reduce).

Attendance was quite heavy – 1500 at the San Jose Convention Center venue and 5000 worldwide. I bailed early, the event was scheduled to go on until late in the evening, ending with a dinner at Google’s Mountain View headquarters. I’m sure the event will be bigger next year with invited talks as well (this year was Google only). Looking forward to it …

AJAX/RIA : No Clear Choice

Monday, May 21st, 2007

It seems pretty clear that AJAX/RIA as a concept is here to stay.

It represents the next natural step in the evolution of application architecture. We started with a thin client talking to a single server in the Mainframe era. We then moved to a thick client talking to a single server in the Client-Server era. With the advent of the web, we had a universal thin client (the browser) that could talk to multiple servers. Now, with AJAX/RIA, we are moving towards having thick clients hosted within a universal framework that can talk to multiple servers.

Now, if only we could all agree on what the universal framework is going to be :-)

There are really three major camps I can see:

  •  Just the modern day browser – thick clients all use JavaScript/XHTML/CSS
  • Browser augmented with Adobe’s Apollo runtime – thick clients can use JavaScript/XHTML/CSS or use MXML/ActionScript/CSS
  • Browser augmented with Microsoft’s Silverlight runtime – thick clients can use JavaScript/XHTML/CSS or use XAML/CLR/CSS

At first glance, it seems like JavaScript/XHTML/CSS would be a logical choice for anyone implementing a thick client because it is the lowest common denominator. However, there are strong motivations to use Apollo or Silverlight – the current browser is not a great environment for hosting thick clients. Security is probably the biggest hole.  Now the confusion sets in – one essentially has to bet on either Adobe or Microsoft. Be nice if there was some consensus built on how the browser as a platform needs to evolve to support running thick clients instead of having everyone pick sides …

SOA based on WS-* is fundamentally flawed

Thursday, May 17th, 2007

All this while, I used to think SOA based on WS-* (see W3C and OASIS) would likely not scale because of the complexity involved in understanding and using the WS-* specs. Now I’m firmly convinced that SOA based on WS-* will never scale because the entire philosophy and thinking that is driving it is fundamentally flawed. Let me explain.

The idea behind SOA as a concept is loose-coupling, i.e. about a large body of applications that co-operate with each other to produce more powerful super-applications. If you look at the thinking driving SOA based on WS-*, it is all about tight-coupling, i.e. about a large body of applications that are extremely well co-ordinated to produce more powerful super-applications. Co-ordination instead of co-operation – therein lies the flawed thinking.

Co-operation is a notion that can be made to work on a large scale with few guiding principles/standards (the web is a wonderful example; others are the movie industry using DVDs, printer industry using letter size paper, etc). It is not easy to achieve, but it is feasible. Co-operation allows for individuality while enabling a shared goal to be achieved.

On the other hand, co-ordination on a large scale is practically impossible to achieve (an army like the US army, the New York Philharmonic Orchestra are some examples of co-ordination). Co-ordination demands extreme discipline and control. Individuals do not matter, only the common goal does.

Everything about SOA based on WS-* tries to achieve co-ordination (the specs themselves and all the grand talk about ESBs, governance, lifecycle management, meta data repositories, etc).

Co-ordination is extremely appealing because it promises ultimate efficiency. This is probably why enterprises are taken in by the SOA based on WS-* sell. With enough time, when it becomes apparent that they’re chasing a fallacy, people will come to their senses and do something much simpler to achieve co-operation that may be messy and not so efficient, but it sure as hell will work.

Patents serve middlemen, not inventors

Thursday, May 10th, 2007

I ran into this article today arguing why Patents are a wonderful thing because they reward inventors in a big way and spur more innovation.

I could not disagree more with the author.

Patents reward a whole industry of middlemen way more than the inventors who create them.

Only in rare cases do inventors make big gains. This happens when they have/acquire the business skills needed to translate their invention into commercial success. Someone like Larry Page (quoted in the article) would fall in this category.

The common case is that the inventor receives a small if any reward for his/her invention and the maximum benefit goes to corporate managers, investors and/or lawyers (of course, if it is worth anything in the first place). If the invention is a commercial success, investors and managers reap the rewards. If the invention is used to sue, lawyers reap the rewards.

Inventors are typically driven by passion, the passion to create. On the other hand, Patents are driven by greed, nothing but greed.

Shadowy world of cookies

Saturday, May 5th, 2007

For a while now, I’ve been using options in Firefox and IE to prompt me before accepting cookies from web sites. Boy, am I surprised to see what goes on …

I would highly recommend that everyone try this and see for themselves. In Firefox 2, use Tools -> Options -> Privacy, check the “Accept cookies from sites” box and select “Ask me everytime” for the “Keep Until” option. In IE 7, use Tools -> Internet Options -> Privacy, select the Advanced button, check the “Override automatic cookie handling” box, select the Prompt option for First party and Third party cookies, check the “Always allow session cookies” box. In both Firefox and IE, there are buttons to clear all cookies, use that to delete all existing cookies. Now sit back and watch the fun!

While I do not generally have a problem with first party cookies that serve a functional purpose (like holding your shopping cart at an e-commerce site), I find third party cookies that track your every move extremely offensive. I’ve come to detest companies like Hitbox that make huge sums of money off data collected in a very shadowy fashion. What is really scary is that when I heard a guy from Hitbox speak at a recent conference (see previous post), he was talking about how they knew the demographics of the people accessing various sites, how many kids they had, their family income and so on – loads of personal data – they are using this personal data in combination with people’s surfing habits to sell all sorts of analysis. Talk about big brother watching …

If you want to personally do something about this – use the browser settings described above to avoid any unwanted cookies (IE has a handy setting to automatically reject third party cookies – but some of them now circumvent this and act like first party cookies) and also clear all cookies periodically (Firefox has a handy option to do this automatically every time you close the browser).

Microhoo and Adoogle

Friday, May 4th, 2007

Looks like Microsoft and Yahoo are getting pretty serious about cozying up. I think everyone quite expected to see these guys join forces, especially after Google purchased Doubleclick. Another marriage I personally see happening fairly quickly is Google and Adobe – Google has to get a significant foothold on the desktop and strong presence in the enterprise to compete with Microhoo. Adobe seems the logical choice with its large Flash and Reader footprints as well as reasonable enterprise footprint (with its forms business).

Be interesting to see how this story plays out over the next few weeks …

Silverlight – Not Impressed

Wednesday, May 2nd, 2007

Spent some time today downloading and trying out Microsoft’s Silverlight.

While the download and install was very impressive (2 MB), it seems like this thing has a very long way to go.

I was expecting to see more developer focused demos – could not find anything. Just a lot of chartware and some videos. Maybe a sign that Microsoft is trying to be a Media company more than a software company … The videos were of pretty poor quality and I had major issues with getting anything to even play without lots of long pauses. Looking at the list of tools required – seems pretty daunting. Almost everything has a fee associated with it.

I think I like Adobe’s Apollo platform much better at this point. The experience is much slicker, the development story is very clean and the desktop integration is powerful.

Ok, I dug around a bit more and I think I just found the developer site for Silverlight – there was an obscure little link from the main site. What is interesting is that this site renders broken in Firefox – oops!