Don Dodge, Google and Developers Evangelism

I was just reading over at TechCrunch about Google quickly hiring Don Dodge after he was let go from Microsoft. It seems Don will be doing what he used to do at Microsoft – Developer Evangelism (good for him, and Google!).

I’m very happy to see that Google is putting their stock options and cash where their mouth is to evangelize their APIs, platforms (Android, AppEngine) and tools to developers.

A while back I wrote about the lack of Google’s outreach in the Israeli developers community, and it is still very visible in Israel by the jobs listings as well as various events and conventions that Microsoft Technology still dominates the Israeli high-tech software scene.

I do hope that hiring Don Dodge and keep on releasing tools, SDKs, Platforms and even languages such as the new Go programming language, to create the necessary diversification that every monopolized field needs.

I just hope that Google will start to do more than just very simple and shallow Dev Days in Israel and will start reaching out the community, specifically in Israel. I would like to see a Google I/O event in Israel and may be a couple of smaller events that dig down into code and details in a more intimate scenario with less people.

In general I would expect Google to start evangelizing in other countries and start having evangelists in every country they have an office. I would suggest Google to learn a bit from MSDN as well as the Microsoft Valued Professional (MVP) program – these tools are one of the best examples of creating a community based on core leaders that can drive the community as well as Google straight up.

Google is still light years from reaching the well oiled, well organized Microsoft evangelism machine and I hope Don and other will be able to make big leaps to close that gap.

Israeli Shortage in High End Laptops

At Yedda (my day job) we recently ordered 3 new laptops.

Our spec was very specific (that’s how we are ;-) ):

  • Core Duo 2 running on at least 2Ghz
  • 2Gb of RAM
  • 100Gb or more hard drive
  • WXGA+ screen (1440×960 resolution)
  • 14.1″ screen
  • Non shared memory video card
  • DVD burner

The reason we want 14.1″ screens is due to size and weight (some of us, not me, rides on bikes and/or motorcycles to get to the YeddaHQ). We also wanted as high resolution as possible and the WXGA+ seems very good.

Up until now we mostly used Thinkpads so we obviously checked out the new T61.

Aside from the fact that it was a bit costly (which we were willing to accept) there were no T61 machines in Israel with WXGA+ or with a Core Duo 2 running on at least 2Ghz or the machines had an integrated shared memory video card (which is a big no no!).

We checked out the Dell D630 which also had the same configuration, got good reviews and was surprisingly ~$500 cheaper. The only problem was that it had to be specially ordered for us since Dell Israel doesn’t work the same way it works in the USA. Dell Israel brings a certain set of models to Israel and Israelis don’t get the pleasure of having a specific Dell machine built just for them.

Luckily we ordered 3 machines and our supplier was willing to place a special order at Dell UK for us.

The original estimated delivery time was 3 weeks (work weeks, not calendar weeks) which ended up today.

As you can figure out from the title, the machines will not arrive today. It seems that there is a shortage in Israel not just in Dell high end laptops but in other brands as well and I’ve heard people getting a delivery date for December.

The official explanation for the delay in shipment of our Dell machines was that there is a delay in LCD screens in the UK and that’s why the machines are sitting there screen-less waiting for us.

The current expected delivery date out of the UK is the 25th of September. I guess we will just have to wait.

Google Apps for your Domain, DNS, CNAME and Security

I’ve recently started to use Google Apps for Your domain to host my private emails on the sandler.co.il domain.

Google Apps for your domain is quite cool and was very easy to configure. I mainly moved to it due to the unbelievable amounts of SPAM and I didn’t have the power or time to configure SpamAssassin in a reasonable way that would actually work.

When I moved, one of the things I did was to change the “default” URL in which me and other members of my family use to access the web mail of the domain. Google Apps for your Domain allows you to do just that by configuring it in its configuration screen and settings a CNAME record that points to ghs.google.com.

After configuring everything I tested it out and noticed something disturbing.

It seems that CNAME (by design/default/whatever) does not support HTTPS, only HTTP. This means that the CNAME alias I configured will be resolved to mail.google.com/a/YourDomain.XXX (replace YourDomain.XXX with your domain ;-) ). If you are not authenticated you’ll be redirected to authenticate on an SSL protected address (https) and upon successful authentication you will be directed to http://mail.google.com/a/YourDomain.XXX (not https – not SSL).

This means that now, when you read or write Emails they are not protected. If you are sitting in an open WIFI network (passwordless network) people can easily sniff out your Emails and correspondence (I know that not using WPA will make you prune to man in the middle attacks, but that’s not the issue here). This is just one of the scenarios that you will be vulnerable (there are a few more).

It’s not that accessing https://mail.google.com/a/YourDOMAIN.XXX will not work. On the contrary, it will work fine and all the communication will be secured using SSL (https).

It seems Google is encouraging recklessness with their current configuration, instead of redirecting authenticated users to the secured version (https/SSL) of their web mail specifically because of the DNS CNAME limitations.

It is a simple fix on Google’s behalf which will increase the security dramatically.

Forgive me Outlook for I have sinned (not)

Forgive me Outlook for I have sinned.

I have been using you as one of the primary communication tools that I have form your very first days. I have stayed within the 2Gb PST file limits but when I was told that Outlook 2003 can hold up to 20Gb I have rejoiced, joyed and thanked you for your kindness.

I still dreaded the old 2Gb limitation but decide to look forward for a better future. I therefore installed Outlook 2007 blindfold as I have known that each version of Outlook brings it’s own bliss and helpfulness to the world.

But oh and behold, my mail downloads prolonged. Is it thy punishment for my ever increasing in size PST? Am I guilty of not enabling “Auto Archive” and splitting my PST files?

Perhaps I was still good this year, for you have sent me a savior in the form of this fix

Seriously now… I have discovered this fix via this post in Download Squad and the responds from Outlook’s PM was very annoying:

“Outlook wasn’t designed to be a file dump, it was meant to be a communications tool…There is that fine line, but we don’t necessarily want to optimize the software for people that store their e-mail in the same .PST file for ten years.”

While it may be true that it wasn’t designed to hold ten years of mail, this is certainly not the first or second version of Outlook. If you’ll take the accumulated usage hours of Outlook of all the people on this planet you’ll amount to thousands (if not more than that) of man years.

Do you want to tell me that all of the MS employees don’t have PST files larger than 1Gb? Do you want to tell me that after 5 years of developing Office 2007 and thousands of hours of dog fooding Outlook 2007 within Microsoft you didn’t check an average user’s PST file to see that its well beyond 500Mb? I don’t save 10 years of Email on my main PST file, mostly from the years that the PST was limited to 2Gb, but I do have 2 years and its more than 500Mb.

It’s you that decided to add RSS feeds into the PST file which means MORE information is placed inside the PST file not less. You should have seen that coming. Really.

Perhaps now is the time to chip in and help the new versions of Thunderbird (the Release Candidate for 2 looks really well) and combined with Lightning (the project to add Calendaring abilities to Thunderbird) and create a decent and usable replacement for Outlook!

Outlook 2007, you have failed me and robbed me of my productivity time while I waited for my mail to download. I’m afraid its time to pick up a fight and make sure that the best PIM software really wins.

Stop the PIM tyranny and join forces to beat the beast. Competition will make it better and we will all rejoice in reclaiming our mailbox as well as our lost Email download time.

Google Apps for Your Domain and Gmail Mail Applet for Nokia phones

I own a Nokia E61 cell phone. A nice phone all in all (aside from the backup problems my wife encountered).

Gmail has this cool little applet that lets me access my Gmail account in a nicer (and better cached) way from my cell phone. It’s a really nice program and I use it quite often.

It has one problem though. If you host your own domain through Google Apps for Your Domain to get the Gmail like interface for your Emails you cannot use this program.

Technically (as far as I could see) the interface is rather the same, the only different should be the user name and password. But there is a restriction in the user name in the mail applet that forces you to put an Email address with a suffix of @gmail.com only. It will not accept anything other than a @gmail.com user name.

Google Apps for your Domain has, however, a program for Blackberries. Not that there is anything wrong with that, but I would really like to have the current nice mail applet working with my hosted Google Gmail application.

I want the normal Gmail applet to work with my custom domain and Google Apps for your domain, otherwise I’m forced to use the not so nice Cell phone browser web mail access which is far less usable than the applet.

Is it too much to ask? I don’t think so, considering that it seems there shouldn’t be any problem supporting it technically (it’s the same backend). If any of you Google Apps for your Domains Googlers are reading this and there is a bigger issue/problem with forcing the mail applet to support Google Apps for Your Domain, I would love to know why (you can even ping me privately through my contact page).

Universal Binaries

Is it just me or Universal Binaries for Mac are a world domination scheme to increase the bandwidth usage of the world?

I know that the Apple folks didn’t want people to start figuring out “Do I have an Intel process or a PowerPC one?”, after all most people don’t really know what’s inside their machines, but in 99% of the cases, when downloading from the web most sites that do provide the software could tell quite easily if the the browser is running on an Intel Mac or a PowerPC Mac by looking at the “User-Agent” string that the browser sends.

I also have another solution, add a patch to older OS versions (and add it to new ones) so that they could look inside the .app file (executable or whatever they call it) and see if it has the necessary bits to tell it if its Intel or PowerPC. If it’s the wrong version, the file itself should include a link to the correct version.

This adds a bit of a burden to the creators of the software (they need to provide a link to the Intel version on the PPC version and vice versa and use a specific compiler and compile two sets of the application) but makes the whole thing a lot more pleasant.

Combine these two methods together and you get decreased bandwidth costs for everyone. Only at the worst case where both the web application failed to detect the correct Mac version and the person downloaded the wrong version that doesn’t fit his/her Mac type, only then they will download both.

Luckily Apple’s market share in PCs worldwide is still a single digit percentage so the bandwidth issue is still small, though its probably rising around Silicon Vally ;-)

Amazon Recommendations, Big Giant Collection Books, Reprints and New Editions

I really like Amazon. I really like Amazon’s recommendations and ever since I inputed most of my books into Amazon I get really good recommendations.

There is one thing that bothers me, though.

I recently made a big order from Amazon and included two books which I was long overdue in owning and reading them. The books were “Long Dark Tea Time of the Soul” and “Dirk Gently’s Holistic Detective Agency” both by Douglas Adams.

After the purchase, Amazon recommendation started to offer me other Dougls Adams books such as “Mostly Harmless“, “So Long and Thanks for all the Fish” and “The Restaurant at the End of the Universe“.

I previously told Amazon that I already own “The Ultimate Hitchhiker’s guide to the Galaxy” which is one large book containing all 5 of the hitchhiker’s guide novels (3 of them are the books mentioned above).

Since I own a book that include those books I would have figuring that Amazon will know that and handle that similar to how they handle situations in which a book is reprinted or has some newer edition (usually with minor changes or no changes at all). The recommendation engine doesn’t handle that because it probably doesn’t take into account that this one book is a collection of other books and in addition to that.

Due to the Hitchhiker’s guide to the Galaxy movie they have re-printed the series so there are newer edition out there, which is probably one of the causes I see these books again.

It’s not that uncommon to have such a book that contains multiple previous titles that were a part of a series before. For example I also own “The Great Book of Amber: The Complete Amber Chronicles” which is one big book that contains the 10 books in the Amber series by Roger Zelazny (luckily I haven’t told Amazon about that so I’m not getting recommendations to buy the same books again).

Perhaps Amazon should take a look into such collection books as well as handling re-prints and newer edition in a different way.

For example, for reading books (not technical books that often have newer editions that do change and add things) I would expect by default to not see any new re-prints and things like that unless I specifically opted that in my settings.

For technical/reference books I would like, by default, to see newer editions because these new editions (usually) add and update information and in most cases its important to stay up-to-date or at least know that there is a newer edition.

For paperback vs. hard cover editions, Amazon seems to handle it well and does understand that if I have the paperback edition I don’t need to be recommended of the hard cover edition and vice versa. I can only assume they implemented it by saving some kind of a reference between these books, so perhaps they should add a new type of reference/link for books that are a collection of other books and other such links to handle the rest of the things I’ve mentioned above.

What do you say? Am I the only book maniac/Amazon maniac/Recommendation maniac out there that thinks about this? :-)

Mac Software Updates – I expected more from Apple

We recently got a Mac Mini to the office so that we can test Yedda better with Safari and in general how Yedda looks, feels and works on all of the various browsers on Mac (mainly Safari, FireFox, Camino and Opera).

It’s a cute little machine. I can easily understand why people fall in love with Mac and Apple products in general.

After setting it up and powering it up I ran the Software Updates so that I will have the latest, greatest and safest Mac software.

After running it and updating various things I ended up with 3 items that needed an update:

  • Java for Mac OS X 10.4, Release 5 (Version 5.0)
  • AirPort Extreme Update 2007-002 (Version 1.0)
  • iPhoto Update (Version 6.0.6)

When I wanted to update them, it downloaded them and when it tried to install the updates I got an annoying error (I don’t have the error in front of me now so this is paraphrasing):

“An unexpected error has occurred”

I tried twice and it didn’t work, so I went to the knowledge base articles of these updates and manually downloaded and installed them.

To my “surprise” manually doing it worked like a charm.

Now I know this is a bit of a petty rant, but as a user that never used a Mac full time (the only two Apple computers I used full time was an Apple IIc and iPod) the expectations that were set by Apple’s marketing machine and others were quite high.

The expectations were high, and my disappointment was about the same height.

I’m not a normal/novice user so I did know what to do, but I think Apple should have the decency to tell me why the update/installation failed, or at least provide a button or a link to say what happened (a link or a button would be good so that it won’t alarm the regular users and will give the necessary information to those who knows what to do with it).

It’s as simple as that. Really.

Nokia PC Suite Content Copier .nfb / .nbu Fiasco

This is going to be a long rant about the new Nokia PC Suite Content Copier backup file format and how its software is NOT compatible with previous versions and there is no mentioning anywhere from Nokia (other than the fact they changed the backup file format stated in their help).

My wife recently upgraded from her Nokia 6230 to a brand new Nokia E61. She really liked the personal information management (PIM) features and that it had a full QWERTY keyboard.

Before switching to the new phone, she backed up the old phone’s content using the Nokia PC Suite Content Copier and it created a .nfb file which seems to contain all of the information.

After she got the new E61, she wanted to restore the data (mainly the Contacts with their phones and all) to the new phone, so she fired up the PC Suite only to find out that she needed to upgrade to a newer version (v6.82.22).

We upgraded, run Content Copier again and wanted to restore the files and then all hell broke lose…

The Content Copier pointed to another folder, not the one that her previous version used to store the backups (which was in My Documnets\My Backups folder). No biggie, so we searched for the term “nokia” around the machine and found the place where it kept the backup file (the one with the .nfb extension).

I pointed the Content Copier to that folder (that’s the only thing you can do) and it didn’t recognize it.

A little Google-ing and a little RTFM and apparently in the latest version of PC Suite, Nokia switched the format of the backup file to .nbu. Previous versions used two files, one with a .nfb extension and one with the .nfc extension. The new versions use one file with the extension of .nbu and according to their help (the one provided with the Content Copier) it contains both of the data of the .nfb and .nfc files.

They did not provide any help or way of figuring out how to restore an old backup in any way or form and it wasn’t even available after a lot of searching around the web.

There are a few pointers on the web in the Nokia forums for others with the same problem (you can searched with other terms and get lots more).

Luckily I’ve stumbled upon this program which can export all of the data from the .nfb files into plain text (and also extract the images and videos backed up in the .nfb file). There are also a Perl module and a Python library that can read and write .nfb files.

The best way to overcome the problem was to import the contacts which I now had in plain text into Outlook and sync Outlook back to the phone.

You can import CSV and tab delimited files into Outlook, but since my wife used the phone’s memory, it meant that we now had multiple numbers like cell phone number and home number assigned to the same contact (at least in some of them) and Outlook (and Excel for that matter) had problems figuring out how to map things.

So, I’ve cooked up this little Python script which converted the PHONEBOOK file (the one that contains the contact entries) into a CSV file. I probably messed a few of the fields there (and I’ll post some info on the PHONEBOOK file format later) but it worked.

We edited the entries in Outlook, synced the phone and finally had most of the data. My most you ask? because even though my wife specifically asked to also backup the SIM card, it did not do that and we lost a couple of contacts. Luckily the majority was in our hands.

I’m a VERY long time Nokia user and I think (most) of their phones are REALLY great but this is simply negligence. There is no other word I can think of at this moment to describe this situation.

So you’ve upgraded your software, great. But there is NO reason for you NOT to read your old backup files. You can generate the new ones but people, have you heard about backwards compatibility? Nokia is usually well knowned for their backward compatibility in user interface and other areas, but apparently someone screwed up big time with the newer version of PC Suite.

Why not read the old backup files? Why not say that I need to do this and that to convert them? Why not supply and program to convert them? This is not how things are done. Even Microsoft allowes you to open Office documents that were written in previous versions of Office. Come on, that’s one of the basic things you do!

What will other less technically savvy people would do? Start to re-generate their contact list?

Even if Nokia does have a way of doing it and they haven’t made it VERY clear for NORMAL users (I’m not included as a normal user, of course) how to find such a program (or guidelines) and use them its really really bad.

These are the sort of things that makes people switch to another cellular phone vendor.

I do hope someone from Nokia is reading this and will take care of these issues for the next version of PC Suite. It will also be nice to make sure that when someone sells you a brand new Nokia cell phone the customer is aware of what needs to be done to restore everything back to the new phone.

Amazon Checkout Interface – Group to as few shipments as possible

I recently ordered a couple of books from Amazon.

When reaching the check out screen I, obviously, selected to group my shipments to as few as possible. I then looked and saw that it was grouped into two shipments, one book should be shipped the next day and the other 4 should ship only on the 20th of March – almost two months afterwards!

This was a bit strange considering the fact that Amazon showed that all books were in stock.

I figured there is probably a book or two causing the delay of the whole shipment, so I switched to the “ship as soon as the books are available” option and saw that one book (one book alone) caused the delay of the whole shipment.

I removed it (with great sorrow – it will wait for the next batch of Amazon books from my wish list), set the “group to as few shipments as possible” and everything was in one big happy shipment.

I wonder what other customers who are a bit less proficient in computers would have done. I’m guessing one of 3 options:

  1. Order and not notice that it will take two months for the shipment to come
  2. Select the option to send things as soon as they are available and pay a bit more
  3. Cancel the shipment and go elsewhere

Why didn’t Amazon add a check to see if the shipment will take more time than it should alert the user and tell him/her which item is the one causing the delay? It shouldn’t be that hard to check something along the lines of

if (scheduledShipmentDate > DateTime.Now.AddMonths(1)) {

AlertUser();

}

Sometimes it’s the little things that tick me off. I’m a great fan of Amazon and it’s really the only place I can get almost any book I can think of, but sometimes a man’s got to post on his blog when a man’s got to post on his blog.