Archive for April, 2009

Brand Awareness and Perception

Awareness and Perception are the two key metrics that any company would use to measure their brand strength.

Awareness in simple terms - how many people know my brand? Usually, awareness is measured through surveys that asks participants a series of questions like “What brand comes to your mind if you want to buy shoes? In general, companies measure unaided awareness- what % of survey participants mentioned the brand without any kind of hint.  For top brands like Coke, McDonalds the awareness will be close to 100%.

Perception is the values consumers attach to a brand.  For example, perception for Volvo will be safety.  To measure perception of a car company, survey will have questions like “How do you rank car brands in terms of safety? And various questions will be asked on quality, performance, or green. Outcome of the survey reveal the brand perception. List of questions depends on what is goal for the brand and how you want customers to think of your brand.

Studies have shown that awareness and perception plays a big role in the market share of a product. What is important? Awareness or perception?  For a growing company (trend) like Twitter, awareness would be the prime driver to grow the product usage. As more people are aware of micro blogging and Twitter, more people would use Twitter. Hence % of awareness level would decide Twitter’s growth and penetration among online population. On the other hand, for a mature business segment like ’cars’, awareness may not be the major driver of market share. Close to 100% of people might be aware of Pontiac but many would not buy Pontiac cars. Hence perception matters the most for Pontiac. Not only the stage of company and industry, other factors such as type of product, level of competition, switching cost, risk of switching also drive the importance of awareness vs. perception.

Generally, it takes at least frequency of 5-6 ads through various mass media to gain 1% increase in awareness among population. Cost for such a campaign will be close to $100M. Since consumers are constantly bombarded with huge number of ads, new products and brands, cost for gaining awareness is constantly going up. Given the high cost involved in conventional mass media, small startups leverage non conventional and inexpensive media like Twitter, Facebook, Blogs, and Youtube to generate brand awareness and perception. Most importantly, companies leverage current user base to generate awareness and perception. Facebook and Twitter effectively used their user base to increase awareness. Services like Facebook  are useful to consumers only when more people use the service. Hence users inherently have an incentive to popularize the service. For any company to be successful in the future, its product or service should have some way to provide an incentive for the users to recommend the product to others.

Couple of decades ago, building a brand like Twitter or Facebook would have required multibillion dollar investments. However, these brands were built with faction of the cost by aligning user incentive correctly and leveraging the users.

Having Your Cake and Eating It Too

cake

Web apps are desperately trying to leverage the full power of the desktop. We had JS, Flash, Silverlight, all technologies that run locally on the client while still keeping your data on the server.

Here’s one more from Google called Google Native Client (NaCl). This is a browser plugin that allows websites to run full blown native apps on your PC through your desktop. Apps are written in C++, but have several restrictions on what they can or cannot do.

It does sound interesting and very cool (especially the C++ part), but here’s the real question. Web apps do want to use your bandwidth and now do want to use your computing power, but why don’t they want to give up control over the data? There seems to be an unhealthy interest in centralizing the data while not centralizing the computing resources.

For all intents and purposes, the direction of some of these technologies seems to indicate that we soon will have apps that will be downloaded Just-In-Time and run on your computer when you open the website in a browser.

But wait a sec, I thought these were called Java Applets. :-)

Announcing Tonido Plug – small, low power, low cost Home Server

CodeLathe is proud to announce the launch of Tonido Plug – small, low power, low cost home server for public . Tonido  Plug will be available for pre-order shortly at Tonido Plug’s website.

tonido_plugWhat is Tonido Plug?

Tonido Plug is a small form factor computer (size of a power adapter)  based on a powerful 1.2 GHz Sheeva processor. Tonido Plug  consumes less than  5W power consumption on average and one can run a Tonido Plug 24/7. Tonido Plug comes with a  Gigabit Ethernet and USB 2.0 connector.  Connect your Tonido Plug to an external USB hard drive and to your home router you instantly get a low cost, low power home server for less than 100$. Tonido Plug comes pre-installed with all your favorite Tonido Applications – Photos, JukeBox, Webshare and Workspace, Thots all running on embedded Ubuntu Jaunty Linux OS. Tonido Plug allows you to access your Tonido apps, files , music and media from anywhere – Intranet and Internet. For more info on Tonido Plug please check out  http://www.tonidoplug.com.

Technical Specifications

  • 1.2 GHz  Sheeva Processor, ARM compliant
  • 512MB Flash
  • 512MB DDR2
  • USB 2.0 and Gigabit  Ethernet
  • Power input: 100-240VAC/50-60Hz 19WDC Consumption: 5V/3.0A

What is Tonido?

Tonido is an extensible and open (available under GPL and commercial licenses) platform that allows you to run your own personal web applications on your desktop and form your own private Tonido network. Applications and data are always local.Since Tonido is extensible, you choose the applications that you want to install and run. Current applications include a browser-based personal information manager (PIM) , a browser-based media player, a direct Tonido-to-Tonido photo sharing app, a personal blog and note keeping app and an application to share any desktop folder directly to the web. For more info on Tonido please checkout http://www.tonido.com.

How does Tonido Plug work?

Step 1: Connect Tonido Plug to your home router and to any external USB hard drive.

Step 2: Connect Tonido Plug to power socket.

Step 3: Connect to Tonido Plug from any device with a web browser using Tonido Plug’s local IP address and create a Tonido Profile.

Step 4: Access your Tonido Apps, files, music and media from anywhere using your Tonido URL (http://mytonidoplug.tonidoid.com:10001/)

That is it. You are done.

Who can use Tonido Plug?

Tonido Plug is ideal for homes and small businesses looking for an inexpensive home server or network attached storage. With Tonido Plug you can have your own reliable storage at a fraction of cost of your on line backup service providers while providing 24/7 access to your files from anywhere : intranet and internet.  With Tonido Plug, your data never flows through a third party server or storage which offers complete privacy and absolute control over your private data.

When Tonido Plug will be available?

Tonido Plug will be available for pre-order shortly. Once you pre-order Tonido Plug you will get your device approximately within 4 – 6 weeks. For initial few orders we are offering  Tonido Plug for a special introductory price of $89.99.

Where can I pre-order Tonido Plug?

You can pre-order Tonido Plug at http://www.tonidoplug.com

Tonido: Q and A

I thought this would be a good place to do a mini faq on some of the general comments and issues raised by various people around the net.

1) Is Tonido about Cloud Computing on your Desktop?

No. I don’t think we ever told anyone in any message that Tonido is about Cloud Computing. For some reason, blogs chose that as their headlines (cloud being the hot topic of the day). We can say that Tonido is an alternative to web apps and is another way to access, communicate and collaborate directly without data flowing through a thirdparty.

The whole cloud notion is nebulous at best and confusing at worst. It is used interchangeably to mean different things, so we don’t want people to think Tonido is somehow related to Cloud computing.

2) What is the motivation to build Tonido as a platform?

If you read my previous blog on the future of software apps, you will agree that the future is clearly browser based. We wanted to build an alternative to web apps hosted elsewhere (where you will end up losing control and your data) . And to do that requires thinking a little bit more than just building a bunch of apps. So we took a lot more time, put a bit more thought into it and built it as a platform where apps can reuse the common components and don’t have to reinvent the wheel once again.

3) How do I trust the Tonido platform?

The short answer is that the Tonido platform SDK is becoming open source. I really can’t see a better way for gaining trust. We also plan on allowing users to run their own Tonido DS server. When that happens, your Tonido instance is truly private.

To add to that, Tonido as a product and its code, before release was audited by a security audit company for security issues.

The longer answer is that trust usually comes based on experience with a particular product or company after a while. So you just might have to look at the track record of the people behind Tonido etc and the products they worked on before. If you can trust that then you can trust Tonido.

4) Why is there no download for my favorite Linux distro? Mac OSX on PowerPC?

In making the decision to go with 3 main OSes, we had to cut some corners. We picked the best option in all three and went with them. We figured the important thing is to release it and then slowly getting it right. Besides, how many products do you know that release on all 3 OSes on day one? We are actually proud that we pulled it off. You should see the number of machines we have to get it going. (topic for another post)

5) Why is there no feature X yet or feature Y yet?

The simple reason is that we wanted Tonido to go out and evolve based on user feedback and user usage patterns. We wanted feature X and feature Y to be driven by strong feedback from our users. So by just asking us why something is not there, you are actually helping us make a decision on prioritizing and implementing those features. So be vocal, complain and let us know.

6) When you say Tonido Platform is open source, does it mean the apps that are being shipped with it?

No, currently only the Platform is open, which means that developers can build new apps on top of it easily. The individual apps shipped currently are not open source.

If third party developers want to build great Tonido apps and want to get paid for their efforts they should be.  If they want to release it open source they can too. We didn’t want to limit the type of apps only to be free ones.

7) Why does the Tonido Platform require apps written in C/C++/Lua and not in Perl, Python, Ruby, PHP?

The overriding consideration for Tonido was that it should be light weight. We could only do that with C++. 5 apps now weighs less than 10 MB download, less than 20 MB in memory. Trying to use any other language required a lot more baggage and setup. Besides, I am not sure how easy it is write a P2P library using other languages (like PHP). Finally, C++ is a language I love and what I know best and it allows me to be 10x more productive than anything else. (even if I have to type 10x more lines than perl to get the same thing done). So even though C++ may not be the best language, it certainly allowed us to build Tonido in half the time it would have taken otherwise.

Besides there are a lot of C/C++ programmers out there who may not have transitioned to the whole web application development, we are hoping many of them do find Tonido Platform easy to to build apps on top of.

Finally, just because C/C++/Lua is available today doesn’t mean that is the only environment that will ever be possible. We are looking at embedding Python or PHP inside Tonido. So look out for that. Having C/C++ being the core of Tonido allows all kinds of language bindings tommorrow, so it is definitely a little limiting today, but not tomorrow.

Have more questions? Please post in comments and we will answer in Q and A part 2.

So much to do, so little time…

Our blog has been pretty quiet the last few days. And the reason is a good one, we have just been swamped with great response to the Tonido beta.

Since being published on a bunch of big blogs(MakeUseOf, LifeHacker), there have been a huge number of people downloading, creating profiles and getting in touch with us.

A big thanks to all who sent in comments that made our effort in the last 1.5 years worthwhile.

Frankly, it is all a little overwhelming. For the first time in the last 1.5 years, programming came to a complete standstill so that we could digest and chew on all that came pouring through. And it has been pouring as never before from Twitter, Facebook, blogs, comments, forum postings, direct emails. Just a lot of feedback.

We are trying to pull everything together and organize it in one big list. We are also trying to figure out what is the next most important thing to do. And believe it or not, we have plenty to do, upto 3 new plugins are in the works, 2 more in the pipeline.

We are also amazed by the interest in the Tonido platform, so that is one of our priorities. We are planning to get at least the Windows SDK version out quickly and get the other OS versions later.

We also have a significant product announcement coming within this week. So stay tuned.

Now let me get back to work.

The Tonido platform is going open source

It is going to be a month since we released the public Tonido beta. Since then we have got great response and feedback from everyone who has tried Tonido.  We even recently crossed a milestone with the number of Tonido Profiles created as well as number of people online.

At CodeLathe, we believe in the power of the open platform as well as in the open source movement.

We think that there is great potential in the Tonido platform and really like the power and flexibility that Tonido offers to end users. We also believe that there are many more applications waiting out there that are yet to be developed by creative developers for Tonido. We want those applications to come alive on the Tonido platform.

As a first step towards that goal, we are happy to announce that the Tonido platform will be available under a open source license (Most likely the GPL v3). This means you can download the Tonido Platform SDK and develop applications based on it completely free of charge.

Not only do we want to get more users to try the Tonido platform, we also want developers worldwide to develop on the Tonido platform.

If an open source license doesn’t fit your needs, the Tonido Platform SDK will also be available through other licenses.

It is going to take a short while to get the SDK ready for download and all the licensing legal stuff worked out.

If you would like to sign up on our developer mailing list, please do so. We will send out information once the SDK is ready.

Betting on Cloud Services? Think again.

“Microsoft confirmed on Monday that Azure users suffered an overnight outage over the weekend during which their applications weren’t available. “
ComputerWorld Mar 19, 2009

“Business and personal users of Gmail suffered an outage starting about 1:30 a.m. PST Tuesday”
CNet News Feb, 24 2009

“Amazon storage ‘cloud’ service goes dark, ruffles Web 2.0 feathers”
CNet News Feb 15, 2009

“Salesforce.com was down for under an hour on Tuesday, leaving many users in the dark. “
PC World Jan 6, 2009

“According to reports on Friday, cloud storage provider FlexiScale (www.flexiscale.com), a unit of UK-based web hosting provider Xcalibre (www.xcalibre.co.uk), has been hit with its second outage in two months, leaving some customers without access to their servers for more than 18 hours.”
Web Host Industry Review Oct 31, 2008

“Recent unreliable commercial e-mail service from Google has underscored the need for enterprises to develop contingency plans for software-as-a-service applications.”
Gartner Sep 3, 2008

“Outage Forces Cloud Computing Users to Rethink Tactics – IT Departments scramble to devise backup plans following service disruptions at Amazon, Citrix and Google.”
Information Week Aug 16, 2008

“Microsoft Windows Live Services Suffer Global Outage.”
Channel Web Feb 26, 2008

When the system is down and when the business is idle for more than 30 minutes, the reputation suffered is priceless.

Despite the possible privacy issues, businesses and individuals are lining up to get a spot in the cloud. There are a number of advantages listed for cloud services – Reduced Cost, Increased Storage, More Mobility etc., Well… the goliaths and proponents of Cloud Services are yet to prove that they can deliver their service in a reliable fashion.

Did you lose your tweets recently?

We have lost all our tweets that we have written between  (March 17 – April 4, 2009 )  from our twitter account. We again lost some tweets after April 4th.

Did it happen to any of your tweets? or  Is it just a random occurrence?

One billion dollar valuation is all fine and dandy. But first Twitter need to make sure that their messaging platform is scalable (You guys must have seen Twitter Over Capacity Message!) and stable.

If it happens again, probably we need to think about releasing our own peer-to-peer twitter clone on top of our Tonido platform to talk with our followers. Let us know whether we are alone in this twitter episode.

Network file transfer with on-the-fly compression

We often transfer large number and large size files over the network from one computer to another. FTP is the default choice for  transferring few files and SCP is the typical choice for transferring large number of files.

If you happen to transfer files from one computer to another over a slow network(such as copying files from home computer to office or vice versa) then the following tip might be helpful. This technique works as follows:
1) Performs on-the-fly compression of files at source computer.
2) Transfer the compressed files over the network.
3) Performs on-the-fly decompression of the files at the target computer.
This technique uses just SSH and TAR commands without creating any temporary files.

Example
Let us assume source computer as HostA and target computer as HostB. We need to transfer a directory (/data/files/) with large number of files from HostA to HostB.
1) Command without on-the-fly compression
Run this command on HostB
# scp -r HostA:/data/files /tmp/
This command recursively copies /data/files directory from HostA to HostB

2) Command with on-the-fly compression
Run this command from on HostB
# ssh HostA “cd /data/;tar zcf – files” | tar  zxf -
This command recursively copies /data/files from HostA to HostB a lot faster on slow network.

Let us take a  look at this command in detail:
1) ssh HostA “cd /data/;tar zcf – files” | tar  zxf -  :
From HostB connect to HostA via SSH.
2) ssh HostA “cd /data/;tar zcf – files” | tar  zxf -  : On HostA switch to directory /data/
3) ssh HostA “cd /data/;tar zcf – files” | tar  zxf – : Tar ‘files’ directory with compression and send the output to STDOUT.
4) ssh HostA “cd /data/;tar zcf – files” | tar  zxf – : Pipe(|) STDOUT from HostA to STDIN of HostB.
5) ssh HostA “cd /data/;tar zcf – files” | tar  zxf – : On HostB decompress and untar data coming in through STDIN.

To show how useful this technique is, we transferred 45M worth of files from HostA to HostB over a DSL connection. Here are the results:
1) No compression method: 12min 59 sec
2) On-the-fly compression method: 2min 33 sec

This method will be effective with uncompressed large files or directories with a mix of different files. If the transferred files are already compressed then this method won’t be effective.

5 sure-fire ways to become better at programming

My previous semi-humorous blog post on becoming a bad programmer generated a lot of reaction, so I decided to write one more, this time eschewing the humor (some appeared to not *get* it) and just jumping to the list directly.

1) Be an apprentice first

Becoming a programmer is like becoming a mason. In the medieval ages, a mason had to first become an apprentice, work hard for several years before becoming an independent mason and joining the guild. Unfortunately, no such process exists for programmers. It is my opinion that programmers need the same mentoring before they develop good programming habits. If you have never met or worked with someone who is a better programmer than you, you are unfortunate. Without the ability to work and see at first hand the habits and processes of great programmers, it is hard to become one. My ability to handle complex problems increased dramatically when I worked with great programmers.

2) Continually adjust your complexity mental models

Programming is purely a mental activity and has no relationship to any physical activity including typing. To become a better programmer, you will need to exercise and build up the part of the brain that deals with managing complexity and dealing with the relationships between countless objects.

So how do you actually get better at this activity? It is by continuously learning from mistakes and tweaking your understanding and process by which you manage complexity. When you continuously refine your complexity models, you get better at managing complexity more efficiently. There is no end to this process – as you work on more complex projects, you will add more tools in your arsenal to manage complexity inside your head. The important thing to realize is that a mental model exists and that you must act consciously to improve it.

3) Be curious about new trends in computing

Programming, unlike bridge building changes the basic tools and processes every 5 years. It is hard work to keep abreast of the changes. But keep up, you must if you don’t want to be left behind. From programming with punch cards, Waterfalls, Assembly, Windows, MFC, Java, J2EE, .NET, PHP, Ruby-on-Rails, REST, Agile Programming, Design Patterns, AJAX, you will have to keep up with the rapidly changing landscape. And I say this not to keep jumping on the latest fad, but to keep up so that you understand the latest technologies and the benefits (and cons). Programming is partly about building things efficiently and choosing the right tool for the job. To get that piece right, you *have* to know what exists out there. Otherwise your program will be obsolete by the time you are ready to release.

4) Understand the major pieces of the software stack

Joel Spolsky talks about leaky abstractions. That is when you work with any abstraction, it always leaks a little bit, allowing the ugly underlying complexity to seep through. When that happens, if you don’t have an understanding of the layer below you will be screwed. Say you are a web programmer, you need to know a little about HTTP. If you are a .NET or MFC programmer, you need to know something about how Windows messages works.

Modern programming environments are little like fishbowls. You can live happily within this world for eternity, but if you want to do more than what is provided within the confines of the fish bowl, you better learn more about the *outside*.

Learn a little about all the pieces of the software stack, from registers in a CPU to low level memory management, process management, networking and so on. You will never be surprised or stopped dead by the glass walls of your fishbowl.

5) Be passionate

To become better at programming, the one sure-fire way is to be passionate about it. You need to be genuinely interested in working, thinking and living in code. No amount of knowledge, experience will otherwise help.

Agree, Disagree? Let me know.