About Us Our Team Our Investments News Investor Info Blog

Archive for the ‘Startups’ Category

Bad Customer Experience

Tuesday, May 13th, 2008

This is the first time I am doing something like this, but I just had the absolutely worst customer experience I have ever had, and it was from two startups. The culprits are Bookeen and BooksonBoard. I decided to try an eBook to see how things work. i settled on a Bookeen Cybook (the Kindle wasn’t available, and I wanted to use it to read mostly PDF’s so I decided on the Cybook.)  Well, it was broken out of the box (though it took me over a month to try it out, I just didn’t have the time to get around to it). The first time I turned it on I was impressed with the resolution and was dying to try one of my books - but I couldn’t navigate - all the buttons seemed to be dead.

At first BooksonBoard were pleasant enough we spent about a week trying to diagnose the problem - recharge it, reinstall firmware. Finally I just asked for a refund. That is when they got nasty and told me that since it was over a week - they just don’t give refunds. period, even though the device was dead on arrival. After some nasty emails back and forth it seemed like the only way they were going to do anything was if I sent it back to Bookeen in France at my own expense.

So I sent it back, packed as they suggested. Today I finally get an email from Bookeen telling me that the screen was broken “either by a drop or pressure” and therefor wasn’t covered by warranty. yeah right. It seemed to me that the only thing that was working was the screen. Now I could pay another $130 and maybe it would work or just write it off. Those were the choices they gave me for a few hundred dollar device that never worked.

So what have they accomplished? They have one irate customer that will NEVER recommend them to anyone, and of course this negative blog post. I am not sure how they (Bookeen and BooksonBoard) think they will succeed with this level of customer service. They broke cardinal rule number 1 - the customer is always right.

This got me think how difficult it is to create a hardware based consumer electronics startup. The level of service they are expected to give just doesn’t jive with the level a startup can provide, unless you team with an experienced retail team. 

Increasing Your Chances of Success as an Entrepreneur

Tuesday, November 20th, 2007

I am revisiting the topic I mention in my previous entry on “Overconfidence and Entrepreneurial Behavior” because of an interesting post I found on the subject by Bob Warfield. He points at a study from the Harvard School of Business, that essentially comes to same conclusions about experience as the Israeli study in my previous post, but has some interesting insights that are useful to a first time entrepreneur.

First their definition of success is different than that of the Israeli study - success is effectively defined as an exit (closer to the hearts of most VCs than the definition used by the Israeli study). Here is my summary of their findings:

1. Entrepreneurs who succeeded in a prior venture (i.e., started a company that went public) have a 30% chance of succeeding in their next venture. By contrast, first-time entrepreneurs have only an 18% chance of succeeding and entrepreneurs who previously failed have a 20% chance of succeeding.

To be honest the relations seem about right - though the actual numbers seem a bit high, but maybe it isn’t if you take into account that an exit doesn’t necessarily mean that money was made.

2. Companies that are funded by more experienced (top-tier) venture capital firms are more likely to succeed. This performance increase exists only when venture capital firms invest in companies started by first-time entrepreneurs or those who previously failed. Taken together, these findings also support the view that suppliers of capital are not just efficient risk-bearers, but rather help to put capital in the right hands and ensure that it is used effectively.

This is really important news for entrepreneurs (since you really can’t just increase your own experience) - if you don’t have the experience yourself, you should “buy” it. Choose your investors not just on the basis of money, but on the basis of their experience, and how much of that experience they are willing to put into a company (one day a month just isn’t enough). Or, if you can then you should hire the experience even though it is expensive. If you look at the numbers in the study - by getting the appropriate experience on board - you can increase your chances of success by 7%-12%

3. The average investment multiple (exit valuation divided by pre-money valuation) is higher for companies of previously successful serial entrepreneurs.

in other words - by getting the appropriate experience on board, you can not only increase your chances of success - but leave more money in your pocket when you are successful.

So for me the study proves the model that we are using at eXeedtechnology, while expensive, is the right model for entrepreneurial success (early heavy involvement of experienced industry veterans as an integral part of the startup). While it doesn’t increase your chances of a home run (which seems to be heavily predicated on luck) – it significantly increases your chance of a successful exit.

Web Credibility

Tuesday, October 30th, 2007

I was looking around at some sites and was reminded of some older, but still relevant work done by the Stanford Persusive Technology lab on web credibility. I would have like to see the research updated to include some guidelines for UGC (User Generated Content) sites - but even so it is still very relevant. There are also a nice set of charts that describe captology here,  and web credibility here.

Another reason that I was reminded of the persuasive computing work is that I keep hearing from Israeli VCs the notion of “ease of use” being a key ingredient in the Web 2.0 world, and you need to make sure that Web 2.0 enterpreneurs understand that. IMHO that is a mistake - ease of use is the minimal bar - without that you don’t get to play… The real need is to make sure that developers and designers  understand that the real goal is “Joy of Use” - sure it has to to be easy and intuitive to use, but users also need to have fun using the technology - otherwise you won’t succeed.

The Long Road towards Integration – Part 4

Sunday, October 21st, 2007

I am sort of surprised that I am back on this subject again, but when I read that Microsoft’s Ballmer plans to buy 20 smaller companies next year (Ballmer: We Can Compete with Google) it drives home for me the importance of being able to integrate well in the aftermath of M&A. My best guess is that those 20 companies will include 1-2 large companies, the rest being small and midsize companies - companies that are “innovating in the marketplace” (a term we used to use at IBM Research). So Microsoft is effectively outsourcing a good portion of their innovation, and placing a big bet on being able to integrate these acquisitions into the fabric of Microsoft.Thee types of smaller acquisitions seem to be in the cards for IBM and Google - and I think more and more technology companies will be outsourcing their innovation this way - augmenting internal “organic” growth with external  ”inorganic” growth. Oracle seems to have gotten this down to an art (though they tend to swallow whales rather than minnows), and even SAP has jumped on the bandwagon. One issue that will clearly come with these acquisitions is how the acquiring company doesn’t kill the spark of innovation that exists in these smaller companies  (of course that is assuming that they want to keep the innovation alive, and aren’t just buying a specific technology or existing product.I had the opportunity the other day to speak with someone that was on the Corporate side of an acquisition and discussed what was their thought process at the time of the acquisition, and how that differed from how things turned out after the acquisition.One thing that struck me was that both sides were fooled since they were (paraphrasing Bernard Shaw)  ”two companies separated by the same language”. The company being acquired thought they were communicating important information about the acquisition, but it turns out that they were using internal shorthand to describe people and situations, which were interpreted completely differently by the other side. This was probably exacerbated by the fact that one side was Israeli and the other American - but it could have happened with any two companies - especially when there is a high impedance mismatch between the two (or in English - the companies are of very different size). . For example when one company said a manager “kept the trains running on time” - they meant a clerk that could keep to a schedule - while the other side thought  they meant someone could manage a complex system with all its nuances and make sure that it keeps working. Understandably these kinds of miscommunications caused a lot of faulty decisions to made during, and right after the acquisition.In my experience it takes about 9  to 18 months until the sides really start to understand each, how the other side works - and how they need to work together. That is assuming that everything goes smoothly. If you try to speed it up too much - you will end up killing the innovation, and you may end up killing any possibility of a successful acquisition.So what is the bottom line? Assume that you will need to keep the current structure of the acquisition intact for about a year before you can make any drastic structural or strategic changes. See the rest of my recommendations in previous posts - and perhaps hire a consultant that has been there and can help smooth the transition.

The Death of Enterprise Software Startups?

Tuesday, October 2nd, 2007

In Israel, it has become close to impossible to get an investment for an Enterprise Software startup, even worse than in the US. One of the main reasons is that enterprise software sales are hard, and expensive ( a lot of high cost man power, and long sale cycles) - which is true. Everyone is looking at models to get around those issues (e.g. open source, SaaS), but fundamentally it remains an issue.
Not that there aren’t problems or opportunities in enterprise software (see The Trouble With Enterprise Software for a nice overview of some of the issues), there are huge issues with enterprise software, and SOA (Services Oriented Architectures) are no panacea. So opportunities for technical innovation abound, it is just that most VCs don’t believe that it is a good investment of time or capital. Since VCs are awfully busy, and have more on their plate than they can handle, once this is a “rule-of-thumb” it is hard to get their ear.
I think this will have grave implications for Enterprise IT shops (and vendors). In last few years most large IT vendors have gotten into the habit of “outsourcing” their technical innovation - they buy companies rather than develop the technology in-house. If the VCs stop investing - then in a few years, innovation in the enterprise software market will dry up. Given the current state of enterprise software, that can’t be a good thing….
I think that things will change - since there is still a lot of money in enterprise software and large vendors need technology, someone will have to provide them with it. Enterprise software companies probably will have smaller chance at IPO - but given the relative lack of competition they should have a better shot at M&A. The trick is to have unique, innovative technology that solves a problem for enterprise IT departments – or even better, for the business. I also think the pendulum has swung too far, and will swing back in a couple of years - making any investment done now, much more valuable in the future

Age and the Israeli Entrepreneur

Sunday, September 23rd, 2007

I read with interest a whole set of blog posts about the age of successful entrepreneurs in the US (one of the better ones was by Marc Andreessen, you can find it here  Age and the entrepreneur, part 1: Some data). In my opinion it was a debate over whether youth and enthusiasism trumped age and experience in the high-tech startup world. One thing that immediately jumps up at you is that most of the high-tech entrepreneurial super stars were young (e.g. Bill Gates, Larry Page, Sergey Brin).

I was wondering whether anyone had done any real studies on how things worked in Israel.  Even though the Israeli VC and start-up model is based on the US model, the culture, environment and people are different than in the US.  Thigs work diffefrently here (and I think the Israeli VCs will need to change in order to adopt - but I’ll write more on that in a separate post). For example, most Israeli entrepreneurs go through maniditory army service - for three years or more (and many Israeli high tech companies started are based on teams that worked together during their Army service). I guess that is why Israeli work better in teams than Americans - and the list of differences go on and on (probably write a post on that too:)

That leads me to an article I read yesterday in the Marker (an Israeli business daily) that quotes a study by Dr. Eli Gimon (sp?). I would have put up a link but I couldn’t find the article on the web - and both the article and summary I found was only in hebrew….

I thought it was telling to see what he actually measured - whether a company that started in a high-tech incubator was around for at least seven years. That was his definition of success. I am not sure any VC would agree with that definition - but it does make sense in an Israeli context. While most US VCs (and Israeli too) are looking for the elusive “home-run” - Israeli produces very few of those. It mostly produces companies with innovative, solid technology - which is why so many Israeli companies are snapped up by overseas companies - they provide technology innovation, depth and skills. These companies get acquired for anywhere between $10M-$200M - where over $100M is rare and high-end. Very different than the US model…

Bottom line - what Dr. Gimon (sp?) found was that the most important ingredient to success for an Israeli start-up is management skill and experience - not age, sex, schooling or national origin of the founder. Also whether they built the company based on their own technology made a difference.

I imagine these findings are probably very different than in the US…

Patents and Israeli Startups

Friday, July 20th, 2007

Patents aren’t cheap, but they are important. Besides the time and effort, it will cost you somewhere between $5K-$15K per patent. As a startup you ‘ll need to worry about a patent portfolio that provides you with real value besides the obvious one - responding to a VC’s query about the IP protection you have, barriers to entry etc. So how do you go about creating a patent portfolio? Here are some of the considerations you should take into account when deciding what to patent:

  • Freedom of action - what is key to making sure that you can build the products you need to be successful, without anyone being able to stop you.

  • Leverage for partnering - allows you to provide unqiue partnership value that (hoepfully) people are willing to pay for. And it is cool to say “patent pending technology”.

  • Block competition - keep others from doing the same thing. But don’t really count on this, since this is usually relatively hard. Given that there is usually more than one way to do things - how do you tell if a competitor is actually using your IP without a costly trial.

  • Due diligence and M&A - worst comes to worst, you can sell your IP portfolio. However, this is really a last resort since patents without skills are usually not considered all that valuable as an acquisition. However some key patents can  increase your value in an acquisition.
  • Generate revenue (and especially profit) - this actually a possible, but very difficult, business model to implement (e.g. Qualcomm as an example). Be honest with yourself - what are the chances that someone will pay big bucks for access to your patent portfolio….

The basic steps in creating your patent are:

  • Invention - Discovering something that is unique and valuable and then deciding which parts are worthy of the time and effort of a patent.

  • Competitive Analysis - Should be done by the inventor, rather than attorney, since the inventors understand the domain better than anyone. You can find helpful resources at http://www.uspto.gov/ and http://www.google.com/patents.

  • Provisional Patent -doesn’t really provide protection, buty does allow you to set a date of inventtion. For the few hundred bucks it costs, it is usually worth it. In your provisional patent you should document as much as you can about the invention. Don’t forget you only have a year to submit the actual patent - don’t wait until the last minute.

  • Write patent  - Expect to spend significant time writing, explaining and reviewing.

  • Submit and wait - and decide where you would like to submit.

  • Modifications - the patent office will probably come back with questions and issues (though not quickily, it can take a couple of years for a patent to be reviewed)..

 

Structured, Semi-Structured and Unstructured Data in Business Applications

Monday, July 16th, 2007

I was discussing these issue again today - so I thought this old paper must still be relevant….
 
There is a growing consensus that semi-structured and unstructured data sources contain information critical to the business [1, 3] and must be made accessible both for business intelligence and operational needs. It is also clear that amount of relevant unstructured business data is growing, and will continue to grow in the foreseeable future. That trend is converging with the “opening” of business data through standardized XML formats and industry specific XML data standards (e.g. ACORD in insurance, HL7 in healthcare). These two trends are expanding the types of data that need to be handled by BI and integration tools, and are straining their transformation capabilities. This mismatch between existing transformation capabilities and these emerging needs is opening the door for a new type of “universal” data transformation products that will allow transformations to be defined for all classes of data (e.g., structured, semi-structured, unstructured), without writing code, and deployed to any software application or platform architecture.

 The Problem with Unstructured Data
 The terms semi-structured data and unstructured data can mean different things in different contexts. In this article I will stick to a simple definition for both. First when I use the terms unstructured or semi-structured data I mean text based information, not video or sound, which has no explicit meta data associated with it, but does have implicit meta-data that can be understood by a human (e.g. a purchase order sent by fax has no explicit meta-data, but a human can extract the relevant data items from the document). The difference between semi-structured and unstructured is whether portions of the data have associated meta-data, or there is no meta-data at all. From now on I will use the term unstructured data to designate both semi-structured and unstructured data.

The problem is that both unstructured data and XML are not naturally handled by the current generation of BI and integration tools – especially Extract, Transform, Load (ETL) technologies. ETL grew out of the need to create data warehouses from production database, which means that it is geared towards handling large amounts of relational data, and very simple data hierarchies. However in a world that is moving towards XML, instead of being able to assume well-structured data with little or no hierarchy in both the source and target, the source and target will be very deeply hierarchical and probably have very different hierarchies. It is clear that the next generation of integration tools will need to do a much better job of inherently supporting both unstructured and XML data.

XML as a Common Denominator
 By first extracting the information from unstructured data sources into XML format, it is possible to treat integration of unstructured data similarly to integration with XML. Also, structured data has a “natural” XML structure that can be used to describe it (i.e. a simple reflection of the source structure) so using XML as the common denominator for describing unstructured data and structured data makes integration simpler to manage.

Using XML as the syntax for the different data types allows a simple logical flow for combining structured XML and unstructured data (see Figure 1):
1. extract data from structured sources into a “natural” XML stream,
2. extract data from unstructured sources into an XML stream,
3. transform the two streams as needed (cleansing, lookup etc.)
4. map the XMLs into the target XML.

This flow is becoming more and more pervasive in large integration projects, hand-in-hand with the expansion of XML and unstructured data use-cases. These use cases fall outside the sweet spot of current ETL and Enterprise Application Integration (EAI) integration architectures – the two standard integration platforms in use today. The reason is that both ETL and EAI have difficulty with steps 1 and 4. Step 1 is problematic since there are very few tools on the market that can easily “parse” unstructured data into XML and allow it to be combined with structured data. Step 4 is also problematic since current integration tools also have underpowered mapping tools that fall apart when hierarchy changes, or when other complex mappings, are needed. All of today’s ETL and EAI tools require hand coding to meet these challenges.

dm-review-no-affiliation.jpg
Figure 1: A standard flow for combing structured, unstructured and XML information

The Importance of Parsing
 Of course, when working with unstructured data, it is intuitive that parsing the data to extract the relevant information is a basic requirement. Hand-coding a parser is difficult, error-prone and tedious work, which is why it needs to be a basic part of any integration tool (ETL or EAI). Given its importance it is surprising that integration tool vendors have only started to address this requirement.

 The Importance of Mapping
 The importance of powerful mapping capabilities is less intuitively obvious. However, in an XML world, mapping capability is critical. As XML is becoming more pervasive, XML schemas are looking less like structured schemas and are becoming more complex, hierarchically deep and differentiated.

This means that the ability to manipulate and change the structure of data by complex mapping of XML to XML is becoming more and more critical for integration tools. They will need to provide visual, codeless design environments to allow developers and business analysts to address complex mapping, and a runtime that naturally supports it.

Unstructured data is needed both by BI and application integration, and the transformations needed to get the information from the unstructured source data can be complex, these use cases will push towards the requirement of “transformation reusability” – the ability to transform the data once (from unstructured to XML, or from XML to XML) and reuse the transformation in various integration platforms and scenarios. The will cause a further blurring of the lines between the ETL and EAI use cases.

Customer data is a simple example use case. The example is to take customer information from various sources, merge it and then put the result into an XML application the uses the data. In this case structured customer data is extracted from a database (e.g. a central CRM system), merged with additional data from unstructured sources (e.g. branch information about that customer stored in a spreadsheet), which is then mapped to create a target XML representation. The resulting XML can be used as input into a customer application, migrate data to a different customer DB or create a file to be shipped to a business partner.

Looking Ahead
 Given the trends outlined above there are some pretty safe bets about where integration tools and platforms will be going in the next 12-24 months:
1. Better support for parsing of unstructured data.
2. Enhanced mapping support, with support for business analyst end-users
3. Enhanced support for XML use cases.
4. A blurring of the line separating ETL integration products from EAI integration products (especially around XML and unstructured use cases)
5. Introduction of a new class of integration products that focus on the XML and unstructured use case. These “universal” data transformation products will allow transformations to be defined for all classes of data (e.g., structured, semi-structured, unstructured), without writing code, and deployed to any software application or platform architecture.

References
[1] Knightsbridge Solutions LLP – Top 10 Trends in Business Intelligence for 2006
[2] ACM Queue, Vol. 3 No. 8 - October 2005, Dealing with Semi-Structured Data (the whole issue)
[3] DM review - The Problem with Unstructured Data by Robert Blumberg and Shaku Atre, February 2003 Issue

Open Source and Freeware

Friday, July 13th, 2007

Selling IT to corporations is hard (well, selling to anybody is hard) and requires a lot of resources (especially around presale - POCs, Bake-offs, etc.) So a lot of VCs are looking to the open source model for salvation - not Open Source in its purest (as published in The Cathedral and the Bazaar), but as a way to lower the cost and frcition in selling to the enterprise.

The logic behind it is that the techies (especially in larger organizations) will download the software, play with it, and start using it in a limited way. This can be either as part of a project to solve a specific problem  (e.g. we need a new documant management systems), or just something that interests them as part of their job (why pay for a FTP client and server if you can just use FileZilla, or pay for a databsae if you can use MySQL). So the thinking is that this solves the issue of both penetration (the user find the stuff themselves), expensive POCs (the users will create the POC themselves) and the length of the sale cycle.

The second part of the open source equation is that users will become an active and viable community - both recommending and improving the product directly. Linux is usually given as the prototypical example - with a vibrant user community and a large number of developer\contributors. The allure behind this idea, and the feeling that you have more control (you can modify the code yourself, no vendor tie in, a community of developers\contributers) is what differentiates Open Source from just Freeware.

So how does a company make money off an open source product:

1. Sell services - any large organization that uses a product wants support, and will pay for it.

2. Sell add-ons, upgrades, premium versions - once they get used to the product, they will be willing to pay for added functionality

What doesn’t seem to work is proving a dumbed down, or partial functionality product to get people “hooked” and them sell them the full version, or leaving out important features.

So should you turn your enterprise software product open source. Before you you do here are a few things to consider:

1. How will the techies find your product? Is it a well know category (so that when  they need to find a CRM system, and the search for vendors, your product will show up - e.g. SugarCRM,).

2. Do you really have a technological breakthrough - or are you trying to sell an enhnaced version of a well established product category? If you do have a real, viable techical breakthrough - your code is open and you can be sure that the first people to download your product will be competitors looking for the “secret sauce”.

3. There are a LOT of Open Source projects out there -  take a look at Sourceforge, there are at least 100K projects out there. You’ll need to put effort (probably at least 1 or 2 people) to make sure that you stand out from the crowd and start growing a user community.

4. The open source download to sale conversion rate is low somewhere between 1 in 1,000 to 1 in 10,000, so you have to make sure that you get enough users to be viable.

5. It is a one way street, you can make your code open source, but it is really impossible to take back that decision once it is out in the wild.

6. Choosing a license - GPL gives you the most control, but many organizations don’t like it’s restrictions. Apache license seems to be universally acceptable - but gives you almost no control.

7. You need to decide what you will do with user submissions - and make sure you get the copyright for everything that is submitted.

Mashups and Situational Apps

Saturday, July 7th, 2007

Mashups both for prosumers (a new term that I had first heard from Clare Hart at the “Buying & Selling eContent” conference) - high-end consumers and creators of content and for scripters (my own term since I am not sure what exactly to call these high end-users - for example the departmental Excel gurus that create and manage departmental Excel scripts and templates).

The search for tools that empower these domain experts to create applications without programming has been around since at least the 80s (i.e.  4th generation programming languages) - which led to various new forms of application creation - but the only one that has really evolved into a “general use”  corporate tool for non-programmers has been Excel (though not really a 4GL). The reasoning behind those tools was to put the power to create appplications into the hands of the domain expert, and you will get better applications, faster. One new evolution of these types of tools are Domain Specific Languages (DSL) that make programming easier by focusing on a specific domain and building languages that are tailored to that domain.

So much for the history lesson - but what does that have to do with Mashups and  Situational Apps?  Well they both focus on pulling together different data sources and combing them in new ways in order to discover new insights. Mashups seem to be the preferred web term, Situational Apps is a term coined by IBM for the same tyoe of application in a corporate setting.

These types of applications (and application builders) have a lot in common:

1. They all start from a data feed of some sort. either RSS or XML.

2. They focus on ease of use over robustness.

3. They create allow users to applications easily to solve short term  problems.

Many of these tools are experimental and in the Alpha or Beta stage, or are Research projects of one type or another (QEDWiki, Microsoft Popfly, Yahoo Pipes, Intel MashMaker, Google Mashup Editor). As these tools start maturing, I think we will see a layered architecture emerging, especially for the corporate versions of these tools.  Here is how I see the corporate architecture layers evolving (click on the chart to enlarge it):

Mashup Layers

I think the layers are pretty self explainatory, except for the top-most Universal Feed Layer which is simply an easy way to use the new “mashup” data in other ways (e.g. other mashups, mobile).

If you look at the stack there are players in all layers (though most of the mashup tools I mentioned above are in the presentation and mashup layers), and the stack as a whole competes very nicely with a lot of current corporate portal tools - but with a much nicer user experience - one that users are already familiar with from the web.

One important issue that is sometimes overlooked is that mashups require feeds - and even though the number of web feeds is growing, there is still a huge lack of appropriate feeds. Since most mashup makers rely on existing feeds they have a problem when a required feed is not available. Even if the number of available feeds explodes exponetially there is no way for the site provider to know how people would like to use the feeds - so for mashups to take off, the creation of appropriate filtered feeds is going take on new importance, and the creation of these feeds is going to be a huge niche. Currently “Dapper” is the only tool that fills all the needs of the “universal feed layer” - site independence, web based and an easy to use, intutive interface for prosumers and scripters.