About Us Secure Tabs Our Investments News Investor Info Blog

Archive for November, 2007

Adam Smith, Tom Sawyer and Web Semantics

Friday, November 30th, 2007

Adding semantic information to the web has been on the agenda for a number of years (at least since 2001), and is high on the hype cycle. It is clear the value of web semantics once they exist – real automated digital assistants, search engines that can find what we meant – not just what we asked for. Practically magic.

So why aren’t web semantics evolving as fast as the web itself (though it seems like a new search engine claiming semantic capability if born almost day)? One key reason is that it is still in the domain of techies - all but meaningless to 95% of regular web users. To really start taking off requires harnessing that 95% of the web, making it useful and profitable for regular web users to generate useful web semantics – for their own benefit (or as Adam Smith put it – “It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner, but from their regard to their own interest..”).

OK, how do we spark this self interest and get the broader web community providing the semantics? The key is to take advantage of the currency of the web - advertising. Everybody is trying to make money off their sites by advertising. What if you could double, or triple advertising revenue by describing what your site is about (i.e simple semantics) just by a simple procedure that is no harder than what is needed to just get regular advertisements on your site? Even better, what if someone else could do for you, and you share the additional revenue (Tom Sawyer would be proud).

So it really is simple – make it so easy to add semantics that anyone can do it, make it worthwhile ($$$) so that everyone will to do it. Then sit back and watch web semantics start taking off. It won’t be perfect, but at least it will start to flesh-out and start evolving at the same velocity as the web itself.

IsraelNetSphere Panel and Advice to New Entrepreneurs

Friday, November 30th, 2007

I was on a IsraelNetSphere panel last week about Web startups and Angels. The moderator (Yaron Orenstein) asked an interesting question – if we could give a piece of advice to new entrepreneurs what would it be.

Not surprisingly the first was to come prepared to meetings, don’t just assume that you can win over an investor off-the-cuff. Now of course this is hard for someone who has never done it before (since you really have no idea how to prepare), but if you want to succeed - prepare. Present your ideas to friends, family, classmates – refine and hone your pitch as much as possible. Sort of like the old joke about “how do you get to Carnegie Hall?” – practice, practice, practice. I think that young entrepreneurs see people doing these types of pitches and assume that it is simple – it isn’t. In my experience the people who make something seem really simple are the ones who spent to most time preparing.

That doesn’t mean be long winded – be short and to the point. Some investors like presentations, other like discussions. You probably won’t know until get into the meeting, so prepare for both.

The second piece of advice was to listen – and that two of you should go to every meeting. Whenever one speaks, the other should listen. Intently. Think of every meeting as a way to prepare for the next one (and 99% of your meetings will be a prelude to another meeting).

Finally – persevere. You will hear a lot of “No’s” before you hear a Yes. You are going to meetings where people aren’t giving you the benefit of the doubt, and are essentially testing you, and your ideas.

Now if you think the preparation is hard – wait until you actually have the money and need to deliver…..

Decsion Support, BI, BPM and Human Process Management

Monday, November 26th, 2007

I have been thinking lately about decision support in business settings. Executives and managers make many, many decisions a day about the business – most of them involving other people that need to either be part of the decision making process, or act on the decisions. Essentially as an executive, you gather some data, meet with some people, make some decisions and then fire off some emails (or phonecalls) - repeat. From my experience, most processes in an organization are of this ad hoc flavor – and really have no tools (except email) for supporting the end-to-end process (from an ad-hoc set of decisions, through execution and finally to results)

There are various tools that help with the steps – for example I remember in the late eighties\early ninties decision support systems (DSS) used to be all the rage. The problem was that executives were unwilling to use the systems and they morphed into the Business Intelligence (BI) tools that all are the rage today (at least based on the number of acquisitions going on in that space). But, both DSS and BI tools address only part of the decision process – gathering and analyzing the data so that an intelligent decision can be arrived at. So those tools help with the “gather some data” part.

Another set of tools are collaboration tools, which can help somewhat with both the “meet some people” and “make some decisions” part. Other tools like Excel, Word, Powerpoint and email also play an important part in these steps. Most executives I know don’t use the various collaboration tools that are available - they use meetings, secretaries and productivity applications. Maybe they’ll start using Wikis too, but as another productivity tool - not an end-to-end decision support system.

Now if you believe the Business Process Management vendors the final step should be to create a process using your easy to use BPM design tool, and then have the process execute using your BPM (hey maybe even BPEL) engine. Yeah, right. BPM tools are heavy duty tools for the IT department, and are used to string together various IT assets. They don’t support the ad-hoc nature of most business processes, or the heavy (or perhaps exclusive) human interactions needed. Even the emerging area of Human-Centric Business Process Management (as coined by Forrestor) doesn’t fit the bill – they really don’t support the ad-hoc nature of most processes in an organization.

So where does that leave us? Essentially with meetings, email (sometimes phone calls and faxes) and productivity tools (ala Excel, Powerpoint, Word). That is how most business and business processes are done. I think this is main cause of email overload in organizations – and until some more natural mechanism for managing these ad-hoc business processes come along – the overload will only get worse…

An interesting article on email overload from First Monday.

Increasing Your Chances of Success as an Entrepreneur

Tuesday, November 20th, 2007

I am revisiting the topic I mention in my previous entry on “Overconfidence and Entrepreneurial Behavior” because of an interesting post I found on the subject by Bob Warfield. He points at a study from the Harvard School of Business, that essentially comes to same conclusions about experience as the Israeli study in my previous post, but has some interesting insights that are useful to a first time entrepreneur.

First their definition of success is different than that of the Israeli study - success is effectively defined as an exit (closer to the hearts of most VCs than the definition used by the Israeli study). Here is my summary of their findings:

1. Entrepreneurs who succeeded in a prior venture (i.e., started a company that went public) have a 30% chance of succeeding in their next venture. By contrast, first-time entrepreneurs have only an 18% chance of succeeding and entrepreneurs who previously failed have a 20% chance of succeeding.

To be honest the relations seem about right - though the actual numbers seem a bit high, but maybe it isn’t if you take into account that an exit doesn’t necessarily mean that money was made.

2. Companies that are funded by more experienced (top-tier) venture capital firms are more likely to succeed. This performance increase exists only when venture capital firms invest in companies started by first-time entrepreneurs or those who previously failed. Taken together, these findings also support the view that suppliers of capital are not just efficient risk-bearers, but rather help to put capital in the right hands and ensure that it is used effectively.

This is really important news for entrepreneurs (since you really can’t just increase your own experience) - if you don’t have the experience yourself, you should “buy” it. Choose your investors not just on the basis of money, but on the basis of their experience, and how much of that experience they are willing to put into a company (one day a month just isn’t enough). Or, if you can then you should hire the experience even though it is expensive. If you look at the numbers in the study - by getting the appropriate experience on board - you can increase your chances of success by 7%-12%

3. The average investment multiple (exit valuation divided by pre-money valuation) is higher for companies of previously successful serial entrepreneurs.

in other words - by getting the appropriate experience on board, you can not only increase your chances of success - but leave more money in your pocket when you are successful.

So for me the study proves the model that we are using at eXeedtechnology, while expensive, is the right model for entrepreneurial success (early heavy involvement of experienced industry veterans as an integral part of the startup). While it doesn’t increase your chances of a home run (which seems to be heavily predicated on luck) – it significantly increases your chance of a successful exit.

Software as a Service and Hardware Virtualization

Thursday, November 15th, 2007

I have been musing lately about the connection between software delivered as a service and hardware virtualization. For me they are two sides of the same coin (I guess we could have just as easily called it Hardware-as-a-Service and Software Virtualization). The simplest way to implement a SaaS’ified version of an existing application is via Virtualization – just run as many instances of the application (or application components) as needed, each in their own virtual machine.

The down side is that this may not be very cost effective. First, you need to be able easily deploy and manage new instances of the application within you virtual environment (and hence VMWare’s acquisitions of Akimbi and Dunes), have a appropriate pricing model for the various components technologies that make up the application, and the ability to easily monitor the virtual vs. real machine resources needed for the application.

It is not always easy to reconcile the software component models with virtualization. Many traditional software vendors charge per instance of their application deployed on a server. So if you want to deploy a DBMS for each instance of the application – the price can be quite prohibitive. It would probably depend heavily on the number of users per instance, but for many SaaS applications there are only a few users per instance. You could rewrite the application so that you could use a shared DBMS, having each application instance use a different DB in the DBMS – but rewriting an application is very costly.

Monitoring all those instances isn’t easy either. You somehow need to correlate all the virtual instances with the physical resources on the machine. One of the key reasons to virtualize is to be able to use machine resources (especially the CPU) more effectively – which means you want to load as many instances as possible before having to buy a new machine – very different then what is available from today’s monitoring tools.  A good overviewof these issues by Bernd Harzog can be found here.

So what’s my point? I think that we’ll see SaaS take-off when it really easy to take an existing app, and created a SaaS’ified version of it – and that will happen when it is as easy as taking a “virtual version” of the application and deploying it for “tenant” as needed. We are still missing some pieces of the puzzle for that to happen, but my guess is that we will see it happen in the next couple of years.

Some Thoughts on Blogging

Wednesday, November 14th, 2007

I have been blogging for a while now, and like everyone else I used to look at metrics everyday, now I look at them every once in awhile. What struck me most about traffic (and hopefully readership - since I can only know that users looked at the site, not whether they read it ) - is that the more you talk about currrent events the more traffic you get.

The blips that I saw on traffic were always around my blogging on topics that were just discussed by other sites, or events that just happened - rather than the blogs on general topics (e.g. the blog post on Mashup camp got a lot more traffic than my posts on Integration and M&A).  The traffic blip is of course even more pronounced if you comment or link-back to the main sites that discussed the event themselves.

This probably isn’t earth shattering news to most bloggers - but the heavy traffic to current event bias suprised me.

Data Integration and Mashups

Saturday, November 10th, 2007

I am attending Mashup camp and university here in Dublin (the weather reminds me of a poem that a friend of mine wrote about Boston in February - gray, gray, gray, Gray!). IBM was here in force at Mashup University giving three good presentations (along with live demos) on their mashup stack. They were saying that the products based on this stack should be coming out early next year (we’ll see, since from my experience it can be very difficult to get out a new product in an emerging area in IBM - since you can’t prove that the space\product is valuable enough). They have decided to pull together a whole stack for the enterprise mashup space (the content management layer, the mashup layer and the presentation layer -see my previous post on mashup layers). One thing that struck me, especially when listening to the IBM QEDwiki and Mashup hub presentations, is how much those upcoming set of tools for enterprise mashup creation are starting to resemble  “traditional” enterprise data  integration tools (e.g. Informatica and IBM\Ascential). These new tools allow easy extraction from various data sources (including legacy data like CICS, web data  and DBs), and easy wiring of data flows between operator nodes (sort of a bus concept).  The end result isn’t a DB load as with ETL, but rather a web page to display.  No real cleansing capability yet, but my guess is that will be coming as just another web service that can be called as a node in the flow. So it is like the mashups are the lightweight cousin of ETL - for display rather than bulk load purposes. It will be interesting to follow and see how ETL tooling and mashup tooling come together at IBM, especially since the both the ETL and mashup tools tools are part of the Data Integration group at IBM.

Microsoft seems to be taking another route, a more lightweight desktop like approach, and focused on the presentation layer. Popfly is a tool that also allows you to wire together data extraction (only web data as far as I could tell, though it could be extended to other data types) and manipulation nodes – as you link the nodes, the output of one node becomes the input of the next etc… It seemed very presentation oriented, and I didn’t see any Yahoo! Pipes like functionality or legacy extraction capability.

Serena is presenting tomorrow, it will be interesting to see what direction they have taken.