Edinburgh Trams: Perception vs Performance

Tram at Ingliston Park and Ride
Tram at Ingliston Park and Ride.

Edinburgh’s Trams may yet provide the ultimate test of perception over performance. I walked the entire route (by nearest footpath) on launch day, then rode the tram back. I preferred the walk. Read More

Scottish Tram Financing

Transforming Travel... or not. Edinburgh Tram's optimistic route plan.

Transforming Travel… or not. Edinburgh Tram’s optimistic route plan.

Some Edinburgh City councillors already privately refer to the city’s tram project as the problem that “cannot be named”. Much as actors refer to Shakespeare’s tragedy as “the Scottish play”, superstitions of bad luck now bedevil the production. A dramatic shift from the optimism that initially characterised the development of the Edinburgh tram, towards pessimism.

That which cannot be named is no longer just the failure of a flagship local transport policy. The issue has engulfed the City of Edinburgh Council, and now risks destroying local politics completely: Not only the existing administration, but public trust in local government decision-making.

Political heavy-weights, who normally shy away from the minutiae of local governance, are now offering parental guidance in public: Alistair Darling (local Member of Parliament, and former United Kingdom Chancellor and Secretary of State for Transport) described the option to borrow £231 million ($370 million) to complete the city centre section of the tram line as “absolute madness” – the local population would be saddled with vast debts. Days later, Graham Birse (chief executive of the influential Edinburgh Chamber of Commerce) called the decision to not complete the city centre section, “bonkers” – far fewer passengers would use a tram that did not serve the city centre adequately. Even Alex Salmond (Scotland’s First Minister) has become directly embroiled, struggling to contain calls for an immediate public inquiry to identify who is responsible.

Burn the witches! This Scottish tragedy is rapidly descending into farce. That would be unfortunate, because this particular local difficulty goes to the heart of the Scottish nationalist agenda: A desire for greater devolution of public funds to local level. More localised independent entities have fewer financial resources, so are less able to manage expensive, risky projects. Consequently policy ambitions also need to be scaled back. Such scale isn’t necessarily a problem – small can be beautiful. The problem lies in pretending to be big, when not.

This article introduces the concept of risk in tram (and similarly large public transportation and infrastructure) projects, chronicles the decisions that lead a relatively small local authority to need to find hundreds of millions of pounds to support a single project, and explores the implications for future policy-making, especially in the context of a more devolved Scotland. Read More

Behind a Royal Wedding

Zara Phillips enters Cannongate Kirk.

Zara Phillips enters Cannongate Kirk.

The marriage of the Queen’s granddaughter, Zara Phillips, to Rugby player Mike Tindall has been widely reported, especially by the celebrity press. It has been referred to as “the other” royal wedding, for its stark contrast with the marriage of William and Kate (the Duke and Duchess of Cambridge) a few months before.

That contrast isn’t just in the status of those getting married – Zara being 13th in line to the British throne, William 2nd. William and Kate’s wedding was a public spectacle, with all the pomp and ceremony of state, while Mike and Zara’s was a “quiet” family affair. Unfortunately the later wedding still generated significant public interest, and the result was a bizarre clash of family and celebrity, privacy and publicity. Read More

Simon Kirby: The Language Organism

Language is a method of sharing thoughts. It is uniquely human: Many species communicate using pre-specified techniques, such as markings on a flower to direct bees, or gestures between mammals – but only humans have the flexibility of language. Language is, perhaps, the key evolutionary advantage the human race has over everything else on planet earth.

So how have we come to develop this trait?

That’s the question Simon Kirby has spent the last 21 years trying to answer, now assisted by one of the world’s leading research groups on the topic. Their research suggests that Darwin’s model of natural selection is not a terribly good explanation. Indeed our culture actually shields us from natural selection, making our genes progressively less important to language as we develop. Simon goes on to speculate that domestification (being buffered from purely survival instincts) is a key condition of the emergence of language.

Kirby’s evidence is especially interesting because, unlike Chomsky, he does not propose an innate underlying structure for the development of language. Such a dominance of unbounded cultural transmission would be both liberating and terrifying: Liberating because it suggests unrealised flexibility in language, especially forms enabled by future technology. Terrifying because (certainly from a relativist perspective, but arguably more widely) shared thought through language is what defines our very being.

This article is based on Simon’s well-attended inaugural lecture to the University of Edinburgh, presented on 22 March 2011. Read More

Turning the Health World Upside Down

There’s a growing acceptance of the links between health, wealth and wider society. Not just the impact of wealth inequalities on measures like life expectancy. But the importance of fixing the underlying social causes of medical problems, rather than just administering the medicine and wondering why the patient doesn’t get better.

It’s convenient to frame this as a Third World problem. And while it is, it’s also a problem within and between developed countries. For example, people from one area of Glasgow (in Scotland) live a decade longer than people residing in another area of the same city, in spite of (theoretically) having access to precisely the same medical expertise.

A most basic analysis of Great Britain (and much of the developed world) reveals an organizational chasm, which most people are not prepared to cross: For example, medical services and social care provision are completely different activities – separate funding, differing structures, responsibilities, professional bodies. Even though individual “patients” shift seamlessly between them. It’s an organisational situation made worse by the difficulty both groups seem to have integrating with anything – in my experience (largely failing to integrate public transport into health and social services), a combination of:

  • The intrinsic (internal) complexity of the service itself, which leaves little mental capacity for also dealing with “external” factors.
  • The tendency to be staffed by those with people-orientated skills, who are often less able to think strategically or in abstract.
  • The dominance of the government, with a natural tendency towards bureaucracy and politicized (irrational) decision making.

Complexity is the biggest problem, because it keeps getting worse: More (medical) conditions and treatments to know about, higher public expectations, greater interdependence between different cultures and areas of the world. Inability to manage growing complexity ultimately threatens modern civilization – it will probably be one of the defining problems of the current age. So adding even further complexity in the form of understanding about “fringe issues” is far from straightforward.

Beyond these practicalities lurk difficult moral debates – literally, buying life. Public policy doesn’t come much harder than this.

Into this arena steps Nigel Crisp. Former holder of various senior positions within health administration, now a member of the UK‘s House of Lords. Lord Crisp’s ideas try to “kill 2 birds with one stone”: For the developed world to adopt some of the simple, but more holistic approaches to health/society found in the less developed world, rather than merely exporting the less-than-perfect approach developed in countries like Britain.

To understand Crisp’s argument requires several sacred cows to be scarified: That institutions like the National Health Service (which in Britain is increasingly synonymous with nationhood, and so beyond criticism) are not perfect. That places like Africa aren’t solely populated by people that “need aid” (the unfortunate, but popular image that emerged from the famines of the 1980s). That the highest level of training and attainment isn’t necessarily the optimum solution (counter to most capitalist cultures). If you’ve managed to get that far, the political and organisational changes implied are still genuinely revolutionary: To paraphrase one commenter, “government simply doesn’t turn itself upside down”.

While it is very easy to decry Nigel Crisp’s approach as idealistic, even naively impractical, he is addressing a serious contemporary problem. And his broad thinking exposes a lot of unpleasant truths. This article is based on a lecture Crisp gave to a (mostly) medical audience at the University of Edinburgh. And the response of his audience. The lecture was based on his book, Turning the World Upside Down: the search for global health in the 21st Century (which I have not read). Read More

Alex van Someren’s Lucky Acorns

Alex van Someren. Alex van Someren is one of those rare people, without whom our modern world would probably be a little bit different. From writing the first book about programming ARM architecture, the computer processor which now sits at the core of almost every mobile phone on the planet. To providing the technology that made Secure Socket Layer (SSL) more commercially viable, and helped enable the ecommerce internet revolution of the late 1990s.

Yet his story is fascinating because it is a definitive study in luck: Not just pure chance. But the type of luck that comes from a combination of unusual personal interests, social circumstance, and the active pursuit of something different.

It’s a reality that few “successful” entrepreneurial people acknowledge, because it’s an uncomfortable reality: It doesn’t fit neatly into a 5-point plan for instant fame and fortune [also see box below]. And it leaves a nagging doubt that the outcome could easily have been unsuccessful. And while I suspect that Alex isn’t comfortable with pure chance, he provides ample examples of how other elements of luck can be biased. How the odds can be improved. The dice loaded more favourably.

Those examples make Alex van Someren worth understanding. This article is based on a talk he gave to the Edinburgh Informatics Forum. Read More

Michael Gazzaniga on the Science of Mind Constraining Matter

Michael Gazzaniga. Can neuroscience explain it? You know – consciousness, being, the number 42. And if everything you thought you were transpired to be nothing more than an easily deceived heap of neurons, would that trouble “you”?

During October 2009, Michael Gazzaniga gave a fascinating series of Gifford lectures exploring how our brains process the information that gives us our sense of “I”. Gazzaniga drew extensively from neuropsychological studies of people with “split brains” (explained later) to develop the notion of a single “interpreter” within the brain – a part of the brain that analyses all the data available for meaning.

Michael Gazzaniga then attempted to rationalise the interpreter, concluding that our focus should be on the interactions of people, not the brain itself. This logic was then expanded to wider society – social structure, interaction, and law. Those later thoughts raised many more questions than were answered.

This article attempts to summarise the key themes in a non-technical manner, with a few naive attempts to interrogate the theories developed. This is my interpretation of 6 hours of lectures. Interpretation, because I tend to recreate Gazzaniga’s conclusions by re-analysing the information presented. With a complex topic such as this, it is likely that some of my interpretations will differ from his. Sections titled “Interlude” are entirely my analysis. Read More


As I write, the United Kingdom is in the midst of a national election campaign. A month during which politicians vie to confuse the electorate with big numbers. Politics is suddenly ravaged by intangibility, because the national economy is unable to sustain the usual tangible proxies for a better life – “more schools and hospitals” – and because the tangible results of fixing that economy tend to be unattractive – “less schools and hospitals”. So the best political strategy is not explaining the consequence of choices in a language ordinary people can understand.

Do you like the sound of £100 million ($150 million)? Can I tempt you with £160 billion? Expressing these figures per person in the population can be useful. The first figure is one bar of luxury chocolate for everyone. Doesn’t sound so big now, does it? The second figure is like everyone having a £2,500 bank overdraft (loan). Strange that, because indirectly, we do.

Unfortunately, applying the economics of household groceries to major items of government expenditure introduces certainty. The idea that one can visit a store where luxury chocolate bars are sold for precisely £1.70. Yet many large elements of government expenditure are akin to ordering a chocolate bar years before it can be eaten, for a price that transpires to be somewhere between £1 and £5.

Larger businesses will be familiar with this concept. It’s called risk. Such businesses are often far more interested in what “it might cost” (£5) than what “it will cost” (£1.70), because what it might cost might lead the business to bankruptcy.

The national economy is chaotic in its complexity, but overall, things should average out. So long as all the assumptions are broadly reasonable: Ultimately some will earn/cost more, some less. Short-term in-balance can be solved by (basically) printing more money, and then down-grading future assumptions until everything is back in balance.

However, this breeds a form of arrogance. A sense that government doesn’t need to consider the possibilities. That we can deliver a radical new policy – that has never been done before – and, in spite of it never having been done before, we know precisely how much it is going to cost. Just like a bar of chocolate.

Unfortunately, assumptions tend towards optimism. On average, projected costs are less than actual costs. This isn’t just a problem for accountants. It means that decisions are taken which do not reflect reality. Potentially leading to a Disneyland scenario, where everything is affordable until after the decision is taken, when suddenly everything has become too expensive. It ultimately challenges the validity of decisions, and in doing so, the moral authority of those that take them.

This article uses the Edinburgh Tram project to demonstrate the inherent uncertainty of large government infrastructure projects. It discusses the role of optimism in planning, and the methods used to reconcile planned optimism with subsequent reality. The article describes how the involvement of the private sector in public projects has evolved over the last 20 years, and the highlights the different time-scales applied to private investment and public choices. It concludes that optimism is not only unavoidable, but necessary. Rather, the true problem lies in tendency of people to demand certainty from the public sector, while accepting uncertainty in the private sector. Read More

Ian McCaig’s History of Lastminute.com

Lastminute.com. Ian McCaig, lastminute.com‘s Chief Executive Officer, told the history of this online travel and lifestyle retailer to the Edinburgh Entrepreneurship Club.

From a stereotypical “dot com” baby in 1998, to the rapidly maturing teenager of 2010. Ian charted the way in which the business’s strategy, structure and ownership had evolved as it matured from something with the turnover of a small local pub, to a multi-billion enterprise. Covering the problems of merging acquired companies, the need to scale costs, and the change from a public (stock market) ownership to private equity.

This article is based on Ian’s talk. It concludes with some personal analysis of the future, with particular reference to my favorite topic, public transportation information… Read More

Financing Hyper-Virality in the Clouds

This article probes the implications of cloud computing for financing very rapidly distributed internet-based services and products. It contains rough, inadequately researched thoughts, sparked from discussions at the recent CloudCamp Scotland. Read More


WeeMee. WeeWorld is a teen-orientated social network, best known for their customized avatars, “WeeMees”. WeeWorld has evolved into an eclectic mix of community, casual games, and virtual goods. Steve Young, creative director, spoke to a small group in Edinburgh. Steve discussed the motivations and behaviour of WeeWorld’s users, and explored the challenges of working with 2D WeeMees, particularly as they move into WeeWorld’s new virtual (synchronous) world.


WeeWorld’s core market are teenagers, mostly in North America. Average age 16 (minimum 13, although younger users may simply lie about their age). 60% are female. The dominant market segment was characterised as “spoilt rich kids” – typically those with their own computers. Of the 23 million registered users, about a million visit the WeeWorld site each month, and 80,000 login each day.

Usage differs from other teen social networks, such as Gaia Online: Only 6% of logged-in users visit the site’s forums, while 80% alter their WeeMee. Teen worlds are evidently not generic.

WeeMees (from the Glaswegian, “little me”) can be placed within personalised 2D rooms (in the style of “cardboard theatre”), used as characters within casual games, or rendered as avatars in a new virtual world called, simply enough, “World”. WeeMees are also used on third party websites and services, including messenger services, such as AIM or Live. Initial ideas for WeeMees had resulted in a lot of avatars simply being copied. APIs now provide some control over how WeeMees are reused.

Users’ main aim is “to gather as many friends as possible”. And to chat in a variant of the English language that even JeffK would find almost unintelligible: $iNG-UL?

Virtual Goods

WeeMees can be customized for free: Body, clothes and accessories. However users can also buy “Points”, which can be spent on specific items.

Points can be purchased via PayPal transactions or pre-paid cards, which are sold in US stores. Kids tend to regard these mechanisms like free credit cards: They are not seen as real money.

People pay for “uniqueness”. However, items need not be complex: The most popular item sold is a simple Alice band.

The most fascinating revelation was that the introduction of the new synchronous (virtual) world doubled the sales of virtual goods. This “World” is not even out of beta testing yet. “World” places WeeMees in the same interactive space as one another. This contrasts to the other areas of the site, where WeeMees are not competing for space. I think that implies the more an avatar needs to stand out from the crowd, the more virtual “Bling” is worth to that avatar’s owner.

WeeWorld is keen to avoid its Points being traded as a virtual currency. Money can only be converted into Points, not back again.


The key to WeeWorld’s success is “immersion”. The key to its revenue is “engagement”. These concepts guide development.

Although WeeMees are cartoon-like (in the style associated with South Park), customizations still need to reflect what people would wear in “real life”. For example, T-shirts branding needs to be subtle – a small logo on part of the garment.

The goal for user-generated content (customizations of WeeMees and rooms) is to make it hard for the user to create something that looks bad. For example, MySpace customisations can (and in my opinion, sadly often do) look terrible.

WeeWorld has adjusted to match conservative US culture. The cannabis plants created in early experiments are long gone. There are no alcoholic drinks. Negotiations with Walmart even forced WeeWorld to disable the customization of boob (brest) size.

The development of “World” posed an interest problem: How should WeeMees move? All the artwork and customizations had been designed for static display, without movement animations. The World uses embedded Flash objects to display information to users, so the amount of data transferred about other users’ movements needs to be minimal.

The solution was to make WeeMees hop. Users can also select a trajectory and fire their WeeMees in a particular direction. Navigating World’s 2D platform-ed environment is quite cereal, but strangely fun!


Social networks are becoming more like virtual worlds, while virtual worlds are becoming more like social networks. WeeWorld is trying to steer a path down the middle. Like all the businesses involved, they are still “feeling their way”, finding out what works.

Development time-scales for WeeWorld (and similar products) are very short. Steve was somewhat frustrated that development of the “World” had taken a whole quarter (3 months). The contrast to video-game style virtual worlds is stark: Those typically take 3 years to construct.

WeeWorld use a Scrum/agile development process (which suits the constantly evolving product). Casual games (a commonly requested feature) are often out-sourced to other developers.

The ability to develop content quickly makes it very easy for good ideas to be copied by competitors. For example, Zwinky might seem remarkably similar…

Bill Joos on Pitching

Bill Joos (or William Wallace Joos, as he prefers to be called in Scotland) spoke at a Edinburgh Entrepreneurship Club/Edinburgh-Stanford Link event on 11 March 2008. Bill experienced plenty of pitches while with Garage Technology Ventures, and shared the top ten mistakes for early stage/startup company business plans and pitches. While his focus was on pitching to venture capitalists, much of what he said is applicable to any business planning process. This article summarises his talk. Read More

David Law on Design as a Competitive Advantage

David Law has successfully launched and run a number of influential design businesses, including Speck Design. His work ranges from iPod skins to “camera armor” to video conferencing environments. David spoke to a small group at the University of Edinburgh on 26th March 2008. He proposed that design should be at the core of a modern business, as a competitive advantage to differentiate a business from others. This article summarises David’s argument, describing why there is a need to differentiate, his approach to design, targeting of niches, and how to stay ahead.

Design to Differentiate

Things are getting easier to make. There has never been a more informed consumer. Markets for consumer products are highly competitive, with little barrier to entry. All this means that popular designs are likely to be emulated, eroding prices downward. The aim of most manufacturing is simply to reduce cost to remain competitive.

The solution? Differentiation. A small company cannot differentiate products through marketing, but it can differentiate through good design.

Approach to Design

David sees design as “supercharged problem solving”. The aim is to satisfy a user need.

How is that need found? Observe users. Don’t ask them, watch them. Find where they get mad, and design a product that takes away their pain.

Then create lots of prototypes quickly. For real. CAD is too slow and lacks realism. Better to create a paper mock-up, which can be seen and handled. Keep on iterating until the design is right.

Development Triangle

The development process behind a new product can be weighted between three objectives:

  • Speed
  • Innovation
  • Cost

For example, a project might be orientated towards speed, with a new product developed in a few weeks. Other projects might be highly cost sensitive. David believes that most companies never consider the balance of objectives, and so tend to end up “somewhere in the middle”.

The Niche

The mantra “always start in a niche” goes against the instinct of many entrepreneurs, who tend to gravitate towards the biggest problem or market, since the reward from success are greater there.

However, niches have a number of distinct advantages:

  • Higher margins
  • Lower competition
  • Easier to “get in to” and find needs within
  • Appreciate audience.

David used the example of Camera Armor: Products that protect SLR equipment while in use. SLRs are a niche within a larger camera market. From this niche it was possible to develop into the larger market for smaller digital cameras – creating innovative cases and a rather dinky little tripod that snaps out from the bottom of the camera when needed.

Staying Ahead

David Law’s teams consists of a small number of designers. All their products are manufactured elsewhere (in China). The manufacturing process is simple – the real value of what they do is in design.

Could China produce good design? David argued that design needs proximity to the market. However, he did cite the example of Samsung: Historically a manufacturer competing on price alone, they have successfully developed a design-orientated approach, and are increasingly producing genuinely good designs in the vein of companies such as Sony or Apple. [It is possible that eventually Chinese manufacturers will follow this path, and become more design-orientated themselves.]

But if it is easy to copy products, how can value be maintained in good design? It depends on the product:

  • Where a key part of the design can be patented, a successful design will pay a long term dividend.
  • Where a design cannot be patented (most common), the method is simple: Keep on innovating, and always keep a step ahead.

BarCamp: Living on Virtual Fish

For those that missed my BarCamp Scotland presentation, “Living on Virtual Fish”, you can view it on SlideShare.

The following articles loosely correlate to each of the talk’s sections, and provide more depth and explanation:

  1. Learn2Play, the new Real Money Trading?
  2. Adventures in Online Advertising
  3. Thoughts on a Socio-Economic Environment based on Nothing

Dave McClure on Social Networking and Web 2.0

Dave McClure addressed a Edinburgh Entrepreneurship Club/Edinburgh-Stanford Link event on 29 January 2008. He outlined some of the advantages of “Web 2.0”, talked extensively on the use of real-time metrics to evolve web services, developed a history of social networking websites, and highlighted the interesting aspects of Facebook. This article summarises Dave’s talk, with some additional commentary from myself.

Advantages of Web 2.0

Web 2.0 is characterised by the:

  • low cost of acquiring large numbers of users,
  • ability to generate revenue through advertising/e-commerce,
  • use of online metrics as feedback loops in product development,
  • sustainable long term profitability (at least for some).

Dave McClure did not actually try and define the term, which was probably wise. Generally the term is applied to websites and services where users collaborate or share content.

Web 2.0 has a number of advantages (although it could be argued that some of these apply to earlier iterations of the internet too):

  • APIs – the ability to act as a web-based service, rather than just a “website”.
  • PC-like interface, albeit still 5 years behind contemporary PC interfaces.
  • RSS feeds (for data sharing) and widgets (user interfaces embedded elsewhere).
  • Use of email mailing lists for retaining traffic. While email certainly isn’t a “web 2.0” technology, his argument is that email is increasingly overlooked as a means of retaining website visitors.
  • Groups of people acting as a trusted filter for information over the internet.
  • Tags (to give information structure) and ratings (to make better content stand out).
  • Real-time measurement systems rapidly giving feedback. Key is the immediacy of the information, and the ability to evolve the web service to reflect that.
  • Ability to make money from advertising, leads and e-commerce. While true since about 1995, the web user-base is now far larger, so the potential to leverage revenue also greater.

Metrics for Startups

I believe the ability to very accurately analyse website usage, implement changes, and then analyse the results, is a key advantage of web-based services. It is an advantage often overlooked by information technology professionals and programmers. I’m not sure why – possibly because web service developers:

  • don’t appreciate how hard/expensive gathering equivalent information is in other sectors of the economy, or
  • are scared to make changes in case they loose business, and/or believe their initial perception of what “works” to be optimum, or
  • just lack the pre-requite analytical curiosity to investigate?

Or perhaps Web 2.0 just isn’t mature enough yet for developers to have to worry too much about optimisation: A new concept for a site will probably either fail horribly or generate super-normal profits. The sector isn’t yet competing on very tight margins, where subtle optimisation can make or break profitability. Of course, optimisation of websites can deliver substantial changes in user behaviour. For example, I have found that a relatively subtle change to the position of an advert can alter the revenue generated by over 20%.

Dave McClure developed the AARRR model. AARRR segments the five stages of building a profitable user-base for a website:

  1. Acquisition – gaining new users from channels such as search or advertising.
  2. Activation – users’ first experience of the site: do they progress beyond the “landing page” they first see?
  3. Retention – do users come back?
  4. Referral – do users invite their friends to visit?
  5. Revenue – do all those users create a revenue stream?

For each stage, the site operator should analyse at least one metric. The table below gives some possible metrics for each stage, with a sample target conversion ratio (the proportion that reach that stage).

Category User Status (Test) Conversion Target %
Acquisition Visit Site – or landing page or external widget 100%
Doesn’t Abandon: Views 2+ pages, stays 10+ seconds, 2+ clicks 70%
Activation Happy 1st Visit: Views x pages, stays y seconds, z clicks 30%
Email/Blog/RSS/Widget Signup – anything that could lead to a repeat visit 5%
Account Signup – includes profile data 2%
Retention Email or RSS leading to clickthrough 3%
Repeat Visitor: 3+ visits in first 30 days 2%
Referral Refer 1+ users who visit the site 2%
Refer 1+ users who activate 1%
Revenue User generates minimum revenue 2%
User generates break-even revenue 1%

These metrics become critical to the design of the product. Poor activation conversion ratio? Work on the landing page(s): Guess at an improvement, test it out on the site, analyse the feedback, and iterate improvements. Gradually you’ll optimise performance of the site.

I find this attempt to structure analysis and relate it back to core business performance, very interesting. However, the sample metrics can be improved on a lot, depending on the nature of the site. For example, to track virality (referral), I might watch the monthly number of del.icio.us adds, or monitor the number of new links posted on forums (Google’s Webmaster tools allow that). Tracking users all the way through the tree from arrival to revenue generation needs to done pragmatically where revenue is generated from very infrequent “big-ticket” sales: With minimal day-to-day data, it can take a long time to determine whether a change genuinely has improved long-term revenue, or whether natural fluctuations in day-to-day earnings just contrived to make it a “good day/week/month”.

Now I know this approach works, but why it works is less clear. We might like to think that we are genuinely improving the user experience, and maybe we are. However, it could be argued that merely the act of change is perceived by users as an improvement – a variation of the Hawthorne effect. The counter argument to the Hawthorne effect can be seen on sites with low proportions of repeat visitors: The majority of those experiencing the improvement will not know what was implemented before.

History of Social Networking

Dave McClure’s interpretation of the timeline of the development of social networking sites is as interesting for what it includes, as for what it omits: No Geocities; no usenet; no forums; no MUDs… The following timeline shows key services in chronological order, except without dates – all the services shown were created within the last ten years:

  • Email lists (Yahoo Groups)
  • 1.0 Social Networks (Friendster) – these early network established the importance of up-time (service reliability) and the ability of users to manipulate pages.
  • Blogs – links between weblogs acting as networks.
  • Photos and video (Flickr, YouTube) – created a sense of community, and allowed tagging/grouping of content.
  • 2.0 Social Networks (LinkedIn)
  • Feeds and shared social information (Upcoming.com event planner)
  • Applications and widgets – the ability to embed data about a user’s friends in applications is probably “the most powerful change on the internet in the last ten years”.
  • Hosted platforms (OpenSocial, Facebook) – most services are likely to allow 3rd-party developers to provide applications on their platforms.
  • Vertical communities (Ning) – ultimately this may develop such that a service like Facebook acts as a repository for a user’s online identity, while specific groups of people gather on other networks.
  • Availability of information – a single sign-on, with automatic data transfer between services.

The future may be “Social Prediction Networks”. This is a variation on the theme of using trusted networks to filter content: Instead of Blogging meets Search, I characterise Social Prediction Networks as Digg meets Facebook. Shrewd observers will note Facebook has already implemented Digg-like features, while simultaneously topic-specific, community-orientated Digg-clones are being launched. People gather into interest groups around a topic, and then through use of tagging and rating, the community filters content. The system effectively predicts what other people in the group will find useful. This may be an optimum approach for groups above the Dunbar number (or an equivalent number representing the maximum number of people a person can form stable relationships with).

Interesting Aspects of Facebook

Three were discussed:

  1. Social graph (friend list) – email and SMS (mobile phone) service providers have rich data on the frequency of communication between people, yet aren’t using this information to form social networks. Dave noted that two major email service providers, Yahoo and AOL, are currently struggling to thrive – this could be an avenue for their future development.
  2. Shared social activity streams – knowledge of what your friends think is important. Friends are more likely to influence you than people you do not know.
  3. API/Platform – dynamic behaviour and links across your social network.

Further Observations

Will growth in social networks continue? Yes – the friend list adds value to the content.

Will others compete? Probably, as a “long-tail” of networks, likely topic-specific.

Can social networks be monetarized better? Currently social networking services generate far less revenue than search services. The challenge for social networking sites is to move towards the wealthy territory of search services. At the same time, search services are moving towards becoming more like social networking sites.

How can traditional companies engage with social networking sites? Social networking sites work best for sales where a product has a strong aspect of peer pressure in the decision to buy. The most important advice is not to create a copy of a website: Instead provide less complex content that uses social networks to draw users to a website.

Applications for social networks tend to be over-complicated, normally because programmers attempt to implement functions found in software they have previously written for other platforms or websites. Generally the successful applications are very simple. Some developers have opted to break complex applications into a series of smaller applications, and use the virality of social networking sites to build traffic for one application from another.

Social network applications are exceptionally viral. They can gain users very rapidly, yet also loose users just as fast. Much of this virality comes from feeds, which typically alert friends when a user installs an application. Within a few years the feed is likely to be based on actual usage of an application.

Facebook now allows applications to be added to “fan pages” (or product pages) – so individual users need not now be forced to install an application to use it.

Those using email lists for retention are best to focus on the title of the email, and not the content. Merely make it easy to find a URL in the content. The key decision for the reader is whether to open the email. What the email says is almost irrelevant – they’ve already decided to visit the site based on the title.

Mike Masnick on Techdirt, Information and Consultancy

These are notes from a talk given by Mike Masnick, CEO of Techdirt, a “technology information company”. Mike addressed a small Edinburgh Entrepreneurship Club/Edinburgh-Stanford Link gathering on 22 January 2008. He outlined the company’s history and philosophy – “use what’s abundant to solve what’s scarce” – and outlined an interesting approach to the delivery of expert/consultancy business services. Read More

Bill Urschel on Internet Advertising Innovation

Bill Urschel is the CEO of the internet advertising exchange, AdECN. William spoke to a Edinburgh Entrepreneurship Club/Edinburgh-Stanford Link gathering on 14 November 2007, about the development of AdECN, its role as an exchange market for internet advertising space, and the future of internet advertising. This article is based on Bill’s talk, which he gave in a personal capacity.

Development of AdECN

William Urschel first realising the market potential for computer/internet ventures when writing computer books. He has started a number of software/internet businesses since, and looks for three things in a new venture:

  1. Market: Something to address of a manageable size, with an overall growth trend (“the rising tide lifts all boats”).
  2. People: 1-5 people with either technology or business backgrounds, and the correct attitude and work ethic.
  3. Product: Address a need… and it is nice if it works.

Historically, advertisers would pay an advertising network, who would then display adverts using the advertising inventory on publishers’ websites. It was common for the network the advertiser dealt with to run adverts across multiple networks. Often business flowed from network to network to network, before an advert actually appeared on a publisher’s site. This resulted in reduced revenue for the publisher, as each network “middleman” took their share: Perhaps for every $1 of advertiser’s money spent, just $0.18 would reach publishers. Waste still existed in the market: Half of the display advertising market was either going unsold or “under-sold” (sold for a significantly lower value than it could attain, simply to fill the space).

How AdECN Works

AdECN was launched in 2002, but didn’t “get moving” until 2004. Its role is to act as a stock exchange for network-to-network advertising deals. The ECN part of the name, meaning Electronic Communication Network, is derived from financial stock markets.

Networks continue to deal directly with their own advertisers and their own publishers. The process will first try and match an advertiser’s demand to a publisher’s inventory within the same network. When advertising demand and publisher inventory within the first network are mismatched, AdECN steps in to broker a deal between different networks. The result is that advertisers get their adverts published, and publishers fill their inventory with paying adverts. The whole auction process takes place in 6-7ms, at the time the publisher’s page is viewed.

AdECN has been careful to make itself an ally of the networks, not a competitor to them:

  • It does not deal directly with advertisers or publishers – it has a distinct role in providing the infrastructure for the exchange.
  • Networks split the commission on the deals between them, just like stock brokers.
  • AdECN levies a flat fee, so is neutral to whoever wins or losses the auction.

The neutrality of AdECN is seen as their main competitive advantage over Yahoo and Google: AdECN isn’t an advertising network in its own right. [Although as described later, AdECN may simply be becoming the new breed of advertising network, in a marketplace where advertisers will increasingly deal directly with publishers. I did not get the chance to query this apparent contradiction.]

Contextual and Behavioral Data

Adverts can be targeted contextually or behaviorally:

  • Context considers simple variables such as time of day or location (typically the country viewer is resident in).
  • Behavior (or, behaviour, or “profile”) considers variables such as the age of the viewer and their search patterns.

Currently 95% of all targeting is contextual because it has historically been difficult to match behavioral information in a fast and ethical manner. In the next “3-5 years”, behavioral advertising will move to dominate 80% of online [display?] advertising.

AdECN capture a lot of data, which is increasingly the added value it can offer networks. By design it does not store data: Data is used only in the (near-instant) auction process. Individual networks/advertisers can bolt on their own “black boxes” to AdECN – bespoke software they design to utilise auction data so that their advertising spend is optimised. The most common use of black boxes is to split Cost per Click (CPC – advertiser pays when someone click the advert) and Cost per Action (CPA – advertiser pays when an action is completed, such as an enquiry form completed, or product sold).

Privacy remains a key issue. Self-regulation is seen as the way forward. This is based on not keeping personal data, and instead focusing on core questions like “what is the consumer going to buy?” The history of Gator (spyware installed which monitored browsing habits) shows that consumer pressure will eventually win over advertising network which don’t stick to reputable privacy practices.

In Hindsight

For the first two years of the venture, AdECN did not perform well. For an internet startup, two years is a long time. In the early years, AdECN’s team were “too abstract and too technical”. The software was eventually rewritten. Fortunately the venture’s backers were able to see the long-term potential. The lack of barriers to entry into the exchange did allow many networks to trial it, which allowed business to slowly build.

By 2004 they were “in the right place, at the right time”. They were bought by Microsoft. Bill Urschel couldn’t reveal specifics, but stated that there was “no b” in the price paid. His final round of investors received a x9.7 return over four months, so nobody was complaining. They sold “too early”, but in practice they had to sell: Similar (although William claims not actually exchanges) competitors Rightmedia and Doubleclick sold to Yahoo and Google respectively. It became inevitable that Microsoft had to buy an exchange.

The Future

The underlying market is expanding, and forecast to continue to grow. Critically:

  • Online advertising accounts for only 7% of total advertising spend, yet occupies more than 7% of consumers’ time: Advertisers are behind the trend, and will logically seek to catch up.
  • Display advertising (on publishers’ sites) is growing faster than search advertising (on sites such as Google search results).
  • With exchanges such as AdECN, display advertising now has the same data/targeting advantages search had 6-7 years ago. Real-time auctions and targetting have taken much longer.

The industry itself will like change, particularly what is meant by the term “ad network”: Advertising agencies can now deal with publishers directly, and use the exchange to handle excess supply or demand – there is no need for the old middlemen, the advertising networks.

The average CPM (Cost per Mile, where a mile is a thousand advert impressions) rates are likely to remain the same where already high (for example, rates around $25 will see little change). However, targeting will allow undersold inventory to be utilised much more effectively, so space sold closer to $0.25 will increase in value. As noted earlier, behavioral/profile targeting is likely to develop such that it dominates within 3-5 years.

Could exchanges move into the television and print advertising arena? Current systems could be improved, but the exchange really needs real-time auctions to flourish.

Scottish Innovation: Designs without markets?

Why is Scotland creating a fifth of the UK‘s patents, but only gaining a tenth of UK venture capital? David Farquhar, CEO of 2in10, argues that in the technology sector at least, we don’t build the right things: We are not focused on marketing and selling. These are rough notes from David Farquhar’s talk to a Edinburgh Entrepreneurship Club/Edinburgh-Stanford Link event.

What’s wrong with technology innovation in Scotland?

First there is a tendency to focus on Intellectual Property (IP). Then focus shifts to the customer, but by offering services – a different product for each customer. That creates a lot of small businesses that struggle to grow.

What’s keeping CEOs awake at night? Lack of revenue from sales. And their investors? Lack of plans for making sales.

80% of firms are targeting the US as their main market, yet most lack basic knowledge about how to sell to the US market. If you don’t know how much you’re going to need to pay a US sales-person, how robust really is your business plan?

Market focus

Two thirds of the most highly valued technology comes from the US, so why not adopt their core philosophy? Build around a market problem, and sell the way customers want to buy.

To misquote Ben Holmes (Index Ventures):

“For every £1 invested in building, spend £5 on marketing and selling.”

Practices and structure

There is not a lack of sales talent in Scotland, nor a crisis of confidence.

There is a need for more best practice to be adopted, specifically:

  • Build the right thing.
  • Talk in the right language.
  • Understand how people like to buy things.
  • Drive revenue.

Most startup firms are structured poorly. Typical startups contain a CEO (who can talk) and a CTO (the brainy one). A structure then develops with engineering and sales/marketing separate.

Instead, sales and marketing should be separate functions, with a “healthy” tension between the two. Product development should reside within the marketing function, not with engineering. This often marginalises the original brains behind the operation (CTO) within the structure, but is necessary to keep market focus.


Wolfson Microelectronics is one of the best known Scottish-based technology firms. Its audio technology is used in products such as the iPod and XBox. It was started in 1984, but by 2000 only had revenue of £6 million per year. To prepare for floatation (IPO) it strengthened its board, including people who had worked in the US. They introduced concepts such as product managers, which fundamentally changed the way the business operated. By 2007 revenue had risen to £180 million.

Still marketing

Failing to understand the buying cycle is a key criticism of selling: For example, a new product might only be purchased as part of an existing product – selling the new product separately to consumers might not work.

A market can be defined as, “a group of customers with the same pain and money“. Money or else they cannot buy. Pain because they have to have a reason to buy. And a group because they have to talk to one another (markets follow a few lead individuals).

It is important not to make assumptions about what the market requires. Chances are the market isn’t how the startup team envisaged it, or has different priorities.

Lumigent was highlighted as a good example of how one technology could be pitched to several different audiences.


David showed how Thomas Siebel‘s Customer Relationship Management software was developed.

It starts with a given idea, in this case based on exposure to the problems of potential customers. The IP stems from that given idea. IP is important for product differentiation. A market segment is identified (again from the given idea) that both has pain and money. From the pain and IP, develop a product. If there is competition, it is necessary to address a specific category. Finally, from the product and category emerge a position – the claim on which the product will be sold. And from that, revenue is generated.

Further reading


This pattern is not entirely restricted to Scotland. It seems a common complaint that the UK and Europe is much better at creating things than commercialising them, in contrast to the US, which is good at commercialising them. In subsequent discussion it was noted that in the US engineers are often taught how to commercialise ideas within universities, which rarely happens in the UK.

Bart Balocki on Practices of Venture Capitalists

Or, how “big bets” are made on “incomplete data”. Bart Balocki, an alumnus of Stanford University, researched the practices of Silicon Valley (US) venture capitalists. These are rough highlights from his talk to the Edinburgh Entrepreneurship Club/Edinburgh-Stanford Link gathering on 17 October 2007. Bart covered the history of US venture capital, outlined how venture capitalists operate, and developed a model of how perceptions of value change over time.

Brief History of US Venture Capital

A simplified time-line:

  • Prior to 1946, high-risk investing was done informally by wealthy families.
  • Post-war, government led the commercialisation of new technology.
  • In 1957 DEC (Digital Equipment Company) was successfully developed using a venture capital model. This was probably the first evidence that venture capitalists could earn a return. Other successes followed during the 1960s, notably Intel.
  • In the 1970s Federal regulations were eased to allow large investment funds (such as pensions) to support venture capital. Successes such as Apple encouraged greater venture capital investment, and led to a growth in scale. In this period many venture capital firms moved away from the traditional hub of banking/finance, to the Sand Hill Road area of Silicon Valley. Sand Hill Road is close to the university, the area from which most ventures were emerging.
  • 1973 marked the first tangible peak in venture capital investment: The first of several “bubbles” – short term booms in venture capital activity (measured both as number and value of investments). The 1980s saw a bubble around personal computing (121 Initial Public Offerings (IPOs) in 1983, compared to 22 the year before), finally crashing with the stock market in 1987.
  • The best known boom was for “dot coms” around 2000, however bubbles continue to emerge, notably “Web 2.0” (user-generated content websites) currently.

Although venture capital activity continually cycles through boom and bust, the overall trend over time is increasing real terms investment through venture capital.

Venture Capitalist Practices

Venture capital firms are typically small – about 20 people, primarily consisting of partners (who make deals and control the money), associates (who make deals), analysts and support staff. Firms may also have an “entrepreneur in residence” (a budding entrepreneur within the firm – termed “micro co-location”), and venture partners (sector specialists).

Firms run several funds concurrently. Funds typically run for a 10 year period. They raise money from investors (such as pension funds), and invest the equity in many different entrepreneurs’ ventures. Most ventures will fail, but a few will succeed. The successes should offset the failures within the fund. Successful ventures are sold – either as a merger/acquisition or (rarely) as an IPO. The performance of funds is commonly referred to by year “vintage”, indicating the value of returns from funds started in that year.

Venture capitalists typically take 2% of the fund as a setup fee, and 20% of the value of the final sale – colloquially a “2 and 20” structure. Within the firm, partners take most of the 20% as a bonus to their salary.

Finding deals tends to be quite labour intensive: “doing lunch”, maintaining networks of local contacts, hosting events. Evaluating deals involves a combination of pattern recognition, due diligence, hiring in expertise, or even just following the crowd. Historically firms would invest in any sector of the economy, however there is now growing specialism, making venture capitalists “faster and smarter”. Typically each associate or partner will complete just 2 deals per year.

Venture capital firms can be characterised as passive or active in their involvement with the ventures they have backed. Active investments might involve a partner or associate as board members. Passive investments might simply stage the financing (not provide all the money up front).

Perceived Value over Time

A particularly interesting aspect of Bart’s research was the graph below (a simplified version, redrawn from my notes). It shows how “perceived values” (y-axis) change over time (x-axis).

Perceived value graph.

Each line indicates the threshold which must be reached for a deal to be considered valuable by each stakeholder.

Line D (green) shows how a new associate venture capitalist (the “deal champion”) perceives value: At first they are still learning, so the line rises in the first few years.

Line P (dotted red) shows how the value of the deal champion’s work is perceived by partners: At first they are not trusted, so it will be almost impossible for the deal champion to convince her partners to commit to a deal.

Line H (dashed blue) shows the first deal the deal champion is able to convince the partners to invest in – the first to exceed line B. The “home run”, this venture becomes the focus of the deal champion. It is highly successful, and eventually its value exceeds the public market perceived value (pink line M): The venture is sold as an IPO at point 1.

Now the deal champion appears to be able to “do no wrong” – she just earnt back the entire value of the fund in one deal! The partners’ perceived value drops below that of the deal champion (at point 2) – the partners are prepared to back ventures with far lower perceived value in the belief that the deal champion knows more than they probably do. Simultaneously, the pre-dot com bubble encourages the deal champion to invest with far less care than before (line D has a far lower perceived value threshold).

Unfortunately, endlessly repeating success in such a high-risk environment is difficult. The deal champion makes many rash investments, none of which repeat the success of their first “home run”. By 2000 both deal champion, partners, and public markets have lost much of their confidence, and lines D, P and M rise again.

John Clare on Electronics Retail Margins, Scale and E-Commerce

The recently retired chief executive of DSG International, John Clare, spoke to a small group in Edinburgh on 5 October 2007. DSG is a leading retailer of electrical goods, primarily in the United Kingdom through stores such as Dixons, Currys and PC World. This article summarises the low-margin nature of the business, the drivers for globalisation and growth in scale, and makes some fascinating observations on the role of physical premises for developing a successful e-commerce (internet retail) model.

Price and Margins

Electronics retailing is characterised by infrequent purchases, with competition primarily on price: Consumers tend to decide to buy a specific product, and have little loyalty to specific retailers. Factors such as availability (“can I take it home from the store now?”) and after-sales support (“what happens when it breaks?”) are still important, but often secondary considerations to price.

Competition on price means low margins: 3-4% margin is typical on goods sold in stores (15-20% gross margin). On some goods margins are lower. A computer might retail at a price that offers a gross margin as low as 6% – not enough to cover the full cost of the sale. Creative sales techniques are required: For example, offer a free printer with the computer, but don’t include the connecting cable, the ink or the paper. Those additional items attract surprisingly generous margins – enough to offset the loses from the original transaction.

The United States market is even more price-centric: Consumers might drive huge distances to save a few cents on a purchase – without apparently considering that the cost and time of the driving may exceed the saving on purchase price. (DSG attempted to enter the US market in the 1980s, but failed. In part due to enthusiasm of Wall Street investors to encourage a competitor to grow a monopoly by continually losing money selling goods below profitable margins: In the long term, the strategy fails, because once a monopoly has been created, raising prices to profitable levels simply causes new competitors to enter the market again.)


Until 10-15 years ago, the advantages of scale in electrical retail were modest: Being part of a branded network of stores does not alter the property costs of owning local stores.

Two factors have made scale increasingly important in electronics retailing, and driven the current trend towards the globalisation of electronics retailing:

  • Limited scope for growth in domestic markets: Additional growth beyond a threshold (for example, DSG’s 20% UK market share) in a domestic market becomes increasingly harder and harder to attain – it is simply easier to focus on “foreign” markets.
  • “Systems” (supply chain, ordering) have become critical to reducing costs and so creating a competitive advantage. Remember, this is a sector where tiny cost differences can determine business success or failure. These systems are increasingly expensive. Walmart was cited as an example: Their systems are valued at over $1 billion. Eventually Walmart had to expand beyond the US to justify such high levels of investment in its systems.

Large established retailers in mature markets (such as France and Germany) were difficult to compete against. Instead DSG (and other large established retailers) have been focusing on “immature” markets – those still dominated by many small retailers, such as southern and eastern Europe, and China. India is also an attractive market, but lacks developed infrastructure and willingness of government to allow foreign investment in the sector.

E-Commerce and Internet Retail

Internet retailing has grown from around 1% of DSG’s sales in 2002/3, to about 10% in 2007. Competition tends to be smaller or unbranded businesses, competing on price. Consumer trust (in an established brand) and commitment to support give established large retailers an advantage online.

DSG has two distinct internet-based retail operations:

  • PIXmania, a “pure play” (internet-only) retailer selling to most of Europe. The decision to acquire this business may be characterised as “hedging one’s bets”: The only acknowledged internet-only success in the EU has been Amazon. It still is not clear whether the Amazon model will transfer to other sectors.
  • Existing physical retail brands, given online presences.

The second type of operation was initially similar to the first, until the introduction of a facility that allowed customers placing online orders to collect the goods from their local store:

The ability to order online and collect goods from your local store tripled online conversion rates, from 1-2% to 4%.

Customers ordering online typically collect goods outside of working hours (before 09:00 and during the evening). These are evidently people that wish to shop online rather than in a store, but want their goods delivered so fast they are prepared to travel to the store to collect them.

Conversion rates may still appear low compared to those at stores (around 30%). But the proportion of customers who are simply researching products, without the intention of buying immediately, is not known.

Improved conversion rates aren’t the only advantage for the retailer. The gross [I assume] margin on internet sales is only about 6%, with a tendency for orders to be for single items (for example, one low-margin computer, with no chance to sell money-making printer ink or paper). However, when the customer arrives at the store to collect their order, they are successfully being sold half (by value) as much again in other items. That raises the overall margin on “internet with collect-from-store” sales to 13-14%.

The e-commerce model is still being developed. Cost reductions are likely as software becomes more standardised, although retailers are simultaneously moving to higher quality, more complex systems, so cost trends are mixed. Currently internet retail operations are about 6% cheaper than physical retail operations.

That means that the “internet with collect-from-store” model has very similar overall margins to the store-only model – and both offer significantly higher margins than the internet-only model.

It is easy to overlook the constraints of slow physical delivery networks when discussing selling goods over the internet. DSG’s “internet with collect-from-store” model gives some rather compelling evidence for just how important rapid delivery is.