Scottish Tram Financing

Transforming Travel... or not. Edinburgh Tram's optimistic route plan.

Transforming Travel… or not. Edinburgh Tram’s optimistic route plan.

Some Edinburgh City councillors already privately refer to the city’s tram project as the problem that “cannot be named”. Much as actors refer to Shakespeare’s tragedy as “the Scottish play”, superstitions of bad luck now bedevil the production. A dramatic shift from the optimism that initially characterised the development of the Edinburgh tram, towards pessimism.

That which cannot be named is no longer just the failure of a flagship local transport policy. The issue has engulfed the City of Edinburgh Council, and now risks destroying local politics completely: Not only the existing administration, but public trust in local government decision-making.

Political heavy-weights, who normally shy away from the minutiae of local governance, are now offering parental guidance in public: Alistair Darling (local Member of Parliament, and former United Kingdom Chancellor and Secretary of State for Transport) described the option to borrow £231 million ($370 million) to complete the city centre section of the tram line as “absolute madness” – the local population would be saddled with vast debts. Days later, Graham Birse (chief executive of the influential Edinburgh Chamber of Commerce) called the decision to not complete the city centre section, “bonkers” – far fewer passengers would use a tram that did not serve the city centre adequately. Even Alex Salmond (Scotland’s First Minister) has become directly embroiled, struggling to contain calls for an immediate public inquiry to identify who is responsible.

Burn the witches! This Scottish tragedy is rapidly descending into farce. That would be unfortunate, because this particular local difficulty goes to the heart of the Scottish nationalist agenda: A desire for greater devolution of public funds to local level. More localised independent entities have fewer financial resources, so are less able to manage expensive, risky projects. Consequently policy ambitions also need to be scaled back. Such scale isn’t necessarily a problem – small can be beautiful. The problem lies in pretending to be big, when not.

This article introduces the concept of risk in tram (and similarly large public transportation and infrastructure) projects, chronicles the decisions that lead a relatively small local authority to need to find hundreds of millions of pounds to support a single project, and explores the implications for future policy-making, especially in the context of a more devolved Scotland. Read More

Systems of Curse and ZAM

The World of Warcraft ecosystem saw the final “big fansite” acquisition this week, with MMO-Champion bought by Curse Inc. Big meaning something that attracts millions of users each month. Curse have been using some of their $11 million of venture capital to buy up a variety of gaming fansites, including many popular WoW sites. But MMO-Champion is significant for 3 other reasons:

  • Corporate deal, not the “founder buy-out” traditionally commonplace among gaming fansites. MMO-Champion was previously owned by Major League Gaming, already a multi-million dollar enterprise (by comparison, $46 million funding).
  • Completes a duopoly (2 dominant businesses) in the core World of Warcraft “fansite” market – Curse and ZAM. While there are other large businesses and specialist niches on the fringe, none of those appear to be growing into the core WoW market.
  • Exposes an intriguing driver of this market structure: Systems costs – the underlying technology and support costs. Intriguing because these were crucial in determining the market structure of far more traditional sectors of the economy, like groceries.

This article analyses the latest acquisitions and discusses the unseen importance of systems costs. Read More

Optimism

As I write, the United Kingdom is in the midst of a national election campaign. A month during which politicians vie to confuse the electorate with big numbers. Politics is suddenly ravaged by intangibility, because the national economy is unable to sustain the usual tangible proxies for a better life – “more schools and hospitals” – and because the tangible results of fixing that economy tend to be unattractive – “less schools and hospitals”. So the best political strategy is not explaining the consequence of choices in a language ordinary people can understand.

Do you like the sound of £100 million ($150 million)? Can I tempt you with £160 billion? Expressing these figures per person in the population can be useful. The first figure is one bar of luxury chocolate for everyone. Doesn’t sound so big now, does it? The second figure is like everyone having a £2,500 bank overdraft (loan). Strange that, because indirectly, we do.

Unfortunately, applying the economics of household groceries to major items of government expenditure introduces certainty. The idea that one can visit a store where luxury chocolate bars are sold for precisely £1.70. Yet many large elements of government expenditure are akin to ordering a chocolate bar years before it can be eaten, for a price that transpires to be somewhere between £1 and £5.

Larger businesses will be familiar with this concept. It’s called risk. Such businesses are often far more interested in what “it might cost” (£5) than what “it will cost” (£1.70), because what it might cost might lead the business to bankruptcy.

The national economy is chaotic in its complexity, but overall, things should average out. So long as all the assumptions are broadly reasonable: Ultimately some will earn/cost more, some less. Short-term in-balance can be solved by (basically) printing more money, and then down-grading future assumptions until everything is back in balance.

However, this breeds a form of arrogance. A sense that government doesn’t need to consider the possibilities. That we can deliver a radical new policy – that has never been done before – and, in spite of it never having been done before, we know precisely how much it is going to cost. Just like a bar of chocolate.

Unfortunately, assumptions tend towards optimism. On average, projected costs are less than actual costs. This isn’t just a problem for accountants. It means that decisions are taken which do not reflect reality. Potentially leading to a Disneyland scenario, where everything is affordable until after the decision is taken, when suddenly everything has become too expensive. It ultimately challenges the validity of decisions, and in doing so, the moral authority of those that take them.

This article uses the Edinburgh Tram project to demonstrate the inherent uncertainty of large government infrastructure projects. It discusses the role of optimism in planning, and the methods used to reconcile planned optimism with subsequent reality. The article describes how the involvement of the private sector in public projects has evolved over the last 20 years, and the highlights the different time-scales applied to private investment and public choices. It concludes that optimism is not only unavoidable, but necessary. Rather, the true problem lies in tendency of people to demand certainty from the public sector, while accepting uncertainty in the private sector. Read More

Railways for Prosperity

Recreating the Island of Sodor in Kidderminster. In the dying years of Margaret Thatcher’s premiership, the United Kingdom government launched a policy document called “Roads for Prosperity”. £23 billion ($35 billion) would fund a network of highway improvements. Schemes that eased capacity constraints on the strategic (primary routes) road network. It was a response to rising car use, and the belief that not providing sufficient highway capacity would damage the UK economy – national prosperity.

It didn’t happen. Neither the threat to prosperity, nor the policy:

  • Environmentalists rallied against the few early projects (famously turning the Newbury Bypass and Twyford Down into civil battlegrounds) – road-building became politically negative, rather than positive.
  • There was never really enough money in national budget to fund the policy – increasingly obvious as the UK economy dipped into the recession of the early 1990s.
  • Even with the policy, roads would still be built slower that road traffic was growing – it was not possible to “build your way out” of the problem. It’s worse than it first seems, because new roads generate additional traffic growth, requiring more road capacity, generating more traffic…

The legacy was apparent in Tony Blair’s first Labour administration (or more accurately, John Prescott’s, the minister who led the transport and environmental agendas in the late 1990s): Much greater emphasis on sustainability, local projects, and use of forgotten modes, like buses and shoes.

Now, step forward 20 years to 2010.

The Secretary of State for railways and other transport, Lord Adonis, announces plans for a new high-speed rail line between London and Birmingham. At least £15 billion ($23 billion) for the first phase, rising to £30 billion with extensions further north. (Read those figures with caution – the costs of the previous West Coast Mainline upgrade project increased so much that nobody could remember how low the initial estimate was.) Inflation means that the cost of this latest rail project is only about half the (real terms) cost of Roads for Prosperity. But Roads for Prosperity proposed thousands of miles of highway, across many different locations, compared to a few hundred miles of railway track between a few large cities. And “Railways for Prosperity”, as I’ve corrupted the latest proposal, doesn’t have the pretence of strategy.

Politically it’s work of genius – the benefits flow to the political class (who tend to use trains), especially those living in increasingly marginal electoral territories in the West Midlands and North-West of England. Meanwhile, the Peoples’ Republic of Great Missenden (and soon likely every other other community near the route) is up in arms because the totalitarian regime they likely never voted for, has decided to build a railway – without the local station necessary for them to commute to London. I exaggerate, but only slightly.

Forget the “high-speed” aspect of the title. Operationally, the need is to increase capacity (see the box below). Make space for more trains on one of the busiest railway lines in Britain. More capacity creates more redundancy in the system, which makes it easier to recover from operational problems, and so makes trains more reliable. From bitter personal experience as a passenger, I suspect reliability is worth more than speed here. Of course, “better reliability” sounds a lot vaguer than “30 minutes faster”.

Read beyond the concrete, and the talk is all about “economic growth”, and “jobs”, and.

It’s at times like this that I want to pick up a shotgun and blow my brains out. 20 years later we’re back where we started. And nobody seems to have noticed.

This article uses historic examples to question the strength of the relationship between transport and the economy. It highlights the political biases towards railways, and their funding. The article explains why grand transport projects remain popular, when their overall impact on problems is often minimal. Rough analysis is presented that demonstrates the futility of building new railways – the 21st century reality, that we simply cannot afford to continue enlarging our transport networks in response to increased passenger demand. Finally, a stark comparison is made between communications and “transport” policy, which questions the validity of spending 15 times more on a new railway, than on a core element of “digital” inclusion. Along the way, the article clarifies a few popular misconceptions, from the influence of Unionism, to the impact of “integration”. Read More

Nation of Adoration

World of Warcraft’s seasonal holiday events temporarily reduce player interest in fishing. It’s always been the case, but the decline in fishing seems to be becoming more extreme over time:

Decline in Fishing Activity due to Holiday Events

The graph’s y-axis is the percentage decline in page views at El’s Extreme Anglin’ from the 7 days before each event, to the first 7 days of the event. Pageviews are a good proxy for overall angler interest. El generates hundreds of thousands of page views each week, so even small changes are significant. The x-axis orders events by date, from January 2008. The axis isn’t scaled correctly to show time, but holidays are fairly evenly distributed throughout the year. Events are shown by green dots, with a shortened date (month and year) and the name of the event.

The data is expressed as a percentage of the previous week, because while interest in fishing “waxes and wains” from year-to-year, changes week-to-week are normally minor.

All the events included last at least 7 days. Where one holiday runs concurrently with another event (for example, the “Lunar Festival” and “Love is in the Air” often clash), only the first event in the sequence is included. Interest in fishing also changes dramatically in the month new content is added, so events that clash with major fishing patches have been excluded (Noblegarden 2008 with patch 2.4, Hallow’s End 2008 with patch 3.0.2, and Noblegarden/Children’s Week 2009 with patch 3.1). Winter Veil is also excluded: The period leading to Christmas is particularly unusual – first students stop studying and have a lot of time to play, and then many players stop playing to spend time with family. This causes large changes in activity from week-to-week, which makes it hard to isolate Winter Veil in the data.

Only 12 separate sets of data can be compared. There is one out-lier – Midsummer 2008 – perhaps the early stages of Wrath of the Lich King testing may have caused a small traffic spike in the week before? The pattern shown on the graph is not certain. But I’m growing confident that events are increasingly impacting on fishing activity.

But why? Read More

Iterative Video Development

The internet allows products and services to be rapidly improved based on user feedback. So rapid, that iterative design should become the primary method of designing internet-based services. Not just as an Agile-like method of working, but as a method of specifying the product itself.

Partly it isn’t because creators haven’t adjusted their methods to match the new technology – we’re still wedded to a single start-to-finish process, with one outcome at the end. Partly it isn’t because feedback can be hard to gather and digest, and even hard to act upon.

An iterative method has become one of the defining characteristics of how I like to write, organise, and present text on the internet. At least, beyond this domain. But until now, I’ve struggled to apply it to internet-based video.

This article introduces internet-based iterative design, and uses YouTube’s “Hot Spot” analysis to show how we can start to apply an iterative approach to video and movie-making. Read More

Do You Fish in Real Life?

This article analyses the transfer of fishing activity between the physical and virtual worlds.

Do You Fish IRL? In Real Life. I dislike the phrase, because it implies that everything else is unreal. Yet many virtual environments trigger the same human emotions as the physical world. Very real indeed.

Google US search for 'fishing guide'. If you search US Google for the term “fishing guide“, the first result may surprise you. It doesn’t help to catch any of the 30,000 species of fish found on planet earth. And its author has bright pink hair.

This isn’t just a neat party trick. Nor an indication that I should write a real fishing guide. Nor a failing of Google’s search index: Google is directing such a generic search to a game-specific website because the search engine thinks that the majority of people searching for a “fishing guide” are looking for a World of Warcraft fishing guide. (The box below provides evidence.)

Perhaps, within the online sphere, virtual fishing is as important as conventional fishing? The caveat, “within the online sphere”, is crucial: Physical world anglers generally aren’t sat in front of a computer screen, while World of Warcraft anglers are. However, the internet is still widely used to find information about offline pursuits: The US Angler Survey found that 42% of those surveyed primarily learn about fishing from websites – more popular than print media. (The survey is presumably biased, because anglers that use the internet are more likely to complete an online survey – but still indicates the internet is a fairly important source of information for physical world anglers.) Of course far more people search for generic terms like “fishing” than anything WoW or guide-related. So game-related search does not dominate as much as it may first seem.

Searches for “fishing guide” are not the only way online anglin’ is merging with offline.

As the remainder of this article demonstrates, World of Warcraft anglers are up to 3 times more likely to fish in the physical world than the wider population: If you enjoy fishing “for real”, you are more likely to fish virtually than other players. This implies that the fishing activity transfers directly between the physical and virtual worlds. Read More

De-Analysing Blizzard’s Starcraft 2 Marketplace

Rob Pardo Earlier in 2009, Blizzard announced a non-commercial World of Warcraft add-on policy, which caused much discussion. Today at BlizzCon, Rob Pardo (illustrated) introduced the Starcraft 2 Marketplace: A future (after the game’s launch) system that would allow independent development teams to create custom “premium maps” for the game, and make money from them. That’s precisely what World of Warcraft add-on developers cannot do. So what’s changed?

Why Create a Starcraft 2 Marketplace?

Pardo stated:

“If you create a really cool map, with all original content, that’s awesome, you can put it up onto the service [Battle.net], and actually make money on your map.”

Blizzard is prepared to share a “portion” of the revenue if you create your own Intellectual Property, and don’t simply re-use their property. Seems reasonable.

The SC2 Marketplace is intended to allow parts of the mod‘ community to evolve from amateurs to professionals. “Fan made” maps were acknowledged as an important way to keep Starcraft alive – over time, players shifted from Blizzard-made maps to fan-made maps. But maps (Pardo used Warcraft 3 as an example) still tend to use Blizzard’s game assets (such as art textures), because creating original content takes a lot of effort. And passion alone does not pay the bills. By allowing map authors to earn money from popular maps, those people would be able to fund the creation of their own, original game assets.

There’s a real sense that Blizzard lost the chance to nurture and (commercially) gain from innovations within “their game engine”. Rob Pardo again:

“The Tower Defense maps came out of the Warcraft 3 community. And now you see Tower Defense in the PlayStation store…”

Earlier in the day Stompalina tweeted about the similarity between Battle.net (Blizzard’s community platform) and Steam (Valve‘s community platform). And she’s not wrong.

Both companies are unusual. They have both escaped from the traditional publisher-funded business model that underpins most major (non-casual/Flash) game development and distribution. Valve’s Steam originally gained popularity from games like Half Life, but has now become a method of distributing games written by others – everyone from small college/”garage” projects, to mainstream titles, like Total War.

Valve is already ahead of Blizzard in constructing a social-gaming platform, even though Blizzard was there first, and should understand the media better (from developing World of Warcraft). So perhaps opening up Starcraft as a semi-commercial platform for third parties is a new strategy in that race?

Why Not Create a Marketplace in Other Games?

SC2 Marketplace Illustration Competition with the wider gaming industry does not explain why Blizzard are so unwilling to adopt a similar approach within their other games. Some of us (and I include myself) would like to do this within World of Warcraft. I have previously demonstrated that WoW has a huge pool of talent among its players, and that pool of talent is increasingly reluctant to work within WoW because it has become afraid to make money. Something which we now all seem agree is required to support major (time-consuming) projects.

It is possible to create original IP within WoW. Technically this would be more difficult within a MMOG, because players that don’t buy your content, still need to interact with those that do. But there are creative methods of working round those limitations.

One possibility is that Starcraft 2 is a new product, which is politically (within Blizzard’s decision-making process) and technically (programmed to be supported from the outset) far easier to impose a new strategy on. And we might eventually see a more relaxed approach in Azeroth.

My fear is that World of Warcraft is being treated differently because its brand is to valuable at this stage in its life-cycle.

Shrewd observers will note that Blizzard have started “doing the Star Wars thing” with the WoW brand: The revenue directly from the game gradually becomes less important than all the merchandise and franchise opportunities. Soft drinks and Trading Card Games were just the beginning…

The problem for “fan-based” projects is:

  1. Franchise and license opportunities are not available to “the little guy”. They’re not the large businesses Blizzard look for.
  2. If you sell a license it has to be worth something. So a “fan project” cannot co-exist with a franchised project that it (often inadvertently) conflicts with.

There have been several examples over the last year where conflict has arisen. Unfortunately, I’m not able to publicly discuss all of them. Suffice to say the legal threats are very real: Suddenly one finds one’s self liable for lost earnings of the franchisee and Blizzard. That’s almost certainly more money than you have – few people are prepared to risk bankruptcy.

On the Road to Damascus

If Blizzard have had a change of heart, will anyone trust them? Sadly the answer is yes. Not least because individuals tend to confuse the company with its products. And the corpses of all those fallen add-on developers decay fast.

A marketplace doesn’t fit Blizzard’s culture – somewhat secretive, protective, and controlling of its work. But Blizzard seem very similar to Apple. And Apple have managed to sustain a very successful iPhone store, full of applications created by independant developers. If both parties benefit, these uncomfortable partnerships can thrive.

Perhaps there is hope after all?

Postscript

The following day, in an interview with DirectTV, Rob Pardo was asked this question directly: Why Blizzard are endorsing commercial SC2 mods, while they have just outlawed commercial WoW mods? His reply was:

“We’re not making money from the people that are doing third party things for WoW. It’s not really allowed to go out and make stuff around WoW without licensing it from us. It’s really us just protecting our Intellectual Property.”

Favorite Fishing Places

This article analyses the favourite fishing locations of World of Warcraft anglers. Both where and why.

The most popular single zone is the Grizzly Hills, with Azshara’s Bay of Storms and Wintergrasp in joint second place. Reasons are split into artistic (music, scenery), emotional (relaxation, memories), practical (fish caught, convenience), and social (companions, player interaction) themes. Overall, each theme has similar importance. The article discusses the apparent contardiction between desires for solitude, and to be surrounded by life.

This is the second of several topics that explore the reasons people fish in a virtual world, ultimately drawing parallels with fishing in the physical world. Read More

Where We Fish

This article analyses where players fish in the game World of Warcraft. It reveals the role of daily quests in shaping our fishing habits, demonstrates just how popular city-fishing is, and starts to reveal why we fish. This is (hopefully) the first in a series of articles that collectively examine why people fish in this massively multiplayer online game.

Daily successful casts by area. The map shows number of successful fishing casts (diameter of each circle), by area. Numbers are daily totals for all United States and European realms combined, based on a sample in July 2009. Click the map for a larger view.

A successful cast is one that does not catch a junk item, which might occur if the anglers’ skill is to low. There are 14 million successful casts each day, catching 16 million fish: Some casts catch more than 1 fish. In addition, there are 4.5 million unsuccessful casts (that catch a junk item). Unsuccessful casts are not shown on the map.

“Old Azeroth” refers to the continents of Kalimdor and the Eastern Kingdoms (the pre-expansion game). Within Northrend (the main area shown on the map), casts into coastal waters are shown separately from “inland” casts in other zones.

In each area, the total number of casts is divided into 3 parts:

  1. Open Water (dark blue) – Casts into bodies of open water.
  2. Daily-Related (gold) – Casts while trying to complete a daily fishing quest. This includes all casts while trying to complete the quest, not just those that catch a quest fish.
  3. Pools (light blue) – Casts into schools of fish.

Northrend is the continent hosting the current game expansion, Wrath of the Lich King. The continent is home to higher-level (more veteran) players. Expect to find most fishing activity here – and we do: There are 9.3 million daily casts in Northrend – two thirds of all successful casts.

A sixth of all casts are related to the daily quests, in spite of the fact that there is just one such quest available each day (the area varies between realms, randomly each day). The Northrend fishing quests are the most popular quests in the game – completed by over 300,000 characters each day. No, really – at least before patch 3.2 was launched, which made Heroic dungeons popular again. Anglers’ might be motivated by the additional reward. Or this might suggest a far greater need to guide players. Either way, it raises some questions, such as, why is there just one fishing quest per day in the current game expansion?

Ignoring daily quest-related fishing, the most popular single location is Dalaran’s Eventide Fountain, with 1.4 million casts per day – equivalent to 1 person on each realm fishing there for 12 hours each day. The irony is that Dalaran’s Eventide Fountain is also one of the smallest body of water in the entire game. Cities account for a third of all casts – Dalaran is not the only popular city. At least half of the “Old Azeroth (Inland)” casts are casts in the waters of major cities (such as Stormwind or Orgrimmar).

So, half of all fishing activity is either directed by quests, or occurs in cities. Training (cooking and/or fishing skills) is also an important reason to fish, although it is harder to estimate how important.

Pool fishing is normally the fastest way to catch “valuable” fish. Yet only 17% of casts are from pools. Even if we look at areas with no quests and desirable “Northrend” fish, like the Grizzly Hills, half of all casts are still in open water. This isn’t the only example that suggests that anglers really are quite lazy, and don’t want to much hassle when fishing.

The remainder of this article explores some of these issues in more detail, using information about where we fish to start to explain why we fish. It also describes the method behind the numbers, with a technical appendix containing data. Read More

De-Analysing Blizzard’s Add-On Policy

Blizzard Entertainment’s new add-on policy has been discussed by everyone from Lum to Slashdot. The number of developers directly affected by the change is small, since only a few add-ons are popular enough to be considered commercial ventures. The policy is more significant because it changes a lot of established conventions, and goes to the heart of how Blizzard embraces (or increasingly, shuns) the talent within its player community. This article is an attempt to analyse the real motivations behind the policy, and highlight the apparent contradiction in policy between in-game add-ons and web-based services. Read More

Social Reconstruction of Public Transportation Information

The UK‘s local public transport data is effectively a closed dataset. The situation in the US seems similar: In spite of the benefits only a handful of agencies have released raw data freely (such as BART and TriMet on the west coast of America).

That hasn’t stopped “screen-scraping” of data or simply typing in paper timetables (from Urban Mapping to many listed here). Unfortunately, the legal basis for scraping is complex, which creates significant risks for anyone building a business. For example, earlier this year, airline Ryanair requested the removal of all their data from Skyscanner, a flight price comparison site that gathers data by scraping airlines’ websites. How many airlines would need to object to their data being scraped before a “price comparison” service becomes unusable?

User-generated mapping content is evolving, often to circumvent restrictive distribution of national mapping. Services include OpenStreetMap and the recently announced Google Map Maker.

Micro-blogging, primarily through Twitter, has started to show the potential of individual travellers to report information about their journeys: Ron Whitman‘s Commuter Feed is a good example. Tom Morris has also experimented with London Twitter feeds.

This article outlines why the “social web”/tech-entrepreneur sector may wish to stop trying to use official sources of data, and instead apply the technology it understands best: People. Read More

Paul Saffo on The Revolution After Electronics

Paul Saffo spoke to Stanford’s Media X conference on the art of predicting the future. Specifically predicting which technology will come to dominate the next decade. Paul’s talk may at first seem somewhat contradictory in nature: Demonstrating how to do it, while simultaneously showing it can’t be done. This article summarises the talk.

30 Year Cycle

Every 30-50 years a new science turns into a technology. With approximate dates:

  • 1900: Chemistry
  • 1930: Physics
  • 1960: Electronics
  • 2000: Biology

We are now on the cusp of a revolution from electronics to biology. The precise inflection point, the point of change, may not yet be clear.

Paul noted that Thomas Watson’s famous misquote, “I think there is a world market for maybe 5 computers”, was made in 1953, right on the cusp of the electronics revolution: Aside from the fact that he was talking about a specific machine, and not all computers, the quote is a good example of how it is difficult to predict the future at such points of radical change.

Forecasting the Future

The goal is not to be right, but “to be wrong and rich”: It is easy to take the view that one cannot forecast. If you do attempt to forecast you will still mostly be wrong, but the very act of trying will increase your chance of success over those that do not try.

The further away from a point in time you predict into the future, the greater the level of uncertainty. The difficulty in forecasting is finding a balance between being too narrow and too broad. Forecasting might use wildcards. The “hard part” is to be wild enough.

Typically forecasts for a new product or technology’s introduction are linear: The magnitude of the amount of use of the technology is forecast to grow steadily with time.

Reality tends to be represented as an S-shaped curve: In the early stages the magnitude of use is below the expectation generated by the linear forecast. Usage then rapidly grows, such that the actual usage rises above the prediction in the later stages. The result is that in the first part, forecasters tend to over-estimate performance, while latterly they under-estimate performance. Venture capitalists tend to have linear expectations, and so are disappointed in the early stages, while failing to see the later potential.

Robots and Inflection Points

Stanley, winner of the 2005 DARPA Grand Challenge. Paul Saffo used the example of DARPA’s annual competition for robot-driven cars. In the first year only a handful of competing robot drivers made it out of the starting gate. No car completed the challenge. The next year 22 out of 25 robots got further than the leader in the first race.

The example gives a quantifiable measure of how the technology is developing, year to year.

Spotting the inflection point, the place at which real, dramatic change starts to occur, can still be hard. Sometimes it can be spotted using data which has been ignored or hidden. Sometimes it is a case of looking for what does not fit. The anonymous quote, “history doesn’t repeat itself, but sometimes it rhymes”, is apt. Look back in time as far as you look forward.

The good news is that if you miss an indicator, you still have lots of time to spot another.

Sensors

Paul contested that the last three decades had been characterised by a dramatic cheapening of a component technology, which in turn had led to the widespread use of a product:

  • 1980s: Cheap processors led to the processing age. The result, widespread use of PC.
  • 1990s: Cheap communications lasers led to the access age. The result was the network infrastructure to support the World Wide Web.
  • 2000s: Cheap sensors are leading to the interaction age. Applications are currently missing, but widespread use of robots appears to be the future.

Biology and Electronics

Electronics is building biology, and Paul expects that eventually biology will rebuild electronics: These technologies are far from isolated.

An example of developments in electronics progressing biology can clearly be seen from work on the human genome. A well funded government-backed project was beaten by a far smaller project. The smaller project was able to successfully deploy robots, with the results that the cost of the work dropped by a factor of 10 each year. The government project had been funded based on the cost of technology at the outset, and initially failed to adequately respond fully to the changing cost structure.

The creation of the first artificial genome in January 2008 may yet prove to be the inflection point.

Trust Instincts at Your Peril

“Assume you are wrong**” (** and forecast often)

Paul used the example of the sinking of a US naval fleet near Honda, on the west coast of the United States, on 8 September 1923. The fleet had been navigating using a forecasting technique called “dead reckoning”. The coastline had a (then) new technology available to assist navigation – radio direction finding. This allowed a bearing to be given between a land station and the fleet.

The radio direction finding gave an unexpected result that did not match the forecasted position. The lead boat in the fleet concluded that their position was more favourable than anticipated (closer to their destination), and turned sharply… straight into the rocks they had been trying to avoid. The 11th boat in the fleet did not trust the judgement of the lead boat, and when the fleet turned, it hedged its bets, slowing and waiting to see what happened. It was one of only 5 ships from the fleet not to run around.

The morale of the tale: Hedge your bets, but embrace uncertainty. Or as written once on a tipping jar:

“If you fear change, leave it in here.”

Divergence of the Species

The question was asked, will biotech lead to a further aggregation of wealth? Yes. The electronics revolution had itself deepened inequality. Biotech raises a particularly ugly spectre which extends beyond wealth, to life itself. The wealthy would be likely to use their wealth to extend their lives. The ultimate outcome – species divergence. Currently the rich tend to benefit from better health care, and so extend life. But biotech is likely to create a lot more options.

Dave McClure on Social Networking and Web 2.0

Dave McClure addressed a Edinburgh Entrepreneurship Club/Edinburgh-Stanford Link event on 29 January 2008. He outlined some of the advantages of “Web 2.0”, talked extensively on the use of real-time metrics to evolve web services, developed a history of social networking websites, and highlighted the interesting aspects of Facebook. This article summarises Dave’s talk, with some additional commentary from myself.

Advantages of Web 2.0

Web 2.0 is characterised by the:

  • low cost of acquiring large numbers of users,
  • ability to generate revenue through advertising/e-commerce,
  • use of online metrics as feedback loops in product development,
  • sustainable long term profitability (at least for some).

Dave McClure did not actually try and define the term, which was probably wise. Generally the term is applied to websites and services where users collaborate or share content.

Web 2.0 has a number of advantages (although it could be argued that some of these apply to earlier iterations of the internet too):

  • APIs – the ability to act as a web-based service, rather than just a “website”.
  • PC-like interface, albeit still 5 years behind contemporary PC interfaces.
  • RSS feeds (for data sharing) and widgets (user interfaces embedded elsewhere).
  • Use of email mailing lists for retaining traffic. While email certainly isn’t a “web 2.0” technology, his argument is that email is increasingly overlooked as a means of retaining website visitors.
  • Groups of people acting as a trusted filter for information over the internet.
  • Tags (to give information structure) and ratings (to make better content stand out).
  • Real-time measurement systems rapidly giving feedback. Key is the immediacy of the information, and the ability to evolve the web service to reflect that.
  • Ability to make money from advertising, leads and e-commerce. While true since about 1995, the web user-base is now far larger, so the potential to leverage revenue also greater.

Metrics for Startups

I believe the ability to very accurately analyse website usage, implement changes, and then analyse the results, is a key advantage of web-based services. It is an advantage often overlooked by information technology professionals and programmers. I’m not sure why – possibly because web service developers:

  • don’t appreciate how hard/expensive gathering equivalent information is in other sectors of the economy, or
  • are scared to make changes in case they loose business, and/or believe their initial perception of what “works” to be optimum, or
  • just lack the pre-requite analytical curiosity to investigate?

Or perhaps Web 2.0 just isn’t mature enough yet for developers to have to worry too much about optimisation: A new concept for a site will probably either fail horribly or generate super-normal profits. The sector isn’t yet competing on very tight margins, where subtle optimisation can make or break profitability. Of course, optimisation of websites can deliver substantial changes in user behaviour. For example, I have found that a relatively subtle change to the position of an advert can alter the revenue generated by over 20%.

Dave McClure developed the AARRR model. AARRR segments the five stages of building a profitable user-base for a website:

  1. Acquisition – gaining new users from channels such as search or advertising.
  2. Activation – users’ first experience of the site: do they progress beyond the “landing page” they first see?
  3. Retention – do users come back?
  4. Referral – do users invite their friends to visit?
  5. Revenue – do all those users create a revenue stream?

For each stage, the site operator should analyse at least one metric. The table below gives some possible metrics for each stage, with a sample target conversion ratio (the proportion that reach that stage).

Category User Status (Test) Conversion Target %
Acquisition Visit Site – or landing page or external widget 100%
Doesn’t Abandon: Views 2+ pages, stays 10+ seconds, 2+ clicks 70%
Activation Happy 1st Visit: Views x pages, stays y seconds, z clicks 30%
Email/Blog/RSS/Widget Signup – anything that could lead to a repeat visit 5%
Account Signup – includes profile data 2%
Retention Email or RSS leading to clickthrough 3%
Repeat Visitor: 3+ visits in first 30 days 2%
Referral Refer 1+ users who visit the site 2%
Refer 1+ users who activate 1%
Revenue User generates minimum revenue 2%
User generates break-even revenue 1%

These metrics become critical to the design of the product. Poor activation conversion ratio? Work on the landing page(s): Guess at an improvement, test it out on the site, analyse the feedback, and iterate improvements. Gradually you’ll optimise performance of the site.

I find this attempt to structure analysis and relate it back to core business performance, very interesting. However, the sample metrics can be improved on a lot, depending on the nature of the site. For example, to track virality (referral), I might watch the monthly number of del.icio.us adds, or monitor the number of new links posted on forums (Google’s Webmaster tools allow that). Tracking users all the way through the tree from arrival to revenue generation needs to done pragmatically where revenue is generated from very infrequent “big-ticket” sales: With minimal day-to-day data, it can take a long time to determine whether a change genuinely has improved long-term revenue, or whether natural fluctuations in day-to-day earnings just contrived to make it a “good day/week/month”.

Now I know this approach works, but why it works is less clear. We might like to think that we are genuinely improving the user experience, and maybe we are. However, it could be argued that merely the act of change is perceived by users as an improvement – a variation of the Hawthorne effect. The counter argument to the Hawthorne effect can be seen on sites with low proportions of repeat visitors: The majority of those experiencing the improvement will not know what was implemented before.

History of Social Networking

Dave McClure’s interpretation of the timeline of the development of social networking sites is as interesting for what it includes, as for what it omits: No Geocities; no usenet; no forums; no MUDs… The following timeline shows key services in chronological order, except without dates – all the services shown were created within the last ten years:

  • Email lists (Yahoo Groups)
  • 1.0 Social Networks (Friendster) – these early network established the importance of up-time (service reliability) and the ability of users to manipulate pages.
  • Blogs – links between weblogs acting as networks.
  • Photos and video (Flickr, YouTube) – created a sense of community, and allowed tagging/grouping of content.
  • 2.0 Social Networks (LinkedIn)
  • Feeds and shared social information (Upcoming.com event planner)
  • Applications and widgets – the ability to embed data about a user’s friends in applications is probably “the most powerful change on the internet in the last ten years”.
  • Hosted platforms (OpenSocial, Facebook) – most services are likely to allow 3rd-party developers to provide applications on their platforms.
  • Vertical communities (Ning) – ultimately this may develop such that a service like Facebook acts as a repository for a user’s online identity, while specific groups of people gather on other networks.
  • Availability of information – a single sign-on, with automatic data transfer between services.

The future may be “Social Prediction Networks”. This is a variation on the theme of using trusted networks to filter content: Instead of Blogging meets Search, I characterise Social Prediction Networks as Digg meets Facebook. Shrewd observers will note Facebook has already implemented Digg-like features, while simultaneously topic-specific, community-orientated Digg-clones are being launched. People gather into interest groups around a topic, and then through use of tagging and rating, the community filters content. The system effectively predicts what other people in the group will find useful. This may be an optimum approach for groups above the Dunbar number (or an equivalent number representing the maximum number of people a person can form stable relationships with).

Interesting Aspects of Facebook

Three were discussed:

  1. Social graph (friend list) – email and SMS (mobile phone) service providers have rich data on the frequency of communication between people, yet aren’t using this information to form social networks. Dave noted that two major email service providers, Yahoo and AOL, are currently struggling to thrive – this could be an avenue for their future development.
  2. Shared social activity streams – knowledge of what your friends think is important. Friends are more likely to influence you than people you do not know.
  3. API/Platform – dynamic behaviour and links across your social network.

Further Observations

Will growth in social networks continue? Yes – the friend list adds value to the content.

Will others compete? Probably, as a “long-tail” of networks, likely topic-specific.

Can social networks be monetarized better? Currently social networking services generate far less revenue than search services. The challenge for social networking sites is to move towards the wealthy territory of search services. At the same time, search services are moving towards becoming more like social networking sites.

How can traditional companies engage with social networking sites? Social networking sites work best for sales where a product has a strong aspect of peer pressure in the decision to buy. The most important advice is not to create a copy of a website: Instead provide less complex content that uses social networks to draw users to a website.

Applications for social networks tend to be over-complicated, normally because programmers attempt to implement functions found in software they have previously written for other platforms or websites. Generally the successful applications are very simple. Some developers have opted to break complex applications into a series of smaller applications, and use the virality of social networking sites to build traffic for one application from another.

Social network applications are exceptionally viral. They can gain users very rapidly, yet also loose users just as fast. Much of this virality comes from feeds, which typically alert friends when a user installs an application. Within a few years the feed is likely to be based on actual usage of an application.

Facebook now allows applications to be added to “fan pages” (or product pages) – so individual users need not now be forced to install an application to use it.

Those using email lists for retention are best to focus on the title of the email, and not the content. Merely make it easy to find a URL in the content. The key decision for the reader is whether to open the email. What the email says is almost irrelevant – they’ve already decided to visit the site based on the title.

Mike Masnick on Techdirt, Information and Consultancy

These are notes from a talk given by Mike Masnick, CEO of Techdirt, a “technology information company”. Mike addressed a small Edinburgh Entrepreneurship Club/Edinburgh-Stanford Link gathering on 22 January 2008. He outlined the company’s history and philosophy – “use what’s abundant to solve what’s scarce” – and outlined an interesting approach to the delivery of expert/consultancy business services. Read More

El’s Extreme Anglin’ – 2007 Retrospective – Part II

This article continues my observations on running El’s Extreme Anglin’, a World of Warcraft (WoW) fishing guide, with a look at some of the trends in usage during 2007. You may also be interested in part I of the 2007 retrospective, which contained some observations on aspects such as thought leadership, quality and links.

Read More

Learn2Play, the new Real Money Trading?

Extract from advert for Luke's Gold Making Guide. Real Money Trade (RMT) is the buying and selling of virtual property or currency for real-world money. Many virtual worlds now embrace this trade in virtual currency and goods, often as a source of income for the world’s operator. Blizzard, the developer of World of Warcraft (WoW), does not:

“RMT is a TOS [Terms of Service] violation. The fanbase is pretty committed to being against it, and we’ve got a group of guys that are committed to stopping TOS violations. The game was never designed for that in mind – everyone starts off even. In the real world that’s not true, but in WoW everyone starts even, and the RMT stuff messes with that.”

Not just rhetoric. They have sued a leading supplier to prevent them advertising in-game. And they regularly ban large numbers of accounts used to “farm” gold.

That environment seems to have expanded another quite logical commercial market: Teaching players to play. “Learn2Play” in the vernacular, or “L2P” in shorthand.

Rather than buying gold (in-game currency), players buy the knowledge of how to make gold themselves. The market isn’t restricted to gold. Guides to power-leveling (advancing a character through the first part of the game as fast as possible) are also popular: Rather than pay someone else to level a player’s character, players can buy a guide containing instructions optimised for rapid leveling.

This article explains Learn2Play, and explores some of the history and trends in this “market”. It focuses specifically on World of Warcraft, in English, which is sufficiently popular to create a tangible commercial Learn2Play market. It draws on my own experience from selling these guides.

Superficial analysis suggests the World of Warcraft Learn2Play market is valued at over $3 million revenue per year. In spite of WoW being an online experience, revenue from physical book sales may still exceed revenue from the virtual equivalent. The market is far smaller than RMT. But the notion that people are willingly investing US dollars in knowledge and skills that are useful solely within one virtual environment, should perhaps deserve as much attention as other real-virtual money transactions.

Read More