Tim O’Reilly visits Nature


Last Friday Tim O’Reilly dropped in on his way through london, and give a web seminar at Nature. We have been running these web seminars in Nature for about two years. They kicked off back then with talks from Jimmy Wales and David Weinberger and it was great to have Tim come in at the almost the two year mark to hear his take on 2.0 and beyond.

Instead of giving a presentation Tim just opened the floor to questions. and I’ve cobbled together an account here.

Everyone agrees that the internet is a transformative technology, and as the cost in finding information drops then businesses that rely on information must change in response to this. It’s also pretty clear that there are many competing business models out there. Aside from being a great technologist and diviner of trends in the geek space, the core of Tim’s business is a publishing business, principally book publishing, so the anticipation was that his insights might be particularly relevant to NPG.

I’ll try to cover the guts of the talk below, but in deference to a promise that I recently made to my flat-mate I am going to avoid using the term ‘paradigm shift’ (exercise to the reader, identify the paradigm shits described in this blog post, bonus points; predict upcoming paradigm shifts, and let me know). For clarity when referring to things that Tim O’Reilly said or did personally I’ll refer to him as Tim and likewise his company as O’Reilly.

Timo opened the questions by saying that the theme of the recent web 2.0 conference was ‘the edge of the network’, yet web 2.0, it seems, is becoming corporate and mainstream. He asked Tim if he would like to talk about what interesting things are on the edge?

Before answering the question directly Tim decided to give a background to web 2.0. It’s true that 2.0 is becoming more mainstream, for instance Steve Balmer was at this year’s conference, but Tim pointed out that he has been talking about these ideas for about 10 years. Back then he gave a talk on hardware, software, and what he called at the time infoware. The PC represented a commodotisation of hardware, and when that happened the value in computing, which previously resided with the makers of mainframes, didn’t disappear, it migrated from hardware to software. Famously IBM missed this shift and MicroSoft capitalized. In the same way that the value moved from hardware to the operating system with the advent of the PC, the value, moved to networked applications in the 90’s with the advent of the internet. The web represented the new layer on top of which new applications were built. This really hit home to Tim when someone told him that they had just gone out and bought a computer in order to use Amazon.

Excitement in the internet drove a bubble that burst in about 2000, but in about 2003 it was noticed that there was a resurgence of interest in the internet with some common features amongst businesses that had survived the bubble. There seemed to be a renaissance of the web, and O’Reilly decided to host a conference highlighting this phenomena. Dale Dougherty of O’Reilly coined the term Web 2.0 for the name of the conference. You can see a bit of history here.

It seemed that one of the key features of these sites is that they gained their value from network effects. The more people that used a site the more valuable the site became. They were using the network as a platform. The canonical examples of this are eBay and Google. With eBay the users are the value. What google realised is that the behavior of users on the internet could tell you about the value of sites. Pagerank takes the behavior of people using the internet to tell you things about the structure of the internet. Many people linking to a site indicate that the site might be more interesting than a site without so many links. Google’s ad auction is another example of using collective intelligence. Google – doesn’t just sell an ad slot to the highest bidder, they maximize revenue based on click rates and costs. If a lower value ad is going to get more clicks the total return is going to be higher. This requires real time analysis of what users were going to do.

At the back end of these sites are massive data centres with large amounts of information about user behavior. This is not new. Credit Card companies and telephone operators have been doing this a long tome before the internet was mainstream, but they have failed to convert these assets into user facing services. At the moment there is no smart phone address book. Phone bills are almost useless lists of calls with no smart filtering of behavior. Credit Card companies and phone companies have the data don’t seem to have cluefulness on how to extract extra value from the data.

If you are running a business a question to ask is what assets you have that are generated by users, and how do you turn those assets into real time services for users? This is the heart of user collaborative intelligence.

If we look at Nature what is the data inventory that we have? We should know as much about who the good scientists are as anyone, and this is very valuable.

Addressing the question of what is on the edge, i.e. what’s the next value creating platform that will result in changes to the current web eco-system Tim pointed strongly at applications that will integrate with sensors in the real world and not just be created by people typing on keyboards. Microsoft’s Photosynth does this by harnessing the huge number of cameras that are out in the wild. Norwich Union has a gps enabled pay as you drive insurance. Mobile phones could be a great platform on which to build, but at the moment mobile phone companies, like credit card companies, don’t seem to have a clue.

Wesabe and Mint are two start-ups in the states that are mining the spending behavior of people who use their services and are pointing the way towards connecting real world behavior with collective intelligence.

Immi with Nielson are introducing a new way of generating TV ratings. They have a special mobile phone that takes a sound sample every 30 seconds or so of the ambient noise in it’s vicinity and matches what it hears with cues that it knows about from tv adverts.

Genetic data becoming available is another example of where uses for collective intelligence will feature. (This made the cover of Nature last week)

Timo asked about publishing. The open access movement is seen as part of web2.0. At point of use most web apps are free, but how does this fit with publishers business models? How does O’Reilly walk the tightrope between being a publishing house and it’s vision of trying to change the world?

Tim said that the market will figure out a way to pay for the things that are truly valuable. You start with the faith that these things will get paid for, you try things, watch what other people are doing and try to catch the wave.

The O’Reilly core book business remains challenged. It dropped to half between 2000 and 2003. Tim takes the view that whether we (O’Reilly and Nature) survive doesn’t matter, as long as what we are doing survives. When the crunch came in O’Reilly retrenched on their core business. They asked what was it that they were really trying to do, their mission statement is “changing the world by spreading the knowledge of innovators”. They moved into conference, which has been very successful for them, and new business models such as safari books online which is a core subscription service.

Tim says that you have to look and ask what is the core function that you preform, for Nature it’s a curation, a seal of approval, a conferring of status.

O’Reilly is seriously looking at how they can do that for their authors. Asking questions like what are the benefits of being an O’Reilly author? What can we do for our authors, so that it is better for them to publish there?

Q James: how do you ad value for readers?

It’s still an untold story, O’Reilly tends to listen to readers, a lot of what O’Reilly do is watching the earliest adopters, and looking for things that are about to be adopted by a lot of people. In some sense the service is to bring things to people that they would not otherwise have found. The service is a kind of storytelling For example make magazine. One of the patterns they saw was a new engagement with hardware, tried some books on hardware hacking, and launched the magazine. They also launched the Maker Faire, a community gathering for people who like to make things. It’s a very broad tent, the maker fare was like a county fair, a normal fair has pigs, this one has robots. There are many people in their back yards making things, like the golden age of mechanics but now with robots and sensors. They are engaged with a new craft. In one pavilion there is a swaporamarama, they have a bunch of people with sewing machines and silkscreening In the next tent is a bunch of pc geeks cobbling together a supercomputer. These are all people who don’t want to buy things, but to build things. computing is infusing the physical world. Part of being a curator is storytelling by bringing things together. A great book to read is Cory Doctorow’s down and out in the magic kingdom. The core idea is how do you create a vision that other people will want to follow?

Timo asks Tim to talk about foo camp

Tim gives the background to foo camp. The original name of the get together was going to be ‘foo bar’, which is a geek joke. (I guess I got my geek props for getting that joke, most of the people attending the talk were left a little blank. Then Tim mentioned another geek joke that went totally over my hear, guess I’ve got a bit further to go)

Tim says that in 1998 they organized the open source summit, which was about people meeting people.

He realised that at that time many people in the Open Source community were working with very much the same goals, but that they had never met each other. Convening is part of telling a story, and the spirit behind foo camp was just to get a whole bunch of really great and interesting people together and see what happened. The initial contact between Tim and Timo happened when O’Reilly was doing a bioinformatics conference, and later Sci-Foo resulted from this contact. The idea was to recreated the foo camp experience with scientists.

Chris asked about sensor driven software and the ethos of the web. When ‘the man’ gets involved, there is always tension. On the web where companies get involved in community building what are the responsibilities of these, specifically with the data that they are collecting?

Tim said that if you look at the history of this, then it goes in cycles. Every industry goes through this. There is a creative anarchy, then some groups start to dominate. They move to capturing more value than they create. The computer industry was a very exciting place, and then it became boring, because it became consolidated

Bill Gates was a visionary, one of the greatest visionaries of the century. His vision was the idea of a personal computer, a computer in every home. Then Microsoft started eating their children. They were saved because a bunch of people in the wild were doing the internet, then this got stale in it’s first incarnation, then there was a new crop of people with a fresh approach

There is going to be a lot of consolidation, the man will take over (he might be idealistic like google), it is going to get a lot more boring. The interesting question is what will happen when google’s growth slows down?

But you have to have a belief in people’s ability to find new things. There are going to be a lot new areas coming out of science, one thing we can do is to help to birth the future.

Timo askes: should we worry about google and the privacy of information?

Tim does worry, but sees that people are adapting, an example is the facebook feed, There was initial horror, but then people adapted. There will be a trade off, you can’t rely on security by obscurity any more as demonstrated by spock person search. He sees this in the uk where there are more surveillance cameras than almost anywhere else.

He watched one blog storm about a council dealing with people putting up cancelled stamps on posters. People on the blog were saying that they should use the surveillance cameras to find out who was doing this.

In another example a woman found her house had been covered in toilet paper. She went around to the local stores and demanded that they show her their cctv footage until she found one that showed a bunch of kids buying loads of toilet paper …

The folks at google gather a lot of data, but so do Credit card companies, phone companies and governments. There will need to be some amount of regulation, and adaptation, but more adaptation than regulation. Society adapts.

Timo: will publishers need to adapt to google index print as well as search content?

Tim supports what google is doing with book search. but right now it looks like google has not figured out an economic model that works for publishers. Book search should work like web search, for example the open content alliance.

It’s a good thing for books to be searchable, like all markets changes happen and we will and can adapt. Tim doesn’t think of himself as a a publisher, but more as someone who helps make interesting futures happen.

Data storage vs bandwidth, science is very data intensive, is the new fundamental limit now bandwidth rather than data storage?

Tim says that he is not a network operator, ‘I do think peer to peer architectures are under utilized, there is a lot of unused bandwidth out there, we may hit storage limits, a few years ago i was at ibm, they said that massive storage is coming.’ Google already do this with a data centre in a box, there is an analogy between tcp/ip and container shipping.

He is sure that the problem will get solved, but it will probably get solved badly. If bandwidth becomes more scare it will become more valuable and more resources will go into fixing the problem. He thinks markets do eventually work

From Dominic: is there any danger, for example, that a book with good information will not sell because people turn to free information that is not as good, leading to a drive towards mediocrity?

Tim says there is a lot of good information out there, it’s not necessarily mediocre (except searching for hotels on google, which might be the first indication that they are turning evil (joke, I think?)) Tim doesn’t buy that published stuff is better than online stuff. There is great stuff online and great stuff in print, he does see that things that were powerful and fashionable do change, for example the top 100 blogs, how many of them are publishing companies? This is a new way of publishing.

Timo: you have done some innovation around online publishing, e.g. safari, rough cuts is another example, can you talk about some of your experiments with online publishing?

Tim says that O’Reilly have working on this since the mid-80s when they published a version

of unix in a nutshell for hypercard, and then doc book. They didn’t want to maintain a lot of documents, wanted a free reader, this led them to the web, and they have been thinking about online books for many years. The open web is a better model than a restricted access model most of the online models they didn’t like because they didn’t bring any of the benefits of a print book, or of the web.

Search can begin to make things interesting, this led to the decision to build safari as a channel, and it is now the 3rd largest channel, but now they are looking at what the impact of google book search is gong to be. The challenge is to think, what is it about online that is really better

Publishing on the internet, is fast cheap and out of control.

Blogs are fast cheap and out of control. They use their blog to drive traffic to some high price short print run books, eg $500 for the facebook report. You have to ask what are the synergies in moving people from one space to another. You need to experiment with a lot of models and the synergies between the models. This has allowed them to experiment with different types of books, such as short print run timely objects.

… and that was it, we ran out of time. (p.s. captions for the picture are welcome)


Comments are closed.