lines in the sand …

screenshotI had the good fortune of receiving an advance copy of Ken Auletta’s forthcoming book “Googled, The End of the World as We Know It“. It’s a fascinating read, one that raises a whole set of interesting dichotomies related to Google and their business practices. Contrast the fact that the Google business drives open and free access to data and intellectual property, so that the world becomes part of their corpus of data – yet they tightly guard their own IP in regards to how to navigate that data. Contrast that users and publishers who gave Google the insights to filter and search data are the ones who are then taxed to access that data set. Contrast Google’s move into layers beyond web sites (e.g., operating systems, web browsers) with their apparent belief that they won’t have issues stemming from walled gardens and tying. In Google we have a company that believes “Don’t be evil” is sufficient a promise for their users to trust their intentions, yet it is a company that have never articulated what they think is evil and what is not (, anyone?).

There is a lot to think about in Auletta’s book – it’s a great read. When I began reading, I hoped for a prescriptive approach, a message about what Google should do, but instead Auletta provides the corporate history and identifies the challenging issues but leaves it to the reader to form a position on where they lead.  In my case, the issue that it got me thinking most about was antitrust.

My bet is that in the coming few years Google is going to get hauled into an antitrust episode similar to what Microsoft went through a decade ago. Google’s business has grown to dominate navigation of the Internet. Matched with their incredibly powerful and distributed monetization engine, this power over navigation is going to run headlong into a regulator. I don’t know where (US or elsewhere) or when, but my bet is that it will happen sooner rather than later. And once it does happen, the antitrust process will again raise the thorny issue of whether regulation of some form is an effective tool in the fast-moving technology sector.


I was a witness against Microsoft in the remedy phase of its antitrust trial, and I still think a lot about whether to technology regulation works. I now believe the core position I advocated in the Microsoft trial was wrong. I don’t think government has a role in participating in technology design and I believe the past ten years have adequately illustrated that the pace of innovation and change will outrun any one company’s ability to monopolize a market. There’s no question in my mind that Microsoft still has a de facto monopoly on the market for operating systems.  There’s also no question that the US and EU regulatory environment have constrained the company’s actions, mostly for the better. But the primary challenges for Microsoft have been from Google and, to a lesser extent, from Apple. Microsoft feels the heat today, but it is coming from Silicon Valley, not Brussels or Washington, and it would be feeling this heat no matter what had happened in the regulatory sphere. The EU’s decisions to unbundle parts of Windows did little good for RealNetworks or Netscape (which had been harmed by the bundling in the first place), and my guess is that Adobe’s Flash/ AIR and Mozilla’s Firefox would be thriving even if the EU had taken no action at all.

But if government isn’t effective at forward-looking technology regulation, what alternatives do we have? We can restrict regulation to instances where there is discernible harm (approach: compensate for past wrongs, don’t design for future ones) or stay out and let the market evolve (approach: accept the voracious appetite of these platforms because they’re temporary). But is there another path? What about a corporate statement of intent like Google’s “Don’t be evil”?

“Don’t be evil” resonated with me because it suggested that Google as a company would respect its users first and foremost and that its management would set boundaries on the naturally voracious appetite of its successful businesses.

In the famous cover letter in Google’s registration statement with the SEC before its IPO, its founders said: “Our goal is to develop services that significantly improve the lives of as many people as possible. In pursuing this goal, we may do things that we believe have a positive impact on the world, even if the near term financial returns are not obvious.” The statement suggests that there are a set of things that Google would not do. Yet as Auletta outlines, “don’t be evil” lacks forward looking intent, and most important it doesn’t outline what good might mean.

Nudge please …

Is there a third way — an alternative that places the company builders in a more active position? After almost two decades of development I believe many of the properties of the Internet have been documented and discussed, so why not distill these and use them as guideposts? I love reading and rereading works like the Stupid Network, or the Cluetrain Manifesto or the Cathedral and the Bazaar, or (something seasonal!) the Halloween Memo‘s. In these works, and others, there is mindset, an ethos or culture that is philosophically consistent with the medium. When I first heard “Don’t be evil” my assumption was that it, and by definition good, referred to that very ethos. What if we can unpack these principles, so that builders of the things that make up these internets can make explicit their intent and begin to establish a compact vs. a loose general statement of “goodness” that is subject to the constraint that “good” can be relative to the appetite of the platform? Regulation in a world of connected data, where the network effect of one platform helps form another, has much broader potential for unintended consequences. How we address these questions is going to affect the pace and direction of technology based innovation in our society. If forward looking regulation isn’t the answer, can companies themselves draw some lines in the sand, unpack what “don’t be evil” suggested, and nudge the market towards an architecture in which users, companies, and other participants in the open internet signal the terms or expectations they have.

Below is a draft list of principles. It is incomplete, I’m sure — I’m hoping others will help complete it — but after reading Auletta’s book and after thinking about this for a while I thought it would be worth laying out some thoughts in advance of another regulatory mess.

1. Think users 

When you start to build something online the first thing you think about are users. You may well think about yourself — user #1 — and use your own workflow to intuit what others might find useful, but you start with users and I think you should end with the users. This is less of a principle and more of a rule of thumb, and a foundation for the other principles. It’s something I try to remind myself of constantly. In my experience with big and small companies this rule of thumb seems to hold constant. If the person who is running the shop you are working for doesn’t think about end users and / or doesn’t use your product, it’s time to move on. As Eric Raymond says you should treate your users as co-developers.  Google is a highly user centric company for one of its scale, they stated this in the pre-ample to the IPO/s3 and they have managed to stay relatively user centric with few exceptions ( likely the most obvious, maybe the Book deal).   Other companies — ie: Apple, Facebook — are less user centric.   Working on the Internet is like social anthropology, you learn by participant observation — the practice of doing and building is how you learn.   In making decisions about services like Google Voice, Beacon etc. users interest need to be where we start and where we end.

2. Respect the layers

In 2004 Richard Whitt, then at MCI, framed the argument for using the layer model to define communication policy. I find this very useful: it is consistent with the architecture of the internet, it articulates a clear separation of content from conduit, and it has the added benefit of been a useful visual representation of something that can be fairly abstract. Whitt’s key principle is that companies should respect the distinction between these layers. Whitt captures in a simple framework what is wrong with the cable companies or the cell carriers wanting to mediate or differentially price bits. It also helps to frame the potential problems that Side Wiki, or the iPhone or Google Voice, or Chrome presents (I’m struck by the irony that “respecting the layers” in the case of a browser translates into no features from the browser provider will be embedded into the chrome of the browser, calling the browser Chrome is suggestive of exactly what I dont want, ie Google specific Chrome!).   All these products have the potential to violate the integrity of the layers, by blending the content and the applications layers. It would be convenient and simple to move on at this point, but its not that easy.

There are real user benefits to tight coupling (and the blurring of layers) in particular during the early stages of a product’s development. There were many standalone MP3 players on the market before the iPod. Yet it was the coupling of the iPod to iTunes and the set of business agreements that Apple embedded into iTunes that made that market take off (note that occurred eighteen months after the launch of the iPod). Same for the Kindle — coupling the device to Amazon’s store and to the wireless “Whispernet” service is what distinguishes it from countless other (mostly inferior) ebooks. But roll the movie forward: its now six and a half years after the launch of the coupled iTunes/iPod system. The device has evolved into a connected device that is coupled both to iTunes and AT&T and the store has evolved way beyond music. Somewhere in that evolution Apple started to trip over the layers. The lines between the layers became blurred and so did the lines between vendors, agents and users. Maybe it started with the DRM issue in iTunes, or maybe the network coupling which in turn resulted in the Google Voice issue. I’m not sure when it happened but it has happened and unless something changes its going to be more of problem, not less. Users, developers and companies need to demand clarity around the layers, and transparency into the business terms that bound the layers. As iTunes scales — to become what it is in essence a media browser — I believe the pressure to clarify these layers will increase.    An example of where the layers have blurred without the feature creep /conflict is the search box in say the Firefox browser.    Google is default, there is a transparent economic agreement that places them there and users can adjust and pick another default if they wish.    One of the unique attributes of the internet is that the platform on which we build things is the very same as the one we use to “consume” those things (remember the thrill of “view source” in the browser). Given this recursive aspect of the medium, it is especially important to respect the layers.   Things built on the Internet can them selves redefine the layers.

3. Transparency of business terms

When platform like Google, iTunes, Facebook, or Twitter gets to scale it rapidly forms a basis on which third parties can build businesses. Clarity around the business terms for inclusion in the platform and what drives promotion and monetization within the platform is vital to the long term sustainability of the underlying platform. It also reduces the cost of inclusion by standardizing the business interface into the platform. Adsense is a remarkable platform for monetization. The Google team did a masterful job of scaling a self service (read standardized) interface into their monetization system. The benefits of this have been written about at length yet aspects of the platform like “smart pricing” arent’t transparent.   See this blog post from Google about smart pricing and some of the comments in the thread.   They include: “My eCPM has tanked over the last few weeks and my earnings have dropped by more then half, yet my traffic is still steady. I’m lead to believe that I have been smart priced but with no information to tell me where or when”

Back in 2007 I ran a company called Fotolog. The majority of the monetization at Fotolog was via Google. One day our Google revenues fell by half. Our traffic hadn’t fallen and up to that point our Google revenue had been pretty stable. Something was definitely wrong, but we couldnt figure out what. We contacted our account rep at Google, who told us that there was a mistake on our revenue dashboard. After four days of revenues running at the same depressed level we were told we had been “smart priced”.   Google would not offer us visibility in how this is measured and what is the competitive cluster against which you are being tested. That opacity made it very hard for Fotolog to know what to do. If you get smart priced you can end up having to re-organize your entire base of inventory all while groping to understand what is happening in the black box of Google. Google points out they don’t directly benefit from many of these changes in pricing (the advertisers do pay less per click), but Google does benefit from the increased liquidity in the market. As with Windows, there is little transparency in regards to the pricing within the platform and the economics.    This in turn leaves a meaningful constituent on the sideline, unsatisfied or unclear about the terms of their business relationship with the platform. I would argue that smart pricing and a lack of transparency into how their monetization platform can be applied to social media is driving advertisers to services like Facebook’s new advertising platform.

Back to Apple.   iTunes is as I outlined about a media browser — we think about it as an application because we can only access Apple stuff through it, a simple, yet profound design decision.   Apple created this amazing experience that arguably worked because it was tightly coupled end to end, i.e, the experience stretched from the media through the software to the device. Then when the device became a phone, the coupling extended to the network (here in the US, AT&T). I remember two years ago I almost bricked my iPhone — Apple reset my iPhone to its birthstate — because I had enabled installing applications that weren’t “blessed” by Apple. My first thought was, “isn’t this my phone? what right does Apple have to control what I do with it, didn’t I buy it?” A couple of months ago, Apple blocked Google Voice’s iPhone application; two weeks ago Apple rejected someecards’ application into the app store while permitting access to a porn application (both were designated +17; one was satire, the other wasn’t). The issue here isn’t monopoly control, per se — Apple certainly does not have a monopoly on cell phones, nor AT&T on cell phone networks. The trouble is that there is little to no transparency into *why* these applications weren’t admitted into the app store. (someecards’ application did eventually make it over the bar; you can find it here.) Will Google Voice get accepted? Will Spotify?, Rdio? someecards?     As with the Microsoft of yesteryear (which, among other ills, forbade disclosure of its relationships with PC makers), there is an opaqueness to the business principles that underlie the iTunes app store. This is a design decision that Apple has made and one that, so far anyway, users and developers have accepted. And, in my opinion, it is flawed.    Ditto for Facebook. This past week, the terms for application developers were modified once again. A lot of creativity, effort, and money has been invested in Facebook applications — the platform needs a degree of stability and transparency for developers and users.

4. Data in, data out?

API’s are a corner stone to the emerging mesh of services that sit on top of and around platforms. The data flows from service providers should, where possible, be two way. Services that consume an API should publish one of their own. The data ownership issues among these services is going to become increasingly complex. I believe that users have the primary rights to their data and the applications that users select have a proxy right, as do other users who annotate and comment on the data set. If you accept that as a reasonable proposition, then it follows that service providers should have an obligation to let users export that data and also let other services providers “plug into” that data stream. The compact I outline above is meaningfully different to what some platforms offer today. Facebook asserts ownership rights over the data you place in its domain; in most cases the data is not exportable by the user or another service provider (e.g., I cannot export my Facebook pictures to Flickr, nor wire up my feed of pictures from Facebook to Twitter). Furthermore if I leave Facebook they still assert  rights to my images.   I know this is technically the easiest answer. Having to delete pictures that are now embedded in other people’s feed is a complex user experience but I think that’s what we should expect of these platforms. The problem is far simplier if you just link to things and then promote standards for interconnections. These standards exist today in the form of RSS, or Activity Streams — pick your flavor and let users move data from site to site and let users store and save their data.

5. Do what you do best, link to the rest

Jeff Jarvis’s moto for newsrooms applies to service providers as well. I believe the next stage of the web is going to be characterized by a set of loosely coupled services — services that share data — offering end users the ability to either opt for an end-to-end solution or the possibility of rolling their own in a specific domain where they have depth of interest, knowledge, data. The first step in this process is that real identity is becoming public and separable from the underlying platform (vs. private in, say The Facebook, or alias based in most earlier social networks). In the case of services like Facebook Connect and Twitter OAuth this not only simplifies the user experience but identity also pre-populates a social graph into the service in question. OAuth flows identity into a user’s web experience, vs. the disjointed efforts of the past. This is the starting point. We are now moving beyond identity into a whole set of services stitched together, by users. Companies of yesteryear, as they grew in scale, started to co-opt vertical services of the web into their domain (remember when AOL put a browser inside of its client, with the intention of “super-setting” the web). This was an extreme case — but it is not all that different from Facebook’s “integration” of email, providing a messaging system with no imap access, one sends me an email to my imap “email” account to tell me to check that I have a Facebook “email”.   This approach wont scale for users.  Kevin Marks, Marc Cantor, Jerry Michalski are some of the people who have been talking for years about an open stack.    In the later half of this presentation Kevin outlines the emerging stack.    I believe users will opt — over time — for best in class services vs. the walled garden roll it once approach.


6. Widen the my experience – don't narrow it

Google search increasingly serves to narrow my experience on the web, rather than expand it. This is driven by a combination of pressure inherent in their business model to push page views within their domain vs. outside (think Yahoo Finance, Google Onebox etc.) and the evolution of an increasingly personalised search experience which in turn tends to feed back to me and amplify my existing biases — serving to narrow my perspective vs. broaden it. Auletta talked about this at the end of his book. He quotes Nick Carr: “They (Google) impose homogeneity on the Internet’s wild heterogeneity. As the tools and algorithms become more sophisticated and our online profiles more refined, the Internet will act increasingly as an incredibly sensitive feedback loop, constantly playing back to us, in amplified form, our existing preferences” Features like social search will only exacerbate this problem. This point is the more subtle side of the point above. I wrote a post a year or two ago about thinking of centres vs. wholes and networks vs. destinations. As the web of pages becomes a web of flow and streams the experience of the web is going widen again. You can see this in the data — the charts in distribution now post illustrate the shift that is taking place.   As the visible — user facing — part of a web site becomes less important than the API’s and the myriad of ways that users access the underlying data, the web, and our experience of it, will widen, again.


I have outlined six broad principles that I believe can be applied as a design methodology for companies building services online today. They are inspired by others, a list of whom would be very long,  I’m not going to attempt to document it, I will surely miss someone.   Building companies on today’s internet is by definition an exercise in standing on the shoulders of giants. Internet standards from TCP/IP onward are the strong foundation of an architecture of participation. As users pick and choose which services they want to stitch together into their cloud, can companies build services based on these shared data sets in a manner that is consistent with the expectations we hold for the medium? The web has a grain to it and after 15 years of innovation we can begin to observe the outlines of that grain. We may not be able to always describe exactly what it is that makes something “web consistent” but we do know it when we see it.

The Microsoft antitrust trial is a case study in regulators acting as design architects. It didn’t work. Google’s “don’t be evil” mantra represents an alternative approach, one that is admirable in principle but lacking in specificity. I outline a third way here, one in which we as company creators coalesce around a set of principles saying what we aspire to do and not do, principles that will be visible in our words and our deeds. We can then nudge our own markets forward instead of the “helping hand” of government.


  • debs

    John – nicely outlined. These principles all work for design, development, community principles as well and integrate nicely as business principles. After 15 years of innovating on the web we are at last at a point where we understand that we must work with the teenager..understand what makes her sing and deliver accordingly – to users, partners, businesses. I just cant wait till we finally drop the “webpage” metaphor ey..;)

    • lacoste polo shirts

      Excellent read, I just passed this onto a colleague who was doing a little research on that..

  • terrycojones

    Hi JB

    Nice post – thanks for taking the time to write it up. I find it all highly relevant to FluidDB, and have thoughts along some of the above lines too, as you might imagine. I'm glad you see at least part of the future in terms of shared data with users in control, that's something we have in common.

    BTW, I followed the MS trial obsessively back in 96/7 or so. A part of your argument sounds like saying “there's no point punishing a murderer because the victims are beyond help”. That's probably not all of what you meant, but it sounds a bit that way. There are no easy answers as you know, but I do think you're pointing in a good direction. Layers and appropriateness of behavior change over time as well. In fact you could argue that the Sherman Act was a first attempt to draw a coarse-grained division between acceptable and unacceptable behavior based on crossing a line (into monopoly). That's hopelessly outdated in some ways (and I agree that government intervention is clumsy, ill-guided, an influence/money/political game) as we're now seeing layers in places where they didn't exist before (like computational platforms, the data in those, the apps that run on them, etc). The issue of eating your own user base (think about the various approaches to MS, Facebook, Twitter (see Chris Dixon's(?) post claiming Twitter will inevitably begin to eat its own) is very interesting. Final thought, the internet being a platform on which we build & consume is not unique to the 'net: computers have been that way for a very long time. There is something fundamentally weird and powerful about having a tool (the computer) that you (as a programmer) use to build other tools, that you then operate on/in the context of the original tool. Perhaps this is an important component of what gives rise to these emergent layers. If so, there may be a small number of well recognized patterns that could be documented, whose emergence could be predicted in advance (presupposing success of a new tool), and upon which companies could be asked to make a statement about. That's a long way of saying that there might be a small practical taxonomy of these things that could make your suggested direction concretely actionable. Wishful thinking, perhaps.

    OK, sorry for the rambling response. Thanks for the food for thought. I have a fairly strong opinion about FluidDB, its data and its apps BTW. I think it's a good opinion, and it's one that's pretty clear to me, a long way ahead of the time at which we actually become relevant to the world.


    • Johnborthwick

      Terry I dont mean to say “there's no point punishing a murderer because the victim is dead, regardless”. Even if forward looking remedies arent effective — regulators should punish harm. There was real damage in the MSFT case — my team and I spent almost a year talking with regulators here and in Europe about it. The question I wanted to focus on is (a) what you do about it beforehand (b) and is there an alternative path to waiting for the govt to act.

      Thanks for your comments re: build and consumption element — agree its a weird and fascinating element to working on computers, period.

      • terrycojones

        Just for the sake of continuing the discussion (it's nearly midnight here)…

        Yes, the harm should be punished, agreed. It's common in cases concerning physical people that a guilty verdict comes with a requirement to pay damages AND to go to jail. The damages compensate for the harm, the jail sentence is partly preventative (“you're not going to do this again, because we're going to lock you up”). Of course you can argue that the jail term may not be effective (and some crimes can be re-committed in or from jail), but I don't think anyone would want to, for example, impose damages on a rapist and not also lock them up for some time. The locking up likely has the older roots, it clearly serves a societal purpose. So if corporations are individuals in the eye of the law, shouldn't we also impose damages and in some sense lock them up?

      • lacoste sale

        Thanks for sharing these useful information! Hope that you will
        continue doing nice article like this.

    • Mark Essel

      Great response Terry, in particular your hope that an actionable /measurable set of guidelines could emerge around some
      of the broad topics John discussed. Extracting essential semantic meaning is a slipperly slope. If we aren't careful we lose the essence of the guideline we were initially hoping to nourish.

      • lacoste polo

        Thanks for your comments re: build and consumption element — agree its a weird and fascinating element to working on computers, period.

  • Steven Kane

    Hi, very stimulating, thanks.

    As a very basic, very conceptual matter, though, I’d argue that the only thing worse than having forward-looking technology regulation is… NOT having it.

    We either believe government has a role in such marketplaces, or we don’t.

    I do.

    Despite the total breakdown of the system, and the massive harm caused to vast swaths of the public, the financial services industries are now, yet again, mounting a massive offensive to yank the teeth out of new regulation before it is written.

    And, with respect, that industry argues for deregulation with the same basic arguments you use here – that the pace of change makes government regulation ineffective at best and harmful at worst.

    And I completely conceded – that can be true. And I agree that government red tape can be a nightmare. And that there are many many cases of regulation having the opposite effect of its intended design, and doing harm etc etc

    But I believe the alternative is worse.

    “Those of us who have looked to the self-interest of … institutions to protect [the public interest] are in a state of shocked disbelief.” — Alan Greenspan

  • whabib

    Excellent piece of work here, John. I read half of this post while waiting for a film to start and the first thing that raced to mind was our experiences at Fotolog, which you summarized so well in point 3. For me, that experience soured me almost completely on the “don't be evil” mantra. Not only were the workings of the auction model obscure to us, but all information about the advertisers was withheld from us and we were not even allowed to do any tracking of the ads or their performance on our own. The purpose was to block us from establishing any relationship to the advertisers directly which, I suppose, would have looked like evil to Google but not so evil to us. Shortly thereafter, I had the pleasure of being on the buy side of the equation, buying words for a small, independent publisher. That proved to be equally frustrating as I found the system frequently insisted on raising bids without offering any proof that the inventory was actually selling out at lower prices.

    I agree that government regulation which attempts to be prescriptive about technology is a waste of time (the EU's “no media player but same price” version of Windows being the most remarkable example), but I think there is a strong and valuable role for government regulation in terms of enforcing openness and disclosure in business dealings, and this applies to technology companies as well as banks. Obviously I'm not a lawyer, but I could see Google being open to charges of price manipulation in services where they actually manage to charge money and, perhaps more controversially, unfair bundling of some goods and services.

  • fredwilson

    great post John. i've thought long and hard about these issues and i concur with your six principals. if every web company lived to them, it would be a much better web than we have today.

  • infoarbitrage

    John, a truly seminal piece of work. Kudos for its depth and clarity of thought. The most interesting thing to me, however, is its relevance for all businesses, anywhere. If nothing else this highlights the convergence of the web and non-web businesses, where innovation moves increasingly rapidly, transparency is essential and the users help define the product roadmap. For all its fluidity Google is just another business, and will increasingly be treated as such.

  • Mike Boyd

    Amazing Read! Thank you.

  • lazerow

    John, another amazing post. You're on fire. Thanks for making us all think. See you soon.


  • choubb

    thanks for putting in up these great princples.

    I do think if some one can stick to these rules in the near future(not forever, who know what is future would be), he would be successful in some way.

    While you do not line out the goal why we should be sticking to these rules, someone would say a success, then the next success is how it measured, money, long-life, respectful, nice-looking…

    sorry for so much questions and rudeness, my native language is chinese, treat me as alien speaking English.

  • Greg Satell


    Excellent post! Tim Berners-Lee raises some of these points as well in his book.

    One point that I would ad is more transparency with respect to consumer information. Consumers should eb able to find out if their information is shared or used for advertising purposes (i.e. profile, clicktrails, communication with other users, etc) As the web becomes more semantic, this issue will only become more important.

    Again, thanks for this.

    – Greg

  • David Semeria

    In essence, you are exploring the space between the combined transport and presentation layers (TCP/IP, HTML, etc) – which are very open and transparent – and the data management layer – which is still very closed. You rightly highlight FB, but even the seemingly more open Twitter ultimately controls access to the UGC on its servers.

    Over time, I firmly believe a “meta net” will emerge in which people will be able to host their information wherever they choose, and allow selective access to users and applications.

    For me, the major issue is who will pay for it. Freedom, as we all know, frequently comes at a price.

  • reecepacheco

    John, great post.

    As a user I always have been and likely will remain a big fan of Google products. Conversely, though I have an affinity for the company itself, their ever-increasing dominance of the web warrants the fear many of us have when it comes to monopolies.

    And that psychology – that fear – I think is at the core of the problem. The government (and the people) need to overcome their fear of any one company's potential dominance and understand that the technology market won't allow itself to be dominated, and more importantly – it's up to us to continue building our ideas and providing competition.

  • Mark Essel

    John this is a great work of thought and execution. How can we as a community add richness to the broad categories of design you suggest? In particular I’m focused on your last topic, widening the Internet experience. This relates heavily to the value of serendipitous discovery. A new social ad tool we’re developing at at first appears to be narrowing the experience of web search (by focusing on semantic tags derived from social streams). Our goal is to provide intelligent search assistants to help uncover content you may not consciously browse to, but find incredibly relevant. The goal of course us full user control of data generated by the service.

    I think it fits nicely within the range of categories you have shared for web progress. Thanks for writing one helluva post!

  • Ryan Graves

    My fear is that Google enters into markets “just because”, killing out smaller competitors. Then Google doesn't innovate at all (Feedburner), but the company that would have innovated like crazy, is now dead.

    In that situation Google is killing innovation, contrary to what most people believe to be true.
    Only time will tell. Great post.

    • Chris (Efficient Guide)

      Agree with you. Google is doing what any successful and huge company typically does: throw its weight around. Google does innovate, but mostly now, their behavior shows a consistent pattern toward indirectly “outsourcing” true innovation to startups (like GetGlue, for example, in Social Search) and then using their huge war chest and teams of brilliant people to a) Acquire the innovator and their technology; or b) Co-opt the innovator's approach and improve on it somehow, and then c) Use their dominance in the market to immediately capitalize fully on that innovation. Reminds of MSFT in the 90s(!). Don't be evil, indeed!

      • Ryan Graves

        Yep. I really like Google's “don't be evil” but I don't think that many of
        their plays are good for innovation, and innovation is good.

  • Chris (Efficient Guide)

    Interesting post. I generally agree with most of the points raised. However, the author's assertion that he now believes the antitrust actions taken against MSFT had no pragmatic effect on their subsequent actions, etc., struck me as naive. The government's actions may not have had a significant impact on subsequent competitive conditions, as the author suggests; however, those actions DID serve to draw a “line in the sand” on anti-competitive practices at MSFT and in the market and almost certainly DID affect MSFT's subsequent decisions and actions. It may have appeared to the author that it did not, but I find that very unlikely. Interventions like this do tremendous damage to the affected company's brand image and for that reason alone MSFT almost certainly made strategy adjustments after the ruling.

    • Johnborthwick

      Chris — no — I think the effect of the antitrust remedies were significant — to MSFT and to the market. In particular the unbundling of the media player and other aspects of Windows in Europe. But how did it effect the market. Take the media player since that was the focus of the European case. Real Networks was the dominant player in this space and Microsoft Media Player + Windows was the threat. Today the use case of the players of yesterday is basically gone – due to embedded media, flash etc.– and iTunes is the dominant media platform a service that was orthogonal to the old players.

  • Dan Rua

    I've only read up to #3 and that's enough to digest over the weekend. Whenever I stumble over here, typically from a colleague's link, I kick myself for not reading you regularly. Monday, I return…

  • marshallkirkpatrick

    John, I enjoyed reading this for sure and admire you for tackling such an incredibly complex issue: how to be huge on the internet and spell out a plan for not being evil. a few points of contention that may or may not be substantive.

    1. it is unclear to me how Google is more user-centric than Facebook. i think Google's launched and failed services over the last few years far outnumber Facebook's – most of FB's stick, I think. you have to wonder how user-centric the planning was for things like search wiki or knol.

    2. users should demand clarity around the layers? what would they demand? “please respect the layers as soon as it becomes a problem?” competition seems the best solution to me, and i don't say that to every problem at all. but as you've illustrated, blurring layers can sometimes be wildly useful. i don't feel capable of determining ahead of time when that line will be crossed and the blurring will become a problem. maybe other people could do that.

    3. re opaqueness to the biz principles that underlie relationships: access to the twitter firehose and the recent ycombinator deal are two examples I'd add to the list. people have told me “i won't describe my relationship with twitter and no one else who has one will do so either.” in fact, when it comes to opaque business relationships between that very important vendor, twitter, and anyone else – betaworks is generally the first party that comes up when people talk about mysterious connections. presumably when twitter really does open for business, then some clear rules will be spelled out? i don't have a lot of faith in that, to be honest. fwiw, I've been super critical of both twitter and facebook but facebook has been a lot more forthcoming with communication with me. i think twitter is upset that the handful of times they've responded to my email inquires, I end up writing negatively about their responses. Facebook, on the other hand, has been far more open with everyone – down to publishing what they call roadmaps for the next 6 months of the business. (conveniently leaving out the world domination step at month 7! 😉

    your write-up here articulated well why this transparency is important but I think we're a long way from it.

    4. re facebook data: it's not just about deleting photos from someone's feed, it's about pulling them back from other sites they were syndicated to if your privacy settings change. thus the prohibition of caching in FB connect. I'm super skeptical of FB and an adamant advocate of opening up their data, but I buy the argument that the prohibition of caching is done at least in large part for the benefit of their users.

    keep up the blogging, sure is nice to get to read your thoughts. I liked the one about the Notificator a whole bunch especially

    ps. the links in your posts are making me sad – I'd like to be able to hover over the links, see where they lead and then decide whether to leave this page or not.

    • Johnborthwick

      Thanks Marshall for the comments, and thinking. Thoughts / responses:

      #1. The question of whether Google or Facebook is more user centric is subjective but let me outline my reasoning. Google tends to launch new services as standalone products, not integrated parts of the core experience of There might be an odd link to the service — usually in labs, maybe in an “other menu” — but they dont make the assumption that every feature should be integral part of the core. Conversely Facebook is designed as a single product — and every feature is an addition to the whole.

      In my mind Google's approach is more aligned with users than the integrated approach. It doesnt assume that the users want everything to come from one service. I have no idea who does more user testing — that was the intent of the statement. I was more focussed on integration vs. aggregation.

      #2. Great question — points to an issue that I never fully resolved in the post — ie: who is the post aimed at users, companies, policy makers etc?? My early drafts of the post were user focussed. But I hit upon issues like the one you highlight — how can users dictate layers?? That was a draft from a year ago (yep, im slow to post sometimes). The post in the form – published is focussed on both users and companies. I was influenced by a book I read this summer — Nudge: Improving Decisions About Health, Wealth, and Happiness (link to amzn page: — it lays out an alternative structure to design what they call decision making architecture. From a user perspective its a question of expectations. I do think there are cases — like w/ Chrome — where you can see a train wreck coming re: layer disruption. As im thinking about your comment its raising a question for me about intent — are the companies in question designing product for their users or to perpetuate a pre-established / incumbent business model. Maybe thats the right litmus test here.

      #3. In terms of transparency re; Twitter, Facebook. Let me speak to the Twitter side as Im more familiar with it. I think what we as an industry have seen over the past 18 months is a young company, growing into a pair of shoes that are bigger than its feet. Let me tell you what I know about the firehose. Twitter offered a handful of companies access to the firehose. I think that Twitter could have offered more transparency around the terms of service of the firehose and I would have liked betaworks to have been one of those companies who received the firehose. We weren't — my understanding is that Twitter gave it to a handful of companies to beta test how it would scale *and* what would happen (ie what would the companies do with it). Ironically one of the companies who got it was Friendfeed — an early Twitter competitor. And despite a competitive posture — they didn't produce anything that was worth keeping from it.

      In my mind there is nothing mysterious about this — its the early stages of development of something new and the parameters (usage, business rules etc.) are slowly getting defined. Im glad you asked because most people dont — maybe the perception of mystery is more interesting than the mundane facts! I do think that the search deals around the firehose that were inked a couple of weeks ago are an important step towards simple, transparent business arrangements, Im hoping there will be a click thru EULA to the firehose. On the point of relative transparency of Facebook vs. Twitter. I think you are comparing two companies who are in different stages of their life cycles. Facebook in the early days ran its service exclusively in the college community — providing a platform to scale and test. Twitter has been doing all of the above in the public sphere. To wrap on this point — I do have more faith that the business rules are getting spelled out, albeit slowly and fitfully.

      4. Agree with the caching point.

      On blogging, im a slow blogger, post about once a quarter, will keep up that cadence. And about those links — thats my workflow laid bare!! I use the side bar to grab a link off a page out of habit. I get your point about hovering. I need to install a bit of js on wordpress that will enable the hovering and show the clicks.

  • Barry O'Gorman

    Fascinating article – tackling the issues of interdependence in the web world. Have spent much of my career working with ERP solutions – where vendors try to pack more and more functionality into their own solution suite. Am a convert to web based apps, the cloud and what goes with it. There is a need for responsible development and management – in a world where we can all benefit from each others efforts. Seems to me that your 6 principles are as good a basis as any for moving forward. No real scope fro much disagreement. Well done.

  • David O'Gorman


    Great article; looking for a third way is very interesting. Neither the bureaucracy of government (at its worst) nor the greed of business (at its worst) will produce best results for end-users. If you think of end-users as digital citizens you start to see the relationship with the IT world in a very different way. Citizens have rights, end-users are only customers with limited commercial rights. I can understand why many IT business people shy away from thinking citizen rather than end-user but we may be approaching a time when we look at ICT tools and services as not much more than utilities. Citizens (generally) expect a postal service, electricity and water wherever they live. Customers take what they can get. If ICT tools and services are a must-have to function in the modern world they have crossed a line which will most likely result in more regulation. Your principles – as above – may provide a way to avoid endless, wasteful big government versus big business wars. Can I suggest though that, to work, these principles will need to be more end-user/citizen focused, so for example: “Think Users” is short and clear, but “Respect the layers” or “Data in, data out” are very technical. Admittedly my focus here is more on end-users/citizens rather than on companies, but my point is that efforts by companies based on a view of the public as consumers rather than citizens tends to encourage self-interested, short-term commercial thinking. It seems that you are trying to get companies to think longer term and about bigger interests. In current circumstances, first company to do this may lose out to rivals. Either government regulation or popular pressure/expectation can encourage/facilitate/force longer-term thinking. Popular pressure from 'citizens' is likely to be more radical than from 'customers'. Though companies do not like thinking of their customers as citizens; in these cases, anything less may not provide a sufficiently strong effective 'framework' that will help to avoid government regulation.

    Short version – can we get to where you want to get without thinking about digital citizenship?


  • watson02

    great post John. i've thought long and hard about these issues and i concur with your six principals. if every web company lived to them, it would be a much better web than we have today.

  • women's shoes

    Over time, I firmly believe a “meta net” will emerge in which people will be able to host their information wherever they choose, and allow selective access to users and applications.

  • skiing goggles

    great stuff

  • skiing goggles

    great stuff

  • Thomas G Hale Sr.

    This is an interesting re-read. Regarding Google and a possible Antitrust suit similar to Microsoft’s. Contrast the buzz about Social Media VS Search engines, and ask now a year later if the case could be made that its not the highway, but a network of highways. Have your thoughts changed?

  • big pony polo

    microsoft antitrust trial is a case study in regulators acting as design’s “don’t be evil” mantra represents an alternative approach, one that is admirable in principle but lacking in specificity.