I did a post over on svbtle about tapestry, a new way to write, it includes a collection of new tap stories. More after the jump …
I did a post over on svbtle about tapestry, a new way to write, it includes a collection of new tap stories. More after the jump …
Last week was a rough one here in New York City. People lost their homes, some a lot worse. We were lucky. I live downtown and all that my family and I had to deal with was no electricity for most of the week, no cell access and a lot of water in our kitchen, no one got hurt and the damage is all fixable. But the lack of basic technology services got me thinking about what could have worked differently. With all the technology that has been placed in the hands of users since the first personal computer I found it remarkable how little was useable. Let me start with a personal overview of our situation and then lay out some ideas about what could be done differently.
When Sandy hit my family and I found ourselves with:
- no power
- no cell phone access (ATT, our provider was down)
- Lack of reliable connectivity
For four days our only access to the internet at home was via one Verizon enabled iPad. Luck would have it, that this turned out to be the one Apple device you want. The iPad has the best battery of any the options available (iPhone, android phone, laptop), and the Verizon network proved to be far superior to ATT.
Yet many of the web sites we needed barely functioned. Con Edison was the worst. After 45 mins of dropped connections the ConEd website told us that they weren’t aware of an outage in our area . Power in all of lower manhattan was out and somehow the site couldn’t tell us that. The map they had of outages had a few flags on it — those were the brave souls who persisted and reported an outage with pitch black all around them. And from what I gather ConEd was way better than Connecticut Light and Power — their home page was still saying “Prepare for Sandy” days after the storm hit. News sites were too general, what I wanted was hyper local news. Twitter wasn’t useful. Twiter is hard to filter and the content stream moved too quickly to use effectively given intermittent connectivity. Facebook wasn’t useful, I wasn’t interested in pushing information out, I wanted to get information.
So what could have have been done differently? Here are five ideas:
1. Data and accessibility:
The data is there it just needs to be made accessible. ConEd and other utility service providers could design their websites for constrained circles of accessibility. Think of an inner most circle with no web access, just SMS. One ring outwards represents low bandwidth or intermittent access, mostly email, some web — and then the furthermost ring represents high bandwidth access. Ideally, utility websites should be adaptive across all these rings, at a minimum they should offer users the ability to navigate it at different levels, depending on the situation. So when an emergency hits users shouldn’t be faced with a site that is optimized for high bandwidth access with videos expelling things. What I wanted from the ConEd site was a simple status update of power restoration in our neighborhood. All the rest of the media and information on the website was of no use, in fact, it detracted from what should have been a simple experience.
Going one step further if ConEd and utility service providers made their basic service data accessible via API’s then it could easily be reformatted and delivered to people using the channel best suited to the situation. In the case of Sandy that would have been a simple web site, optimized for low bandwidth and intermittent connectivity, with neighborhood navigation. Someone would have made that site if ConEd and other utility providers made block level service status data available.
[update: there are a few end points that people found to ConEd data, that generated some data, for a good example see this thanks to @cmenscher who sent this to me and @ckundo who created the visualization]
And if service providers had basic API’s they could share them with each other. As @Auerbach reminded me ConEd may not actually know if power is down in your building but Time Warner Cable knows it. And the street lights have connectivity back to a central station. A little bit of data sharing could go a long way here.
Utility web sites seem to have been designed primarily as marketing tools. This is backwards. The sites shouldn’t be managed by the marketing department, particularly in the case of a utility where customer churn is basically nonexistent. Take a look at the coned twitter feed: https://twitter.com/conedison. The number of messages with media, essentially promotional,is high. But ConEd and CLP are were at least active on Twitter this past weekend. In contrast AT&T’s marketing department seem to have gone home for the weekend.
Socialmedia is opening up channels for people to talk to companies and companies to talk to customers. The departmental lines between marketing and customer service are a fabrication of an era that is past. Customer service is becoming marketing and it should have primacy in situations like this. Companies and brands are starting to think they need to produce media in order to talk with customers. This makes sense in a marketing context but in a situation like Sandy it doesn’t. I’m not interested a Utilities video channel. What I want is usable information.
3. Simplify, Simplify, think /status
As technology advances systems become complex. During emergency situations that complexity needs to be unwound so that basic services remain available and accessible – the first and most basic is an awareness in an organization that systems, critical systems need to scale up and down this curve of complexity. If that awareness can become part of how we design technology then as new, more, complex functionality is added to a product will make the roll back actually possible.
Another approach to simplifying or unwinding a complex systems is for there to be basic standards that system providers agree to. Consider really simple things — i.e.: what if service providers adopted a standard so that users knew that if they went to www.coned.com/status or www.ATT.com/status or Twitter.com/status they would get a network status update. In emergencies simplicity of navigation goes a long, long way. There are simple solutions and while this disaster is fresh in our minds is a good time to consider a few.
4. City, local, government data hubs:
Government and city government’s first job is to keep citizens safe to that end government could play an important role as a hyper local data aggregator. If the service providers made service/status data accessible via API’s then cities could easily aggregate that down to a neighborhood level. What I really needed was a single page with aggregate information for power, cell access, flood levels for our neighborhood or even block. This is a prototypical public good that local governments could offer citizens. Match that page with a simple notification system to alert me of changes and we would have a very simple, usable, local status page. Note the data I’m talking about is not account level data, its simple service level availability data. This isn’t a radical shift in the role of government, or governments access to data — at some level data becomes a public good and government are the most natural and benign aggregator of that data.
5. Towards a Machine readable city:
By the end of this year there will be approximately 2.3bn people connected to the network. Thats a big number, but we are on the cusp of an explosion in that number. Sensors that communicate with purpose built devices are going to be everywhere (think fuel bands and Nests for consumers and for the enterprise think about all the industrial hardware that will be wired up with sensors to monitor use and state of wear). Additionally, I believe, cities will become machine readable. Imagine if a city simply added to its street signs simple QR codes. Not only would this give added information to citizens but information could be programmatically updated in the case of an emergency like Sandy. Over the coming decade billions of sensors get wired into the network, many of them in our cities. Most of these sensors primary purpose will be commercial yet there will be some level of aggregate data that the city government should have access to aggregate. Weatherunderground had some useful maps of the tide levels on monday night as Sandy approached but the detail needed on a local level to make informed decisions was missing.
What happened here in NYC was nothing compared to the earthquake in Japan and the nuclear fallout that followed. Yet alot of our technology failed us. Technology needs to be designed as flexible, adaptable to the context that it exists in. Over the coming decade we will see contextual computing upend many of the services that today we take for granted. Building and designing technology with events like Sandy as a consideration are a first step down the path of making computing and the machines we depend on, function regardless and in regard of the context they exist in.
I did a live interview last Friday on Bloomberg TV. It was interesting. A conversation about the early stage technology environment, increases in the cycles of change, new things at betaworks and the Facebook IPO.
If the subject of the Facebook IPO and the acceleration of the rate of technology change interests you there are two other posts on the subject I saw this weekend.
Back to Bloomberg and the live interview
Live TV is always interesting, I dont enjoy it but I love the fact that its live, its your words, no editing possible. That aside, the way that Bloomberg do these segments is fascinating. The host is wired up, standing in the atrium of the Bloomberg building, producer jammed into her ear. She has two screens in front of her, both are bloomberg terminals, running windows. First check out that keyboard, Bloomberg terminals and airport checkin are the only places you see things like that. Back to the screens. From what I could gather one on the right was email and a chat window. Email was moving fast, a stream of a message or so every few minutes, Twitter sending new follows, notifications, @mentions, etc. On the left was an application to let the host compose real time a feed into her teleprompter.
The segment began with a discussion of Facebook and Google. Part way through it the producer (in her ear) tells her there is a breaking story about JP Morgan. As soon as there is a pause she says “we are going to jump to a breaking story after the advertising break”. During the ad’s she composes the introduction to the breaking story on the screen on the left.
Its fascinating to watch the process, only thing that is missing is a chartbeat terminal with a live feed of user metrics (ie: who is watching what). The way media is made is changing as the real time stream is becomes an integral part of the creation / production process.
Today at betaworks we are launching findings, a platform for sharing and discovering what people are reading. You can see my bookshelf of what I’m reading here – this includes books as well as web pages from which I have clipped highlights that interest me. You can see the quotes I have highlighted here, or the same collection as an xml feed. All these quotes are then placed into a social framework where you can explore who I follow on findings and who follows me. Users of findings get to choose whether to make their collections public or private. The default is public because at betaworks we believe that making data open and sharable adds value to the data in its entirety. I’m not sure if it’s a squared relationship but I do believe that it’s more than linear.
Building findings was a slow brew or a “slow hunch”. Back in 2005, Steven Jonson wrote a great blog post about his use of DevonThink software and how he was using it as his personal Memex. The piece resonated with me. At the time I had a flatfile system with years worth of collected quotes and clips; it was searchable but the happenstance of the discovery tool that Devon offered opened up a whole new dimension to my collection. Steven and I started to develop the first version of findings about four years ago with the goal of creating a platform to help people collect, share and discover things they were reading.
Along the journey Steven found this wonderful quote from Robert Darnton about the commonplace book:
“Unlike modern readers, who follow the flow of a narrative from beginning to end, early modern Englishmen read in fits and starts and jumped from book to book. They broke texts into fragments and assembled them into new patterns by transcribing them in different sections of their notebooks. Then they reread the copies and rearranged the patterns while adding more excerpts. Reading and writing were therefore inseparable activities. They belonged to a continuous effort to make sense of things, for the world was full of signs: you could read your way through it; and by keeping an account of your readings, you made a book of your own, one stamped with your personality.” (You can see the origin of this clip and others by Darnton on this page.)
This quote exemplifies how I read, and write, today. Despite this, the tools and the language of sharing quotes and marginalia are still only loosely formed. With findings.com, we take a step forward (or back!) to this future.
Back in 2007, there were no ebook readers, no kindles, no iPads – not even a nook. The iPhone was barely six months old and had no apps – unless you decided to jailbreak. In short, it was too early for findings so we bought the domain and shelved the development. As a side note, we originally started with the domain findin.gs but that was a mouthful so we moved over to findings.com. A bit easier to pronounce. The project sat on the shelf for about 18 months.
About two years ago, Steven Johnson and I again started talking about the need for a common platform where quotes and marginalia could be shared, re-organized and re-combined. With devices that enabled “networked” long-form reading on the market, the potential behind the findings idea seemed both timely and unbounded. About a year ago Corey jumped on board and the three of us got to work; Corey built another beta. Once again, we didn’t ship this second version; rather we tested, trialed and kicked it around at betaworks. We kept asking ourselves how to make this useful, how to retain the context of the book yet give the atomic unit (aka quote) a place to exist independently…and with social context. Today we are launching findings. The experience is simple yet the meta-data that is processed in the background is complex. I hope you will give it a try. Sign up for an account and clip and synch your highlights – let’s see what we can build around digital marginalia.
Since we started working on findings people like James Bridle have helped construct a roadmap. As James has written, marginalia is a vital and vibrant part of the reading experience. It’s both personal and social: “digital technologies do not just disseminate, they recombine, and in this reunification of our reading experiences is the future of the book”. Thanks to James and others for their insight, we are collectively just starting to understand what is possible and what reading will be in the future.
Thanks to the current team of Steven, Corey, Jason, Jeffery, Neil and Alex its great to see findings live. And thank you to the original development team of Nate, Neil and Trevor. Sometimes ideas need time to develop, simmer and brew.
And a final note, please be patient with us the site had a lot more traffic than we expected today.
Steven’s post on the launch is here.
tumblr is on a tear. The growth numbers are insane and they have just announced a big, big funding round. Back in May of this year TechCrunch ran a post outlining that tumblr was doing the same number of pageviews a day as they did in a month back in 2009: 250m pageviews a day. If you look at the same metric today, the Quancast pageview (impression) count is now close to 400m. The NYT reports that the service is now doing ”13 billion page views per month from 2 billion page views per month. Since the site was first introduced, 30 million blogs have been created using the tool. Those 30 million blogs now generate more than 40 million posts each day.”
This is stunning growth and is a testament to great work by David and the team over the past five years. Its also an indicator about how fast new social platforms can get to scale. We are living in an age of mutlple social platforms. The next five years is going to be fascinating as the established platforms (ie: Facebook, Twitter), relate to the new platforms (ie: tumblr).
I remember meeting David before we started betaworks; I was still running Fotolog and David was working with the Next New Networks team. It was April of 2007, and my old friends Emil and Fred had recruited David to work as a contractor to help them build out the Next New product. tumblr was a side project that David had created because he believed web publishing could be different. He believed publishing could be a simple and beautiful experience; holistic design of the publishing experience, from the post dashboard to the layout of every pixel, could be something simple and bold. I remember talking with David about the early forms of blogging and how tumblogging was emerging as a short variant. We talked about dashboards and how they should be integrated into the published experience (vs. a toolset that sits outside), and we talked about re-blogging and different tools and forms to amplify and syndicate posts. We also talked about reposting from other networks and how he wanted tumblr to retain the layout of posts vs. linking out.
The thing I remember the most from the conversation was David himself. He is one of the best and most dedicated product entrepreneurs I have ever met – he thinks carefully and deeply about every interaction that he and his team creates, and always has. Every pixel has been considered with care. It’s wonderful to see and work with someone who cares so much about the actual product experience. David is different and special. The rest of the story is history. David left Next New Networks and started focusing on tumblr full time. I started betaworks and made tumblr one of our first investments. Its been a pleasure to work with David over the years and to be part of what is becoming a great “banner” New York company in the social web. Congratulations to David, John and the team. Here is a video of David speaking at last year’s betaday event.
Note that as backdrop this talk was the day after tumblr had a large outage, so I think David had been pretty much up all night. It’s a wonderful example of his dedication and commitment as a person and a builder.
Over the past year, News.me has been incubated within bitly. Today, we’re pleased to announce that News.me has officially spun out of bitly into an independent company under betaworks. As I wrote earlier this year, with News.me we are seeking to rethink and reinvent the way that people discover news; I’m very excited that News.me is now set up and running as a standalone company with the resources it needs to fully pursue that vision.
Michael Young, who has been with News.me since its inception at the New York Times R&D lab, will continue to lead the development efforts as Chief Technology Officer. He’ll be joined by Rob Haining (of Epicurious, GQ, and Idea Flight app fame) who is leading iOS development, and Justin Van Slembrouck (from Adobe, where he designed the Times Reader application) overseeing User Experience and Design. Jake Levine, formerly Entrepreneur in Residence at betaworks, is joining News.me as General Manager. We’re looking for a few more developers to round out the team, so if you’re passionate about news and the social web and are eager to explore the boundaries of emergent devices, drop us a line.
Finally, in anticipation of a series of releases over the next few months, we’re excited to share that the News.me iPad app is now free. You can read more about this change, along with our plans for the next generation of the product, over at the News.me blog.
An interview with the Wall Street Journal @alansmurray discussing the impact of social media.
This weekend (May 14th) 7on7 runs for the second time in NYC – the event brings together artists and technologists – where they conceive and often build a project over the course of a single day. Some people have referred to it as a YCombinator for the art world, sort of, but last year it was a little more unconventional and irreverent than a YC event. Slamming an artist together with a technologist can have unexpected consequences.
Last year Matt Mullenweg and Evan Roth hacked WordPress to add in a feature that would create random and unexpected experience at points in the software that he described a lonely or threatening. Marc Andre Robinson & Hilary Mason created an umbrella with a homing beacon so that you could see patterns of use and rain across a region. Joshua Schachter & Monica Narula devised a concept for a guilt exchange. You can see a video of these three presentation here. The other four presentations were wonderful – the whole event from 2010 is posted here.
Why 7 on 7?
A handful of reasons: this event and the process that it represents is something I have been fascinated by for a long time. The first site I created on the web was äda’web, back in 1994. It was a platform for artists and technologists to collaborate and create projects for the web – ones that were medium specific – ie: it wasn’t about putting paintings on the web, rather it was about using the web to create. The site is still up and running courtesy of the Walker Art Center, to whom we (and AOL) donated äda’web to in 1998. For more about what “äda’web is” see this interview with my co-founder, Benjamin Weil, and / or read this piece he wrote about äda’web as a digital foundry.
Back in the late nineties it struck me that the process that an artist and a technologist apply to their craft is similar. There is much to write on this subject, rather than diving in here there is a thread we started yesterday on quora titled Do Artists and Technologists create things the same way – it spells out similarities between creating art and creating technology.
7on7 slams technology together with Art. As such it is a great platform for pranksters. Pranksters have a vital role in any society — from Jesters, forward they help us gain perspective and see and say things that might otherwise be socially unacceptable. I met this group earlier this year who setup a system to randomly wardial phone boxes in London — Art or Hack? I’m not sure, either way, fierce fun.
Last thought. Art and technology are two communities that are well represented here in New York and yet they dont intersect that frequently. This event was designed to become a bridge between these communities. As technology becomes more deeply engrained in our lives and society it will become part of what we consider to be art and vica-versa. See you on saturday, I can promise something will surprise.
7on7, this Saturday, May 14th details here.
note: I’m a board member at Rhizome and member of the motley crew who came up with this idea – others are: Lauren Cornell, Peter Rojas and Fred Benenson.
News.me launched this morning as an iPad app and as an email service. Here is some background on why and how we built News.me:
Why News.me? For a while now at bitly and betaworks, we have been thinking about and working on applications that blend socially curated streams with great immersive reading interfaces.
Specifically we have been exploring and testing ways that the bitly data stack can be used to filter and curate social streams. The launch of the iPad last April changed everything. Finally there was a device that was both intimate and public — a device that could immerse you into a reading experience that wasn’t bound by the user experience constraints naturally embedded in 30 years of personal computing legacy. So we built News.me.
News.me is a personalized social news reading application for the Apple iPad. It’s an app that lets you browse, discover and read articles that other people are seeing in their Twitter streams. These streams are filtered and ranked using algorithms developed by the bitly team to extract a measure of social relevance from the billions of clicks and shares in the bitly data set. This is fundamentally a different kind of social news experience. I haven’t seen or used anything quiet like it before. Rather than me reading what you tweet, I read the stream that you have selected to read — your inbound stream. It’s almost as if I’m leaning over your shoulder — reading what you read, or looking at your book shelves: it allows me to understand how the people I follow construct their world.
As with many innovations, we stumbled upon this idea. We started developing News.me last August after we acquired the prototype from The New York Times Company. For the first version we wanted to simply take your Twitter stream, filter it using a bitly-based algorithm (bit-rank) and present it as an iPad app. The goal was to make an easy to browse, beautiful reading experience. Within weeks we had a first version working. As we sat around the table reviewing it, we started passing our iPads around saying “let me look at your stream.” And that’s how it really started. We stumbled into a new way of reading Twitter and consuming news — the reverse follow graph wherein I get to read not only what you share, but what you read as well. I get to read looking over other people’s shoulders.
Streamline Your Reading
The second thing we strove to accomplish was to make News.me into a beautiful and beautifully simple reading experience. Whether you are browsing the stream, snacking on an item (you can pinch open an item in the stream to see a bit more) or you have clicked to read a full article, News.me seeks to offer the best possible reading experience. All content that is one click from the stream is presented within the News.me application. You can read, browse and “save for later” all within the app. At any given moment, you can click the browser button to see a particular page on the web. News.me has a simple business model to offer this reading experience.
Today we are launching the iPad News.me application and a companion email product. The email service offers a daily, personalized digest of relevant content powered by the bit-rank algorithm, and is delivered to your inbox at 6 a.m. EST each morning. The app. costs $.99 per week, and we in turn pay publishers for the pages you read. The email product is free.
Created with flickrSLiDR.
How was News.me developed? News.me grew out of an innovative relationship between The New York Times Company and bitly. The Times Company was the first in its industry to create a Research & Development group. As part of its mission, the group develops interesting and innovative prototypes based on trends in consumer media. Last May, Martin Nisenholtz and Michael Zimbalist reached out to me about a product in the Times Company’s R&D lab that they wanted to show us at betaworks. A few weeks later they showed us the following video, accompanied by an iPad-based prototype. The video was created in January 2010, a few months prior to the launch of the iPad, and it anticipated many of the device’s gestures and uses, in form and function. Here are some screenshots of the prototype.
On the R&D site there are more screenshots and background. The Times Company decided it would be best to move this product into bitly and betaworks where it could grow and thrive. We purchased the prototype from the Times Company in exchange for equity in bitly and, as part of the deal, a team of developers from R&D worked at bitly to help bring the product to market.
With Thanks … The first thank you goes to the team. I remember the first few product discussions, the dislocation the Times Company’s team felt having been air lifted overnight from The New York Times Building to our offices in the heart of the Meatpacking District. Throughout the transition they remained focused on one thing: building a great product. Michael, Justin, Ted, Alexis — the original four — thank you. And thank you to Tracy, who jumped in midstream to join the team. And thank you the bitly team, without whom the data, the filtering, the bits, the ranking of stories would never be possible. As the web becomes a connected data platform, bitly and its api are becoming an increasingly important part of that platform. The scale at which bitly is operating today is astounding for what is still a small company, 8bn clicks last month and counting.
I would also like the thank our new partners. We are launching today with over 600 publishers participating. Some of whom you can see listed here, most are not. Thank you to all of them we are excited about building a business with you.
Lastly, I would like to thank The New York Times Company for coming to betaworks and bitly in the first place and for having the audacity to do what most big companies don’t do. I ran a new product development group within a large company and I would like to dispel the simplistic myth that big companies don’t innovate. There is innovation occurring at many big companies. The thing that big companies really struggle to do is to ship. How to launch a new product within the context of an existing brand, an existing economic structure, how to not impute a strategy tax on a new product, an existing organizational structure, etc. These are the challenges that usually cause the breakdown and where big company innovation, in my experience, so often comes apart. The Times Company did something different here. New models are required to break this pattern, maybe News.me will help lay the foundation of a new model. I hope it does and I hope we exceed their confidence in us.
And for more information about the product see http://www.news.me/faq
This is a different kind of post. I started thinking about “networked media” last August. This began in the same way my longer posts usually do: a slow process of thinking, writing, and editing that spans a few months. But the process took a left turn in October when I decided to speak about networked media at betaday. My work on the blog post ceased and I focused my attention on betaday. What I’m posting here is a compilation of the introduction that I wrote back in August, a video of the betaday talk, and my general notes.
The impact of the “socialization of the web” (i.e. the social components of the web that now pulse through every web page) is a fascinating subject that I think we are only just beginning to understand. Though “socialization” is a politically loaded word, my intent here is not political. Rather, my use of the word “socialization” is three-fold: I seek to 1.) to show how media is changing as it becomes integrated with social experiences. 2.) to note that the economics of media production is changing and 3.) to emphasize that this shift is a process, not a product.
Over the past few years I have written a fair amount about how the social web will change the way people discover and distribute information online. This started with a post in the spring of 2008 on the Future of News. Then in early ’09 I outlined how “social” would change the discovery process and disrupt traditional search. And then I wrote a long piece about what this shift in discovery means for the user experience on sites. These ideas, and subsequent posts, have informed a lot of what we have built and invested in at betaworks. New modes of navigation and discovery are being developed – from Summize to Tumblr to TweetDeck, and more recently from GroupMe to Ditto. It is now generally accepted that the impact of “social” on discovery and navigation is under way, but I believe the impact goes beyond discovery.
Undoubtedly, search has changed, and continues to change, the way we write, create pages, layout pages, tag and relate to content. It has also encouraged the creation of sites with limited or distracting content that exist solely to optimize search. Search has not driven a change in the content and user experience once a user is on a page that they value. By contrast, the “social web” is changing the web itself – ”social” is altering the nature of what we find. Social experiences are becoming the backbone of many sites. A web page that is part of the “social web” transforms content into a liquid experience, giving rise to a new kind of media: networked media. In the video from betaday, I walk through this shift and show data we have at betaworks that illustrates this change.
General Notes re: Networked Media from my September draft:
Starting about four years ago it became clear that the social, real time web could change the way search and discovery happened online. Fast forward today and that is certainly happened. The impact of this shift in distribution economics isn’t over but the trend has tipped to scale during 2010. Last year we saw site after site announce the percent of traffic that it is getting from the social web now exceeds or is a second only to search. In my post on how social will disrupt search two years back I used the example of youtube, and showed the speed at which it had become the second largest search destination on the web. Twitter, Facebook, tumblr and other vertical social networks are driving meaningful traffic to sites around the web. Take a collection of sites in the chart below, from news to commerce, from TV based media to sports for many of them social is now the largest driver of traffic. Nick Denton said last month that referrals to Gawker properties from Facebook had increased sixfold since the start of the year. And this is different traffic to search traffic. Its socially referred, its of higher quality and embedded in it is the multiplier effect that the social publishing platforms drive.
The socialization of the page
The question I would like to turn to now is how web pages and applications are been changed by the social, real time web. Search changed the way we discovered the web. Web sites optimized their pages for search bots but in most cases they didn’t actually change the content or substance of the page that was presented to the end user. Put another way, search brought little tangible benefit to the end user beyond discovery. Search certainly created new forms of sites. Domain parking, content farmers, link bait, search spawned thousands of sites that managed to game the discovery tool to gain attention, clicks and visits by users who find themselves on site that has the meta data they were looking for but often little of the content.
But unlike search the dynamic of a web page becoming part of the social web is transforming the experience and the content of that page into a liquid experience that is giving rise to a new kind of media. Humor sites changed because of search. This was the one exception I found. Fred Seibert told me last summer about how humor sites changed the content of their pages, placing the punch line up front — because that is what people searched for.
(for the interested, a short primer is here on what we do at betaworks)
Three steps re: how does a page becomes networked?
#1. An Activity window opens up Somewhere between 1-3 hrs after a story is posted a window of social activity opens. An example, albeit a slightly unusual one: a product page on amazon for a set of speaker wires that cost almost $7,000 — this past weekend this page has all of a sudden taken flight on on Twitter and some of the social blogs. The page was actually posted to reddit a month ago. Yet for whatever reason, the insanity of a $7,000 cable didnt mesh with the zeitgeist until November 27th. On the 27th the page was Tweeted by @PaulandStorm. And off it went. Screen shot of the page here. In the video above you see this process happen in detail. I use Chartbeat to understand the progression and dispersion that occurs in this initial activity window. Take a look at the dispersion patterns of typical stories on Fred’s AVC blog you can clearly see the window of engagement happen — just take a look at this as Fred puts up a new post one morning. Look at the uptake starting about 1 hour after the post hits. Usually the peak occurs at the 100 minute mark. Chartbeat data from 1000′s of large sites around the web suggests that for a blog the peak is usually around 60 mins after posting and for a news site its 130mins. Its great how open Fred is with this data, lots to learn. These are windows of meaningful, concurrent activity. Concurrent users is the key metric to track at this point. Amplification in the social web is what drives the metric. And amflification happens because of relative influence within your and other social groups. Link and discuss: It’s Betweenness That Matters, Not Your Eigenvalue: The Dark Matter Of Influence: http://sto.ly/ii40vr
#2. Social clustering occurs With the engagement window open and concurrent users on the page peeking clustering starts to happen. What separates this from just an open engagement window is the level of engagement. Users arrive on the site and they start posting comments and the conversation begins. ”Each comment someone takes the time to leave serves as a proxy for 100 or so folks who properly echo that sentiment” (Batelle). Examples… The importance of the time of day that you publish into the social web. Timing relative to what your social group is talking about now is what triggers clustering. This is why socialflow works — it knows when is the right time to send the message that lights up the social web. Below is an image from some analysis that the NY Times using bit.ly data. It shows the dispersion of a particular story — in this case a Kristof piece about the Pill – across the social web. In the image you can see the clustering occurring, this burst over time of influencers and social engagement.
#3. The page becomes Networked Snap a synchronous experience occurs. Critical mass of users on one page at the same time and something magical happens. Think about it as a page becoming a live event or a live site. Similar to a concert there is a residue of the social experience when you go back — even if its way after the event. If you watch the opening of this live concert you will get a visceral experience this looks like and what happens when media becomes connected with the audience. Its Springsteen’s hungry heart and while he plays the opening of the song he turns it right over the audience to pick it up and sing the opening. Forking of content.
– Rise of agile publishing: what is it? Lean editorial teams, instrumentation of sites, getting the data feedback, adaptive CMS’s, importance of posting at the right time up, importance of tracking social engagement, how every page is becoming a front page
– Serendipity. Some of this is science, some of it isnt. An “old” page can become networked out of no where — point back to the amazon example. You don’t know where its going it’s going to happen, you need tools to track and alert you when its happening
– We are moving into an age of networked media. Dana Boyd’s analysis of the shift from broadcast to networked media
– closing of comments post the activity window – proximity references / boyd article, couple of old ones are in close proximity to this one – Structured data types to allow for debate topics.
Example: Gawker. gawker is experimenting , new design that is both more dynamic (real time) and more immersive, without the restrictions of reverse chronological. Users are no longer navigating from page to page across isolated sites. Rather they are experiencing the subset of sites as a liquid experience, where there is a consistent flow from site to site and the consistent aspect is social. Users flow — ambient experience of media.
Example: Dribble and iTunes icon, this became a networked media event.
Example: Yahoo bloggers adapt content to the refers and links to the spiker
Example: “the quality of the dynamics of the conversation shift from one where parlor tricks can sustain themselves beyond the quality of the content to one where we can get sort of immediate tactile connection with people” (source: 4.18.09 Gillmor Gang 1.01 min).
Example: Red State : Twitter 140 charac wish they could aggregate topics need standardized metrics re social engagement
Points of tension to discuss and think about further?
- Advertising as the primary mode monetization and pulling people in vs. pulling them away.
- Tension between platform owners who monetize w/ advertising on their site, trying to intergrate web sites into their monetization flow
- The monolithic assumption that one social platform will rule all. How vertical use cases of social (from tumblr to Foursquare to Groupme to Instagram) illustrate how social is fragmenting into specific workflows and uses. Do “digital networks architectures naturally incubate monopolies” Lanier?
- How are the economics of social media are effecting networked media. Ownership of data, ownership of content, if users are creating the content what rights do they have over it?
- Importance of the link structure of the web its the most fluid form resist the temptation to vertically integrate and “consumption” sites.
- dimensionality reduction too much data
- Importance of the link structure of the web its the most fluid form resist the temptation to vertically intergrate and “consume” sites
- Heisenberg principle of social media, the act of a page becoming social changes it
My reading collection on networked media: http://bit.ly/bundles/johnb/u
Access to fast, affordable and open broadband, for users and developers alike is, I believe, the single most important driver of innovation in our business. The FCC will likely vote next week on a framework for net neutrality – we got aspects of this wrong ten years ago we can’t afford to be wrong again. For the reasons I outline below, we are at an important juncture in the evolution of how we connect to the Internet and how services are delivered on top of the platform. The lack of basic “rules of the road” for what network providers and others can and can’t do is starting to hamper innovation and growth. The proposals aren’t perfect but now is the time for the FCC to act.
Brad Burnham stopped by our office earlier this week to talk about his proposal for the future of net neutrality.The FCC has circulated a draft of a set of rules about neutrality that the Commission will likely vote on this week. Though the rules are not public, Chairman Genachowski outlined their substance last week. Through a combination of the Chairman’s talk, the Waxman Proposal, and the Google/Verizon proposal, one can derive the substance of the issue and understand its opportunities and risks. I strongly support much of what the Chairman has proposed and I support the clarifications that Burnham outlines. Before further discussing this point, I have to ask – why does this matter now? Over the past few years there has been a lot of discussion, a lot of promises, and some proposals with regard to net neutrality.
Three reasons why this matters now: READ ON OVER AT TECHCRUNCH…