Thursday, 13 November 2008
OpenSocial’s birthday today
Then, applications had to be embedded in sites as gadgets, which makes the social context clear for users, but means developers have to write some Javascript, and can only run code when the user is looking at the site.
With OpenSocial 0.8 rolling out, the REST APIs mean that developers can integrate with social sites using server-side code directly, potentially delegating user registration, profiles and friend relationships to an already-trusted social site, and feeding activity updates back into them.
To do this, we are building an Open Stack, based on OpenID, XRDS-Simple, OAuth, PortableContacts and OpenSocial. By composing open standards in this way, we can make each one more valuable. The advantages of OpenID over email login in itself are not that obvious to users, but if the OpenID can be used to bring in your profile and contacts data - with your permission via OAuth - suddenly the added value is clear to users and developers alike. This connection was one of the exciting discussions at the Internet Identity Workshop this week - here's a video of myself, Steve Gillmor, David Recordon and Cliff Gerrish talking about it.
Saturday, 8 November 2008
Missing the point of OpenID
However, the then uses his unmemorable Facebook URL http://www.facebook.com/p/Dare_Obasanjo/500050028 as an example, rather than any of the memorable ones he actually uses and people refer to, such as http://www.25hoursaday.com/weblog/ or http://carnage4life.spaces.live.com/ or http://twitter.com/Carnage4Life
DeWitt Clinton did an excellent job of clearing up some of Dare's other innaccuracies, but he then rhetorically exaggerated thus:
URLs make fantastic identifiers — for the 0.1% of the web population that understands that they “are” a URL. Fortunately, the other 99.9% of the world (our parents, for example) already understand that they have an email address.
This is missing the huge population of the online world (our children, for example) who consider email a messy noisy way to talk to old people, or to sign up to services when forced to, but are happy using their MySpace or Bebo or Hi5 or LiveJournal or Blogger or Twitter URLs to refer to themselves.
As I said in URLs are People Too:
The underlying thing that is wrong with an email address is that it's affordance is backwards - it enables people who have it to send things to you, but there's no reliable way to know that a message is from you. Conversely, URLs have the opposite default affordance- people can go look at them and see what you have said about yourself, and computers can go and visit them and discover other ways to interact with what you have published, or ask you permission for more.
Where I see OpenID providing a key advantage is in it's coupling with URL-based endpoints that provide more information and save the user time. The OpenID to PortableContacts connection as demonstrated by janrain can add your friends (with permission) from an OpenID login directly via OAuth.
This makes the OpenID login instantly more useful than an email one, and by connecting to an OpenSocial endpoint too, you can couple activities you take on the wider web with the site you trust to be a custodian of your profile and friends data, so your friends can discover what you are doing elsewhere, and come and join you.
I'm looking forward to talking through these issues at Internet Identity World next week in Mountain View.
Friday, 7 November 2008
Blogging's not dead, it's becoming like air
One thing I learned at Technorati is that one sure-fire way to get linked to by bloggers is to write an article about blogging. Sure enough, The Economist and Nick Carr have, with their 'death of the blogosphere' articles, garnered a fair bit of linkage.
Their curious obsession with the Technorati Top 100 is missing what is really happening. As JP points out, the old blogging crew are still around, they're just blogging less that those paid to do so a dozen times a day. Not because they are less interested or engaged, but because there are now many new ways to do what we used blogs for back then.
In 2001, if we wanted to share brief thoughts, we used a blog; to link to others’ posts, we used a blog. If we wanted a group discussion, we made a group blog.
With Technorati, and trackback and pingback, we built tools to follow cross-blog conversations, and learned that we are each others’ media. As I wrote in 2004:
The great thing about weblogs is when you discover someone. Someone who makes sense to you, or someone who surprises you with a viewpoint you hadn't thought of. Once you have found them you can subscribe to their feeds and see how they can keep inspiring or surprising you.
You can even start a blog, link to them, and join the conversation
A year later I reiterated:
By tracking people linking to me or mentioning my name, Technorati helps me in this distributed asynchronous conversation (thats how I found Mike and Dave's comments, after all). However, as I've said before, "I can read your thoughts, as long as you write them down first". In order to be in the conversation, you need to be writing and linking. Perforce, this means that those who write and link more, and are written about and linked to more, are those who most see the utility of it.
What has happened since is that the practices of blogging have become reified into mainstream usage. Through social networks and Twitter and Reader shared items and Flickr and HuffDuffer and all the other nicely-focused gesture spreading tools we have, the practice of blogging, of mediating the world for each other, has become part of the fabric of the net.
This may be the first blogpost I've written since August, but the many digital publics I'm part of have been flowing media and friendly gestures to and from me all the time.
Monday, 4 August 2008
Social Disease, or making magic?
Monica thanked me for the explanation, saying that she was glad I had elaborated as she had thought, and I hope she forgives me for paraphrasing, that 'social software was something awful, like social workers'. That really made me think, and I haven't quite got to the end of where that throwaway comment has led me.
Is 'social' the problem with social software? Certainly in the UK, 'social' has some rather negative connotations: Social workers are often despised and derided as interfering, and often incompetent, busybodies. Social housing is where you put people at the bottom of the socioeconomic heap. Social sciences are the humanities trying to sound important by putting on sciency airs. Social climbers are people who know how to suck their way up the ladder. Social engineering is getting your way deviously, by using people's weaknesses against them. Social security is money you give people who can't be bother work for themselves. Socialism is an inherently flawed system that is prone to corruption. Social disease is venereal.
This reminds me of early in the Social Sofware story:
The SSA meeting was fairly chaotic - perhaps reflecting the diverse meanings of 'Social'. Clay Shirky did not show up (or if he did, did not speak up); Dave Winer later poured scorn on the efforts, implying it was all about social climbing.Friedrich Hayek famously said that the word 'social' empties the noun it is applied to of their meaning. Hayek goes on:
...it has in fact become the most harmful instance of what, after Shakespeare's 'I can suck melancholy out of a song, as a weasel sucks eggs' ( As You Like It , II, 5), some Americans call a 'weasel word'. As a weasel is alleged to be able to empty an egg without leaving a visible sign, so can these words deprive of content any term to which they are prefixed while seemingly leaving them untouched. A weasel word is used to draw the teeth from a concept one is obliged to employ, but from which one wishes to eliminate all implications that challenge one's ideological premises.
Perhaps the problem is that the social realm is the realm of trust, so saying things are social is asserting "trust me". As Adam Gopnik writes on magic in the New Yorker:
But the Too Perfect theory has larger meanings, too. It reminds us that, whatever the context, the empathetic interchange between minds is satisfying only when it is “dynamic,” unfinished, unresolved. Friendships, flirtations, even love affairs depend, like magic tricks, on a constant exchange of incomplete but tantalizing information. We are always reducing the claim or raising the proof. The magician teaches us that romance lies in an unstable contest of minds that leaves us knowing it’s a trick but not which one it is, and being impressed by the other person’s ability to let the trickery go on.[...]
I saw, too, that David Blaine is absolutely sincere in his belief that the way forward for a young magician lies not in mastering the tricks but in mastering the mind of the modern age, with its relentless appetite for speed and for the sensational-dressed-as-the-real. And I thought I sensed in Swiss the urge to say what all of us would like to say—that traditions are not just encumbrances, that a novel is not news, that an essay is a different thing from an Internet rant, that techniques are the probity and ethic of magic, the real work. The crafts that we have mastered are, in part, the tricks that we have learned, and though we know how much knowledge the tricks enfold, still, tricks is what they are.
Thursday, 31 July 2008
Open Source and Social Cloud Computing
Tim O'Reilly has written an excellent review post on Open Source and Cloud Computing which says, among other things:
The interoperable internet should be the platform, not any one vendor's private preserve.So here's my first piece of advice: if you care about open source for the cloud, build on services that are designed to be federated rather than centralized. Architecture trumps licensing any time.
But peer-to-peer architectures aren't as important as open standards and protocols. If services are required to interoperate, competition is preserved. Despite all Microsoft and Netscape's efforts to "own" the web during the browser wars, they failed because Apache held the line on open standards. This is why the Open Web Foundation, announced last week at OScon, is putting an important stake in the ground. It's not just open source software for the web that we need, but open standards that will ensure that dominant players still have to play nice.
The "internet operating system" that I'm hoping to see evolve over the next few years will require developers to move away from thinking of their applications as endpoints, and more as re-usable components. For example, why does every application have to try to recreate its own social network? Shouldn't social networking be a system service?
This isn't just a "moral" appeal, but strategic advice.[...]
A key test of whether an API is open is whether it is used to enable services that are not hosted by the API provider, and are distributed across the web.
I think this API openness test is not strong enough. As I wrote in An API is a bespoke suit, a standard is a t-shirt, for me the key test is that implementations can interoperate without knowing of each others' existence, let alone having to have a business relationship. That's when you have an open spec.
The other thing I resist in the idea of an internet operating system is that that the net is composable, not monolithic. You can swap in and implementations of different pieces, and combine different specs that solve one piece of the problem without having to be connected to everything else.
The original point of the cloud was a solved piece of the problem that means you don't have to worry about the internal implementation.
Thus, the answer to "shouldn't social networking be a system service?" is yes, it should be a Social Cloud. That's exactly what we are working on in OpenSocial.
Monday, 28 July 2008
Here Comes Everybody - Tummlers, Geishas, Animateurs and Chief Conversation Officers help us listen
Bob Garfield's de haut en bas attack on web commenters upset two very skilled conversational catalysts, Ira Glass, and Derek Powazek. The false dichotomy of 'we choose who you get to hear' and 'total anarchic mob noise' was dismissed by Jack Lail too. At the same time, Ben Laurie explained how the IETF's open-to-all mailing lists can be hijacked by time-rich fools, talking about the Open Web Foundation.
At Supernova last month, listening to Clay Shirky talk about the problems of collective action reminded me of a small nit I have with his excellent book Here Comes Everybody (which you should all read). He talks about the deep changes that ridiculously easy group forming online has wrought, but he also explains that most of these groups fail, in various ways.
The key to this is finding people who play the role of conversational catalyst within a group, to welcome newcomers, rein in old hands and set the tone of the conversation so that it can become a community. Clay referred to Teresa Nielsen-Hayden, who is a great example of this, and I have had the privilege to discuss this with Teresa, Amy Muller,Christy Canida and others at the Troll Whispering session at Web2Open, and heard very similar stories from Gina Trapini, Annalee Newitz, Jessamyn West and Jeska Dzwigalski at The Users Are Revolting at SXSW.
The communities that fail, whether dying out from apathy or being overwhelmed by noise, are the ones that don't have someone there cherishing the conversation, setting the tone, creating a space to speak, and rapidly segregating those intent on damage. The big problem with have is that we don't have a English name for this role; they get called 'Moderators' (as Tom Coates thoroughly described) or 'Community Managers', and because when they're doing it right you see everyone's conversation, not their carefully crafted atmosphere, their role is often ignored.
In other languages there are words closer to this role - Suw and I thought of geisha a while back, whereas Teresa suggested the Yiddish Tummler - both Deb Schultz and Heather Gold liked that one. In French animateur has the broader connotations of discussion, leadership and guidance needed, but in English we are stuck with enervated latinate words like facilitator. Even an eloquent and charismatic presidential candidatehad a difficult time explaining what a 'Community Organizer' does, around the same time that Bartlett was resorting to card tricks.
Which brings me back to Clay's book - in it he gives an account of the #joiito chatroom that completely misses the rĂ´le that JeannieCool played there, making her sound like a n00b. The software tool, jibot, that has helped keep that conversation going for 5 years, was built to support Jeannie's role as conversational catalyst. I do hope he gets a chance to correct this in the next edition.
The broader issue is one that we are still working on - building rules for who gets to speak where and when, re-imagining the historic model of a single hegemonic public record that print Journalism still aspires to, from its roots in the coffeeshops of London into the many parallel publics we see on the web, and how legal precedents designed for a monopoly of speech make no sense here.
In the meantime, if your newspaper, social media initiative or website isn't working right, you need to find your tummler, geisha, animateur or conversational catalyst, but you should consider giving them a big name title like 'Chief Conversation Officer'.
Tuesday, 8 July 2008
Shortening URLs, or getting inbetween?
With the rise of short message systems like Twitter, there is a growth in URL shorteners (as each one's namespace gets full, others get shorter). Today bit.ly launched to big fanfare in the blogosphere.
I took a closer look. What I noticed is that the older generation of these - tinyurl.com and xrl.us use a 301 Moved Permanently redirect, whereas bit.ly and is.gd use a 302 Found redirect, which means 'don't cache the redirected URL, keep checking the original'.
In other words, these services are saying in their HTTP responses that they may change what the short URLs point to in future, putting browsers, indexers and caches on notice that this may happen.
I also noticed that bit.ly, like tinyurl.com, allows you to pick a custom label from their namespace, but if you do it returns two 302 redirects in sequence (once to a more cryptic bit.ly url, then to the external one you chose). I pointed bit.ly/k at this blog, so you can check it yourself with curl:$ curl --head http://bit.ly/k
HTTP/1.1 302 Found
Location: http://bit.ly/fwNKA
$ curl --head http://bit.ly/fwNKA
HTTP/1.1 302 Found
Location: http://epeus.blogspot.com
Apart from the extra delay this introduces, this is also telling your browser and web crawlers not to cache this, as they may change it in future. Compare tinyurl.com:$ curl --head http://tinyurl.com/kevinm
HTTP/1.1 301 Moved Permanently
Location: http://epeus.blogspot.com
Google's advice for webmasters is to use 301 for redirects, as this signals the preferred URL.
Monday, 30 June 2008
Google as a restaurant? Watch Gordon Ramsay
Jeff Jarvis says he's writing a metaphorical application of Google principles to running a restaurant. Over the last few weekends, while sorting out stuff at home, I've been watching Gordon Ramsay's Kitchen Nightmares which BBC America seems to be playing continuously at weekends. If you haven't seen it, do watch some - each episode, Ramsay spends a week at a failing restaurant in the UK and tries to help them turn it around.
After seeing a few, there are recurrent themes that Ramsay comes up with: simple menus, built on good ingredients that local people understand, served promptly. Which fits well with Google's ten things - simple frontend, low latency results, user-focused. How he tries these out involve analogues for user testing, A/B experiments, and profiling under high load.
Of course, Google does run restaurants - so Jeff can read how they get built and tested directly.
Saturday, 14 June 2008
I'm with the stupid network
Sunday, 8 June 2008
How not to be viral
Graphing Social Patterns East is on tomorrow, and I'm sorry not to be there, though m'colleague Patrick Chanezon will be. However, reading the schedule I notice the word 'viral' is still much in evidence.
If you behave like a disease, people develop an immune system
At the Facebook developer Garage last week, I heard a developer say:when I hear 'viral' applied to software I replace it with 'cancerous' to clarify. A few months back I wrote that social Apps should be Organic, not Viral, and at Google I/O last week I expanded on this with m'colleagues Vivian Li and Chris Schalk. Here's an overview of the alternative reproductive strategies to being a virus that we came up with:
r-Strategy - scatter lots of seeds
Some plants and animals, like dandelions and frogs, rely on having huge numbers of offspring, with the hope that a few of them will survive - this is known as an r strategy. In application terms this is like wildly sending out invitations, or forcing users to invite their friends before showing them useful information. It may help you spread your seed, but most of them will die off rapidly.
K-Strategy - nurture your young
Mammals take the opposite strategy; they have a few young, and nurture them carefully, expecting most of them to grow up to adulthood and reproduce themselves. This is known as a K strategy. This translates into software by following Kathy Sierra's principles to create passionate users who will share your application through word of mouth. Another way to nurture your users is to encourage them to use your application before they have to install it, as Jonathan Terleski describes.
Fruiting - delicious with a seed in
Many plants encourage their seeds to be spread more widely by wrapping them in fruit, so that animals or birds will carry them further, eating the fruit and helping the seed to propagate. The analogy here is in making sure your invitations aren't just bald come-ons for your application "a friend said something - click here to find out what" - with a forced install on the way, but instead are clearly bearing gifts to the receiving user, so they will want to click on the link after seeing what is in store. This is one of Jyri Engström's principles for Web 2.0 success with Social Objects.
Rhizomatic - grow from the roots up
Another reproductive strategy that many plants, including strawberries and ginger use is to send out runners or shoots from the roots, so that they spread out sideways, from the bottom up, known as rhizomes or stolons. The analogy here is for social applications that spread through appearing in users activity streams and via entries in application directories, growing outwards through the 'grass roots' runners that they send out as part of their normal usage.
Being dumb gets low CPMs
A lot of the debate around viral applications reminds me of a David Foster Wallace quotation:TV is not vulgar and prurient and dumb because the people who compose the audience are vulgar and dumb. Television is the way it is simply because people tend to be extremely similar in their vulgar and prurient and dumb interests and wildly different in their refined and aesthetic and noble interests.
Social networks aren't like TV - everyone sees something different in them. If you want to gather engaged, inspired, interested and indeed valued users, write an application that speaks to their refined and aesthetic and noble interests, and see how they will spread it through their social networks to find the others who share their interests.
It was interesting to see Slide redirecting away from virality today. GSP West was on at the same time and place as eTech, and I heard some eTechies refer to it as 'Grasping Social Parasites'; I hope that the growing realisation that a disease is not a good model to base your business on means that tomorrows conference will spread a better reputation for GSP East.
Tuesday, 27 May 2008
Miasma theory - wrong in the 1840s, wrong now
My generation draws the Internet as a cloud that connects everyone; the younger generation experiences it as oxygen that supports their digital lives. The old generation sees this as a poisonous gas that has leaked out of their pipes, and they want to seal it up again.
Bill Thompson and Nick Carr are worried about governments interfering too:
In the real world national borders, commercial rivalries and political imperatives all come into play, turning the cloud into a miasma as heavy with menace as the fog over the Grimpen Mire that concealed the Hound of the Baskervilles in Arthur Conan Doyle's story.
Except, if you have read or listened to Steven Johnson's excellent The Ghost Map, you'll know that the miasma theory of disease was a fatal error for urban England in the 1840s - the real problem was not the bad smells in the air, but the diseases in the water. The fault, dear governments, lies not in our clouds but in your pipes.
Monday, 26 May 2008
An API is a bespoke suit, a standard is a t-shirt
When a site designs an API, what they usually do is take their internal data model and expose every nook and cranny in it in great detail. Obviously, this fits their view of the world, or they wouldn't have built it that way, so they want to share this with everyone. In one way this is like the form-fitting lycra that weekend cyclists are so enamoured of, but working with such APIs is like being a bespoke tailor - you have to measure them carefully, and cut your code exactly right to fit in with their shapes, and the effort is the same for every site you have to deal with (you get more skilled at it over time, but it is a craft nonetheless).
Conversely, when a site adopts a standard format for expressing their data, or how to interact with it, you can put your code together once, try it out on some conformance tests, and be sure it will work across a wide range of different sites - it's like designing a t-shirt for threadless instead.
Putting together such standards, like HTML5, OpenID, OAuth or OpenSocial or, for Dave's example of reviews, hReview, takes more thought and reflection than just replicating your own internal data structures, but the payoff is that implementations can interoperate without knowing of each others' existence, let alone having to have a business relationship.
I had this experience at work recently, when the developers of the Korean Social network idtail visited. I was expecting to talk to them about implementing OpenSocial on their site, but they said they had already implemented an OpenSocial container and apps using OpenID login, and built their own developer site for Korean OpenSocial developers from reading the specification docs.
I'm looking forward to more 'aha' moments like that this week at I/O.
Wednesday, 7 May 2008
Talking about OpenSocial all over the place
- Cloud computing with Joyent at Web 2.0(video)
- Chris Vallance of BBC Pods and Blogs (audio)
- Jemima Kiss of The Guardian (audio)
- Data Portability podcast (audio)
- Kimberley Dykeman of web2.0 TV (video)
- Christina Warren of Download Squad (video)
- Caroline McCarthy of CNET (text)
For more in-depth details on OpenSocial, come along to Google I/O on May 28th-29th in San Francisco
Tuesday, 6 May 2008
Portable Apps, not data?
He says:
Your data host’s job is to perform actions on your data. Rather than giving copies of your data out to a thousand companies (the Facebook and Data Portability approach) you host the data and perform actions on it, programmed by those companies who are developing useful social applications.
Which is exactly what an OpenSocial container does - mediate access to personal and friend data for 3rd party applications.
This environment has complete access to the data, and can do anything with it that you want to authorize. The developers provide little applets which run on your data host and provide the functionality. Inside the virtual machine is a Capability-based security environment which precisely controls what the applets can see and do with it.
This maps exactly on to Caja, the capability-based Javascript security model that is being used in OpenSocial.
Your database would store your own personal data, and the data your connections have decided to reveal to you. In addition, you would subscribe to a feed of changes from all friends on their data. This allows applications that just run on your immediate social network to run entirely in the data hosting server.
Again, a good match for OpenSocial's Activity Streams (and don't forget persistent app data on the server).
Currently, everybody is copying your data, just as a matter of course. That’s the default. They would have to work very hard not to keep a copy. In the data hosting model, they would have to work extra hard, and maliciously, and in violation of contract, to make a copy of your data. Changing it from implicit to overt act can make all the difference.
The situation is worse than that; asking people for their logins to other sites is widespread and dangerous. I'd hope Brad would support OAuth as a step along the way to his more secure model - especially combined with the REST APIs that are part of OpenSocial 0.8
If you're interested in these aspects of OpenSocial, do join in the linked mailing lists, and come along to the OpenSocial Summit on May 14th (just down the road from IIW).
Monday, 5 May 2008
Mixing degrees of publicness in HTTP
The session was tersely summarized here, but let me recap the problem.
When you are browsing the web, you often encounter pages that show different things depending on who you are, such as blog, wikis, webmail or even banking sites. They do this by getting you to log in, and then using a client-side cookie to save you the bother of doing that every time. When you want to give a site access to another one's data (for example when letting Flickr check your Google Contacts for friends), you need to give it a URL to look things up at.
The easy case is public data - then the site can just fetch it, or use a service that caches public data from several places, like the Social Graph API. This is like a normal webpage, which is the same for everyone, returning a HTTP 200 response with the data.
The other common case is where the data is private. OAuth is a great way for you to delegate access to a web service for someone else, which is done by returning an HTTP 401 response with a WWW-Authenticate: OAuth
header showing that authentication is needed. If the fetching site sends a valid Authorization
header, it can have access to the data.
The tricky case is where there is useful data that can be returned to anyone with a 200, but additional information could be supplied to a caller with authentication (think of this like the social network case, where friends get to see your home phone number and address, but strangers just get your hometown). In this case, returning a 401 would be incorrect,as there is useful data there.
What struck me was that in this case, the server could return a 200, but include a WWW-Authenticate: OAuth
header to indicate that more information is available if you authenticate correctly. This seems the minimal change that could support this duality, and much easier than requiring and signalling separate authenticated and unauthenticated endpoints through a HTML-level discovery model, or, worse, adding a new response to HTTP. What I'd like to know from people with deeper HTTP experience than me is whether this is viable, and is it likely to be benign for existing clients — will they choke on a 200 with a WWW-Authenticate
header?
HTTP does have a 203 response meaning Non-Authoritative Data, but I suspect returning that is more likely to have side effects.