This is one of those posts that could be a response to someone else’s post but got so long, it’s here. It’s my thoughts on a long chain of people thinking, which are most effectively summarised by Amber.
I’m not going to rehash the conversations – the people that have gone before me have done it so much better – but I wanted to have a look at this purely from an identity management perspective. These are the thoughts that I thought:
- Much of this is, of course, all about identity and how your identity is big business to the services around you. David Kernohan mentioned the ‘user data bubble’ and this is exactly the sort of scenario that IDM folks such as UMA are trying to tackle with their approach to the personal data ecosystem (still makes me shudder as a phrase). I’ve always been impressed with UMA as a technology but sceptical about user take-up and the amount of ‘friction’ involved in having to manage your own personal data to get effective sharing and information filtering the way you want it.
- If we want to see frictionless sharing, it is likely that we are probably compromising on personal data security and what we call PII (personally identifiable information) release somewhere. This is a fact that is difficult to escape.
- I think company behaviour and patterns are interesting in this case. Even though Google and Facebook (and hotmail and everyone else) are doing the same thing, the approach taken to ‘personalising’ or ‘filtering’ or ‘advertising’ information to us has been different with each, and that changes perception. Facebook started on paper as a walled garden, an authenticated environment, and we kind of expect the tailored environment of advertising within that space – especially when it’s free. Google on the otherhand is perceived by many as an open environment, even though people are often not aware that they are permanently signed in to Google….so when they start pushing Google+ links or showing too much awareness of our behaviour, it causes concern.
- I wonder what effect, if any, the changes to cookie regulations will have on the way information is filtered through to us without our awareness? It is exactly the sort of monitoring behaviour the law is designed to prevent, but it is exactly the sort of behaviour the law is badly placed to stop.
- A lot of the filtering does actually hit the mark – for example Amber really did want to know about Scottish Castles – and even though it can be annoying it’s not something we want gone, perhaps just more under our control. The space accurate filtering of web content is not working out in is the more traditional academic space – the Google Scholar approach is just not taking off. This is something I talked about at the FAM11 event.
I often talk about the phrase ‘if you’re not the customer, you’re the service’ and its boring to keep on trotting on a hackneyed phrase, but it’s that attitude that things like UMA are trying to address. UMA says I may be using your service for free, but you are not buying me, you are not buying my data, and I know what it is worth to you.
So where does this leave us? I’m not sure, but as Amber’s post suggests maybe there is a group of people, the twittering classes, who might be willing and able to embrace the personal data ecosystem and use it to make their filtered, frictionless world a place where they are more comfortable? We’ll just have to wait and see.