#futureWCM – some thoughts from China – part 3

Microsoft and Google featured strongly in my last post because in thinking about how web content management may develop over the next decade, the big battle for hearts and minds these organisations are engaged in will continue to shape WCM because they touch so many aspects of the content process. A personal view I’ve held for a number of years now is that Microsoft’s understandable efforts to protect the desktop worldview that it earns the bulk of its revenue from has been doing WCM a disservice and products such as SharePoint continue to distract us from smarter ways of doing things. Conversely, Google appears to be accelerating its pace of development in exciting and innovative ways. Its services are superb for small to medium businesses and I welcome its ongoing efforts to usurp Microsoft’s desktop dominance in larger organisations.

Many WCM and ECM developments of the last 15 years have been skewed towards Microsoft’s desktop PC view of the world and, for the sake of ubiquity, we have complied with this worldview and happily made WCM products that look like Microsoft’s desktop apps (because that’s what most people are familiar with using), connected to them or integrated them while being at its mercy with sometimes flaky support for broader standards such as WebDAV or having to get to grips with it’s particular way of doing things such as extensive use of CAML in SharePoint.

Google has been making steady progress in pushing the humble web browser forward to accomplish ever more sophisticated computing application tasks. Recent developments have highlighted progress in these areas. Google Wave and a new range of Google Apps site templates illustrate the company’s play for Microsoft’s stronghold of business collaboration. Reading Don Dodge’s recent blog post following his departure from Microsoft to Google and looking beyond the slightly acrimonious tone of some of it, the words he chose to describe the development of Google docs hit at the heart of Microsoft’s perceived weaknesses and Google’s strengths.

Personally I welcome these developments and have felt frustrated at the time it’s taking to shift from an information management approach that has clearly had it’s day and to a web first one that makes so much sense in an always on, always connected world.

For a number of years now I’ve been emphasising the importance of context in the content management process and while in a Product Management and Strategy role a few years ago, was focused for a time on visualising just the kind of connected and collaborative process now being illustrated by Google Wave. At the time we were looking at how web based application developers such as Zoho were innovating word processing in a browser based environment, recognising that new thinking should be applied rather than replicating old thinking on a new delivery platform.

#futureWCM – some thoughts from China – part 2

1st generation web content management was driven by the US and the desire for the dominant global organisations of the 90’s to embrace the commercial opportunities offered by the web

2nd generation web content management was driven to a large extent by Europe and Scandinavia, who have needed to deal with many more language and cultural challenges across all types and tiers of organisations

3rd generation web content management is being driven by web users themselves who have discovered the power of open source community development, online content creation and socially driven communications

4th generation web content management will be driven by the East – simply because the West doesn’t understand the East well enough. An excellent recent TED presentation here by Devdutt Pattanaik emphasises some aspects of this lack of understanding

I’m not making this observation because I am currently writing this blog in China. My experiences in working for European brands with strong Asian presence in recent years has given me an insight into how business is done in the East at a grass routes level, how and where this is influencing information management requirements and how this is likely to impact web content management.

For product manufacturers, particularly those with some heritage, the web can be a double-edged sword. On one hand it has helped them create effective global sales operations. On the other, it erodes margins and polarises markets – with mass market low-cost products at one end and premium products at the other. The middle ground is not a comfortable place to be in today’s wired economy.

Sitting in meetings here in Hong Kong I have been struck by the contrast of presentations by the marketing folks. The European contingent’s slides are often peppered with the phrase ‘no internet’ – referring to efforts to prevent high-end, premium products being subjected to a price-led web war. So the unease in the room was apparent when the Chinese marketing folks presented. In contrast, their presentations were almost entirely about the web and it’s hard to forget that almost every single product being discussed, including competitive ones, is manufactured in China.

The more I listened, the more I got a sense of déjà vu. There was a lot of comment about sites like Taobao and Team Buy  . Although terms like social networking were being used liberally, the concepts they were talking about, such as ‘team buying’ sounded awfully familiar to web seminars I attended back in the late 90s where start-ups like letsbuyit.com were regular presenters. During the peak of the dotcom boom, their concept of people coming together to push down the price of an item made regular appearances on TV in the form of their ‘ant’ logo.

Letsbuyit.com was a high profile victim of the dotcom bust but it is making a comeback – this time as a membership orientated price comparison site

Given that the great firewall of China is blocking access to some of the familiar names of the ‘social media’ world, it appears there are no shortage of online ‘conversations’ happening amongst the countries many, many millions of web users. It looks like China is continuing to through it’s own dotcom boom within its firewall, with the types of irrational exuberance that continues to be a feature of the western world’s web usage, fueling a boom in online communications and shopping. Regardless of whether this bubble bursts any time soon, I think these developments are significant to the future of the web and web content management in the coming years.

At present, open-source software is not big in China – mainly because extensive pirating means that proprietary software is mainly free too. I imagine that Microsoft, in particular, is quite happy about this as it has helped indoctrinate the world’s largest population into the belief that the only way to operate a computer, deal with content and communicate online is via its software.

From what I’ve heard over here, China has big ambitions in software. Perhaps the recent resignation of Kai-Fu Lee from Google China (who originally headed Microsoft’s Chinese Research operation) indicates things are gathering pace as one imagines he would have the background knowledge and insight to jump into the Chinese tech venture capital space at the right time. If China is to make an impact beyond its firewall, then it needs to look beyond what Google is doing to usurp Microsoft’s desktop computing dominance. The netbook market development driven to a large extent by Taiwan’s ASUS innovations has often been described as a threat to Microsoft’s dominance because it has demonstrated that there is an alternative. ASUS and it’s fellow Taiwanese manufacturer Acer’s enthusiasm for netbooks is clear and I understand it shook Microsoft that these innovations were more popular in the western world than it believed they would be.

So, with Chinese companies innovating in hardware, it follows that they’ll be innovating in software, in the first instance to deal with the obvious differences in language and culture close to home and secondly to help create a new world order.

#futureWCM – some thoughts from China – part 1…

…Not much specific Chinese WCM input yet but I’m expecting some over the next few days. The following is just a few initial thoughts from the flight over and being wide awake when I need to be asleep 😦

As another decade is coming to a close and we begin heading towards 2020, there is increasing commentary about the future of Web Content Management and what the next year, 5 years and 10 years might bring forth.

Quite often, looking back and learning lessons from history helps in the process of anticipating what the future might hold. While we can be sure there will be unpredictable developments, we can also be sure that history will repeat itself in one form or another. After-all, the ‘noughties’ has been a decade full of ‘history repeating itself’ – from the Dotcom  ( read South Seas) bubble, to long drawn out ‘religious’ wars in ancient lands to another great economic disaster. This last decade would seem to haved showed more than many that the more technologically sophisticated we get and the more electronically connected we become, the faster and more frequently we repeat our historical mistakes.

So, a good place to look back, before looking forward, is to the birth of the ‘Web’ part of Web Content Management and the great works of Tim Berners-Lee.

Aside from Sir Berners-Lee’s association with Dorset UK (where the very best of ‘noughties’ WCM came from of course 😉 ) and his more recent time at Southampton University working on the Semantic Web (I’m trying to put my home town on the map for something other than the Titanic 😉 ) I believe the simplistic essence of his original idea hits at the heart of what Web Content Management, to date, has been all about .

The essence of the WWW is ‘content plus pointers’ and you can pretty much distil the majority of our WCM efforts over the last 10-15 years down to that simple description. Content (documents, text, data, images, video) plus pointers (taxonomy, site navigation, search engines, blogs (chronological pointing), wikis (pointing simplified) )

Through his work on the semantic web, Berners-Lee has described the WWW transitioning into the GGG (Giant Global Graph). Spookily enough, when you read more about graph theory, one word stands out – the ‘Matrix’.

However, prophetic film making aside, the simple description of the GGG is ‘content plus pointers plus relationships plus descriptions

Twitter, for example, distils down nicely into this description. 140 characters can provide surprisingly valuable content – it is the best content ‘pointing’ tool yet devised (because of the following 2 points) – relationships between content and pointers are visible and can be analysed – and the content, it’s context and relevancy, is often described well – by humans rather than metadata.

But, like any first mover in the technology space, Twitter is gaining critical mass but also generating considerable hype. I think that looking beyond this hype and the mechanics that Twitter is illustrating is key to the future of WCM. In part two of this post, I’ll give the reasons for this thought…

Twitter influence and the new SEO…

whisperingIf there are two things modern comms technologies have taught us, it is the power and danger of herd behaviour and that if something can be manipulated, it will be manipulated.

Despite the enormity of the dotcom boom and bust and the even greater enormity of the credit crunch, both of which were driven by technology fueled herd behaviour, we continue to rush headlong into the ‘next big thing’ – until it comes crashing down around our ears again.

Likewise, the massive growth of Google over the last decade has led to the equally massive growth of an industry whose sole purpose is to manipulate Google results.

On social media, the alarm bells are starting to ring for me and I’m getting a sense of ‘unrealness’ that I got working in the midst of the web and telecoms industry at the turn of the century and in watching the ‘financial services’ bubbles growing ever larger in more recent years.

Two experiences in the last week have added to that sense of unrealness. The first is playing about with Klout and the second is looking deeper at emerging areas of ‘sentiment’ monitoring.  Twitter influence and social media sentiment are on my radar in terms of web strategy but, as always, I’m keen to get beyond the assumption and hype often associated with such things and not waste either my time or my employer’s time and money unnecessarily.

So, if automated tools such as Klout are to be believed, I managed to raise my ‘influence’ score by 8 points in less than a week. How? well essentially by using the same triggers that have always been used in these type of environments, since the early newsgroup days .

1. Controversy

2. Audaciousness (nice word – thanks Simon 😉 )

3. Flaming

Maybe it was a little more subtle than the average troll, maybe not, but in essence it was these 3 approaches that gained the ‘@’ and RT responses that Klout is clearly using as a primary measure of influence in its algorithms. Is Twitter a more mature environment impervious to techniques employed for years in other online social environments? – I think not. Do I genuinely feel I’ve gained any influence during the last week – errr no – but Klout thinks so.

If, however, the consensus is that something like Klout is an accurate representation of influence on Twitter then this raises another key question – how representative is Twitter of offline influence and is there a danger of getting fixated with something which in the bigger, wider world currently has very little influence?

Beyond Twitter, I think I’m seeing the rise of a new SEO – Sentiment Engineering & Optimisation – that, like bees to a honeypot, is already attracting the naive evangelists,  ‘snake oil’ salespeople, confusion marketers and cowboy operators.

Anyone who has spent any real time in web analytics and has had responsibility for reporting results knows how easy it is to misinterpret data and how taking numbers on face value can be a dangerous thing, without trying to understand the deeper background to that data.

If it’s easy to misinterpret and misrepresent structured data, I’m thinking it’s even easier to misinterpret and misrepresent ‘unstructured’ information.

There was a tweet from one attendee at the recent Alterian customer day that I felt was fundamental to this ” It’s hard for a computer to comprehend sarcasm and irony.” Like it or loath it, our brave new social media environment is loaded with sarcasm and irony and as this article illustrates, it is a complex form of communication. Has ‘natural language processing’ advanced to a point where it can interpret a comment correctly that actually has the opposite meaning? If it has then I suggest we are much closer to achieving artificial intelligence than perhaps I thought. Add into this, the ‘ugly side’ of social media, very well described by Econsultancy here, and there is a whole potent mix of manipulated, distorted and amplified information that potentially has to be dealt with here.

With its immense programming capability, Google has a continual struggle to prevent its search results being manipulated (I’m guessing a lot more than 200 signals now) and judging by the numbers of individuals and organisations jumping onto the current SEO bandwagon there is still clearly money to made in trying to do it. So, realistically, how close are we to even beginning to monitor and understand the infinitely more complex realm of social media sentiment in a useful and accurate way? And do solutions such as Klout indicate that a whole new Sentiment Engineering & Optimisation (SEO) industry is about to explode? Sadly, I think they do 😦

#fixwcm – some thoughts from the front line…

Sadly, travel and work commitments have clashed with this year’s J Boye conference in Aarhus but I was pleased to see plenty of engaging Twitter and blog coverage from the first day’s presentations and, in particular, the opportunity to contribute via the #fixwcm hashtag on Twitter.

No doubt the Twitter, blog and presentation file coverage of this debate represents a fraction of the overall debate so I risk going over old ground with this post but then, as a deafened person, I’m used to not hearing the full story and having to fill in the blanks sometimes – so nothing new there then 😉

The biggest blank for me in the debate, from following it remotely, was a lack of comment from the CMS buyers and users themselves. Perhaps they were in the audience listening intently as the analysts, commentators and vendors tweeted and blogged around them, or were too busy with their day jobs to enter the debate online.

Anyway, for what it’s worth, I thought I’d add some further commentary from that CMS buyer and user perspective. In particular I’m looking at this from experiences over the last couple of years of working for product manufacturers with a global presence.

Well over 60% of people start their product research online via the manufacturer’s website. There is an expectation that the manufacturer’s site has the latest, most in-depth and up-to-date content about the product in which they are interested and so it is vitally important that manufacturers manage online content effectively and efficiently. The web itself has both facilitated and accelerated a global marketplace and many manufacturers need to operate in multiple countries today to survive and prosper. This makes website globalisation and localisation increasingly important for a broader range of organisations.

fixwcm_toolsSo, given that context, the premise of the debate was that Web Content Management is broken and needs fixing. For me, I think it’s important to start with the definition and I am naturally drawn to the CMS Watch introduction and definition of Web Content Management.  It defines WCM as “A system that lets you apply management principles to content.” and goes on to list 10 key areas that constitute a WCM capability – from authoring to multi-channel deployment.

Personally I concur with comments I saw yesterday about WCM going through a painful evolutionary phase rather than being broken and needing fixing. What I’ve been experiencing over the last couple of years is growing frustration with the processes and tools we are using to manage content and I think this stems from these main areas…

1. Microsoft’s ubiquitous desktop computing presence

Long held wisdom suggests that we have naturally accepted Microsoft’s ubiquitous presence in our lives because it is only through such ubiquity that modern computing has revolutionised how we work and live. In many respects though, this ubiquity has defined how we do things, whether it is starting to write content in a Word document or producing endless ‘death by Powerpoint’. These habits and rituals, developed over many years now are hard to break. I find it quite staggering for instance that the collective wisdom of a very established global manufacturer is contained in folder after folder after folder of files on shared drives with seemingly very little actual captured knowledge, context or lifecycle management. Organisations and the people within them tend to stick with what they know and what has become second nature to them, even if at a fundamental level there are much better ways in which they could be doing things. Logic suggests for instance that generating content in a web first ‘Wave-like’ collaborative way would ultimately be far more effective than in disconnected Word documents but habit and ritual of the long-standing, largely disconnected desktop environment will persist for many years.

2. Web application capabilities moving faster in home life than at work

A key aspect of Web 2.0 as defined originally by Tim O’Reilly is the network effect. It’s worked with staggering success in recent years with the applications we’ve adopted outside of the workplace. And that’s the key point here. In many organisations these applications are stopped at the firewall. I’m sitting here in an open plan office of 200 plus people and the only person who can access any ‘web 2.0’ type application is me. So while community driven, virally dispersed applications have become part of daily lives at home many organisations actively block their usage in the workplace. Right now for instance I am dealing with some digital asset management issues within the system we are currently using and I long for the fast, dynamic, intuitive and highly productive interface that Flickr provides.

In my previous role and in my current one, a lot of the web editors I am working with on a daily basis fall into the ‘millenial’ generation. It’s not surprising therefore that I receive regular comments about how they wished the internal web publishing system worked more like Google or Facebook or YouTube or WordPress. However, having spent time in a software product management role, I know just how complicated it will be for the WCM system developers to get such rich interface functionality into their products to meet the expectations raised by the ‘web 2.0’ giants and they constantly have to evaluate the time and effort required for fixing long-standing pain points versus that required to create business winning new features. I sense here that Open Source philosophies and approaches have raised expectations about how users themselves can influence a product’s development path and presented a real challenge to proprietary vendors to improve how they listen to and respond to user needs.

3. Challenges to long-held management principles

Having spent over 20 years in the workplace now I have never felt so strongly that technology is currently defining two very distinct worlds – the corporate one and the social one. A point I made yesterday in a #fixwcm tweet is that “many organisations’ culture is counter-intuitive to social web so proprietary WCM innovation will continue to lag OS/social software”. The mature, proprietary WCM industry has been defined by what organisations are prepared to pay for. The social web holds many risks – some genuine, some perceived – so once again there is only so far the majority of organisations are prepared to shift from their established ways of doing things. It’s frequently commented that the more transparent, collaborative and crowd-sourced mechanisms exhibited by ‘web 2.0’ approaches challenge the ‘top-down’, ‘command and control’ mechanisms used for many years in many organisations. I’ve certainly seen examples where this is true but also examples where fad, fashion and the irrational exuberance the web world is prone to outweighs simple common sense.

To be honest though, I find I’m regularly reminding myself and those around me that for all the discussions about brand engagement in an online world being increasingly defined by the likes of Facebook and Twitter, every online survey I’ve conducted in the last few years has reinforced that in terms of what web users want from manufacturer’s websites – ‘product information’,  ‘where to buy’ and ‘how to use’ remain the core content management requirements that I simply must not lose sight of.  And how this can be managed efficiently, effectively and engagingly in over 20 languages across 50 countries is core to the future growth goals of the organisation.

4. Continued disconnect between business need and technical implementation

The phrase “if you’re not part of the solution you’re part of the problem” springs to mind here. Business people continue to struggle to articulate what it is they want the technology to achieve in their organisations and technical people continue to struggle to understand how business goals could and should influence implementation. I’ve spent time developing user stories and trying to drive developments from business user personas and perspectives in an agile environment – and it’s not easy! There probably aren’t enough people out there in typical organisations ready and willing to take the extra effort required to work both sides of the equation – so disconnects between what the business wants and what the technicians implement will doubtless continue. Generational shifts, with those who grew up with the web now entering the workplace, could make a difference here in raising the overall level of technical literacy and understanding but clearly this needs to be tempered with the wisdom and common sense of business people who have spent many years understanding what makes their organisations’ tick and how to compete in demanding marketplaces. Generational shifts are also influencing the outlook and function of IT departments and this could help further in breaking down what can sometimes be a major disconnect between business need and implementation.

Right now though, the reality remains that with all requirements considered, it is a mature, proprietary WCM and the collective knowledge and skill of the implementors we use that is still the best fit to address our core objectives and, although there are frustrations, the potential time and cost required to change the current ways of doing things far outweigh the system and process improvements likely to be gained by such a change. In terms of ‘fixing’ what we have I think a service,  MOT and tune-up are well overdue but we are a few years away from considering a full engine re-build, or indeed, a change of car.