Real Story Group Blog posts about Governance Copyright (c) 2016, Inc. All Rights Reserved. : Blogs en-us 10/23/2014 00:00:00 60 Updated ECM Evaluations include HP WorkSite, Alfresco, OpenText, Ever Team, and M-Files #EntArch #ecm Thu, 23 Oct 2014 07:31:00 +0000 We have just released an update to our ECM & Cloud File Sharing vendor evaluations.

The new release includes evaluations of three new vendors, one in the "ECM Platforms" category and two in the "Document Management Products" category.

ECM Platforms

We now include HP in this category along with IBM, Alfresco, EMC, Microsoft, Oracle, and others. HP WorkSite has seen many ownership changes over time but still remains a credible (if flawed) offering for professional services and legal oriented scenarios.

We also updated our Alfresco and OpenText reviews based on customer feedback.

Document Management Products

In this category, we've added two new vendors:

  1. M-Files Corporation: Finnish M-Files provides a simple, Windows-based document management system
  2. Ever Team: France-based Ever Team is notable for offering both Microsoft- and Java-based environments for its EverSuite offering.

Besides adding new vendors, you'll find updates for several existing evaluations based on customer and expert feedback.

As always, if you are a subscriber, you can log in and download your copy. If you are not, you can download a complimentary sample.

The biggest lesson for your enterprise from the debacle #EntArch #pmot Fri, 25 Oct 2013 13:05:00 +0000 By now you've probably read quite a bit about the manifold problems behind the US Government's largely failed launch of If you haven't, here's a quick summary: delayed funding; delayed procurement; rushed development by multiple contractors; poor program coordination; insufficient and late QA; severe problems not bubbling up to decision-makers; lack of accountability; and so on.

Whatever your opinions about Obamacare, no one can deny that this is a disaster, most of all for uninsured people trying to participate.

Reactions among tech people have varied. Some developers have been sympathetic, while others are engaging in I-told-you-so, but I'll guess that many are thinking along the lines of, "Now maybe people will come to understand how difficult it is to do this stuff..."

And perhaps there's some silver-lining truth to that: the whole episode has enlightened at a minimum the chattering classes, and probably the person-on-the-street, about how huge tech projects can go wrong. Some long-overdue federal technology procurement reform might result as well.

In any case, there is one clear lesson for your enterprise.

The Lesson

Execution aside, let's acknowledge that the specifications for were extraordinarily complex.

In fact, they hit the web application trifecta, in having to...

  1. Distill a highly complicated customer journey on the front end into a usable experience
  2. Apply a diverse set of business rules to back-end transactions, involving myriad external partners
  3. Support huge volumes of traffic, including intense spikes

Any one of these requirements demands quite specialized expertise. Two of those will put a web application project at high risk. All three, and you have to be very, very good (and possibly lucky) to pull it off. Of course, others have done it. Like, say, Amazon. Can your organization match their heft?

I believe the key lesson here for web and IT leaders is: be forthright with your colleagues. If something is hard, tell them early and often, but educate them as much as possible, rather than just pushing them off. And for business leaders, remember that the most magical experiences are the most difficult to create.

On the plus side, the term could usefully enter into common vernacular, maybe even as a verb. So the next time someone hands you an impossible project, you can reply, "This has all the makings of a," or even, "We need two more weeks of QA or this is going to on us." (I was thinking something along those lines this morning, when a security patch caused some cron jobs to go all on our webserver CPUs, taking down the RSG website for ten minutes.)

So, greater understanding could indeed give this whole episode a silver lining -- at least for the tech world -- after all...

P.S. If you're looking to align your business and tech teams around a significant digital workplace or digital marketing technology effort, and you want some outside advice, drop us a line.

IT can become a positive force in Enterprise 2.0 success #e20 #CoIT Thu, 30 Aug 2012 10:49:00 +0000 A common refrain among industry pundits is that IT is part of the problem holding back innovation in the enterprise in general, and stunting social networking in particular.

For the most part, we don't see that. In fact more the opposite. SearchCIO recently gave me the opportunity to riff a little on the topic. Here's an excerpt from that interview.

    One constant -- and here is where I differ from some gurus in this field -- that I have seen as a key to E2.0 success is a very supportive and interested IT operation.

    They are helping lead people through this thicket of buzzwords and confusion. They help sort this out into concrete projects, which IT is generally pretty good at. They see across the organization. They see commonalities. They often are already trying to rationalize these tools; they know they don't want 12 different Wiki platforms.

    The other interesting thing about IT teams is that very often they have early, successful adopters of these social networking and collaboration tools.

    Now the downside is that IT may overestimate their business colleagues' ability to pick up some of these interfaces and play with them and do useful things with them. But I have seen things turn out really well where IT was playing a leadership role, in particular where their approach was not to just stifle interesting things that were happening, but to allow innovation.

    These are the IT teams that are willing to work with existing projects that may even have started as shadow IT, and to say, "Look, we're not coming in here to impose a solution; we are coming in here to bring specific applications that are going to add value." And that is a whole different conversation.

To quote from an older post, "If you want your IT group to be more creative, then you also need to give them the freedom to experiment, too...."

DAM Project Managers: are you effectively using vendor and integrator resources? #DAM #MediaAssetManagement Fri, 06 Jul 2012 08:39:00 +0000 Let’s face it: many Digital Asset Management projects (or for that matter even most technology projects) do not have the luxury of unlimited budgets. In addition to overcoming myriad business, technology and process-related challenges, DAM champions often have to deal with resource constraints as well. 

Here is a recommendation for DAM project managers: in tandem with selecting your technology, plan how you will run the engagement with the right mix of internal, vendor, and implementation partner resources. I am not suggesting that you use vendor services teams for the entire time the system is in use at your organization. But, specialist vendor resources are often part of the implementation project team, particularly at project beginning -  often they alone understand the nitty-gritty of the DAM product you’ve selected. The extent of your reliance on such consulting resources should be determined by the proficiency (& availability) of your in-house team and of course, your project budget. 

Your goal as a project manager should be to derive the maximum mileage from the involvement of any such specialist (and need I say, expensive) external resources.  While it may sound like Project Management 101, you’d be surprised at how often we encounter customers who are squandering away precious consultant time and money. It almost always boils down to this: the customer has not done the preparatory work prior to engaging the consultants. So, the consultants idle away (but their billing meter keeps ticking as the project budget starts to strain), and the customer is only then trying to figure out what the information is that the consultant needs to help them. 

With a bit of planning, you can help the vendor or parter integrator consultants hit the ground running – organize necessary workshops, meetings with the stakeholders, gather existing documentation and requirements – the exact specifics depend on your project, but you get the general drift. 

I'll add that vendor and integrator consultants should be briefed up front about your specific requirements, context and constraints so that they provide you the solution that best matches your requirements, rather than a cookie-cutter solution. Consultants are happy to supply either – but remember, the onus is on you to clearly explain, so you can get what you want.  

It takes two to tango and what many customers do not realize is that it’s not just the knowledge of the consulting resources but your own preparedness to engage effectively with them also matters. It means you need to do your homework before consultants come on board and not after they’re on board. Do that and you’ve increased your odds of staying on course and within your budget. 

Our Digital Asset Management stream outlines more best practices for DAM projects. Sometimes, we can forget the basics as we try to keep on top of whatever we’re supposed to keep on top of.

As always, we love to hear from you - what’s been your DAM project management mantra? 

Yammer is driving CIOs crazy -- and what they can do about it... #e20 #socbiz Mon, 05 Mar 2012 14:02:00 +0000 Yammer -- perhaps the most well-known of enterprise microblogging tools -- is driving CIOs crazy.

I'm not referring here to Yammer's now famous "freemium" approach of getting your colleagues hooked on a free version and then upselling a more enterprisey edition. The real problem is this: employees signing up by virtue of having an enterprise email address believe that the free version of Yammer is an enterprise-sanctioned and perhaps even enterprise-managed solution.

This is of course a complete delusion, albeit a totally understandable one.

As subscribers to our Enterprise Collaboration and Social Software research know, Yammer's free version is a legal conundrum for the typical large enterprise. In discussions with senior leaders from among our enterprise subscribers, a recurrent theme has emerged about the constant efforts they must take to educate colleagues about the platform.

Savvier enterprises whose employees have taken to the free version emphasize that their public social media policies apply to Yammer, rather than their internal collaboration policies. That means, for example, that you shouldn't share sensitive data or documents via that channel. Busy employees may not always ingest that message. And at a time when many industry gurus don't recognize the difference between external social media and internal social networking, you can understand why some of your employees may not grasp the nuances either.

At the same time, we don't counsel our subscribers to kill off Yammer eruptions either. Enterprise social networking is hard enough to nurture without suffocating it in the crib. (On the other hand many Yammer experiments end up suffocating on their own for some very specific reasons, but that's another story...)

You need to understand the implications of the free version, and do what you can and should do to mitigate risks until you come up with a sanctioned, supported alternative. That alternative might well include licensing the paid version of Yammer, though in the event, you'll want to review your agreement very closely, since like many cloud providers, Yammer's enterprise edition SLA isn't so hot either.

Other collaboration and social computing providers are watching the Yammer model closely. The more free services the get targeted at enterprise employees, the more you need to educate about acceptable use. Better yet, devise a roadmap to get out in front of those needs. Let us know if we can help.

Six Sigma for WCM #bpm #cms Wed, 14 Dec 2011 12:52:00 +0000 Because I completed my university degree and started working full-time at the age of 20 (I wanted nothing more than to finish school and be out in the "real world"), I am not particularly impressed by copious degrees, professional certifications, or letters after a person's name.

I know however that I am in the minority -- in previous jobs I have argued with fellow managers about the educational qualifications of the people we hired. ("I don't care if she's a certified PMP, has a PhD, and 5 other professional certificates. Has she ever actually managed a large, successful project? Can she inspire a team? Is she an effective communicator?") Education only gets you so far: what matters in my book is real experience and results. (Note, I have 3 PhDs in my immediate family, so I may now be uninvited to the holiday events.)

Recently I had an advisory session about web content quality control with a large, global Real Story Group research customer who adopted a new web content management system a year ago. They wanted to establish a culture of quality among their web content managers, and champion those who not only understood the system designed to manage dozens of global websites, but also possessed a certain level of experience and inherent attitude towards quality.

I suggested adopting a six-sigma-like / karate-oriented "belt" system that, after a base level of system training, was focused on levels of experience-based knowledge and, equally important, positive attitude and focus on content quality. 

Here's a short summary of a longer deliverable:

WCMS yellow belt:

  • Has taken 3 days initial WCMS training, is willing to ask for advice, proactive curiosity about WCM and how it works

WCMS blue belt:

  • Minimum of 3 months using the system, showing inherent interest in the "health" of the system, proactive constructive attitude about content quality
  • Understands the ramifications of content type changes, tagging, and other content editing tasks
  • Takes responsibility at at a local level and derives improvement ideas for both the site(s) and the WCMS from experience

WCMS black belt:

  • At least 6 months using the system, daily exposure to the WCMS, experienced as an admin
  • Omniscience of the "big picture" of how the WCMS effects sites, knows and understands why it's important to do things correctly, encourages others to maintain quality 
  • Leads, delegates, and maintains content quality at a global level

One key question came up during the planning: "should we hire or promote someone into the black belt role?" Promote, promote, promote. Again, since this is about real-world experience in a particular environment and a particular company, it's like learning a language. Even if you learn to speak Spanish in Madrid, that doesn't cut it in Mexico City or the Andes. Black belts vary from dojo to dojo; the same should be true for WCMS black belts.

As such, there were many more criteria specific to the organization that defined these different levels, which I won't share here. Creating an operational framework that rewards system users based on their real-world experience, dedication to web content quality, and proactive attitudes towards system and content improvement is one way to make WCM a core part of how your team is evaluated. This approach can be adopted regardless of what platform you use, and also adapted for other types of technology.

As always, let us know if you need help.

SharePoint exposes lack of information management commitment #sharepoint #KMers Fri, 11 Nov 2011 13:30:00 +0000 Last week we ran the SharePoint Symposium in Washington DC and kicked off the two-day event with a question to the audience.  What single word best describes "SharePoint" to you?  The preponderance of negative answers thrown out surprised us:

  • Complex
  • Viral
  • Collaboration
  • Clunky
  • Misrepresented
  • Social
  • Bottomless Pit

This is typically a pretty pro-SharePoint audience, and in terms of research for our buyer-focused subscription ECM and SharePoint research services it has been a reliable source of peer information over the years.

But I have no doubt at all if we had asked the same question in previous years, we would have seen a more positive list of answers.  Further discussion revealed that SharePoint buyers and users in the room had been caught between a bottom up / top down approach to deployment.  Bottom up in that IT had thrown the problem of SharePoint over the wall and left users to self provision, or business users had simple gone off and acquired SharePoint on their own. Top down in that IT had provided very elementary and unusable SharePoint environments with insufficient education and training.  What was missing in both approaches was:

  • Business Analysis
  • Process Analysis
  • Change Management
  • Information Management

Neither the business groups that had SharePoint unleashed on them (or unleashed it themselves), nor the IT department that technically owned SharePoint offers those kind of skills anymore. Yet both assume somehow that the other will figure it all out. It's not so much that SharePoint is at fault; rather that growing SharePoint installations reveal the dearth of supporting resource, and their criticality.

We can call this a lack of skills, but that's not the real issue: it's a lack of enterprise commitment to the "soft" resources required for success here. Remember that SharePoint is like any other enterprise platform: you can't install and forget.  Adherents of SharePoint in the cloud would do well to note this too...

Helping you avoid the mistakes of others #pmot #EntArch Mon, 22 Aug 2011 14:25:00 +0000 Having worked with a broad spectrum of technology customers over the years, I've come to see broad common patterns in the mistakes that most large enterprises make -- be it during the technology selection process, RFI/RFP process, or project execution itself.

To help our enterprise subscribers learn from others, I'm preparing a couple of Advisory Papers on these topics. the first paper in the series, How to Avoid Common Technology Selection Mistakes, is now available.

This video offers a sneak preview of some of the themes and lessons explored in the paper.

I welcome your comments, below.

Who wants to apply Retention Policies to Tweets? #e20 #compliance Wed, 13 Jul 2011 11:42:00 +0000 Some enterprises do indeed want to apply retention policies to employee tweets.

EntropySoft, a content integration and migration vendor, has released a new connector for Twitter. EntropySoft already has a set of connectors that get OEM'ed into various packaged content and document management systems, many of which we cover in our evaluation reports.

This new connector enables enterprises to access content (tweets and other information) stored in Twitter. EntropySoft says the primary objective here is to archive those tweets in a corporate archiving system. Once the Twitter content is in one of your own repositories, you can do pretty much anything that the target system allows you to do -- such as declaring tweets as records and applying retention policies on them.

Many organizations have been experimenting with various alternatives for managing their employees' tweets. Some enterprises, for example, employ their Web Content Management system or a Portal-type application as a tweeting interface, so that they can subsequently manage those tweets as content in their larger application. By using a connector-based approach, in contrast, employees can tweet using any of their favorite tools but those tweets can still be brought back into an internal system for subsequent management.

There are some challenges that you'd need to address with the latter model, though. Besides the practicality of considering tweets and more generally social content as records, there are also issues of ownership and legality related to extraction of tweets from Twitter and archiving these in your own systems.

In any case, the ability to access social content via such tools has many uses. Many of the products we cover in our various reports -- such as Document Management and ECM tools like Alfresco and EMC Documentum as well as search engines like Endeca and Exalead -- turn to with EntropySoft to supply connectors within their products. We don't know yet if they plan to use this new Twitter connector too but if and when they start using it, it could become marginally easier to search, index, and archive social content from within your existing systems.

ECM in Healthcare Today Tue, 01 Feb 2011 17:25:00 +0000 There is no more challenging an environment for true ECM (Enterprise Content Management) than that of healthcare. 

Over the past few years a number of leading US hospitals have subscribed to our research and advisory services as they attempt to better leverage and co-ordinate broad content management needs. The panoply of use-cases -- EMR/EHR, diagnostic tools, rules driven scenarios, patient management, self management, and education -- all demand (in theory at least) a seamless operating and integration environment.  Inconsistency and potential contradictions in information can be critical in clinical settings.

Of course, none of this is new.  However, a co-ordinated and updated approach to managing information and content is rising on many hospital agendas -- due in part to increased US government funding, a general drive for greater efficiency, and a highly litigious and competitive commercial environment. 

Alas, the fact remains that few suppliers of content management technology have any substantial presence in this sector. Fewer still have the domain expertise and proper grasp of these highly complex and industry-specific requirements to be able to deliver much real value.

That said, some vendors are trying to build up their presence and offer better solutions. Hyland has long had a presence, and now Xerox is trying to get a foothold for its DocuShare Suite via its acquisition of WaterWare (a small firm that had build an EHR application on top of DocuShare).  But the major players you would expect to see regularly in this space -- such as EMC, Oracle, IBM, and Microsoft -- have patchy reputations and a history of trying to fit generic solutions into specialized environments.

In fairness, the healthcare sector is itself disconnected and difficult to navigate, where suppliers are severely challenged to develop high quality, co-ordinated solutions.  Clinicians usually rule the roost, and their expectations and requirements are often out of line with IT reality. Similarly major hospitals consist of a multitude of near autonomous departments, making co-ordination and alignment between them near impossible at the best of times. 

Nevertheless, healthcare is an area where ECM can deliver profound benefits. Healthcare (in both its private and public incarnations) also represents an industry sector in real need of serious improvement when it comes to managing information resources.

Current investment by document management and broader ECM vendors in developing better BPM, analytics, integration, and case management capabilities could ultimately deliver some valuable new technology solutions.  But in today's healthcare world, ECM is poorly represented in the broader technology portfolio.

Agile web development - how do you get there? #EntArch #cms Thu, 13 Jan 2011 13:02:00 +0000 Some of our subscriber inquiries have to do with accelerating their website and CMS deployment times.  Having been burnt by waterfall-oriented development methodologies in the past, enterprises are now looking to "agile" development for answers. But already, many are becoming equally disillusioned with agile. Is it really just as problematic as waterfall?

I'd say it's perfectly feasible to be more agile in your web development. The process itself shouldn't be a problem. The cross-organizational change required to put the process in place is much more of an issue. Yet, 9 out of 10 companies I've talked to the past few years seem to get stuck in first gear.

So get past the scrum and back into play: start with getting the basics right.  Here are some ideas.

Remember you're developing websites, not software
Often, the process works like this.  A concept of a site will get specified. The specification will then be handed over to developers. They will work to turn it into a functioning CMS implementation and front-end website, using agile methods. There's a watershed between marketing/communications departments and IT -- and it's reinforced by agile methods themselves. (See for instance the Chicken and the Pig analogy.)

This is doomed to fail, because agile is not meant for completing 100% of a set specification at a specific deadline date. And even if the developers would succeed, the goals of the project will have shifted and moved on by the time they're done. Because you're aiming at a moving target -- that's the nature of the web.

Instead, think about this as web development. That may sound obvious, but it really means that every stage, from concept to UX and IA, from content strategy, to graphical design, to software customization and templating, should be part of the agile process. Not just the software development bit. You can't just drop an idea and move on: everyone needs to be actively involved in the ongoing agile process, but most notably the business team. Remember: watersheds tend to lead to waterfalls.

Agile is not an excuse for not knowing what you need
Problems get exacerbated if the original concept is too vague. That'll lead to dissatisfaction if you're using waterfall methods -- but agile will only serve to amplify the effect. If you keep changing your mind, then asking developers to change their plans to suit you, you're just leading them on a wild goose chase. The project will never be finished.

Yes, your website should be in constant evolution to keep up with a rapidly changing reality. But you still need a clear plan. Conceptualize and develop what you can chew off in the short term -- and let that be part of an iterative agile process. But also keep a sense of direction for the longer term. You don't have to know where you'll be in five years, but you do have to know whether you want to turn left or right at the next crossing. Otherwise, you'll just wander aimlessly, chasing an elusive mirage.

There's a time and a place for waterfall and agile
You can't just throw a switch and start being agile. You need the technical and organizational infrastructure first. Take the time to do this thoroughly. Don't jump in the water right away. You might drown before you figure out how to be agile enough to swim.

If you're starting an overhaul of your organization, the CMS, and the website, get a thorough foundation in place. A friend of mine claims the Tower of Pisa was built using agile methods from day one -- which is why the building itself looks nice enough, but it's rather noticeably lopsided. Foundations need thorough specification and execution, because no matter how agile your fixes afterward, they'll always be crooked.

Finally, and I've said this a couple of times before: a website isn't like a book that's finished and sent of to the printers. You're never done with it. Agile methods suit this constant evolution very well. But getting into a constant flow is far from easy; don't underestimate the expertise and time you'll need. We can help you with expertise -- but don't expect miraculous quick fixes.

In praise of TIMAF #ecm #EntArch Wed, 29 Dec 2010 21:16:00 +0000 There's a great new book that begins with two questions:

  1. "What does an information manager do?"
  2. " How does she do it?"

And then proceeds to answer them both.


The book is called TIMAF Information Management Best Practices Vol. 1. I recommend it.

The longer I'm in this business, the more I see the common problems we all face as -- at their core -- information management challenges. The problem with the term "Information Management," though, is that it feels very abstract and doesn't seem immediately relevant to the workaday struggles of the typical website or data warehouse manager. Consequently it's a term that gets bandied about in academia, but not so much within the modern enterprise.

I think this book -- edited by Bob Boiko and Erik Hartman -- will help change that. It's a preliminary collection of best practices that's more specific and detailed than you'd see in the trade press, but more actionable and case-oriented than you'd see in an academic treatise (the authors are all practitioners and consultants). It's also mercifully vendor-free. [Disclosure: two of my colleagues (Alan and Apoorv) contributed chapters.]

If I had any criticism it would be that the book perhaps over-emphasizes content (versus data) in general, and structured content in particular -- a faint shortcoming that's more than redeemed by the step-by-step approach taken in most of the chapters.

As you prepare for a new year, consider going back to the basics, and check out this book.

Assessing Buyer Risk via Redesigned Cross-Checks #EnSW #EntArch Tue, 21 Dec 2010 14:05:00 +0000 Every product and technology solution provider inherently brings some amount of risk. When it comes to technology, change is inevitable and often a good thing, but also potentially a strong indicator of risk for new buyers of technology. For example:

  • Vendors - of all sizes - can be bought and sold, merged, or shut down completely
  • Employees get hired and fired - bringing new ideas and taking their ideas elsewhere
  • Open source projects merge, fork, or stop innovating
  • Products can be updated rapidly, occasionally, or not at all
  • Updates can be minor, major, or complete overhauls

When purchasing new technology, you should always assess the risk of the product or project and vendor partner from which you are purchasing. We've redesigned our Cross-Check chart to make it even easier for you to make this risk assessment.

Ultimately you need to find a vendor that is a good fit for your needs in terms of usability, technical capability, and price. Beyond that, you need to assess the vendor itself, and consider where it's going. You also need to know about pending changes to any specific tool, which can have an enormous impact on your implementation.

For example, If a vendor is due to release a major upgrade to its product (and the product itself is a relatively small portion of the vendor's business), and at the same time the vendor is in acquisition negotiations, this would represent a high risk to you, the buyer. However, that's only a presumption; deeper investigation could reveal that the upgrade is a relatively smooth and productive one, and that the pending acquisition may in fact enhance both focus and R&D investment into this particular product.

Starting this week we'll be releasing new market analyses, complete with the new Cross-Checks in all of our technology areas. In each of these releases we'll explain what we're seeing in the market and why we place each vendor in a particular risk area. As always, there's no "magic" or "leading" segment.  You'll want to find the right fit for your needs. Our evaluations can help.

Subscribers, you will get the new Market Analyses included as part of your subscription.

How did our 2010 predictions fare? #cio #ecm Tue, 23 Nov 2010 13:40:00 +0000 As you may know, we make twelve predictions every year, and every year, we go back to try to assess our accuracy. So, let's see how we did with our 2010 predictions from last December.

1) Enterprise Content Management and Document Management will go their separate ways
This has largely happened, primarily because customers have persuaded vendors to (mostly) give up the ghost on "enterprise" content management and return to practical applications.

2) Faceted search will pervade enterprise applications
This has definitely happened, though what constitutes best practices is evolving (c.f., difference between SharePoint Search and FAST Search results).

3) Digital Asset Management vendors will focus on SharePoint integration over geographic expansion
Definitely. The North America / EMEA divide in this marketplace remains stark, while vendors push SharePoint "connectors" -- albeit of varying stages of maturity.

4) Mobile will come of age for Document Management and Enterprise Search
Yep, though that was an easy one; questions about usability, persistence, security, etc. still remain.

5) WCM vendors will give more love to Intranets
No, didn't happen. Wishful thinking.

6) Enterprises will lead thick client backlash
Yes, I think we are seeing the back-side of flex-based clients for enterprise applications.

7) Cloud alternatives will become pervasive
Sure, but that was another easy one.

8) Document Services will become an integrated part of ECM
Sort of. ECM vendors are starting to promote document composition services more, but integration with other document management systems remains thin.

9) Gadgets and Widgets will sweep the Portal world
Definitely true, and more interestingly, they are making inroads into the non-portal world.

10) Records Managers face renewed resistance
We argued that, "the movement for simple retention rather than detailed RM practices will continue to gain ground." And by that measure, I'd say yes.

11) Internal and external social and collaboration technologies will diverge
Yes, though to be honest, this has been an organic trend in the marketplace for a couple years, mitigated only by an emerging trend to extend some internal collaboration services to limited sets of external partners.

12) Multi-lingual requirements will rise to the fore
Sort of. This is obviously a long-term trend, yet some smaller vendors still suffer in delivering multi-lingual capabilities.

So, on the whole, we were 10 for 12. That's slightly better than 2009's 9 out of 12, though one might argue that the future is getting clearer rather than us becoming more prescient.

Meantime, stay on the look-out for our 2011 predictions, which promise to be a thought-provoking collection...

Vamosa to stop going #cms #migration Mon, 20 Sep 2010 18:42:00 +0000 Vamosa means "let's go" in spanish, and sadly the UK-based content migration technology firm Vamosa is indeed going...away.

After a week of rumor we confirmed directly with the firm today that they are in "administration." What this means in practical terms is that Vamosa is now under temporary new management (an insolvency practitioner is appointed), tasked with putting together a rescue package and/or outside buyer. 

From what we can gather this move came as a shock to all concerned, but clearly it will take time for all the details surface. I think it likely that Vamosa will be sold as a going concern, that there will be a buyer, and that in some form or other the Vamosa technology and possibly even the brand will live on.  There's a place in the enterprise for migration and compliancy tools, but cash is always king even when sales are going well, and small vendors have fewer options at financial crunch time.

At this stage there is not a lot I can add, other than the very obvious advice that any dealings with Vamosa need to go on hold at least until this is all resolved.

Cri Du Coeur for Records Management #compliance #sharepoint Wed, 04 Aug 2010 10:11:00 +0000 If you were to trust in the marketing swill that comes out of the vendor and analyst community these days, you would believe that large organizations are not just embracing ERM (Electronic Records Management) but that they are positively hugging and kissing it too.  You might believe that organizations driven by urgent compliance needs are enthusiastically managing and archiving large SharePoint installations, and are having meaningful discussions about how to deal with Web 2.0 content, having already taken control of the e-mail mountain. 

This is of course complete and utter rubbish. SharePoint sites continue to grow unabated,  and nobody has even started to deal properly with e-mail as a record, let alone the plethora of technologies that Web 2.0 encompasses. You can certainly find examples of brave souls who have made progress if you look hard enough, but they are the equivalent of a few grains of sand on a beach.

Legal has zero clue what IT actually does (beyond provide a poor quality helpdesk). Records Managers have nobody's ear but the RM community. Business thinks it knows best and listens to no one. Ironically the vendor community is for once the voice of reason here. Strip away the marketing hype, and vendors have made huge progress (as our research details) to deliver solutions that can actually provide excellent ERM capabilities in today's highly fragmented and ever growing enterprises. Although product offerings vary subantially among individual vendors, the problem does not lie with the technology.

What matters, more, though, is who's going to fix the situation? More specifically, where does one start to fix something that is so terribly broken?

In my personal opinion the place to start is at the beginning, and to question ERM's "raison d'etre."  Once upon a time office workers made use of filing clerks, who in turn made use of  cabinets, folders, and file plans. Information was managed, and no one needed to know the magic behind it, it just worked.  When you needed to get hold of a piece of information the filing clerk would get it for you, and when you needed to dispose of information the filing clerk would likewise oblige. When you moved onto better things or fell under the proverbial bus, you did so safe in the knowledge that the next person could pick up (information-wise at least) where you left off.  Not so in today's office: information gets lost, information gets hoarded unnecessarily, and when you transition upward or onward, you often leave the equivalent of an information black hole behind you.

The role of managing information through its lifecycle to destruction is arguably more relevant, vital, and important today than it has ever been, but who's responsibility is it? Records Managers are considered impractical and out of touch with modern reality, IT is clueless but sounds clever ("don't worry it's all backed up....."). Business listens to no one; instead they believe they have the task in hand via their zip drives and desktop search.

I am not sure if my need to rant today is connected in someway to the recent discovery online of a photograph of myself from my days in the Army, and my subsequent visit to the Military Museum in Winchester. In turn possibly stirring up a deep seated need to rally the troops, or more likely in my case to start an armed insurrection. But whatever it is, I do passionately believe that the time is long overdue to have a battle royale over what ERM should be, as opposed to what it currently is or is not. A clean slate is required, and fresh ideas based on the reality of overwhelming volumes of information are essential to the debate.

Subsites - up front vs. ongoing costs #cms Mon, 02 Aug 2010 16:23:00 +0000 Do you run or plan a site with hundreds or thousands of subsites?  If so, you have many unique issues to deal with, including complex permissions, templating, taxonomy, and UI requirements that are completely irrelevant for smaller sites.  And don't forget another important area -- cost -- both in money and time.  Here are some thoughts on trade-offs and how you might address them.

Let's start with the assumption that if you have a large number of subsites, chances are you have complicated politics.  Your reaction may be to just do whatever it takes to get business units into the system, including adding complicated functionality specific to a unit.  I've personally been a party of this approach, but you can easily end up with Frankenstein systems that cannot be maintained or innovated.  Why?  If you implement a disjointed system then it's harder to regression test and to add features since you don't know all of the impacts you will have in making changes.

Let's consider the steps of a subsite launch:

  1. Request.  The process for requesting a subsite.
  2. Approve.  The process to approve or reject a site creation.
  3. Negotiate.  Negotiating the details, especially the functionality.
  4. Train.  Training the team that will be managing the site.
  5. Create.  The technical creation of the subsite.
  6. Embody.  The site owners adding their content.
  7. Review.  Quality review before launch.
  8. Launch.  Launching the subsite.
  9. Maintain and Innovate.  Ongoing maintenance and innovation across all the subsites.

When deciding on your architecture and processes, consider the cost of each of these steps, and not just some.

There are several approaches / philosophies to subsites, but one extreme would be the completely lenient approach where everyone can do whatever they want on whatever platform they want.  In that case, steps 1 through 8 are easy (or last can be dealt with on a group-by-group basis).  But watch out for number 9: if you wanted to add a new and innovative feature across all the sites?  Share content across sites automatically?  Standardize look and feel for you site visitors?  Next to impossible. 

Probably the most common approach is a centralized platform, but where the core web team is lenient about what sites are created and the functionality details.  This can happen when there is weak governance, with the basic idea of "let's just get them in a common system first."  In this case, the request and approval process is easy.  But watch out: the Negotiate stage can take a long time.  Now you've got to not only work out site or group-specific functionality, but argue about which features should be added.  Also, the training and creation process isn't standardized, so takes more time.  Similarly, the review process can take a long time as there may have been misunderstandings of what is possible and not possible.  But the real problem is the maintenance and innovation.  The system becomes so complicated that it's difficult to maintain.  Also, if everyone did their own thing, then it also is difficult to roll out innovative features across all subsites.

Another approach is to "package" your subsites so that creating a new site is almost trivial.  This could include an extremely simple form that would be filled out to create a site, that would just have selections (no free-form written requirements) for the different options of creating your site.  In general, the negotiation would be simple since most folks would want to just fill out the form and get a site.  But if an option wasn't available for a particular group, then there could be a discussion of whether it was worth adding the functionality to the core system (or adding a parameter for an existing piece of functionality) so that it is preferably available to all sites for use.  Obviously negotiating changes to the core system would take time, but by carefully guarding the core functionality, and biasing toward packaging items for everyone, you wind up with a more stable system.  In addition, your system is more maintainable and features can be rolled out more easily across all subsites.  Training should be easier as well.  To pull this off, you will need strong web operations management, and in particular strong product management. 

Yes, there are other possible approaches.  But one thing is for sure: consider all the steps of your subsite and platform lifecycle when deciding on your subsite creation approach.

How to get the right vendors to respond to your RFP #cms #ecm Mon, 12 Jul 2010 13:00:00 +0000 Many of our enterprise research customers are reporting an interesting trend: fewer responses to their RFPs (a.k.a., "tenders"). Even vendors they had at the top of their lists may decline to bid. This puzzles them. Aren't we in an economic recession? Isn't it a buyer's market for technology? Well, yes and no.

There are at least five reasons for this phenomenon, and specific steps you can take to mitigate potential problems.

1) RFPs are getting better
In the technologies we cover at least, we're seeing fewer check-list RFPs and more narrative-based requirements. We also see larger enterprises profiting from preliminary RFIs. (Hopefully our advice over the years has played a modest role here -- but many others are offering good counsel as well.)

Good RFPs tend to create more work for the vendor, and if a potential bidder thinks your project is a stretch, they will more readily beg off.

But those who think they offer a good fit -- and it's all about "fit" -- will usually put in the extra effort.

Indeed, the most important by-product of better RFPs is better vendor responses, even if fewer in number. Responses become more targeted and relevant. This makes it easier to discriminate among them. So there's another trend we've noticed: less contentious down-select meetings. When you're clearer about what you want and bidders have to be more specific about their real capabilities and tendencies, decisions come more easily. (Ditto for subsequent down-select decisions, including after the all-important test phases.)

Of course, not all vendor responses will be as targeted as you'd like. And to be sure, vendors still don't have to look far to find horrible RFPs. But this leads me to point #2...

2) Vendors have become more discriminating
The very economic downturn that has you thinking you're in the driver's seat has vendors watching their cost-of-sales more carefully than ever. In short, they're less likely to chase. That's A Good Thing for all concerned, but also should give you some pause. If you're looking for the right fit, you want to appear at your best when soliciting proposals.

So, avoid nebulous, unprioritized requirements, and never use canned RFPs. Above all, figure out a plausible set of candidate suppliers. When vendors see wacky short lists, they typecast you as an amateurish or potentially indecisive prospect, and are more likely to beg off. Do your homework first.

Then inform the targeted bidders that you chose them through a substantive review process. If they haven't met you, they may balk at responding to an RFP over the transom, even though you've spent a lot of effort figuring out that they could offer a good fit. Try to talk to them on the phone before you send an RFP, and invite a preliminary webinar where useful. Which leads me to point #3...

3) Vendors remain a little paranoid
Whether you like it or not, you may need to assure prospective bidders that the deal isn't "wired" for a competitor. Vendors have always been suspicious that they're getting led around by the nose when a decision has already been made -- where the customer just goes through the motions, perhaps using a fake selection process to secure a better price from the chosen supplier.

In my experience, vendors are overly paranoid, and should give more benefit of the doubt. But they've been burned enough that you can understand their fears, especially if they don't have some sort of relationship with you that establishes initial trust.

[As an aside, at The Real Story Group we work only for you the technology buyer, but on principal we won't advise on nominally competitive procurements where the outcome is pre-ordained and bidders are being exploited. We try to explain to any customer issuing a wired solicitation that it's a smaller industry than they think, and in the long run, reputations matter. Vendors need to adhere the same ethical standards as well.]

4) Systems Integrators are always discriminating
In maturing marketplaces, RFPs increasingly target systems integrators (SIs) or other services firms, rather than software vendors. (We've recommended this on several occasions recently.) Also, when deeply vetting open source platforms, you will often need to turn to a consultancy to bid. In any case, the SI must bring a particular solution to the party, but has to justify both the tool and their organizational fit for your needs. This type of combo procurement can become tricky for you, but also potentially very effective.

Although SIs receive the lion's share of global technology spend, they have narrower profit margins and are therefore careful about what they go after. Also, the best ones are busy. Often very busy. That has them looking for engagements that represent a good fit. A carefully-crafted RFP will let them know if they fall in your sweet spot. A loopy RFP will tempt them to perceive you as a "squirrelly" client -- the kind they lose money on -- so they decline to bid.

5) The vendor screwed up
Sometimes vendors fail to bid simply because of bad timing or other mistakes. There may have been a hiccup in their sales process. Maybe the reps and proposal writers in that region were over-committed that month. Maybe they were in the midst of replacing personnel. Maybe they didn't fully grasp the import of your solicitation.

That's all their fault, not yours. So it's tempting to dismiss them as a poor partner as a result. But you shouldn't over-weight a vendor's sales acumen -- or lack of it -- in your decision, especially if you think they might make a good fit against more enduring criteria like functionality, architecture, and TCO.

Here's how to address this. Confirm whether each vendor is going to participate early in the proposal process so you can give them a chance to overcome issues of poor timing, insufficient understanding, or human mistakes. Don't worry that a candid conversation will make you seem weak-kneed or supplicating; it will likely raise the vendor's interest in participating and you can still remain discriminating in your reviews and negotiations going forward.

By the same token, you shouldn't coddle a potential bidder, either. Sometimes non-responsiveness really does indicate deeper incompetence or disinterest. Just take the extra step to find out what's really going on.

At the end of the day...

Fewer bidders isn't necessarily a problem. Among other benefits it means less reading! And better reading, since, done right, the responses should feel more relevant to your concrete needs.

The problem comes when you don't get a sufficiently discriminating set of responses, and/or a particular supplier that you really wanted to participate begged off instead. To avoid this, follow the advice from #2 and #5 above about communicating early with potential bidders.

Also remember that an RFP and attendant proposals -- while critical -- represent only part of the selection process. Our evaluation research identifies empirical, test-based steps (specific to each type of technology) that you should take before making a final decision.

In the end, like so many things, quality is more important than quantity. Vendors who try to be all things for all customers will respond to vague RFPs. The best fit for you will more likely emerge from a stronger, more focused solicitation. If we can help, please let us know.

Does SharePoint Cause Information Management Problems? Thu, 01 Jul 2010 16:45:00 +0000 In a recent article in Computing regarding this subject, the author suggests that SharePoint is the cause of recent information management challenges within organizations adopting the platform.  Like much of the criticism lodged against SharePoint, the article focused on the sometimes-unbridled provisioning of sites and information constructs like lists.  However, is this condition entirely SharePoint's making?

As subscribers to our SharePoint Watch research know, the platform presents several important challenges to licensees.  SharePoint is somewhat weak in centralized administrative controls (fine grained controls aren't universally available for things like security), and management capabilities essential for larger organizations often require 3rd-party add-ons. Of course, these shortcomings will get worse if SharePoint spreads virally within your enterprise.

That said, firms must also take responsibility in implementing proper governance policies that normal humans can understand and apply; too often organizations create draconian and inaccessible governance plans or simply leave users to their own devices. Both situations create more harm than SharePoint's own provisioning management issues.

SharePoint is weak in some areas and strong in others.  However, this is one place where I tend to side with Martin White, of Intranet Focus. He once remarked that SharePoint, more often than not, points out an information management issue that most firms didn't know they had because they weren't managing their information prior to implementing SharePoint. 

Automating manual processes - is it always worth the effort? #ecm Thu, 17 Jun 2010 10:31:00 +0000 For IT professionals it is taken as a matter of faith that automation improves efficiency, that making paper based documents and processes electronic is, in and of itself, a default benefit.  But experience in the real world paints a more mixed and problematic picture.

Take for example a recent study by University College London into the UK's groundbreaking transition to electronic patient care records for the National Health Service. This initiative has come at a cost so far of $360 million dollars, and (to quote) "will require a high cost and an enormous effort to fulfill it's potential." That's a nice way of saying that the effort to date has largely been a waste of time.

The problems that have dogged this particular initiative are the same problems that arise in almost all similar projects, regardless of the industry sector or budget size. 

  • Undertaking a full content audit, and ensuring that all the original data is not only accurate but also consistent, is difficult, time consuming, and costly
  • Migrating content from (often disparate) systems is always a major challenge
  • If you have not proven the value of the new system, people with stick with the old
  • Validating and continuously improving the new process without full buy-in from the old users is near impossible

Though I do have Luddite tendencies, I am not advocating that you should avoid moving manual and paper-based processes to electronic methods. But I am advocating that you do so with your eyes open to the inherent complexity and difficulties.  Build a business case first, and ensure you have a full grasp on the work involved. Electronic processes are not by default any more efficient or cheaper than manual systems. 

And as for the rather cynical attempt of the ECM industry to push the benefits of Green IT, digital data is often no more "green" than paper. Sometimes it's worse. 

One final consideration is that of confidentiality. If you have highly confidential information remember that it is far easier to inadvertently expose that data to a multitude when it's in electronic form, than paper. 

Again to be clear, great cost and efficiency gains can be got from the move from manual to automated processes. But the move is often far bumpier and less clear-cut in terms of benefits than many think.

I spent this past week at the AIIM UK  Roadshow talking to buyers and end users of ECM technology. My mantra was that we often do not recommend to our research subscribers that they invest in more technology. Rather that you understand better what tools and platforms you own already, and make best use of it. It's an unusual proposition from an industry analyst firm, but as we have 100% independence, it's one we're happy to give.  Likewise we have no problem in advising our customers to think twice before diving headfirst into projects that are based on the belief that automation in itself equals improved business.

Autonomy buys CA Information Governance - A First Take #compliance #ecm Wed, 09 Jun 2010 14:46:00 +0000 Autonomy today announced that it has acquired CA's (formerly Computer Associates) Information Governance assets.  These consist primarily of two main products, CA Records Manager and CA Message Manager.

It's a surprising acquisition from two angles, first that CA should have seen so little value in this division to dump it, and secondly that Autonomy, already something of a holding firm with overlapping products, should feel the need to buy still more overlapping technologies.

That being said CA had two decent products, that essentially provided a two-level approach to compliance. (We evaluate them both in our ECM research.) The first, a general and very practical, bucket-style approach to retention, that could have the second (Records Manager) added to it for those with a need for more granularity.  However, both were acquisitions by CA. Records Manager was from the MDY acquisition in 2005 and had previously been called "FileSurf." The second, Message Manager, was from a firm called iLumin that was also acquired in 2005, and at the time was a market leader in e-mail management. 

Neither acquisitions really settled at CA, and never really found themselves a part of CA -- itself a huge and at times difficult organization.  At first glance it seems CA did not really appreciate what they had, or the strategic value it could provide to their overall data management practice. 

Autonomy, though, already has records management systems (from Meridio and Interwoven) along with archiving from the Zantaz acquisition.  Why they should need more I don't know.  The only thing that is clear from today's announcement is that Autonomy has further confirmed the general impression that they are a holding company. Undoubtedly Autonomy will soon announce that the CA products have been integrated via "The IDOL Platform", but we would strongly advise buyers to take time to see how this really works itself out -- as well as the long-term costs.

No magic in eDiscovery #search Wed, 09 Jun 2010 12:48:00 +0000 In the American corporate world, eDiscovery is the "pain du jour," as requests to find and turn over documents grow exponentially.

Yet many organizations seem to be under the impression that if they buy eDiscovery software something magical will happen: That regardless of the complete and utter chaos that constitutes information management within their organizations, at the push of a button or two the eDiscovery software will find the required information, and then present it in a neat and secure format, thereby meeting the eDiscovery request.  Even in the world of Harry Potter, there is no magic that powerful.

A good example of just how far the hype is from reality was nicely illustrated by Goldman Sachs this week. In responding to an FCIC (Financial Crisis Inquiry Commission) request for information, Goldman provided 5 Terabytes of data, the equivalent of 2.5 billion pages of documents. To quote the FCIC Chairman, "We did not ask them to pull up a dump truck to our offices and dump a load of rubbish."  Now I have no idea whether this was simply a deliberate attempt to obfuscate and hinder the crisis probe -- as the FCIC seems to believe -- but I doubt that it's the whole of the story even if it is a part of it.

eDiscovery requests fall into two main categories: the easy and the impossible.  The easy ones relate to specific individual and a specific transaction. By freezing an individual's email account and network folders almost everything you have been asked to look for is found.  The impossible category consists of almost everything else. In these cases lawyers sort things out in back rooms, usually come to a compromise, and when they can't, either fight the request through the courts or simply settle.  When a government probe like this comes along though you have no real option other than to comply, or at least attempt to as best you can.

To meet eDiscovery requests with ease, you need solid information governance across the whole organization to be in place, and you need a lot more than a search engine on steroids.  When firms have billions of documents or even just millions, knowing exactly which documents are needed, and how they related to each other in a chain of events resulting in an eDiscovery request, will require a lot of manual trawling, lawyers at high hourly rates, and pragmatism for the foreseeable future. 

The only good news is that many firms are now recognizing that no magical software will ever be able to fix this situation. You need good information management practices, and a plethora of content technologies (ECM, Search, Archiving etc) all working in tandem. 

Google opt-out -- another blow to web analysts? #analytics Mon, 07 Jun 2010 12:05:00 +0000 A few months ago, Google promised to make it easier for website visitors to opt out of tracking by websites that employ their free Google Analytics service. Their announced solution last week strikes me as a bit of grandstanding on one hand, and potentially damaging to the future value of web analytics on the other.

In the context of the debate around a new U.S. Federal government OMB policy on the use of persistent cookies, as well as other national governments' interest in enabling site visitors to opt out of web analytics tracking, Google's stated goal is to make opting-out "easier."  The company has responded by developing a downloadable browser plug-in that disables Google Analytics data collection from all sites using Google Analytics.

Why is downloading a plug-in considered any easier than disabling cookies from within your browser options? Or adding websites to your exclusion lists? 

I can understand why some people may want to opt out -- especially for particular websites -- but it's important to understand that Google has selected the "nuclear" option here in lieu of using its market authority to promote a more modest, site-specific approach. (Competitor Omniture, now part of Adobe, provides a site-specific opt-out service, but doesn't promote it heavily.)  Adopted widely, the nuclear option will add to the myriad difficulties that already compromise web analytics accuracy. The impact will be magnified if Google's approach gets endorsed through the new Federal cookie policy and other government policies.

To be clear: I understand reasonable privacy concerns.  My alternative would be to provide site visitors with the option to use their browser to opt out of tracking from the specific website they are visiting, or opt out of all tracking from the particular web analytics solution, but not automatically default to the latter, as Google has done.

Why would Google promote the all-encompassing opt out? After all, it's the biggest web analytics service provider in the world by a large margin.  Global opt-outs hurt its customers.  Except that its customers don't pay anything for the service, whereas government goodwill is critical to Google's success in many other lucrative areas.  For Google Analytics customers, it's just another reminder: there's no free lunch.  

Multiple document silos - where to start? #ecm Mon, 31 May 2010 13:02:00 +0000 Virtually every customer of ours manages multiple document repositories. The document volumes stored in these repositories can get enormous; even smaller organizations can produce millions of documents, while the largest run to billions. 

What is common across them all is the desire to consolidate, and to gain more value from the huge volume of information sitting in documents across their units. Sometimes this is no more than a desire to find things quickly, wile in other cases the goal is to merge and integrate business process tasks with relevant and current information. In still other instances, the enterprise simply needs to reduce costs, do more with less.

Multiple repositories can come in many different forms, be they hundreds of SharePoint sites, a handful of massive ECM systems, or a combination of shared drives and Outlook folders. But they all represent the same basic problem. "I have the information I need, it's somewhere, but I can't access it or find it easily, let alone leverage its full value."

When trying to come up with an approach to improve these multiple repository situations people usually consider the following approaches:

  • Migrate everything into an Über Repository
  • Federate the management of the repositories in place
  • Start an information governance project
  • Work on Information Architecture/Taxonomy/Metadata
  • Take a federated search approach to the situation
  • Utilize API's and/or EAI to integrate at the back-end
  • Build a portal/mashup to integrate at the front end
  • Use BPM to integrate in the middle
  • Go the SOA route to deliver shared ECM services

In fact there are still more approaches you could take, and the options above are not mutually exclusive, but they are the most common.

What is not so common is taking an approach that tries to address the bad practices that created these situations in the first place.  Somebody once said, "It's OK to make mistakes, but not OK to make the same mistakes repeatedly." Yet that is what so many organizations do when it comes to managing information effectively.

Rather than jumping straight into one of the options enumerated above, I would advise you to take a step back and to first consider starting every information management project with a major cleaning exercise. 

For example, if documents have not been accessed in X period of time, let's be honest, they are unlikely to be accessed ever again, and in most cases there is no legal or regulatory requirement for you to hoard dead information. So take the chance to identify deadwood data, and get rid of it, preferably by a formal disposition process, or if you must then simply move it to cheap offline storage and "archive it."  But whatever you do, work toward a situation where only active and relevant information is sitting in your silos. Move or destroy information that is not.

For many organizations such a clean out delivers more value than the rest of the project activities put together. In some cases volumes of content get reduced by 80% plus, and as a result when you are browsing, searching, or mining content you are only accessing current, relevant information.  It also makes the job of consolidating, migrating, or integrating information silos much more worthwhile. 

This should always be your starting point: clean data. Otherwise you'll spend most of your time knitting together in one way or another an awful lot of useless files.