Five signs that vendors influenced an analyst report

  • 3-Jul-2017

There's a lot of skepticism today about the influence that vendors exert on traditional analyst reports, including software evaluation reports. That skepticism is well founded. But the problem is not that vendors bribe analyst firms outright. The game is more subtle than that. Vendors use their leverage over analyst firms to skew reports in favor of suppliers over customers.

You end up with charts like this.

Mystical Quadrant

How do you know when a report or white paper was written more for vendors' interest than for you the customer? Here's five tell-tale signs.

1) When the report employs euphemisms for product shortcomings

Let's start with a dead give-away: when a report labels real weaknesses as "challenges."

This implies that dealing with buggy code, siloed SaaS offerings, or missing functionality is not actually career-deadening drudgery, but somehow a muscle-boosting activity, like training for a marathon or learning a new language. When you dig deeper you'll often find that the analyst actually meant challenges for the vendor to overcome. It's revealing that traditional analyst grammar puts the vendor as subject and the customer as object. Analyst reports are often for the benefit of other vendors and investors.

A similar and only slightly better variant employs the word "caution." When we point out -- as we did recently for a DAM vendor -- that "the company is perceived by customers as arrogant and unhelpful," that's not a caution. It's a potential project snuffer for you the implementer.

2) When the report takes vendor promises uncritically

When confronted with a real weakness, vendors are no different from other companies: they make promises. Most commonly, they'll say, "we're going to fix this in the next version." Sometimes they do actually fix the problem, often they don't, but either way it gives them something to smooth a written critique in the meantime. 

I've learned through hard experience that roadmap certainty is an oxymoron, for commercial products and open source projects alike. Anyone who lived through years of Autonomy/Interwoven TeamSite 6.x - 8.y can attest to eternal promises of "it's forthcoming" in a mystical (but ultimately mythical) subsequent. OpenText, Drupal, and Office 365 customers can describe similar experiences.

So when you read "a fix is on the way," there's a good chance another kind of fix was already in...

3) When the report employs fake criticism

Analysts often avoid criticizing vendors by listing "challenges" that don't really matter to customers. Reports will decry things like a lack of visibility in markets that the vendor doesn't target in the first place, or a perceived lack of marketing acumen.  How much do you care whether your tech supplier is good at marketing?

Analysts also love to poke at a vendor's "positioning." Positioning is a bullshit term with no relevance for you the customer. In vendor-speak, positioning translates to "watch what we say, not what we do."

Sometimes there's another game here. Citing a positioning or marketing problem represents a not-so-subtle critique of the vendor's product management — a critique that might generate a sales call from that same analyst firm offering to advise the vendor for a very lucrative fee.

4) When the report doesn't grade on a curve

Typically a vendor or technology needs to be writhing on its deathbed to receive a less-than-average rating in a traditional analyst report. I won't ascribe all technology problems to poor vendor selection, but in a world where more than half of IT projects fail to meet their objectives, surely your choices are not just between "good" and "better."

Journalist and crowd-sourced reviews sometimes fall victim to this too: all the vendors rate between an 8 and a 10 on some fanciful numeric scale, leaving you the customer to divine real meaning from fractional differences.

Remember, the key criterion here is not a generic rating, but a supplier's fit for your particular circumstances.

5) When the report avoids taboo topics

In vendor-influenced reports, certain topics become taboo, and therefore go missing. How often do you see criticisms of things like scalability, cloudability, security, customer support, and ease-of-use?

The perceived stakes here are very high. If enterprise customers notice a taboo problem in a major analyst evaluation, they may consider them deal-killers, so the terms become toxic, and analyst reports tend to avoid them. It's possible that industry analysts — some of whom are former vendor product managers themselves — simply get cautious about appearing to have dropped the big hammer on their brethren.

That's bad. We find these weaknesses among those most commonly raised by customers. To be fair, problems like usability frequently represent implementation failings rather than inherent software shortcomings. Then again, it's the role of an analyst to distinguish between those two things. (That's why we tend to hire former implementers who can effectively sniff out what's what.)

You the customer should not make hasty judgements here without testing the solution first. Because the real story is this: scalability, security, usability, and support problems plague all of the nearly 150 tools we cover. Many of them are also struggling to adapt to cloud and mobile.  The degree and nature of their toxicity will vary. You need to understand these weaknesses in context. Above all, as an industry we need to talk about them openly. That's what we try to do in RSG's product evaluations.

[Disclosure: RSG offers vendors the opportunity to fact-check our evaluations of their products, but not influence our interpretations. We do not consult with, work for, advise, or shill for vendors.]

What analysts (sometimes) respond...

If you sit down with traditional analysts over a couple of drinks they'll explain that there are many different types of reports, from a vendor-funded white paper to multi-vendor evaluations and surveys. Each variant is going to incur different types of vendor pressure, and most analysts fight their corner hard.

After the third drink they may confide all sorts of things they find wrong with Vendors X, Y, and Z, but would never put in writing.

In such an environment, one way traditional analysts maintain objectivity is to remain charitable to all the vendors reviewed; hence the escape into euphemism.

What you should do

Remember that software vendors have key allies here. Independent industry evangelists, channel partners, and other cheerleaders dominate much of the public discourse about technology. Sure, more customer conversations are happening in public forums -- and that's a good thing -- but there's still a place for dispassionate, expert analysis (or we wouldn't be in this business!).

I encourage you to see through the fake gravitas of euphemism that characterizes most analyst (and some journalist) output today. Don't accept that you have to read between the lines to know what a reviewer really thinks. You're the one spending money on the tools, so:

  • Insist on candor, if only to save your valuable time
  • Ask hard questions, because real experts will have intelligent answers
  • Look for tough critiques, the true sign of objectivity in a world where everyone claims "independence"

We'll try our best to meet you there.

 

Our customers say...

"There is really no comparison between the level of detail and insight I find in the Real Story Group publications and other resources. Why is The Web CMS Research so good? First, Real Story Group avoids fads and takes a very measured, grounded analysis of changes going on in the web marketplace. Second, Real Story Group evaluates solutions holistically, looking at more than the software itself and considering the vendor's business viability. Third, Real Story Group can't be beat for having a conceptual grasp of what is really important in these products, and how they actually get deployed, from small implementations to enterprise scale. It's this thorough, totally grounded perspective that makes Real Story Group research an indispensable tool in my business."

John Berndt, President and CEO, The Berndt Group, Ltd.

Other Web Content & Experience Management posts

Thanks for the Book Reviews!

  • September 18, 2017

We wanted to take a moment to say a special thank you to a few people who took the time to write extended reviews of the book....

New Book Applies Design Thinking to Tech Selection

  • August 2, 2017

Tony and I try to capture how design thinking can lead to better technology selection in a new book, "The Right Way to Select Technology: Get the Real Story on Finding the Best Fit."...