Five signs that vendors influenced an analyst report

There's a lot of skepticism today about the influence that vendors exert on traditional industry analyst reports, including all-important technology evaluation reports and quadrants.

That skepticism is well founded. But the problem is not that vendors bribe analyst firms outright. The game is more subtle than that. Vendors use their leverage over analyst firms to skew reports in favor of suppliers over customers.

You end up with charts like this.

Mystical Quadrant

How do you know when a report or white paper was written more for vendors' interest than for you the customer? Here's five tell-tale signs.

1) When the report employs euphemisms for product shortcomings

Let's start with a dead give-away: when a report labels real weaknesses as "challenges."

This implies that customers dealing with undertested code, siloed datasets, messy architectures, or latent technical debt is not actually career-deadening drudgery, but somehow a muscle-boosting activity, like training for a marathon or learning a new language.

When you dig deeper you'll often find that the analyst actually meant challenges for the vendor to overcome. It's revealing that traditional analyst grammar puts the vendor as subject and the customer as object. Analyst reports are often for the benefit of other vendors and investors.

A similar and only slightly better variant employs the word "caution." When we point out -- as we did recently for a DAM vendor -- that "the company is perceived by customers as arrogant and unhelpful," that's not a caution. It's a potential project snuffer for you the implementer.

2) When the report takes vendor promises uncritically

When confronted with a real weakness, vendors are no different from other companies: they make promises. Most commonly, they'll say, "we're going to fix this in the next version." Sometimes they do actually fix the problem, often they don't, but either way it gives them something to smooth a written critique in the meantime. 

I've learned through hard experience that roadmap certainty is an oxymoron, for commercial products and open source projects alike. Anyone who lived through years of HP/Autonomy/Interwoven TeamSite 6.x - 8.y can attest to eternal promises of "it's forthcoming" in a mystical (but ultimately mythical) subsequent. Adobe, Drupal, and Office 365 customers can describe similar experiences.

So when you read "a fix is on the way," there's a good chance another kind of fix was already in...

3) When the report employs fake criticism

Analysts often avoid criticizing vendors by listing "challenges" that don't really matter to customers. Reports will decry things like a lack of visibility in markets that the vendor doesn't target in the first place, or a perceived lack of marketing acumen.  How much do you care whether your tech supplier is good at marketing?

Analysts also love to poke at a vendor's "positioning." Positioning is a bullshit term with no relevance for you the customer. In vendor-speak, positioning translates to "watch what we say, not what we do."

Sometimes there's another game here. Citing a positioning or marketing problem represents a not-so-subtle critique of the vendor's product management — a critique that might generate a sales call from that same analyst firm offering to advise the vendor for a very lucrative fee.

4) When the report doesn't grade on a curve

Typically a vendor or technology needs to be writhing on its deathbed to receive a less-than-average rating in a traditional analyst report. I won't ascribe all technology problems to poor vendor selection, but in a world where more than half of MarTech projects fail to meet their objectives, surely your choices are not just between "good" and "better."

Crowd-sourced review sites sometimes fall victim to this too: all the vendors rate between an 8 and a 10 on some fanciful numeric scale, leaving you the customer to divine real meaning from fractional differences.

Remember, the key criterion here is not a generic rating, but a supplier's fit for your particular circumstances.  That's why RSG developed RealQuadrant, so you could build a custom vendor shortlist for your unique circumstances.

5) When the report avoids taboo topics

In vendor-influenced reports, certain topics become taboo, and therefore go missing. How often do you see criticisms of things like scalability, cloudability, security, customer support, and ease-of-use?

The stakes here become quite high. If enterprise customers notice a taboo problem in a major analyst evaluation, they may consider them deal-killers. So the terms become toxic, and analyst reports tend to avoid them. It's possible that industry analysts — many of whom are former vendor product managers themselves — simply get cautious about appearing to have dropped the big hammer on their brethren.

That's bad for you the customer.

We find those same toxic weaknesses raised quite commonly by customers, and therefore very much worth discussion. To be fair, problems like usability frequently represent implementation failings rather than inherent software shortcomings. Then again, it's the role of an analyst to distinguish between those two things. (That's why RSG tends to hire former implementers who can effectively sniff out what's what.)

You the customer should not make hasty judgements here without testing the solution first. Because the real story is this: scalability, performance, usability, and support problems plague all of the nearly 130 tools we cover. The degree and nature of their toxicity will vary, so you need to understand these weaknesses in context. Above all, as an industry we need to talk about them openly. That's what we try to do in RSG's product evaluations.

[Disclosure: RSG offers vendors the opportunity to fact-check our evaluations of their products, but not influence our interpretations. We do not consult with, work for, advise, or shill for vendors.]

What analysts (sometimes) respond...

If you sit down with traditional analysts over a couple of drinks they'll explain that there are many different types of reports, from a vendor-funded white paper to multi-vendor evaluations and surveys. Each variant is going to incur different types of vendor pressure, and most analysts fight their corner hard.

After the third drink they may confide all sorts of things they find wrong with Vendors X, Y, and Z, but would never put in writing.

In such an environment, one way traditional analysts maintain objectivity is to remain charitable to all the vendors reviewed. Hence the escape into euphemism.

What you should do

Remember that software vendors have many key allies here. Beyond friendly industry analysts, vendors can rely on industry evangelists, channel partners, and other cheerleaders to dominate much of the public discourse about their platforms. Sure, more customer conversations are happening in public forums — and that's a good thing — but there's still a place for dispassionate, expert analysis (or we wouldn't be in this business!).

I encourage you to see through the fake gravitas of euphemism that characterizes most analyst (and some journalist) output today. Don't accept that you have to read between the lines to know what a reviewer really thinks.

You're the one spending money on the tools, so:

  • Compare and contrast findings from multiple analyst firms, but...
  • Insist on candor throughout, if only to save your valuable time
  • Ask hard questions of your analyst-advisors, because real experts will have  answers based on hard experience
  • Look for truly tough critiques of vendor offerings, the true sign of objectivity in a world where everyone claims "independence"

We'll try our best to meet you there.

 


Our customers say...

"The Digital Asset Management Research is a straightforward, comprehensive report that's invaluable to anyone considering or implementing digital asset management. With a no-nonsense approach to evaluating the major vendors and best practices, this report is a true handbook. It's my new bedside reading..."


Faith Robinson, Content Strategist & Industry Thought Leader

Other Digital Asset Management posts

DAM Trends to Watch in 2024

To navigate this dynamic environment, we are excited to announce a must-attend webinar: "DAM Trends to Watch in 2024," scheduled for February 28, 2024, at 01:00 pm ET / 18:00 UTC. This webinar is designed to accelerate and align your DAM initiatives with the evolving marketplace, ensuring your strategies are current and ahead of the curve.

DAM New York 2023: Year 20!

Henry Stewart’s next in-person event, DAM New York 2023 (aka the World’s largest event dedicated to Digital Asset Management) is the 20thyear of the conference!

Join Us at DAM LA

After a few long years, we are now just a few weeks away from the DAM community reconvening on the West Coast.