Why Automated Curation Never Works

Why Automated Curation Never Works

This is a big topic around here. I’ve seen many automated content curation tools popup now and then. Most of these are targeted to the IM crowd and promise big things. You also have other tools like Paper.li that claim content curation but are just veiled attempts at aggregation.

Our philosophy is different than many of these curation tools. We like to build tools that are one step below automation.

This drives our philosophy in not only what features we add to the base Curation Suite™ platform but also the types of features we add to the Listening Engine.

It’s when you add this content discovery (or listening platform) to the mix where things get a little murky and turn some content curation tools to the dark side.

Why some platforms turn to this automated curation model is simple and usually the argument is some form of “well we found content for you now let’s just automate the curating because that’s what you really want”…

It seldom is something that is executed correctly and I don’t see it getting any better in the current state of computer logic. One of the main reasons for this is…

Machines Can’t Create Effective Commentary

Here’s a hidden gem that we will start talking more about. Commentary is the real value of content curation. While the content you share is the appetizer, the commentary is the main course, the drinks, and the desert. Done correctly commentary will re-inforce your prospective and be something that is highly sought after by your audience.

There isn’t a machine yet that can produce effective commentary that is insightful, or snarky, or humorous. Sure a computer recently might have passed the Turing test, but acting like a confused teenager  from Russia is a far cry from providing a few lines of text that make you smile, wink, grimace,  laugh out loud or bang uncontrollably on your keyboard.

No Machine Can Decipher Quality

The current state of automated curation looks at social shares and content engagement. While this is a good indication of what might be popular that doesn’t mean it’s a good indication of what is valuable to your audience. It’s also not a good indication if you can provide effective commentary as we mentioned above.

That’s why services like Klout currently fail.  Klout does an okay job of indicating if someone has some form of digital influence (and it’s getting better) but it doesn’t really do a good job of measuring real world influence. Maybe if Klout had the computing power and reach of the NSA it could possibly provide some real world influence data. I’m not knocking Klout as they are doing the best they can with the data that’s provided but right now the Klout score doesn’t correlate between real world influence.

Machines Lack Empathy

machine-lacks-empathyThis point ties in both the commentary and quality aspect. Since machines can’t think or feel they lack a critical element that most people have—empathy. Empathy is a critical skill when curating.

Empathy means you can feel what your audience feels, you know their pains, their aspirations, and their wants and needs. No computer or curation tool can get even close to knowing this.

Good content curation evokes an emotion in your audience. A computer can’t watch 30 second video and feel the same range of emotions that a person can. Sure a machine can tell you that a ton of people have watched a certain video. But that machine won’t be able to tell you “this video is perfect for your audience” or “don’t share this video as it’s entertaining but your audience wouldn’t appreciate this from  you”.

If It Could Be Done You Wouldn’t Know

This is another personal perspective. But if we designed a way to automate content curation and it worked highly effectively and we could monetize it—I wouldn’t share it with you. But if I did it would most likely be cost prohibitive.

The reason for this is to do this well is still a hard task. Look at the efforts UpWorthy and Buzzfeed go through curate content. Although one of their main sources is Reddit it’s still a fair amount of effort to put out content that will get people to click, share, and consume.

Final Point

I could probably list 5-6 more reasons why automated curation isn’t effective but in the end good content curation still requires a human element. Until machines/computers can replicate the same nuances produced by the human mind it will be tough to see how automation really can reliably produce an effective result.

This is the main reason why we’ve chosen to build a tool that sits firmly one step below automation.

See you in the next post!