Lolcat Kills Hello Kitty. Live Video NOW!

Most PR people are familiar with journalistic conventions. Things like AP style, for example. We know a story starts with the lede, not lead, and about the reverse triangle approach to news writing. We know what a byline is, and are constantly on the hunt for the people behind them, to pitch stories to.

We also know that it is a noisy world, and there’s much competition for the headline. PR folks often don’t know the reasoning behind what makes the cut. The choices can seem arbitrary and unfair.

But would we be better off if it was some machine, and not a person, deciding what news is important?

The fact is, what we call news is changing – driven by tech and not journalistic principles. More of us are getting our news from Facebook, other networks and news apps these days. And it is the tech companies – and their programmers and algorithms – that determine what appears in our news feeds.

They shape news not only by filtering but also deciding what is share-worthy, what’s crap, and through their deals with news organizations, all while trying to make money and keep advertisers happy.

Big tech and their solutions are increasingly the lenses through which we see the news. They insist, generally, that they’re not the media – but make no mistake. Their influence is real and significant, at almost every step of the editorial supply chain.

I thought it might be interesting to look at traditional news value (see this helpful Taylor and Francis guide) vs. how a social network prioritizes the same.

As you can see from the lists below, there’s quite a difference (OK the “News Now” list is meant to be light-hearted and not 100% accurate – but you get the idea). Another challenge is that the algorithms are constantly changing, hence all the strikethroughs.

This is a big deal, in my opinion. What do you think? What does it mean for your communications strategies and news promotion?

Stay tuned to this blog for further updates on the topic.



Share article on

Nailing the Facebook Image: Handy Cheat Sheet

To properly target and engage your audience on Facebook, you need impressive visual assets. Luckily, Facebook offers the freedom to be creative and use eye-catching images in your profile, company page, ads and event invites.

However, there are image dimensions and sizing guidelines that you must follow, or they will not appear as you like and may not be approved at all.

Luckily, TechWyse created this Facebook image sizes and dimensions cheat sheet to lend you a helping hand when crafting your next social media campaign.

Even the savviest social media professionals may not be aware of the Facebook’s image specs. For instance, shared images and shared links require different sizing when it comes to uploading.

Facebook recommends 1200 x 630 px for a shared image. On the other hand, Facebook recommends that shared links should be 1200 x 627 px.

Besides image sizing and dimensions, Facebook also imposes text character limits. They will disapprove/give lesser reach to promoted posts with more than 20% text. This means that you need to make sure the text you are using in your post images must meet this character limit if you want to see your post approved.

Bookmark, download or print the cheat sheet and share it with your team of social media content creators, digital marketers and graphic designers. Hang it on your desk, on your office wall or anywhere you can easily reference it when working out the specifics of the visual assets to accompany your Facebook posts.

Hope it helps!  Good luck.



Share article on

Internet Society Drills Down on Fake News


I attended the Internet Society’s “Content Rules?!” session the other week.   The panel drilled down on what we now call The Fake News problem (I couch it like this because, as you’ll see it’s not a new one), defining it and exploring causes and solutions.

There’s been a lot already written about fake news. It’s turned into a real meme and hot button, but there’s been lots of noise and confusion. That’s not surprising because it is a complex topic, one that only recently hit our radars in the wake of the election.

Giving it a name gave it legs, a thing to blame (in some cases just because someone doesn’t like an article), and evoked lots of teeth gnashing. The session gave me the opportunity to hear from some very smart people from different sides, better understand the issues and crystallize my thoughts about how we might address the problem.

Not a New Problem

Journalist and American University Professor Chuck Lewis started by explaining that fake news has been around for years in various forms, e.g. government disinformation and propaganda. Toni Muzi Falcone’s DigiDig wrap (in Italian) also discussed this.

“The irony of us feeling victimized by fake news is pretty thick,” he said. “We’ve gone from truth to truthiness to a post-truth society, and now it’s fake news,” said Chuck, “but it’s been going on for centuries.”

He blamed the number of people “spinning information” vs. reporting it, and the ratio of PR people to journalists (which has grown to 5:1), and said it is a crisis for journalism. The big questions are, who decides what is true, and how do you set standards for 200+ countries? We’ve traditionally relied on the press to be content mediation experts.

“We are at a critical, disturbing crossroad,” Lewis said, as “No one wants the government to be the mediators.”

A Systemic Problem

Compounding the problem are the changing ways we get info, and the growing influence of social networks. Gilad Lotan, head of data science at Buzzfeed, discussed this.

He’s studied political polarization in Israel. Gilad showed some fancy social graphs that tracked the spreading of stories in the wake of IDF’s bombing of a Palestinian school. Two different story lines emerged. Neither was “fake” Gilad explained; “They just chose to leave certain pieces of info out in order to drive home points of a narrative.”

Gilad further discussed how your network position defines the stories you see; this leads to polarization and homophily (a fancy way of saying echo chamber). He also explained the role of algorithmic ranking systems. “You’re much more likely to see content which aligns with your viewpoints,” he said. This spawns “personalized propaganda spaces.”

It gives bad actors a way to game the system. Gilad illustrated this via what had been the elephant in the room – the 2016 US presidential election. He shared images that showed phantom groups manipulating the spread of information.

“The awareness of how algorithms work gave them a huge advantage. To this day, if you search for ‘Hillary’s Health’ on YouTube or Google, you see conspiracy theories at the top.”

Moderator Aram Sinnreich, associate professor at American University added: “My impression as a media scholar and critic… is that there’s been a lot of finger-pointing… everyone feels that there’s been a hollowing out of the Democratic process… undermining of the traditional role that the media has played as the gatekeeper of the shared narrative and shared truths; people want to hold the platforms accountable.”

Flavors of Fake News

Andrew Bridges, a lawyer who represents tech platforms, said that it is important to define the problem before considering solutions. The knee-jerk reaction has been to try to turn social networks into enforcement agencies, but that would be a mistake, according to Bridges. That’s because there are seven things calling fake news that could have different solutions (I list them with the examples he cited):

  1. Research and reporting with a pretense of being objective (e.g., major newspapers)
  2. Research and reporting in service of a cause (National Review, Nation, New Republic)
  3. Pretend journalism – claim to be a news source but is a curator (Daily Kos)
  4. Lies – the ones that Politifact and others give Pinocchio noses or “pants on fire” awards
  5. Propaganda – the systematic pattern of lying for political gain
  6. Make-believe news, like Macedonian sites.  They make up news from whole cloth.
  7. Counterfeit sites – they make you think you are at ABC News.com, for example

Then, he dramatically challenged the panel and audience to label certain big ticket topics as fake news or not: Evolution, global warming, the importance of low-fat diets, the importance of low carb diets.

Bridges said that there’s not necessarily a quick fix or tech solution to the problem. “These things have been out there in society, in front of our eyes for years.”  He likened the problem to gerrymandering, gated communities and questions about Hillary’s health.

Some have proposed algorithmic transparency (not surprisingly, Bridges thinks it is an awful idea; “Opening them up just makes it easier to game the system”).

What could work, according to the lawyer? “I think we should look to algorithmic opacity, and brand values of the organizations applying the algorithm.” What about content moderation? He said “Do we turn it over to a third party, like a Politifact? Who moderates the moderator? We know what moderation is – it’s censorship.”

In Bridges view, education is important. We should teach the importance of research and fact checking, and keep each other honest: “Friends don’t let friends spread fake news.”

Other Challenges

Jessa Lingel, Ph.D. and assistant professor at Annenberg School of Communications, seemed to be the youngest on the panel and spoke up for millennials:

“You can’t promise a generation of Internet-loving people a sense of control and agency over content and not expect this breakdown in trust.” She talked about the growth of citizen-driven journalism and the shift from content generation to interpretation. Jessa bemoaned the digital natives’ loss of innocence:

“We were promised a Democratic web and got a populist one; a web that that connects us to different people, instead we got silos. Geography wasn’t supposed to matter… anyone with an Internet connection is the same… instead, geography matters a lot.”

Jessa siad that algorithmic transparency is important but said that it is not enough. “Opacity? I do want to pay attention to the man behind the curtain. We need more than that tiny little button that explains ‘why did I get this ad?’”

Up Next: More on Solutions

As you have hopefully seen from my post, there are many opinions on the situation, and it’s a complex topic.

What do you think? In my next post, I’ll share my thoughts on fake news problems and solutions.



Share article on

Fake News, AI and Bias, Oh My!

First published on DigiDig

There have been quite a few relevant articles recently, so I thought I would write this post which includes links and brief summaries.

Real Interest in Fake News

Fake news continues to be a hot story that shows what can go wrong when algorithms choose what we read. I covered the topic here blog when it first broke a few weeks ago.

Ryan Holmes of HootSuite wrote for the Observer in The Problem Isn’t Fake News – it’s Bad Algorithms: “As algorithms mature, growing more complex and pulling from a deeper graph of our past behavior, we increasingly see only what we want to see… More dangerous than fake news, however, is all the real news that we don’t see. For many people, Facebook, Twitter and other channels are the primary… place they get their news. By design, network algorithms ensure you receive more and more stories and posts that confirm your existing impression of the world and fewer that challenge it. Overtime, we end up in a “filter bubble”‘

Writing for DigiDay, Lucia Moses explained Why Top Publishers are Still Stuck Distributing Fake News. It is not just about news feed algorithms, but also involves the “intelligence” behind automatically-served programmatic ads (native ads can look like real articles). She shares an example in which the NY Times displayed a fake news ad next to their real story on fake news (got that?).

Algorithms and Bias

One of the primary concerns about algorithms relates to bias. How do biases infect data-driven computations? And in which ways do programs discriminate?

Kristian Hammond answers the first question in the TechCrunch story 5 Unexpected Sources of Bias in AI. He ponders whether bias is a bug or feature, and says: “Not only are very few intelligent systems genuinely unbiased, but there are multiple sources for bias… the data we use to train systems, our interactions with them in the ‘wild,’ emergent bias, similarity bias and the bias of conflicting goals. Most of these sources go unnoticed. But as we build and deploy intelligent systems, it is vital to understand them so we can design with awareness and hopefully avoid potential problems.”

Alvin Chang writes in Vox about How the Internet Keeps Poor People in Poor Neighborhoods. He shares an example of a Facebook ad that violates the Fair Housing Act by excluding certain users from seeing it. This is blatant, but algorithmic discrimination can be a lot more subtle, and thus harder to root out, he explains.

Artificial Intelligence for Dummies

If you are new to algorithms and AI, you might want to read this Digital Trends story, which breaks down the differences between machine learning, AI, neural networks, etc.

 



Share article on

Latest PR Gambit: Publishing on Platforms

Back in the day (“the day” being about 10 years ago), we had a simple message for PR shoe-737084_1920clients who wanted to get in on the social media and blogging action.

It was: “Go forth and blog too. Master the channels that are accessible to all.” Those who took the time to produce quality content, nurture social communities and post consistently saw their online influence grow.

Now, the open web is being challenged by the growth of social networking platforms. They’re places we go to connect, and get entertained and informed. Their news clout is growing, as the networks are increasingly publishers and aggregators of content. The social networks reach vast audiences with precise targeting – compelling attributes for marketers.

In short, if you are in the news business or want to promote your own, you are missing out if you are not on Facebook, LinkedIn, Twitter, etc.

But there are a number of challenges along the way. It takes PR out of our media-centric comfort zones. It’s not obvious how to use social networking channels to accomplish your goals, which generally include coverage KPIs.

Sure, many in PR have jumped on the social media and content marketing bandwagons. We can handle Tweeting and blogging quite well. But getting your news seen and covered or appreciated by the right audiences, especially if your profile does not already have umpteen million friends/followers, is another matter.

Success generally requires a combination of paid and organic promotion as well as an understanding of the algorithms, those wonky programs that determine what appears in our news feeds. But they are black boxes and constantly changing. Plus, ad options may be unfamiliar, and they’re also moving targets.

How does one figure this all out? Listen, read, and more important, experiment. Dip your toes in. Test, validate, then repeat.

Reading this blog is a good start, as it offers commentary, articles about best practices and links to the right resources. The networks can be opaque, when it comes to specifics about their algorithms – but they do inform about changes and make recommendations.

In short, there are no pat answers, although one could invoke advice similar to the words at the beginning of the article: go forth and publish on Facebook (for example). Learn about the secrets of shareable content and how to get into the news feed.

I’ll close with an example from the world of politics, which seems fitting since the election has been front and center. It’s an article that ran awhile back in the NY Times Sunday magazine.

What do you think? Could a similar approach work beyond the field of politics? What ideas does this give you for PR? See the link and excerpts below, and please share your comments.

Inside Facebook’s… Political Media Machine
[Facebook’s] algorithms have their pick of text, photos and video produced and posted by established media organizations… But there’s also a new and distinctive sort of operation that has become hard to miss: political news and advocacy pages made specifically for Facebook, uniquely positioned and cleverly engineered to reach audiences exclusively in the context of the news feed…

These are news sources that essentially do not exist outside of Facebook… cumulatively, their audience is gigantic: tens of millions of people. On Facebook, they rival the reach of their better-funded counterparts in the political media…

But they are, perhaps, the purest expression of Facebook’s design and of the incentives coded into its algorithm — a system that has already reshaped the web…
Truly Facebook-native political pages have begun to create and refine a new approach to political news…. The point is to get [users] to share the post that’s right in front of them. Everything else is secondary.



Share article on

Breaking into Facebook’s News Feed: 3 Stories, 9 Tips

Facebook has made quite a few changes to its algorithm and news feed in recent months, as has been news-1592592_1280chronicled on this blog.  Digiday said that some publishers are responding by focusing more efforts on SEO.

But where does this leave brands and marketers who want to target Facebook users with news and content?

You need to change with times. These days, your content should be informative, relevant and entertaining – it helps if the topics resonate with your friends and family.

There were quite a few good posts that recommended strategies in light of the updates.  Here were three that stood out, and three tips for each.

In A Publisher’s Guide to Facebook’s News Feed Updates, the Newswhip blog shared these tips:

  1. Focus on organic reach and stories shared by actual users vs. brand pages
  2. Use engagement metrics to inform strategy and content creation
  3. Stay attuned to what interests your readers and work hard to serve a niche audience

The same blog follow up with more good advice: How to Adapt to Facebook’s “Personally Informative” News Feed. It offered a helpful pointer to how Facebook defines Personally Informative. This means staying in tune with audience interests. Newswhip recommends:

  1. Creating an RSS aggregator featuring the news sources favored by your desired audience
  2. Building your personal brand – the changes favor peer-to-peer sharing
  3. Being genuine, avoiding clickbait and deception

To the last point, Facebook’s more recent changes target and penalize click bait.  The Hootsuite blog featured a story on How to Get Clicks without resorting to Clickbait.  It recommends:

  1. Be accurate, the headline shouldn’t promise more than the content delivers
  2. Create an emotional connection
  3. Take the time and care to craft an effective headline

 



Share article on

Steal this News Feed (How to get Into Facebook Trending)

I don’t typically write about reverse engineering news feeds. This blog is about hacking the feed in a hacker-1500899_1280figurative sense; i.e. boosting the odds that your news gets featured in the social networks that dominate our attention these days. It’s less about black hat, more smart marketing and communications.

But I thought I’d share a story about the actual hacking of algorithms. In Quartz, David Gershgorn wrote that Stealing an AI Algorithm and its data is a “high school-level exercise.”  He wrote:

Researchers have shown that given access to only an API, a way to remotely use software without having it on your computer, it’s possible to reverse-engineer machine learning algorithms with up to 99% accuracy. Google, Amazon, and Microsoft allow developers to either upload their algorithms to their cloud or use the cloud company’s proprietary AI algorithms, both of which are accessed through APIs.

The article explained how you can crack the algorithm’s logic by sending queries, and evaluating the answers:

Think about making a call to a machine learning API as texting a friend for fashion advice. Now imagine you were to send your friend thousands of messages…  After driving your friend insane, you would get a pretty clear idea of their fashion sense, and how they would pick clothes given your wardrobe. That’s the basis of the attack.

However, anyone who wants to hack Facebook’s news feed would not benefit from this approach, which relies on the availability of an API that’s accessible to developers in the cloud.

So, what about figurative hacking? As this recent NiemanLab piece relates (it also references Gershgorn) Almost No one Knows How Facebook Trending Algorithm Works (But Here’s an Idea). Joseph Lichterman wrote:

Trending now… features broad topics surfaced by the algorithm. According to Facebook’s guidelines, the engineers overseeing Trending are “responsible for accepting all algorithmically detected topics that reflect real-world events.”

Based on some sniffing around, he determined that these thing can help you get into Facebook Trending:

  • Make sure your content includes keywords or hash tags that are trending
  • Don’t spam (Facebook detracts for frequent posting)

What do you think? I’ll be sharing many more tips about how to optimize your news for the Facebook news feed in an upcoming post.



Share article on

Huffington Post taps Data Science to go Viral

My recent posts have explored how publishers are working with social platforms to expand audience and IMG_2875adapt story telling formats (see Publishers & Platforms In a Relationship, and Platforms as Publishers: 6 Key Takeaways for Brands). They reported the experiences of social teams and editors at some of the largest broadcast, print daily and native web outlets.

Those featured, however, didn’t go into detail on the role of advertising to boost reach.

At last week’s NY Data Science Meetup (at Metis NYC) we learned how the Huffington Post, the largest social publisher, is using data science to better understand which articles can benefit from a promotional push. Their efforts have propelled merely popular stories into through-the-roof viral successes.

The meetup was about Data Science in the Newsroom. Geetu Ambwani, Principal Data Scientist at Huffington Post, recalled the days when their editors monitored searches trending on Google to inform content creation and curation. Since then it is a new game, as more people are discovering and consuming news through social media.

In an age of distributed news, HuffPo needed a new approach.

Data across the Content Life Cycle

Geetu discussed the role of data in the content life cycle spanning creation, distribution and consumption. For creation, there are tools to discover trends, enhance and optimize content, and flag sensitive topics. Their RobinHood platform improves image usage and the all-important headline.

Geetu’s favorite part, she said, was exploring the “content gap” between what they write and what people want to read. It’s a tension that must be carefully considered – otherwise writers might be tempted to focus on fluff pieces vs. important news stories.

When it comes to consumption, data can be used to improve the user experience – e.g. via recommendations and personalization.

Project Fortune Teller: Data Predict Viral Success

Geetu and her team turned to data science to help with distribution. “The social networks are the new home page – we need to be where the audience is,” she said.

Only a small percentage of their stories get significant page views on the web. Performance on social often varies by platform. The team honed the content mix for each to improve engagement. Part of this was determining which articles out of the 1000 daily stories should get an extra boost.

Geetu wondered if they could mine data to spot the ones that have “legs” beyond early popularity. With this info in hand, they could promote these with high value ads, and populate Trending Now and Recommendation widgets to further boost sharing and reach.

And thus , Project Fortune Teller was born. The team looked for winners according to a range of data such as web traffic growth, and social consumption and sharing. But it was no easy task. There are many variables to consider. They needed to determine the optimal time window, as some articles take a bit longer to start to trend. Finally, they intentionally excluded hot news stories, instead focusing on evergreen content that was resonating.

Geetu and her team mined historical data, using time series analysis to build a model (for more details, see this SlideShare presentation). They notified the content promotion staff when there was a likely winner. The resulting quick action turned popular articles into viral successes.

The conclusion? Machine learning is a key driver of success for predicted content.



Share article on

Facebook Calls the News Shots, Upending Media and Marketing

I generally don’t chase breaking news stories – my posts come once or twice a week at most. This may stormy-1472633_1920seem a disadvantage in the fast moving world of social media. But the slower pace affords some perspective –  I try to look beyond the quick headline, see the bigger picture and connect the dots for readers.

And experience has shown that if I miss one news cycle, there will be another right around the corner.

For example, in just a few short weeks, Facebook drew fire for apparent bias in their Trending Now feature.  Research came out confirming that it is the number one social network for news – and the chief way many of us get our news. The company changed its algorithm, decreasing the organic reach of publishers.  And just this week they’re again catching flack – this time, for not seeming to think through implications of Facebook Live, as citizen journalists broadcast raw footage faster than Facebook can filter the streams (see Farhad Manjoo’s NY Times piece).

On the one hand you have admire their continued innovation.  Facebook never stands still, always seems ready to shake things up to keep users engaged and coming back. On the other, you wonder how much they’ve thought through all the implications.  It’s a little like the proverbial dog chasing a car.  Facebook has caught the news “car”, now what does it do?

They seem to be playing all sides, trying to make everyone happy while increasing their influence. There have been the predictable media responses about impact on journalism, echo chambers and trivializing of news.

The reality is, news is  is in the eyes of the beholder – and in a content and algorithm-driven world, Facebook – increasingly the arbiter – says News with a capital N needs to get in line.

Meanwhile, media should adapt their strategies, as it is clearly a mistake to focus on Facebook and platforms at the expense of cultivating other sources of traffic and attention.

Marketers go where media and users do – so they need to  take a fresh look and revise their play books.

As to the impact on users, and society at large? There, I am not so concerned. We continue to have endless choices of info, news, opinion and analysis.

If people want to rely on Facebook to stay informed, that is their prerogative.  If they want to ignore news and spend their time with baby pictures, that is fine too.  These are likely the same people who looked no farther than the bridge of their nose for other views before Facebook.



Share article on

The One thing that Could Settle the Facebook News Controversy

There’s one thing that could put the whole Facebook Trending News bias controversy to monkey-236864_1920rest – but I haven’t seen it yet.

In case you are not familiar with the story, Gizmodo ran this piece last week: Former Facebook Workers: We Routinely Suppressed Conservative Views. The story hit a nerve, given election season timing and concerns about the growing influence of Facebook and other social networks.

To quell the controversy, Facebook made a number of statements and released details of their process for how the trending news “sausage” is made (which was a real page turner for news feed geeks like me).

These moves did not settle the matter. The major news organizations covered the details – but predictably and ironically from their left and right leaning perspectives.

The New York Times story by Mike Isaacs saw no evil; he covered the checks and balances, implying that these filter out biases:

While algorithms determine the exact mix of topics displayed to each person… a team is largely responsible for the overall mix of which topics should — and more important, should not — be shown in Trending Topics. For instance, after algorithms detect early signs of popular stories on the network, editors are asked to cross-reference potential trending topics with a list of 10 major news publications, including CNN, Fox News, The Guardian and The New York Times.

But the Wall Street Journal’s Deepa Seetharaman saw something more insidious:

Facebook researchers last year ranked 500 news sites based on how popular they were with the social network’s users who identified their political alignment as conservative or liberal. According to those rankings, eight of the 10 national news outlets that play an outsize role in determining trending topics are more popular with liberals.

I’ll leave the topic of Facebook’s response to a crisis for another post (they broke the first rule: don’t piecemeal information, that just prolongs the issue).

It seems to me that there must be some unstructured text mining tech that could settle the question. Isn’t it possible to analyze Facebook Trending News stories to find and tabulate the ones with a slant – and compare the numbers for left and right-leaning stories?

It is no simple task, given the complexities of how this all works. Facebook’s document reveals the processes, which blend human and machine effort. Even after the algorithms and editors pick their shots, the feeds get further tailored based on user preferences and actions.

But the bigger question: Is it really even possible to edit or curate without bias?



Share article on