Hiding From The Right?

You've probably heard about the "Facebook algorithm change," and the big meeting with reps of the Conservative social/political position because of the perception that Facebook has been actively blocking what's "trending" from that end of the pool. This is as opposed to simply letting a numerical system identifying key words indicate which words or strings of words are appearing most frequently, but having an editorial team actively suppress content, as tech website Gizmodo contends is the case.

They probably have. Whether it's provable is another story.

I've got to hand to it him, Mark Zuckerberg has been canny from the moment Facebook was released. Google used many of the same tactics in making the brand, the platform, desirable because you couldn't have it. When FB first hit, it was only for Harvard students, then students and grads. Then students at Ivies. Then sub-Ivies (that's about where I was "allowed" to join). Then any college graduate. Then it opened up to more age groups, and finally, the world.

But by withholding membership, it became something you had to have. Google, recall, was a "by invitation only" email system when it first launched. Well then, having an @gmail.com was a sign that you were somebody. You were cool.

Both behemoth platforms have become so ubiquitous they more or less leapt over "meh, everybody's got that,  and went straight from "in crowd" to default.

But therein lies the issue that  the political/social Right is having with particularly Facebook. It's become too big, too important, too much of a magnifying glass in our day to day lives. We believe it. So if it were lying to us, it could be serious. And it could be too powerful.

In case you've been napping under the proverbial Rock, you know that all results when searching online are served to you based on this thing called the "algorithm." It's the set of rules by which relevance is determined. In search engines, like Google, it has been many things at many different times, but what brought Google to its pinnacle (from which it has never been toppled and probably won't be barring a major hack or server farm disaster) was that when you searched for something, it gave you back what you wanted. I used to giggle - all puns intended - with glee when doing the call-in show, Point 'n' Click,  when, as we would be discussing a caller's tech question, I'd type in a search word or string into Google and get back some information so on-topic it was almost eerie. Needless to say, I, and many others, fell in love with the search tool and abandoned most others.

In fact, while Yahoo, for example, has taken the path of becoming a portal as much as or more so than a search tool, Google has become a utility, of which search is just a piece (think mail,hangouts, a Facebook alternate called Google+, a blogging utility, Contact manager, and even cloud drive) is search.

However, much as we love our Google, if we really began to feel that search results were being withheld from us, or that our individual, as opposed to aggregate, data was being used against us (this is a bit of a conundrum, though, because the more a search engine "knows" about you, the better it can predict what you'll most want when you search for something), we would rebel and the door would be open to an alternative. So Google has been careful to "not be evil," but still show us what we're looking for, creepy though it may be.

Facebook, however, has played a more manipulative game. From that fateful day in 2006 (can you believe that?) when it first launched its News Feed, Facebook has tried to shape your online world.

I remember the day it first dawned on me I wasn't seeing everything everyone I was friends with was putting up. Well, duh. It only stands to reason there's just so much you can possibly take in, so not every cookie your friend from high school 30 years ago shows off on Facebook was going to end up in your news feed. But do you remember the very old days when, if you wanted to know what was new with a friend you had to deliberately seek them out and scroll through their posts and pictures? Now you could just sit back and let the world come to you.

But let that sink in a bit. Let the world come to you. Before that, the world you got was the world you went looking for: "Oh, I'll check and see what my sister has done recently." Now it was a constantly moving series of announcements - but not all of them - from friends and family, and odd people you didn't really remember ever having met, but hey, what the heck.

Then in 2007 FB added the "Like" button, and the game changed again. Now you could react - sometimes strangely given that someone might announce "my dog just died" and you'd hit the "Like" button for commiseration.

From that point on, FB became - and allowed you to become - more and more selective in what you saw. You could engage filters to see or be seen, and FB could analyze your "Like" pattern and choose more of the same for you to view.

The big question now is whether FB has been, recently, playing "fair," or if it even has an obligation to play fair as a private business that you can opt out of if you wish. (More and more Millenials are moving over to Instagram, a social app that sputtered along for a while as mostly a means of sharing images that you could artfully doctor, but now has replaced FB for this cohort. Incidentally, Instagram is owned by Facebook, so there you go.) Does FB have a responsibility to show you the broad spectrum of political opinion? Does FB have to be honest about what's trending - even if what's trending isn't what their editor's like agree with?

Don't be too fast to make a choice here: knowing that something you don't like is trending could be both a danger and a warning. Not knowing that a pernicious idea was percolating to the top could put you in the unhappy position of having no alert. On the other hand, having people repeating, again and again, a silly, wrong, or nasty idea - and therefore having it trend heavily - could also lull you into agreement. I can't tell you how many times I, and I'm sure many others, read one of those - ok, I'll call it as I see it - rotten stupid memes, taken a little time to research it, and then refuted it (wasting my precious finger tips in the effort, no doubt). Yet on and on they go, and where they stop - and how much damage they do along the way - nobody knows.

So. Facebook: is it quashing Conservative thought and opinion? Or for that matter, any other thought and opinion? If it is, then what? How much influence, really, does Facebook have over our thoughts, ideas, moods, and opinions? (Some studies suggest a lot, most of it negative.)

Well, here's my answer for now: Google "is Facebook suppressing Conservative ideas," and what you'll get is what you know you'll get: NPR says "Not really," Hotair says "Definitely."

My algorithm predicted that!

Comments

Popular Posts