There’s a well-known Internet phenomenon known as the Streisand Effect. The quick summary is that trying to hide or remove information on the Internet will only serve to bring increased attention to the material.
What I find more interesting than the effect itself are the causes behind it and how they create multiple problems in the social media world. Those two problems are Not Understanding the Platform and The Illusion of Control.
Not Understanding the Platform. In the case that gave rise to the name, Barbra Streisand took offense to an aerial photograph taken of her house that was published online. The problem with her complaint is that until she brought it up, nobody knew it was her house. The photograph was merely numbered and had been part of a government survey of coastal erosion. The photograph of her house was one of over 12,000 photos published online and had been downloaded six times (two of the downloads were her attorneys). By not understanding the built-in anonymity of the platform, Streisand shone a spotlight of interest on the image and it ended up downloaded over 400,000 times. Had she taken the time to understand the platform itself she could have avoided the issue altogether.
This is a significant problem when groups don’t understand entire generations of technology (like being so unfamiliar with current technology as to think taking down a Web page will remove all copies of it, cached versions, etc.). But it can also be a problem when refined aspects of a platform aren’t understood as well (photograph tags, reshared content, etc.).
It’s understandable if a person or organization wants to keep information protected. If that information appears somewhere the initial response may be to try and pull it down completely. But successfully pulling posted content offline is surgery, not traumatic amputation. It must be a refined, focused activity that is only taken when necessary (and must be undertaken with an understanding that it may fail). Understanding the platform will lead you to understand how to conduct that surgery–or if any action is even required.
The Illusion of Control. So many times I hear people say that Facebook’s terms give them ownership of your content. I’ll be quick to correct them (section 2 of the Facebook Terms says otherwise) but then I’ll usually go on to ask why they care about ownership. If they are posting their own content to a platform whose primary purpose is to share with others then ownership should be the last thing on your mind. (If the issue is posting someone else’s work, that’s a different issue.)
Truth is, while platforms keep trying to reinvent better privacy controls, privacy and social media are not friends. They’re barely frenemies. More likely, they’ve formally declared war on each other and it’s going to take a summit in Malta to settle everything. Central to that notion is that privacy is about control–the ability for you to keep information secret. To that end, Benjamin Franklin was the greatest privacy expert ever when he famously wrote:
Three can keep a secret, if two of them are dead. – Benjamin “Don’t Friend Me” Franklin
Information is like mud on a white carpet–it wants to go everywhere. And ultimately you cannot control it once the information is loosed. Even if the information belongs to you in some way, once it hits that happy puppy of a social media platform, he’s going to run his muddy paws all over the room, jump up on the couch, and shake himself in the middle of your annual all-white cocktail party. You can take steps to mitigate the damage, but prevention is your best bet. Once the information is out you’ll need to a different strategy to handle containment–you cannot just say “This is mine and I don’t want it here anymore.” It will have the same impact as screaming “Bad muddy dog!” Maybe less.
Social media is redefining what mine and yours means when it comes to content. If you want to have any chance of containing the spread of information then you have to embrace the changing definitions. Once you’ve embraced that change and abandoned the illusion of control, then you can figure out how to work within the system/platform/community to address your concerns.
Ultimately, those two Streisand Effect causes have the effect of unintentionally curating content. There is more content posted online than anyone can read–how we find the interesting content is largely dependent on the content being interesting. Someone trying to keep you from seeing the content instantly makes the post more interesting. Perhaps my headline did that here for you (what a coincidence!).
And once you recognize these two root causes of the Streisand Effect you’ll see it come out in a variety of ways. The latest example that was sent to me by a colleague is the case of a woman who was sued by a church that she posted negative blog comments about. This story shows how both Streisand causes can manifest in multiple forms. First, the woman wrote a few reviews of the church on Google that were taken down, probably at request of the church (mistake). She escalated by starting up her own blog. If you go back and read the blog, it’s fairly innocent. There are some posts that seem critical of a certain church but it’s apparent she wasn’t happy and moved on. She even mentions how maybe a dozen people read a certain post. The church escalated back by suing her for $500,000 and the media storm ensued. Now hundreds of articles have been written about the church’s lawsuit (I haven’t found one favorable to the church) and the church itself. A wave of negative attention much greater than a small blog talking about a local issue. All because the church didn’t understand the platform (Google reviews, then a small blog) and had the illusion of control (that threatening with a big lawsuit would make the blog go away).