Monthly Archives: May 2013

Creating Accountability for Opinions on the Internet

Check out Part 1 of this piece, Shut Up Everybody.

I’d like to build a website where I can enter in a prediction and who authored it, and then later come back and mark whether the prediction was good or bad. You get a point for a good prediction and lose a point for a bad one. With enough data you’d have a pretty accurate read on whether someone knows what they’re talking about or not. You could even track the predictions on a site-by-site basis and get the same readouts for your favorite blogs.

Similar to Wikipedia, the whole thing works on user-submitted verifiable data. When you enter in someone’s prediction you link to the source. When you have proof that their prediction panned out or not you link to that source as well.

Here’s a couple examples. As I’m writing this post there are a lot of bloggers who feel they need to weigh in on whether the Xbox One will be a success or not. I say let’s write it all down and see who’s wrong! Maybe we’re giving readership to blogs that have no actual credibility or expertise in the field they report on. TechCrunch famously pooh-poohed Twitter when it launched in 2006, but I’m sure they’ve been right on a bunch of stuff too. What’s the exact percentage? Who are the most accurate predictors on staff?

The data is already out there and aggregating it would be a great service to the internet.

Changing the Internet’s Motivations

One of the best ways to enact behavioral change is to start measuring the thing you want to change. If you already read Shut Up Everybody then you know a little about the seedier side of journalism. It’s a business whose success hinges on pageviews and ad impressions which incentivizes opinionated journalism and sensationalism. Many blogs want to bring out the worst in us, because it’s the easiest way to get our clicks.

What if the service I talked about above existed? We can already look at M. Night Shyamalan’s track record on Rotten Tomatoes and learn that his movies in aggregate have only received a rating of 44/100. Metacritic provides a similar service for entertainment. Why aren’t we measuring more things? I want to measure integrity. I want to know when the website I’m visiting has notoriously low integrity. I want their readership to flounder as a result, and I want a competitor with more integrity to take that readership.

If you’re passionate about this subject and you have website building skills you should get in touch with me by sending an email to my gmail address: yayitsandrew.