Are Google and the Semantic Web "NetNeutral" ?
There's recent news that Google's new SearchWiki will let people tailor their search results to be restricted to a few specific sites. In that way, users can cut out the clutter of sites they don't want to visit, and focus only on those they want. If they prefer articles from the New York Times over the Washington Post, say, they can now influence those results directly.
The idea isn't completely new. Most search engines allow site specific searches if you include a tag like url:website.com, or site:website.com. This, however, allows you to frequently search from a specific array of sites all the time. While it won't be your ownly results, the results will be weighted in their favor. And overtime, the Searchwiki could influence the results others see. For more details read .
While I'm all for optimized searching, but I'm a stronger advocate of Net Neutrality. I'm still not entirely decided whether this breaches any Net Neutrality, but it certainly raises some concern in my mind. The loophole around the Net Neutrality is that it's specified by the user, and not Google - and in that sense, Google remains neutral. But regardless, I wonder of it's likelihood to kills a (somewhat) level playing field.
Take Wikipedia right now. There's many other attempts at a Wiki-like encyclopedia, though Wikipedia is unarguably the largest. While I personally dislike some of Wikipedia's rules, I end up using them more than any of the alternatives. But I would never rate them higher than the more traditionally reputable sources. Besides that however, what about the many other websites that are trying to compete with Wikipedia? They may be smaller, and by virtue of the fact that Wiki-sites rely on user participation they may be less reliable - but if they're swept under the rug, then what? Wikipedia is the only option?
The danger, as I see it, is that Google could be creating an environment where people don't become neutral about websites - and then it becomes increasingly difficult to reach users in an already competitive world. It's like being able to customize your annual YellowPages such that you only get the stores that you want. Sure the YellowPages could become a far more useful tool - but only for getting you what you want. It doesn't help you discover what's new. It's about getting you what you don't know about. What good is a book that has only Domino's Pizza and Pappa John's pizza listed, even if they're the two you order from 90% of the time?
Maybe the mentality is that if a pizzeria were to pop up that had great pizza, it would get noticed simply by its great product. Perhaps... but perhaps not. Great ideas are born every day on the internet, and die out just as rapidly because they aren't getting the audience they want - most often because an already established website adopts a similar product.
The Semantic Web is, perhaps, at risk of creating a non-neutral environment. I guess it all comes down to the users at that point.
So the question remains - is user-control a good or bad thing?