The FCC’s Notice of Proposed Rulemaking (NPRM) on net neutrality announced in October 2009 just closed the “comments period” this past Friday. This was the period in which anyone may address the FCC’s announced NPRM and provide positive or negative comments, solutions, and advice. If the proposed rules move forward in their current fashion, or with the possibility of being even more restrictive, they will most likely squash all legitimate and necessary forms of network management. When the rules fall into, place time-sensitive services and applications like voice-over-IP (VoIP) will suffer and operate in a degraded form. The results of which will be stagnation in Internet innovation and advancement, further damage to our economy, and higher prices for Internet access in areas in which Internet service providers (ISP) are dependent on network management techniques to deliver quality service to their customers.
Years ago in classrooms around the country the scholarly discussion of open access became the ideal. Open access was an idea based on the scholarly pursuits surrounding common carriage that dealt with forcing those that controlled telegraph wires in the early 1900s to open those wires to new companies. Essentially this would open access to multiple companies to send messages across the same wires, and each message would be treated equally across those wires.
The concept of open access on the Internet was adopted into the phrasing “network neutrality,” coined by Columbia Law professor Tim Wu, and originally bore the same principles of those in telegraph communication transfer. All communication on the Internet would be treated the same, i.e. neutrally. Net neutrality is invaluable to the Internet, but the present day spin has turned a scholarly debate into a chase for unmitigated regulation.
Today when the phrase net neutrality is used, it is generally synonymous with a variety of scare tactics being bandied about from ISPs blocking access to Web sites and applications to these companies preventing freedom of speech. The truth here is that these things have nothing to do with the real issue. The scare-tactic examples are actually consequences that can occur, but not necessarily do occur, when the proprietary ideals followed by every ISP under self-regulation are treaded upon. The real issue is simply network management.
When data is sent across the web in any form, e-mail, surfing a Web site, watching a streaming video, or playing a video game online, the data is sent in a broken-down form called “packets.” When the information is gathered at the opposite end the packets are rebuilt and the data is received by us, the end users. The concept of net neutrality is at its root about the order in which these packets are sent and received.
Those favoring net neutrality regulation believe that packets should be sent and received on a first-come, first-serve basis. This is most easily visualized by thinking of a two-lane road with traffic headed in the same direction that merges into one lane. When the lanes of traffic merge on the Internet, net neutrality principles dictate that the packets of information should merge and be processed in a first come, first serve manner. In other words, data I requested before my neighbor should be processed and delivered to me before the fulfillment of my neighbor’s request.
Those opposing regulation believe that network engineers know best how to send and deliver network traffic. This is known as network management. However, a very important element in this thinking is to understand that those opposed to regulation are not necessarily opposed to the concept of net neutrality.
Because of the complicated networks we are dealing with, and because of new time-sensitive applications being introduced on the Web today, like VoIP, the first-come, first-serve methodology of sorting Internet traffic has in many ways hurt burgeoning technologies. With certain techniques, like that of quality of service (QoS), network engineers have the capability to prioritize certain packets. This means that transmission of data that is time-sensitive, like VoIP phone calls, does not have to wait in line behind non-time-sensitive data transfers, like e-mail. The promotion of using network management techniques is not an endorsement of throwing out the concept of net neutrality.
The principles of preventing blocking access to Web sites and applications still remain. In fact, the argument can be made that allowing QoS and other network management techniques allows for a more neutral Internet because the end users Internet experience is maintained and presented without flaw no matter what service or application they are using. Furthermore, the recent clamoring that regulation provides First amendment protection is a misunderstanding of our First Amendment. The protections for free speech guaranteed by the First Amendment are a shield for the people against government intrusion, not a sword for the government to decide what free speech looks like.
Even with the best of intentions, ideas don’t always transition into good working regulation. Net neutrality is a good thing, but it is also a philosophy and an idea, and it should be a self-regulated policy with simple FCC oversight. In 2005, then-FCC Chairman Kevin Martin, commenting on Net Neutrality, said, “I remain confident that the marketplace will continue to ensure that these principles are maintained. I also am confident, therefore, that regulation is not, nor will be, required.” Martin was correct then, and remains so today. The free market is a sharp, double-edged sword that will maintain Network Neutrality without the meddling hands of government and the detriment it would bring.
Nick Brown is a technology policy analyst, and has spent time with The Heritage Foundation, Competitive Enterprise Institute, and has been a Google Fellow. He currently works with Digital Society.