The
Good, The Bad, The Internet
The
broadband communications industry is in a state of constant and rapid change.
With the rise of video streaming services such as Youtube, Netflix, and Hulu, the
demand for high speed and high bandwidth service is at an all-time high. Additionally,
the internet has become an integral part of nearly every industry.
Unfortunately, as demand has increased, internet service providers in the
United States have not taken necessary steps to improve their infrastructure to
be able to provide the service that consumers currently want. The situation has
turned to arguments and court settlements about who should be paying who.
Consumers are left with a subpar service without any options for change since
the primary ISP competitors have essentially created a monopoly where none is
better than the other and new competitors are kept out. The Federal
Communications Commission has failed to change or create regulations that would
improve the situation. While there are some bright spots of new companies and
municipalities working together to improve the infrastructure in their areas, prohibitive
costs are ensuring a continued lack of competition. Internet service providers
in the United States are not meeting consumer demand and are actively
preventing progress in the industry, because of their refusal to upgrade their
infrastructure, anti-competitive actions, and employment of anti-consumer
policies.
The
current state of affairs in the internet service industry has many contributing
factors, but has been largely shaped by the Telecommunications Act of 1996 and multiple
lawsuits against the Federal Communications Commission. The Telecommunications
Act of 1996 was the first major change to telecommunications law in the United
States since the Communications Act of 1934. The act was necessary to create
laws in response to the spread of the internet in the general public. While a
primary purpose of the act was to ensure competition in the industry, reduced
regulation of media company mergers lead to a massive amount of acquisition by
a small number of larger companies. This had the unintended effect of actually
reducing the amount of competition in the market. The acquisition of small
companies beginning in the mid 1990’s has continued and escalated to this day.
Recently, Comcast has announced plans for the acquisition of Time Warner Cable,
the second largest internet service provider behind Comcast, which would
effectively remove a massive competitor. Additionally, the deregulation was
intended to ease the entry of new companies into the industry. However, the few
companies that initially bought or merged with the smaller companies became so
large so quickly that no new company could possibly compete without billions of
dollars. The lack of regulation and vague language from the Federal
Communications Commission lead to internet service providers pushing the
boundaries even further, eventually leading to Comcast V. FCC in 2010. The case came about when the FCC attempted
to stop Comcast from deliberately blocking certain types of internet traffic
from Comcast customers. The FCC based their argument on articles within the
Telecommunications Act of 1996, but the D.C. Circuit decided that the FCC had
no legal jurisdiction to regulate Comcast’s network management. In response to
the ruling, the FCC attempted to gain jurisdiction with the Open Internet
Rules. Alexander Reicher gives an in depth analysis of the case’s impact on how
internet service providers are regulated and the failure of the Open Internet
Rules. Reicher asserts that the FCC has failed to “adequately elaborate
criteria for reasonable network management in the Open Internet Rules,” and in
this failing “the FCC left the concept wideopen to interpretation by future
litigants” (Reicher 750). Reicher concludes that “Reasonable network
management, in turn, should be defined first by whether or not the
discriminatory practice is technically necessary” (Reicher 762). Unfortunately,
the lack of strong action from the FCC has allowed internet service providers
to continue restriction of traffic in order to maximize profit without regard
for the actual quality of service they are being paid to provide.
Questionable
business practices by the major internet services providers in the United
States have continued to this day. Since streaming services like Netflix and
Youtube have gained massive popularity, accusations of ISPs deliberately
limiting the speed and quality of streaming video have arisen again. The
accusations come when a customer notices that the quality or speed of their
service becomes inconsistent depending on what they are doing. Some have argued
that ISPs are limiting their service in a manner similar to Comcast’s actions
that lead to the case versus the FCC mentioned earlier. ISPs have consistently
denied this however, recent events cast doubt on their claims. In February of
2014, Netflix agreed to pay Comcast an undisclosed amount to ensure quality
streaming to Comcast users. This arrangement came about after Comcast refused
an offer from Netflix to install Netflix servers in their network in order to
ensure a consistent experience for users. Upon completion of the agreement,
performance for Comcast customers improved immediately. This makes it unlikely
that the problems that were causing complaints were anything besides artificial
limitations put in place by Comcast. As Comcast has become the largest internet
service provider in the U.S., anything they do essentially sets a precedence
for the other large ISPs to follow. Last week, Verizon came to a similar
agreement with Netflix. The negative implications of these agreements are
immense. First, requiring content providers to pay ISPs for assurance that
their content can reach customers severely limits the possibility of new
content providers being able to afford to create competition for the
established services. Secondly, due to the deregulation established by the
Telecommunications Act of 1996, it is possible for ISPs to give an artificial
advantage to services, such as highly profitable subscription based television
content, owned by the same parent company. Continuing to limit the usability of
online video streaming services serves to push users back to television, which
is often bundled by ISPs with internet service. This also serves to alleviate
pressure to make costly infrastructure improvements to meet demand. Some ISPs
have responded to the increase in video streaming and downloading by consumers
with the implementation data caps and usage based pricing. A data cap
essentially means that while a customer is paying for an advertised speed, she
can only use it a set amount before incurring additional charges. This type of pricing
was common to mobile phones for years but has been met with heavy resistance
when companies have attempted to implement it in home internet connections. Jacob
Minnie questions the necessity of this pricing method by discussing both the
reasoning stated by ISPs and the clear counters to their arguments. Minnie
summarizes the two primary reasons given for implementing data caps as being “it’s
unfair that the 99% should be subsidizing [the top 1% of data user’s] usage”
and “to prevent network congestion” (Minnie 13). Minnie argues that the first
point runs counter to the usage and profit figures released by the ISPs
themselves noting Time Warner Cable’s statement in 2008 “TWC’s costs for data
access dropped 12% while the number of subscribers climbed 10%” (Minnie 13). This
shows that as individual cost for data usage is decreased, the overall
subscriber base increases and therefor increases profit without implementing
data caps. The second argument of preventing network congestion is more
complex. While it is true that data usage has risen to an extent that the
current infrastructure is strained during peak usage hours, there is no
evidence that data caps are beneficial in dealing with the congestion. In
Minnie’s argument against this point he raises the statement of Comcast Senior
V.P., Joe Waz, who said “that bandwidth caps do little to change that” and a
Comcast White Paper which stated the “cap does not address the issue of network
congestion, which results from
traffic levels that vary from minute to minute” (Minnie 14). Minnie goes on to
argue that the true reason behind data is monetary gain which is a point also
argued by Daniel Lyons who also elaborates on the anticompetitive effects of
usage based pricing. Lyons also points
out a statement from Comcast that “estimates that the amount of data required
to replace its cable service with an Internet-based competitor would be 288
gigabytes each month, a figure suspiciously close to the 300-gigabyte monthly
cap that the company
is test-marketing.” (Lyons 36). Arguments
made by ISPs based on bandwidth limitations serve only to distract from the
issue that demand is outgrowing the currently installed infrastructure.
In
2011, Google launched a new service called Google Fiber. Google Fiber provides
fiber optic broadband internet and television to users. The current widespread
copper wire standard in use across the U.S. is absolutely dwarfed in terms of
bandwidth and speed by fiber optic networks. Casimer and Carolyn DeCusatis
state it in simple terms; “the reason we are interested in optical fibers for
communication is because their available bandwidth far exceeds that of copper
wire” (DeCusatis 9). However, the cost involved in upgrading a copper network
to fiber optic is immense. This leads to an argument over who should be
handling the investment in infrastructure upgrades. Pierre Coucheney, Patrick
Maillé, and Bruno Tuffin examine this argument caused by the increased demand
for improved infrastructure. They describe the core argument of ISPs as being
“The underlying concern is that investment is made by ISPs but content
providers get an important part of the dividends” (Coucheney 1). The ISPs do
not feel that they should be solely responsible for the cost of upgrades when
the upgrades will directly contribute to an increase in profit for content
providers. Comcast has used this as justification for requiring payment from
Netflix to ensure quality service. On the other hand, Google has taken a
different approach by working with customers and individual cities to reduce
the cost of fiber optic installation. In cities where Google Fiber has been
introduced, the cost of the infrastructure upgrade has been incorporated into
the subscription price. Additionally, Google has worked with city governments
to speed the process of obtaining and reduce fees for construction permits.
While this still leaves Google with a massive installation bill, they seem to
have a more long term goal in mind than other ISPs in the U.S. While Comcast
and Verizon attempt to make short term gains by charging content providers,
Google is working to improve the end product for the consumer. All of these
companies are looking to maximize profit but Google’s approach appears to be
offering the best product, not to see how many different charges can be created
for a product that does not meet demand. Compared to countries like South
Korea, Japan, and Sweden, the United States is lagging behind in terms of
internet connectivity directly because of the lack of widespread access to
fiber optic connections. The response to Google Fiber from the established ISPs
serves to show how the lack of competition in the market has allowed the
continued lack of infrastructure upgrades. While multiple ISPs immediately
announced implementation of their own fiber optic networks in response to
Google Fiber, none have actually installed any new infrastructure. At best,
ISPs are offering fiber service to areas that have been built with fiber
connectivity already. Comcast demonstrated a one gigabit per second connection,
approximately 100 times faster than the average connection in the United
States, in 2011. Today, the fastest connection speed offered by Comcast is 105Mbps,
about one tenth of what they demonstrated three years ago. Similar capabilities
have been demonstrated by other ISPs but without real competition being able to
thrive in the industry, there is no pressure on ISPs to actually improve their
service. Recently, I spoke to a small business owner, Chad Uthe, who described
his problems with internet connection for his company’s offices. Uthe is the
president of a small landscaping company and recently began experiencing
problems with his connection on the business office computers. After contacting
his ISP, AT&T, he was informed that the physical copper connection was not
capable of reliably supporting the connection speed he was paying for.
AT&T’s technician recommended reducing the connection speed by 50% to
improve the quality. When asked how this situation was affecting the day to day
operations of the business, Uthe responded “obviously it is a problem. All of
the time we spend dealing with this is time we aren’t spending on customers”
(Uthe). I also asked Mr. Uthe if
switching to another internet service provider was an option to which he
responded that no other company had any lines to the building. However, he did
say “Comcast offered, but we would have to pay about $6,000 for a new line to
be installed” (Uthe). This illustrates not only the problem with the lack of
competition that allows ISPs to continue offering subpar services but also that
this problem is having an effect on businesses as well as residential
subscribers.
Internet
services providers argue that there is no incentive to change their practices.
As described by Coucheney, Maille, and Tuffin, ISPs are unwilling to invest in
the necessary changes if it will lead to profits for other companies. While
this argument stands on its own, consideration must be given to the fact that
ISPs are already charging subscription fees that are designed to be able to
cover infrastructure maintenance and improvements. In defense of their
questionable network traffic management practices, every argument offered by
ISPs has been refuted by their own later statements. Unfortunately, the most
powerful argument the internet services providers have is that there is no
reason to change. There is no new competition coming in and the established
providers have all acted with the same anti-competitive, anti-consumer
behaviors.
Beginning
with actions immediately following the Telecommunications Act of 1996, internet
service providers in the United States have demonstrated consistently
anti-competitive and anti-consumer behavior that is damaging to the industry as
a whole. By taking advantage of deregulation that was meant to improve
competition in the market, a few ISPs have created a near monopoly on the
industry. Instead of making improvements to the quality of their service in
order to meet consumer demand, ISPs have turned to ridiculous price hikes and
unreasonable network management practices. Not only are internet service
providers failing to meet general consumer demand, they are also failing their
business subscribers. Their refusal to upgrade the existing inadequate copper
infrastructure has brought about a state of stagnation in the industry. New
technologies and capabilities are being pushed aside and blocked out in order
to maintain a lack of completion. Weak action from the Federal Communications
Commission has led to a near monopoly with little oversight. As long as
internet service providers in the United States continue their short sighted
policies, without oversight, and without competition, consumers will be stuck
with a subpar service.
Works
Cited
Agrawal,
Govind P. Fiber-Optic Communication Systems. Wiley, 2010. Print. 15
April 2014.
Coucheney,
Pierre, Patrick Maille, and Bruno Tuffin. "Net Neutrality Debate: Impact
of Competition among ISPs” IEEE Transactions on Network and Service
Management 10.4 (2013): 425-33. Print. 15 April 2014.
DeCusatis,
Casimer and Carolyn J. Sher DeCusatis. Fiber
Optic Essentials. Burlington: Academic Press, 2006. Print. 15 April 2014.
Lyons,
Daniel A. "Internet Policy's Next Frontier: Usage-Based Broadband
Pricing." Federal Communications Law Journal 66.1 (2013): 1. Print.
15 April 2014.
Minne,
Jacob. "Data Caps: How ISPs are Stunting the Growth of Online Video
Distributors and what Regulators can do about it.(Internet Service
Providers)." Federal Communications Law Journal 65.2 (2013): 233.
Print. 15 April 2014.
Perren,
Alisa. "Rethinking Distribution for the Future of Media Industry
Studies." Cinema Journal 52.3 (2013): 165-71. Print. 15 April 2014.
Reicher,
Alexander. “Redefining Net Neutrality After Comcast V. FCC.” Berkeley Technology Law Journal 26.1 (2011):
733-63. Print. 15 April 2014.
Shaffer,
Gwen, and Scott Jordan. “Classic Conditioning: The FCC’s use of Merger
Conditions to Advance Policy Goals.” Media, Culture & Society 35.3
(2013): 392-403. Print. 15 April 2014.
Uthe,
Chad. Personal interview. 25 April 2014.
Reflection
I chose this topic because it is so important to technological development and daily life, but the problems that are occurring seem to not be well-known or well-understood. I am very pleased with the topic. My goals were to bring attention to the negative consequences of current ISP's actions and address their arguments. I think that I met those goals however, I think the large amount of background information needed and overall complexity of the issue made the paper feel somewhat spread thin. I considered focusing entirely on the Telecommunications Act of 1996 as there is certainly enough there to write about. I decided against that because I wanted to show the current situation and bring more clarity to the issues. The most interesting aspect of this topic for me was the Telecommunications Act of 1996. It was an excellent experience to see how widespread of an effect the act had on the industry despite looking good on paper. It shows how little understanding for the industry there was at the time and the lack of change shows that poor level of understanding has continued. I collected a very large amount of information. However, it was difficult to find reliable sources for the immediate situation. In the short period between when I began research and when I was working on the final draft, there were at least three major industry events relevant to the topic. I learned a great deal about parsing sources to check for reliability and cross referencing other sources to verify.
No comments:
Post a Comment