Net Neutrality and the Digital Humanities
  • Net Neutrality and the Digital Humanities

    Geoffrey Sauer
    January 15, 2014 

    There's been a great deal of discussion in the past 24 hours about the decision yesterday from the U.S. Court of Appeals for the District of Columbia, which will prevent the FCC from enforcing "net neutality" policies which have been in place since 2010. I've appreciated the coverage in stories like NPR's "Feds Can't Enforce Net Neutrality: What This Means For You", and found substantive articles from regular technology websites like CNET's "Why you should care about Net neutrality", and Ars Technica's "Net neutrality is half-dead: Court strikes down FCC’s anti-blocking rules". The Los Angeles Times was more direct with its story: "Net neutrality is dead. Bow to Comcast and Verizon, your overlords". I've been quite pleased at how prominent the coverage of these matters have appeared in my social media feeds, which suggests that these issues may be more visible today than in the past (though it may just be, à la Eli Pariser, that my "filter bubbles" make me see more of these than others do).

    But while the coverage has been accurate in its analysis of the significance of the policies, I haven't read a good description yesterday or today about why the telecom companies wished to repeal net neurality or discussion from academics about what impact this will have for multimodal digital humanities scholarship. I am concerned that it may not be very useful to resolving this situation to think in Manichean "evil corporations" or "wise government" models; those don't tend to persuade people to change their convictions, I've found, and Peter Sloterdijk helped me to realize the ineffectiveness of that sort of critique when I read his work in the early 1990s.

    Public policy issues surrounding Internet publishing are my area of interest, so I thought I'd spend a bit of time representing my concerns here, to help EServer users understand what I see to be motivations for, cultural issues surrounding, and likely results of this ruling. Especially for those of us in the digital humanities.

    In the early 1990s, Internet connections were "symmetrical." This means that if you had a 10 megabit per second connection, it could be used for uploading, downloading, or any ratio of uploading to downloading. A server would often spend more than 90% of its bandwidth uploading (sending content to remote computers online); a web browsing computer would spend most of its bandwidth downloading (receiving content from remote servers). But any machine could send or receive content, based on the speed of the connection. If you needed a faster connection, you paid for it. This meant that serving multimodal content, such as streaming video, was limited by the costs of the bandwidth. There wasn't much revenue until 2005 in streaming video, so there were limited numbers of sites which wanted to stream video to Internet users. (I might point out that the EServer's Lectures on Demand site, founded in 1996, was one of the early sites to offer this.)

    There didn't seem to be much need to worry about high-bandwidth content on the Internet. Any server which wanted to serve such content would need to pay for the bandwidth to send it, and the costs of that would seem to be prohibitive.

    So broadband service providers, cable and telephone companies building their network service to homes and apartments, began to develop shared and asymmetrical network connections (with much higher speeds downloading content from the Internet than uploading it). An image of shared internet bandwidth. Neighborhoods increasingly shared switching systems (the problem of "the last mile" became a serious one for telecommunication service providers). If you paid a significant monthly fees to a broadband service and only loaded occasional web pages (but wanted them to load quite quickly), it might seem unfair if your connection were slow because your next-door neighbors were downloading large amounts of video. Interstream has a nice illustration of this problem on one of their articles about the net neutrality issue. This was even a greater issue with wireless (cellular) data services, where high-bandwidth consumers were enormously more expensive to serve than typical (low-bandwidth) users. Cell phone companies eventually phased out "unlimited wireless data" plans, and moved to monthly data limits to attempt to prevent some people from dominating wireless bandwidth capacity—though this alienated and irritated some educated consumers (who tended to be the people consuming large amounds of wireless bandwidth). And broadband providers in the U.S. met with strong, articulate resistance from consumers when they attempted to suggest this in the mid-2000s.

    By 2005-06, a few companies had developed ideas to generate revenue from streaming video. Google/YouTube encouraged users to upload video content, and generated income from advertising. Netflix and Hulu Plus served media to subscribers who paid approximately $8 per month for access to streaming media. Numerous other companies began to follow this model, such as Amazon and Apple Inc. By 2013, the stories weren't difficult to find suggesting that up to 50% of network traffic in the U.S. was being consumed by YouTube and Netflix Streaming (see, for example, reporting on Forbes and CNET).

    Broadband providers who were selling data services to users at home were confronted with a problem, then: the users who viewed YouTube and Netflix were much more expensive to serve than users who didn't. And because American consumers were already paying more for bandwidth than consumers in many other industrial nations, the idea to generate revenue from the sites "causing the problem" (i.e., Google/YouTube and Netflix) seemed to have some merit. So telecommunications companies began to lobby, as early as 2009-2010, to have the traditional "net neutrality" policies mandated by the FCC removed or reduced, so that they could either limit the amount of data some websites which sent huge amounts of data across the networks could send, or could block some content unless the companies which provided high-bandwidth services would reimburse the telecom providers. This simply made sense to the telecom providers (no matter how terrible unlimited corporate ability to censor sites of their choosing seemed to those of us in academia).

    The FCC, concerned about beginning a system in which service providers could limit access to some sites. What if, for example, Comcast, a popular cable modem service provider in some regions of the country (which also owns NBC) elected to block or slow service to websites providing access to streaming video from rival television networks ABC or CBS? Or favor/partner with Amazon, Netflix or Hulu, to the detriment of the others? Comcast has been caught in the past throttling download speeds from services such as BitTorrent (see the EFF report from 2007). The new ruling will permit telecommunications service providers to limit or block bandwidth for any reason whatsoever.

    It's true that the U.S. Court of Appeals for the District of Columbia has been considered one of the more conservative jurisdictions, at least until the recent changes in Senate filibuster laws finally permitted the U.S. Senate to begin appointing judges for appelate courts again. But the decision isn't solely about corporate freedom from government. In 2010, the FCC decided to argue a diluted form of a "common carrier" argument; they suggested, in the case FCC vs. Independent Telephone & Telecommunicsations Alliance that telecommunications providers (both wired and wireless). Under this theory, telecommunication providers were exempt from responsibility for some traffic carried across their networks (illegal traffic, for example) in exchange for being prohibited from blocking or limiting access to particular forms of media or specific providers. Telecommunications firms argued that this was "unfair," but did so fairly unconvincingly to popular audiences (look at the articles mentioned in the first paragraph, for example).

    But there was, at least, some legitimate justification for ISPs wishing a right to throttle, limit or block some sites.

    Don't get me wrong; I think the decision was a terrible one, with highly problematic implications for academics and digital humanities scholars. But the decision arose from a predictable legal theoretical position, and an actual problem faced by telecommunications providers—not simply from meanness. I believe that permitting telecommunications corporations to block, throttle, or limit access to particular websites selectively will be a terrible change, but not because they're "villains"; it's because the Web shouldn't have such structures of power.

    The problem is that if ISPs now seek to optimize their bandwidth for consumers, they'll begin to measure how much bandwidth comes from various websites. And the move toward teaching multimodal content production in English depatments means that we're not all publishing low-bandwidth content, like, for example, the edition of Aphra Behn's The Rover I published on the EServer in 1996. Instead we may be publishing more work like the streaming video of Cheryl E. Ball's presentation "Evaluating and Assessing Digital Scholarship for Teaching and Research" we recorded at Iowa State in 2010.

    Or we may find that the 33,241 students at Iowa State University may download sufficient material from the domain that commercial cable or telephone service providers (who provide the high-bandwidth Internet access for students, faculty and staff at home in Ames, Iowa) will find our university's content high-bandwidth enough that we should be taxed, perhaps as much as Google/YouTube or Netflix.

    And we don't have the revenue models they do. For sites like the nonprofit website I run,, we don't have much revenue at all. Sites like these have produced large amounts of content as a service to the public, or because of a commitment to open access. Neither of which lead to much revenue to pay telecommunications firms not to limit their customers' access to our content. In 2013 the EServer served 5.33 terabytes of data to 27,093,365 readers (see my last post for details); but we didn't charge anyopne for any of the content (nor do we charge scholars who want to publish material on our site; our income comes entirely from donations and grants). Scholarly journals which publish multimodal content, such as the rhetoric journal Kairos, might have to pay additional taxes to for-profit corporations, not just for the Internet access to upload our content to web readers, but to ask them not to throttle/block/limit access to our content. This would make the innovative, small, open-access projects so responsible for the modern internet's wealth of information to be pared down to only revenue-generating publishers. Significantly revenue-generating publishers.

    I am a professor who does not speak for my university, and only write from my experience. But I believe that it is in our strong interest to motivate the FCC to enable guidelines which will prevent such dangers. This will probably require that the FCC persuade Congress to pass new legislation, which will not be an easy task (the telecommunications lobby has more and better lobbyists than I do, and the current FCC chair, Tom Wheeler, is a former cable and wireless industry lobbyist). But the dominant narrative I see in the popular press suggests that telecom corporations are villains, perhaps operating from scurrilous motives, and I would like the readers who've made it this far in my post to understand that I believe that narrative will not help us to persuade U.S. legislators to pass bills that would reinstate the FCC's "net neutrality" policy. Not when telecom corporations are regular contributors, and we seem to be political outsiders, naive as to business exigencies.

    It is an important goal to reinstate net neutrality, if we don't wish to have digital humanities scholarship (which has made so many improvements in recent years) return the mid-1990s publication of simple HTML editions of eighteenth-century drama. We love publishing that content, but strongly prefer to build upon the freedoms which have enabled digital humanities scholars in the past decade to publish content in a variety of modes.


    Geoffrey Sauer is an associate professor at Iowa State University. He is also the director of the ISU Studio for New Media and the founding director of the nonprofit arts and humanities website He can be reached at

  • 1 Comment sorted by
  • I see an article on Ars Technica today about how Netflix performance on Verizon and Comcast has been dropping in recent months. This may be inferred as deriving from the issues discussed in my article above—either from problems the telecoms have providing sufficient bandwidth to their end users, or mechanisms in their systems to limit bandwidth for high-demand websites (such as Netflix streaming). We'll see, in the weeks to come, which it is.

    If you want to read more, see: