Data-Driven Modernization of E-rate for Wi-Fi in Schools

By: Sarah Oh

Around the time of the modernization order of the E-rate program in 2014, I published an empirical study[1] on distribution effects of program rules between 1998 and 2012.  I found that the old rules, particularly the discount rate matrix, had distribution effects that perhaps needed reform.  Since the Universal Service Administrative Company (USAC) abides by administrative order to run the E-rate program, it was a natural question to ask whether the rules were causing particular outcomes.

My data analysis showed that per-student and per-school estimates of cumulative internal connection funds were higher in New York, California and Texas than the other 48 states, including Washington D.C.  I chose to study “internal connection funds” which supports payments for new Wi-Fi equipment.  I found that large school districts in major cities benefited more than the smaller districts in suburb, town, and rural schools.

For instance, my estimates showed that funds to New York, California and Texas recipients amounted to an estimated $826 per student enrolled in the National School Lunch Program.  In the other 48 states, students in the same lunch program received less than half, at an estimated $302 per student.  The rules created disparities at the per-school level as well.  In New York, California, and Texas, schools received an average of $251,399 in cumulative funds, while the other 48 states received an average of $75,340 each.  These three states enroll approximately 14 million children in over 27,000 schools each year, which is less than half of the 38 million children in over 91,000 schools in the other 48 states.

There is greater need in New York, California and Texas than the other 48 jurisdictions.  For those familiar with the E-rate rules, an average discount rate of 80 in those states shows a higher level of need compared to 76 in the rest of the country.  However, New York students with a discount rate of 77 reaped far more funds than students in the other 48 states with a slightly better discount rate of 76.  For every pupil enrolled in the school lunch program in New York, an estimated $1,285 has been spent, while an estimated $302 has been spent per student enrolled in the same national school lunch program in the other 48 jurisdictions.

Funding discrepancies do not disappear by simply increasing E-rate funds.  The rules created distribution effects even though the discount matrix carefully incorporated school lunch program demographics and urban and rural locations.  Perhaps administrative resources at the school district level contributed to these outcomes.  The New York City Department of Education applied for and received $1.7 billion in E-rate internal connection funds over fifteen years, Los Angeles Unified School District $738 million, San Diego Unified $114 million, Dallas, Houston, and Laredo Independent School Districts $145, $141, and $89 million each.  School districts with larger operations perhaps benefited from greater administrative know-how and organizational resources able to navigate a complicated process.  Perhaps smaller school districts in other states are limited by smaller economies of scale.  Might an intermediary be created to help these smaller schools?

The FCC’s data-driven modernization order of December 2014 will improve broadband connectivity in schools and libraries.  A recent annual target by the FCC to distribute $1 billion for Wi-Fi internal connections infrastructure will connect more schools and school districts around the country.  Continued scrutiny could increase the effectiveness of Universal Service Funds by making sure funds are sent to smaller school districts around the country.

Sarah Oh is a graduate student at George Mason University, where she studies economics.   

The opinions expressed in this piece are those of the author and may not necessarily represent the view of the Aspen Institute.

[1] http://www.sciencedirect.com/science/article/pii/S030859611400113X

Expanding the Telecommons: TV Neutrality After Net Neutrality

By:  Farid Ben Amor

Some commons, like roads and parks, deal in scarce tangible resources.  Cultural commons however deal in information scarcity and the national Zeitgeist.  In defense of a commons that uniquely straddles these two realms, many Americans vigorously embraced the Internet in 2014-15.  That embrace was brash, filled with passionate reactions supporting net neutrality (including unfortunate friendly fire like “dingo” namecalling,) but understandable for a significant public and cultural resource thought to be at risk.  As we know, the FCC followed suit with a fundamental change rather than an incremental one, reclassifying Internet service providers as common carriers.  This distinction codified a business model separation between the transmission and content provider elements almost a decade after much of the market did, when the most popular ISP-cum-content provider AOL tore down the remnants of its walled garden to compete on the open Internet.  Similarly, another commons – TV – recently under threat of commodification due to the attempted walling up of gardens along with increased M&A activity, will be pried open to competition with two recent FCC proposals.

Like the Internet, many Americans have also embraced the liberation of TV content from its platform shackles, beginning coincidentally about a decade ago with the launch of YouTube, Netflix’s online video service, and Hulu.  These services expanded access to popular programming, but consumers still can’t access top shows as they premiere through them.  Ten years later, we still need a traditional TV subscription to watch day-and-date primetime shows (like my favorite: the new X-Files) or major live events (for me, the Super Bowl… halftime show.)  Following suit again, Chairman Wheeler and his office proposed two rulemakings to grow a neutral TV marketplace which will make it easier for new online content providers to offer top broadcast and cable shows.  These rules would 1) reclassify facilities-based pay TV services (known as MVPDs) to include qualifying Internet video services, and 2) reform MVPD set top box requirements to switch from current hardware-based security to IP-based security.  If enacted, the FCC would not only preserve the TV commons, (otherwise at risk due to consolidation,) but permit unlimited additions to it through competitive and innovative video platforms.

The first – MVPD reclassification – significantly reduces the commercial and practical barriers that limit online video services from carrying major network programming.  Here’s why:  applying the same regulatory status to online TV platforms would enable them to a) choose to carry popular cable networks on commercially feasible terms, with the help of the program access rules; b) choose to carry broadcast networks under the long established retransmission consent regime, subject to the Copyright Office acceding in kind (which it has indicated it would likely do) and; c) neuter anticompetitive restrictions in existing affiliate agreements, which MVPDs routinely demand as a condition of network carriage and lock in for many years at a time.  These would effectively render the TV delivery platform a technology-neutral commons, freeing popular shows to appear on many more products.  And it’s important to note that these three benefits are in service of the network programmers as well, who suffer from the monopsony effect of having limited buyers to negotiate among – which further strengthens the similarity between TV neutrality and net neutrality.

The second – set top box reform – would open up choices not just for physical set top box devices, but virtual ones as well, regardless of which MVPD is provisioning the programming.  This means the technology in CableCARD, which is the current government standard for securing content, would now be replaced with an updated security standard that can go over-the-top, thereby also liberating content in a different way, from the set top box itself.  This would open the TV guide to allow new hardware and software entrants to compete in an otherwise oligarchic MVPD set top box marketplace, without disturbing the MVPDs’ content agreements.  Naturally, the incumbents are sweating about this, afraid that tech companies such as Apple would get an easy “in” to their content agreements.  But competition can also benefit the MVPDs by steering their energies for this content navigation layer away from protectionism and toward innovation to retain its customers.  And for them, this separates the platform (the guide interface) from the content.  The platform market will transition into a “replaceable parts” system, which adds more choice to the TV commons.

Even though the tech sector often instinctively views government intervention skeptically as a negative consequence to disruption, it should back these FCC rules.  They are field-leveling exercises to correct for the market failures that have permitted facilities-based incumbents to fortify against competition.  Indeed, the “virtue” of self-interest will always motivate this anti-competitive behavior in incumbents unless government prevents them from blocking new entrants and incentivize innovation instead.  History proves this.  In 1992, the US government enabled satellite TV to successfully compete against cable TV using the same MVPD “fair playing field” expansion under consideration now.  Competition was also the emphasis of the 1996 Telecom Act, in which Congress gave a clear directive to “let anyone enter any communications business.”  Given the resultant expansion of TV in the late 90s and 2000s, we know that such regulatory neutrality only grows our telecommons, advancing our society toward digital equity and decreased censorship.  The Fairness Doctrine may have been a necessity when there were only a few choices for programming, but not when there are many.  Truly, more regulation ensuring fair competition on the platform level encourages deregulation over the content flowing down the pipe.

In other words, the more the TV commons are expanded, the less need for content regulation.  These two FCC rulemakings would accomplish just that.  Moreover, in the long run, the first proposal (MVPD reclassification) may even obviate the need for set top box reform.  But programming deals run long and can take time to negotiate.  Therefore the second proposal (set top box reform) is still compelling in that it enables consumers to sample competitive TV services easily over the short term.  The best case with the passage of both set top box reform and MVPD reclassification would enable competition to develop rapidly as programming rights wouldn’t slow down new TV guide competitors, while online video services efficiently transform the TV experience from the ground up.  Having both rules would help rapidly reduce costs by increasing competition among both MVPDs and set top box manufacturers.  The enactment of these two FCC rulemakings together ensures neutrality of the TV commons for Americans at its two greatest chokepoints today, enhancing commercial opportunities for businesses and innovative products for consumers, and setting the stage for the development of TV into the Internet age.

Farid Ben Amor is a graduate student at the University of Southern California, where he researches media and telecommunication policy, and is director of business development at Pluto TV.  His views do not necessarily represent the views of his employer, however, Pluto filed comments with the FCC in favor of MVPD reclassification as discussed in this piece.

The opinions expressed in this piece are those of the author and may not necessarily represent the view of the Aspen Institute.

The Next Act

Article originally posted on [Insides Sources]. Written by Blair Levin.

Telecommunications policy raises numerous controversial issues but two debates underlay all others: how is the United States doing in international rankings and should Congress rewrite the current communications law? The interplay between these debates raises two questions; one ironic and the other, serious.

The international debate features one side suggesting we are falling behind while the other side, composed of telephone and cable companies and supportive policy experts, argues we’re doing great.  Advocates for the second view are also leading the charge for new legislation.  The ironic question is if we are doing so badly, why change?

I’m not accusing them of hypocrisy; we could be doing great and still need a new bill.  But it does beg the more serious question: what is the problem the legislation is meant to resolve?

The general answer is we should modernize the law.  While not a bad outcome, it is hardly compelling, particularly as a legislative process tends to freeze investment in the sector.  Capital flows to lobbyists, not product innovation.  The question advocates of new legislation should be required to answer is where do we want capital to flow in telecommunications where it is not flowing now?

My own answer comes from the United States National Broadband Plan, which boils down to four strategies for improving broadband performance:

  1. Drive fiber deeper into the network;
  2. Use spectrum more efficiently;
  3. Get everyone on broadband; and
  4. Use the broadband platform to improve government performance

As to fiber, there are signs of progress, including recent announcements by Google and AT&T about increasing their footprint of world-leading fiber networks.  These announcements are largely driven by changes in local policy, and it is not clear that federal legislation is required or even helpful.  With spectrum, Congress already authorized the FCC to reallocate spectrum more efficiently and the executive branch is engaged in similar efforts.  Notably, no advocates for new legislation suggest we are suffering from underinvestment in the wired or wireless sectors.

We have seen less progress, however, with the third and fourth strategies.  While some private efforts, such as the Comcast Internet Essentials and the Google Fiber low cost offerings are praiseworthy, such private actions are unlikely to solve what is a public problem: tens of millions unconnected to and ill-versed in working with the core platform for economic and civic engagement in the 21st Century.

Moreover, government services are still, by and large, an analog enterprise in a digital world.  The problems of healthcare.gov revealed part of the problem but the costs and opportunities more extensive than the performance of a single website.  Several years ago, a coalition of Hi-Tech CEOs that proposed how the federal government could save over $1 trillion over ten years by using best practices, particularly with information technology, to improve productivity.  The Obama Administration has moved forward with several laudable efforts, such as with open data and a “New Management Agenda” but both lack the scale and political capital to provide the magnitude of change we need.

How to move forward?  Congress should authorize a Commission to draft a plan for how to use the tools of 21st Century technology to improve government operations. The members would be composed of executives with a record of managing similar technology transitions.  Congress should provide the Commission a broad mandate, including recommendations for benchmarking and improving interactions between the public and government agencies, moving all government services to the more efficient digital platform, improving economic growth through expedited processing and open data, lowering long-term costs through accelerated technology upgrades, and improving the supply chain and procurement process.  Some of the savings should be dedicated to a focused effort to overcome the lack of “digital readiness” that is emerging as the primary barrier keeping people off the Internet.  Further, as services move the digital platform, the value of using it grows, causing more to embrace its use.

The Commission should be structured like the base-closing commission. Congress would have a specified window to debate its recommendations and be required to approve or reject its recommendations on a single up or down vote.

If such legislation were to move forward, capital would increase its flow towards next generation government services.  Americans are proud of our role in developing technologies that have revolutionized how people everywhere communicate, exchange information and improve their lives.  We should aspire to be just as proud about how our government uses that technology to improve how it performs. Instead of working on a bill with an ill-defined upside, Congress would better serve the country by looking at how to simultaneously prepare the rest of our country and our government services for the Information based 21st Century economy.

NBP@4: Surprises, Lessons, and Still in Beta

By: Blair Levin, Former Executive Director, National Broadband Plan
Information Technology and Innovation Foundation Forum
March 19, 2014

Every time there’s broadband related news—a court decision, a merger, a hearing—there’s a spate of articles in which the people who thought the U.S. was doing badly five years ago say we still are and the people who five years ago said we were doing great say we still are.

A debate where everyone recycles five-year old sound bites is not a recipe for progress.

Moreover, those debates about international rankings all reflect decisions made long ago.  What we need to focus on are the decisions we should make today to lead in the future.

Turns out, in our plan, and in every national broadband plan around the world, there are four strategies that dominate:

  1. Driving fiber deeper;
  2. Using spectrum more efficiently;
  3. Getting everyone online; and
  4. Using the platform to improve delivery of public goods.

So the right question is are we improving in executing on those 4 strategies?

I’m a bit surprised, but delighted, that, on 1 and 3, we are doing better.  Interestingly though, it’s not due to federal government efforts but is largely due to private, non-profit and local government efforts.  It may well be that on the 10th Anniversary, the greatest impact of the plan will turn out to be how the process seeded the Google Fiber and the Comcast Internet Essentials effort.

On the fiber side, Google Fiber has sparked not just activities in nearly 40 communities, caused pro-consumer competitive responses and also inspired the efforts of others including AT&T, CenturyLink, CSpire, and Gig.U .  Moreover, Google has led to a learning exercise about how municipalities can improve the economics of fiber deployment.  As a result, many, including AT&T CEO Randall Stevenson have discerned a sea change in municipal reactions to efforts to upgrade networks.  As he noted “(c)ities and municipalities are beginning to hold up their hands and say we would like you come in and invest. And they’re actually beginning to accommodate and tailor terms and conditions that makes it feasible and attractive for us to invest.”

At the same time, and I believe related, a NBP recommendation (8.19) which I believed was important but thought had little political support, has become much more popular—that the federal government should act against state laws that put up barriers to municipal broadband.  Prior to the plan, a number of states enacted such laws.  Since then, the tide has turned with recent efforts to do so failing, the FCC becoming more engaged and there is even an effort in Tennessee to reverse previously enacted restrictions.  Cities have become both more aggressive and more sophisticated about advancing their own bandwidth destiny.  Indeed, the way in which cities have become the principal government jurisdiction delivering on the promise of the national broadband plan has caused the biggest change in my own thinking.

In short, what looked like a fiber-deprived desert a few years ago is starting to sprout.  We are far from finished but one can be optimistic that in a few years we might have a gigabit garden.  And with Chairman Wheeler adroitly jumpstarting the Plan’s call for an IP Transition —something that, in contrast with the municipal issue, I thought would move much faster—perhaps federal policy may align to incent bandwidth abundance , which holds promise for addressing a number of policy concerns.

On the adoption side, Comcast is doing a terrific job and deserves praise for both increasing its program’s momentum but also making it permanent.  One reason they succeed is that like Google, they constantly learn and improve.  One of the folks they are learning from is the smartest researcher in the space, John Horrigan.  I’m delighted he is here to raise some important and previously over looked questions about digital readiness.

One interesting question we will not know the answer to for awhile is which approach—Comcast’s approach which focuses on special offers for low-adoption communities or Google’s efforts, which create a broad low cost option of a one time $300 fee for seven years of service—will do the most to address the adoption problem.

Some of my friends on the left criticize Comcast’s efforts.  Some of my friends on the left criticize Google’s.  I think they are both wrong.  (I also have friends on the right who I think are wrong but on different subjects.  And almost all my friends often think I am wrong.  They are often right.) I admire what both companies are doing; driving faster, better, cheaper broadband in ways that are sustainable.  Having two different models is great.  We don’t have to choose one so let’s not.  Let’s learn from both.

My criticism would focus on what the government is not doing.  I gave a speech on this topic on the plan’s first anniversary.  I won’t repeat it here, except to summarize by saying the government has many tools to improve how it delivers service to low adoption communities in ways that will also provide incentives to use broadband. This, I still believe, is fertile ground for further work.

The success of Google Fiber and Comcast in improving our county’s performance is a far greater tribute to them than it is to the plan.  In my book, those who allocate capital—whether financial, political, or other types—to address a problem deserve the praise and both companies have.  It is worth noting however, that while the thrust of the planning effort was to make recommendations for changes in federal policy, there is significant value to planning efforts in simply sparking the right dialogue and that federal planning efforts should be open to non-federal government solutions.

Other areas, however, can only be addressed by federal government action. One such area is spectrum.  Chapter 5, on Spectrum, has to be seen as a great success.  Not only did it correctly identify the need for diversity in allocations, with both licensed and unlicensed, it directly sparked a number of actions, including:

  • The President’s Executive memo stating the 500 MHz goal which caused NTIA to look for more spectrum and led to 1695-1710, 1755-1780, and 3.5 GHz being on the table;
  • The only communications legislation passed in a recent Congress, on the Incentive auction (and subsequent FCC rulemakings) as well as directing an auction of certain bands identified in the NBP;
  • Liberalization of MSS spectrum (S-band/AWS-4);
  • Improvement of WCS spectrum;
  • The current discussion of using 5 GHz for unlicensed;
  • The discussion and possibility of action for a national TV White Spaces footprint in post-incentive auction guard bands

There are a number of people who deserve praise for moving this agenda forward, including then Wireless Bureau Chief Ruth Milkman and her deputy John Leibovitz, who both did the lion’s share of the work in writing and then implementing the Chapter, and Larry Strickland and his team at NTIA.  I also think Commissioner Clyburn, while she was Interim Chair, did a great job untying the Gordian Knot on a couple big issues, such as the 700 MHz interoperability order.

Of course, we do not know the outcome of the incentive auction.  I was personally glad to see Chairman Wheeler quickly and publicly state what everyone privately knew; that the previously announced date of 2014 for the auction was not going to happen.  He was right to prioritize getting it done right over getting it done fast.

I also think that while we should aspire to recapture as much spectrum as the market will justify, that number is likely to be lower than the 120 MHz identified in the Plan.  The final number will be a function of many factors, but primarily market forces.  In my view any number above 60 MHz will be a significant improvement over a spectrum future that would never, as far as we could project in 2009, have a significant portion of lower band spectrum come on the market in a coordinated fashion.

The fourth bucket, improving the delivery of public goods and services, I will leave for other panelists, as their expertise is greater than mine.  I am sorry that the change in date meant we do not have an opportunity to hear from Dr. Mohit Kausal, who not only did a great job writing the chapter on health care and broadband but also, along with others in his incredibly talented team, is out in the market, taking the vision and executing on the opportunities to use broadband to improve delivering healthcare.

I am delighted to be together again with others who delivered great work with the plan and continue to do so in their post plan work: John, who I previously mentioned; Rear Admiral Jamie Barnett, who laid the groundwork for the long-overdue FirstNet project; Steve Midgely who went on to write the Department of Education’s Tech Plan and who in many ways helped catalyze the activities that led to the FCC and others understanding that the E-Rate program required an upgrade ; and Nick Sinai, who implemented his own green button energy data recommendation when he moved to the Office of Science and Technology.

This leads me to a final thought.  As I have seen here and in other countries where I have been consulted on broadband plans, in the long run, the execution of a plan is more important than the plan itself.  Good execution can correct for any errors in the plan.  A great plan with lousy execution will ultimately fail.  As we said in the implementation chapter, “this plan is in beta and always will be.”  Good execution requires readjustment when facts change.  And despite what others might think, I think that in the five years since we started the plan, a lot has changed.

I have previously expressed disappointment with how some policy leaders here viewed the ideas in the Plan through the lens of one-day sound bites instead of long-term policy, so it makes me particularly grateful that the ITIF is holding this event.   This enables us to have a candid discussion of how we are executing and how we can learn something so that like, say Google or Comcast, our country can be stronger tomorrow than we are today.

Thank you.

The National Broadband Plan Four Years Later

By: Blair Levin

Blair Levin
Fellow
Communications & Society Program
The Aspen Institute

One of the lessons of “Big History” is that the great advantage the human species has over other species is inter-generational “collective learning.”  A turtle today knows what the turtle knew 200,000 years ago.  Humans know something more than we did then because what we learn in a lifetime is shared, and not lost, when an individual dies.  And as a species, we are driven to test that knowledge, reject it when it proves inadequate, and improve on what we know.

It turns out that the same is true for countries and National Broadband Plans.  While many countries have such plans, the ones that thrive the most are those that continue to learn, share that learning and improve.  Some, like Singapore, where I was last month, do it well.  Others, like Myanmar and Ethiopia, where I also have met with government officials, well, not so much.

In an effort to be a bit more like Singapore and less like some others, a group of National Broadband Plan Alumni will be gathering at the Information Technology and Innovation Foundation on Wednesday, March 19 at 12:00 p.m. ET to discuss what we have learned in the four years since we published the plan.  Some results are counter-intuitive.  In a Plan with 200 recommendations for government officials, perhaps our two most important initiatives to emerge from the planning process are entirely private sector driven.  Others are surprising.

The recommendation that I thought had the least political support now has significant political momentum while the one with the most political support was stalled for nearly four years.  Other lessons are more expected.  The chapter that has had the most success in changing federal policy, is the one where the federal government held the most levers.

To find out what I mean, and more important, to find out what some of my colleagues think about lessons from the Plan and what is the agenda ahead for what should be thought of as the government’s IP transition, please join us Wednesday. Details and registration at http://www.itif.org/events/national-broadband-plan-four-years-later.