Modern Market Failures in News

In his stellar recent book, The Human Network, Matthew O. Jackson discusses reasons why the internet might cause erosion of news quality. First, the costs of starting up an outlet to disseminate news are lower than ever. In many circumstances, lower entry barriers are good for consumers—they spur competition in an industry, resulting in higher quality and lower costs. However, a balancing force in markets for information like news is that it is costly to determine the quality of the information received. When there are many providers, the market for news becomes more congested—assessing the quality of different sources of information is hard and takes time, and that effort and time gets wasted on low-quality or pernicious information providers. Past research suggests people are not good at distinguishing fact from fiction on the internet.1

A second problem is that social media and news aggregators reduce the payoff from investing in investigative journalism. High quality reporting published on the internet may be immediately re-reported by other outlets and spread through social media feeds. To get basic, up-to-date  information about current events, people do not need to read the original source. In the past, this was not the case. 100 years ago, when a newspaper had a new scoop, it would take at least a day for other papers to disseminate that same information. Newspapers that produced an investigative piece had a big first mover advantage, and the desire to be first encouraged healthy competition between newspapers. Consumers wanted to subscribe to the newspapers that gave them quality news before it became stale. Today there is little first mover advantage for producing news. The speed of replication makes investigative journalism less lucrative, meaning less investigative journalism gets produced. The legwork is costly, and in the past few decades the benefit has often not been worth the cost, especially for local news2


These two phenomena can combine to erode the quality of news. Creating high quality news is hard, and pieces get replicated almost immediately; creating bad or fake news is easy; people are not very good at distinguishing between the different types, making the gap in revenue for both types of news inefficiently small. A countervailing effect is reputation. If people generally trust established news outlets more than less reputable ones, they may tend to still patronize such news sites more. Undermining the credibility of established outlets can diminish reputational advantages.

Good interventions to improve these possible issues are not obvious. Somehow increasing barriers to spreading information can potentially be very costly if it limits access to legitimate knowledge. While diversity of information might lead to congestion in news, it can also help people learn things, spurring innovation and making people better off in and of itself. Technology platforms have been grappling with the pitfalls of restricting information exposure in recent years. Algorithms developed by social media outlets to surface items on newsfeeds and filter out “bad” information get criticized for censoring views or for exerting excessive, unaccountable, black box control over what people see.

Similarly, encouraging the socially optimal level of investigative journalism is challenging and seems to contrast with frameworks for incentivizing R&D investment for many other types of goods. To incentivize costly investment in research and development, countries around the world have, for example, patent laws. These laws prevent people from immediately copying new inventions that were expensive to develop. Instituting patent or copyright laws for investigative journalism seems less tenable for many types of journalism. First, it may be impossible to enforce. Lawsuits against everyone tweeting about the contents of an article would be socially wasteful and restrict civil liberties. Further, there is compelling evidence that patent laws fail to boost innovation in the first place. Providing subsidies or prizes for investigative journalism is also tricky. Not all investigative journalism should be subsidized—especially fake or misleading news. Ideally pieces that are higher quality should be subsidized more, but measuring quality is hard. Readership is likely not a good proxy for quality, and further is hard to measure when people can get the same information from many different sources. Deciding what does and does not deserve to be subsidized could introduce unwanted bureaucracy, concentrated power, and bias in news. Coming up with mechanisms to encourage high-quality investigative journalism seems like an important area for research.

There are market failures in journalism today that likely have been magnified in the past several years. The easy creation and rapid spread of information around the web can reduce the stock of quality news. Less investigative journalism can reduce accountability and prevent diffusion of high-quality knowledge that spurs innovation. Thinking about how to incentivize quality journalism while preserving the rapid proliferation of information on the internet is an interesting research question.

  1. See also work by Gentzkow and coauthors.
  2. Not all journalism suffers equally from reduced replication costs–I expect the greatest distortion occurs for shorter current events pieces. Long-form journalism pieces produced by outlets like the New Yorker may pack a lot of information that is hard to express succinctly, or might convey value to the reader from the stylistic quality of the writing. It is harder for readers to get all the value from the article through just a tweet or summary, perhaps compelling them to read the original piece.