Showing posts with label policy. Show all posts
Showing posts with label policy. Show all posts

Sunday, August 13, 2017

Agricultural Econ Potpourri

by Levi Russell

As we move into debate over the next Farm Bill, here is a great overview of the state of the discussion.

If you're curious about China's new policy on beef imports, check out this blog post.

Farm fixed expenses are finally moving down.

As fixed expenses are moving down, land values are up yet again this year.

Monday, May 15, 2017

Potpourri

by Levi Russell

Here's a collection of articles I've read over the last week or so.

David Henderson on Thoma on potential changes to banking regulation.

Economist Allan Meltzer recently passed away. Here and here are two commentaries on his work.

A major contribution of another recently-deceased and well-known economist William Baumol is discussed here.

Don Boudreaux has a fantastic post on the importance of Econ 101. Here's a short excerpt:
To put the point a bit differently, ECON 101 instills the good habit of looking past stage 1, which is the stage at which most non-economists stop their investigations of economic consequences.  ECON 101 prompts those who grasp it to look also to stages 2, 3, and 4.  More-advanced economics courses – all the way to ECON 999 – teach that in theory there is also the possibility of stages 5, 6, 7, …. n.  Awareness of these theoretical possibilities is, of course, useful.  But awareness of stages 5, 6, 7, … n is either meaningless or, worse, practically dangerous without also an awareness of stages 2, 3, and 4.  And nearly all economic ignorance in the real world is simple unawareness of stages 2, 3, and 4.  (It’s also mistaken to conclude – as Kwak concludes – that awareness of stages 5, 6, 7, …. n regularly nullifies policy conclusions drawn from awareness of stages 1 through 4.)
Here's a great piece at Cato on the Net Neutrality issue.

Economics blogger Jim Rose corrects Noah Smith's oft-repeated claim that the economics profession was, until recently, dominated by right-wingers and libertarians.

Friday, November 18, 2016

Nudging the Nudgers with Better Regulatory Policy

by Levi Russell

A recent law blog post by Brian Mannix (hat tip to David Henderson for the link) discusses the significance of the Congressional Review Act (CRA) during a presidential transition period. In a nutshell, the CRA allows congress to overturn a regulation written by an agency in the executive branch with a bare majority in each house until the 60th day after it is issued. Of course, like a bill, the president can veto, in which case congress must come up with a 2/3 majority to override the veto. Given that most presidents wouldn't want to stop their own branch of government from implementing regulations, the CRA doesn't often come into play. However, since Republicans control both houses and the incoming president is a Republican, there are some interesting things that could happen given the Republicans' ostensible preference for less regulation. Mannix has some interesting examples in his post so I suggest reading it.

What interested me was a specific line in the post:
Note that the CRA mechanism is distinct from the proposed REINS Act mechanism.  Under REINS, Congress would need to approve of major regulations before they become effective; under the CRA rules become effective if Congress refrains from disapproving.
This got me thinking about behavioral economics and the idea of "nudges." An example of a nudge is a change to the rules that makes the "best" option the default. Often nudges are suggested as part of government policy, but that's not always the case. An oft-repeated example is to make 401k enrollment with your employer the default option while still providing an "opt out" opportunity for those who don't want to contribute to a 401k.

Looking again at the last sentence in the quoted text above, the REINS Act strikes me as a great example of "nudging the nudgers." That is, imposing a rule on those in the executive branch who are in charge of making and enforcing rules based on legislation. Specifically, the REINS Act would make the default position "no new regulations" and only those that passed additional scrutiny by elected representatives would actually be issued. I argue that, at least potentially, the REINS Act would lead to better regulation for a couple of reasons:

1) There would be more oversight by elected representatives of the regulatory process than there is currently.

2) Massive regulations like those written based on Dodd-Frank or the Affordable Care Act would probably be issued more slowly since the regulations would have to be approved by Congress. This would give us a chance to see how the initial regulations actually play out rather than relying on speculative cost/benefit analysis (side note - most regulations aren't subjected to cost/benefit analysis anyway).

Additionally, the REINS Act could reduce the problem of passing on more authority to the executive branch than it was designed to have. This problem arises when Congress passes a very general bill (which is more likely to garner enough votes to pass than a very narrowly-written bill) empowering the executive branch's bureaucracy to write regulations that are less likely to be in line with the will of the people.

The REINS Act was passed in the House last year and is currently sitting in the Senate. It would be great to see more discussion of this bill, but perhaps I just missed it. I'd love to read your thoughts in the comments below!

Bonus: Here's a great article on "nudging the nudgers."

Note that the CRA mechanism is distinct from the proposed REINS Act mechanism.  Under REINS, Congress would need to approve of major regulations before they become effective; under the CRA rules become effective if Congress refrains from disapproving. - See more at: http://www.libertylawsite.org/2016/11/17/midnight-mulligan-the-congressional-review-act-rides-again/#sthash.dJwa6OFY.dpuf

Tuesday, November 15, 2016

Ignoring Positive Externalities

by Levi Russell

Recently ag economist Jayson Lusk visited UGA to speak on the future of food and the "food movement." One great point he made is that food is quite abundant now relative to any time in the past, and yet there is a very lucrative book and movie industry built around concerns with our food supply. Certainly our food system is far from perfect, but absolute poverty on a global scale has been curtailed dramatically.

A question from the audience particularly interested me: what about externalities related to our current food production methods such as pollution? Lusk's answer was very good. He acknowledged the existence of both negative and positive externalities in food production. He went on to state that both should be considered when designing policy. It certainly seems to me that a lot of attention is paid to the negative externalities associated with food production in the policy world and in the economics profession; relatively little attention is paid to measuring the positive externalities such as the fact that on average Americans spend only about 10% of their disposable income on food and are free to pursue all sorts of other interests.

This discussion reminded me of a paper I read awhile back. It's ungated, and I encourage you to read it if you're interested in this stuff. Here's the abstract:
This paper criticizes the treatment of externalities presented in modern undergraduate economic textbooks. Despite a tremendous scholarly push-back since 1920 to Pigou’s path-breaking writings, modern textbook authors fail to synthesize important critiques and extensions of externality theory and policy, especially those spawned by Coase. The typical textbook treatment: 1) makes no distinction between pecuniary and technological externalities; 2) is silent about the invisible hand’s unintended and emergent consequences as a positive externality; 3) overemphasizes negative externalities over positive ones; 4) ignores Coase’s critique of Pigouvian tax “solutions;” and 5) ignores the potential relevance of inframarginal external benefits in discussions of policy “solutions” to negative externalities. Aside from presentations of “The Coase Theorem” excerpted from only 4 pages of Coase’s voluminous writings, it is as though the typical textbook author slept through nearly a century of scholarly critique of Pigou.

Sunday, October 23, 2016

Monopoly Concerns with Baysanto

by Levi Russell

The recent merger of DuPont/Pioneer with Dow and the acquisition of Monsanto by Bayer have sparked a lot of discussion of market concentration, monopoly, and prices. A recent working paper published by the Agriculture and Food Policy Center (AFPC) at Texas A&M University written by Henry Bryant, Aleksandre Maisashvili, Joe Outlaw, and James Richardson estimates that, due to the merger, corn, soybean, and cotton seed prices will rise by 2.3%, 1.9%, and 18.2%, respectively. They also find that "changes in market concentration that would result from the proposed mergers meet criteria such that the Department of Justice and Federal Trade Commission would consider them “likely to enhance market power” in the seed markets for corn and cotton." (pg 1) The paper is certainly an interesting read and I have no quibble the analysis as written. However, some might draw conclusions from the analysis that, in light of other important work in industrial organization, are not well-founded.

The first thing I want to point out is that mergers an acquisitions can, at least potentially, result in innovations that would justify increases in the prices of the merged firm's products. To the extent that VRIO analysis is descriptive of firm's behavior with respect to innovation, we would expect that better entrepreneurs would be able to price above marginal cost. Harold Demsetz made this point in his 1973 paper Industry Structure, Market Rivalry and Public Policy. The authors of the AFPC study point this out as well, but the problem is that, even though we have estimates of potential price increases due to the mergers, it is very difficult to determine whether any change in price in the future is actually attributable to market power or simply due to innovation in the seed technology.

Secondly, the standard models of monopoly assert that pricing above marginal cost is at least potentially a sign of a firm exercising market power. Here, articles by Ronald Coase and Armen Alchian are relevant. I provided a discussion of the relevant portions in a previous post so I'll just briefly summarize here: pricing above marginal cost is an important signal that the current market demand is potentially not being met by the firms in the industry. It's a signal to other potential investors that entering the industry might be worth it. Further, there is an issue of measurement. Outside observers may calculate fixed cost, variable cost, and price and determine that a firm is pricing above marginal cost. However, there may be costs of which said observers are unaware. For example, there may be significant uncertainty (which is not the same as risk) about the future prospects of the industry. This is certainly possible in the biotechnology industry since the government heavily regulates firms in this sector. This is not to say that such regulation is bad or should be removed, simply that it presents costs that are difficult for outsiders to calculate.

Finally, I want to examine one part of the analysis in the AFPC paper. On pages 10 and 11, the authors write (citations deleted):
A market is contestable if there is freedom of entry and exit into the market, and there are little to no sunk costs. Because of the threat of new entrants, existing companies in a contestable market must behave in a reasonably competitive manner, even if they are few in number.
Concentrated markets do not necessarily imply the presence of market power. Key requirements for market contestability are: (a) Potential entrants must not be at a cost disadvantage to existing firms, and (b) entry and exit must be costless. For entry and exit to be costless or near costless, there must be no sunk costs. If there were low sunk costs, then new firms would use a hit and run strategy. In other words, they would enter industry [sic], undercut the price and exit before the existing firms have time to retaliate. However, if there are  high sunk costs, firms would not be able to exit without losing significant [sic] portion of their investment. Therefore, if there are high sunk costs, hit-and-run strategies are less profitable, firms keep prices above average costs, and markets are not contestable. 
I submit that under this definition, scarcely any industry on the planet is contestable, yet we see prices fall in many industries over time, even in those we would expect to have significant sunk costs and in which we would expect incumbents to have significant cost advantages over new entrants.

It's true that we sometimes must make simplifying assumptions that are at odds with reality to forecast future market conditions. However, some might infer from the AFPC paper (though I stress that the authors do not) that something must be done by anti-trust authorities to unwind the mergers and acquisitions under discussion. To infer this would be to commit the Nirvana Fallacy. To expect anything in the real world (whether in markets or in the policymaking arena) to be "costless" is an impossible standard.

It will be interesting to see what becomes of these mergers and whether seed prices move sharply upward in coming years. What is certain is that there is tremendous causal density in any complex system, such as the market for bio-engineered seed. Thus, policymakers should be humble and cautious about applying the results of theoretical and statistical analysis in their attempts to better our world.

Thursday, September 22, 2016

Political Economy of Crop Insurance

by Levi Russell

Last week my (co-authored) article on the political economy of crop insurance in the next farm bill (coauthored with Art Barnaby of Kansas State University) was published in Choices Magazine. I thought I'd reproduce the theme overview here and link to all 4 articles for those who are interested.

The Farm Bill, passed every four or five years, is a large piece of legislation which includes agricultural, food, conservation, and rural development programs. The most recent bill, passed in 2014, made significant cuts to commodity programs and increased budgeted spending on crop insurance. This change shifts the focus of farm risk management toward crop insurance, making it an even more important part of a producer’s toolkit. Looking ahead to the next farm bill in 2018/2019, this focus on crop insurance will likely continue.

The articles in this issue anticipate three discussions surrounding crop insurance’s role in the next farm bill: the political economy of crop insurance by Barnaby and Russell, economic evaluation of crop insurance’s role in the safety net by Zacharias and Paggi, and crop insurance’s role in specialty crop agriculture by Paggi.

Barnaby and Russell examine three crop insurance alternatives which are likely to be proposed in the debate over the next farm bill:

 1. Replacing crop insurance with a free, area-based disaster program,
 2. Making modifications to existing policy which would significantly reduce support to  farmers and jeopardize the private delivery system, and
 3. Complete elimination of the safety net.

The article summarizes the political factors and their interaction with the economic effects of these proposals.

Zacharias and Paggi identify the key considerations for improving crop insurance’s role in the farm safety net. Among these are regional and commodity-specific considerations, government budget constraints, and interactions between crop insurance and other titles in the farm bill. They emphasize the importance of developing appropriate metrics for evaluating the simultaneous performance of crop insurance and commodity programs and conclude with a research agenda for examining these issues.

Paggi discusses the broader role of crop insurance as a risk management tool for specialty crop producers. Specialty crops are of interest due to the increase in specialty crops’ share of the total crop insurance liability over the last 15 years. Paggi details the connection between crop insurance and specialty crops and provides a discussion of factors affecting the future of this connection.

Finally, Woodard addresses the elasticity of demand for crop insurance issues.  This key value will determine the maximum achievable size of any cuts in USDA’s share of the crop insurance premium and still maintain a politically acceptable level of farmer participation in crop insurance needed to prevent any future ad hoc disaster program.  It is critical for policy makers to understand the impact of elasticity of demand to prevent unintended consequences by making Federal budget cuts to crop insurance.  All budget cuts are not equal so how those cuts, if any, are made is extremely important.

Given the important role of crop insurance in the future of the farm safety net, political and economic factors affecting policy decisions are particularly of interest. This issue provides a first look at the conversations policy makers, industry representatives, and academic economists will have leading up to the next farm bill.

Thursday, September 15, 2016

Coase and Hog Cycles

by David Williamson

If you read this blog, then you're probably familiar with Ronald Coase's work on the importance of transaction costs. But did you know that Coase devoted a substantial portion of his early career to criticizing the Cobweb Model? He actually wrote 4 separate articles on the subject between 1935 and 1940, but not one makes Dylan Matthew's list of Coase's top-five papers. This work is actually really fascinating in the context of economic intellectual history, so here is a quick summary!  

The 1932 UK Reorganization Commission for Pigs and Pig Products Report

It all started when the UK Reorganization Commission for Pigs and Pig Products claimed in a 1932 report that government intervention was needed to stabilize prices in the hog industry. The Commission found that hog prices followed a 4-year cycle: two years rising and two years falling. The Commission explained this cyclical behavior using the Cobweb Model. In this model, products take time to produce. So, to know how much to produce, firms have to guess what the price will be when their product is ready to bring to the market. If producers are systematically mistaken about what prices will be, this could lead to predictable cycles in product spot prices.

The Cobweb Model

How forecasting errors can lead to cycles in product prices is illustrated in the figure below. Suppose we begin time at period 1 and hog producers bring Q1 to the market to sell. Supply is essentially fixed this period because producers can't produce more hogs on the spot, so the price that prevails on the market will be P1. Since this price exceeds the marginal cost of production (represented by S), the individual producers wish they had produced more. Now, when the producers go back home to produce more hogs, they have to guess that the price will be when their hogs are ready to sell. Suppose it will take 2 years to produce more hogs. The UK Reorganization Commission argued that hog producers will assume the price of hogs next period will be the same as it was this period (in other words that producers had "static" expectations about price). That means, in this context, hog producers think the price of hogs in 2 years will still be P1. So each producer will individually increase production accordingly. However, when the producers return to the market in 2 years, they will find that everyone else increased production too and that quantity supplied is now Q2. As a result, the price plummets to P2 and the producers actually lose money. Not learning their lesson, the hog producers will again go home and assume that the price next period will be P2 and collectively cut back their production to Q3. Hopefully you see where this is going, even if the hog producers don't. The price will go up again in 2 years and then down again in 2 more. Thus, we have a 4-year cycle in hog prices. How long will this cycle continue? That depends on the elasticities of supply and demand. If demand is less elastic than supply, as was believed to be the case in the hog market, then the price swings will continue forever and only get bigger as time goes on.

220px-Cobweb_theory_(divergent).svg.png
Source: Wikipedia

Coase Takes the Model to the Data

The Cobweb Model is really clever, but does it actually capture the reality of the hog market? Coase and his co-author Ronald Fowler tried to answer that question by evaluating the model's assumptions. First, are hog producer expectations truly static? Expectations cannot be observed directly, but Coase and Fowler (1935) used market prices to try and infer whether producer expectations were static. It didn't seem like they were. Second, does it really take 2 years for hog producers to respond to higher prices? Coase and Fowler (1935) spend a lot time discussing how hogs are actually produced. They found that the average age of a hog at slaughter is eight months and that the period of gestation is four months. So a producer could respond to unexpectedly higher hog prices in 12 months (possibly even sooner since there were short-run changes producers could also make to increase production). So why does it take 24 months for prices to complete their descent? Even if we assumed producers have static expectations, shouldn't we expect the hog cycle to be 2 years instead of 4?  

This evidence is hard to square with the Cobweb Model employed by Reorganization Commission, but Coase's critics were not convinced. After all, if it wasn't forecasting errors that were driving the Hog Cycle, then what was? "They have, in effect, tried to overthrow the existing explanation without putting anything in its place" wrote Cohen and Barker (1935). Coase and Fowler (1937) attempted to provide an explanation, but this question would continue to be debated for decades.

The Next Chapter

Ultimately, John Muth (1961) proposed a model that assumed producers did not have systematically biased expectations about future prices (in other words that they had "rational" expectations). Muth argued this model yielded implications that were more consistent with the empirical results found by Coase and others. For example, rational expectations models generated cycles that lasted longer than models that assumed static or adaptive expectations. So a 4-year hog cycle no longer seemed as much of  a mystery. I'm not sure what happened to rational expectations after that. I hear they use it in Macro a bit.  Anyways, if you are interested in a more detailed summary of Coase's work on the Hog Cycle, then check out Evans and Guesnerie (2016). I found this article on Google while I was preparing this post and it looks very good.

References

Evans, George W., and Roger Guesnerie. "Revisiting Coase on anticipations and the cobweb model." The Elgar Companion to Ronald H. Coase (2016): 51.

Coase, Ronald H., and Ronald F. Fowler. "Bacon production and the pig-cycle in Great Britain." Economica 2, no. 6 (1935): 142-167.

Coase, Ronald H., and Ronald F. Fowler. "The pig-cycle in Great Britain: an explanation." Economica 4, no. 13 (1937): 55-82.

Cohen, Ruth, and J. D. Barker. "The pig cycle: a reply." Economica 2, no. 8 (1935): 408-422

Muth, John F. "Rational expectations and the theory of price movements."Econometrica: Journal of the Econometric Society (1961): 315-335.

Friday, September 9, 2016

Beef Trade and the TPP

by Levi Russell

As one of my colleagues recently pointed out at an Extension meeting, both major-party candidates are (at least claiming to be) anti-international-trade. It's true that trade restrictions would be harmful to many segments of the U.S. agriculture sector, including beef. I ran across a great article in Beef Magazine last month that shows the U.S.' top trade partners. The chart below is lifted from the article.


As you can see, Australia is responsible for a substantial proportion of beef (not cattle) imports into the U.S. Our exports go primarily to Asian markets and our geographical neighbors. The article goes into some detail about the recent change in fresh beef imports from Brazil. The new policy is a tariff-rate-quota; details are available in the article and in this video.

Since I strive to tell the other side of the story as fairly as possible, I thought I'd link to what I believe is the most sophisticated argument against the Trans Pacific Partnership I've read. I recommend reading it, even if you are pro-TPP.

Tuesday, August 9, 2016

100 Years of Zoning

by Levi Russell

In the past I've discussed zoning laws, referencing articles that compare present vs past policies and that explain the unusual case of Houston, TX. More recently I read an article on Bloomberg with a provocative title: "Zoning has had a Good 100 Years, and That's Plenty." The author's main point is that the costs of zoning laws (primarily their negative effects on the poor) outweigh the benefits. Below are some passages I particularly liked.

Over the past few years, zoning has been blamed, mainly by economists bearing substantial empirical evidence, for an ever-growing litany of ills. The charge that zoning is used to keep poor people and minorities out of wealthy suburbs has been around for decades. But recent research has also blamed it for increasing income segregation, reducing economic mobility and depressing economic growth nationwide.

One can never be certain about these things, but it’s quite possible that excessive land-use restrictions are among the major causes of our long national economic malaise.  Jason Furman, chairman of the White House’s Council of Economic Advisers, made this very point in a speech in November. Yet the platform adopted at the Democratic National Convention this week made no mention of either “land use” or “zoning,” while the Republican platform mentioned them only to condemn the current administration’s purported efforts “to undermine zoning laws in order to socially engineer every community in the country.”
 ...

Dartmouth College economist William Fischel, whose excellent book “Zoning Rules!” has been my most important source on this topic, favors a different explanation. In the decades before the automobile, industrial and residential development was to a large extent constrained by the location of rail and streetcar lines. After trucks and buses became common, though, industrial businesses could locate far from railways (and wharves) and apartment developers could build far from streetcar lines. Anxious homeowners -- and in some cases, merchants -- clamored for rules to keep people from building factories next door.

This does seem to have been one of the motivating factors in New York. According to David W. Dunlap’s New York Times column Monday on the zoning anniversary, “the merchants of Fifth Avenue were losing their retail customers and watching the value of their properties drain away, as big loft buildings for garment manufacturers muscled in around them.” Still, as America’s least auto-centric city, New York also focused its zoning rules on concerns -- skyscraper design, for example -- that were less of an issue elsewhere in the country. It was to be another zoning ordinance adopted six years later in Euclid, Ohio, that ended up fully establishing zoning as a national institution. 
 ...

Sutherland [a Supreme Court Justice in the 1920s and 1930s - LR] affirmed that cities had every right to zone land without compensating landowners or businesses that were harmed. He also said -- unprompted by the facts of the case -- some strangely nasty things about apartment buildings. A sample:
Very often the apartment house is a mere parasite, constructed in order to take advantage of the open spaces and attractive surroundings created by the residential character of the district.
I find it really hard to read that as anything but an affluent guy justifying the legal exclusion of less-affluent people from his neighborhood. There’s been an element of class discrimination to zoning from the early days -- sometimes mixed in with racial discrimination. Still, there have always been other, more-positive aspects, too. In Fischel’s words, “zoning probably makes for more efficient provision of local services and better neighborhoods than would be available without it.”

After about 1970, though, zoning’s negative economic effects began to grow. Before then, housing prices were more or less the same across the country. Since then, prices in the metropolitan areas of the Northeast and West Coast have risen much faster than in most of the rest of the nation -- in the process increasing inequality, thwarting residential mobility and slowing economic growth. Ever-tougher zoning rules and restrictions on growth appear to be a major cause. Fischel has a long list of explanations for this intensification of zoning that I won’t go into here, other than to mention the one that drives me the craziest -- the dressing-up of self-interested economic arguments in the language of environmentalism and morality.

Thursday, August 4, 2016

Howard Baetjer on Regulators as Monopolists

by Levi Russell

Howard Baetjer (Towson University) has an interesting article arguing that, in many cases, regulators behave like monopolists. I've written on the subject of monopoly several times over the last year or so (this one and this one are particularly relevant) and I've personally thought a lot about the ideas Baetjer explores in his piece.

The whole thing is worth reading, but here are some really good paragraphs:

Among the most important services in society is assuring the quality and safety of goods and services. We want assurance, for example, that our taxi drivers are competent and their cars are safe, that our banks have adequate capital, that our medicines are safe and effective, and that our schools teach our children well.

And yet the government agencies that regulate the quality and safety of these are legal monopolies. Those they regulate are required to abide by the government agencies’ decisions; the regulated enterprises have no freedom to choose different quality-assurance services from some competing entity instead. Government regulatory agencies are thus not regulated by market forces and, accordingly, they are not directly accountable to the public they are supposed to serve. ... They are indirectly accountable to the public through the political process, but that process puts so much distance between the public and the government regulator that regulators are effectively left unregulated.

So, government regulators are unregulated monopolies.
 ...
To be clear, these regulatory agencies do not have monopolies in the strict sense that no other provider of quality assurance is allowed to operate. For example, some taxi companies may distinguish themselves by enforcing particularly high standards of cleanliness and punctuality; banks could join associations that certify their exceptionally large capital cushions; and name-brand drug manufacturers try to distinguish their products as better than generics. In all these cases, however, the government regulator is the only quality assurer to whose standards all the enterprises in the industry must by law conform. Additional requirements over and above what the government requires are allowed, but the government’s requirements are mandatory. In this sense government regulators have monopolies. 

Thursday, July 14, 2016

Regulating the Regulatory Process

by Levi Russell

I suppose this is Mercatus Center week, but I can't resist sharing some great analysis and commentary from their researchers.

Senior Research Fellow Patrick McLaughlin recently testified before Congress on the need for an established process of regulatory form at the federal level. Drawing on the experience of the UK and Canada, McLaughlin presents several methods of establishing "regulatory budgeting." He describes this method of regulatory error correction this way:
Regulatory budgets, like other types of budgets, only work if they force the spender to identify and prioritize the most valuable options. The behavior of an agency with a budget differs from that of an agency without a budget. In today’s no-budget world, an agency’s objective is to fulfill its mission with the promulgation of rules. The effectiveness and efficiency of those rules are not evaluated in hindsight, and prospective evaluation of effectiveness and efficiency only occurs for less than one percent of all new rules. In contrast, an agency with a regulatory budget would act differently. First, the agency would avoid new regulations that would not achieve high benefits relative to their budgetary cost. Second, the agency would have incentive to eliminate old regulations that are found to be ineffective or intolerably inefficient. In other words, a regulatory budget process would resemble an error-correction process: it would lead to fewer new errors as well as aid in the identification and correction of existing ones.
 McLaughlin goes on to explain methods of setting the regulatory budget limit and several measures of regulation that could be used in this approach. I highly recommend reading the whole testimony.

Here's the conclusion:
Regulators and legislators alike are not perfect. Regulations are perhaps unique in the sense that they are undeniably important to all actions in the economy, but are not subject to a process for error correction. These errors—most of which are probably undiagnosed owing to the lack of retrospective analysis—are far from benign. They contribute to regulatory accumulation, a force that disproportionately harms low-income households, deters innovation, and slows economic growth, without delivering offsetting benefits. The reduction of the error rate requires a process that ensures the development and application of high-quality information, both before and after the effects of regulations have been observed. Regulatory budgeting represents one option to achieve just that.

Regulatory budgeting would lead to the creation of better information about the effects of regulations. Simultaneously, it would create incentives for regulators to act upon that information, promulgating those regulations that offer the greatest benefit relative to costs and eliminating regulations that impose an undue burden on the American people.

Monday, July 4, 2016

Mandated GMO Labels: A Regressive Tax

by Levi Russell

The predictable effects of mandatory GMO labeling will be felt very soon in Vermont and those with low incomes will be especially hard-hit. Supermarkets in the state will lose some 3,000 products from their shelves. The video on this news story is telling: people don't seem to know much about GMOs and don't really think about the negative effects of mandatory labeling. Anti-GMO organizations such as Greenpeace have been accused of running a fear campaign that isn't supported by scientific evidence. There's no evidence that GMOs are harmful to people, but a law requiring them to be labeled very likely will be.

The federal law passed in the Senate will require companies to use QR codes or dedicated websites to provide information about the presence of genetically modified organisms in their food. The compliance costs associated with this law include the addition of the QR code or website URL to the packaging, the development of the databases with the required information, and the maintenance of this database as farming practices and ingredients change. The latter two will likely be far higher than the former and will affect food prices for the foreseeable future.

Here are some of the potential indirect effects:

1) Less consumer choice - The article linked above shows that this is already becoming a reality. I suspect those 3,000 products will come back to shelves eventually, but the development of new products is now more costly due to the necessity of adding information to GMO databases.

2) Higher prices - Additional costs to food companies will effectively shift the food supply curve to the left and raise prices.

3) Less innovation - Though "very small" food companies are exempted from the rule, many startups are created with the goal of becoming mass-market products (If you don't believe me, just watch an episode of "Shark Tank."). This requirement will be another cobweb of red tape these companies have to get through to get on consumer shelves.

Maybe all these costs are worth it. Given the lack of scientific evidence of harm and the fact that humans have been modifying the genetics of food in a far more haphazard way for a very, very long time, I have my doubts. The reality is that the costs mentioned above will fall disproportionately on those with the lowest incomes. Those with moderate to high incomes will be able to pay more for the food they really want, but for those who spend a substantial portion of their income on food already will find it harder to make other ends meet.

Friday, June 24, 2016

Brexit Stock Market Perspective

by Levi Russell

The financial press was abuzz before and after the recent UK referendum to leave the European Union (see here, here, here, here, here, and here). To be sure, the British Pound took a big hit and several stock market indices across the Western world were affected. I'd just chalk this up to political uncertainty, not a referendum on the referendum. After all, we don't actually know if the UK will leave the EU. Perhaps I'm biased.

Here's the perspective I promised in the title. First, a look at the 5-day charts (all taken from Yahoo Finance) of U.S. (Dow and S&P 500), British (FTSE), Spanish (IBEX), German (DAX), and French (CAC 40) stock exchange indices:







To be sure, these are some pretty serious one-day drops. However, the Dow and S&P 500 fell more modestly than the others and the FTSE (British index) has recovered somewhat. The hardest hit so far are the European indices. Interesting, to be sure.

But what does this selloff look like over a 1 year time horizon? How far back in time do we have to go to see these indices at similar levels?







The Dow and S&P 500 are right about where they were last month. If you take out the big troughs in September 2015 and early this year, both indices are pretty much flat. The FTSE looks similar, though it seems to have a bit more of a downward trend than the U.S. The Spanish, German, and French indices are down a bit more relative to the past few months but it seems they're just continuing a downward trend that's been around for the past 12 months.

I'm not saying the referendum had no effect on the markets but after looking at these charts I'm left asking "Where's the fire?" Maybe I'm missing something, or maybe my cavalier attitude to the stock market stems from the fact that I'm 29 years old.

Sunday, June 19, 2016

Specialization and Trade - A Reintroduction to Economics

That's the tile of Arnold Kling's newest book. It's published by the Cato Institute and is available in e-book format on Amazon for a mere $3.19. You can also download a PDF copy here free. Arnold Kling is an MIT trained economist who spent the bulk of his professional economic career at the Federal Reserve and Freddie Mac. Kling's blog, one of the best on the web in my opinion, is always thought-provoking. As the title of his blog suggests, he makes every effort to understand and fairly state the positions of those with whom he disagrees.

I read a couple of blurbs about the book last week and have only just finished the first chapter. So, rather than write a review, I'll reproduce a section of the Introduction that gives a short description of each chapter. Kling certainly has a unique perspective and I suspect I'll learn a lot from this relatively short book.
“Filling in Frameworks” wrestles with the misconception that economics is a science. This section looks at the difficulties that economists face in trying to adopt scientific methods. I suggest that economics differs from the natural sciences in that we have to rely much less on verifiable hypotheses and much more on hard-to-verify interpretative frameworks. Economic analysis is a challenge, because judging interpretive frameworks is actually harder than verifying scientific hypotheses. 
“Machine as Metaphor” attacks the misconception held by many economists and embodied in many textbooks that the economy can be analyzed like a machine. This section looks at a widely used but misguided approach to economic analysis, treating it as if it were engineering. The economic engineers are stuck in a mindset that grew out of the Second World War, a conflict that was dominated by airplanes, tanks, and other machines. Their approach fails to take account of the many nonmechanistic aspects of the economy. 
“Instructions and Incentives” deals with the misconception that economic activity is directed by planners. This section explains that although people within a firm are guided to tasks through instruction from managers, the economy as a whole is not coordinated that way. Instead, the price system functions as the coordination mechanism. 
“Choices and Commands” is concerned with the misconceptions held by socialists and others who disparage the market system. This section explains why a decentralized price system can work better than a centralized command system. Central planning faces an information problem, an incentive problem, and an innovation problem. 
“Specialization and Sustainability” exposes the misconception that we must undertake extraordinary efforts in order to conserve specific resources. This section explains how the price system guides the economy toward sustainable use of resources. In contrast, individuals who attempt to override the price system through their individual choices or by imposing government regulations can easily miscalculate the costs of their actions. 
“Trade and Trust” addresses the misconception among some libertarians that the institutional infrastructure needed to support specialization and trade is minimal. Instead, this section suggests that for specialization to thrive, societies must reward and punish people according to whether they play by rules that facilitate specialization and trade. A variety of cultural norms, civic organizations, and government institutions serve this purpose, but each of those institutions has its drawbacks. 
“Finance and Fluctuations” deals with the misconceptions about finance that are common among economists, who often fail to appreciate the process of financial intermediation. This section looks at the special role played by financial intermediaries in enabling specialization. Intermediation is particularly dependent on trust, and as that trust ebbs and flows, the financial sector can amplify fluctuations in the economy’s ability to create patterns of sustainable specialization and trade. 
“Policy in Practice” corrects the misconception that diagnosis and treatment of “market failure” is straightforward. This section looks at challenges facing economists and policymakers trying to use the theory of market failure. The example I use is housing finance policy during the run-up to the financial crisis of 2008. The policy process was overwhelmed by the complexity of the specialization that emerged in housing finance. Moreover, the basic thrust of policy was determined by interest-group influence. The lesson is that a very large gap exists between the economic theory of public goods and the practical execution of policy. 
“Macroeconomics and Misgivings” argues that it is a misconception, albeit one that is well entrenched in the minds of both professional economists and the general public, to think of the economy as an engine with spending as its gas pedal. This section presents an alternative to the mainstream Keynesian and monetarist traditions. I argue that fluctuations in employment arise from changes in the patterns of specialization and trade. Discovering new patterns of sustainable specialization and trade is more complex and subtle and less mechanical than what is assumed by the Keynesian and monetarist traditions.

Friday, June 10, 2016

We're All Utilitarians Now?

by Levi Russell

As an avid EconTalk listener, I often hear Russ Roberts, the host, talk about his skepticism of many aspects of modern economics. I'm usually at least a little sympathetic with Russ's point of view, but a recent Wall Street Journal interview featuring Roberts threw me off:
Economics fancies itself a science, and Mr. Roberts used to believe, as many of his peers do, that practitioners could draw dispassionate conclusions. But he has in recent years undergone something of a crisis of economic faith. "The problem is, you can't look at the data objectively most of the time," he says. "You have prior beliefs that are methodological or ideological about the impact of things, and that inevitably color the assumptions you make." 
A recent survey of 131 economists by Anthony Randazzo and Jonathan Haidt found that their answers to moral questions predicted their answers to empirical ones. An economist who defines "fairness" as equality of outcome might be more likely to say that austerity hurts growth, or that single-payer health care would bend the cost curve. The paper's authors quote Milton Friedman's brief for "value-free economics" and reply that such a thing "is no more likely to exist than is the frictionless world of high school physics problems."
I certainly think our interests and ideology can steer us into asking certain questions, but I'm not sure I agree that it affects the results of our analyses as much as Roberts seems to think. The deeper issue just might be the following: our cost/benefit analyses implicitly assume a utilitarian worldview. Thus, when asked about our policy views, we are more likely to narrow our own morality to fit within the confines of utilitarianism. If a cost/benefit analysis comes out in favor of Policy X, are we not expected to favor Policy X even if our analysis didn't include other moral goods such as freedom or justice? Are we, as economists, all utilitarians?

The other day I happened to run across an article by philosopher Rutger Claassen in the Journal of Institutional Economics entitled "Externalities as a Basis for Regulation: A Philosophical View" that addresses this deeper issue. Here's an excerpt from the introduction:
Thus, the main question of the paper simply is: when should an externality be reason for state intervention? Which externalities deserve internalization? The aim of the paper is to show that the utilitarian criterion for answering this question which is embedded in economic analyses is implausible. Instead, I will argue that we need to follow those philosophers who have argued in the line of John Stuart Mill, in favor of the harm principle. Externalities are structurally analogous to harms in political philosophy. Work on the harm principle, however, points to the need for a theory of basic human interests to operationalize the concept of harm/externalities. In the end, therefore we need to fill in judgments about externalities with judgments about basic human interests. If my analysis  is convincing, then one overarching point of importance for the whole tradition of market failure theories emerges. This is what the customary attitude to the issue, to juxtapose economic theories and philosophical grounds for regulation, is highly problematic. It is telling that most handbooks on regulation start with an overview of market failures, and then add to these efficiency-based rationales some philosophical reasons for regulating: usually social justice (equity) reasons and moralistic/paternalist reasons. Instead we need to integrate both frameworks, by showing how philosophical pre-suppositions are at work within economic categories of market failure.
The author begins by discussing Pigovian and Coasean perspectives on externalities and how to deal with them. Claassen does a good job explaining both perspectives and mentions that transactions costs are a problem for both market participants and for government regulators.

The bulk of the article is dedicated to Claassen's criticism of the utilitarian perspective taken by the bulk of economic policy analysis, and discussing the harm principle as a better basis for normative analysis in economics. Specifically, he discusses 1) moral externalities which arise "from preferences about other people's behavior," 2) pecuniary externalities which are losses/gains due to changes in consumer preferences, technological innovation, or competition, and 3) positional externalities which "arise where consumers lose welfare because they compare themselves to others."

He concludes the section:
These cases point to different problems with a purely utilitarian calculus: it ignores issues of individual freedom (moralistic externalities) and justice (pecuniary externalities); and the calculus itself is highly indeterminate (positional externalities). However, philosophers thus far have been stronger at criticizing economic externality analyses than at providing an alternative. Can we find a more solid ground for a normative analysis of externalities?
The rest of the article develops his theory of "basic interests" and applies the theory to the Supreme Court's June 2012 verdict on the "individual mandate" found in the Affordable Care Act (Obamacare). I leave these to the interested reader.

Here's Claassen's conclusion:
This paper has aimed to establish three conclusions. First, economic externalities analyses are probelmatic because they ignore important normative considerations about individual freedom and justice, largely due to their utilitarian grounding (section 3). Second, some philosophers have proposed to exploit the analogy with the harm principle in liberal political philosophy. however, if we follow up on this suggestion and explore representative theories of harm (such as those by Joel Feinberg or Joseph Raz), this points to the need for a theory of basic human interests that does the real normative work in diagnosing harms. Such a theory is needed to evaluate which externalities call for state regulation (section 4). Third, what these basic interests are, in the end, is a matter of political dispute. Economists who have complained about the politicization of externality analyses have simply failed to accept the inherently political nature of questions about the organization of social and economic life. [emphasis mine]
Claassen's paper raises some important issues with the current moral underpinnings of economic analysis and challenges us to think more deeply about the assumptions we make about morality in normative analysis. As policymakers rely more and more on economic analysis, it's good to see these issues being addressed.

Wednesday, June 8, 2016

More Mercatus Center Research on State Tax Reform

by Levi Russell

In a previous post I shared a comparison of the results of tax reform in Utah and Kansas. That comparison was part of a broader analysis of reform efforts in 5 states: Kansas, Michigan, North Carolina, Rhode Island, and Utah. The report provides a detailed analysis of reform efforts and draws some general conclusions about how reform should be implemented.

The  authors generally report good news for the states in terms of government fiscal health. Kansas is an exception. Here's one of the "common trends" identified in the report:
The most effective tax reforms seem to be those that both lower the rates of taxation and simultaneously broaden the scope of activities that are taxed. Such reforms improve the efficiency, convenience, and transparency of a tax system.
 This is the opposite of what Kansas has done. Unlike North Carolina, Kansas politicians failed to couple the tax reform effort with orderly spending cuts. Further, as the report notes, Kansas narrowed its tax base in a distortionary way:
Kansas also made the decision to exempt “pass-through” profits from corporate taxation; that is, business income that is taxed on individual business owners’ tax returns. While this lowers the tax burden on businesses, it creates distortions in the way business owners choose to classify their operations. Moreover, it is inequitable because it disproportionately benefits high earners and creates an unfair playing field among businesses.
There has certainly been a lot of media coverage of Kansas' state government budget information. Another Mercatus paper compares state government fiscal situation data from all 50 states and Puerto Rico in 2014. Kansas is 27th of the 51 states/territories examined. This doesn't sound consistent with the dominant narrative in the media.

How has the reform effort affected the private economies in these states? Below is a graph of private GDP indices for the five states listed above, the US as a whole, and two other states that are, to put it mildly, in big trouble fiscally: California and Illinois. It's tough to draw any general conclusions. Michigan, Utah and California are all doing quite well relative to the US as a whole. Michigan and Utah have had significant tax and spending reductions; California hasn't. Illinois, Kansas, North Carolina, and Rhode Island are all lagging relative to the US as a whole. Kansas and Illinois had pretty flat growth from 2012 to early 2014, but have picked up recently. Kansas in particular seems to be catching up to the US as a whole. North Carolina has been catching up at a feverish pace.


Quantity Index for Real Private State GDP - BEA
click image to enlarge
Yet another Mercatus paper provides a short review of the literature on the relationship between state tax policy and the economic health of the state. Here's the relevant paragraph:
Research finds that higher state taxes are generally associated with lower economic performance. There is somewhat weaker evidence that state and local taxes can significantly reduce income growth within a state, particularly when the revenues raised are devoted to transfer payments. More recent research corroborates this finding in relation to net investment and employment. However, when additional tax revenue is used to improve the quality of public goods and services, economic growth may increase. When looking at business activity more broadly, more comprehensive reviews of the literature find higher taxes to be associated with less economic growth. They also find this relationship to be stronger within metropolitan areas than across metropolitan areas, which means that local taxes have a larger effect on economic growth when it is less costly for firms and taxpayers to relocate to avoid the tax.

Monday, June 6, 2016

Legal and Economic Implications of Farm Data

by Ashley Ellixson

Discussions of farm data are a hot topic not only in today’s agricultural industry but also across the legal field.  I recently authored an article that describes the legal and economic concerns surrounding data ownership, privacy rights, and possible recourse in event of intentional data breach.  The publication aims to answer the questions around “who owns farm data?”, “what happens when farm data is misappropriated?” and “what can I do to protect my farm’s data?”  These questions and many more are swirling around industry, legislatures, and farm organizations.  

Until the law defines farm data or a court speaks to the protections of such data, experts in the field can only suggest best management practices (both at the farm-level and the legal liability level). From the farm perspective, not only the law but the relative value of farm data will direct the optimal choice for damages, if any. Damages may be realized as loss of local bargaining power or a direct cost to the farmer; however, only time will tell. This collaborative effort between Kansas State University and University of Maryland can be found on the AgManager.info website.  


Guest Contributor

Monday, May 30, 2016

A Case Study of State Tax Reform Efforts

by Levi Russell

Adam Millsap of the Mercatus Center has a new case study on state tax reform. I have not yet read the study, but his Forbes column has some good stuff in it. I reproduce the sections on Utah and Kansas below.

Success in Utah

Of the five states studied, Utah’s 2006 reform appears to have been the most successful. The income tax was simplified from six brackets to one and many deductions were eliminated, which made it less distortionary and easier to understand. The study also notes that Utah was able to improve the efficiency of its tax system without experiencing severe drops in revenue.

According to the study, Utah’s tax reform was successful because its supporters were able to identify key stakeholders and include them in the reform process. This ensured that any reform that reached the governor’s desk had broad support. The study also points out that Utah has had a relatively high level of economic freedom for many years. This is a sign that the institutions and cultural attitude required for comprehensive tax reform were in place.

Problems in Kansas

Contrary to Utah’s experience, Kansas’ 2012 tax reform was more problematic. While the number of tax brackets was reduced from three to two and several tax credits were eliminated in order to broaden the base, Kansas’ reform also created a major distortion by exempting some business income from taxation.

This reform has allowed some businesses to avoid income taxes altogether which encourages others to mimic that behavior in order to minimize their tax own tax burden. One such example is University of Kansas basketball coach Bill Self, who is primarily paid through his business entity that is exempt from state income taxes. The distortion in Kansas’ tax code incentivizes this behavior.

Another problem with Kansas’ tax reform is that the decline in tax revenue due to the reform was not matched by a similar decline in spending. This has resulted in budget deficits. In Utah and the other cases studied tax reform was accompanied by reductions in state spending, which is crucial for maintaining a balanced budget.

Friday, May 27, 2016

Problems with the Definition of Food Deserts

by Brandon McFadden

Food deserts are often used to define areas that have low access to food.  In fact, many people are now referring to food deserts as low access, low income areas.  A Food Access Research Atlas is a map that shows tracts that are defined as food deserts throughout the U.S.  The Atlas is made available by the USDA and can be accessed here.  According to the USDA, “The Food Access Research Atlas maps census tracts that are both low income (li) and low access (la), as measured by the different distance demarcations. This tool provides researchers and other users multiple ways to understand the characteristics that can contribute to food deserts, including income level, distance to supermarkets, and vehicle access.” 

However, the current definition of low access may be too general.  A tract is defined as low income if: 1) The tract’s poverty rate is 20% or greater; or 2) The tract’s median family income is less than or equal to 80% of the State-wide median family income; or 3) The tract is in a metropolitan area and has a median family income less than or equal to 80% of the metropolitan area's median family income.  The original food desert measure defines low access as living one mile from a supermarket in urban areas and 10 miles in rural areas.  For more information about how the USDA defines food deserts read this

To illustrate that the current definition of low access may be too general, allow me to use Gainesville, FL as an example.  Below are two maps of Gainesville.  The map on the left is from USDA and the map on the right is a map from a Google search (the scaling for the two maps is not exact).  The green tracts in the USDA map represent the original food desert measure and the brown tracts represent a more stringent measure of access—0.5 miles from a supermarket in an urban area. 

From the Google map you can see that there are many Publix grocery stores in or near these green and brown tracts.  Moreover, there are many other supermarkets in the map area that are not shown.  Also in this map area are 3 Winn-Dixie grocery stores, 3 Wal-Marts, Target, Lucky’s Market, Earth-Fare, Trader Joes, Fresh Market, Ward’s Supermarket, Earth Origins, several ethnic specialty stores, and a weekly farmer’s market.  Something not captured by the Atlas, is the availability of public transportation.  For example, there is a bus system in Gainesville that increases the access to supermarkets.



The high number of supermarkets in this map area makes me wonder how access could be reasonably increased in Gainesville.  Consumers obviously need supermarkets, but consumers also need housing, green spaces, medical services, shops, etc.  The point of this is not to trivialize the effects of access to food.  Rather, the point is that the current measure of food deserts appear to be too liberal.  If we are interested in the effects of access and income on diets, we need more realistic measures of low access and income.