Thursday, December 06, 2007

Corporate Tax War

December 4, 2007; Page A20

Word is that the Bush Administration will soon propose a cut in the U.S. corporate income tax, following House Democrat Charlie Rangel's proposal this fall to cut the rate to 30.5% from 35%. As a new study makes clear, such a reduction would give a lift to the U.S. economy when it really needs it.

The study, from the National Bureau of Economic Research, looked at corporate taxes in 85 countries from 1996 to 2005. Economists from the World Bank and Harvard University calculated the effective business tax rate for each country, because some nations have so many tax loopholes that the rate paid by companies can be one-half to one-third the statutory tax rate. The study found that corporate taxes have a statistically significant negative effect on economic performance.

High business taxes were found to reduce a nation's domestic capital investment, the amount of foreign investment into that country, and its overall growth in GDP. The authors conclude that "corporate taxation reduces the return on capital and thus discourages investment" and "reduces the cash flow of the firm" in such a way as to reduce the after-tax capital available for reinvestment.

The researchers also found that high corporate levies reduce entrepreneurship, which drives new industries and job growth. In many nations the corporate tax rate is paid both by large corporations and small businesses. In the U.S., small businesses are often organized under Subchapter S of the tax code and thus pay the personal income tax rate. However the tax is imposed, the study found, "a 10 percentage point rise in a nation's effective corporate tax rate causes a decline in the number of firms by 1.8 per 100 people (the average is 5 per 100 population)."

The clear implication is that raising the U.S. personal income tax rates would also stunt small business entrepreneurship. Yet this is precisely what all of the Democratic Presidential candidates, and even Mr. Rangel, propose. In Mr. Rangel's case, the benefits of his cut in the corporate tax for big business to 30% would be offset by the damage he'd do by raising the top marginal tax rate on individuals and small businesses to as high as 44%. The NBER research suggests this could discourage hundreds of thousands of small businesses from being formed in the next few years.

This study supports research earlier this year by economist Kevin Hassett of the American Enterprise Institute, which found that high business taxes also result in lower wages for workers. The higher rate means less capital investment in making workers productive, which translates into smaller pay checks.

What American CEOs understand, but most in the media and political class so far refuse to acknowledge, is that the U.S. is far behind the rest of the world in reducing corporate tax rates. The U.S. corporate income tax rate is the world's second highest after Japan's among developed nations. Even Mr. Rangel's proposed reduction would leave the U.S. well above the OECD average of 25%. In recent years, Germany, France, the United Kingdom, Vietnam, Poland and Singapore, among many other nations, have either cut or proposed to cut their business tax rates. These lower rates are attracting more investment and capital, and they pose a threat to America's economic competitiveness if Washington fails to act.

The NBER study is a reminder of how out-of-touch America's current political debate is with global economic trends. American politicians are proposing new barriers to trade, as well as new obstacles to capital formation, even as the rest of the world is welcoming more of both. The study is also a reminder that because workers don't see a tax does not mean that they don't feel its impact. If America is going to remain the developed world's leading job creator and economic engine, corporate tax rates are going to have to fall -- and by more than even Mr. Rangel has suggested.

URL for this article: http://online.wsj.com/article/SB119673397691112663.html
Copyright 2007 Dow Jones & Company, Inc. All Rights Reserved

Thursday, November 15, 2007

Taxation and Redistribution

TAXATION and REDISTRIBUTION (Chapter) from: THE CONSTITUTION OF LIBERTY by F.A.Hayek, 1960.

series of excerpts: (even though not blockquoted)

In many ways I wish I could omit this chapter. Its argument is directed against beliefs so widely held that it is bound to offend many.

Redistribution by progressive taxation has come to be almost universally accepted as just. Yet it would be disingenuous to avoid discussing this issue. Moreover, to do so would mean to ignore what seems to me not only the chief source of irresponsibility of democratic action but the crucial issue on which the whole character of future society will depend.

After a long period in which there was practically no questioning of the principle of progressive taxation and in which little discussion took place that was new, there has lately appeared a much more critical approach to the problem.

It would be said at once that the only progression with which we shall be concerned and which we believe cannot in the long run be reconciled with free institutions is the progression of taxation as a whole, that is, the more than proportionally heavy taxation of the larger incomes when all taxes are considered together. Individual taxes, and especially the income tax, may be graduated for a good reason – that is, so as to compensate for the tendency of many indirect taxes to place a proportionally heavier burden on the smaller incomes. This is the only valid argument in favor of progression.

It is clearly possible to bring about considerable redistribution under a system of proportional taxation. All that is necessary is to use a substantial part of the revenue to provide services which benefit mainly a particular class or to subsidize it directly.

As is true of many similar measures, progressive taxation has assumed its present importance as a result of having been smuggled in under false pretenses. When at the time of the French Revolution and again during the socialist agitation preceding the revolutions of 1848 it was frankly advocated as a means of redistributing incomes, it was decisively rejected.

J.R. McCulloch expressed the chief objection in the often quoted statement: “The moment you abandon the cardinal principle of exacting from all individuals the same proportion of their income or of their property, you are at sea without rudder or compass, and there is no amount of injustice and folly you may not commit.” In 1848 Karl Marx and Friedrich Engels frankly proposed “a heavy progressive or graduated income tax” as one of the measures by which, after the first stage of the revolution, “the proletariat will use its political supremacy to wrest, by degrees, all capital from the bourgeois, to centralize all instruments of production in the hands of the state.”

But the general attitude was still well summed up in A. Thiers’s statement that “proportionality is a principle, but progression is simply hateful arbitrariness,” or John Stuart Mill’s description of progression as “a mild form of robbery.”

But after this first onslaught had been repelled, the agitation for progressive taxation reappeared in a new form. The social reformers, while generally disavowing any desire to alter the distribution of incomes, began to contend that the total tax burden, assumed to be determined by other considerations, should be distributed according to “ability to pay” in order to secure “equality of sacrifice” and that this would be best achieved by taxing incomes at progressive rates. (Its basic conception is that of the decreasing marginal utility of successive acts of consumption.) Modern developments within the field of utility analysis itself have, however, completely destroyed the foundations of this argument. There can now be little doubt that the use of utility analysis in the theory of taxation was a regrettable mistake (in which some of the most distinguished economists of the time shared) and that the sooner we can rid ourselves of the confusion it has caused, the better.

Those who advocated progressive taxation during the latter part of the nineteenth century generally stressed that their aim was only to achieve equality of sacrifice and not a redistribution of income; also they generally held that this aim could justify only a “moderate” degree of progression and that its “excessive” use was, of course, to be condemned.

The suggestion that rates would not stay within these limits was treated as a malicious distortion of the argument, betraying a reprehensible lack of confidence in the wisdom of democratic government.

In 1891, Prussia introduced a progressive income tax rising from 0.67 to 4 per cent. In vain did Rudolf von Gneist, the venerable leader of the then recently consummated movement for the Rechtsstaat, protest in the Diet that this meant the abandonment of the fundamental principle of equality before the law, “of the most sacred principle of equality,” which provided the only barrier against encroachment on property.

Though some other Continental countries soon followed Prussia, it took nearly twenty years for the movement to reach the great Anglo-Saxon powers. It was only in 1910 and 1913 that Great Britain and the United States adopted graduated income taxes rising to the then spectacular figures of 8.25 and 7 per cent, respectively. Yet within thirty years these figures had risen to 97.5 and 91 per cent. Thus in the space of a single generation what nearly all the supporters of progressive taxation had for half a century asserted could not happen came to pass.

It has come to be generally accepted once more that the only ground on which a progressive scale of over-all taxation can be defended is the desirability of changing the distribution of income and that the defense cannot be based on any scientific argument but must be recognized as a frankly political postulate, that is, as an attempt to impose upon society a pattern of distribution determined by majority decision.

It seems at least probable (though nobody can speak on this with certainty) that under progressive taxation the gain to revenue is less than the reduction of real income which it causes. If the belief that the high rates levied on the rich make an indispensable contribution to total revenue is thus illusory, the claim that progression has served mainly to relieve the poorest classes is belied by what happened in the democracies during the greater part of the period since progression was introduced.

…once the principle of proportional taxation is abandoned, it is not necessarily those in greatest need but more likely the classes with the greatest voting strength that will profit…

The real reason why all the assurances that progression would remain moderate have proved false and why its development has gone far beyond the most pessimistic prognostications of its opponents is that all arguments in support of progression can be used to justify any degree of progression. … Unlike proportionality, progression provides no principle which tells us what the relative burden of different persons ought to be.

…democracy has yet to learn that, in order to be just, it must be guided in its action by general principles. What is true of individual action is equally true of collective action, except that a majority is perhaps even less likely to consider explicitly the long-term significance of its decision and therefore is even more in need of guidance by principles.

It is the great merit of proportional taxation that it provides a rule which is likely to be agreed upon by those who will pay absolutely more and those who will pay absolutely less and which, once accepted, raises no problem of a separate rule applying only to a minority. In no sense can a progressive scale of taxation be regarded as a general rule applicable equally to all – in no sense can it be said that a tax of 20 per cent on one person’s income and a tax of 75 per cent on the larger income of another person are equal. Progression provides no criterion whatever of what is and what is not to be regarded as just. …members of the majorities have found themselves again and again unexpectedly the victims of the discriminatory rates for which they had voted in the belief that they would not be affected. (my comment: think of 1970s bracket creep or more currently the alternative minimum tax from a 1969 tax act)

…proportional taxation leaves the relations between the net remunerations of different kinds of work unchanged. It concerns the effect, not on the relations between individual incomes, but on the relations between the net remunerations for particular services performed, and it is this which is the economically relevant factor.

There can be no doubt, however, whether or not the net remunerations for two services which before taxation were equal still stand in the same relation after taxes have been deducted. And this is where the effects of progressive taxation are significantly different from those of proportional taxation. The use that will be made of particular resources depends on the net reward for services, and, if the resources are to be used efficiently, it is important that taxation leave the relative recompenses that will be received for particular services as the market determines them. Progressive taxation alters this relation substantially by making net remuneration for a particular service dependent upon the other earnings of the individual over a certain period, usually a year.

…progressive taxation necessarily offends against what is probably the only universally recognized principle of economic justice, that of “equal pay for equal work.”

A man who has worked very hard, or for some reason is in greater demand, may receive a much smaller reward for further effort than one who has been idle or less lucky. Indeed, the more the consumers value a man’s services, the less worthwhile will it be for him to exert himself further. The fact that with progressive taxation the net remuneration for any service will vary with the time rate at which the earning accrues thus becomes a source not only of injustice but also of a misdirection or resources.

No practicable scheme of averaging incomes can do justice to the author or inventor, the artist or actor, who reaps the rewards of perhaps decades of effort in a few years. Nor should it be necessary to elaborate further on the effects of steeply progressive taxation on the willingness to undertake risky capital investments. It is obvious that such taxation discriminates against those risky ventures which are worthwhile only because, in case of success, they will bring a return big enough to compensate for the great risk of total loss.

Beyond the harmful effects on incentive and investment…, there are other effects which are less understood but at least equally important. Of these, one which perhaps still deserves emphasis is the frequent restriction or reduction of the division of labor. The tendency to “do it yourself” comes to produce the most absurd results when, for instance, a man who wishes to devote himself to more productive activities may have to earn in an hour twenty or even forty times as much in order to be able to pay another whose time is less valuable for an hour’s services.

We can also only briefly mention the very serious effect of progressive taxation on the supply of savings. The socialist answer to those who are concerned about this effect on savings is, in fact, no longer that these savings are not needed but that they should be supplied by the community, i.e., out of the funds raised from taxation. This, however, can be justified only if the long-term aim is socialism of the old kind, namely, government ownership of the means of production.

One of the chief reasons why progressive taxation has come to be so widely accepted is that the great majority of people have come to think of an appropriate income as the only legitimate and socially desirable form of reward. They think of income not as related to the value of the services rendered but as conferring what is regarded as an appropriate status in society. …this contention lacks all foundation and appeals only to emotion and prejudice… There is no necessary relation between the time an action takes and the benefit that society will derive from it. The whole attitude which regards large gains as unnecessary and socially undesirable springs from the state of mind of people who are used to selling their time for a fixed salary or fixed wages and who consequently regard a remuneration of so much per unit of time as the normal thing. … It is meaningless for men whose task is to administer resources at their own risk and responsibility and whose main aim is to increase the resources under their control out of their own earnings.

Profits and losses are mainly a mechanism for redistributing capital among these men rather than a means providing their current sustenance. The conception that current net receipts are normally intended for current consumption, though natural to the salaried man, is alien to the thinking of those whose aim is to build up a business. Even the conception of income itself is in their case largely an abstraction forced upon them by the income tax. …I doubt whether a society consisting mainly of “self-employed” individuals would ever have come to take the concept of income so much for granted as we do or would ever have thought of taxing the earnings from certain services according to the rate at which they accrued in time. It is questionable whether a society which will recognize no reward other than what appears to its majority as an appropriate income, and which does not regard the acquisition of a fortune in a relatively short time as a legitimate form of remuneration for certain kinds of activities, can in the long run preserve a system of private enterprise.

…the building-up of new enterprises is still and probably always will be done mainly by individuals controlling considerable resources. New developments, as a rule, will still have to be backed by a few persons intimately acquainted with particular opportunities; and it is certainly not to be wished that all future evolution should be dependent on the established financial and industrial corporations.

It is one of the advantages of a competitive system that successful new ventures are likely for a short time to bring very large profits and that thus the capital needed for development will be formed by the persons who have the best opportunity of using it.

Much of the individual formation of new capital, since it is offset by capital losses of others, should be realistically seen as part of a continuous process of redistribution of capital among the entrepreneurs. The taxation of such profits, at more or less confiscatory rates, amounts to a heavy tax on that turnover of capital which is part of the driving force of a progressive society.

The most serious consequence, however, of the discouragement of individual capital formation where there are temporary opportunities for large profits is the restriction of competition. taxes today absorb the greater part of the newcomer’s “excessive” profits… The old firms do not need to fear his competition: they are sheltered by the tax collector. …what is more important for them is that it prevents the dangerous newcomer from accumulating any capital. They are virtually privileged by the tax system. In this sense progressive taxation checks economic progress and makes for rigidity.

An even more paradoxical and socially grave effect of progressive taxation is that, though intended to reduce inequality, it in fact helps to perpetuate existing inequalities and eliminates the most important compensation for that inequality which is inevitable in a free-enterprise society. … A system based on private property and control of the means of production presupposes that such property and control can be acquired by any successful man. If this is made impossible, even the men who otherwise would have been the most eminent capitalists of the new generation are bound to become the enemies of the established rich.

Can there be much doubt that poor countries, by preventing individuals from getting rich, will also slow down the general growth of wealth? And does not what applies to the poor countries apply equally to the rich?

It is probable that the practice is based on ideas which most people would not approve if they were stated abstractly. That a majority should be free to impose a discriminatory tax burden on a minority; that, in consequence, equal services should be remunerated differently; and that for a whole class, merely because its incomes are not in line with those of the rest, the normal incentives should be practically made ineffective – all these are principles which cannot be defended on grounds of justice.

Yet experience in this field shows how rapidly habit blunts the sense of justice and even elevates into a principle what in fact has no better basis than envy. If a reasonable system of taxation is to be achieved, people must recognize as a principle that the majority which determines what the total amount of taxation should be must also bear it at the maximum rate.

…some progression in personal income taxation is probably justified as a way of compensating for the effects of indirect taxation.

What is needed is a principle that will limit the maximum rate of direct taxation in some relation to the total burden of taxation. The most reasonable rule of the kind would seem to be one that fixed the maximum admissible (marginal) rate of direct taxation at that percentage of the total national income which the government takes in taxation. This would mean that if government took 25 per cent of the national income, 25 per cent would also be the maximum rate of direct taxation of any part of individual incomes.

Tuesday, November 13, 2007

Historical example of coping with economic change

Putting the Past to Work

By JACK FALVEY

IN AN AGE OF INFORMATION OVERLOAD, identifying the most useful information in a timely fashion isn't easy -- and it may be some comfort to know it never was. Yet by studying the adaptive skills of earlier captains of commerce, entrepreneurs in even the most cutthroat businesses can learn how to smack down the competition.

The key: Embrace invention -- even that of your competitors -- and use it better and faster than they do.

In the 1870s, John D. Rockefeller had a telegraph line run to his Euclid Avenue home in Cleveland. When he came home for lunch, he could stay in touch with his Oil City, Pa., contacts for updates on gushers and dry holes. He could then telegraph his brother in New York to adjust the price of kerosene for the European market, and his brother could pass the price on to Europe by trans-Atlantic cable.

Although Standard Oil employed telegraphers, John D. Rockefeller sent and received his own "e-mails." Sending and receiving Morse code at commercial speeds were not easy skills to master, but Rockefeller was "computer-literate." He had to be skilled in the current technology to have the best information and act on it.

The oil business of that day was not a fuel business. Standard Oil sold illumination. Tallow and whale-oil concerns were its competitors. Kerosene lamps, especially with mantles that burned white-hot, were a great advance in technology. Standard Oil produced a lamp-fuel kerosene of such purity that explosions were greatly reduced. Its five-gallon branded blue tins became known around the world. (Meanwhile, the byproduct of kerosene distillation, gasoline, was discarded as a nuisance.)


DOW JONES REPRINTS
This copy is for your personal, non-commercial use only. To order presentation-ready copies for distribution to your colleagues, clients or customers, use the Order Reprints tool at the bottom of any article or visit: www.djreprints.com. • See a sample reprint in PDF formatOrder a reprint of this article now.

Then came Thomas Alva Edison and his light bulb. Rockefeller had to find both a new product and a new customer. Henry Ford provided the opportunity; Rockefeller changed course to exploit it. Rockefeller had the information and vision to see and connect illumination and transportation.

[illustration]
In business, vanquishing obstacles and obsolescence require the same skills now as in the days of the sextant.

John Patterson was a coal dealer in the late 1800s who was troubled by cash pilferage. To catch the thief, he purchased locked wooden boxes that required transactions be punched into paper-tape rolls before the cash drawer would open -- with a spring mechanism ringing a bell so management would know the drawer had been opened. At the end of each day, he could unlock the side compartment, count the holes and have accurate sales numbers for the shift. Patterson was then able to fire the right man for cause.

In retirement from the coal business, he bought the cash-register company since he was impressed with its product. Patterson renamed the company National Cash Register and began his quest to change the business world.

One of his sales managers was not to his liking, and as a man in a hurry, Patterson sacked him forthwith. Infuriated with what he viewed as the injustice of his termination, the fired man went with a competitor, gained control of that company and renamed it. If Patterson was to be National Cash Register, NCR, his ex-employee would one-up him and be International Business Machines, IBM. Tom Watson joined the business-knowledge arms race with a vengeance. Accounting machines became the battleground. Yet, much of the valuable information that was generated in the distillation of the numbers was discarded.

Similarly, before the 1970s retailers counted and recounted products on shelves. When NCR and IBM introduced universal product codes and check-out scanners, sales numbers were continuously available. Yet most retailers continued taking inventory as they'd always done, until someone (likely a paid consultant) showed them how to use the tracking information to improve shelf-stocking efficiency.

Now, product-tracking covers not only stocking, ordering, and inventory-tallying functions; it has advanced beyond the point of sale. CVS, the drugstore chain, has tens of millions of customer-information cards in use. CVS can adjust prices in accordance with customer acceptance and calibrate that information to mesh with vast amounts of other store data. Each of more than 6,200 CVS locations will have the needed amount of merchandise as predicted by recent histories of supply-and-demand flow, paring the likelihood of out-of-stocks and overstocks at the end of a sale.

Whether you're measuring by sextant or global-positioning system, the objective is still the same. Consider this exchange:

Irish radio operator to British ship: "Alter course 20 degrees right to avoid collision."

British ship: "Alter your course 20 degrees left to avoid collision."

Irish: "Negative, alter your course 20 degrees right to avoid collision."

British: "This is His Majesty's aircraft carrier Royal Oak. I demand you alter your course 20 degrees left to avoid collision."

Irish: "Sir, I am a lighthouse."

There have always been immovable objects in our world. Navigating around them is still a sound strategy. While our new knowledge economy requires different navigation techniques, we can't discard the basics of all that we know about what has gone before. Building on history is the way to get to new heights.

Santayana said "Those who cannot remember the past are condemned to repeat it." Those who don't understand history will be baffled by the future. The knowledge economy is not a new challenge.


JACK FALVEY teaches at the Boston campus of the University of Massachusetts. He founded MakingTheNumbers.com1. This article was excerpted from his forthcoming book, Getting it Done; Navigating in the Knowledge Economy.


Barron's welcomes submissions to "Other Voices". Essays should be about 1,000 words in length, and sent by e-mail to the Editorial Page editor at tg.donlan@barrons.com2.

URL for this article: http://online.barrons.com/article/SB119465204901488593.html
Hyperlinks in this Article: (1) http://www.makingthenumbers.com/ (2) mailto:tg.donlan@barrons.com
Copyright 2007 Dow Jones & Company, Inc. All Rights Reserved

Saturday, November 03, 2007

Darn Beliefs comment

Darn Beliefs comment

Doug North Understanding the Process of Economic Change pp.57,58,135,137

Any discussion of the role of beliefs and values in shaping change inevitably turn to Max Weber’s pioneering work. His Protestant Ethic and the Spirit of Capitalism emphasizes the religious origins of such values. Yujiro Hayami has stressed the importance of moral codes in business transactions in Japan, “it was an admixture of Confucianism, Buddhism and Shintoism, but in substance it taught the same morals that Adam Smith considered to be the basis of the wealth of nations – frugality, industry, honesty and fidelity. Clearly this ideology was an important support for commercial and industrial development in the late Tokugawa period, as it suppressed moral hazards and reduced the costs of market transactions.”

Elsewhere

In his Protestant Ethic and the Spirit of Capitalism, Max Weber is concerned to show that the religious ethic embodied in Protestantism – and specifically Calvinism – contained values that promoted the growth of capitalism. But which way does the causation run; and how do we know that both the values and the growth of capitalism did not stem from some other source? Weber makes a connection between religious views and values, and between values and economic behavior; but he does not demonstrate how the consequent behavior would generate the growth of the specific institutions and organizations that produced a growing economic system. Moreover Counter-Reformation Catholicism may have equally encouraged the same individualism and sense of discipline that Weber uniquely ascribes to Protestantism.

A long-standing view of many scholars has been that individualistic behavioral beliefs are congenial to economic growth. Alan Macfarlane’s controversial The Origin of English Individualism traces the sources of English individualism back to the thirteenth century or earlier. It paints a picture of a fluid, individualistically oriented set of attitudes toward the family, the organization of work, and the social structure of the community.

The belief structure embodied in Christian dogma was, despite some notorious contrary illustrations, amenable to evolving in directions that made it hospitable to economic growth. Both Ernst Benz and Lynn White maintain that Christian belief gradually evolved the view that nature should serve mankind and that therefore the universe could and should be controlled for economic purposes. Such an attitude is an essential precondition for technological progress. But it was particularly the unique institutional conditions of parts of medieval/early modern Europe that provided the sort of experiences that served as part of the catalyst to precipitate such perceptions. From this perspective Weber’s protestant ethic is a part of the story of this adaptation but is “downstream” from the originating sources.

Deepak Lal Reviving the Invisible Hand pp.31,34

Concerning the evolution of international property rights:

These international standards built on the system of commercial law that had been created as a result of Pope Gregory VII’s papal revolution in the eleventh century, which established the church-state, and a common commercial law for Christendom.

…in his detailed and careful study piecing together long-term estimates of world GDP and population, Maddison conclusively shows that the so-called Great Divergence, which led one small part of the Eurasian continent to begin a process that slowly but certainly led it to forge ahead of the other Eurasian civilizations, began in the eleventh century. This date also fits my argument in Unintended Consequences, that it was the change in the material beliefs of the West inaugurated by the second of the two papal revolutions which led slowly but certainly to the Great Divergence. Maddison finds that “Western Europe overtook China in per capita performance in the 14th century. Thereafter China and most of the rest of Asia were more or less stagnant in per capita terms until the second half of the 20th century.”

Friday, October 26, 2007

The Truth About the Top 1%

By ALAN REYNOLDS October 25, 2007; Page A23

Key legislators and presidential hopefuls in the Democratic Party have proposed raising the top two tax rates. They're also suggesting extra surtaxes for war, for alleviating the Alternative Minimum Tax, for Social Security, and for subsidizing compulsory health insurance. Barack Obama and John Edwards advocate taxing capital gains at 28%; Hillary Clinton favors taxing dividends at the surtaxed income-tax rates.

The argument for these proposals has nothing to do with the impact of higher tax rates on incentives and the economy. It is all about "fairness" -- defined as reducing the top 1%'s share of income.

[illustration]

This political exercise invariably begins by citing dubious statistics about pretax incomes among the top 1% (1.3 million tax returns) as an excuse for raising tax rates on the top 5%, among others. Echoing speeches from Sen. Clinton, Business Week recently exclaimed, "According to new Internal Revenue Service data announced last week, income inequality in the U.S. is at its worst since the 1920s (before the Great Depression). The top percentile of wealthy Americans earned 21.2% of all income in 2005, up from 19% in 2004."

These statistics are extremely misleading.

First of all, the figures do not describe the top percentile's share of "all income," but that group's share of "adjusted" gross income (AGI) reported on individual tax returns. For one thing, thousands of professionals and business owners who used to report most of their income under the corporate tax responded to lower individual income-tax rates after 1986 and 2003 by reporting more income under the individual tax as partnerships, LLCs and Sub-S corporations.

Peter Merrill of PricewaterhouseCoopers found that "since the Tax Reform Act of 1986 . . . the share of business income earned through pass-through entities has increased by 75% from 29% in 1987 to 52% in 2004." Business profits accounted for just 11.1% of the income reported by the top 1% in 1986, according to economists Thomas Piketty and Emmanuel Saez, but that business share leaped to 21.2% by 1988 and to 29.1% in 2005.

It is this bookkeeping shift, moving business income from the corporate to the individual tax, not CEO pay, which raised the top 1%'s share on individual tax returns. Income reported on W2 forms -- salaries, bonuses and exercised stock options -- accounted for only 57.2% of total income among the top 1% in 2005, down from 63% in 2000 and 65.7% in 1986. Real compensation among the top 1% actually fell 7% from 2000 to 2005.

Turning to the denominator of this ratio ("all income"), a huge portion of middle and lower income is no longer reported on tax returns. A larger and larger share of middle-class investment income is now accumulating outside of AGI because it is inside IRA, 401(k) and 529 savings plans.

The CBO reckons the top 1% accounted for more than 59% of all capital gains, interest, dividends and rent reported on individual tax returns by 2004. Yet estimates of the share of national wealth of the top 1% range from 21%-33%.

If the top 1% own 21%-33% of all capital, how could they be collecting 59% of the income from capital? They can't and they aren't. The top 1% is simply reporting a rising share of capital income because those with more modest incomes are keeping a rising share of their capital income unreported -- in IRA, 401(k) and 529 accounts. Millions also shrink their "adjusted" incomes by subtracting contributions to IRAs unavailable to the rich.

Another huge swath of middle and lower income is excluded because AGI includes only the taxable portion of Social Security benefits and totally misses most other transfer payments such as Medicaid, food stamps and the Earned Income Credit. The Canberra Group, an international group of experts on income statistics brought together from 1996-2000 by the OECD, World Bank, U.N. and others, insisted household income must include everything that "increases the recipients' potential to consume or save." Government transfers amounted to $1.5 trillion in 2005 -- more than the total income of the top 1% in the basic Piketty and Saez estimates ($1.2 trillion).

As a result of such huge omissions, and tax avoidance, the AGI of $7.5 trillion in 2005 was $3.7 trillion smaller than pretax personal income (personal income was $10.3 trillion in 2005, after subtracting $875 billion of payroll taxes). Anyone suggesting AGI is a more accurate measure than personal income is obliged to argue that GDP in 2005 was exaggerated by 29.4%.

Estimated income shares from the IRS or Messrs. Piketty and Saez are not about income per household, but income per tax return. That matters because the top fifth of households average two salaries per tax return. The Census Bureau reports that the top fifth accounted for 26.8% of all full-time works last year while the bottom fifth accounted for just 5.7%. In fact, 64.5% of the households in the bottom fifth had nobody working, not even part time for a few weeks. When labor economists discuss income inequality, they habitually switch to speculating about skill-based differences in hourly wages, totally ignoring differences in hours worked.

Third, the latest IRS figures are not comparable with those of 1986, much less with 1929, because the definition of AGI changes with changes in tax law. Such estimates differ greatly, with the IRS saying the top 1% received only 11.3% of income in 1986 (because AGI then excluded 60% of capital gains) while Messrs. Piketty and Saez put that year's figure at 13.1% and the CBO says it was 14%.

The IRS figures only go back to 1986, so the Business Week comparison with the 1920s is invalid. The new figure is from the IRS but the old one is from Messrs. Piketty and Saez. Their recent estimates are also not comparable to their prewar estimates. Before 1944, their figures were obtained by dividing top income shares by 80% of personal income. Their estimates for 2005 were obtained by dividing top incomes by the $6.8 trillion left on tax returns after excluding even taxable transfer payments.

If total income for 2005 was defined as it was for 1928, then the share of the top 1% would have dropped to 13.3% in 2005, compared with 19.8% in 1928. Besides, as Messrs. Piketty and Saez explained, "our long-run series are generally confined to top income and wealth shares and contain little information about bottom segments of the distribution."

A fundamental problem with all tax-based income data involves "taxable income elasticity." Numerous studies, some by Mr. Saez, show that the amount of top income reported on individual tax returns is highly sensitive to changes in marginal tax rates on individual income, corporate income and capital gains. After the tax on dividends was reduced in 2003, for example, top-bracket investors held more dividend-paying stocks in taxable accounts (rather than in nontaxable accounts) and fewer tax-exempt bonds.

When the top tax on capital gains was cut in 1997 and 2003, investors reacted by trading stocks more frequently and realizing more capital gains in taxable accounts. In the Piketty-Saez data, capital gains accounted for only 10.8% of the top 1%'s income from 1987 to 1996, when the capital gains tax was 28%. By contrast, capital gains accounted for 16.9% of the top 1%'s reported income from 1997 through 2002, when the tax was down to 20%.

Even if estimates of the top 1%'s income share were not so sensitive to changes in tax rates, they would still tell us nothing about what happened to incomes among the other 99%. The top 1%'s share always falls in recessions, even aside from capital gains. But that certainly doesn't mean recessions raise everyone else's income.

"It is a disputed question," wrote Messrs. Piketty and Saez, "whether the surge in reported top incomes has been caused by the reduction in taxation at the top through behavioral responses." In fact, their data clearly suggest that higher tax rates on top incomes, dividends and capital gains would sharply reduce top incomes, dividends and capital gains reported on individual tax returns. Such behavioral responses would have little impact on actual income or wealth at the top, while nonetheless leaving middle-income taxpayers stuck with a much larger share of the tax burden.

Mr. Reynolds, a senior fellow with the Cato Institute, is the author of "Income and Wealth" (Greenwood Press 2006).

URL for this article: http://online.wsj.com/article/SB119327553557070785.html
Copyright 2007 Dow Jones & Company, Inc. All Rights Reserved

Thursday, October 18, 2007

Mead Example

This is an example of referring to an offsite MP3 file hosted at www.freedrive.com. Click on the link below and your computer should automatically download and being playing it (I think). Mead MP3

Thursday, October 11, 2007

Global Warming links at FreedomKeys

Collection of links from this website "Let's be clear: the work of science has nothing whatever to do with consensus. Consensus is the business of politics. Science, on the contrary, requires only one investigator who happens to be right, which means that he or she has results that are verifiable by reference to the real world. In science consensus is irrelevant. What is relevant is reproducible results. The greatest scientists in history are great precisely because they broke with the consensus." -- Michael Crichton
Alarmist Claims are Melting Under Scrutiny
The earth has spent most of its time in ICE AGES. Are YOU smart enough to see WHERE WE ARE NOW on this chart? Also see: "Read the Sunspots"
WHY POLITICIZED SCIENCE IS DANGEROUS
SEE
THE AMERICAN ECOLOGICAL CHURCH OF GLOBAL WARMING, HERE.

See: “the world’s southernmost continent shows that temperatures during the late 20th century did not climb as had been predicted by many global climate models” HERE
So far, 15,000 scientists have signed the petition saying the global warming is NOT caused enough by humans, if at all, to make public policy about it. See who they are HERE.
TOTAL greenhouse gas contributions FROM HUMANS add up to only about .0028 of the greenhouse effect.
Michael Crichton's State of Fear and the END of Radical Environmentalism

A divide thanks to regulation

Culture And Commerce November 2007 Atlantic Monthly

Real estate may be as important as religion in explaining the infamous gap between red and blue states.

by Virginia Postrel

A Tale of Two Town Houses

In 2000, my husband and I moved out of our mid-1970s three-bedroom town house in Los Angeles and into a brand-new three-bedroom town house in Uptown Dallas. At the time, the two were worth about the same, but the Dallas place was 1,000 square feet bigger. We’ve moved back to L.A., and we’re glad we kept our old house. Over the past seven years, its value has roughly doubled. By contrast, we sold our Dallas place for $6,500 less than we paid for it.

Housing market

It’s not that we bought into a declining Dallas neighborhood: Uptown is one of the hottest in the city, with block upon block of new construction. But the supply of housing in Dallas is elastic. When demand increases, because of growing population or rising incomes, so does the amount of housing; prices stay roughly the same. That’s true not only in the outlying suburbs, but also in old neighborhoods like ours, where dense clusters of town houses and multistory apartment buildings are replacing two-story fourplexes and single-family homes. It’s easy to build new housing in Dallas.

Not so in Los Angeles. There, in-creased demand generates little new supply. Even within zoning rules, it’s hard to get permission to build. When a local developer bought three small 1920s duplexes on our block, planning to replace them with a big condo building, neighbors campaigned to stop the proj­ect. The city declared the charming but architecturally undistinguished buildings historic landmarks, blocking demolition for a year. The developer gave up, leaving the neighborhood’s landscape—and its housing supply—unchanged. In Los Angeles, when demand for housing increases, prices rise.

Dallas and Los Angeles represent two distinct models for successful American cities, which both reflect and reinforce different cultural and political attitudes. One model fosters a family-oriented, middle-class lifestyle—the proverbial home-centered “balanced life.” The other rewards highly productive, work-driven people with a yen for stimulating public activities, for arts venues, world-class universities, luxury shopping, restaurants that aren’t kid-friendly. One makes room for a wide range of incomes, offering most working people a comfortable life. The other, over time, becomes an enclave for the rich. Since day-to-day experience shapes people’s sense of what is typical and normal, these differences in turn lead to contrasting perceptions of economic and social reality. It’s easy to believe the middle class is vanishing when you live in Los Angeles, much harder in Dallas. These differences also reinforce different norms and values—different ideas of what it means to live a good life. Real estate may be as important as religion in explaining the infamous gap between red and blue states.

The Dallas model, prominent in the South and Southwest, sees a growing population as a sign of urban health. Cities liberally permit housing construction to accommodate new residents. The Los Angeles model, common on the West Coast and in the Northeast Corridor, discourages growth by limiting new housing. Instead of inviting newcomers, this approach rewards longtime residents with big capital gains and the political clout to block projects they don’t like.

The direct results of these strategies are predictable: cheap, plentiful housing in some places, and expensive, scarce housing in others. A remodeler working on my L.A. town house a couple of years ago wistfully recalled visiting a cousin in Arlington, Texas, between Dallas and Fort Worth. He wanted to move there himself. In Arlington, he said, “you can buy a million-dollar house for $200,000.” According to Coldwell Banker’s annual sur- vey, a 2,200-square-foot, four-bedroom “middle-management” home costs around $141,000 in Arlington (or, for big spenders, $288,000 in Dallas), compared with $1 million or more in the L.A. area. One man’s million-dollar dream home is another’s plain old tract house.

Many people do pack up and move, if not to Arlington, then to Las Vegas or Charlotte. Historically a magnet for educated migrants, California has begun losing college-educated residents, on net, to other states, in large part because of the high cost of housing. Most of the South’s population growth since the 1980s has come from the lure of cheap housing created by liberal permitting policies, according to new research by the Harvard economists Edward Glaeser and Kristina Tobio. By lowering the cost of housing, these policies give residents higher real incomes compared with similarly paid workers elsewhere—a strong incentive to move, even if you don’t like bugs or hot summers. The mobile middle class gravitates to the cities where housing is affordable. “If you’re your basic $85,000-a-year person, you can’t own in Los Angeles. You can’t do it,” says the Wharton School economist Joseph Gyourko. And if you’re your basic $45,000-a-year person, closer to the U.S. median household income, you’d better pack for Texas.

That doesn’t mean Los Angeles and San Francisco are in any danger of turning into Detroit and Buffalo. To the contrary, Gyourko calls them “superstar cities,” places that offer “a rare blend” of stimulating leisure activities and a highly productive work environment. A life that looks “rushed” and “materialistic” to the folks headed for North Carolina feels exciting and creative to die-hard urbanites. As a friend who recently moved from Manhattan to Santa Monica once said to me, “When people say a place is ‘good for raising children,’ that means it’s boring.” But not everyone with a taste for urban amenities can afford the superstar life. As the number of affluent Americans grows, the rich are bidding up the price of living in these special places, increasing the gap between the superstar cities and everyplace else.

People in these high-price areas respond that they have no control over housing costs. Everyone wants to live in California, and the land is already full of houses. This isn’t Texas, with its miles and miles of empty old cotton fields. True, land is cheaper and more plentiful in less-developed parts of the country. But high-price areas could put many more units on the land they have. Research by Gyourko, Glaeser, and Raven Saks found that the lowest-density areas around expensive cities tend to have the least new construction and the most land-use restrictions. It’s actually somewhat easier to build in more densely populated towns and neighborhoods—the opposite of what you’d expect if a shortage of empty land were the problem.

Some of the higher price of L.A. real estate does reflect the intrinsic pleasure of living there, as I’m reminded every time I walk out my door into the perfect weather. Some of the price reflects the productivity advantages of being near others doing similar work (try selling a screenplay from Arlington, Texas). All of these benefits—and the negatives of traffic and smog—are reflected in the price of land.

But what exactly is that price? Consider two ways of computing the price of a quarter acre of land. You can compare the value of a house on a quarter acre with that of a similar house on a half acre. Or you can take the price of a house on a quarter acre and subtract the cost of the house itself—the price of construction. Either way, you get the value of an empty quarter acre. The two numbers should be roughly the same. But they aren’t. The second one is always bigger, because it includes not just the property but the right to build. Expanding your quarter-acre lot to a half acre doesn’t give you per- mission to add a second house.

In a 2003 article, Glaeser and Gyourko calculated the two different land values for 26 cities (using data from 1999). They found wide disparities. In Los Angeles, an extra quarter acre cost about $28,000—the pure price of land. But the cost of empty land isn’t the whole story, or even most of it. A quarter- acre lot minus the cost of the house came out to about $331,000—nearly 12 times as much as the extra quarter acre. The difference between the first and second prices, around $303,000, was what L.A. home buyers paid for local land-use controls in bureaucratic delays, density restrictions, fees, political contributions. That’s the cost of the right to build.

And that right costs much less in Dallas. There, adding an extra quarter acre ran about $2,300—raw land really is much cheaper—and a quarter acre minus the cost of construction was about $59,000. The right to build was nearly a quarter million dollars less than in L.A. Hence the huge difference in housing prices. Land is indeed more expensive in superstar cities. But getting permission to build is way, way more expensive. These cities, says Gyourko, “just control the heck out of land use.”

The unintended consequence of these land-use policies is that Americans are sorting themselves geographically by income and lifestyle—not across neighborhoods, as they used to, but across regions. People are more likely to live surrounded by others like themselves, creating a more-polarized cultural map. In the superstar cities, where opinion leaders congregate, the perception is growing that the country no longer has a place for middle-class life. Yet the same urban sophisticates who fret that you can’t live decently on less than $100,000 a year often argue vociferously that increasing density will degrade their quality of life. They may be right—but, like any other luxury good, that quality commands a high price.

The URL for this page is http://www.theatlantic.com/doc/200711/housing.

Thursday, September 27, 2007

Education spending

Money for Nothing September 27, 2007; Page A16

If any state has taken to heart the claim that more money is the key to improving public education for low-income students, it's Connecticut. The Nutmeg State, which ranks first in per capita income ($47,800), also leads the way in average teacher salary ($58,700) and is third in per-pupil spending ($11,000). Yet according to the latest National Assessment of Educational Progress released this week, Connecticut has the nation's largest achievement gap between poor and non-poor students.

The NAEP measured reading and math skills in grades four and eight. And while scores nationwide showed modest progress this year, Connecticut is moving in the opposite direction. Not only are low-income students falling further behind in Connecticut than anywhere else, but the state's overall ranking is also down. Since 2005, Connecticut has lost ground to other states in three of the four NAEP categories. In fourth-grade math, it's fallen to 16th from 9th. In eighth-grade math, it's fallen to 29th from 20th.

As usual, the supposed beneficiaries of Connecticut's education lucre are faring worst. Poor and minority students typically attend schools in urban districts that spend thousands of dollars more than the per-pupil state average. Yet the state ranking for Hispanic students declined in all four categories; for blacks, it fell in all but one category. Eighth-graders in Connecticut who qualify for free or reduced price lunches had the second-lowest math scores for poor students in the U.S.

Some Connecticut lawmakers, egged on by the teachers unions, will doubtless use these results to argue for throwing still more tax dollars at education. Earlier this year, Republican Governor Jodi Rell pushed (unsuccessfully) for a 10% personal income tax rate hike, which she said was necessary to fund additional school spending. But here's a better idea: Try focusing on how money is spent instead of merely how much.

Public charter schools in Connecticut regularly outperform traditional public schools, and do so on significantly smaller budgets. Hartford's lone charter school, Jumoke Academy, receives $8,000 per student from the state, while surrounding public schools receive $13,600 per kid. On the most recent state assessment test, 60% of Jumoke's students scored proficient in math, 70% scored proficient in reading and 95% scored proficient in writing. The corresponding results for the surrounding public schools were 22%, 30% and 27%.

Connecticut has only 16 charter schools operating today, thanks to political limits imposed in Hartford. Governor Rell's high job-approval ratings put her in a position to push for creating more, but she's been unwilling to take on the unions that oppose school choice for the underprivileged.

"The politicians are much freer with financial capital than with political capital," says Marc Porter Magee of ConnCan, an education reform group based in New Haven that has called for more charter schools. "They'll spend as much money as they can get through, but they won't take on the tough reforms when push comes to shove." The biggest losers from Ms. Rell's lack of political courage are the poorest kids in the state.

URL for this article: http://online.wsj.com/article/SB119085526233340818.html
Copyright 2007 Dow Jones & Company, Inc. All Rights Reserve

Thursday, August 30, 2007

In Nature’s Casino

It was Aug. 24, 2005, and New Orleans was still charming. Tropical Depression 12 was spinning from the Bahamas toward Florida, but the chances of an American city’s being destroyed by nature were remote, even for one below sea level. An entire industry of weather bookies — scientists who calculate the likelihood of various natural disasters — had in effect set the odds: a storm that destroys $70 billion of insured property should strike the United States only once every 100 years. New Orleanians had made an art form of ignoring threats far more likely than this; indeed, their carelessness was a big reason they were supposedly more charming than other Americans. And it was true: New Orleanians found pleasure even in oblivion. But in their blindness to certain threats, they could not have been more typically American. From Miami to San Francisco, the nation’s priciest real estate now faced beaches and straddled fault lines; its most vibrant cities occupied its most hazardous land. If, after World War II, you had set out to redistribute wealth to maximize the sums that might be lost to nature, you couldn’t have done much better than Americans had done. And virtually no one — not even the weather bookies — fully understood the true odds.

But there was an exception: an American so improbably prepared for the havoc Tropical Depression 12 was about to wreak that he might as well have planned it. His name was John Seo, he was 39 years old and he ran a hedge fund in Westport, Conn., whose chief purpose was to persuade investors to think about catastrophe in the same peculiar way that he did. He had invested nearly a billion dollars of other people’s money in buying what are known as “cat bonds.” The buyer of a catastrophe bond is effectively selling catastrophe insurance. He puts down his money and will lose it all if some specified bad thing happens within a predetermined number of years: a big hurricane hitting Miami, say, or some insurance company losing more than $1 billion on any single natural disaster. In exchange, the cat-bond seller — an insurance company looking to insure itself against extreme losses — pays the buyer a high rate of interest.

Whatever image pops to mind when you hear the phrase “hedge fund manager,” Seo (pronounced so) undermines it. On one hand, he’s the embodiment of what Wall Street has become: quantitative. But he’s quirky. Less interested in money and more interested in ideas than a Wall Street person is meant to be. He inherited not money but math. At the age of 14, in 1950, his mother fled North Korea on foot, walked through live combat, reached the United States and proceeded to become, reportedly, the first Korean woman ever to earn a Ph.D. in mathematics. His father, a South Korean, also came to the United States for his Ph.D. in math and became a professor of economic theory. Two of his three brothers received Ph.D.’s — one in biology, the other in electrical engineering. John took a physics degree from M.I.T. and applied to Harvard to study for his Ph.D. As a boy, he says, he conceived the idea that he would be a biophysicist, even though he didn’t really know what that meant, because, as he puts it, “I wanted to solve a big problem about life.” He earned his doctorate in biophysics from Harvard in three years, a department record.

His parents had raised him to think, but his thoughts were interrupted once he left Harvard. His wife was pregnant with their second child, and the health plan at Brandeis University, where he had accepted a job, declared her pregnancy a pre-existing condition. He had no money, his parents had no money, and so to cover the costs of childbirth, he accepted a temp job with a Chicago trading firm called O’Connor and Associates. O’Connor had turned a small army of M.I.T. scientists into options traders and made them rich. Seo didn’t want to be rich; he just wanted health insurance. To get it, he agreed to spend eight weeks helping O’Connor price esoteric financial options. When he was done, O’Connor offered him 40 grand and asked him to stay, at a starting salary of $250,000, 27 times his post-doc teaching salary. “Biophysics was starved for resources,” Seo says. “Finance was hurling resources at problems. It was almost as if I was taking it as a price signal. It was society’s way of saying, Please, will you start solving problems over here?”

His parents, he suspected, would be appalled. They had sacrificed a lot for his academic career. In the late 1980s, if you walked into the Daylight Donuts shop in Dallas, you would have found a sweet-natured Korean woman in her early 50s cheerfully serving up honey-glazed crullers: John’s mom. She had abandoned math for motherhood, and then motherhood for doughnuts, after her most promising son insisted on attending M.I.T. instead of S.M.U., where his tuition would have been free. She needed money, and she got it by buying this doughnut shop and changing the recipe so the glaze didn’t turn soggy. (Revenues tripled.) Whatever frustration she may have felt, she hid, as she did most of her emotions. But when John told her that he was leaving the university for Wall Street, she wept. His father, a hard man to annoy, said, “The devil has come to you as a prostitute and has asked you to lie down with her.”

A willingness to upset one’s mother is usually a promising first step to a conventional Wall Street career. But Seo soon turned Wall Street into his own private science lab, and his continued interest in deep questions mollified even his father. “Before he got into it, I strongly objected,” Tae Kun Seo says. “But now I think he’s not just grabbing money.” He has watched his son quit one firm to go to work for another, but never for a simple promotion; instead, John has moved to learn something new. Still, everywhere he goes, he has been drawn to a similar thorny problem: the right price to charge to insure against potential losses from extremely unlikely financial events. “Tail risk,” as it is known to quantitative traders, for where it falls in a bell-shaped probability curve. Tail risk, broadly speaking, is whatever financial cataclysm is believed by markets to have a 1 percent chance or less of happening. In the foreign-exchange market, the tail event might be the dollar falling by one-third in a year; in the bond market, it might be interest rates moving 3 percent in six months; in the stock market, it might be a 30 percent crash. “If there’s been a theme to John’s life,” says his brother Nelson, “it’s pricing tail.”

And if there has been a theme of modern Wall Street, it’s that young men with Ph.D.’s who approach money as science can cause more trouble than a hurricane. John Seo is oddly sympathetic to the complaint. He thinks that much of the academic literature about finance is nonsense, for instance. “These academics couldn’t understand the fact that they couldn’t beat the markets,” he says. “So they just said it was efficient. And, ‘Oh, by the way, here’s a ton of math you don’t understand.’ ” He notes that smart risk-takers with no gift for theory often end up with smart solutions to taking extreme financial risk — answers that often violate the academic theories. (“The markets are usually way ahead of the math.”) He prides himself on his ability to square book smarts with horse sense. As one of his former bosses puts it, “John was known as the man who could price anything, and his pricing felt right to people who didn’t understand his math.”

In the mid-1990s, when Wall Street first noticed money to be made covering the financial risks associated with hurricanes and earthquakes, it was inevitable that someone would call John Seo to ask him if he could figure out how to make sense of it. Until then, he had specialized in financial, not natural, disasters. But there was a connection between financial catastrophe and natural catastrophe. Both were extreme, both were improbable and both needed to be insured against. The firm that called him was Lehman Brothers, whose offer enticed Seo to quit his job and spend his first year at Lehman learning all he could about the old-fashioned insurance industry.

Right away, he could see the problem with natural catastrophe. An insurance company could function only if it was able to control its exposure to loss. Geico sells auto insurance to more than seven million Americans. No individual car accident can be foreseen, obviously, but the total number of accidents over a large population is amazingly predictable. The company knows from past experience what percentage of the drivers it insures will file claims and how much those claims will cost. The logic of catastrophe is very different: either no one is affected or vast numbers of people are. After an earthquake flattens Tokyo, a Japanese earthquake insurer is in deep trouble: millions of customers file claims. If there were a great number of rich cities scattered across the planet that might plausibly be destroyed by an earthquake, the insurer could spread its exposure to the losses by selling earthquake insurance to all of them. The losses it suffered in Tokyo would be offset by the gains it made from the cities not destroyed by an earthquake. But the financial risk from earthquakes — and hurricanes — is highly concentrated in a few places.

There were insurance problems that were beyond the insurance industry’s means. Yet insurers continued to cover them, sometimes unenthusiastically, sometimes recklessly. Why didn’t insurance companies see this? Seo wondered, and then found the answer: They hadn’t listened closely enough to Karen Clark.

Thirteen years before what would become Tropical Storm Katrina churned toward Florida — on Monday, Aug. 24, 1992 — Karen Clark walked from her Boston office to a nearby Au Bon Pain. Several hours earlier, Hurricane Andrew had struck Florida, and she knew immediately that the event could define her career. Back in 1985, while working for an insurance company, Clark wrote a paper with the unpromising title “A Formal Approach to Catastrophe Risk Assessment in Management.” In it, she made the simple point that insurance companies had no idea how much money they might lose in a single storm. For decades Americans had been lurching toward catastrophe. The 1970s and ’80s were unusually free of major storms. At the same time, Americans were cramming themselves and their wealth onto the beach. The insurance industry had been oblivious to the trends and continued to price catastrophic risk just as it always had, by the seat of its pants. The big insurance companies ran up and down the Gulf Coast selling as many policies as they could. No one — not even the supposed experts at Lloyd’s of London — had any idea of the scope of new development and the exposure that the insurance industry now had.

To better judge the potential cost of catastrophe, Clark gathered very long-term historical data on hurricanes. “There was all this data that wasn’t being used,” she says. “You could take it, and take all the science that also wasn’t being used, and you could package it in a model that could spit out numbers companies could use to make decisions. It just seemed like such an obvious thing to do.” She combined the long-term hurricane record with new data on property exposure — building-replacement costs by ZIP code, engineering reports, local building codes, etc. — and wound up with a crude but powerful tool, both for judging the probability of a catastrophe striking any one area and for predicting the losses it might inflict. Then she wrote her paper about it.

The attention Clark’s paper attracted was mostly polite. Two years later, she visited Lloyd’s — pregnant with her first child, hauling a Stone Age laptop — and gave a speech to actual risk-takers. In nature’s casino, they had set themselves up as the house, and yet they didn’t know the odds. They assumed that even the worst catastrophe could generate no more than a few billion dollars in losses, but her model was generating insured losses of more than $30 billion for a single storm — and these losses were far more likely to occur than they had been in the previous few decades. She projected catastrophic storms from the distant past onto the present-day population and storms from the more recent past onto richer and more populated areas than they had actually hit. (If you reran today the hurricane that struck Miami in 1926, for instance, it would take out not the few hundred million dollars of property it destroyed at the time but $60 billion to $100 billion.) “But,” she says, “from their point of view, all of this was just in this computer.”

She spoke for 45 minutes but had no sense that she had been heard. “The room was very quiet,” she says. “No one got up and left. But no one asked questions either. People thought they had already figured it out. They were comfortable with their own subjective judgment.” Of course they were; they had made pots of money the past 20 years insuring against catastrophic storms. But — and this was her real point — there hadn’t been any catastrophic storms! The insurers hadn’t been smart. They had been lucky.

Clark soon found herself in a role for which she was, on the surface at least, ill suited: fanatic. “I became obsessed with it,” she says. One big player in the insurance industry took closer notice of her work and paid her enough to start a business. Applied Insurance Research, she called it, or A.I.R. Clark hired a few scientists and engineers, and she set to work acquiring more and better data and building better models. But what she really was doing — without quite realizing it — was waiting, waiting for a storm.

Hurricane Andrew made landfall at 5 on a Monday morning. By 9 she knew only the path of the storm and its intensity, but the information enabled her to estimate the losses: $13 billion, give or take. If builders in South Florida had ignored the building codes and built houses to lower standards, the losses might come in even higher. She faxed the numbers to insurers, then walked to Au Bon Pain. Everything was suddenly more vivid and memorable. She ordered a smoked-turkey and Boursin cheese sandwich on French bread, with lettuce and tomato, and a large Diet Coke. It was a nice sunny day in Boston. She sat outside at a small black table, alone. “It was too stressful to be with other people,” she says. “I didn’t want to even risk a conversation.” She ate in what she describes as “a catatonic state.” The scuttlebutt from Lloyd’s already had it that losses couldn’t possibly exceed $6 billion, and some thought they were looking at a loss of just a few hundred million. “No one believed it,” she says of her estimate. “No one thought it was right. No one said, ‘Yeah, $13 billion sounds like a reasonable number.’ ” As she ate, she wondered what $13 billion in losses looked like.

When she returned to the office, her phones were ringing. “People were outraged,” she says. “They thought I was crazy.” One insurance guy called her, chortling. “A few mobile homes and an Air Force base — how much could it be?” he said.

It took months for the insurers to tote up their losses: $15.5 billion. (Building codes in South Florida had not been strictly enforced.) Fifteen and a half billion dollars exceeded all of the insurance premiums ever collected in Dade County. Eleven insurance companies went bust. And this wasn’t anything like the perfect storm. If it had gone into Miami, it could have bankrupted the whole industry. Clark had been right: the potential financial losses from various catastrophes were too great, and too complicated, to be judged by human intuition. “No one ever called to congratulate me,” Clark says, laughing. “But I had a lot of people call and ask to buy the model.”

After Hurricane Andrew came a shift in the culture of catastrophe. “This one woman really created the method for valuing this risk,” says John Seo. Clark’s firm, A.I.R., soon had more than 25 Ph.D.’s on staff and two competitors, Eqecat and Risk Management Solutions. In its Bay Area offices, R.M.S. now houses more than 100 meteorologists, seismologists, oceanographers, physicists, engineers and statisticians, and they didn’t stop at hurricanes and earthquakes but moved on to flash floods, wildfires, extreme winter storms, tornadoes, tsunamis and an unpleasant phenomenon delicately known as “extreme mortality,” which, more roughly speaking, is the possibility that huge numbers of insured human beings will be killed off by something like a global pandemic.

The models these companies created differed from peril to peril, but they all had one thing in common: they accepted that the past was an imperfect guide to the future. No hurricane has hit the coast of Georgia, for instance, since detailed records have been kept. And so if you relied solely on the past, you would predict that no hurricane ever will hit the Georgia coast. But that makes no sense: the coastline above, in South Carolina, and below, in Florida, has been ravaged by storms. “You are dealing with a physical process,” says Robert Muir-Wood, the chief scientist for R.M.S. “There is no physical reason why Georgia has not been hit. Georgia’s just been lucky.” To evaluate the threat to a Georgia beach house, you need to see through Georgia’s luck. To do this, the R.M.S. modeler creates a history that never happened: he uses what he knows about actual hurricanes, plus what he knows about the forces that create and fuel hurricanes, to invent a 100,000-year history of hurricanes. Real history serves as a guide — it enables him to see, for instance, that the odds of big hurricanes making landfall north of Cape Hatteras are far below the odds of them striking south of Cape Hatteras. It allows him to assign different odds to different stretches of coastline without making the random distinctions that actual hurricanes have made in the last 100 years. Generate a few hundred thousand hurricanes, and you generate not only dozens of massive hurricanes that hit Georgia but also a few that hit, say, Rhode Island.

The companies’ models disagreed here and there, but on one point they spoke with a single voice: four natural perils had outgrown the insurers’ ability to insure them — U.S. hurricane, California earthquake, European winter storm and Japanese earthquake. The insurance industry was prepared to lose $30 billion in a single event, once every 10 years. The models showed that a sole hurricane in Florida wouldn’t have to work too hard to create $100 billion in losses. There were concentrations of wealth in the world that defied the logic of insurance. And most of them were in America.

The more John Seo looked into the insurance industry, the more it seemed to be teetering at the edge of ruin. This had happened once before, in 1842, when the city of Hamburg burned to the ground and bankrupted the entire German insurance industry many times over. Out of the ashes was born a new industry, called reinsurance. The point of reinsurance was to take on the risk that the insurance industry couldn’t dilute through diversification — say, the risk of an entire city burning to the ground or being wiped off the map by a storm. The old insurance companies would still sell policies to the individual residents of Hamburg. But they would turn around and hand some of the premiums they collected to Cologne Re (short for reinsurance) in exchange for taking on losses over a certain amount. Cologne Re would protect itself by diversifying at a higher level — by selling catastrophic fire insurance to lots of other towns.

But by their very nature, the big catastrophic risks of the early 21st century couldn’t be diversified away. Wealth had become far too concentrated in a handful of extraordinarily treacherous places. The only way to handle them was to spread them widely, and the only way to do that was to get them out of the insurance industry and onto Wall Street. Today, the global stock markets are estimated at $59 trillion. A 1 percent drop in the markets — not an unusual event — causes $590 billion in losses. The losses caused by even the biggest natural disaster would be a drop in the bucket to the broader capital markets. “If you could take a Magnitude 8 earthquake and distribute its shock across the planet, no one would feel it,” Seo says. “The same principle applies here.” That’s where catastrophe bonds came in: they were the ideal mechanism for dissipating the potential losses to State Farm, Allstate and the other insurers by extending them to the broader markets.

Karen Clark’s model was, for Seo, the starting point. When he first stumbled upon it and the other companies’ models, he found them “guilty until proven innocent,” as he puts it. “I could see the uncertainty in them,” he says, “just by looking at the different numbers they generated for the same storm.” When they run numbers to see what would happen if the 1926 Miami hurricane hit the city today, A.I.R. puts the losses at $80 billion, R.M.S. at $106 billion and Eqecat at $63 billion. They can’t all be right. But they didn’t need to be exactly right, just sort of right, and the more he poked around inside them, the more he felt they were better than good enough to underpin financial decisions. They enabled you to get a handle on the risk as best you could while acknowledging that you would never know it exactly. And after all, how accurate were the models that forecast the likelihood that Enron would collapse? Next to what Wall Street investors tried to predict every day, natural disasters seemed almost stable. “In the financial markets, you have to care what other people think, even if what they think is screwed up,” Seo says. “Crowd dynamics build on each other. But these things — hurricanes, earthquakes — don’t exhibit crowd behavior. There’s a real underlying risk you have to understand. You have to be a value investor.”

The models were necessary but insufficient. True, they gave you a rough sense of the expected financial losses, but they said nothing about the rewards. Financial markets exist only as long as investors feel the odds are stacked in their favor. Investors — unlike roulette players — can honestly expect to make a gain (their share in the profits of productive enterprise). But how big a gain? How should the payout vary, from government bonds to blue-chip stocks to subprime mortgages? The rewards in each market tended to vary with investors’ moods, but those in catastrophe insurance were just incredibly volatile. Hurricane insurance rates would skyrocket after a big storm, then settle back down. This wouldn’t do: if big investors were going to be persuaded to take billions of dollars in catastrophic risk, they would need to feel there was some reason in the pricing of that risk. “The market,” as Seo puts it, “needs an acceptable mode of failure.”

In the spring of 2001, to the surprise of his colleagues, Seo left his big Wall Street firm and opened a hedge fund — which, he announced, wouldn’t charge its investors the standard 2 percent of assets and 20 percent of returns but a lower, flat fee. “It was quixotic,” says Paul Puleo, a former executive at Lehman who worked with Seo. “He quits this high-paying job to basically open a business in his garage in a market that doesn’t exist.” Seo opened his new shop with his younger brother Nelson and then brought in their older brother, Michael. (His third brother, Scott, had studied astrophysics but decided that “there was no future in astrophysics” and eventually turned himself into an ophthalmologist.) Seo named his firm Fermat Capital Management, after one of his intellectual heroes. “I had once read the letters between Pierre de Fermat and Blaise Pascal,” he wrote in a recent e-mail message. “From my father I had learned that most great mathematicians were nasty guys and total jerks (check out Isaac Newton . . . extra nasty guy), but when I read the Fermat-Pascal letters, you could see that Fermat was an exception to the stereotype . . . truly a noble person. I loved his character and found that his way of analyzing profitless games of chance (probability theory) was the key to understanding how to analyze profitable games of chance (investment theory).”

Four years later, Seo’s hedge fund still faced two problems. The smaller one was that investors were occasionally slow to see the appeal of an investment whose first name was catastrophe. As one investor put it, “My boss won’t let me buy bonds that I have to watch the Weather Channel to follow.” That objection doesn’t worry Seo much. “Investors who object to cat-bond investing usually say that it’s just gambling,” he says. “But the more mature guys say: ‘That’s what investing is. But it’s gambling with the odds in your favor.’ ”

His bigger problem was that insurance companies still didn’t fully understand their predicament: they had $500 billion in exposure to catastrophe but had sold only about $5 billion of cat bonds — a fifth of them to him. Still, he could see their unease in their prices: hurricane- and earthquake-insurance premiums bounced around madly from year to year. Right after Andrew, the entire industry quintupled its prices; a few tranquil years later, prices were back down nearly to where they had been before the storm. Financial markets bounced around wildly too, of course, but in the financial markets, the underlying risks (corporate earnings, people’s moods) were volatile. The risk in natural-disaster insurance was real, physical and, in principle, quantifiable, and from year to year it did not change much, if at all. In effect, the insurers weren’t insuring against disaster; they were only pretending to take the risk, without actually doing so, and billing their customers retroactively for whatever losses they incurred. At the same time, they were quietly sneaking away from catastrophe. Before the 1994 Northridge earthquake, more than a third of California homeowners had quake insurance; right after, the insurers fled the market, so that fewer than 15 percent of California homeowners have earthquakes in their policies today.

The market was broken: people on fault lines and beachfronts were stuck either paying far too much for their insurance or with no real coverage except the vague and corrupting hope that, in a crisis, the government would bail them out. A potentially huge, socially beneficial market was moments from birth. All it needed was a push from nature. And so on Aug. 24, 2005, John Seo was waiting, waiting for a storm. And here it came.

Wall Street is a machine for turning information nobody cares about into information people can get rich from. Back when banks lent people money to buy homes and then sat around waiting for interest payments, no one thought to explore how quickly homeowners would refinance their mortgages if interest rates fell. But then Wall Street created a market in mortgage bonds, and the trader with better information about how and when people refinance made a killing. There’s now a giant subindustry to analyze the inner financial life of the American homeowner.

Catastrophe bonds do something even odder: they financialize storms. Once there’s a market for cat bonds, there’s money to be made, even as a storm strikes, in marginally better weathermen. For instance, before the 2005 hurricane season, a Bermuda cat-bond hedge fund called Nephila found a team of oceanographers in Rhode Island called Accurate Environmental Forecasting, whose forecasts of hurricane seasons had been surprisingly good. Nephila rented the company’s services and traded bonds on the back of its reports. “They kind of chuckle at what we do,” says a Nephila founder, Frank Majors. “The fact that we’re making $10 million bets on whether Charley is going to hit Tampa or not. It made them a little nervous at first. We told them not to worry about what we’re going to do with the information. Just give it to us.”

As Katrina bore down on New Orleans, a cat bond named Kamp Re, issued by the insurance company Zurich, was suddenly at risk. If Zurich lost more than $1.2 billion on a single hurricane in about a two-year period, investors would lose all their money. If Zurich represented about 3 percent of the U.S. insurance market — that is, it was on the hook for about 3 percent of the losses — a hurricane would need to inflict about $40 billion in damage to trigger the default. Since no event as big as this had ever happened, it was hard to say just how likely it was to happen. According to R.M.S., there was a 1.08 percent chance that Kamp Re bond holders would lose all their money — assuming the scientists really understood the odds. The deal had been a success. One of its biggest buyers was John Seo.

As Katrina spun, the players in nature’s casino gathered around the table. When the storm jogged east and struck not New Orleans directly but the less populated, and less wealthy, coastline between Louisiana and Mississippi, they all had the same reaction — relief — but Hemant Shah felt a special relief. Shah is one of the founders of R.M.S., and he was at that moment driving to catch a flight from San Francisco to New York, where he hoped to speak at a conference devoted to predicting terrorism. When he saw Katrina miss New Orleans, he said to himself, O.K., it’s big, but it’s not catastrophic, and he boarded his plane.

As he flew across the country, R.M.S. and its competitors replicated Katrina inside their computers in much the same way that Karen Clark had once replicated Hurricane Andrew. Just hours after landfall, all three firms sent clients in the insurance industry their best estimates of financial losses: R.M.S. put them at $10 billion to $25 billion; Eqecat called for a range between $9 billion and $16 billion; Clark’s A.I.R. had a range of $12.7 billion to $26.5 billion. Big, as Shah said, but not catastrophic. Traders who had underwritten Kamp Re took calls from an investor at a Japanese bank in London. Cheered by Katrina’s path, the fellow was looking to buy some Kamp Re bonds. The traders found another investor eager to unload his Kamp Re holdings. The London investor bought $10 million of Kamp Re at a price of $94.

John Seo just watched. For the past four years, he and his brothers had made money at such moments as this: “live” cat trading, it’s called. A few investors would inevitably become jittery and sell their cat bonds at big discounts, what with the Weather Channel all hysteria all the time. (“The worst place to go if you’re taking risks,” says one cat-bond investor, “is the Weather Channel. They’re just screaming all the time.”) But entering the 2005 hurricane season, the Seo brothers had reconsidered their habit of buying in a storm. “The word had gotten out that buying in the storm was the smart thing to do,” Seo says. “And we were afraid our past successes would give us an irrational interest in buying. Everything’s all fuzzy in these events. And when things are fuzzy, your brain gives you an excuse to push the envelope. So we adopted a policy, before the season, of staying out of the market.”

A few hours later, Hemant Shah’s plane landed in New York. Shah turned on his BlackBerry and discovered that the New Orleans levees had broken: much of the city would soon be underwater. “My first reaction,” Shah says, “was, Uh-oh, we have a problem.” In the imaginary 100,000-year history of hurricanes that R.M.S. had in its computers, no hypothetical storm that struck so far from New Orleans had ever caused the levees to fail. The models, like the intuition they replaced, had a blind spot.

The Kamp Re bonds collapsed, the price dropping from the mid-90s to the low 20s. A few weeks later, an announcement from Zurich American made it clear that the investors in Kamp Re wouldn’t be getting any money back, and Kamp Re’s price fell from $20 to 10 cents. But then the real trouble started: R.M.S., the modeling company, declared that it was rethinking the whole subject of hurricane risk. Since 1995, scientists had noted a distinct uptick in hurricane activity in the North Atlantic Basin. The uptick had been ignorable because the storms had not been making landfall. But between July 2004 and the end of 2005, seven of history’s most expensive hurricanes had struck the American coast, leaving behind 5.5 million insurance claims and $81 billion in insured losses. The rise in hurricane size and frequency was no longer ignorable. R.M.S. convened a panel of scientists. The scientists agreed that unusually warm sea-surface temperatures were causing unusually ferocious and frequent storms. The root cause might be global warming or merely the routine ups and downs of temperatures in the North Atlantic Basin. On cause they failed to agree. On consequence they were united. At the beginning of August 2005, R.M.S. had judged a Katrina-size catastrophe to be a once-in-40-years event. Seven months later, the company pegged it as a once-in-20-years event. The risk had doubled.

It had been just 13 years since Karen Clark’s model swept the industry, but the entire catastrophe risk-taking industry now lived at the mercy of these modelers. The scientists were, in effect, the new odds-makers. It was as if the casino owner had walked up to his roulette table, seen a pile of chips on 00 and announced that 00 would no longer pay 36:1 but would henceforth pay only 18:1. The agencies that rated the insurance companies — S & P, Moodys, etc. — relied on the scientists to evaluate their exposure. When the scientists increased the likelihood of catastrophic storms, S & P and Moodys demanded that the insurance companies raise more capital to cover their suddenly more probable losses. And so in addition to the more than $40 billion they had lost in Katrina, the insurance companies, by edict of the ratings agencies, needed to raise $82 billion from their shareholders just to keep their investment-grade rating. And suddenly they weren’t so eager to expose themselves to losses from hurricanes.

John Seo felt differently. Katrina had cost him millions. But at the same time, in a funny way, it had vindicated his ideas about catastrophe. He had lost only what he had expected to lose. He had found an acceptable mode of failure.

As a boy, John Seo learned everything he could about the Titanic. “It was considered unsinkable because it had a hull of 16 chambers,” he says. The chambers were stacked back to front. If the ship hit something head on, the object might puncture the front chamber, but it would likely have to puncture at least three more to sink the ship. “They probably said, What are the odds of four chambers going?” he says. “There might have been a one-in-a-hundred chance of puncturing a single chamber, but the odds of puncturing four chambers, they probably thought of as one in a million. That’s because they thought of them as independent chambers. And the chambers might have been independent if the first officer hadn’t gambled at the last minute and swerved. By swerving, the iceberg went down the side of the ship. If the officer had taken it head on, he might have killed a passenger or two, but the ship might not have sunk. The mistake was to turn. Often people associate action with lowering risk or controlling risk, but experience shows more often than not that by taking action you only make the risk worse.”

The Titanic offered another lesson for the investor in catastrophe: the threats that seem to us the most remote are those we know the least about. Catastrophe risk is fundamentally different from normal risk. It deals with events so rare that experience doesn’t help you much to predict them. How do you use history to judge the likelihood of a pandemic killing off 1 in every 200 Americans? You can’t. It has happened only once. (The Spanish flu epidemic of 1918.) You lack information. You don’t know what you don’t know. The further out into the tail you go — the less probable the event — the greater the uncertainty. The greater the uncertainty, the more an investor should be paid to live with it.

The financial markets, or, at any rate, the arcane corner of Wall Street that dealt exclusively with highly unlikely financial events, had figured this out. The traders who sold insurance against extreme market collapses — the tail risks — all tended to charge exactly the same price, between four and five times their expected losses. Expected loss could be defined like this: Say an investor wanted to buy $1 billion of insurance for a year against a once-in-100-years stock-market crash. The expected loss would be 1 in 100, 1 percent of $1 billion: $10 million. The insurance would thus cost $40 million to $50 million. The pattern held across Wall Street. The trader at Lehman Brothers who priced stock-market-crash insurance didn’t know the trader at Harvard Management who priced the insurance against drastic interest-rate changes, and he didn’t know the trader at O’Connor and Associates who priced the insurance against the dollar’s losing a third of its value. But their idea of a fair premium for insurance against financial disaster suggested they were reading the same books on the subject — only there were no books. “The reigning theory is that the taste for risk is as arbitrary as the value of a painting,” Seo says. “But if this is so, why are these preferences so consistent across markets?”

Seo thought, Maybe risk is not like art. Maybe there is some deep rule that governs it. And maybe the market is groping its way to that rule all by itself.

Intuitively what the market was doing made sense. Highly improbable events were especially unsettling. The person who insured others against an unlikely event faced not only the problem of judging its likelihood; even if he knew how often it would occur, he didn’t know when it would occur. Even if you had complete certainty that a U.S. stock-market crash happened just once every 25 years, you still didn’t know which year. If you had set up a business to sell crash insurance in January 1987, you would have been bankrupted by the crash in October; on the other hand, if you had gone into the business in 1988, you would have gotten rich. There was no justice in it. The catastrophic risk-taker was a bit like a card counter at the blackjack table allowed to play only a few hands: yes, the odds are in his favor, but he doesn’t always get to play long enough for the odds to determine the outcome.

The uncertainty in these extreme, remote market risks meant that the person who took them should be paid more to do so. But how much more? Extreme events were treated on Wall Street as freak outliers that bore no relation to other, more normal events. There was a striking consistency in the pricing of these risks across Wall Street, but there was no hard logic under them: it was all being done by feel.

The logic is what Seo stumbled upon back in 2000 at Lehman Brothers after someone handed him a weird option to price. An industrial company had called Lehman with a problem. It operated factories in Japan and California, both near fault lines. It could handle one of the two being shut down by an earthquake, but not both at the same time. Could Lehman Brothers quote a price for an option that would pay the company $10 million if both Japan and California suffered earthquakes in the same year? Lehman turned to its employee with a reputation for being able to price anything. And Seo thought it over. The earthquakes that the industrial company was worried about were not all that improbable: roughly once-a-decade events. A sloppy solution would be simply to call an insurance company and buy $10 million in coverage for the Japanese quake and then another $10 million in coverage for the California quake; the going rate was $2 million for each policy. “If I had been lazy, I could have just quoted $4 million for the premium,” he says. “It would have been obnoxious to do so, but traders have been known to do it.” If either quake happened, but not both, he would have a windfall gain of $10 million. (One of his policies would pay him $10 million, but he would not be required to pay anything to the quake-fearing corporation, since it would get paid only if both earthquakes occurred.)

But there was a better solution. He needed to buy the California quake insurance for $2 million, its market price, but only if the Japanese quake happened in the same year. All Seo had to do, then, was buy enough Japanese quake insurance so that if the Japanese quake occurred, he could afford to pay the insurance company for his $10 million California insurance policy: $2 million. In other words, he didn’t need $10 million of Japanese quake insurance; he needed only $2 million. The cost of that was a mere $400,000. For that sum, he could insure the manufacturing company against its strange risk at little risk to himself. Anything he charged above $400,000 was pure profit for Lehman Brothers.

And that was that, except it wasn’t. He saw something. Each risk by itself was not unusual: the quakes being insured against were once-a-decade events. But since each earthquake had a 1-in-10 chance of happening in a year, the chances that both of them would occur were far more remote: 1 in 100 (10 percent of 10 percent). When you combined these more ordinary risks, you simulated extremely unlikely ones. “What I noticed, after the fact, is that this exotic option’s price was special,” he says. “It was related to tail pricing.” The risk of catastrophe wasn’t some freak outlier with no connection to more mainstream risks. It bore a fixed relationship to those risks. Indeed, one way of thinking about natural catastrophes was as a combination of more likely events.

Thus the hunches of Wall Street professionals found vindication in Seo’s arithmetic. The expected loss of the more ordinary risk of a single earthquake was $1 million (a 10 percent chance of a $10 million loss). The insurance cost $2 million, or twice the expected loss. The expected loss of the remote combined risk was $100,000 (a 1 percent chance of a $10 million loss). But the insurance cost $400,000: four times the expected loss. All those practical traders who were pricing tail risk at roughly four times the expected losses had been on to something. “Here I saw the beginnings of a market mechanism that directly links 1-in-10-year risk pricing to 1-in-100-year risk pricing,” Seo says. The intuitive reason that extreme, remote risk should be more highly priced than normal everyday risk was “a happy agreement between human psychological perception and hard mathematical logic.”

Seo’s math — which soon left middle school for graduate school — served two purposes: to describe this universal rule about the pricing of risk and to persuade investors that there was a deeper, hidden logic to investing in catastrophe. They could have some sense of what the price of the risk should be. It was an extraordinary idea: that catastrophe might be fair.

Then came Katrina. The reaction to the storm has put a fine point on Americans’ risk disorientation. The single biggest issue in Florida’s 2006 governor’s race, for instance, was the price of insurance. The Republican, Charlie Crist, got himself elected on the strength of his promise to reduce Floridians’ home-insurance rates by creating a state-subsidized pool of $28 billion in catastrophe insurance coverage. “Florida took this notion of spreading this risk and turned it on its head,” says one former state insurance commissioner. “They said, ‘We’re going to take all this risk ourselves.’ ” The state sold its citizens catastrophe insurance at roughly one-sixth the market rates, thus encouraging them to live in riskier places than they would if they had to pay what the market charged (and in the bargain, the state subsidized the well-to-do who live near the beach at the expense of the less-well-to-do who don’t). But if all the models are correct, $28 billion might not cover even one serious storm. The disaster waiting to happen in Florida grows bigger by the day, but for a man running for governor of Florida, ignoring it is a political no-brainer. If he’s lucky — if no big storms hit in his term — he looks like the genius who saved Floridians billions in catastrophic-risk premiums. If he’s unlucky, he bankrupts Florida and all hell breaks loose, but he can shake down the federal government to cover some of the losses.

Louisiana’s politicians are usually quicker than most to seize upon shrewd politics that generate terrible social policy, but in this case they could not afford to. Louisiana cannot generate and preserve wealth without insurance, and it cannot obtain insurance except at the market price. But that price remains a mystery. Billions of dollars in insurance settlements — received by local businesses and homeowners as payouts on their pre-Katrina policies — bloat New Orleans banks and brokerage houses. The money isn’t moving because the people are paralyzed. It’s as if they have been forced to shoot craps without knowing the odds. Businesses are finding it harder than ever to buy insurance, and homeowners are getting letters from Allstate, State Farm and the others telling them that their long relationship must now come to an end. “I’ve been in the business 45 years,” says a New Orleans insurance broker named Happy Crusel, “and I’ve never seen anything remotely like this.” An entire city is now being reshaped by an invisible force: the price of catastrophic risk. But it’s the wrong price.

Insurance companies, John Seo says, are charging customers too much — or avoiding their customers altogether — instead of sharing their risk with others, like himself, who would be glad to take it. New Orleans, as a result, is slower than it otherwise would be to rebuild. “The insurance companies are basically running away from society,” he says. “What they need to do is take the risk and kick it up to us.” They need to spread it as widely as possible across the investment world and, in the process, minimize the cost of insuring potential losses from catastrophes.

But this, too, is happening. The people on Wall Street who specialize in cat bonds now view Katrina as the single most important thing that ever happened to their business: overnight it went from a tiny backwater to a $14 billion market, and it is now stretching and straining to grow. In March of this year, a single insurer, Allstate, announced its intention to sell $4 billion in catastrophe bonds. A $14 billion market is a trivial sum next to the half-trillion or so dollars that the insurance industry stands to lose from megacatastrophes and next to the additional trillions of dollars worth of property that has gone uninsured in the places most likely to be destroyed by nature, like California, because the insurance is so expensive. But there are all around John Seo signs of a shift in the culture of catastrophe. “It has all the features of providential action,” he says. “It’s like all the actions of man and nature serve to grow the cat-bond market.”

When Katrina struck and his Kamp Re bonds collapsed — from $100 to 0 — Seo was able to view his loss with detachment. The models had badly underestimated the risk, but it was in the nature of extreme risk that the prediction of it would sometimes be mistaken. “The important thing is that the money wasn’t lost in an unearned manner,” he says, by which he means that it wasn’t lost dishonestly or even unwisely or in what his community of investors would consider a professionally unacceptable manner. Investors will endure losses as long as they come in the context of a game they perceive as basically fair, which is why they don’t abandon the stock market after a crash. “That’s all I need to know,” Seo says. “That’s all my clients need to know.” Actually, he goes even further: “I would be embarrassed if we had a big event and our loss wasn’t commensurate with it. It would mean that we didn’t serve society. We failed society.”

Seo’s returns in 2005 were only slightly positive, compared with the roughly 10 to 12 percent he had been delivering, but the demand for his services boomed. He now controls $2 billion, or more than twice what he had before the most costly natural disaster in history. Big investors weren’t scared off by Katrina. Just the reverse. It has led many of them to turn to Seo and others like him to make money from catastrophe. And they probably will. But what interests Seo more is what might happen in the bargain, that the financial consequences of catastrophe will be turned into something they have never been: boringly normal.

Michael Lewis is a contributing writer. The paperback edition of his book “The Blind Side: Evolution of a Game” will be published next month.

Copyright 2007 The New York Times Company