Friday, December 23, 2005

Income mobility, wealth and standards of living

Great American Dream Machine

By STEPHEN MOORE and LINCOLN ANDERSON December 21, 2005; Page A18

New reports by the Census Bureau and the Federal Reserve Board on the economic well-being of the typical American family reveal that over the past three decades, the vast majority of families have experienced a rapid growth in their income and wealth. Now that nearly six out of 10 households own stock and two out of three own their own homes, the average family -- for the first time ever -- has net worth (assets minus liabilities) of more than $100,000. Median family income has climbed to more than $54,000 a year.

Almost no one in the national media has taken notice of this good news, which has been camouflaged by a barrage of misleading and gloomy stories on "stagnant wages," "the growing income gap between rich and poor," "the disappearing middle class" and "rising poverty in America." The reality is that if the economic growth, employment and family-finances numbers get any better, the media will soon have to start calling this the "Clinton economy."

What the reports tell us is that the vast majority of Americans have not bumped into income glass-ceilings, but rather are experiencing an astonishing pace of upward income mobility. The Census data from 1967 to 2004 provides the percentage of families that fall within various income ranges, starting at $0 to $5,000, $5,000 to $10,000, and so on, up to over $100,000 (all numbers here are adjusted for inflation). These data show, for example, that in 1967 only one in 25 families earned an income of $100,000 or more in real income, whereas now, one in six do. The percentage of families that have an income of more than $75,000 a year has tripled from 9% to 27%.

But it's not just the rich that are getting richer. Virtually every income group has been lifted by the tide of growth in recent decades. The percentage of families with real incomes between $5,000 and $50,000 has been falling as more families move into higher income categories -- the figure has dropped by 19 percentage points since 1967. This huge move out of lower incomes and into middle- and higher-income categories shows that upward mobility is the rule, not the exception, in America today.

[Upward Mobility]

It is true that the median-income numbers have fallen slightly in recent years. But this has been the pattern during virtually every recession and immediate post-recession period of the last 40 years. Median-income growth stalls, and then when the recovery picks up steam, incomes resume their inexorable march upward. That is why the long-term trend is what we should be paying attention to. And examining this data leaves no room for argument: The middle class has not been "shrinking" or losing ground, it has been getting richer. For example, the Census data indicate that the income cutoff to be considered "middle class" has risen steadily. Back in 1967, the income range for the middle class (i.e., the middle-income quintile) was between $28,000 and $39,500 a year (in today's dollars). Now that income range is between $38,000 and $59,000 a year, which is to say that the middle class is now roughly $11,000 a year richer than 25 to 30 years ago. This helps explain why middle-income families can buy things like cable TV, air conditioning, DVD players, cell phones, second cars and so on, that were considered mostly luxury items for the rich in the 1950s and '60s.

The upper-middle class is also richer. Those falling within the 60th to 80th percentile in family income have an income range today of between $55,000 and $88,000 a year, which is about $24,000 a year higher than in 1967. This rapid upward income mobility indicates that the great American Dream, in which each generation achieves a higher living standard than their parents, is alive and well.

Turning from income to wealth, data from the Fed provide further confirmation of family economic gains for the middle class. The total net worth of Americans rose to just shy of $50 trillion in 2004. The Fed has not yet calculated the median household wealth for 2004, but we estimated that number by taking the average ratio of mean wealth to median family wealth over the past 10 years. This yields an estimate of $105,000 in 2004. This is almost double the median family-wealth level of 1983 and nearly triple the level of 1962. Until very recently, for a family to attain six figures of wealth was considered quite rich. Despite all of the groans about the over-indebtedness of American households, the new Federal Reserve Board data suggest that the family balance sheet is not highly levered. The ratio of debt to assets is only 18.3%.

Finally, we need to address the issue of whether the poor are being left behind in this era of wealth and income gains. It is true that there is one income category -- $0 to $5,000 a year -- where income mobility is hardly observable. This very low income group has stubbornly fluctuated between 2.3% and 3.6% of American families. We certainly think driving down the percentage of these "have nots" is a critical national priority, but we do not think their poverty is a result of a macro failure of the U.S. economy. Indeed, a very large percentage of these families do not even have a full-time worker participating in the labor force. We think poverty is best addressed by real competition-based reforms in education, by fighting crime and addiction, and by rebuilding American families.

The media and the poverty lobby have seized upon the news that the poverty rate has spiked upward to 12.7% in 2004, up from 11.3% before the recession. This rise was widely reported and condemned, but again this is a short-term phenomenon attributable to the aftershocks of the recession. What was not widely reported was that the 12.7% poverty rate was the lowest coming out of any recession in the last 25 years, and that the poverty rate has been lower than 12.7% in only five of the last 25 years. It certainly is better than the 15.1% rate poverty-rate peak in 1993.

The Census family-finances data corroborate the common-sense notion that by far the best long-term anti-poverty program is growth and avoidance of recession -- because the downturns invariably hit the poor hardest. That's why President Bush's tax cuts should be extended: They have created a positive investment, jobs and growth climate that will, if history is any guide, reverse the recent uptick in poverty levels.

We can say with certainty that most working Americans are achieving levels of wealth and income that far surpass those of their parents. It's reassuring to know that the U.S. is still the pre-eminent meritocracy, where economic success is still predominantly powered by hard work and saving, not inheritance and privilege.

Mr. Moore is senior economics writer and a member of the Journal's editorial board. Mr. Anderson is chief economist at LPL Financial Services.

URL for this article: http://online.wsj.com/article/SB113513427028228173.html
Copyright 2005 Dow Jones & Company, Inc. All Rights Reserved

Wednesday, December 21, 2005

Tax games

'Stop Messing With Federal Tax Rates'

By EDWARD C. PRESCOTT December 20, 2005; Page A14

It looks like politics is getting the best of things when it comes to tax policy. In order to avoid a logjam, the two sides have agreed to a compromise that will raise taxes on individuals and corporations in an attempt to gather more revenue and shrink the country's budget deficit. This comes at a time when the economy is beginning to show signs of renewed energy and looks to be poised for a solid run of growth that would expand business investment and get more people working.

Is this a grim assessment of the U.S. economic situation? No -- such is the sad state of affairs in Germany, a country that just a few months ago seemed like it would finally emerge from the economic Dark Ages that have defined much of the Continent's situation in recent decades. Instead, one of Europe's most important economies may sink even further into its high-tax, high-unemployment and low-growth miasma. We can only hope that this economic power doesn't drag down too many of its neighbors. The flickering light of Europe's economic renaissance has been dimmed, alas, and a looming darkness yet pervades the great land.

But what of this country? If you were under the impression that the scenario I outlined in the first paragraph referred to the U.S., you weren't far off base. Congress has before it the question of whether to extend the tax reform package of 2003. The Senate recently passed a partial package that, among other things, would keep millions of U.S. taxpayers from paying the alternative minimum tax (AMT) next year, but its bill did not extend reduced tax rates for capital gains and dividends beyond 2008, when they expire. This is unfinished business. Currently at 15%, the failure to act will move the rates on capital gains to 20% and those on dividends to an individual's marginal tax rate.

The House is one step ahead, voting to extend the 15% rate for two additional years, to 2010. Better, but still well short of the mark.

Of course, part of the problem here is the same thing that is bedeviling the Germans right now -- politics. Political parties have their own incentives and they don't necessarily line up with what may be the best economic policy. For example, when some politicians see the phrase "Bush tax cuts," they may have a very hard time getting beyond the first word. This is natural, I suppose, but not often helpful. So, it may be useful, for starters, for us to stop calling them the Bush tax cuts. That a particular president is in office when certain policies are passed is doubtless of significant political importance, but these considerations just muddy the economic debate, which should focus on the proper level of taxation.

In that spirit, let's drop the word "cuts," too. The problem with advocating a cut in something is that you are necessarily going to stir up political trouble from someone who will want to increase it again. So, even if you are fortunate enough to get your cut enacted, it is likely a matter of time before the political pendulum swings back and someone else gets their increase. And so we got the Reagan tax cuts, the Clinton tax hikes, the Bush temporary tax cuts and . . . who knows what is next? Both sides can't be right, which means something must be wrong.

In the meantime, taxpayers -- both consumers and businesses -- are left to wonder and worry what the next tax package will look like, and they are forced to scurry and scheme to take advantage of a current law or to avoid the penalties of the next. Or vice versa. This is no way for a government to treat its citizens.

That our current tax system is complicated and burdensome and absorbs unnecessary amounts of our limited resources is well accepted by most everyone, and this issue was a primary concern of the Advisory Panel on Federal Tax Reform that recently released its recommendations. This problem deserves to be seriously addressed, but we could take a big step in the right direction if we just stop messing with federal tax rates. Maybe Congress should take a cue from the Federal Reserve, which learned a long time ago that oversteering with its policy corrections wreaks havoc with market expectations and impedes economic growth. Just as the Federal Reserve has made it clear that it will strive to maintain low inflation, which has allowed businesses and consumers to invest and plan accordingly, Congress should establish good tax rates and walk away. The people will take it from there.

So what are good tax rates? It's useful to begin with consideration of a simple principle: Taxes distort behavior. From this powerful little sentence comes the key insight that should inform our thinking about setting tax rates. Any tax, even the lowest and the fairest, will cause people to consume less or work less. Taxes that are inordinately high only exacerbate this reaction, and the aggregate accumulation of these individual decisions can be devastating to an economy.

Good tax rates, then, need be high enough to generate sufficient revenues, but not so high that they choke off growth and, perversely, decrease tax revenues. This, of course, is the tricky part, and brings us to the task at hand: Should Congress extend the 15% rate on capital gains and dividends? Wrong question. Should Congress make the 15% rate permanent? Yes. (This assumes that a lower rate is politically impossible.)

These taxes are particularly cumbersome because they hit a market economy right in its collective heart, which is its entrepreneurial and risk-taking spirit. What makes this country's economy so vibrant is its participants' willingness to take chances, innovate, acquire financing, hire new people and break old molds. Every increase in capital gains taxes and dividends is a direct tax on this vitality.

Americans aren't risk-takers by nature any more than Germans are intrinsically less willing to work than Americans. The reason the U.S. economy is so much more vibrant than Germany's is that people in each country are playing by different rules. But we shouldn't take our vibrancy for granted. Tax rates matter. A shift back to higher rates will have negative consequences.

And this isn't about giving tax breaks to the rich. The Wall Street Journal recently published a piece by former Secretary of Commerce Don Evans, who noted that "nearly 60% of those paying capital gains taxes earn less than $50,000 a year, and 85% of capital gains taxpayers earn less than $100,000." In addition, he wrote that lower tax rates on savings and investment benefited 24 million families to the tune of about $950 on their 2004 taxes.

Do wealthier citizens realize greater savings? Of course -- this is true by definition. But that doesn't make it wrong. Let's look at two examples: First, there are those entrepreneurs who have been working their tails off for years with little or no compensation and who, if they are lucky, finally realize a relatively big gain. What kind of Scrooge would snatch away this entrepreneurial carrot? As mentioned earlier, under a good system you have to provide for these rewards or you will discourage the risk taking that is the lifeblood of our economy. Additionally, those entrepreneurs create huge social surpluses in the form of new jobs and spin-off businesses. Entrepreneurs capture a small portion of the social surpluses that they create, but a small percentage of something big is, well, big. Congratulations, I say.

Another group of wealthier individuals includes those who, for a variety of reasons, earn more money than the rest of us. Again, I tip my hat. Does it make sense to try to capture more of those folks' money by raising rates on everyone? To persecute the few, should we punish the many? We need to remember that many so-called wealthy families are those with two wage-earners who are doing nothing more than trying to raise their children and pursue their careers. Research has shown that much of America's economic growth in recent decades is owing to this phenomenon -- we should encourage this dynamic, not squelch it.

But shouldn't we worry about federal deficits? Isn't it true that we need to raise the capital gains and dividends rate to capture more revenue and thus help close the widening deficit maw? The plain fact is that last fiscal year the debt-to-GDP ratio (broadly defined) went up only 0.2%. If the forecasted deficits over the next five years are correct, it will begin declining. Tax revenues will rise as economic activity continues to grow -- indeed, this has been the case in 2005. Besides, to raise tax rates and thereby dampen economic activity seems a perverse way to improve our economic situation, including our level of tax receipts -- 15% of something is better than 20% of nothing.

Congress has some unfinished business regarding these key tax rates. On the assumption that they do not resolve this issue prior to their holiday break, I invite the senators and representatives to speak frankly with folks back home -- the small-time investor saving for retirement, the light manufacturer on the edge of town, the hardworking couple across the street -- and ask them how these tax rates affect their lives. Our elected officials might be surprised, but they shouldn't be.

Let's not fall back into old patterns of oversteering and overtaxing. Let's not keep trying to trick our citizens into accepting one tax one day, and another tax the next. Let's not try to tax our way to prosperity. It won't work for Germany, and it won't work for us.

Mr. Prescott, senior monetary adviser at the Federal Reserve Bank of Minneapolis and professor of economics at the W.P. Carey School of Business at Arizona State University, is a 2004 Nobel laureate in economics.

URL for this article: http://online.wsj.com/article/SB113503904805326884.html
Copyright 2005 Dow Jones & Company, Inc. All Rights Reserved

Pluralism

The Rise, Fall And Return of Pluralism November 15, 2005

By Peter F. Drucker, a professor of social science and management at the Claremont Graduate University and a former president of the Society for the History of Technology. He is author, most recently, of "Management Challenges for the 21st Century," just out from Harperbusiness.

(This article originally appeared in The Wall Street Journal on June 1, 1999)

The history of society in the West during the last millennium can -- without much oversimplification -- be summed up in one phrase: The Rise, Fall and Rise of Pluralism.

By the year 1000 the West -- that is, Europe north of the Mediterranean and west of Greek Orthodoxy -- had become a startlingly new and distinct civilization and society, much later dubbed feudalism. At its core was the world's first, and all but invincible, fighting machine: the heavily armored knight fighting on horseback. What made possible fighting on horseback, and with it the armored knight, was the stirrup, an invention that had originated in Central Asia sometime around the year 600. The entire Old World had accepted the stirrup long before 1000; everybody riding a horse anywhere in the Old World rode with a stirrup.

But every other Old World civilization -- Islam, India, China, Japan -- rejected what the stirrup made possible: fighting on horseback. And the reason these civilizations rejected it, despite its tremendous military superiority, was that the armored knight on horseback had to be an autonomous power center beyond the control of central government. To support a single one of these fighting machines -- the knight and his three to five horses and their attendants; the five or more squires (knights in training) necessitated by the profession's high casualty rate; the unspeakable expensive armor -- required the economic output of 100 peasant families, that is of some 500 people, about 50 times as many as were needed to support the best-equipped professional foot soldier, such as a Roman Legionnaire or a Japanese Samurai.

The knight exercised full political, economic and social control over the entire knightly enterprise, the fief. This, in short order, induced every other unit in medieval Western society -- secular or religious -- to become an autonomous power center, paying lip service to a central authority such as the pope or a king, but certainly nothing else such as taxes. These separate power centers included barons and counts, bishops and the enormously wealthy monasteries, free cities and craft guilds and, a few decades later, the early universities and countless trading monopolies.

By 1066, when William the Conqueror's victory brought feudalism to England, the West had become totally pluralist. And every group tried constantly to gain more autonomy and more power: political and social control of its members and of access to the privileges membership conferred, its own judiciary, its own fighting force, the right to coin its own money and so on. By 1200 these "special interests" had all but taken over. Every one of them pursued only its goals and was concerned only with its own aggrandizement, wealth and power. No one was concerned with the common good; and the capacity to make societywide policy was all but gone.

The reaction began in the 13th century in the religious sphere, when -- feebly at first -- the papacy tried, at two councils in Lyon, France, to reassert control over bishops and monasteries. It finally established that control at the Council of Trent in mid-16th century, by which time the pope and the Catholic Church had lost both England and Northern Europe to Protestantism. In the secular sphere, the counterattack against pluralism began 100 years later. The Long Bow -- a Welsh invention perfected by the English -- had by 1350 destroyed the knight's superiority on the battlefield. A few years later the cannon -- adapting to military uses the powder the Chinese had invented for their fireworks -- brought down the hitherto impregnable knight's castle.

From then on, for more than 500 years, Western history is the history of the advance of the national state as the sovereign, that is as the only power center in society. The process was very slow; the resistance of the entrenched "special interests" was enormous. It was not until 1648, for instance -- in the Treaty of Westphalia, which ended Europe's Thirty Years War -- that private armies were abolished, with the nation-state acquiring a monopoly on maintaining armies and on fighting wars. But the process was steady. Step by step, pluralist institutions lost their autonomy. By the end of the Napoleonic Wars -- or shortly thereafter -- the sovereign national state had triumphed everywhere in Europe. Even the clergy in European countries had become civil servants, controlled by the state, paid by the state and subject to the sovereign, whether king or parliament.

The one exception was the United States. Here pluralism survived -- the main reason being America's almost unique religious diversity. And even in the U.S., religiously grounded pluralism was deprived of power by the separation of church and state. It is no accident that in sharp contrast to Continental Europe, no denominationally based party or movement has ever attracted more than marginal political support in the U.S.

By the middle of the last century, social and political theorists, including Hegel and the liberal political philosophers of England and America, proclaimed proudly that pluralism was dead beyond redemption. And at that very moment it came back to life. The first organization that had to have substantial power and substantial autonomy was the new business enterprise as it first arose, practically without precedent, between 1860 and 1870. It was followed in rapid order by a horde of other new institutions, scores of them by now, each requiring substantial autonomy and exercising considerable social control: the labor union, the civil service with its lifetime tenure, the hospital, the university. Each of them, like the pluralist institutions of 800 years ago, is a "special interest." Each needs -- and fights for -- its autonomy.

Not one of them is concerned with the common good. Consider what John L. Lewis, the powerful labor leader, said when FDR asked him to call off a coal miners strike that threatened to cripple the war effort: "The president of the United States is paid to look after the interests of the nation; I am paid to look after the interest of the coal miners." That is only an especially blunt version of what the leaders of every one of today's "special interests" believe -- and what their constituents pay them for. As happened 800 years ago, this new pluralism threatens to destroy the capacity to make policy -- and with it social cohesion altogether -- in all developed countries.

But there is one essential difference between today's social pluralism and that of 800 years ago. Then, the pluralist institutions -- knights in armor, free cities, merchant guilds or "exempt" bishoprics -- were based on property and power. Today's autonomous organization -- business enterprise, labor union, university, hospital -- is based on function. It derives its capacity to perform squarely from its narrow focus on its single function. The one major attempt to restore the power monopoly of the sovereign state, Stalin's Russia, collapsed primarily because none of its institutions, being deprived of the needed autonomy, could or did function -- not even, it seems, the military, let alone businesses or hospitals.

Only yesterday most of the tasks today's organizations discharge were supposed to be done by the family. The family educated its members. It took care of the old and the sick. It found jobs for members who needed it. And not one of these jobs was actually done, as even the most cursory look at 19th-century family letters or family histories shows. These tasks can be accomplished only by a truly autonomous institution, independent from either the community or the state.

The challenge of the next millennium, or rather of the next century (we won't have a thousand years), is to preserve the autonomy of our institutions -- and in some cases, like transnational business, autonomy over and beyond national sovereignties -- while at the same time restoring the unity of the polity that we have all but lost, at least in peacetime. We can only hope this can be done -- but so far no one yet knows how to do it. We do know that it will require something that is even less precedented than today's pluralism: the willingness and ability of each of today's institutions to maintain the focus on the narrow and specific function that gives them the capacity to perform, and yet the willingness and ability to work together and with political authority for the common good.

This is the enormous challenge the second millennium in the developed countries is bequeathing the third millennium.

URL for this article: http://online.wsj.com/article/SB113207898125297807.html
Copyright 2005 Dow Jones & Company, Inc. All Rights Reserved

Monday, December 12, 2005

Politicized Science?

December 11, 2005
The Way We Live Now

Madness About a Method

Science is the distinctive achievement and crowning glory of the modern age. So, at least, we are often told. It is also something that, relatively speaking, the United States is pretty good at. By many measures, this nation leads the world in scientific research, even if our dominance has been slipping of late. Oddly, though, Americans on the whole do not seem to care greatly for science.

Traditionalists, especially on the right, fear that science promotes godless materialism. Its insistence on finding purely natural explanations, they maintain, threatens to drain the world of moral purpose and spiritual meaning. On the left, fashionable thinkers of recent years have declared that science is an ideological prop of global capitalism. In the guise of giving us an objective picture of reality, they say, science encodes a hidden justification for the dominance of one class (bourgeois), one race (white) and one sex (male).

As for the great ruck of ordinary Americans, they are merely uninterested in, or perhaps bored by, science. Only one in five has bothered to take a physics course. Three out of four haven't heard that the universe is expanding. Nearly half, according to a recent survey, seem to believe that God created man in his present form within the last 10,000 years. Less than 10 percent of adult Americans, it is estimated, are in possession of basic scientific literacy.

This ignorance of science, flecked with outright hostility, is worth pondering at a moment when three of the nation's most contentious political issues - global warming, stem-cell research and the teaching of intelligent design - are scientific in character. One reason that has been cited for the dislike of science is that it is "irresistible" - that its influence tends to overwhelm and drive out competing values and authorities. But the Bush administration seems all too successful in resisting it. Time after time, critics say, the administration has manipulated and suppressed scientific findings for political reasons.

In rationalizing his opposition to the creation of new embryonic stem-cell lines, for example, the president informed the public that existing lines would be sufficient for medical purposes - a claim that left researchers flabbergasted and proved to be wildly off the mark. On the issue of climate change, American inaction on curbing greenhouse gas emissions is defended on the grounds that there is still some uncertainty about the magnitude and causes of global warming. Administration allies have even maligned the motives of climate researchers, arguing that their "alarmist" predictions are aimed at ensuring a steady flow of scientific grant money - and conveniently overlooking the fact that many global-warming skeptics are themselves financed by the energy industry. (As Richard Posner has observed, the industry with the keenest financial interest in getting climate change right - the insurance industry - is taking global warming very seriously, indeed.)

Are we to conclude that the Bush administration is anti-science? Not necessarily. Its selective aversion to scientific evidence may be more strategic than philosophical. Perhaps the administration accepts the authority of science but has a scheme for reckoning costs and benefits that it is not entirely candid about - a scheme in which, say, the next quarter's corporate profits outweigh rising sea levels or third world drought a half-century hence. When it comes to science, a cynic might remark, there is little point in "speaking truth to power": power already knows the truth.

In fairness, resistance to the authority of science can sometimes be detected even within the scientific community, and in its more progressive precincts, no less. Take the issue of race. One of the most durable sources of evil in the world has been the idea that humans are divided into races and that some races are naturally superior to others. So it was morally exhilarating to discover, with the rise of modern genetics, that racial differences are biologically trifling - merely "skin deep," in the popular phrase. For the last three decades, the scientific consensus has been that "race" is merely a social construct, since genetic variation among individuals of the same race is far greater than the variation between races. Recently, however, a fallacy in that reasoning - a rather subtle one - has been identified by the Cambridge University statistician A.W.F. Edwards. The concept of race may not be biologically meaningless after all; it might even have some practical use in deciding on medical treatments, at least until more complete individual genomic information becomes available. Yet in the interests of humane values, many scientists are reluctant to make even minor adjustments to the old orthodoxy. "One of the more painful spectacles of modern science," the developmental biologist Armand Marie Leroi has observed, "is that of human geneticists piously disavowing the existence of races even as they investigate the genetic relationships between 'ethnic groups."'

For nonscientists, it may be the sheer difficulty of science - its remoteness from any "common sense" view of the world - that makes it seem alien and dangerous. Nothing could be more contrary to intuition than quantum mechanics, in which everyday categories of cause and effect break down completely; or the theory of the Big Bang, according to which the universe somehow leapt into existence from a pointlike singularity.

Science is also a rival to other worldviews that most people find more congenial. In hopes of allaying the sense of rivalry, it is often said that science and religious faith are compatible, since the former deals with "how" questions, the latter with "why" questions. As an empirical matter, however, that does not seem to be true. On the whole, around 9 in 10 Americans say they believe in a personal God. When scientists are surveyed, that figure falls to 4 in 10. Among the scientific elite - members of the National Academy of Sciences - fewer than 1 in 10 say they believe in God, with the biologists in particular professing agnosticism or atheism at a rate of 95 percent.

Vaclav Havel once observed, in a transport of anti-science afflatus, that "Modern science. . .abolishes as mere fiction the innermost foundations of our natural world: it kills God and takes his place on the vacant throne, so henceforth it would be science that would hold the order of being in its hand as its sole legitimate guardian and so be the legitimate arbiter of all relevant truth." So what are the options for someone who is determined to resist this usurping arbiter? One of them is to insist that science can't possibly tell the whole story: by limiting itself to "natural" explanations, it blinds itself to the supernatural order that gives meaning to the universe. The problem is that no one has ever shown how supernatural causes can be accommodated by the scientific method, which relies on testability to produce consensus.

That suggests a second option. You might concede that science is a path to the truth but deny that it is the path. Here, though, you will find it difficult to locate much opposition, even among scientists. No one these days wants to be guilty of "scientism," the belief that science is a uniquely privileged form of knowledge and that everything else is at best poetry, at worst nonsense. Yet if science is merely one among many paths, it is a path that is inherently expansionist, absorbing others whenever it draws near. Is there a believer today who does not feel slightly threatened by current research into how the wiring of our brains might have evolved in a way that encourages faith in deities?

This leaves a still more radical option. You might deny that science is a path to truth at all. That is not quite so crazy as it sounds. Among philosophers of science, there is a perfectly respectable (if minority) view called "instrumentalism." According to this view, scientific theories do not yield a true picture of a mind-independent reality; they are merely useful tools that enable us to predict our experience and have a measure of control over it. History provides some support for instrumentalism. Scientific progress, it has been observed, takes place by funerals. Since past scientific theories have invariably proved false - phlogiston, anyone? - we can expect the same of our present and future theories. That does not take away from their utility as engines for turning out cures and weapons and gadgets, or at their most picturesque, as abstract stories to keep us in awe before the cosmos.

The problem with this line of thought is that it makes the success of science something of a miracle. How, asks the Oxford zoologist Richard Dawkins, do we account for science's "spectacular ability to make matter and energy jump through hoops on command" if not by assuming that the world, deep down, is more or less the way science says it is? Only a philosopher, and perhaps an oversubtle one, would advocate acting on science without believing it is really true. But to believe it and yet refuse to act on it - now, that takes a politician.

Jim Holt is a frequent contributor to the magazine.

Big Idea

December 11, 2005
Questions for Peter Watson

What's the Big Idea?

Q: Your ambitious new book, "Ideas: A History of Thought and Invention, From Fire to Freud," claims to chronicle all the major ideas in the world since the invention of the hand ax two million years ago. Are you trying to be a polymath?

My wife says I am the know-it-all from hell.

How does one go about deciding which ideas to put in and which to leave out? As they say, even taxi drivers have ideas.

Yes, taxi drivers have ideas. They have ideas about how to get from Eighth Street to 81st Street by missing the Midtown traffic. But what we are talking about here - let's be sensible - are ideas that have an impact on the lives of many people. We're not talking about just little ideas, are we?

On the other hand, not all big ideas are good ideas. In fact, most big ideas are probably terrible ideas. What do you think is the single worst idea in history?

Without question, ethical monotheism. The idea of one true god. The idea that our life and ethical conduct on earth determines how we will go in the next world. This has been responsible for most of the wars and bigotry in history.

But religion has also been responsible for investing countless lives with meaning and inner richness.

I lead a perfectly healthy, satisfactory life without being religious. And I think more people should try it.

It sounds as if you're starting your own church.

Not at all. I do not believe in the inner world. I think that the inner world comes from the exploration of the outer world - reading, traveling, talking. I do not believe that meditation or cogitation leads to wisdom or peace or the truth.

Then I don't understand why you would want to write a history of ideas, since inner reflection and dreaminess surely count at least as much as scientific experiment in the formation of new ideas.

To paraphrase the English philosopher John Gray, it is more sensible to look out on the world from a zoo than from a monastery. Science, or looking out, is better than contemplation, or looking in.

If that were true, how would you explain a novelist like Virginia Woolf, whose achievement was based on the rejection of the panoramic outward view in favor of inner sensibility?

The rise of the novel generally is a great turning in. But I don't think it has given a lot of satisfaction to people. It has not achieved anything collective. It's a lot of little personal turnings that lots of people love to connect with. But these are fugitive, evanescent truths. They don't stay with you very long or help you do much.

You strike me as deeply unanalyzed. Have you ever considered seeing a psychiatrist?

I was a psychiatrist. I left because I thought Freud was rubbish.

Where did you train?

The Tavistock Clinic in London. I left in the late 60's because I thought Freudian therapy was a waste of time. I don't believe there is any such thing as the unconscious or the id.

In that case, where do you think ideas come from?

I don't think they come out of daydreaming. Everybody who has had a great idea or made a great realization has been working very hard at it, and they often have failed many times. You don't go from nothing to a great idea without doing a lot of work.

I find I seldom have ideas away from my desk.

That is because ideas come from other ideas. I used to sleep with a piece of paper by my bed. But I never had an idea in bed. The other thing I noticed is that when you are out to dinner and you have a good idea and write it down, the next day when you're sober, it's terrible.

Perhaps if you went out less, you would have better ideas.

I think the interesting thing in life is not having an idea, but realizing it.

The Excluded Middle - book review

December 11, 2005
'Off Center,' by Jacob S. Hacker and Paul Pierson

The Excluded Middle

Over the last few years, in this time of Democratic despondency, there has emerged a new genre of comfort books for liberals - books that seek to expose the nefarious means by which conservatives have amassed power, while at the same time reassuring urban liberals that they bear none of the blame. Thomas Frank's best-selling "What's the Matter With Kansas?," for instance, advanced the premise that rural voters just aren't sophisticated enough to vote in their own interests. In "Don't Think of an Elephant!," the linguist George Lakoff took a slightly different angle, suggesting that these voters weren't dumb, exactly, but that their brain synapses had been rewired by the Republicans' skillful manipulation of language. Now come Jacob S. Hacker and Paul Pierson, political science professors at Yale and the University of California, Berkeley, with "Off Center: The Republican Revolution and the Erosion of American Democracy." Hacker and Pierson offer a variation on this same theme: voters can't make the right choices, they contend, because our system of government itself has dangerously malfunctioned.

The authors begin with the basic premise, buttressed with a sheaf of studies and laid out in clinical prose, that the American electorate is no more conservative than it ever was. "When Reagan was elected in 1980, the public mood was more conservative than in any year since 1952," Hacker and Pierson write. "But by the time of George W. Bush's election in 2000, Americans had grown substantially more liberal" and their views "were virtually identical to their aggregate opinions in 1972." This represents an immediate break with Frank, who argued that American voters have swerved hard right in response to social issues, but to anyone who's actually talked to voters around the country it is also a more plausible claim. For all the hype about the so-called religious right, most rural and exurban voters display little ideological zealotry; rather, they seem inclined toward mild conservatism on economics and foreign policy, along with a reverence for individual liberty - a combination which places them firmly in the historical mainstream of American politics.

By design, Hacker and Pierson argue, American democracy should force politicians to cater to this political center, and yet the Republican majority in Washington has all but ignored it. As a case study, the authors examine the G.O.P.'s tax cuts, which have reduced federal taxes to their lowest level since 1950. Even administration officials admitted, privately, that Americans weren't clamoring for a tax cut. And, as has often been reported, roughly 40 percent of these tax cuts went to the 1 percent of Americans who make the most money, while the average voter will get little more than years of budget deficits. How is it, Hacker and Pierson ask, that Republicans have managed to pursue such an extreme policy without incurring any political consequences?

Their answer is that Republicans have learned how to game the system shamelessly. Groups like Americans for Tax Reform and the Club for Growth enforced party discipline, forcing moderate Republicans to back the president or risk being unseated. The administration dishonestly marketed the tax cuts, promoting the idea that they would somehow transform the lives of middle-income families. And by inserting so-called sunset provisions into the final law, which made the tax cuts appear to expire when the intention was to renew them, they were able to disguise their true cost in the budgeting process. In all of these ways, the authors contend, Republicans managed to enact a policy that benefits the few while appearing to champion the many.

Hacker and Pierson go on to expand this argument into a broader indictment of Republican leaders, who, they claim, have sabotaged the checks and balances built into American government so that they can fool voters while governing in the narrow interests of extremists. In particular, they accuse Republicans of manipulating the media (as evidenced, for example by the embarrassing revelation that the administration paid commentators for favorable coverage) and of ruthlessly redrawing congressional districts.

No doubt Hacker and Pierson wish their book could have arrived a year ago, rather than at this moment of upheaval in Washington, when their underlying argument - that Republicans have figured out a way to do whatever they want without fear of centrist rejection - is being effectively disproved. After all, President Bush's poll numbers have reached historic lows, and moderates in Congress, sensing that the president can no longer protect them, have refused to extend tax cuts and slash critical spending. Bush's Social Security plan, which gets extended treatment in "Off Center," is effectively dead. The political center is apparently not the doormat the authors believe it to be. Nonetheless, the deeper message of Hacker and Pierson's book will no doubt resonate with a lot of readers - the idea that Republicans win elections by manipulating the electoral system and misrepresenting their policies, so that voters are unable to understand what they're voting for.

There is substantial truth to most of Hacker and Pierson's claims about Republican tactics. But is this duplicity really the sole reason, or even the main one, that so many moderate voters continue to help elect conservatives? Could it be, for instance, that the threat of terrorism and the G.O.P.'s historic advantage on issues of national security are contributing factors? (It's worth noting that Bush's presidency was in deep distress before the attacks of Sept. 11.) Hacker and Pierson flatly dismiss this idea; their book is concerned only with domestic policy, they say, because that's what voters care about most. What about the perception of Democratic elitism, which alienates a lot of rural voters who might otherwise vote for an alternative candidate? The authors reject this as a media exaggeration, and assert that poor and working-class Americans have never identified as strongly with the Democratic Party as they do today. This may be true of voters in the aggregate, but it conveniently fails to acknowledge what exit polls have made starkly clear - that middle-income white men have fled the party by the planeload in recent elections, providing the Republican margin of victory.

In fact, Hacker and Pierson can't seem to find any significant fault among Democrats at all, save for their chronic but so darn lovable disunity. Like Frank and Lakoff, the authors seem to prefer the more self-ennobling explanation that conservatives have seized power from an unwitting electorate. For all its pretensions to objectivity, "Off Center" deteriorates into just the latest example in our political discourse of what might be called confirmational analysis - that is, a work whose primary purpose is to confirm what its audience already believes.

In the end, for all its talk about the political center, there is a radical current that runs through "Off Center" - an insinuation that American democracy no longer works simply because Democrats haven't been winning. This is most pronounced in discussions of the Electoral College, which Hacker and Pierson dream of abolishing, and the structure of the Senate, which they deplore. Citing the New Yorker writer Hendrik Hertzberg, the authors point out that if each senator represents half his state, then the Democratic minority in the Senate actually represents 30 million more voters than the Republican majority. This is very interesting, but, as a couple of political scientists should know, it's also irrelevant; our democracy is not, in fact, an Athenian democracy but a republic of states that was designed to protect small states from the dictates of urban elites. That some liberals, Hacker and Pierson among them, would reinvent the system now for their own ends is highly ironic, given that this is precisely the kind of contempt for the traditions of American democracy of which they accuse their opponents. Which just goes to show you that the real threat to our system of government is ideological certainty - no matter whose ideology happens to be at issue.

Matt Bai covers national politics for The New York Times Magazine.

Tuesday, December 06, 2005

Milton Friedman

The New York Times
December 4, 2005
Everybody's Business

On Milton Friedman's Birthday, We Get the Present: Him

ONE of the infinite number of ways in which I have been blessed is in the friends I inherited from my parents. Peter M. Flanigan, investment banker and aide to my former boss, Richard M. Nixon, and ambassador to Spain. Roy L. Ash, a co-founder of Litton Industries and a budget director in the Nixon and Ford administrations. Mr. Nixon, president and peacemaker. Paul W. McCracken, genius economist and adviser to three presidents. C. Lowell Harriss, a super teacher and emeritus professor of economics at Columbia University. Anna Jacobson Schwartz, a renowned economist and statistician. And the list goes on and on.

But at the top of the totem pole for their contributions to man's progress and to human understanding comes Milton Friedman, and his wife, Rose D. Friedman. This week, a giant fête is being held in their honor (nominally about his 90th birthday, although he actually turned 93 in July) in Los Angeles. I was to be the master of ceremonies, but I was also needed to preach the good gospel of freedom and gratitude at the University of Utah that night, so I think I will tell a little bit about this giant of a man right now.

In addition to being an emeritus professor of economics at Chicago, where he taught from 1946 to 1976, Milton Friedman is:

1. A brilliant mathematician;

2. A spectacularly original and insightful economist (winner of one of the first Nobels in economic science, in 1976, when they were being given to truly heavy hitters instead of the uneven selection we sometimes see today);

3. A gifted writer;

4. A tireless crusader for education for the disadvantaged;

5. A loyal and helpful friend; and

6. Above all, a fearless fighter for individual freedom.

His wife has stood by him and collaborated in these triumphs.

I first came to know Milton and Rose Friedman as something other than pals of my Ma and Pa in 1964-65, when Professor Friedman was, for one fabulous year, the Wesley Clair Mitchell Research Professor of Economics at Columbia, while on leave from his beloved Chicago. I was a junior at Columbia and madly in love with a girl named Cathy, who was at Chicago. On my first visit to Professor Friedman's office, I was lamenting the fact that Cathy was so far away and that I could never possibly find a girl I loved as much in a small town like New York. "Benjy," Professor Friedman said, "I can tell you as a statistician that if there were only one right woman for every man, they would never find each other. Go out and find someone else." It was a flashing insight (and, indeed, I soon found a much more pleasant girlfriend named Mary right across the street at Barnard).

A few days later, when I was crossing Broadway at 116th Street to get to lunch at a Chock full o'Nuts restaurant with Professor Friedman, I urged him to run ahead and cross against the light. "Benjy," he said, "why should we risk the rest of our lives to save 20 seconds?"

These were my first hints that I was in the presence of genius. Although he was working in the graduate school, Professor Friedman let me be a pupil. If memory serves, the text we used was the book he had written with Dr. Schwartz (one of the truly great unsung heroes of economics and my late mother's best friend at Barnard), "The Monetary History of the United States." This book was the culmination of their analysis of the connection between the quantity of money and business cycles in the United States economy.

Until "The Monetary History," the prevailing view of economists was that the supply of money affected the price level but not the real level of economic activity. By dint of painstaking research and formulas, Professor Friedman and Dr. Schwartz showed that changes in the money supply greatly affected real levels of output and employment.

The main thesis of the book was that the Great Depression had been caused not by changes in tariff laws (always a questionable notion at best), not by the stock market crash (even more questionable), but by catastrophically wrongheaded decisions by the Federal Reserve Board in the period 1929-33 and again in 1936-37. The Fed - obsessed with fears of inflation even as the economy was collapsing - shrank the money supply drastically and basically choked the life out of the economy. (There was also a fascinating ethnic and racial angle to the story, which Professor Friedman later related to me: certain potentates at the Fed were doing this in part as an anti-Semitic reaction to the views of a Jewish Fed official named Eugene I. Meyer, who was also owner of The Washington Post and father of Katharine Meyer Graham.)

In the world of economics, this was a discovery on a par with the theory of relativity in physics or Copernican astronomy. There is simply no way to exaggerate its importance. The use of the money supply to regulate the economy and to prevent future depressions was largely born of the work by Professor Friedman and Dr. Schwartz. It was, in every way, lifesaving.

There was an immense additional amount of work by Professor Friedman in economics and econometrics, and much of it had to do with the relation of the price level, monetary aggregates, output and unemployment to each another, and especially the key role of expectations in thwarting economic policy. For this work, he won the Nobel. But economics was just the beginning of his work, along with his wife's.

In about 1965, the whole world was worshiping at the altar of John F. Kennedy's words, "Ask not what your country can do for you - ask what you can do for your country." Professor Friedman, in his writing, said that neither was a fitting question in a free society. People should be asking what they can do for themselves and their friends and communities, not how they can serve the state or what they could get as wards of the state.

This became the beginning of Professor Friedman's 1980 PBS series "Free to Choose," also about the role of individual freedom in creating a prosperous, politically free society. At a time when the prevailing liberal ethos was all about the state, planning and direction from on high, Professor Friedman and his wife stood up for the glory of the rights and choices of the individual. From the individual, not from the state, came creativity, progress, freedom, prosperity. From the state came oppression and stagnation. While (just as a humble opinion) I see the state having a vital role in securing freedom for minorities, defending the society against aggression, and delivering the mail, Professor Friedman's basic point is certainly correct. (He saw major roles for the state in enforcing antitrust laws and insuring banks and in other areas, too.) One of Professor Friedman's most apt students was Ronald Wilson Reagan, and another is the governor in my home state of California, Arnold Schwarzenegger.

Now, in the autumn of his life, Professor Friedman works for school choice. He wants to give parents vouchers so they can send their children to better schools, removing them from failing public schools. He wants to end the state's near-monopoly on education, which has produced such disappointing results, especially among the nonrich and nonwhite. He faces an uphill struggle (and we parents of teenagers who refuse to do homework think the return of the lash might be a better innovation), but he is in there pitching for giving poor people the same choices rich people have in education. It can hardly be worse than what we have today.

THERE are a few really great names in standing up for the individual in a world where the beautiful people always seem to want the state to tell us how to live - William F. Buckley; Robert L. Bartley, the late editor of the Wall Street Journal's editorial page; and Mr. Reagan - but they all pale before the brain power and ingenuity of Milton and Rose Friedman. If we have a free society today, if we have avoided anything close to another Great Depression, if we have prosperity and fairly stable prices, we owe much of it to Milton Friedman. If we have a free market economy that will yet pull us through our many travails and will be the beacon of hope to the whole world, if we still have a majority of the economy not in the hands of the state, much of the credit goes to Milton Friedman.

It is not a stretch to say that when great buildings of Manhattan have vanished, intelligent people will still find inspiration in the works of Milton Friedman. He will be in a pantheon of economists along with Adam Smith, David Ricardo and John Stuart Mill. And he helped me find a girlfriend and kept me from getting run over. Happy birthday, Professor Friedman, and many more to come.

Ben Stein is a lawyer, writer, actor and economist. E-mail: ebiz@nytimes.com.

Tuesday, November 15, 2005

Tribute to Peter Drucker - Steve Forbes

A Tribute to Peter Drucker By STEVE FORBES November 15, 2005; Page A22 What made Peter Drucker, who died last Friday just shy of his 96th birthday, the most influential management guru of the modern era? Mr. Drucker's genius for extraordinarily farsighted insights came from a combination of intense curiosity, right principles and deep understanding of the perfections and imperfections of human nature. He never went stale intellectually, which is why business journalists, executives, entrepreneurs, leaders of nonprofit institutions, students and the occasionally wise politician eagerly sought to pick his brains right up to the time he died. * * * What helped make Mr. Drucker so insightful was a profound understanding of economics, an understanding that still eludes most economists today. Not for him was the notion of "macroeconomics," of seeing the economy as something of a machine that can achieve steady, stable growth. To him, traditional economic notions of "equilibrium" or Keynesian ideas of "aggregate demand" were nonsense. Innovation, constant change, and turmoil were the true constants of a progressing economy. No surprise that the economist fellow-Austrian (at least by birth) Joseph Schumpeter was Mr. Drucker's hero. In 1983, at the centennial of both Schumpeter and the then-legendary John Maynard Keynes, Mr. Drucker wrote in Forbes that Schumpeter's centenary birthday would hardly be noticed. Yet "Schumpeter it is who will shape the thinking and inform the questions on economic theory and economy policy for the rest of this century, if not for the next 30 or 50 years." Today Schumpeter's emphasis on the crucial importance of entrepreneurship and "creative destruction" are now commonplaces. [Words of Wisdom] WORDS OF WISDOM Read samples of Peter Drucker's insights into management1 and selected writings 2 for The Wall Street Journal. • Manager's Journal: Sell the Mailroom3 • Review & Outlook: Drucker on Everything4 11/14/2005 • The American CEO5 12/30/2004 • The Rules of Executive Class6 10/01/2004 • The Rise, Fall and Return of Pluralism7 06/01/1999 As Mr. Drucker wrote over two decades ago, "The economy is forever going to change and is biological rather than mechanistic in nature. The innovator is the true subject of economics. Entrepreneurs that move resources from old and obsolescent to new and more productive employments are the very essence of economics and certainly of a modern economy. Innovation makes obsolete yesterday's capital earnings and capital investment. The more an economy progresses the more capital formation -- profits -- will it therefore need." These two men saw profits as a moral imperative, a genuine "cost" in the cost of staying in business because "Nothing is predictable except that today's profitable business will become tomorrow's white elephant." B.C. Forbes, our company's founder, who came to this country 100 years ago with little education and even less money, liked to say that you learn more about a company's prospects from observing its "head knocker" (what he called CEOs) than you will from its balance sheet. Mr. Drucker spent a lifetime hammering home the point that people are key. For instance, a leader who looks at workers as a cost instead of a resource is fatally flawed. No surprise he long recognized the importance of entrepreneurs: "All great change in business has come from outside the firm, not from inside." Mr. Drucker's ability to prophesy -- almost always correctly -- was uncanny. All of this is why he could come up with innovations that now seem commonplace, such as management by objective. He continued to admonish executives to carve out time to think and make careful decisions, to focus on one or two tasks, to delegate to others what you can't do well yourself. That's why, for example, Mr. Drucker remained a one-man shop, a soloist; he could easily have founded a large consulting firm and gotten immensely rich. But that would have gone against his profoundest instincts. He was at his best as a teacher -- gathering information, gaining insights and then getting others to gain understanding. Schumpeter believed asking the right questions was more important than the answers. Mr. Drucker agreed -- to a point, anyway. Decades ago, Mr. Drucker foresaw the rise of "knowledge workers." After World War II, he realized the far-reaching consequences of the GI Bill of Rights, which enabled millions of veterans to go to college, thus leading him to predict long before computer chips and the Internet that "knowledge workers" would replace manual workers. Mr. Drucker also prophesied the breakdown of the traditional, thoroughly integrated, hierarchal industrial corporation. In the 1950s, he predicted the rise of Japan as a major economy, an astonishing insight when many experts thought the country would forever be a nation of small farmers and manufacturers of cheap, shoddy goods. He also saw Japan's subsequent troubles -- an aging population and lack of vigorous entrepreneurship and worker flexibility. Mr. Drucker long ago warned of the consequences of the rise of corporate and government pension funds, and the impact these vast accumulations of money -- and thus power -- would have on corporate governance, years before anyone had heard of Calpers. He also warned of a backlash from the extraordinary rise in CEO pay. "In the next economic downturn," he told Forbes readers nearly a decade ago, "there will be an outbreak of bitterness and contempt for these super corporate chieftains who pay themselves millions. In every major economic downturn in U.S. history, the villains have been the heroes during the preceding book." Mr. Drucker also told us to expect enormous changes that will come in higher education, thanks to the rise of satellites and the Internet. "Thirty years from now big universities will be relics. Universities won't survive. It is as large a change as when we first got the printed book." He believed "High school graduates should work for at least five years before going on to college." It will be news to most college presidents and a lot of alumni that "higher education is in deep crisis. Colleges won't survive as residential institutions. Today's buildings are hopelessly unsuited and totally unneeded." All this from a life-long academic. He brooked no nonsense about some of the topics that obsess Chicken Little today. Outsourcing? He told Fortune in 2002 that "We import two to three times as many jobs as we export. Wage costs are of primary importance for very few industries. The industries that are losing jobs out of the U.S. are the more backward industries." He never tired of pointing out the huge advantage the U.S. has over Europe and Japan and other countries with American workers' flexibility, not only for changing jobs but physically moving from one area to another to pursue opportunities. In fact, outsourcing is a necessity, Mr. Drucker said. Companies should have others do what is not their prime task. Outsourcing is not so much about cost cutting ("illusory") as it is about improving the quality of work that others can do better than you: "You should outsource everything for which there is no career track that can lead to senior management." * * * How higher education is managed did not impress Mr. Drucker; but what did is our continuing education system, whether in community colleges or by computers. Also: "Our most important education system is in the employees' own organization." That is where most Americans learn the most. Mr. Drucker also came up with the admonition of pursuing your opportunities and cutting your losses: "A critical question for leaders is 'When do you stop pouring resources into things that have achieved their purpose?'" As he repeatedly told Pastor Rick Warren, founder of the 15,000 member Saddleback Community Church in Lake Forest, Calif., and who has helped start another 60 churches around the world, "Don't tell me what you are doing, Rick, tell me what you stopped doing." Until his last breath, Mr. Drucker himself never stopped doing and doing. Mr. Forbes is president & CEO of Forbes, Inc. and editor-in-chief of Forbes magazine. URL for this article: http://online.wsj.com/article/SB113202508406497251.html Hyperlinks in this Article: (1) http://online.wsj.com/article/SB113192826302796041.html (2) http://online.wsj.com/article/SB113192891878696055.html (3) http://online.wsj.com/article/SB113202230063197204.html (4) http://online.wsj.com/article/SB113192462453195997.html (5) http://online.wsj.com/article/SB110436476581112426.html (6) http://online.wsj.com/article/SB108605270355625419.html (7) http://online.wsj.com/article/SB928182059339889134.html Copyright 2005 Dow Jones & Company, Inc. All Rights Reserved

A Tribute to Peter Drucker

The Wall Street Journal

November 15, 2005

COMMENTARY
DOW JONES REPRINTS
This copy is for your personal, non-commercial use only. To order presentation-ready copies for distribution to your colleagues, clients or customers, use the Order Reprints tool at the bottom of any article or visit: www.djreprints.com. • See a sample reprint in PDF format. • Order a reprint of this article now.

A Tribute to Peter Drucker

By STEVE FORBES November 15, 2005; Page A22

What made Peter Drucker, who died last Friday just shy of his 96th birthday, the most influential management guru of the modern era?

Mr. Drucker's genius for extraordinarily farsighted insights came from a combination of intense curiosity, right principles and deep understanding of the perfections and imperfections of human nature. He never went stale intellectually, which is why business journalists, executives, entrepreneurs, leaders of nonprofit institutions, students and the occasionally wise politician eagerly sought to pick his brains right up to the time he died.

* * *

What helped make Mr. Drucker so insightful was a profound understanding of economics, an understanding that still eludes most economists today. Not for him was the notion of "macroeconomics," of seeing the economy as something of a machine that can achieve steady, stable growth. To him, traditional economic notions of "equilibrium" or Keynesian ideas of "aggregate demand" were nonsense. Innovation, constant change, and turmoil were the true constants of a progressing economy.

No surprise that the economist fellow-Austrian (at least by birth) Joseph Schumpeter was Mr. Drucker's hero. In 1983, at the centennial of both Schumpeter and the then-legendary John Maynard Keynes, Mr. Drucker wrote in Forbes that Schumpeter's centenary birthday would hardly be noticed. Yet "Schumpeter it is who will shape the thinking and inform the questions on economic theory and economy policy for the rest of this century, if not for the next 30 or 50 years." Today Schumpeter's emphasis on the crucial importance of entrepreneurship and "creative destruction" are now commonplaces.

[Words of Wisdom] WORDS OF WISDOM
Read samples of Peter Drucker's insights into management1 and selected writings 2 for The Wall Street Journal.
Manager's Journal: Sell the Mailroom3 Review & Outlook: Drucker on Everything4 11/14/2005 The American CEO5 12/30/2004 The Rules of Executive Class6 10/01/2004 The Rise, Fall and Return of Pluralism7 06/01/1999

As Mr. Drucker wrote over two decades ago, "The economy is forever going to change and is biological rather than mechanistic in nature. The innovator is the true subject of economics. Entrepreneurs that move resources from old and obsolescent to new and more productive employments are the very essence of economics and certainly of a modern economy. Innovation makes obsolete yesterday's capital earnings and capital investment. The more an economy progresses the more capital formation -- profits -- will it therefore need." These two men saw profits as a moral imperative, a genuine "cost" in the cost of staying in business because "Nothing is predictable except that today's profitable business will become tomorrow's white elephant."

B.C. Forbes, our company's founder, who came to this country 100 years ago with little education and even less money, liked to say that you learn more about a company's prospects from observing its "head knocker" (what he called CEOs) than you will from its balance sheet. Mr. Drucker spent a lifetime hammering home the point that people are key. For instance, a leader who looks at workers as a cost instead of a resource is fatally flawed.

No surprise he long recognized the importance of entrepreneurs: "All great change in business has come from outside the firm, not from inside."

Mr. Drucker's ability to prophesy -- almost always correctly -- was uncanny. All of this is why he could come up with innovations that now seem commonplace, such as management by objective. He continued to admonish executives to carve out time to think and make careful decisions, to focus on one or two tasks, to delegate to others what you can't do well yourself. That's why, for example, Mr. Drucker remained a one-man shop, a soloist; he could easily have founded a large consulting firm and gotten immensely rich. But that would have gone against his profoundest instincts. He was at his best as a teacher -- gathering information, gaining insights and then getting others to gain understanding. Schumpeter believed asking the right questions was more important than the answers. Mr. Drucker agreed -- to a point, anyway.

Decades ago, Mr. Drucker foresaw the rise of "knowledge workers." After World War II, he realized the far-reaching consequences of the GI Bill of Rights, which enabled millions of veterans to go to college, thus leading him to predict long before computer chips and the Internet that "knowledge workers" would replace manual workers. Mr. Drucker also prophesied the breakdown of the traditional, thoroughly integrated, hierarchal industrial corporation. In the 1950s, he predicted the rise of Japan as a major economy, an astonishing insight when many experts thought the country would forever be a nation of small farmers and manufacturers of cheap, shoddy goods. He also saw Japan's subsequent troubles -- an aging population and lack of vigorous entrepreneurship and worker flexibility.

Mr. Drucker long ago warned of the consequences of the rise of corporate and government pension funds, and the impact these vast accumulations of money -- and thus power -- would have on corporate governance, years before anyone had heard of Calpers. He also warned of a backlash from the extraordinary rise in CEO pay. "In the next economic downturn," he told Forbes readers nearly a decade ago, "there will be an outbreak of bitterness and contempt for these super corporate chieftains who pay themselves millions. In every major economic downturn in U.S. history, the villains have been the heroes during the preceding book."

Mr. Drucker also told us to expect enormous changes that will come in higher education, thanks to the rise of satellites and the Internet. "Thirty years from now big universities will be relics. Universities won't survive. It is as large a change as when we first got the printed book." He believed "High school graduates should work for at least five years before going on to college." It will be news to most college presidents and a lot of alumni that "higher education is in deep crisis. Colleges won't survive as residential institutions. Today's buildings are hopelessly unsuited and totally unneeded." All this from a life-long academic.

He brooked no nonsense about some of the topics that obsess Chicken Little today. Outsourcing? He told Fortune in 2002 that "We import two to three times as many jobs as we export. Wage costs are of primary importance for very few industries. The industries that are losing jobs out of the U.S. are the more backward industries." He never tired of pointing out the huge advantage the U.S. has over Europe and Japan and other countries with American workers' flexibility, not only for changing jobs but physically moving from one area to another to pursue opportunities.

In fact, outsourcing is a necessity, Mr. Drucker said. Companies should have others do what is not their prime task. Outsourcing is not so much about cost cutting ("illusory") as it is about improving the quality of work that others can do better than you: "You should outsource everything for which there is no career track that can lead to senior management."

* * *

How higher education is managed did not impress Mr. Drucker; but what did is our continuing education system, whether in community colleges or by computers. Also: "Our most important education system is in the employees' own organization." That is where most Americans learn the most. Mr. Drucker also came up with the admonition of pursuing your opportunities and cutting your losses: "A critical question for leaders is 'When do you stop pouring resources into things that have achieved their purpose?'" As he repeatedly told Pastor Rick Warren, founder of the 15,000 member Saddleback Community Church in Lake Forest, Calif., and who has helped start another 60 churches around the world, "Don't tell me what you are doing, Rick, tell me what you stopped doing."

Until his last breath, Mr. Drucker himself never stopped doing and doing.

Mr. Forbes is president & CEO of Forbes, Inc. and editor-in-chief of Forbes magazine.

URL for this article: http://online.wsj.com/article/SB113202508406497251.html
Hyperlinks in this Article: (1) http://online.wsj.com/article/SB113192826302796041.html (2) http://online.wsj.com/article/SB113192891878696055.html (3) http://online.wsj.com/article/SB113202230063197204.html (4) http://online.wsj.com/article/SB113192462453195997.html (5) http://online.wsj.com/article/SB110436476581112426.html (6) http://online.wsj.com/article/SB108605270355625419.html (7) http://online.wsj.com/article/SB928182059339889134.html
Copyright 2005 Dow Jones & Company, Inc. All Rights Reserved
This copy is for your personal, non-commercial use only. Distribution and use of this material are governed by our Subscriber Agreement and by copyright law. For non-personal use or to order multiple copies, please contact Dow Jones Reprints at 1-800-843-0008 or visit www.djreprints.com.

Tuesday, October 18, 2005

Welcome

This blog is a supplement for greatguys.blogspot.com where we park various articles that we'd like to refer to, but don't want to post to our main blog.