terça-feira, 30 de dezembro de 2008

History is replete with examples of empires mounting impressive military campaigns on the cusp of their impending economic collapse.
-Eric Alterman

segunda-feira, 29 de dezembro de 2008

So far as I can remember, there is not one word in the Gospels in praise of intelligence.
-Bertrand Russell

domingo, 28 de dezembro de 2008

A New Kind of Science

By Stephen Wolfram

Stephen Wolfram is a scientist, author, and business leader. He is the creator of Mathematica, the author of A New Kind of Science, and the founder and CEO of Wolfram Research. His career has been characterized by a sequence of original and significant achievements.

Born in London in 1959, Wolfram was educated at Eton, Oxford, and Caltech. He published his first scientific paper at the age of 15, and had received his Ph.D. in theoretical physics from Caltech by the age of 20. Wolfram's early scientific work was mainly in high-energy physics, quantum field theory, and cosmology, and included several now-classic results. Having started to use computers in 1973, Wolfram rapidly became a leader in the emerging field of scientific computing, and in 1979 he began the construction of SMP--the first modern computer algebra system--which he released commercially in 1981.

In recognition of his early work in physics and computing, Wolfram became in 1981 the youngest recipient of a MacArthur Prize Fellowship. Late in 1981 Wolfram then set out on an ambitious new direction in science aimed at understanding the origins of complexity in nature. Wolfram's first key idea was to use computer experiments to study the behavior of simple computer programs known as cellular automata. And starting in 1982 this allowed him to make a series of startling discoveries about the origins of complexity. The papers Wolfram published quickly had a major impact, and laid the groundwork for the emerging field that Wolfram called "complex systems research."

Through the mid-1980s, Wolfram continued his work on complexity, discovering a number of fundamental connections between computation and nature, and inventing such concepts as computational irreducibility. Wolfram's work led to a wide range of applications--and provided the main scientific foundations for such initiatives as complexity theory and artificial life. Wolfram himself used his ideas to develop a new randomness generation system and a new approach to computational fluid dynamics--both of which are now in widespread use.

Following his scientific work on complex systems research, in 1986 Wolfram founded the first research center and the first journal in the field. Then, after a highly successful career in academia--first at Caltech, then at the Institute for Advanced Study in Princeton, and finally as Professor of Physics, Mathematics, and Computer Science at the University of Illinois--Wolfram launched Wolfram Research, Inc.

Wolfram began the development of Mathematica in late 1986. The first version of Mathematica was released on June 23, 1988, and was immediately hailed as a major advance in computing. In the years that followed, the popularity of Mathematica grew rapidly, and Wolfram Research became established as a world leader in the software industry, widely recognized for excellence in both technology and business.

Following the release of Mathematica Version 2 in 1991, Wolfram began to divide his time between Mathematica development and scientific research. Building on his work from the mid-1980s, and now with Mathematica as a tool, Wolfram made a rapid succession of major new discoveries. By the mid-1990s his discoveries led him to develop a fundamentally new conceptual framework, which he then spent the remainder of the 1990s applying not only to new kinds of questions, but also to many existing foundational problems in physics, biology, computer science, mathematics, and several other fields.

After more than ten years of highly concentrated work, Wolfram finally described his achievements in his 1200-page book A New Kind of Science. Released on May 14, 2002, the book was widely acclaimed and immediately became a bestseller. Its publication has been seen as initiating a paradigm shift of historic importance in science, with new implications emerging at an increasing rate every year.

Wolfram has been president and CEO of Wolfram Research since its founding in 1987. In addition to his business leadership, Wolfram is deeply involved in the development of the company's technology, and continues to be personally responsible for overseeing all aspects of the functional design of the core Mathematica system.

Wolfram has a lifelong commitment to research and education. In addition to providing software for a generation of scientists and students, Wolfram's company maintains some of the web's most visited sites for technical information. Wolfram is also increasingly active in defining new directions for education, especially in the science he has created.

Building on Mathematica, A New Kind of Science, and the success of Wolfram Research, Wolfram has recently launched several highly creative initiatives that can be expected to have major impacts in diverse areas.

Stephen Wolfram: A New Kind of Science Online - Table of Contents
Stephen Wolfram: Official Website
Mathematica Books Written by Stephen Wolfram
Stephen Wolfram: Articles on Cellular Automata
In every village there is a torch - the teacher; and an extinguisher - the clergyman.
-Victor Hugo
We can never solve our significant problems from the same level of thinking we were at when we created the problems.
-Albert Einstein

Democracy, Inequality, and Representation:

A Comparative Perspective

Pablo Beramendi and Christopher J. Anderson (editors)

The gap between the richest and poorest Americans has grown steadily over the last thirty years, and economic inequality is on the rise in many other industrialized democracies as well. But the magnitude and pace of the increase differs dramatically across nations. A country’s political system and its institutions play a critical role in determining levels of inequality in a society. Democracy, Inequality, and Representation argues that the reverse is also true—inequality itself shapes political systems and institutions in powerful and often overlooked ways.

In Democracy, Inequality, and Representation, distinguished political scientists and economists use a set of international databases to examine the political causes and consequences of income inequality. The volume opens with an examination of how differing systems of political representation contribute to cross-national variations in levels of inequality. Torben Iverson and David Soskice calculate that taxes and income transfers help reduce the poverty rate in Sweden by over 80 percent, while the comparable figure for the United States is only 13 percent. Noting that traditional economic models fail to account for this striking discrepancy, the authors show how variations in electoral systems lead to very different outcomes.

But political causes of disparity are only one part of the equation. The contributors also examine how inequality shapes the democratic process. Pablo Beramendi and Christopher Anderson show how disparity mutes political voices: at the individual level, citizens with the lowest incomes are the least likely to vote, while high levels of inequality in a society result in diminished electoral participation overall. Thomas Cusack, Iverson, and Philipp Rehm demonstrate that uncertainty in the economy changes voters’ attitudes; the mere risk of losing one’s job generates increased popular demand for income support policies almost as much as actual unemployment does. Ronald Rogowski and Duncan McRae illustrate how changes in levels of inequality can drive reforms in political institutions themselves. Increased demand for female labor participation during World War II led to greater equality between men and women, which in turn encouraged many European countries to extend voting rights to women for the first time.

The contributors to this important new volume skillfully disentangle a series of complex relationships between economics and politics to show how inequality both shapes and is shaped by policy. Democracy, Inequality, and Representation provides deeply nuanced insight into why some democracies are able to curtail inequality—while others continue to witness a division that grows ever deeper.

PABLO BERAMENDI is assistant professor of political science at Duke University. CHRISTOPHER J. ANDERSON is professor of government at Cornell University.

Russell Sage Foundation

quarta-feira, 24 de dezembro de 2008

The minority, the ruling class at present, has the schools and press, usually the Church as well under its thumb. This enables it to organize and sway the emotions of the masses, and make its tool of them.
-Albert Einstein

Laissez-Faire Capitalism

Should Be as Dead as Soviet Communism


The collapse of Communism as a political system sounded the death knell for Marxism as an ideology. But while laissez-faire capitalism has been a monumental failure in practice, and soundly defeated at the polls, the ideology is still alive and kicking.

The only place you can find an American Marxist these days is teaching a college linguistic theory class. But you can find all manner of free market fundamentalists still on the Senate floor or in Governor's mansions or showing up on TV trying to peddle the deregulation snake oil.

Take Sen. John Ensign, chairman of the National Republican Senatorial Committee, who went on Face the Nation and, with a straight face, said of the economic meltdown: "Unfortunately, it was allowed to be portrayed that this was a result of deregulation, when in fact it was a result of overregulation."

Or Gov. Mark Sanford, who told Joe Scarborough he was against bailing out the auto industry because it would "threaten the very market-based system that has created the wealth that this country has enjoyed."

If a politician announced he was running on a platform of "from each according to his ability, to each according to his need" he would be laughed off the stage. That is also the correct response to anyone who continues to make the case that markets do best when left alone.

It's time to drive the final nail into the coffin of laissez-faire capitalism by treating it like the discredited ideology it inarguably is. If not, the Dr. Frankensteins of the right will surely try to revive the monster and send it marauding through our economy once again.

We've only just begun to bury the financially dead, and the free market fundamentalists are already looking to deflect the blame.

In a comprehensive piece on what led to the mortgage crisis and the subsequent financial meltdown, the New York Times shows how the Bush administration's devotion to unregulated markets was a primary cause of our economy to ruin. But the otherwise fascinating piece puts too much focus on the "mistakes" the Bush team made by not paying attention to the warning signs popping up all around them.

"There is no question we did not recognize the severity of the problems," claimed Al Hubbard, Bush's former chief economic adviser. "Had we, we would have attacked them."

But the mistake wasn't in not recognizing the "severity of the problems" -- the mistake was the ideology that led to the problems. Communism didn't fail because Soviet leaders didn't execute it well enough. Same with free market fundamentalism. In fact, Bush and his team did a bang-up job executing a defective theory. The problem wasn't just the bathwater; the baby itself is rotten to the core.

William Seidman, the longtime GOP economic advisor who oversaw the S&L bailout in 1991, cuts to the chase: "This administration made decisions that allowed the free market to operate as a barroom brawl instead of a prize fight. To make the market work well, you have to have a lot of rules."

Even Alan Greenspan, whose owl-eyed visage would adorn a Mount Rushmore of unregulated capitalists, has begun to see the light, telling a House committee in October that he "made a mistake in presuming that the self-interests of organizations, specifically banks and others, were such that they were best capable of protecting their own shareholders and their equity in the firms."

But most Republicans are still refusing to see what's right in front of them. Especially Bush, our CEO president, who lays the blame not on the failures of the marketplace but on past administrations and corporate greed. "Wall Street got drunk," he says. Maybe so, but who made the last 8 years Happy Hour, and kept serving up the drinks?

Last week, Ben Smith reported that the GOP was launching "a new, in-house think tank aimed at reviving the party's policy heft." In a private memo explaining the think tank, RNC chairman Mike Duncan wrote: "We must show how our ideology can be applied to solve problems." But, of course, it's that very ideology that's causing the problems. It's like the old horror movie cliché: "We've traced the call -- it's coming from inside the house!"

We've got to do everything we can to make sure there will be no sequels to this political horror. The blame shifters cannot be allowed to make their case without the truth being pointed out at every turn. It's time to relegate free market fundamentalists to the same standing as Marxist ideologues: intellectual curiosities occasionally trotted out as relics of a failed philosophy.

terça-feira, 23 de dezembro de 2008

My only wish is. . . to transform friends of God into friends of man, believers into thinkers, devotees of prayer into devotees of work, candidates for the hereafter into students of the world, Christians who, by their own procession and admission, are "half animal, half angel" into persons, into whole persons.
Ludwig Feuerbach (Lectures on the Essence of Religion)

"Top Ten" Humanitarian Crises

Reveal Growing Insecurity, Neglected Health Needs

NEW YORK - December 22 - Massive forced civilian displacements, violence, and unmet medical needs in the Democratic Republic of Congo, Somalia, Iraq, Sudan, and Pakistan, along with neglected medical emergencies in Myanmar and Zimbabwe, are some of the worst humanitarian and medical emergencies in the world, the international medical humanitarian organization Doctors Without Borders/Médecins Sans Frontières (MSF) reported today in its annual list of the "Top Ten" humanitarian crises.

The report underscores major difficulties in bringing assistance to people affected by conflict. The lack of global attention to the growing prevalence of HIV-tuberculosis co-infection and the critical need for increased global efforts to prevent and treat childhood malnutrition—the underlying cause of death for up to five million children per year—are also included in the list.

View the Top Ten Humanitarian Crises of 2008.

"Working on the frontlines of crisis zones throughout the world, MSF medical teams witness firsthand the medical and psychological consequences people endure from extreme violence, displacement, and neglected—yet treatable—diseases and health needs," said MSF International Council President Dr. Christophe Fournier. "In some of these places, it is extremely difficult for aid groups to access populations requiring help. Where we are able to provide assistance, we have a special responsibility to bear witness and speak out about intolerable suffering and draw attention to basic humanitarian needs—needs that are often largely ignored."

Many of the countries on this year’s list illustrate the ever-shrinking space for impartial humanitarian action, making it extremely difficult to deliver aid to those most affected and vulnerable. Aid organizations now operate with increased security risks and in generally more hazardous and threatening environments. In highly politicized and volatile conflicts such as those in Somalia, Pakistan, Sudan, and Iraq, MSF—despite its neutrality and independence—is limited in its ability to directly address immense medical needs.

In Somalia, intensified violence, including direct attacks and threats against aid workers, meant that MSF had to curtail some of its operations in 2008, including the withdrawal of its international staff, significantly reducing the quality of assistance provided to an already weakened population. In Pakistan, hundreds of thousands of people fled air attacks and bombings from a counter-insurgency campaign in the northwest area earlier in the year. After aid workers in the country were threatened, attacked, and kidnapped, MSF restricted the number of international staff in its projects.

In places such as Myanmar and Zimbabwe—where governments fail to make health care a priority or view NGO interventions with suspicion—humanitarian organizations are either limited in the type of assistance they can provide or are left to deal with overwhelming health crises on their own. In Myanmar, where MSF is the main provider of HIV care, hundreds of thousands of people are needlessly dying due to a severe lack of HIV/AIDS treatment while the government does far too little to help its own people.

Governments are also ignoring the crisis of childhood malnutrition. In Niger, the government in 2008 forced the termination of MSF’s child nutrition program in the region of Maradi, where tens of thousands of children were suffering from acute malnutrition. As a result, they have not received proven and highly effective treatment. The closure came at a time when efforts to make progress in the fight against malnutrition globally are more possible—and more necessary—than ever.

"The reality on the ground is that the humanitarian community is unable to do nearly enough for populations in grave need of medical assistance," Dr. Fournier said. "With the release of this list, we hope to focus much needed attention on the millions of people who are trapped in conflict and war, affected by medical crises, whose immediate and essential health needs are neglected, and whose plight often goes unnoticed."

MSF began producing the "Top Ten" list in 1998, when a devastating famine in southern Sudan went largely unreported in U.S. media. Drawing on MSF’s emergency medical work, the list seeks to generate greater awareness of the magnitude and severity of crises that may or may not be reflected in media accounts.

To view each story, click on the following links:

segunda-feira, 22 de dezembro de 2008

Living Without God:

New Directions for Atheists, Agnostics, Secularists, and the Undecided


Ronald Aronson has a mission: to demonstrate that a life without religion can be coherent, moral, and committed. Optimistic and stirring, Living Without God is less interested in attacking religion than in developing a positive philosophy for atheists, agnostics, secular humanists, skeptics, and freethinkers. Aronson proposes contemporary answers to Immanuel Kant’s three great questions: What can I know? What ought I to do? What can I hope? Grounded in the sense that we are deeply dependent and interconnected beings who are rooted in the universe, nature, history, society, and the global economy, Living Without God explores the experience and issues of 21st-century secularists, especially in America. Reflecting on such perplexing questions as why we are grateful for life’s gifts, who or what is responsible for inequalities, and how to live in the face of aging and dying, Living Without God is also refreshingly topical, touching on such subjects as contemporary terrorism, the war in Iraq, affirmative action, and the remarkable rise of Barack Obama.

Review: "Without Illusions: Doug Ireland welcomes a passionate and practical approach
to secularism" (New Humanist Volume 123 Issue 5 September/October 2008)

Encountering Naturalism:

A Worldview and Its Uses

Book description:

Most of us have a worldview, an overarching context for life that helps to shape our beliefs, goals and actions. This book explores the science-based worldview known as naturalism – a comprehensive and fulfilling alternative to faith-based religion and other varieties of dualism.

Taking empirical science as the route to reliable knowledge, naturalism holds that we inhabit a single, natural world; there is no separate supernatural realm. We are fully physical beings whose origins lie in cosmic and biological evolution. We are therefore entirely at home in the universe.

By understanding and accepting our complete connection to the natural world, naturalism provides a secure foundation for human flourishing, an effective basis for achieving our purposes and addressing our deepest concerns. We don’t need belief in the supernatural to sustain us.

Nature, it turns out, is enough.

The Center for Naturalism
Naturalism.Org
Those who wish to seek out the cause of miracles, and to understand the things of nature as philosophers, and not to stare at them in astonishment like fools, are soon considered heretical and impious, and proclaimed as such by those whom the mob adores as the interpreters of nature and the gods. For these men know that once ignorance is put aside that wonderment would be taken away which is the only means by which their authority is preserved.
By Baruch Spinoza

Baruch Spinoza - Internet Encyclopedia of Philosophy
Baruch Spinoza (Stanford Encyclopedia of Philosophy)

domingo, 21 de dezembro de 2008

The Missing Class:

Portraits of the Near Poor in America

Authors:
Fifty-seven million Americans-including 21 percent of the nation's children-live a notch above the poverty line, and yet the challenges they face are largely ignored. While government programs assist the poor, and politicians woo the more fortunate, the "Missing Class" is largely invisible and left to fend for itself.

Missing Class parents often work at a breakneck pace to preserve the progress they have made and are but one divorce or unexpected hospitalization away from sliding into poverty. Children face an even more perilous and uncertain future because their parents have so little time to help them with their schoolwork or guide them during their adolescent years. With little supervision, the younger generation often flounders in school, sometimes falling prey to the same problems that are prevalent in the much poorer communities that border Missing Class neighborhoods. Paradoxically, the very efforts that enabled parents to get ahead financially often inhibit their children from advancing; they are in real danger of losing what little ground their parents have gained.

The Missing Class is an urgent and timely exploration that describes-through the experiences of nine families-the unique problems faced by this growing class of people who are neither working poor nor middle class. Katherine Newman and Victor Tan Chen trace where these families came from, how they've struggled to make a decent living, and why they're stuck without a safety net. An eloquent argument for the need to think about inequality in a broader way, The Missing Class has much to tell us about whether the American dream still exists for those who are sacrificing daily to achieve it.

Beyond the Boycott : Labor Rights, Human Rights, and Transnational Activism

https://www.russellsage.org/publications/beyond-boycott

As the world economy becomes increasingly integrated, companies can shift production to wherever wages are lowest and unions weakest. How can workers defend their rights in an era of mobile capital? With national governments forced to compete for foreign investment by rolling back legal protections for workers, fair trade advocates are enlisting consumers to put market pressure on companies to treat their workers fairly. In Beyond the Boycott, sociologist Gay Seidman asks whether this non-governmental approach can reverse the “race to the bottom” in global labor standards.

Beyond the Boycott examines three campaigns in which activists successfully used the threat of a consumer boycott to pressure companies to accept voluntary codes of conduct and independent monitoring of work sites. The voluntary Sullivan Code required American corporations operating in apartheid-era South Africa to improve treatment of their workers; in India, the Rugmark inspection team provides ‘social labels’ for handknotted carpets made without child labor; and in Guatemala, COVERCO monitors conditions in factories producing clothing under contract for major American brands. Seidman compares these cases to explore the ingredients of successful campaigns, as well as the inherent limitations facing voluntary monitoring schemes. Despite activists’ emphasis on educating individual consumers to support ethical companies, Seidman finds that, in practice, they have been most successful when they mobilized institutions—such as universities, churches, and shareholder organizations. Moreover, although activists tend to dismiss states’ capabilities, all three cases involved governmental threats of trade sanctions against companies and countries with poor labor records. Finally, Seidman points to an intractable difficulty of independent workplace monitoring: since consumers rarely distinguish between monitoring schemes and labels, companies can hand pick monitoring organizations, selecting those with the lowest standards for working conditions and the least aggressive inspections. Transnational consumer movements can increase the bargaining power of the global workforce, Seidman argues, but they cannot replace national governments or local campaigns to expand the meaning of citizenship.

As trade and capital move across borders in growing volume and with greater speed, civil society and human rights movements are also becoming more global. Highly original and thought-provoking, Beyond the Boycott vividly depicts the contemporary movement to humanize globalization—its present and its possible future.

GAY W. SEIDMAN is professor of sociology at the University of Wisconsin, Madison.

A Volume in the American Sociological Association’s Rose Series in Sociology

sábado, 20 de dezembro de 2008

Defining Right and Wrong in Brain Science


Essential Readings in Neuroethics

Defining Right and Wrong in Brain Science is an authoritative record of the emerging ideas that are defining neuroethics. Edited by University of Calgary philosophy professor Walter Glannon, it is an essential reference for anyone who wants to understand how these issues have taken shape.

Contributors include Adina Roskies, writing on neuroethics for the New millennium, Martha J. Farah and Paul Root Wolpe on monitoring and manipulating brain function, Antonio Damasio on the neural basis of social behavior, and Alan Leshner on ethical issues in taking neuroscience research from bench to bedside. Other thinkers represented in this collection are British Medical Research Council Chairman Colin Blakemore, Patricia Smith Churchland, Arthur Caplan, Paul McHugh, and Anjan Chatterjee.

This book will be indispensable to readers curious about how discoveries in brain science are stirring up classic--and new--questions of ethics.

This new volume is the fifth in The Dana Foundation Series on Neuroethics.

sexta-feira, 19 de dezembro de 2008

THREE-YEAR STUDY AT SEVEN MAJOR UNIVERSITIES

FINDS STRONG LINKS BETWEEN ARTS EDUCATION AND COGNITIVE DEVELOPMENT

Washington, DC, March 4, 2008Learning, Arts, and the Brain, a study three years in the making, is the result of research by cognitive neuroscientists from seven leading universities across the United States. In the Dana Consortium study, released today at a news conference at the Dana Foundation’s Washington, DC headquarters, researchers grappled with a fundamental question: Are smart people drawn to the arts or does arts training make people smarter?

For the first time, coordinated, multi-university scientific research brings us closer to answering that question. Learning, Arts, and the Brain advances our understanding of the effects of music, dance, and drama education on other types of learning. Children motivated in the arts develop attention skills and strategies for memory retrieval that also apply to other subject areas.

The research was led by Dr. Michael S. Gazzaniga of the University of California at Santa Barbara. “A life-affirming dimension is opening up in neuroscience,” said Dr. Gazzaniga, “to discover how the performance and appreciation of the arts enlarge cognitive capacities will be a long step forward in learning how better to learn and more enjoyably and productively to live. The consortium’s new findings and conceptual advances have clarified what now needs to be done.”
Participating researchers, using brain imaging studies and behavioral assessment, identified eight key points relevant to the interests of parents, students, educators, neuroscientists, and policy makers.

1. An interest in a performing art leads to a high state of motivation that produces the sustained attention necessary to improve performance and the training of attention that leads to improvement in other domains of cognition.

2. Genetic studies have begun to yield candidate genes that may help explain individual differences in interest in the arts.

3. Specific links exist between high levels of music training and the ability to manipulate information in both working and long-term memory; these links extend beyond the domain of music training.

4. In children, there appear to be specific links between the practice of music and skills in geometrical representation, though not in other forms of numerical representation.

5. Correlations exist between music training and both reading acquisition and sequence learning. One of the central predictors of early literacy, phonological awareness, is correlated with both music training and the development of a specific brain pathway.

6. Training in acting appears to lead to memory improvement through the learning of general skills for manipulating semantic information.

7. Adult self-reported interest in aesthetics is related to a temperamental factor of openness, which in turn is influenced by dopamine-related genes.

8. Learning to dance by effective observation is closely related to learning by physical practice, both in the level of achievement and also the neural substrates that support the organization of complex actions. Effective observational learning may transfer to other cognitive skills.

As several of the consortium members stressed at today’s news conference, much of their research was of a preliminary nature, yielding several tight correlations but not definitive causal relationships.

Although “there is still a lot of work to be done,” says Dr. Gazzaniga, the consortium’s research so far has clarified the way forward. “We now have further reasons to believe that training in the arts has positive benefits for more general cognitive mechanisms.”

Principal investigators, working with their colleagues, were:

1 How Arts Training Influences Cognition
Michael Posner, Ph.D.
University of Oregon

2 Musical Skill and Cognition
John Jonides, Ph.D.
University of Michigan

3 Effects of Music Instruction on Developing Cognitive Systems at the Foundations of Mathematics and Science
Elizabeth Spelke, Ph.D.
Harvard University

4 Training in the Arts, Reading, and Brain Imaging
Brian Wandell, Ph.D.
Stanford University

5 Dance and the Brain
Scott Grafton, M.D.
University of California at Santa Barbara

6 Developing and Implementing Neuroimaging Tools to Determine if Training in the Arts Impacts the Brain
Mark D’Esposito, M.D.
University of California, Berkeley

7 Arts Education, the Brain, and Language
Kevin Niall Dunbar, Ph.D.
University of Toronto at Scarborough(Fomerly at Dartmouth College)

8 Arts Education, the Brain, and Language
Laura-Ann Petitto, Ed.D.
University of Toronto at Scarborough(Fomerly at Dartmouth College)

9 Effects of Music Training on Brain and Cognitive Development in Under-Privileged 3- to 5-Year-Old Children: Preliminary Results
Helen Neville, Ph.D.
University of Oregon

The Dana Foundation is a private philanthropic organization with particular interests in neuroscience, immunology, and arts education.

quarta-feira, 17 de dezembro de 2008

The "Who Could Have Known?" Era

The Huffington Post

See if this sounds familiar:

An ambitious and risky undertaking carried out with hubris, and featuring the weeding out of anyone who raises alarm bells, little-to-no transparency, an oversight system in which no central authority is accountable, and the deliberate manufacturing of ambiguity and complexity so that if -- when -- it all falls to pieces, the excuse "who could have known?" can be used....

Is it Iraq? Fannie Mae? Citigroup? Bernie Madoff?

The correct answer is: all of the above.

When you look at the elements that were crucial to the creation of each of these debacles, it's amazing how much in common they all have. And not just in how they began but in how they ended: with those responsible being amazed at what happened, because...who could have known? Well, to paraphrase James Inhofe, I'm amazed at the amazement.

In fact, when historians look for a name that sums up the Bush II years, they could do worse than calling them The "Who Could Have Known?" Era.

Each of the disasters listed above was entirely predictable. And, indeed, was predicted. But those who rang the alarm bells were aggressively ignored, which is why it's important that we not let those responsible get away with the "Who Could Have Known?" excuse.

Let's start with Iraq -- specifically the reconstruction of Iraq. This weekend the New York Times got its hands on the unpublished 513-page federal history of the reconstruction. It's not pretty. As the Times puts it: it was "an effort crippled before the invasion by Pentagon planners who were hostile to the idea of rebuilding a foreign country, and then molded into a $100 billion failure by bureaucratic turf wars, spiraling violence and ignorance of the basic elements of Iraqi society and infrastructure." As a result, almost six years and $117 billion later, many essential services are only now reaching pre-war levels.

The report quotes Colin Powell on how the Pentagon, to cover up its failures, "kept inventing numbers of Iraqi security forces [that had reached readiness] -- the number would jump 20,000 a week! 'We now have 80,000, we now have 100,000, we now have 120,000.' "

Hmm, making up numbers to realize a short-term gain, but which end up making the inevitable long-term reckoning much worse? Sounds a lot like what was happening at Citigroup at around the same time.

In late 2002, Charles Prince was put in charge of the company's corporate and investment bank. The banking giant was already knee deep in toxic paper and aggressively looking the other way.

He was so successful at averting his eyes that when, five years later, as Wall Street began to feel the initial shocks of the mortgage meltdown, he was told that the bank owned $43 billion in mortgage-related assets -- it was the first he'd heard of it. Isn't that something he should have known? Or did he prefer not knowing?

Prince had plenty of help ignoring the obvious, particularly from Robert Rubin. According to a former Citigroup executive quoted in the long New York Times analysis of Citi's downfall, despite ascending to the top of the Citi food chain, Prince "didn't know a C.D.O. from a grocery list, so he looked for someone for advice and support. That person was Rubin."

When it all came tumbling down, both Rubin and Prince portrayed themselves as helpless victims of circumstance, because...Who Could Have Known?

"I've thought a lot about that," Rubin said when asked if he made mistakes at Citigroup. "I honestly don't know. In hindsight, there are a lot of things we'd do differently. But in the context of the facts as I knew them and my role, I'm inclined to think probably not."

What he means, of course, is the facts as he chose to know them.

Prince's head is even higher in the clouds: "Anything," he said, "based on human endeavor and certainly any business that involves risk-taking, you're going to have problems from time to time."

Sounds like he's reading from the same damage control playbook as former Fannie Mae CEO Franklin Raines. According to Raines, he can't be blamed for what happened at Fannie Mae because mortgage stuff is so, well, complicated. In fact, he can't even understand his own mortgage: "I know I can't and I've tried," Raines told a House committee last week. "To this day, I don't know what it said... It's impossible for the average person to understand" mortgage terms such as negative amortization. In other words, Who Could Have Known?

Committee chair Henry Waxman wasn't buying it: "These documents make clear that Fannie Mae and Freddie Mac knew what they were doing. Their own risk managers raised warning after warning about the dangers of investing heavily in the subprime and alternative mortgage markets."

Ignoring warning after warning is an essential element of the "Who Could Have Known?" excuse, as are rewriting history and shamelessly disregarding the foresight shown by those who sounded the alarm bells.

We're seeing the same ingredients in the Madoff affair. "We have worked with Madoff for nearly 20 years," said Jeffrey Tucker, a former federal regulator and the head of an investment firm facing losses of $7.5 billion. "We had no indication that we...were the victims of such a highly sophisticated, massive fraudulent scheme." It's a sentiment echoed by Arthur Levitt, the former chairman of the Securities and Exchange Commission: "I've known [Madoff] for nearly 35 years, and I'm absolutely astonished."

Who Could Have Known?

Well, Harry Markopoulos, for one. In 1999, after researching Madoff's methods, Markopolos wrote a letter to the SEC saying, "Madoff Securities is the world's largest Ponzi Scheme." He pursued his claims with the feds for the next nine years, with little result.

Jim Vos, another investment adviser who had examined Madoff's firm, says: "There's no smoking gun, but if you added it all up you wonder why people either did not get it or chose to ignore the red flags."

The answer comes from Vos's cohort Jake Walthour Jr., who told HuffPost blogger Vicky Ward: "In a bull market no one bothers to ask how the returns are met, they just like the returns."

Hasn't the "Who Could Have Known?" excuse been exposed as a sham enough times to render it obsolete?

Apparently not. Here come the Bush Legacy Project's revisionists expecting us to believe that everyone thought Saddam had WMD -- even though many were on record saying he didn't.

In the wake of 9/11, Condi Rice assured us nobody "could have predicted" that someone "would try to use an airplane as a missile." Except, of course, the government report that in 1999 said, "Suicide bomber(s) belonging to al Qaeda's Martyrdom Battalion could crash-land an aircraft packed with high explosives (C-4 and semtex) into the Pentagon, the headquarters of the Central Intelligence Agency (CIA), or the White House."

After Katrina, the White House read from the "Who Could Have Known?" hymnal: No one could have predicted that the storm would be a Category 5, and that this could result in the levees being breached. We now know, of course, that plenty of people knew that the levees could be breached and said so before the storm hit.

Then there is Alan Greenspan, who, looking back in October of this year on the makings of the financial crisis he helped create (I mean, that just happened to come out of nowhere) delivered this "Who Could Have Known?" classic: "If all those extraordinarily capable people were unable to foresee the development of this critical problem...we have to ask ourselves: Why is that? And the answer is that we're not smart enough as people. We just cannot see events that far in advance."

The only problem is, many people did see events that far in advance.

Unlike Greenspan, I don't believe the problem is that we are "not smart enough as people." As we've seen time after time, smart enough people are all too willing to ignore facts they don't like. Or, even worse, they construct oversight systems designed to be ineffective -- and unable to provide to those in power information they don't really want to know.

Much has been made of the smartness of Obama's new team. But I'm hoping that their defining characteristic won't be their IQs but their willingness to confront reality and take responsibility for their decisions.

It's time to say goodbye to the "Who Could Have Known?" era. It's time to know things again. And to know that you know them.

terça-feira, 16 de dezembro de 2008

Social Class : How Does it Work?

https://www.russellsage.org/publications/social-class

Class differences permeate the neighborhoods, classrooms, and workplaces where we lead our daily lives. But little is known about how class really works, and its importance is often downplayed or denied. In this important new volume, leading sociologists systematically examine how social class operates in the United States today. Social Class argues against the view that we are becoming a classless society. The authors show instead the decisive ways social class matters—from how long people live, to how they raise their children, to how they vote.

The distinguished contributors to Social Class examine how class works in a variety of domains including politics, health, education, gender, and the family. Michael Hout shows that class membership remains an integral part of identity in the U.S.—in two large national surveys, over 97 percent of Americans, when prompted, identify themselves with a particular class. Dalton Conley identifies an intangible but crucial source of class difference that he calls the “opportunity horizon”—children form aspirations based on what they have seen is possible. The best predictor of earning a college degree isn’t race, income, or even parental occupation—it is, rather, the level of education that one’s parents achieved. Annette Lareau and Elliot Weininger find that parental involvement in the college application process, which significantly contributes to student success, is overwhelmingly a middle-class phenomenon. David Grusky and Kim Weeden introduce a new model for measuring inequality that allows researchers to assess not just the extent of inequality, but also whether it is taking on a more polarized, class-based form. John Goldthorpe and Michelle Jackson examine the academic careers of students in three social classes and find that poorly performing students from high-status families do much better in many instances than talented students from less-advantaged families. Erik Olin Wright critically assesses the emphasis on individual life chances in many studies of class and calls for a more structural conception of class. In an epilogue, journalists Ray Suarez, Janny Scott, and Roger Hodge reflect on the media’s failure to report hardening class lines in the U.S., even when images on the nightly news—such as those involving health, crime, or immigration—are profoundly shaped by issues of class.Until now, class scholarship has been highly specialized, with researchers working on only one part of a larger puzzle. Social Class gathers the most current research in one volume, and persuasively illustrates that class remains a powerful force in American society.

Annette Lareau is a Professor of Sociology at University of Maryland, College Park. Her book, Unequal Childhoods: Class, Race, and Family Life, uses ethnographic methods to examine the impact of social class on children's daily lives. Unequal Childhoods won the best book award for the Sociology of Family Section, the Section on Childhood and Youth, and the Sociology of Culture Section (co-winner) of the American Sociological Association. Her first book, Home Advantage: Social Class and Parental Involvement in Elementary Education is a study of the impact of social class on family-school relationships for parents of children in first grade. Home Advantage won the Willard Waller Award for the Sociology of Education Section of the American Sociological Association. With Jeff Shultz, she is the editor of Journeys Through Ethnography: Realistic Accounts of Fieldwork.

Dalton Conley is University Professor of the Social Sciences and Chair of Sociology at New York University. He also holds appointments at NYU's Wagner School of Public Service, as an Adjunct Professor of Community Medicine at Mount Sinai School of Medicine, and as a Research Associate at the National Bureau of Economic Research (NBER). In 2005, Conley became the first sociologist to win the NSF's Alan T. Waterman Award. His research focuses on the determinants of economic opportunity within and across generations. In this vein, he studies sibling differences in socioeconomic success; racial inequalities; the salience of physical appearance to economic status; the measurement of class; and how health and biology affect (and are affected by) social position.

segunda-feira, 15 de dezembro de 2008

Synaptic Self:

How Our Brains Become Who We Are

Research on the brain, one of the few genuine frontiers remaining in science, continues to fascinate us, as it offers a glimpse into the deepest foundations of humanity. But in spite of great progress in understanding specific mental functions, like perception, memory, and emotion, little has been learned about how the self - the essence of who a person is, both in his or her own mind and in the eyes of others - relates to the brain.

"A clear, up-to-date, and impressively fair-minded account of what neuroscience has established about human nature."
- Howard Gardner, author of Frames of Mind and Intelligent Reframed

"In this pathbreaking synthesis, Joseph LeDoux draws on dazzling insights from the cutting edge of neuroscience to generate a new conception of an enduring mystery: the nature of the self. Enlightening and engrossing, LeDoux's bold formulation will change the way you think about who you are"
- Daniel L. Schacter, Chairman of Psychology at Harvard University

"Synaptic Self is a wonderful tour of the brain circuitry behind some of the critical aspects of the mind. LeDoux is an expert tour guide and it is well worth listening. His perspective takes you deep into the cellular basis of what it is to be a thinking being".
- Antonio R. Damasio, Neuroscientist, author of The Feeling of What happens

domingo, 14 de dezembro de 2008

Symbiotic Planet:


A New Look at Evolution

Lynn Margulis, Distinguished University Professor in the Department of Geosciences at the University of Massachusetts-Amherst.

Description
Although Charles Darwin’s theory of evolution laid the foundations of modern biology, it did not tell the whole story. Most remarkably, The Origin of Species said very little about, of all things, the origins of species. Darwin and his modern successors have shown very convincingly how inherited variations are naturally selected, but they leave unanswered how variant organisms come to be in the first place. In Symbiotic Planet, renowned scientist Lynn Margulis shows that symbiosis, which simply means members of different species living in physical contact with each other, is crucial to the origins of evolutionary novelty. Ranging from bacteria, the smallest kinds of life, to the largest—the living Earth itself—Margulis explains the symbiotic origins of many of evolution’s most important innovations. The very cells we’re made of started as symbiotic unions of different kinds of bacteria. Sex—and its inevitable corollary, death—arose when failed attempts at cannibalism resulted in seasonally repeated mergers of some of our tiniest ancestors. Dry land became forested only after symbioses of algae and fungi evolved into plants. Since all living things are bathed by the same waters and atmosphere, all the inhabitants of Earth belong to a symbiotic union. Gaia, the finely tuned largest ecosystem of the Earth’s surface, is just symbiosis as seen from space. Along the way, Margulis describes her initiation into the world of science and the early steps in the present revolution in evolutionary biology; the importance of species classification for how we think about the living world; and the way “academic apartheid” can block scientific advancement. Written with enthusiasm and authority, this is a book that could change the way you view our living Earth.
Endosymbiosis: Lynn Margulis
Edge: Lynn Margulis

The Republican War on Science


Science has never been more crucial to deciding the political issues facing the country. Yet science and scientists have less influence with the federal government than at any time since the Eisenhower administration.

In the White House and Congress today, findings are reported in a politicized manner; spun or distorted to fit the speaker’s agenda; or, when they’re too inconvenient, ignored entirely. On a broad array of issues—stem cell research, climate change, abstinence education, mercury pollution, and many others—the Bush administration’s positions fly in the face of overwhelming scientific consensus.

In The Republican War on Science, Chris Mooney tied together the disparate strands of the attack on science into a compelling and frightening account of our government’s increasing unwillingness to distinguish between legitimate research and ideologically driven pseudoscience.
Now, in a revised and expanded paperback edition, Mooney brings us up to date on the war on science, relates the phenomenon to the Bush administration’s handling of the Iraq war and Hurricane Katrina—and ends with a call to arms to scientists and their allies.
Chris is senior correspondent for The American Prospect magazine and author of two books, including the New York Times bestselling The Republican War on Sciencedubbed “a landmark in contemporary political reporting” by Salon.com and a “well-researched, closely argued and amply referenced indictment of the right wing’s assault on science and scientists” by Scientific American —and Storm World: Hurricanes, Politics, and the Battle Over Global Warming —dubbed “riveting” by the Boston Globe and selected as a 2007 best book of the year in the science category by Publisher’s Weekly. He also writes “The Intersection” blog with Sheril Kirshenbaum.

sábado, 13 de dezembro de 2008

Rationality for Mortals

How people cope with uncertainty

By Gerd Gigerenzer, Director of the Center for Adaptive Behavior and Cognition, Max Planch Institute for Human Development Berlin

What is the nature of human wisdom?

For many, the ideal image of rationality is a heavenly one: an omniscient God, a Laplacean demon, a super computer, or a fully consistent logical system. Gerd Gigerenzer argues, in contrast, that there are more efficient tools in our minds than logic; he calls them fast and frugal heuristics. These adaptive tools work in a world where the present is only partially known and the future is uncertain. Here, rationality is not logical but ecological, and this volume shows how this insight can help remedy even the widespread problem of statistical innumeracy.

Rationality for Mortals (which follows on a previous collection, Adaptive Thinking) presents Gigerenzer's most recent articles, revised and updated where appropriate, together to psychologists, philosophers, physicians, biologists, economists, educators, and others who are curious about the nature of rationality and how humans are able to make wise decisions.

Oxford University Press

quinta-feira, 11 de dezembro de 2008

Capitalist Fools

Behind the debate over remaking U.S. financial policy will be a debate over who’s to blame. It’s crucial to get the history right, writes a Nobel-laureate economist, identifying five key mistakes—under Reagan, Clinton, and Bush II—and one national delusion.
Vanity Fair January 2009
by Joseph E. Stiglitz

There will come a moment when the most urgent threats posed by the credit crisis have eased and the larger task before us will be to chart a direction for the economic steps ahead. This will be a dangerous moment. Behind the debates over future policy is a debate over history—a debate over the causes of our current situation. The battle for the past will determine the battle for the present. So it’s crucial to get the history straight.

What were the critical decisions that led to the crisis? Mistakes were made at every fork in the road—we had what engineers call a “system failure,” when not a single decision but a cascade of decisions produce a tragic result. Let’s look at five key moments.

No. 1: Firing the Chairman
In 1987 the Reagan administration decided to remove Paul Volcker as chairman of the Federal Reserve Board and appoint Alan Greenspan in his place. Volcker had done what central bankers are supposed to do. On his watch, inflation had been brought down from more than 11 percent to under 4 percent. In the world of central banking, that should have earned him a grade of A+++ and assured his re-appointment. But Volcker also understood that financial markets need to be regulated. Reagan wanted someone who did not believe any such thing, and he found him in a devotee of the objectivist philosopher and free-market zealot Ayn Rand.

Greenspan played a double role. The Fed controls the money spigot, and in the early years of this decade, he turned it on full force. But the Fed is also a regulator. If you appoint an anti-regulator as your enforcer, you know what kind of enforcement you’ll get. A flood of liquidity combined with the failed levees of regulation proved disastrous.

Greenspan presided over not one but two financial bubbles. After the high-tech bubble popped, in 2000–2001, he helped inflate the housing bubble. The first responsibility of a central bank should be to maintain the stability of the financial system. If banks lend on the basis of artificially high asset prices, the result can be a meltdown—as we are seeing now, and as Greenspan should have known. He had many of the tools he needed to cope with the situation. To deal with the high-tech bubble, he could have increased margin requirements (the amount of cash people need to put down to buy stock). To deflate the housing bubble, he could have curbed predatory lending to low-income households and prohibited other insidious practices (the no-documentation—or “liar”—loans, the interest-only loans, and so on). This would have gone a long way toward protecting us. If he didn’t have the tools, he could have gone to Congress and asked for them.

Of course, the current problems with our financial system are not solely the result of bad lending. The banks have made mega-bets with one another through complicated instruments such as derivatives, credit-default swaps, and so forth. With these, one party pays another if certain events happen—for instance, if Bear Stearns goes bankrupt, or if the dollar soars. These instruments were originally created to help manage risk—but they can also be used to gamble. Thus, if you felt confident that the dollar was going to fall, you could make a big bet accordingly, and if the dollar indeed fell, your profits would soar. The problem is that, with this complicated intertwining of bets of great magnitude, no one could be sure of the financial position of anyone else—or even of one’s own position. Not surprisingly, the credit markets froze.

Here too Greenspan played a role. When I was chairman of the Council of Economic Advisers, during the Clinton administration, I served on a committee of all the major federal financial regulators, a group that included Greenspan and Treasury Secretary Robert Rubin. Even then, it was clear that derivatives posed a danger. We didn’t put it as memorably as Warren Buffett—who saw derivatives as “financial weapons of mass destruction”—but we took his point. And yet, for all the risk, the deregulators in charge of the financial system—at the Fed, at the Securities and Exchange Commission, and elsewhere—decided to do nothing, worried that any action might interfere with “innovation” in the financial system. But innovation, like “change,” has no inherent value. It can be bad (the “liar” loans are a good example) as well as good.

No. 2: Tearing Down the Walls
The deregulation philosophy would pay unwelcome dividends for years to come. In November 1999, Congress repealed the Glass-Steagall Act—the culmination of a $300 million lobbying effort by the banking and financial-services industries, and spearheaded in Congress by Senator Phil Gramm. Glass-Steagall had long separated commercial banks (which lend money) and investment banks (which organize the sale of bonds and equities); it had been enacted in the aftermath of the Great Depression and was meant to curb the excesses of that era, including grave conflicts of interest. For instance, without separation, if a company whose shares had been issued by an investment bank, with its strong endorsement, got into trouble, wouldn’t its commercial arm, if it had one, feel pressure to lend it money, perhaps unwisely? An ensuing spiral of bad judgment is not hard to foresee. I had opposed repeal of Glass-Steagall. The proponents said, in effect, Trust us: we will create Chinese walls to make sure that the problems of the past do not recur. As an economist, I certainly possessed a healthy degree of trust, trust in the power of economic incentives to bend human behavior toward self-interest—toward short-term self-interest, at any rate, rather than Tocqueville’s “self interest rightly understood.”

The most important consequence of the repeal of Glass-Steagall was indirect—it lay in the way repeal changed an entire culture. Commercial banks are not supposed to be high-risk ventures; they are supposed to manage other people’s money very conservatively. It is with this understanding that the government agrees to pick up the tab should they fail. Investment banks, on the other hand, have traditionally managed rich people’s money—people who can take bigger risks in order to get bigger returns. When repeal of Glass-Steagall brought investment and commercial banks together, the investment-bank culture came out on top. There was a demand for the kind of high returns that could be obtained only through high leverage and big risktaking.

There were other important steps down the deregulatory path. One was the decision in April 2004 by the Securities and Exchange Commission, at a meeting attended by virtually no one and largely overlooked at the time, to allow big investment banks to increase their debt-to-capital ratio (from 12:1 to 30:1, or higher) so that they could buy more mortgage-backed securities, inflating the housing bubble in the process. In agreeing to this measure, the S.E.C. argued for the virtues of self-regulation: the peculiar notion that banks can effectively police themselves. Self-regulation is preposterous, as even Alan Greenspan now concedes, and as a practical matter it can’t, in any case, identify systemic risks—the kinds of risks that arise when, for instance, the models used by each of the banks to manage their portfolios tell all the banks to sell some security all at once.

As we stripped back the old regulations, we did nothing to address the new challenges posed by 21st-century markets. The most important challenge was that posed by derivatives. In 1998 the head of the Commodity Futures Trading Commission, Brooksley Born, had called for such regulation—a concern that took on urgency after the Fed, in that same year, engineered the bailout of Long-Term Capital Management, a hedge fund whose trillion-dollar-plus failure threatened global financial markets. But Secretary of the Treasury Robert Rubin, his deputy, Larry Summers, and Greenspan were adamant—and successful—in their opposition. Nothing was done.

No. 3: Applying the Leeches
Then along came the Bush tax cuts, enacted first on June 7, 2001, with a follow-on installment two years later. The president and his advisers seemed to believe that tax cuts, especially for upper-income Americans and corporations, were a cure-all for any economic disease—the modern-day equivalent of leeches. The tax cuts played a pivotal role in shaping the background conditions of the current crisis. Because they did very little to stimulate the economy, real stimulation was left to the Fed, which took up the task with unprecedented low-interest rates and liquidity. The war in Iraq made matters worse, because it led to soaring oil prices. With America so dependent on oil imports, we had to spend several hundred billion more to purchase oil—money that otherwise would have been spent on American goods. Normally this would have led to an economic slowdown, as it had in the 1970s. But the Fed met the challenge in the most myopic way imaginable. The flood of liquidity made money readily available in mortgage markets, even to those who would normally not be able to borrow. And, yes, this succeeded in forestalling an economic downturn; America’s household saving rate plummeted to zero. But it should have been clear that we were living on borrowed money and borrowed time.

The cut in the tax rate on capital gains contributed to the crisis in another way. It was a decision that turned on values: those who speculated (read: gambled) and won were taxed more lightly than wage earners who simply worked hard. But more than that, the decision encouraged leveraging, because interest was tax-deductible. If, for instance, you borrowed a million to buy a home or took a $100,000 home-equity loan to buy stock, the interest would be fully deductible every year. Any capital gains you made were taxed lightly—and at some possibly remote day in the future. The Bush administration was providing an open invitation to excessive borrowing and lending—not that American consumers needed any more encouragement.

No. 4: Faking the Numbers
Meanwhile, on July 30, 2002, in the wake of a series of major scandals—notably the collapse of WorldCom and Enron—Congress passed the Sarbanes-Oxley Act. The scandals had involved every major American accounting firm, most of our banks, and some of our premier companies, and made it clear that we had serious problems with our accounting system. Accounting is a sleep-inducing topic for most people, but if you can’t have faith in a company’s numbers, then you can’t have faith in anything about a company at all. Unfortunately, in the negotiations over what became Sarbanes-Oxley a decision was made not to deal with what many, including the respected former head of the S.E.C. Arthur Levitt, believed to be a fundamental underlying problem: stock options. Stock options have been defended as providing healthy incentives toward good management, but in fact they are “incentive pay” in name only. If a company does well, the C.E.O. gets great rewards in the form of stock options; if a company does poorly, the compensation is almost as substantial but is bestowed in other ways. This is bad enough. But a collateral problem with stock options is that they provide incentives for bad accounting: top management has every incentive to provide distorted information in order to pump up share prices.

The incentive structure of the rating agencies also proved perverse. Agencies such as Moody’s and Standard & Poor’s are paid by the very people they are supposed to grade. As a result, they’ve had every reason to give companies high ratings, in a financial version of what college professors know as grade inflation. The rating agencies, like the investment banks that were paying them, believed in financial alchemy—that F-rated toxic mortgages could be converted into products that were safe enough to be held by commercial banks and pension funds. We had seen this same failure of the rating agencies during the East Asia crisis of the 1990s: high ratings facilitated a rush of money into the region, and then a sudden reversal in the ratings brought devastation. But the financial overseers paid no attention.

No. 5: Letting It Bleed
The final turning point came with the passage of a bailout package on October 3, 2008—that is, with the administration’s response to the crisis itself. We will be feeling the consequences for years to come. Both the administration and the Fed had long been driven by wishful thinking, hoping that the bad news was just a blip, and that a return to growth was just around the corner. As America’s banks faced collapse, the administration veered from one course of action to another. Some institutions (Bear Stearns, A.I.G., Fannie Mae, Freddie Mac) were bailed out. Lehman Brothers was not. Some shareholders got something back. Others did not.

The original proposal by Treasury Secretary Henry Paulson, a three-page document that would have provided $700 billion for the secretary to spend at his sole discretion, without oversight or judicial review, was an act of extraordinary arrogance. He sold the program as necessary to restore confidence. But it didn’t address the underlying reasons for the loss of confidence. The banks had made too many bad loans. There were big holes in their balance sheets. No one knew what was truth and what was fiction. The bailout package was like a massive transfusion to a patient suffering from internal bleeding—and nothing was being done about the source of the problem, namely all those foreclosures. Valuable time was wasted as Paulson pushed his own plan, “cash for trash,” buying up the bad assets and putting the risk onto American taxpayers. When he finally abandoned it, providing banks with money they needed, he did it in a way that not only cheated America’s taxpayers but failed to ensure that the banks would use the money to re-start lending. He even allowed the banks to pour out money to their shareholders as taxpayers were pouring money into the banks.

The other problem not addressed involved the looming weaknesses in the economy. The economy had been sustained by excessive borrowing. That game was up. As consumption contracted, exports kept the economy going, but with the dollar strengthening and Europe and the rest of the world declining, it was hard to see how that could continue. Meanwhile, states faced massive drop-offs in revenues—they would have to cut back on expenditures. Without quick action by government, the economy faced a downturn. And even if banks had lent wisely—which they hadn’t—the downturn was sure to mean an increase in bad debts, further weakening the struggling financial sector.

The administration talked about confidence building, but what it delivered was actually a confidence trick. If the administration had really wanted to restore confidence in the financial system, it would have begun by addressing the underlying problems—the flawed incentive structures and the inadequate regulatory system.

Was there any single decision which, had it been reversed, would have changed the course of history? Every decision—including decisions not to do something, as many of our bad economic decisions have been—is a consequence of prior decisions, an interlinked web stretching from the distant past into the future. You’ll hear some on the right point to certain actions by the government itself—such as the Community Reinvestment Act, which requires banks to make mortgage money available in low-income neighborhoods. (Defaults on C.R.A. lending were actually much lower than on other lending.) There has been much finger-pointing at Fannie Mae and Freddie Mac, the two huge mortgage lenders, which were originally government-owned. But in fact they came late to the subprime game, and their problem was similar to that of the private sector: their C.E.O.’s had the same perverse incentive to indulge in gambling.

The truth is most of the individual mistakes boil down to just one: a belief that markets are self-adjusting and that the role of government should be minimal. Looking back at that belief during hearings this fall on Capitol Hill, Alan Greenspan said out loud, “I have found a flaw.” Congressman Henry Waxman pushed him, responding, “In other words, you found that your view of the world, your ideology, was not right; it was not working.” “Absolutely, precisely,” Greenspan said. The embrace by America—and much of the rest of the world—of this flawed economic philosophy made it inevitable that we would eventually arrive at the place we are today.

How did we land in a recession? Visit our archive, “Charting the Road to Ruin.”

Joseph E. Stiglitz is University Professor at Columbia University. Among many books, he is the other of Globalization and Its Discontents. He received the Nobel Prize in Economics in 2001 for research on the economics of information. Most recently, he is the co-author, with Linda Bilmes, of The Three Trillion Dollar War: The True Costs of the Iraq Conflict.
CommonDreams.Org

The American Financial Regime

CARD CHECK: "You have nothing to lose but your chains"
by Mike Whitney

Even though the Federal Reserve is now the biggest single participant in the financial system, the myth of a "free market" still lingers on. It's mind boggling. The Fed has expanded its balance sheet by $2 trillion, guaranteed $8.3 trillion of dodgy mortgage-backed paper, provided a backstop for bank deposits, money markets, commercial paper, and created 8 separate lending facilities to ensure that underwater financial institutions can still appear to be solvent. The whole system is a state subsidized operation buoyed on a taxpayer-provided flotation device which bears no resemblance to an invisible hand. More astonishing, is the massive power grab engineered by the Fed which has taken place without the slightest protest from 535 shell-shocked congressmen and senators. Elected officials have either kept their finger in the air to see which way the political wind is blowing or timidly caved in to Treasury's every multi-billion dollar demand. It's flagrant blackmail and everyone knows it. Congressional oversight is an oxymoron.

Anyone who has followed the financial crisis from its origins knows that the Fed's bloody fingerprints are all over the crime scene. Still, that hasn't stopped well-meaning liberal economists (Krugman, Stiglitz, Reich) from supporting Bernanke's increasingly unorthodox attempts to flood the financial system with liquidity ("quantitative easing") and invoke whatever radical strategy pops into his head. In fact, many of the experts believe that Bernanke should do even more given the sheer size of the meltdown. There's growing support for a gigantic stimulus package ($700 billion) which will focus on road construction, infrastructure, state aid, extensions to unemployment benefits and green technologies. The Obama camp hopes that government programs and deficit spending will make up for the huge losses in aggregate demand which threaten to drag prices down even further in a self-reinforcing deflationary cycle. Even so, its natural to wonder at the wisdom of giving even more power to the very people who created the mess to begin with and who seem more interested in proving their depression-fighting theories than throwing a lifeline to struggling homeowners, consumers or auto workers. Maybe its time to try something different.

So far, Bernanke's monetarist approach has amounted to nothing. The stock indexes are off 45 percent and housing prices continue to plunge. The Fed's low interest rates and lending facilities have helped to keep the banking system from collapsing, but they've failed to get consumers or businesses spending again. The economy is tanking fast. Paul L. Kasriel, the Director of Economic Research at The Northern Trust Company summed up Bernanke's dilemma like this:

"In a sustained housing bust that causes banks to take a big hit to their capital (low interest rates) simply will not matter. This is essentially what happened recently in Japan and also in the US during the Great Depression. Most people are not aware of actions the Fed took during the Great Depression. Bernanke claims that the Fed did not act strong enough during the great depression. This is simply not true. The Fed slashed interest rates and injected huge sums of base money but it did no good. More recently, Japan did the same thing. It also did no good. If default rates get high enough, banks will simply be unwilling to lend which will severely limit money and credit creation." (Interview with Paul Kasriel; Mish's Global Economic Trend Analysis)

In fact, the banks are just one part of the problem. Another part is the shortage of creditworthy borrowers now that home equity is drying up and the standards for loans have gotten tougher. Most people have seen their personal wealth vanish as hosuing prices fall and and their 401-Ks shrivel to the size of a chickpea. The Fed chairman faces huge obstacles in trying to restart the credit engine and get maxed out consumers spending again.

Bernanke has expanded the money supply at record pace, but to little effect. The money is stagnating in pools because the financial plumbing is still gunked up from troubles in the banking system. The credit-transmission system has broken down causing a generalized contraction throughout the economy. Business activity has dropped off a cliff and consumer confidence is at a 40 year low. In his Forbes article "What Would Keynes Do?", Former Treasury Department economist, Bruce Bartlett, sheds light on a part of the problem which many of the pundits miss:

"Another problem that policymakers back then didn't grasp is that the money supply's effectiveness depends on how quickly people spend it; something economists call velocity. If velocity falls because people are hoarding cash, it may require a great deal more money to keep the economy operating.

Think of it this way: Velocity is the ratio of the money supply to the gross domestic product. If GDP is $10 trillion and money turns over 10 times per year, then $1 trillion in money supply will be sufficient. But if velocity falls to 9, a $1 trillion money supply will only support a $9 trillion GDP. If the Fed doesn't want GDP to shrink by 10%, it will have to increase the money supply by 10%.

This is essentially the problem we have today. Unlike in the 1930s, the Fed is not allowing the money supply to diminish. Also, we have programs like federal deposit insurance to prevent bank deposits from shrinking. But velocity is collapsing. Banks, businesses and households are all hoarding cash, not spending except for essentials. This is bringing on the deflation that is crippling the economy."

This is why Bernanke has launched his radical intervention, buying bonds, stocks and anything else that will keep asset-prices from crashing. It's an attempt to reignite spending by goosing the market. When businesses and consumers can't sustain demand, the government has to step in and take their place. Otherwise, businesses have to cut costs even more dramatically, sending unemployment soaring while prices continue to nosedive.

The real worry is that Bernanke's pet theory is merely an academic pipe-dream which is doing more harm than good. After all, his strategy is based on a controversial reading of history that is only accepted by disciples of Milton Friedman. The idea that a normal recession morphed into the Great Depression because the money supply decreased by one-third between 1929 to 1932, is likely an oversimplification of a very complex situation. If Bernanke's calculations are correct, then show us the goods? Why haven't the zero-percent interest rates and the trillion dollar lending facilities stimulated spending? Instead, the equities markets continue to tumble, corporate profits are down, foreclosures are on the rise, commodities are in freefall, and the unemployment lines are winding halfway across the continent. (Unemployment during the Great Depression didn't reach 25 percent for three years. It is actually accelerating faster in 2008 than it did in 1929) So, where's the progress, Ben?

The present list of remedies fail to address the underlying rot in the system itself. That's the problem. There's no doubt that Timothy Geithner and Larry Summers will have better luck mitigating the effects of the slumping economy, but to what end? To stitch together a system which diverts a larger and larger portion of the national wealth to a smaller and smaller group of corporatist and bankers? Is that the measure of success?

(Note: Redistribution US Style: In the United States the top 1 percent of wealth holders in 2001 together owned more than twice as much as the bottom 80 percent of the population. If this were measured simply in terms of financial wealth, i.e., excluding equity in owner-occupied housing, the top 1 percent owned more than four times the bottom 80 percent!)

No thanks. Besides, the financial crisis is not an accident of nature, like a tornado or an avalanche. It's a self-inflicted wound that can be traced back to particular policies that were put in place to shift wealth from one class to another. The low interest rates, the massive leveraging, the undercapitalized institutions, the off-balance sheets operations were all concocted with the same objective in mind. The Fed's repertoire may change, but the results are always the same; they reflect the deeply-held class bias which orders the economy according to the interests of rich and powerful.

Besides, there's reason to believe that Bernanke doesn't fully grasp the fundamental problem, that economic growth in recent years was predicated on a flawed model that can't be restored. Consumers were able to spend beyond their means because their personal assets were greatly inflated by the availability of easy credit and lax lending standards. Now that risk is being repriced, debt deflation has set in and prices are plummeting across the spectrum. Homeowners are feeling the pinch because they can't tap into their home equity which amounted to $800 billion in 2006. The process of lowering interest rates by spreading risk throughout the system (securitization) has frozen over, sending investors fleeing from the stock markets to the safety of US Treasurys and cold hard cash. Bernanke's attempts to reflate the bubble by buying up Fannie and Freddie's mortgage-backed securities (MBS) and bundled credit card debt from finance companies is a sign of utter desperation. He's like a man pumping air into a punctured tire, pushing up and down furiously while the air hisses out the other side.

The economy is contracting because the excessive spending was based on artificially low interest rates and debt leveraging. In The End of Prosperity Fred Magdoff and Paul Sweezy wrote:

“In the absence of a severe depression during which debts are forcefully wiped out or drastically reduced, government rescue measures to prevent collapse of the financial system merely lay the groundwork for still more layers of debt and additional strains during the next economic advance.” As Minsky put it, “Without a crisis and a debt-deflation process to offset beliefs in the success of speculative ventures, both an upward bias to prices and ever-higher financial layering are induced." (John Bellamy and Fred Magdoff, "Financial Implosion and Stagnation", Monthly Review)

This is the market model that Bernanke and Paulson are trying to resuscitate, but without much success. The credit that once gushed from the hedge funds and investment banks has slowed to a trickle. It's no longer possible to take complex debt-instruments and amplify their value 30 or 40 times over. Investors have seen through the swindle and boycotted the market for pools of debt packaged as securities. As foreclosures rise, the banks balance sheets will continue to hemorrhage, forcing them to make margin calls that will push more and more financial institutions into bankruptcy. It can't be stopped. This is what happens when the underlying economy can no longer support an oversized financial system where wages have stagnated and workers are unable to make the interest payments of their loans. The whole system begins to buckle. John Bellamy and Fred Magdoff explain the origins of "financialization" in their article "Financial Implosion and Stagnation":

"It was the reality of economic stagnation beginning in the 1970s, as heterodox economists Riccardo Bellofiore and Joseph Halevi have recently emphasized, that led to the emergence of “the new financialized capitalist regime,” a kind of “paradoxical financial Keynesianism” whereby demand in the economy was stimulated primarily “thanks to asset-bubbles.” Moreover, it was the leading role of the United States in generating such bubbles—despite (and also because of) the weakening of capital accumulation proper—together with the dollar’s reserve currency status, that made U.S. monopoly-finance capital the “catalyst of world effective demand.”

Magdoff and Bellamy's theory confirms that there was a plan to expand financial markets into riskier areas to compensate for the stagnation which unavoidably occurs in capitalist economies. The real problem is rooted in the hostility of corporate bosses towards workers which translates into wages that don't keep pace with production. When wages languish, in an economy that is 70 percent consumer spending, the only way to increase GDP is by expanding credit. And that, in fact, is exactly how it has played out. Trickle down ideologues, like Henry Paulson, make every effort to extend credit to anyone with a pulse and a body temperature of 98.2 degrees, but they fight tooth and nail to crush the unions or any attempt to raise salaries. And Paulson, of course, is not alone in waging class warfare; he is just an extreme example.

The bottom line, is that financialization, which rests on the twin pillars of easy credit and ballooning debt, creates an inherently unstable system which is prone to wild swings and frequent busts. Bernanke is trying to restore this system ignoring the fact that workers--whose personal balance sheets are already bleeding red--can no longer support it. No amount of tinkering in the credit markets will reduce the overcapacity bulging throughout the system or add one farthing a poor man's bank account. There is a historic mismatch between supply and demand that cannot be reconciled by Bernanke's market meddling. Workers need a raise, that's how demand is created.

The same message goes out to Obama's economic team, too. The stimulus package might get the economy through the short-term rough patch, but if wages don't rise, the economy will continue to underperform. That's why the new commander in chief would be well advised to quickly pass The Employee Free Choice Act (also known as "card check") which would end secret ballots in union elections. It may be the most important piece of legislation in a decade. Its passage would ease union organizing and help to grow union membership which has dwindled to about 10 percent of the work force.

Forget about the fake differences between the two political parties. There aren't any. The only hope for deep structural change is to strengthen the unions and give workers a place at the policy table. That's the only peaceful way to dismantle this parasitic financial regime and bring about a more equitable distribution of wealth.

Mike Whitney is a frequent contributor to Global Research.