Archive for outubro \29\UTC 2009

Efficient Market Theory and the Crisis

outubro 29, 2009

Eis aqui uma defesa de uma teoria econômica com substância, publicada no Wall Street Journal!

=================

Efficient Market Theory and the Crisis
OCTOBER 27, 2009, 10:13 P.M. ET (http://online.wsj.com)
Neither the rating agencies’ mistakes nor the overleveraging by financial firms was the fault of an academic hypothesis.

By JEREMY J. SIEGEL

Financial journalist and best-selling author Roger Lowenstein didn’t mince words in a piece for the Washington Post this summer: “The upside of the current Great Recession is that it could drive a stake through the heart of the academic nostrum known as the efficient-market hypothesis.” In a similar vein, the highly respected money manager and financial analyst Jeremy Grantham wrote in his quarterly letter last January: “The incredibly inaccurate efficient market theory [caused] a lethally dangerous combination of asset bubbles, lax controls, pernicious incentives and wickedly complicated instruments [that] led to our current plight.”

But is the Efficient Market Hypothesis (EMH) really responsible for the current crisis? The answer is no. The EMH, originally put forth by Eugene Fama of the University of Chicago in the 1960s, states that the prices of securities reflect all known information that impacts their value. The hypothesis does not claim that the market price is always right. On the contrary, it implies that the prices in the market are mostly wrong, but at any given moment it is not at all easy to say whether they are too high or too low. The fact that the best and brightest on Wall Street made so many mistakes shows how hard it is to beat the market.

David Gothard

This does not mean the EMH can be used as an excuse by the CEOs of the failed financial firms or by the regulators who did not see the risks that subprime mortgage-backed securities posed to the financial stability of the economy. Regulators wrongly believed that financial firms were offsetting their credit risks, while the banks and credit rating agencies were fooled by faulty models that underestimated the risk in real estate.

After the 1982 recession, the U.S. and world economies entered into a long period where the fluctuations in variables such as gross domestic product, industrial production, and employment were significantly lower than they had been since World War II. Economists called this period the “Great Moderation” and attributed the increased stability to better monetary policy, a larger service sector and better inventory control, among other factors.

The economic response to the Great Moderation was predictable: risk premiums shrank and individuals and firms took on more leverage. Housing prices were boosted by historically low nominal and real interest rates and the development of the securitized subprime lending market.

According to data collected by Prof. Robert Shiller of Yale University, in the 61 years from 1945 through 2006 the maximum cumulative decline in the average price of homes was 2.84% in 1991. If this low volatility of home prices persisted into the future, a mortgage security composed of a nationally diversified portfolio of loans comprising the first 80% of a home’s value would have never come close to defaulting. The credit quality of home buyers was secondary because it was thought that underlying collateral—the home—could always cover the principal in the event the homeowner defaulted. These models led credit agencies to rate these subprime mortgages as “investment grade.”

But this assessment was faulty. From 2000 through 2006, national home prices rose by 88.7%, far more than the 17.5% gain in the consumer price index or the paltry 1% rise in median household income. Never before have home prices jumped that far ahead of prices and incomes.

This should have sent up red flags and cast doubts on using models that looked only at historical declines to judge future risk. But these flags were ignored as Wall Street was reaping large profits bundling and selling the securities while Congress was happy that more Americans could enjoy the “American Dream” of home ownership. Indeed, through government-sponsored enterprises such as Fannie Mae and Freddie Mac, Washington helped fuel the subprime boom.

Neither the rating agencies’ mistakes nor the overleveraging by the financial firms in the subprime securities is the fault of the Efficient Market Hypothesis. The fact that the yields on these mortgages were high despite their investment-grade rating indicated that the market was rightly suspicious of the quality of the securities, and this should have served as a warning to prospective buyers.

With few exceptions (Goldman Sachs being one), financial firms ignored these warnings. CEOs failed to exercise their authority to monitor overall risk of the firm and instead put their faith in technicians whose narrow models could not capture the big picture. One can only wonder if the large investment banks would have taken on such risks when they were all partnerships and the lead partner had all his wealth in the firm, as they were just a few decades ago.

The misreading of these economic trends did not just reside within the private sector. Former Fed Chairman Alan Greenspan stated before congressional committees last December that he was “shocked” that the top executives of the financial firms exposed their stockholders to such risk. But had he looked at their balance sheets, he would have realized that not only did they put their own shareholders at risk, but their leveraged positions threatened the viability of the entire financial system.

As home prices continued to climb and subprime mortgages proliferated, Mr. Greenspan and current Fed Chairman Ben Bernanke were perhaps the only ones influential enough to sound an alarm and soften the oncoming crisis. But they did not. For all the deserved kudos that the central bank received for their management of the crisis after the Lehman bankruptcy, the failure to see these problems building will stand as a permanent blot on the Fed’s record.

Our crisis wasn’t due to blind faith in the Efficient Market Hypothesis. The fact that risk premiums were low does not mean they were nonexistent and that market prices were right. Despite the recent recession, the Great Moderation is real and our economy is inherently more stable.

But this does not mean that risks have disappeared. To use an analogy, the fact that automobiles today are safer than they were years ago does not mean that you can drive at 120 mph. A small bump on the road, perhaps insignificant at lower speeds, will easily flip the best-engineered car. Our financial firms drove too fast, our central bank failed to stop them, and the housing deflation crashed the banks and the economy.

Mr. Siegel, a professor of finance at the University of Pennsylvania’s Wharton School, is the author of “Stocks for the Long Run,” now in its 4th edition from McGraw-Hill.

The Economics of Cloud Computing

outubro 28, 2009

O governo federal nos EUA anunciou que iria colocar recursos no seu orçamento de 2010 da ordem de US$ 20 bilhões para infraestrutura de TI, mas estava preocupado com a magnitude destes gastos.  A empresa de consultoria Booz Allen entrou em campo e conduziu um estudo para avaliar o impacto destes investimentos.  A empresa confirmou os benefícios que tais investimentos podem trazer, mas faz alguns alertas!

=========

The Economics of Cloud Computing
Addressing the benefits of infrastructure in the cloud.

Although $20 billion has been allocated in the President’s FY 2010 budget for federal IT infrastructure investments, the government is increasingly concerned about such massive expenditures. One viable option for reducing IT costs may involve adopting cloud computing as the federal IT infrastructure model.

Cloud computing provides computing resources, storage, and applications as internet-accessible services. The federal budget submitted to Congress in February of this year suggested adopting a cloud computing business model, virtualizing agency data centers, and consolidating data centers and operations. Long-term savings from this effort are expected to be many times the upfront investment costs in cloud computing technology.

Booz Allen Hamilton conducted an economic analysis to examine the plan’s potential savings and related implications. Called “The Economics of Cloud Computing: Addressing the Benefits of Infrastructure in the Cloud,” the study used Booz Allen’s proprietary cost model and extensive experience in economic analysis of IT programs to estimate lifecycle costs of implementing public, private, and hybrid clouds. Unlike most studies, “The Economics of Cloud Computing” considered transition costs, lifecycle operations, and migration schedules.

Study results indicate that cloud computing offers clear opportunities for agencies to reduce costs associated with server hardware and support and meet the long-term savings expectations. The amount of savings depends on the scale of the data center and the amount of time required to move operations into the cloud.  But to give one example, the benefit-to-cost ratio of a non-virtualized 1,000-server data center could reach 15.4:1 after implementation, and total lifecycle cost may be 66% lower than maintaining a traditional data center.

Several critical factors outlined in the study, however, would greatly impact the degree of economic benefit. Oversight organizations such as the Office of Management & Budget, National Institute of Standards and Technology, and General Services Administration must provide timely, well-coordinated support and drive the careful planning needed to enable agencies to select future cloud scenarios that yield the best trade-offs between costs, benefits, and risks.

Senior Associate Gwen Morton and Associate Ted Alford were part of the Booz Allen team that contributed to this study.

study posted October 6, 2009

The Chinese Disconnect

outubro 24, 2009

Essa intriga entre os EUA e a China fica cada vez mais intensa. Um dos posts mais visitados neste blog (desde 2007 mais de 18 mil no total, e umas 30 por dia) se intitula “O Conflito China, na política americana”, que você pode acessar em https://jccavalcanti.wordpress.com/2007/08/10/o-conflito-china-na-politica-americana/, dá uma idéia do quanto esta intriga vai longe.

Agora é o nobre Prof. Krugman, em sua coluna em The New York Times, que empresta seu tempo a isto!

=======

The Chinese Disconnect

By PAUL KRUGMAN
Published: October 22, 2009
Senior monetary officials usually talk in code. So when Ben Bernanke, the Federal Reserve chairman, spoke recently about Asia, international imbalances and the financial crisis, he didn’t specifically criticize China’s outrageous currency policy.

But he didn’t have to: everyone got the subtext. China’s bad behavior is posing a growing threat to the rest of the world economy. The only question now is what the world — and, in particular, the United States — will do about it.

Some background: The value of China’s currency, unlike, say, the value of the British pound, isn’t determined by supply and demand. Instead, Chinese authorities enforced that target by buying or selling their currency in the foreign exchange market — a policy made possible by restrictions on the ability of private investors to move their money either into or out of the country.

There’s nothing necessarily wrong with such a policy, especially in a still poor country whose financial system might all too easily be destabilized by volatile flows of hot money. In fact, the system served China well during the Asian financial crisis of the late 1990s. The crucial question, however, is whether the target value of the yuan is reasonable.

Until around 2001, you could argue that it was: China’s overall trade position wasn’t too far out of balance. From then onward, however, the policy of keeping the yuan-dollar rate fixed came to look increasingly bizarre. First of all, the dollar slid in value, especially against the euro, so that by keeping the yuan/dollar rate fixed, Chinese officials were, in effect, devaluing their currency against everyone else’s. Meanwhile, productivity in China’s export industries soared; combined with the de facto devaluation, this made Chinese goods extremely cheap on world markets.

The result was a huge Chinese trade surplus. If supply and demand had been allowed to prevail, the value of China’s currency would have risen sharply. But Chinese authorities didn’t let it rise. They kept it down by selling vast quantities of the currency, acquiring in return an enormous hoard of foreign assets, mostly in dollars, currently worth about $2.1 trillion.

Many economists, myself included, believe that China’s asset-buying spree helped inflate the housing bubble, setting the stage for the global financial crisis. But China’s insistence on keeping the yuan/dollar rate fixed, even when the dollar declines, may be doing even more harm now.

Although there has been a lot of doomsaying about the falling dollar, that decline is actually both natural and desirable. America needs a weaker dollar to help reduce its trade deficit, and it’s getting that weaker dollar as nervous investors, who flocked into the presumed safety of U.S. debt at the peak of the crisis, have started putting their money to work elsewhere.

But China has been keeping its currency pegged to the dollar — which means that a country with a huge trade surplus and a rapidly recovering economy, a country whose currency should be rising in value, is in effect engineering a large devaluation instead.

And that’s a particularly bad thing to do at a time when the world economy remains deeply depressed due to inadequate overall demand. By pursuing a weak-currency policy, China is siphoning some of that inadequate demand away from other nations, which is hurting growth almost everywhere. The biggest victims, by the way, are probably workers in other poor countries. In normal times, I’d be among the first to reject claims that China is stealing other peoples’ jobs, but right now it’s the simple truth.

So what are we going to do?

U.S. officials have been extremely cautious about confronting the China problem, to such an extent that last week the Treasury Department, while expressing “concerns,” certified in a required report to Congress that China is not — repeat not — manipulating its currency. They’re kidding, right?

The thing is, right now this caution makes little sense. Suppose the Chinese were to do what Wall Street and Washington seem to fear and start selling some of their dollar hoard. Under current conditions, this would actually help the U.S. economy by making our exports more competitive.

In fact, some countries, most notably Switzerland, have been trying to support their economies by selling their own currencies on the foreign exchange market. The United States, mainly for diplomatic reasons, can’t do this; but if the Chinese decide to do it on our behalf, we should send them a thank-you note.

The point is that with the world economy still in a precarious state, beggar-thy-neighbor policies by major players can’t be tolerated. Something must be done about China’s currency.

China’s Dollar Problem

outubro 22, 2009

O mais recente artigo do Prof. Kenneth Rogoff no http://www.project-syndicate.org!

=============

China’s Dollar Problem
Kenneth Rogoff

CAMBRIDGE – When will China finally realize that it cannot accumulate dollars forever? It already has more than $2 trillion. Do the Chinese really want to be sitting on $4 trillion in another five to 10 years? With the United States government staring at the long-term costs of the financial bailout, as well as inexorably rising entitlement costs, shouldn’t the Chinese worry about a repeat of Europe’s experience from the 1970’s?

During the 1950’s and 1960’s, Europeans amassed a huge stash of US Treasury bills in an effort to maintain fixed exchange-rate pegs, much as China has done today. Unfortunately, the purchasing power of Europe’s dollars shriveled during the 1970’s, when the costs of waging the Vietnam War and a surge in oil prices ultimately contributed to a calamitous rise in inflation.

Perhaps the Chinese should not worry. After all, the world leaders who just gathered at the G20 summit in Pittsburgh said that they would take every measure to prevent such a thing from happening again. A key pillar of their prevention strategy is to scale back “global imbalances,” a euphemism for the huge US trade deficit and the corresponding trade surpluses elsewhere, not least China.

The fact that world leaders recognize that global imbalances are a huge problem is welcome news. Many economists, including myself, believe that America’s thirst for foreign capital to finance its consumption binge played a critical role in the build-up of the crisis. Cheap money from abroad juiced an already fragile financial regulatory and supervisory structure that needed discipline more than cash.

Unfortunately, we have heard leaders – especially from the US – claim before that they recognized the problem. In the run-up to the financial crisis, the US external deficit was soaking up almost 70% of the excess funds saved by China, Japan, Germany, Russia, Saudi Arabia, and all the countries with current-account surpluses combined. But, rather than taking significant action, the US continued to grease the wheels of its financial sector. Europeans, who were called on to improve productivity and raise domestic demand, reformed their economies at a glacial pace, while China maintained its export-led growth strategy.

It took the financial crisis to put the brakes on US borrowing train – America’s current-account deficit has now shrunk to just 3% of its annual income, compared to nearly 7% a few years ago. But will Americans’ newfound moderation last?

With the US government currently tapping financial markets for a whopping 12% of national income (roughly $1.5 trillion), foreign borrowing would be off the scale but for a sudden surge in US consumer and corporate savings. For the time being, America’s private sector is running a surplus that is sufficient to fund roughly 75% of the government’s voracious appetite. But how long will US private sector thrift last?

As the economy normalizes, consumption and investment will resume. When they do – and assuming that the government does not suddenly tighten its belt (it has no credible plan to do so) – there is every likelihood that America’s appetite for foreign cash will surge again.

Of course, the US government claims to want to rein in borrowing. But, assuming the economy must claw its way out of recession for at least another year or two, it is difficult to see how the government can fulfill its Pittsburgh pledge.

Yes, the Federal Reserve could tighten monetary policy. But they will not worry too much about the next financial crisis when the aftermath of the current one still lingers. In our new book This Time is Different: Eight Centuries of Financial Folly , Carmen Reinhart and I find that if financial crises hold one lesson, it is that their aftereffects have a very long tail.

Any real change in the near term must come from China, which increasingly has the most to lose from a dollar debacle. So far, China has looked to external markets so that exporters can achieve the economies of scale needed to improve quality and move up the value chain. But there is no reason in principle that Chinese planners cannot follow the same model in reorienting the economy to a more domestic-demand-led growth strategy.

Yes, China needs to strengthen its social safety net and to deepen domestic capital markets before consumption can take off. But, with consumption accounting for 35% of national income (compared to 70% in the US!), there is vast room to grow.

Chinese leaders clearly realize that their hoard of T-Bills is a problem. Otherwise, they would not be calling so publicly for the International Monetary Fund to advance an alternative to the dollar as a global currency.

They are right to worry. A dollar crisis is not around the corner, but it is certainly a huge risk over the next five to 10 years. China does not want to be left holding a $4 trillion bag when it happens. It is up to China to take the lead on the post-Pittsburgh agenda.

The Business of Sustainability

outubro 21, 2009

Um interessante report da MIT Sloan Management Review: http://sloanreview.mit.edu/special-report/the-business-of-sustainability/!

=============

The Business of Sustainability

Findings From the First Annual Survey and Interview Project
 
Will sustainability change the competitive landscape and reshape the opportunities and threats that companies face? Will sustainability have a material impact on your company? What should you and your company do about it? Our special report explores these burning questions and many more.

Read the Report

The Business of Sustainability is produced by MIT Sloan Management Review and The Boston Consulting Group. It is part of our Sustainability Initiative, which studies how the challenges and opportunities presented by sustainability will transform management. Read the special report or a summary report.

 

The Future of Supercomputers is Optical

outubro 19, 2009

Deu no www.technologyreview.com do MIT!

Friday, October 16, 2009

The Future of Supercomputers is Optical

An IBM researcher gives a timeline for developing the next generation of supercomputers.
By Katherine Bourzac

This week at the Frontiers in Optics conference in San Jose, Jeffrey Kash of IBM Research laid out his vision of the future of supercomputers.

The fastest supercomputer in the world, the Los Alamos National Laboratory’s IBM Roadrunner, can perform 1,000 trillion operations per second, which computer scientists call the petaflop scale. Getting up to the next level, the exaflop scale, which is three orders of magnitude faster, will require integrating more optical components to save on power consumption, Kash said. (Laser scientists at the conference are also looking towards the exascale, as I reported on Wednesday.)

Melinda Rose of photonics.com reported on Kash’s talk, which he stated represented his personal views and not those of IBM:

Because a 10x increase in performance means the machine will consume double the power, to make future supercomputers feasible to build and to operate optics will need to be more widely used he said. In 2008 a 1-petaflop computer cost $150 million to build and consumes 2.5 MW of power. Using the same technology, by 2020 a 1 exaflop machine would cost $500 million to build and consume 20 MW of power.

Kash gave a timeline that would find optics replacing electrical backplanes by 2012 and replacing electrical printed circuit boards by 2016. In 2020, optics could be directly on the chip. In a less aggressive scenario, by 2020 all off-chip communications need to be optical, he said.

But for that to happen, to get optics up to millions of units in 2010, the price needs to drop to about $1 per Gb/s, he said. Today, Gb/s processing costs about $10.

Banda larga sem ICMS, o começo de uma era

outubro 19, 2009

São Paulo mais uma vez sai na frente na questão de redução de impostos nas TICs!  A matéria abaixo veio do blog do jornalista  Ethevaldo Siqueira (http://www.ethevaldo.com.br).

==========
Banda larga sem ICMS, o começo de uma era
19 de outubro de 2009

Seria muito positivo para o Brasil se os demais governadores de Estados brasileiros seguissem o exemplo do governador paulista, José Serra, que assinou na quinta-feira (15/10), perante grande audiência no Futurecom 2009, o decreto que elimina o Imposto sobre Circulação de Mercadorias e Serviços (ICMS) para serviços de conexão de banda larga popular, ou seja, a velocidades de 200 quilobits por segundo (kbps) a 1 megabit por segundo (1 Mbps).

Esta edição do Futurecom foi a segunda realizada em São Paulo. O evento, bem organizado e com bom conteúdo, foi prestigiado por autoridades de diversos setores. Na área de comunicações, as duas grandes ausências foram as do ministro das Comunicações, Hélio Costa, e do presidente da Anatel (Agência Nacional de Telecomunicações), embaixador Ronaldo Sardenberg.

A escolha da banda larga como tema principal foi das mais oportunas e despertou grande interesse. O clima de confiança e otimismo foi a marca dominante dos discursos, tanto de autoridades quanto de dirigentes setoriais, em especial quanto à recuperação da economia mundial e ao cenário brasileiro.

A grande maioria dos empresários e dirigentes setoriais – tanto das operadoras quanto da indústria – mostrou-se otimista com as perspectivas de investimentos criadas para os próximos 7 anos, com a realização no Brasil da Copa do Mundo, em 2014, e das Olimpíadas de 2016, no Rio de Janeiro. Espera-se que a preparação do País para esses megaeventos venha a criar a demanda de investimentos da ordem de R$ 150 bilhões nos próximos 5 anos, na ampliação da infraestrutura de telecomunicações e, em especial, de banda larga fixa e móvel.

Um dos fatos mais positivos foi a presença de autoridades como o governador José Serra, o prefeito de São Paulo, Gilberto Kassab, e parlamentares. Laudálio Veiga Filho anunciou a realização em julho de 2010 de uma edição internacional do Futurecom no Chile, além de um evento orientado aos profissionais de marketing e comunicação, a ser realizado até o primeiro semestre do ano que vem.

Um estudo especial

Fruto de parceria entre a consultoria e site Teleco e a Huawei, foi divulgado na semana passada o estudo que analisa o perfil detalhado da evolução da banda larga móvel no país. Esse estudo é atualizado trimestralmente e pode ser acessado pelo link para download: http://www.huawei.com/pt/catalog.do?id=1779

Estratégias recentes das grandes corporações de TICs-II: Google

outubro 13, 2009

Na newsletter da Creativante desta semana, com o título “Estratégias recentes das grandes corporações de TICs-II“, que você pode acessar aqui, damos especial destaque à empresa Google. 

Para complementar este post, gostaríamos de indicar também um link da revista Businessweek apontando para um slide-show intitulado “Google’s 20 Hottest Tools“!

An institutional economics prize

outubro 12, 2009

O Prof. Paul Krugman, prêmio Nobel de Economia do ano passado, comenta os premiados deste ano no seu blog (http://krugman.blogs.nytimes.com/).

=============

October 12, 2009, 8:12 am

<!– — Updated: 8:27 am –>An institutional economics prize

Congratulations to Elinor Ostrom and Oliver Williamson. What a day for them!

The way to think about this prize is that it’s an award for institutional economics, or maybe more specifically New Institutional Economics.

Neoclassical economics basically assumes that the units of economic decision-making are a given, and focuses on how they interact in markets. It’s not much good at explaining the creation of these units — at explaining, in particular, why some activities are carried out by large corporations, while others aren’t. That’s obviously an interesting question, and in many cases an important one. For example, in my own home field of international trade, the basic models don’t assign any particular role to multinational corporations; how do we get them into the story, and what difference do they make?

There was an old tradition of economics that focused on the origins and nature of economic institutions. This tradition was very influential before World War II.

But it proved not at all helpful during the Great Depression. My caricature version is that when the Depression hit, institutional economics, asked for advice about what to do, replied that well, it’s all very complicated, and has deep historical roots, and … Meanwhile, Keynesian economists, using very simple mathematical models, basically said “Push this button — we need more G”.

And this had a somewhat perverse effect. The rise of Keynesian economics also meant the rise of the equations guys (Samuelson in particular), and in the end the equations crowded out institutional economics even as Keynes fell into disfavor.

But the questions didn’t go away. And institutional economics has been making a quiet comeback for the past several decades.

Oliver Williamson’s work underlies a tremendous amount of modern economic thinking; I know it because of the attempts to model multinational corporations, almost all of which rely to some degree on his ideas. I wasn’t familiar with Ostrom’s work, but even a quick scan shows why she shared the prize: if the goal is to understand the creation of economic institutions, it’s crucial to be aware that there is more variety in institutions, a wider range of strategies that work, than simply the binary divide between individuals and firms.

The prize is also, of course, a happy reminder that most of the profession is not caught up in the macro wars!

Add: Don’t tell Senator Coburn, but the NSF Political Science program has supported a lot of Elinor Ostrom’s research.

The Future of Televison

outubro 12, 2009

future_tv

Um interessante report preparado pela revista Businessweek sobre o Futuro da Televisão, sob a ótica do contexto americano!

Ver http://www.businessweek.com/bwdaily/dnflash/special_reports/20090422the_future_of_tv.htm!


%d blogueiros gostam disto: