Archive for junho \29\UTC 2009

Consumer Confidence Rises 4th Straight Month for the First Time Since the End of the 2001 Recession

junho 29, 2009

Post de 26/06 do blog do Prof. Mark Perry indica que a cada dia o pior da crise  (exatamente onde ela começou) vai ficando para trás!


Consumer Confidence Rises 4th Straight Month for the First Time Since the End of the 2001 Recession


hammertime.gif LA TIMES Confidence among U.S. consumers rose this month for a fourth straight time, reflecting signs that the worst of the recession has passed. The Reuters/University of Michigan final index of consumer sentiment gained to 70.8, the highest level since February 2008, from 68.7 in May.

Recent reports show some areas of the economy, such as housing and manufacturing, are seeing a smaller pace of decline, consistent with the Federal Reserve’s projection this week that the slump is “slowing.” Government data today indicated that efforts to revive the economy are allowing consumers to spend even with unemployment at a 25-year high. The data also showed savings surged to the highest level since 1993.

MP: The last time the Michigan consumer sentiment index increased in four consecutive months was the period from October 2001 to January 2002, which signalled the end of the 2001 recession (see shaded area in chart above). The four-month cumulative increase of 14.5 points in consumer sentiment from March to June 2009 (see shaded area in chart) is even greater than the 11.2 point increase in late 2001-early 2002.

Ballmer: Advertising Is All About Content

junho 28, 2009

Steve Ballmer, CEO da Microsoft, fala sobre propaganda em evento global da área.  O post veio do blog!


Ballmer: Advertising Is All About Content

Microsoft’s CEO lays out his perspective on the future of online advertising at European ad conclave.

June 26, 2009
By Stuart J. Johnston:

Microsoft’s CEO Steve Ballmer presented an overview of his vision for the future of advertising to a gathering of global advertising professionals meeting on the French Riviera.

His message laid out Microsoft’s (NASDAQ: MSFT) vision of advertising in the future — and it’s not just about moving media from paper or television to online.

It also, among other things, depends on technology evolution such as the merging of computing and television. Additionally, though, it also means that the worlds of commerce and advertising need to be re-imagined.

“Five years ago, I’d have said we are mostly trying to do the offline world online,” Ballmer said.

Ballmer made the statements during his keynote speech to attendees at the Cannes Lions International Advertising Festival 2009 in Cannes, France, on Wednesday.

His speech comes as the company’s newly launched search engine tries to compete with rivals Google (NASDAQ: GOOG) and Yahoo (NASDAQ: YHOO) better than its predecessors, Live Search and MSN Search.

The ultimate prize — money from advertising. Although search advertising is crucial, however, so is figuring out how to deal with the continuing merger of television — talk about an advertising vehicle — with computing.

“TV is probably where the development or the future of what’s going to happen to interactive digital content is least clear,” Ballmer said, envisioning hybrid devices like Microsoft’s Xbox 360 game console and media hub, or wireless devices.

“At the end of the day you can put a PC, some form of a PC … next to the TV or embedded in the TV and we&’ll have to, as we’re repackaging the Internet and the digital content for the phone, we’re going to see content get repackaged and repurposed to really make sense of a TV-based experience,” Ballmer continued.

Ballmer’s “Mister Obvious” approach may have been calming for his audience, confronted with rapidly changing business models and explosions in communications technologies, as advertising tries to bridge the gap between the physical and online worlds.

Nonetheless, over time all media will move to digital form and most paper publications will migrate online or die, he said.

“We ought to be able to drive everybody to consume the communications and information, the information in their world digitally. All information will be social, you’ll expect to be able to collaborate and share thoughts, perspectives and interactions in any piece of content or any experience socially,” he added.

Of course, Microsoft has its own agenda for online advertising as it tries to evolve its business to embrace services “in the cloud,” much of which will be Subsidized by advertising.

To that end, over the past several years, Microsoft has spent lavishly — both internally and through acquisitions — to make itself a credible presence in online advertising, with some success.

“Microsoft has now grown an advertising business in excess of $2 billion and that’s a big number for our company, a big number, and we’re very serious about it,” Ballmer said.

The company’s largest single acquisition was the purchase of online advertising powerhouse aQuantive for $6 billion in 2007.

That media buy has helped Microsoft’s play for a larger chunk of the online advertising pie. In the first nine months of Microsoft’s fiscal 2009, which ends June 30, the company brought in $1.7 billion in online advertising despite the recession, according to its latest 10Q, That’s not a small amount for a company that grossed $60 billion last fiscal year.

Nod to Apple

Much of Microsoft’s future advertising earnings will depend on its Azure cloud computing platform, meant to deliver services to users wherever they are and whatever device they’re using.

Microsoft launched Azure with much fanfare last fall. However, the service is still in beta and the company still has not explained how it will make money — though advertising will surely play a large role.

Ballmer also gave a nod to Apple for creating its App Store, a model that Microsoft followed when it introduced its own Windows Marketplace for Mobile earlier this year.

“[I] give a lot of credit to Apple and what they’ve done with the App Store … people are trying to figure out how to take the PC-based Internet and repackage it for a small screen,” he added.

He wasn’t quite so generous when it came to mobile devices.

“Microsoft will continue to push forward with our Windows Mobile program, which despite the fact that we’ve certainly seen a little momentum in our competitors this year; we think the notion that there’s a software platform that can be available on many hardware devices, is key to platform standardization on the phone.”

A Value Proposition for Enterprise Architecture (Uma Proposição de Valor para Arquitetura Empresarial)

junho 25, 2009

 Uma interessante abordagem sobre Arquitetura Empresarial do blog de Richard Veryard (!


Monday, June 15, 2009

A Value Proposition for Enterprise Architecture


Something else I took away from Sally Bean’s and Peter Haine’s workshop Reflecting on EA at EAC 2009 was the relationship between two sets of questions. What is EA all about, what is the (emerging, changing) identity of EA? What is the value proposition for EA?

I personally find the second of these two questions much more important and interesting than the first. Questions of identity often result in entrenched positions, and (as I discussed in my previous post Think Globally, Act Locally) can produce division between different views. In the case of Enterprise Architecture, there is a traditional view (EA-as-IT-planning) and an emerging view (EA-as-business-strategy).

I think the question about the value proposition opens up a much more interesting discussion about the possible evolution and potential multi-tasking of enterprise architecture teams. (EA itself needs to understand its business model to help it survive. The full exploration of the value proposition needs something like the Osterwalder business model canvas we discussed in our workshop on on Business Modelling for Business Improvement, so I’ll have a go at that. See below for Osterwalder template.)

One of the key questions for the value proposition is the timescale. Sometimes enterprise architecture is described in terms of longer-term value: through-life coordination and capability management. Some people are still comfortable with this; but other people see a difficulty with the fact that business needs to wait so long to realise this kind of value, or even to measure it properly. In contrast, there is growing support for a much shorter-term delivery of value.

Essentially, that means EA must deliver value within same timescale as projects. And what is the nature of this value? Roger Sessions points to the massive failure rate in IT projects, and argues that’s something EA can and should be fixing. In other words, the EA value proposition can be defined in terms of improving IT project success ratios.

I see two difficulties with this. The first is one of perspective. If EA is working closely with the projects, then how is the EA perspective any different from project perspective? And if the problem is that projects are doing things wrong, then how can EA fix the problem from within the project perspective? The EA view of business requirements is hardly going to be very different from that of the good business analyst on the project. If EA is no longer taking the long view, then its value proposition is largely based on the hypothesis that the architects may have a bit more knowledge and experience than the business analysts, and some slightly superior tools and techniques. But we might achieve this outcome more efficiently and effectively by simply upgrading the business analysis practice and redeploying the architects as senior business analysts. Indeed, some IT organizations seem to be moving in this direction, although they haven’t taken away the formal job titles yet.

The second difficulty is that the job of overseeing projects and ensuring project success is hugely duplicated. Within a large IT organization, we might have project management, programme management, IT governance, tools and methods, quality management (control and/or assurance) as well as enterprise architecture, each with its own “body of knowledge”, each trying to prevent projects from getting things wrong (and claim the credit). The word “silo” springs to mind here. (All of those roles might possibly be held by the same person, but does that remove the complexity?)

From a systems-thinking perspective, this looks completely crazy. If the value proposition for EA is simply to correct things that projects are doing wrong, then this counts as “failure demand”. If the value proposition for EA is to make sure that projects are successful, then that’s putting the responsibility in the wrong place. It is the project’s job to be outstandingly successful. If they can achieve this unaided, this appears to make EA redundant.

In summary, I’m not convinced that the traditional value proposition for enterprise architecture is convincing to its customers, whoever they are. (Who are the customers anyway, the CIO or CFO who have to pay for it, or the business line management and IT project managers who are being asked to spend time and effort on EA “for their own good”, and are not always grateful for EA attention?) I think the top priority for the enterprise architecture discipline is to find and formulate a viable and meaningful value proposition. And it really doesn’t matter whether we call it “enterprise architecture” or not.

Thanks to several Twitter friends for today’s discussion: A Jangbrand, Anders Østergaard, Andrew Townley, Brenda Michelson, Colin Beverage, Roger Sessions, Todd Biske. (Did I miss anyone?)

Building a 21st Century infrastructure (Construindo uma infraestrutura do século 21)

junho 23, 2009

Bom post do, publicado em março deste ano!


Building a 21st Century infrastructure


By CBR staff writer

Enterprises are under increasing pressure to do more with less. Rapidly expanding infrastructure means increased power consumption, complexity and cost. CBR takes a look at the ways in which a company can reduce these factors without compromising efficiency or performance.

In days gone by, the solution to tackling increasing data centre demand was to simply throw more hardware at it. More servers equalled more capacity. But that required more space, power consumption, man hours and more bits and bobs to keep everything running.

The explosion of data – one report suggests 15 petabytes of information is being generated every day – is driving up data centre demand, which in turn is pushing up energy costs.

During the economic boom of the late 1990s and early 2000s this was not an issue; companies had money to throw at data centres. But that is no longer the case. IBM suggests that data centre costs, such as energy and space, have risen eight times since 1996 and companies can no longer afford to throw good money after bad at the problem.

So what can enterprises do? Data, applications and any other information contained in a data centre have to stay there, and it has to be available whenever the company needs it. Data centres need to run all the time, switching the infrastructure off at night, along with the lights, is not an option.

Simply reducing costs is not enough. Enterprises need to spend their money more wisely. That does not have to mean spending less money, instead the pressure is on to find more effective investment opportunities. Is it a sensible option to cap server power consumption when that could impact the response time of applications and SLAs?

Enterprises are increasingly turning toward virtualisation as a way of reducing costs without having to worry about a decrease in the availability of business-critical operations. Virtualisation can be described as the abstraction of “a form of technology away from its original environment – a literal and physical form – and redeliver it in a virtual form.”

The idea of virtualisation is not new. Companies such as IBM have been involved in it, in one form or another, since the 1960s, “in the specific context of separate logical partitions running in parallel on a shared mainframe,” according to the company. Since those days, virtualisation has grown to cover systems, storage, networks and applications.

Data centre consolidation, shifting the functionality of many servers onto fewer servers, enables a company to manage the infrastructure as a single entity and results in reduced space, power and cooling requirements, as well as management costs.

Having all these systems in place is one thing, but getting the best out of them is another. IBM says its technology can also improve performance, with what it is calling a dynamic infrastructure. Unveiled at IBM Pulse 2009 in Las Vegas, it aims to close the gap between where a business is and where it needs to be, to enable it to operate more efficiently.

Richard Esposito, vice president of IT strategy and architecture services, IBM Global Technology, said: “A dynamic infrastructure is the intelligent connection of underlying business and IT assets that are highly-automated to reduce costs, increase service levels and better manage risk.”

Esposito went on to say that the line between business assets and IT assets is becoming blurred and that is actually improving organisational efficiency. Taking a more business-minded approach to IT assets means companies are getting greater visibility of their infrastructure, which in turn is driving greater utilisation and performance.

Al Zollar, IBM Tivoli general manager, said that this is where IBM’s dynamic infrastructure sits. “Our approach is holistic, it’s not just based on our equipment but connecting with the equipment provided by others, such as power distribution units and air conditioning units,” he said. “Then we bring all of those assets into a single unified view where we can get a single set of measurements.”

This helps provide a much clearer picture of what is happening within a data centre, says Zollar. “We are bringing not just traditional IT assets but assets that are being enabled with IT. Data centre assets like cooling and power distribution are being smartened up with sensors that can communicate with control systems that can be driven by automated actions,” Zollar said.

The improvements to data centre optimisation look set to continue. Many industry analysts have suggested that Fibre Channel over Ethernet (FCoE) has the potential to revolutionise next-generation data centres.

The ability to leverage 10Gb Ethernet networks should not only improve the performance of the data centre network but it could also reduce power and cooling costs as well as the amount of physical space needed, as the number of network interface cards required should be reduced.

In a recent interview with CBR, Dante Malagrino, marketing director, data centre solutions at Cisco Systems offered a ringing endorsement of FCoE: “Network infrastructures can be very costly to build and manage, and it’s too complicated. So as we look at IT simplification and reducing costs, FCoE will consolidate lots of different environments on one piece of cable,” he said.

Malagrino’s comments were backed up by Craig Nunes, VP of marketing at utility storage vendor 3PAR. He recently told CBR: “We’ve seen a lot of I/O transitions come and go and they always take longer than predicted, but it is clear that FCoE has the momentum. You get better protocol consolidation, it’s easier to deal with and you get better leverage of your data centre equipment,” Nunes said.

But Nunes did issue a word of warning. “All the signs are there that it’s going to be an important interconnect from storage to host. That said, it won’t be today or tomorrow or later this year,” he said.

That still leaves enterprises with the issue of what they can do now. One company helping to deliver IBM’s dynamic infrastructure is Ilog, headquartered in Gentilly, France. In August 2008, the company was acquired by IBM for $340m, with the deal completing in January 2009.

CBR recently caught up with Jeremy Bloom, senior product marketing manager for optimisation at Ilog. He said his company is providing optimisation technologies which fit with IBM’s Smarter Planet initiative, particularly in the power supply field.

“We’re delivering a few applications for the dynamic infrastructure. I think there is a great potential there. One of the ideas behind it is that it enables you to locate and fix a lot of issues through sensors and automation,” Bloom said.

This is an important development. Enabling a company to proactively monitor and manage its infrastructure should improve resiliency. If a sensor detects that a server is getting close to operational capacity, business-critical applications can be automatically moved to a different server, with no drop-off in performance or availability.


CBR opinion

Integrating both the business and IT infrastructures, what IBM terms a dynamic infrastructure, should enable a company to consolidate an existing infrastructure by using virtualisation technologies to operate much more effectively. It is hard to argue with the benefits: higher efficiency, improved performance and reduced energy and management costs.

Photo credit: Pandiyan on Flickr, CC licence

In Search of Innovation (Em busca de Inovação)

junho 22, 2009

 Um excelente artigo sobre inovação que saiu em The Wall Street Journal de hoje!


  • JUNE 22, 2009, 5:46 A.M. ET

    In Search of Innovation
    When companies try to come up with new ideas, they too often look only where they always look. That won’t get them anywhere.


    If you want to understand why some companies lack innovative ideas, think about the man who can’t find his car keys.

    His friend asks him why he’s looking for the keys under the lamppost when he dropped them over on the lawn. “Because there’s more light over here,” the man explains.

    [The Journal Report- See the complete Business Insight report.]

    For too many companies, that describes their search for new ideas, and it pretty much guarantees they won’t go anywhere fast. While such a company can marginally improve what it’s already good at, it misses out on the breakthroughs—those eureka moments when a new concept pops up, as if from nowhere, and changes a company’s fortunes forever.

    Those ideas, however, don’t really come from nowhere. Instead, they are typically at the edge of a company’s radar screen, and sometimes a bit beyond: trends in peripheral industries, unserved needs in foreign markets, activities that aren’t part of the company’s core business. To be truly innovative, companies sometimes have to change their frames of reference, extend their search space. New ways of thinking and organization can be required as well.

    In other words, they have to look away from the lamppost.

    None of this is easy to do. But companies that succeed may just recognize the next great opportunity, or looming threat, before their competitors do. And that’s important in tumultuous economic times with rapidly changing technologies. Indeed, every once in a while, that blip on the horizon turns out to be a tsunami.

    Min Jae Hong

    For the past several years, we and other researchers have participated in workshops with more than 100 companies discussing and experimenting with new ways of looking for and developing innovations. Here are nine examples of practices with the potential to produce a company’s eureka moment.


    Many companies use teams of writers with diverse perspectives to create complex scenarios of what future markets may look like. The writers try to imagine detailed opportunities and threats for their companies, partners and collaborators. An oil company that wants to explore energy opportunities in cities of the future, for example, might want to work on scenarios with writers from construction, water and utility-management companies.

    Industry organizations and government agencies use scenarios, too, sometimes in collaboration with companies. Bord Bia, the Irish food agency, works on scenarios with global food companies based in Ireland like Kerry Group PLC and Glanbia PLC. Danish pharmaceutical giant Novo Nordisk AS has shared scenarios with the Oxford Health Alliance, a British nonprofit. Novo Nordisk thus helps the cause and broadens its own views by gaining the input of alliance members.


    A few companies have created Web sites that act as literal marketplaces of ideas. is a site where people and companies look for help in solving scientific and business challenges. Posters of challenges sometimes offer cash rewards for solutions: Amounts have ranged from $5,000 to $1 million. The site began as an in-house tool for research scientists at Eli Lilly & Co. to help one another. Now it is independent, with Indianapolis-based Lilly as a founding shareholder.

          [Far and Wide

    • The Situation: Companies looking for innovative ideas often limit their searches to fields they’re already familiar with.
    • The Problem: That can help with incremental progress, but it seldom leads to the kinds of breakthroughs or inspirations that generate new markets and dramatic growth.
    • The Solution: Companies need to look at the edge of their radar screens, and sometimes a bit beyond, to experience eureka moments. The authors describe methods that successful companies use to keep innovation strong.]

    By opening the site up, Lilly gets wider access to individuals and companies with ideas that may be of value. Problem solvers can be professionals, retired scientists, students or anyone who can answer a problem that has stumped a company’s own researchers. InnoCentive, based in Waltham, Mass., says the site gives solutions to about 40% of the problems posed.

    Other companies use their own Web sites as open invitations for new ideas. BMW AG, for example, through what it calls its Virtual Innovation Agency, invites ideas from “small and medium-sized innovative companies” on the Web site


    Ideas and insights from so-called lead users can be the starting point for new markets, products and services.

    Lead users are innovators themselves. They tend to be people working in or using products in a specific market who are frustrated by the tools, goods or services currently available and yearn for something better. Many medical devices, for example, originate from sketches drawn by surgeons, surgical nurses and other medical staff who feel driven to experiment with new ideas because current products aren’t meeting their needs. They are often supportive, and tend to tolerate product failures as part of a process that helps bring about improvements. Software and other online-products companies have shown interest in lead users for perpetual beta testing and other product development.

    British Broadcasting Corp. sponsors a Web site for lead users at Several times a year the BBC uses the site to host what it calls “hack days,” when it lets subscribers play around with source codes the BBC uses for such online applications as live news feeds, weather and TV listings. BBC staff look at what the Backstage subscribers come up with to see what can be useful. Some of this work is feeding through to applications used on the BBC Web sites and elsewhere. For example, one idea from a hack day led the BBC to link its iPlayer, a tool for watching BBC video on the Web, with , the social-networking site. Facebook users can set up an iPlayer window on their pages to watch BBC programming.

    Interest has surged in market research that uses detailed, firsthand observation to learn more about consumers’ needs or wants. Deep diving is one of many terms used to describe the approach, which resembles an anthropological study in the way researchers immerse themselves in the lives of the target consumers.

    Such approaches can help uncover underserved or unserved markets and give clues to new directions and new frames in which to search for innovative ideas.

    Novo Nordisk, for one, mobilized teams in several developing countries to research how health systems with limited resources were handling diabetes care. Researchers compiled detailed interviews and observations—documenting cases by interviewing patients and recording them on video, and spending time in hospitals, rural clinics and the health ministry. The result: a rich picture of the market, of needs that weren’t being met, and fertile suggestions for alternative products and services that might be delivered.


    Some companies design probe-and-learn strategies that study opportunities in segments of markets the company isn’t active or strong in. This strategy goes further than deep diving by actively experimenting with new ideas in a new context. The experiments might not always work, but they will give valuable insight about future directions of markets.

    [Join the Discussion

    Join Kathrin Möslein and two of her colleagues for a forum on how and where to look for new ideas]

    BT Group PLC, the British telecommunications company, for instance, is looking at ways to help the elderly live longer at home. By 2026, about 30% of the U.K. population will be more than 60 years old. As part of its probe-and-learn exercise, BT is conducting a test service in which it places sensors in the homes of elderly customers to monitor their movement; if the sensors detect unusual activity, or none, they trigger an alarm. BT says that the service already is generating revenue, but that its greater significance is as a stepping-stone to help the company learn more about what will be a huge and very different market in the future.


    By engaging more of its own workers in the search for innovation, a company can broaden its vision. For example, the duties of procurement, sales or finance groups can be expanded to include learning about trends they encounter that ordinarily might be considered not of primary interest to the company.

    Reckitt Benckiser PLC, the U.K.-based maker of household-cleaning and personal-hygiene products, has mobilized a large number of its agents in purchasing, marketing and customer relations to be on the lookout for relevant new market trends. A small in-house team attempts to verify reported insights and to build on them. The team reports regularly to senior managers, who decide which concepts to pursue further. A company spokeswoman adds that 40% of revenue in 2007 resulted from innovations launched in the prior three years.


    Innovation can bubble up inside a company as well—when the organization follows practices that favor it.

    [For Further Reading

    See these related articles from MIT Sloan Management Review

    • The Era of Open Innovation
      Henry W. Chesbrough (Spring 2003)
      Companies are increasingly rethinking the fundamental ways in which they generate ideas and bring them to market—harnessing external ideas while leveraging their in-house R&D outside their current operations.
    • How Management Innovation­ Happens
      Julian Birkinshaw and Michael Mol (Summer 2006)
      Few companies understand how such innovation occurs and how to encourage it. To foster new management ideas and tech­niques, firms first need to understand the four typical stages in the management innovation process.
    • Institutionalizing Innovation (Winter 2008)­
      Scott D. Anthony, Mark W. Johnson and Joseph V. Sinfield
      Building an engine that produces a steady stream of innovative growth businesses is difficult, but companies that are able to do it differentiate themselves from competitors.
    • The Great Leap: Driving Innovation From the Base of the Pyramid
      Stuart L. Hart and Clayton M. Christensen (Fall 2002)
      Billions of poor people aspire to join the world’s economy. Disruptive innovation can pave the way, helping companies combine sustainable corporate growth with social responsibility.
    • Integrating Innovation Style and Knowledge Into Strategy
      Edward F. McDonough III, Michael H. Zack, Hsing-Er Lin and Iris Berdrow
      If you devise strategy by thinking only about the positioning of your company’s product or service, you are missing a huge opportunity.]

    Clear policies that reserve blocks of time for scientists or engineers to explore their own ideas have worked well at some companies. At 3M Co., based in St. Paul, Minn., scientists can spend 15% of their time on projects they dream up themselves, and the company has set procedures to take bright ideas forward, including grants and venture funding. Google Inc. takes a similar approach, allowing researchers to devote 20% of their schedules to play time, pursuing their own ideas and projects. The company credits this policy with fostering many of its important product innovations, including Gmail, its popular Web-based email service.

    It helps to have an established pathway to make sure the best new ideas get taken forward. In some cases, informal networking has pushed innovations to the forefront—below the radar screen of formal corporate systems. BMW, for example, has experience with what it calls “U-boat” projects, which run along below the surface of formal management approval. The Series 3 Touring car came into being not because of a formal product plan but as a consequence of efforts below the radar screen. The team responsible often worked at night, and welded together a prototype made from whatever bits they could scavenge.


    Sometimes innovations arise when different departments talk to each other. But what’s the best way to start the conversation?

    Many companies set up so-called communities of practice, which are typically internal Web sites where employees are encouraged to share knowledge and skills important to the company.

    A British company, meanwhile, has taken the idea a step further. To encourage more interactions and exchanges of ideas, the U.K.-based engineering-services company Arup Group has developed something it calls a “knowledge map” depicting the company’s areas of expertise and how workers and departments are connected to one another in terms of information flows.


    Close, long-term relationships—depending too much on the same customers, partners or suppliers for innovation ideas—can reinforce old ways of doing things and make changing a frame of reference difficult.

    Some companies seek innovation partners with whom they wouldn’t normally work, and who might bring a fresh perspective. Doctors at the Great Ormond Street Hospital for Children in London, for example, consulted with members of a pit-stop crew from Italy’s Ferrari Formula One motor-racing team to explore ways of improving how children were being moved out of heart surgery and into intensive care.

    Some companies are also recruiting staff with very different perspectives to spice up their knowledge mix. The Danish enzyme maker Novozymes actively recruits experienced entrepreneurs. Such characters aren’t afraid to challenge corporate perspectives and to make waves. As one manager put it, they create a little grit to stimulate the oyster to produce pearls.

    –Dr. Bessant is a professor at the Imperial College Business School in London. Dr. Möslein is a professor at the School of Business and Economics at the University of Erlangen-Nuremberg in Nuremberg, Germany. Dr. von Stamm is director and catalyst at the Innovation Leadership Forum in North Wootton, England. They can be reached at

    Why Inflation Isn’t the Danger (Porque a Inflação não é o Perigo, nos EUA)

    junho 21, 2009

    Enquanto aqui no Brasil os comentaristas de plantão ficam preocupados em saber se o COPOM-Comitê de Política Monetária do Banco Central vai baixar mais (ou não) a taxa de juros, nos EUA a discussão é se as medidas adotadas para superação da crise irão trazer inflação ou não.

    Em um texto publicado ontem em The New York Times, o Prof.  Alan Blinder dá a sua versão para a questão!


    Why Inflation Isn’t the Danger

    Published: June 20, 2009

    SOME people with hypersensitive sniffers say the whiff of future inflation is in the air. What’s that, you say? Aren’t we experiencing deflation right now? The answer is yes. But, apparently, for those who are sufficiently hawkish, the recent activities of the Federal Reserve conjure up visions of inflation.

    David G. Klein


    The central bank is holding the Fed funds rate at nearly zero and has created a mountain of bank reserves to fight the financial crisis. Yes, these moves are unusual, but these are unusual times. Concluding that the Fed is leading us into inflation assumes a degree of incompetence that I simply don’t buy. Let me explain.

    First, the clear and present danger, both now and for the next year or two, is not inflation but deflation. Using the 12-month change in the Consumer Price Index as the measure, inflation has now been negative for three consecutive months.

    It’s true that falling oil prices, now behind us, were the main reason for the deflation. Core C.P.I. inflation, which excludes food and energy prices, has been solidly in the range of 1.7 percent to 1.9 percent for six consecutive months. But history teaches us that weak economies drag down inflation — and ours will be weak for some time. Core inflation near zero, or even negative, is a live possibility for 2010 or 2011.

    Ben S. Bernanke, the Fed chairman, is a keen student of the 1930s, and he and his colleagues have been working overtime to dodge the deflation bullet. To this end, they cut the Fed funds rate to virtually zero last December and have since relied on a variety of extraordinary policies known as quantitative easing to restore the flow of credit.

    These policies basically amount to creating new bank reserves by either buying or lending against a variety of assets. But quantitative easing is universally agreed to be weak medicine compared with cutting interest rates. So the Fed is administering a large dose — which is where all those reserves come from.

    The mountain of reserves on banks’ balance sheets has, in turn, filled the inflation hawks with apprehension. But their concerns are misplaced. To understand why, start with the basic economics of banking, money and inflation.

    In normal times, banks don’t want excess reserves, which yield them no profit. So they quickly lend out any idle funds they receive. Under such conditions, Fed expansions of bank reserves lead to expansions of credit and the money supply and, if there is too much of that, to higher inflation.

    In abnormal times like these, however, providing frightened banks with the reserves they demand will fuel neither money nor credit growth — and is therefore not inflationary.

    Rather, it’s more like a grand version of what the Fed does every Christmas season. The Fed always puts more currency into circulation during this prime shopping period because people demand it, and then withdraws the “excess” currency in January.

    True inflation hawks worry about that last step. (Did someone say, “Bah, humbug”?) Will the Fed really withdraw all those reserves fast enough as the financial storm abates? If not, we could indeed experience inflation. Although the Fed is not infallible, I’d make three important points:

    The possibilities for error are two-sided. Yes, the Fed might err by withdrawing bank reserves too slowly, thereby leading to higher inflation. But it also might err by withdrawing reserves too quickly, thereby stunting the recovery and leading to deflation. I fail to see why advocates of price stability should worry about one sort of error but not the other.

    The Fed is well aware of the exit problem. It is planning for it, is competent enough to carry out its responsibilities and has committed itself to an inflation target of just under 2 percent. Of course, none of that assures us that the Fed will hit the bull’s-eye. It might miss and produce, say, inflation of 3 percent or 4 percent at the end of the crisis — but not 8 or 10 percent.

    The Fed will start the exit process when the economy is still below full employment and inflation is below target. So some modest rise in inflation will be welcome. The Fed won’t have to clamp down hard.

    SKEPTICAL? Then let’s see what the bond market vigilantes really think.

    The market’s implied forecast of future inflation is indicated by the difference between the nominal interest rates on regular Treasury debt and the corresponding real interest rates on Treasury Inflation Protected Securities, or TIPS. These estimates change daily. But on Friday, the five-year expected inflation rate was about 1.6 percent and the 10-year expected rate was about 1.9 percent. Notice that the latter matches the Fed’s inflation target. I don’t think that’s a coincidence.

    But if the inflation outlook is so benign, why have Treasury borrowing rates skyrocketed in the last few months? Is it because markets fear that the Fed will lose control of inflation? I think not. Rising Treasury rates are mainly a return to normalcy.

    In January, the markets were expecting about zero inflation over the coming five years, and only about 0.6 percent average inflation over the next decade. The difference between then and now is that markets were in a panicky state in January, braced for financial Armageddon; they have since calmed down.

    My conclusion? The markets’ extraordinarily low expected inflation in January was both aberrant and worrisome — not today’s. As long as expected inflation doesn’t rise much further, you should find something else to worry about. Unfortunately, choices abound.

    Alan S. Blinder is a professor of economics and public affairs at Princeton and former vice chairman of the Federal Reserve. He has advised many Democratic politicians.

    The Cloud as Innovation Platform: Early Examples (A Nuvem como Plataforma de Inovação)

    junho 19, 2009

     O tema da cloud computing está de fato ganhando momento. Eis abaixo um post da Computerworld, que saiu no blog, que dá uma pequena idéia das iniciativas que estão rolando nesta área!

    The Cloud as Innovation Platform: Early Examples

    Bernard Golden
    18.06.2009 kl 18:00 | IDG News Service
    I am privileged to serve as co-chair of the Cloud Services SIG for a Silicon Valley-based non-profit, the SDForum, which is a great resource for technologists, entrepreneurs, and investors to meet and investigate new technologies. We’ve been running the SIG since January, and it’s been a great experience to see what people are doing with cloud computing (if you’re located in Silicon Valley, please come to one of our meetings; you’ll enjoy it and learn a lot). In addition, just by virtue of being located in Silicon Valley, I get the opportunity to see lots of great new technologies – like yesterday, when I attended the Amazon Web Services Start-Up Event at the PlugandPlayTechCenter in Sunnyvale.It is striking about how companies are leveraging cloud computing to create new products or services. I thought I would write about a few of them this week just to give an insight about how people are taking advantage of the characteristics of cloud computing.

    Big Data: As you know, I am a big believer in the big data theme – that organizations are moving beyond transactions and into relationships and content, thereby exponentially increasing the amount of data under storage – and requiring much more (and deeper) analytics. Moreover, the traditional tools used to manage data, both from a pure storage perspective as well as a tool perspective (i.e., database engines, etc.) don’t scale very well, either technically or economically. At the Cloud Services SIG last month, we had several companies presenting that discussed how they integrate with the cloud to better address the big data problem.

    First off, we had a Google representative, who discussed Google Datastore, which is a robust key/value pair storage mechanism designed to provide massive scalability. While not offering the extensibility or flexibility of a relational database, Google Datastore addresses common cloud storage requirements, which are typically very large amounts of relatively simple data.

    We then heard from Cloudera, which distributes a supported Hadoop distribution. Hadoop is a great tool to enable parallel processing of very large amounts of data with an aim of performing a relatively simple operation on some portion of the data. Hadoop, which I wrote about a few months ago, has a distributed file system that is redundantly spread throughout a set of servers, which is used to store and retrieve data. A query is launched against the data store, executing a map/reduce function, which then performs some operation on the resulting data set. Hadoop is widely used for very large data sets that outstrip the capacity limits of traditional databases and data warehouses. Incidentally, as I mentioned in my earlier blog piece, Amazon offers Hadoop functionality directly as an AWS offering.

    The last speaker of the night represented Aster Data, a parallel database company that is self-organizing (that is to say, if a new server is put into the parallel pool, the data automagically repartitions itself without need for manual intervention). Aster Data can be run in a cloud environment – indeed, with its ability to incorporate new servers, it is particularly well-suited for a cloud environment. The product also includes map/reduce functionality, which provides great flexibility in allowing a developer to decide at run-time which type of query is best for a particular task. By the way, at next week’s Cloud Services SIG, Greenplum will present. Greenplum is somewhat analogous to Aster Data, and recently announced a cloud product that is oriented toward enabling organizations to build an internal cloud of data warehouse capacity that can be doled out as needed. To give an idea of how large the data sets companies are now addressing, Ebay uses Greenplum to manage a 6 Petabyte data warehouse with 17 trillion (!) rows (that’s a lot of Canned Cloud – read down the page a bit).

    Turning to the Amazon event, four Amazon customers presented and discussed their use of cloud computing (my discussion of the following is from notes and memory, as the slides are not yet available). One company was ShareThis, which allows people to share interesting content with friends and colleagues. ShareThis keeps track of all the share events (one might think of these as transactions – Person A shares Content X with Person B, there’s three data elements to keep track of and aggregate for statistics). The numbers of events ShareThis has in its data store is mind-boggling; it uses Amazon SimpleDB to track all of them.

    The second company was really interesting – Pathwork Diagnostics, a biotech firm that uses Amazon to evaluate oncology diagnostics. The presenter said that they run large Hadoop-based queries on 240 Amazon EC2 instances for a couple of days, and then shut them down. This is another instance of big data that could not be easily processed in traditional fashion.

    Next up was SmugMug, which offers a photo upload and sharing service. Another big data story, but with a twist. SmugMug’s data challenge is not a searching issue; rather it is a physical capacity issue. Digital photos aren’t necessarily that large, but in the quantities that SmugMug deals with, total storage requirements are immense. SmugMug relies on Amazon’s S3 service for storage. The presenter, a former data center ops guy, was asked if he didn’t miss having his own equipment. He seemed to sigh for a moment, reminiscing (it appeared to me) about the good old days of racks of equipment, then quickly shuddered when he thought about how much equipment would be necessary. He also made mention of the fact that, as a self-funded company, investing in large amounts of equipment would be cost-prohibitive.

    The last, but certainly not least, speaker represented Netflix. Yes, the DVD-by-mail company. Except it isn’t just DVDs-by-mail anymore. You can view a significant part of the Netflix inventory online – and Amazon is used as part of that process. Every digital video object must be encoded to run in the Netflix viewer, since the native format is not supported (nor secure) for remote viewing. Netflix leverages Amazon EC2 instances to perform that encoding.

    A couple of questions were posed at the Amazon event regarding cost of running the Amazon service. Since the criticism of external clouds is they must be more expensive than internal data resources, the questions make sense. Because several of these companies are operating at very large scale, one might think that whatever crossover point exists at which internal resources are less expensive than outside resources must have been crossed. According to the presenters, using Amazon still made economic sense, even at the scale of computing they were implementing. For SmugMug, attempting to obtain and manage the resources necessary to store all the digital assets under management would be prohibitive (the presenter said that SmugMug only has around fifteen employees, so employing enough people to manage enough hardware to store all the assets isn’t possible in any reasonable economic scenario). The presenter from Pathword noted that the alternative for his company wouldn’t be 240 servers, since they couldn’t afford them, it would be two or four servers; the tradeoff would be that the jobs would take weeks instead of hours, which for a startup seeking competitive advantage is unacceptable.

    The applications described in this piece represent products designed to take advantage of cloud computing characteristics. Instead of people creating the same type of apps they would have done in the resource-constrained world of an internal data center, they leverage the scalability and ability to shut off capacity when it’s no longer required. This approach – creating applications designed for cloud environments instead of hosting data center-oriented apps in a cloud environment – is what we mean when we discuss with clients the importance of creating cloud applications, not just putting applications in the cloud. When thinking about the cloud, it’s important not to consider it as just a data center with a different IP address; it provides the opportunity for computing-based innovation.

    Bernard Golden is CEO of consulting firm HyperStratus, which specializes in virtualization, cloud computing and related issues. He is also the author of “Virtualization for Dummies,” the best-selling book on virtualization to date.

    Cloud Computing Seminars HyperStratus is offering three one-day seminars. The topics are: 1. Cloud fundamentals: key technologies, market landscape, adoption drivers, benefits and risks, creating an action plan 2. Cloud applications: selecting cloud-appropriate applications, application architectures, lifecycle management, hands-on exercises 3. Cloud deployment: private vs. public options, creating a private cloud, key technologies, system management The seminars can be delivered individually or in combination. For more information, see

    Sua Excelência não convenceu

    junho 17, 2009

    Como estou muito indignado com o que está acontecendo no Senado do Brasil, venho aqui reproduzir artigo do final do dia de ontem da cientista política Lúcia Hippolito, em seu blog no!

    Em resumo: “Sai, Sarney!”


    A Defesa de Sarney

    Sua Excelência não convenceu

    Muito nervoso, maltratando a língua portuguesa, o presidente do Senado, senador José Sarney, foi à tribuna para se defender das críticas, segundo ele, muito injustas, que não respeitam sua biografia.

    Não convenceu. Listou vários fatos de sua biografia. Falou dos 50 anos de vida pública,  misturou fatos ocorridos durante a ditadura com ações suas na presidência da República.

    Eximiu-se de toda e qualquer responsabilidade pela desmoralização completa por que passa o Senado da República. Repetiu inúmeras vezes que a crise não é dele, é do Senado.

    Lamento, mas o senador José Sarney é o maior responsável pela crise.

    Não se trata de desmentir ou de apagar a biografia do nobre parlamentar. Longe disso. Quem reescrevia o passado eram os historiadores soviéticos. A história de José Sarney é bem conhecida.

    O que há de mais curioso a ressaltar no discurso de quase meia hora é a total falta de compromisso de José Sarney com os últimos dez ou 15 anos da história do Senado. Sarney discursou como se tivesse chegado ontem à presidência da Casa.

    Como se não estivesse presidindo o Senado pela terceira vez. Como se não fosse pessoalmente responsável pela criação de cerca de 50 das 181 diretorias recém-descobertas na Casa.

    Como se não fosse pessoalmente responsável pela nomeação de Agaciel Maia como diretor-geral do Senado. Como se não tivesse legitimado uma série de atos de Agaciel Maia e do diretor de Recursos Humanos, João Carlos Zoghbi.

    Não é trivial privatizar o Senado da forma como o Senador José Sarney o fez. Tinha até outro dia um neto e duas sobrinhas empregados. Recebia auxílio-moradia tendo residência particular em Brasília e tendo à sua disposição, desde fevereiro, a residência oficial do Senado.

    Sua estrategista de campanha era também diretora do Senado. Exonerada para fazer campanha, teve a exoneação cancelada (tudo através de documentos sigilosos).

    Sua casa em São Luis era protegida por seguranças do Senado… embora ele seja senador pelo Amapá.

    Semana passada, sua Excelência foi padrinho de casamento da filha de Agaciel Maia, ex-diretor-geral do Senado. Que agora, depois de prestar relevantes serviços ao senador Sarney, está sendo jogado às feras. Pelo senador Sarney.

    O senador José Sarney não tem direito de afirmar que a crise não é dele.

    Quando tenta diluir a crise do Senado brasileiro na crise de representação mais geral, que acontece em muitos parlamentos do mundo, o senador tenta uma manobra esperta.

    É verdade que há crise em outros países, mas lá os parlamentares renunciam, pedem desculpas públicas, devolvem o dinheiro desviado. Alguns até se matam.

    Não se espera nenhuma atitude radical por parte do senador Sarney. Nem mesmo a renúncia à presidência do Senado virá por livre e espontânea vontade.

    Mas o clima de rebelião entre os funcionários do Senado é evidente. A forte reação da opinião pública também.

    Uma vez o senador José Sarney contratou a Fundação Getúlio Vargas para fazer um diagnóstico da situação do Senado e propor medidas. Deu certo. Nada aconteceu.

    Desta vez, repetiu a manobra. Mas suspeito muito de que não vai funcionar.

    Os tempos são outros, Excelência.

    Why No One Can Guess When Main Street Recovery will Occur (Porque ninguém pode adivinhar quando a recuperação da Main Street vai ocorrer)

    junho 16, 2009

    Aqui estamos com um artigo de um dos maiores nomes da economia mundial, o Professor Paul Samuelson, prêmio Nobel de Economia de 1970. Do alto dos seus 94 anos, o Prof. Samuelson faz uma advertência séria sobre o estado da economia americana.

    Sua preocupação é fortemente concentrada na relação EUA e China.   O seu artigo abaixo saiu no blog, e foi uma indicação de hoje do blog do Prof. Greg Mankiw!


    Why No One Can Guess When Main Street Recovery will Occur
    Paul A. Samuelson

    Henry Ford said, “History is bunk.” Even more cynically, Napoleon said, “History is a set of fables agreed upon.”

    Both had a point.

    But back in the early 1930s, during the Great Depression, President John F. Kennedy’s father, Old Joe Kennedy, made two fortunes betting that stocks would keep falling and unemployment would keep growing.

    He disbelieved in early New Deal recoveries.

    By contrast, the leading U.S. economist at Yale, Professor Irving Fisher, after (1) marrying a fortune; and (2) earning a second fortune by inventing a profitable visual filing system, nevertheless ended up losing no less than three fortunes!

    The story of these two opposites illustrates how and why economics can never be an exact science.

    Joseph Kennedy Sr. was a tough and crafty speculator. Apparently he sold stocks short from 1929 to, say, 1931 or 1932.

    Professor Fisher early on first lost his own fortune when the stocks he bought went bust. So he restudied the Wall Street and Main Street statistics.

    Admitting that he had been too optimistic, Fisher wrote a new book. In it he admitted his previous error. But his new book said: The stock market is now a bargain.

    Alas, his heiress wife’s assets collapsed under this guidance. Stubborn Fisher persisted in his optimism. He went on to advise his sister-in-law, the president (I believe) of Wellesley College, to stay with stocks! This time she balked and fired him as investment adviser.

    While Fisher was going broke, Joe Kennedy persevered by selling short the stocks that still were falling. No paradox.

    However, as the New Deal recovery program finally began to succeed, Kennedy Sr. left the stock market and bought the Chicago Merchandise Mart Building — the biggest structure in the world at that time.

    Which speculator was right? And which was wrong between these two well-informed giants? No sage can answer that question.

    Today, Federal Reserve Gov. Ben Bernanke glimpses a possible recovery by year’s end.

    He is a cautious scholar, backed by the best forecasters in the world at the Federal Reserve Board.

    I would be a rash fool to quarrel with this official’s quasi-optimistic view that by year’s end some stability will occur.

    You and I should hope that there will indeed be a glimmer of light at the end of the tunnel ahead.

    But shift our vision now to the future.


    Even if the short run prospect for a 2009-2010 recovery turns out to be good, I must warn once again that the long-run outlook for the U.S. dollar is hazardous.

    China is the new important factor.

    Up until now, China has been willing to hold her recycled resources in the form of lowest-yield U.S. Treasury bills. That’s still good news. But almost certainly it cannot and will not last.

    Some day — maybe even soon — China will turn pessimistic on the U.S. dollar.

    That means lethal troubles for the future U.S. economy.

    When a disorderly run against the dollar occurs, I believe a truly global financial panic is to be feared. China, Japan and Korea now hold dollars not because they think dollars will stay safe.

    Why then? They do this primarily because that is a way that can prolong their export-led growth.

    I am not alone in this paranoid future balance-of-payment fear.

    Warren Buffett, for one, has turned protectionist. Alas, protectionism may well soon become more maligned.

    President Obama struggles to support free trade. But as a canny centrist president, he will be very pressed to compromise.

    And he will be under new chronic pressures. His experts should right now be making plans for America to become subordinate to China where world economic leadership is concerned.

    The Obama team is a good one.

    But will they act prudently to adjust to America’s becoming the secondary global society?

    In the chess game of geopolitics between now and 2050, much stormy weather will take place. Now is the time to prepare for what the future will likely be.



    Sai, Sarney!

    junho 15, 2009

    Não costumo colocar coisas sobre a política brasileira com regularidade neste blog, mas a situação em que se encontra o Senado deste país, bem como a atual situação de privatização deslavada deste espaço público, leva-me a apresentar minha total indignação com o que está acontecendo. 

    Como tudo isto tem a ver com um grupo de políticos que só contribuiram, no mínimo, para “emlamear” a imagem do Congresso Nacional, e como isto tem a ver com políticos como o Senador José Sarney, venho nesta oportunidade manifestar minha total indignação reproduzindo neste blog o artigo que mais representa meu sentimento nesta hora, como é o artigo de hoje do jornalista Ricardo Noblat!

    Portanto, “Sai, Sarney!”


    Sai, Sarney!


    Vez por outra lemos a respeito de político japonês que se matou depois de ter sido acusado de corrupção.

    O mais recente foi Toshikatsu Matsuoka, ministro da Agricultura, em maio de 2007. Ele aceitou suborno de um empresário e pediu reembolso de despesas que sempre foram cobertas por seu gabinete.

    A ser processado e talvez preso, preferiu se enforcar.

    O próximo domingo será um dia tristemente histórico para a Inglaterra. Pela segunda vez, um presidente da Câmara dos Comuns, o equivalente à nossa Câmara dos Deputados, renunciará ao cargo, acusado de má conduta. O primeiro a renunciar foi Sir John Trevor em 1695. Seu crime? Ter embolsado grana de um comerciante em troca do apoio à aprovação de uma lei.

    Michael Martin, 63 anos, presidente da Câmara dos Comuns há quase dez, não se vendeu a ninguém nem tirou vantagens ilícitas do cargo. Mas foi conivente com os colegas que tiraram.

    Deputados com direito a verba para bancar moradia em Londres conseguiram reembolso por gastos para consertar quadras de tênis, limpar fossas, comprar cadeiras de massagem e aparelhos de televisão de tela plana. Os mais ousados cobraram até pelo aluguel de filmes pornográficos.

    O cordato Martin avalizou os desmandos. Uma vez que eles foram descobertos pela imprensa, tentou encobri-los. Como a tarefa se revelou impossível, pediu ajuda à polícia para identificar as fontes de informações dos jornalistas. A polícia nem se mexeu.

    Por fim, Martin se rendeu. Seguirá o exemplo dado por Trevor há 314 anos.

    Aqui já assistimos a renúncia de presidentes da Câmara e do Senado enrolados em denúncias de quebra de decoro. Foi o caso de Severino Cavalcanti, presidente da Câmara. E de Jader Barbalho, Antonio Carlos Magalhães e Renan Calheiros, presidentes do Senado.

    Diferentemente de Trevor no passado, e agora de Martin, eles não abandonaram os cargos premidos pelo sentimento de vergonha. Renunciaram para não ser cassados. Foi um ato sem vergonha. Assim puderam preservar os direitos políticos e voltar ao Congresso reeleitos.

    José Sarney está no olho do furacão que varre o Senado desde que ele foi eleito em fevereiro último para presidi-lo pela terceira vez. A primeira foi em 1995.

    O que existe de podre no Senado não é obra exclusiva dele. Um presidente do Senado não pode tudo, muito menos sozinho.

    Mas é um escárnio Sarney continuar fingindo que nada tem a ver com a crise mais grave da história do Senado. Não apenas tem a ver: Sarney é o principal responsável por ela. A semente da crise foi plantada no primeiro mandato dele como presidente do Senado.

    “Eu só tenho a agradecer ao Dr. Agaciel Maia pelos relevantes serviços que ele prestou”, disse Sarney ao se despedir do ex-diretor-geral do Senado, defenestrado da função devido à crise.

    Agaciel foi nomeado por Sarney. Ao longo de 14 anos, acumulou poderes e cometeu toda a sorte de abusos com a concordância explícita ou velada de Sarney e dos que o sucederam no comando do Senado.

    Na semana passada, ao som da música do filme “O Poderoso Chefão”, Agaciel casou a filha Mayanna sob as bênçãos de Sarney, Renan Calheiros e de dois outros ex-presidentes do Senado – Garibaldi Alves e Edison Lobão.

    Para lá do inchaço do quadro de funcionários do Senado, do pagamento de horas extras não trabalhadas, da criação de diretorias fantasmas, da homologação de licitações suspeitas e da assinatura de decretos secretos, há fatos que dizem respeito diretamente a Sarney e que o deixam mal na foto.

    Dono de imóvel em Brasília e inquilino da mansão destinada ao presidente do Senado, Sarney recebeu durante mais de um ano auxílio-moradia de R$ 3.800,00 mensais reservada a senadores sem teto.

    Flagrado, primeiro negou que recebesse. Depois se apropriou do mote de Lula e disse que não sabia.

    Um neto de 22 anos de Sarney assessorou durante mais de um ano o senador Epitácio Cafeteira (PTB-MA). Foi a maneira que Cafeteira encontrou, segundo admitiu, de agradecer ao pai do rapaz por tê-lo reaproximado de Sarney.

    Há uma sobrinha de Sarney lotada no ex-gabinete da filha dele no Senado, Roseana Sarney, atual governadora do Maranhão. E há outra empregada no gabinete do senador Delcídio Amaral (PT-MTS) em Campo Grande. Essa ganha sem trabalhar.

    É possível acreditar que o pai da crise esteja de fato empenhado em resolvê-la? Ou que reúna condições para tal? E quem disse que seus pares estão interessados em refundar o Senado?

    A essa altura, uma só coisa depende de fato de Sarney: a renúncia à presidência do Senado para atenuar as nódoas recentes de sua biografia.

    %d blogueiros gostam disto: