Archive for fevereiro \24\UTC 2010

Asset-driven Enterprise Architecture

fevereiro 24, 2010

 Post do blog http://advice.cio.com!

==============

Tue, Feb 23, 2010 22:02 EST
Asset-driven Enterprise Architecture

Should Enterprise Architecture be driven by asset-preservation goals?

Posted by: Simona Lovin in Questions

In his e-book “Primer for Enterprise Engineering and Manufacturing”, Zachman makes an interesting distinction between make-to-order engineering (aka job-shop type design, where each project is viewed and treated as single purpose) vs. assemble-to-order engineering (by which he understands mass-customization of architecture practices).

In Zachman’s view, the assemble-to-order architectural paradigm is the way to go: it presupposes the development of reusable, primitive models for each cell of the Zachman framework; it is process-oriented and integration-oriented; its inherent reusability orientation promotes faster deployment times; and it is asset-based.

Generally speaking, the main features advertised by Zachman’s assemble-to-order architecture have correspondents in today’s enterprise and segment architectures. Enterprise architectures – sliced and diced into segment architectures – are multi-layered model-driven concoctions anchored to a solid fundament of business processes, which, at least in principle, support the identification of potential redundancies (be it in terms of investments, system capabilities, data, or technologies) . What stands out in Zachman’s paradigm is the focus on assets as a driver for architectural priorities.

Which triggers my question: does anyone out there use asset-preservation (as opposed to cost-avoidance or cost-reductions) as the criteria for identifying high-priorities segments or areas to be architected?

If so, what industry are you in and what do you consider to be your organization’s primary assets?

SAP’s Business Objects Problems Go Beyond Gartner’s Quadrant

fevereiro 21, 2010

Quem foi que disse que Fusões & Aquisições de empresas são coisas simples?  Eis abaixo uma prova em contrário!

É um post do http://www.itbusinessedge.com!

==========

SAP’s Business Objects Problems Go Beyond Gartner’s Quadrant
Posted by Ann All Feb 19, 2010 3:03:18 PM

Gartner has long displeased vendors who don’t land in prime spots on its Magic Quadrant. ZL Technologies even sued Gartner in October, claiming paying clients received favorable treatment during the analyst company’s quadrant selection process. (The suit was dismissed a month later.)

Admittedly, I am not the most connected writer in business technology publishing. I confess I hadn’t even heard of ZL Technologies before the suit was filed. But now mega-vendor SAP is taking public umbrage with part of the Magic Quadrant assessment for BusinessObjects, its business intelligence product.

As InformationWeek reports, SAP/BusinessObjects was ranked second to IBM in Gartner’s “completeness of vision” axis, but behind a bunch of other vendors (Oracle, Microsoft, IBM, SAS, and Microstrategy, to be exact) for its “ability to execute.” SAP was so peeved it commissioned a blind survey of 24 CIOs, conducted by the Gerson Lehrman Group.

But maybe SAP protesteth too much? Strativa Managing Partner Frank Scavo, writing on his The Enterprise System Spectator blog, says SAP’s complaint simply brings more attention to the legitimate issue of a badly handled transition for customers of BusinessObjects. the BI specialist SAP acquired in 2007. Executive board member John Schwarz, who oversaw the BusinessObjects business for SAP, just left the company. Uh-oh.

To be fair, both Gartner and Scavo point out that SAP competitors IBM and Oracle suffered transition problems following their acquisitions of BI companies Cognos and Hyperion. But SAP’s problems, which IT Business Edge’s Loraine Lawson wrote about in the summer of 2008, sound more major than the usual merger-and-acquisition bumps. She cited an InfoWorld report that many BusinessObjects customers did not receive their IDs to access a newly integrated support system. According to the article, SAP lacked e-mail addresses for some customers and had to use regular mail and/or had outdated contact information. As Loraine wrote:

You would think a business intelligence company would keep better records, but apparently you’d be wrong.

Scavo’s take on SAP’s Gartner complaint:

The fact that BusinessObjects customers are still having support problems — two years after they found themselves in SAP’s customer base — suggests that SAP should spend less time trying to disprove Gartner’s findings and more time getting its own support systems and processes in order. 

SAP’s Business Objects Problems Go Beyond Gartner’s Quadrant

Posted by Ann All Feb 19, 2010 3:03:18 PM

Gartner has long displeased vendors who don’t land in prime spots on its Magic Quadrant. ZL Technologies even sued Gartner in October, claiming paying clients received favorable treatment during the analyst company’s quadrant selection process. (The suit was dismissed a month later.)

 

Admittedly, I am not the most connected writer in business technology publishing. I confess I hadn’t even heard of ZL Technologies before the suit was filed. But now mega-vendor SAP is taking public umbrage with part of the Magic Quadrant assessment for BusinessObjects, its business intelligence product.

 

As InformationWeek reports, SAP/BusinessObjects was ranked second to IBM in Gartner’s “completeness of vision” axis, but behind a bunch of other vendors (Oracle, Microsoft, IBM, SAS, and Microstrategy, to be exact) for its “ability to execute.” SAP was so peeved it commissioned a blind survey of 24 CIOs, conducted by the Gerson Lehrman Group.

 

But maybe SAP protesteth too much? Strativa Managing Partner Frank Scavo, writing on his The Enterprise System Spectator blog, says SAP’s complaint simply brings more attention to the legitimate issue of a badly handled transition for customers of BusinessObjects. the BI specialist SAP acquired in 2007. Executive board member John Schwarz, who oversaw the BusinessObjects business for SAP, just left the company. Uh-oh.

 

To be fair, both Gartner and Scavo point out that SAP competitors IBM and Oracle suffered transition problems following their acquisitions of BI companies Cognos and Hyperion. But SAP’s problems, which IT Business Edge’s Loraine Lawson wrote about in the summer of 2008, sound more major than the usual merger-and-acquisition bumps. She cited an InfoWorld report that many BusinessObjects customers did not receive their IDs to access a newly integrated support system. According to the article, SAP lacked e-mail addresses for some customers and had to use regular mail and/or had outdated contact information. As Loraine wrote:

You would think a business intelligence company would keep better records, but apparently you’d be wrong.

Scavo’s take on SAP’s Gartner complaint:

The fact that BusinessObjects customers are still having support problems — two years after they found themselves in SAP’s customer base — suggests that SAP should spend less time trying to disprove Gartner’s findings and more time getting its own support systems and processes in order.

Accenture e SAS fazem parceria em soluções de análise

fevereiro 18, 2010

Este será o ano da Analytics! Foi assim que comecei as newsletters da Creativante  este ano.  E o mercado está respondendo!

A matéria abaixo, da InformationWeek,  saiu hoje no http://www.itweb.com.br!

================ 
Accenture e SAS fazem parceria em soluções de análise

   

por Katherine Burger | InformationWeek EUA 

 

 18/02/2010

   

Oferta das companhias será focada nas verticais serviços financeiros, saúde e governo

   

Accenture e o SAS anunciaram a formação da Accenture SAS Analytics Group, como forma de expandir a estratégia de relacionamento e, em conjunto, desenvolver, implementar e gerenciar a “próxima geração” de soluções de análise preditiva. O objetivo das empresas é ajudar companhias e órgãos governamentais a entenderem e implementarem soluções de análise preditiva. 

Como parte da expansão, as duas companhias acordaram em investir no desenvolvimento de soluções focadas em indústrias específicas, inicialmente, os produtos serão voltados para as verticais serviços financeiros, saúde e serviço público. As empresas planejam ainda entregar capacidades analíticas como serviço gerenciado. 

“Nosso relacionamento com o SAS é um elemento chave na estratégia da Accenture de se voltar de forma agressiva para oportunidades em torno da próxima geração de soluções de análise”, informou William D. Green, chairman e CEO da Accenture, em release. “Companhias que utilizam análise preditiva para tomada de decisões podem melhorar o desempenho dos negócios e ultrapassar a concorrência no longo prazo.” 

Jim Goodnight, CEO do SAS, acrescentou dizendo: “em nossa experiência de prover análise para nossos clientes, nunca vimos líderes adotando esse tipo de solução com prontidão nas companhias. A Accenture SAS Analytics Group ajudará essas companhias a levarem os resultados dos negócios para outro nível ao aplicar as melhores soluções de análise preditiva e as melhores práticas para tomada de decisões.” 

Inicialmente, a Accenture SAS Analytics Group planeja desenvolver uma série de soluções para a indústria de serviços financeiros, incluindo seguros e bancos; serviços de saúde, como farmacêuticas e operadoras; e governo. 

“Durante a recessão econômica, as companhias tiveram sucesso em usar dado para produzir vantagens competitivas”, avaliou Dave Rich, diretor na Accenture Analytics. “Atualmente, essas empresas podem usar análise preditiva para permitir que as decisões sejam tomadas mais rapidamente e também para melhora do desempenho do negócio.” 

Internet Usage Statistics

fevereiro 17, 2010

Post de sábado do blog do Prof. Mark Perry (http://mjperry.blogspot.com/)!

========

Internet Usage Statistics

The Fourth Paradigm- Data Intensive Scientific Discovery

fevereiro 17, 2010

Acabo de ver no blog de Otavio Coelho (http://blogs.msdn.com/otavio/) o link para um novo livro da Microsoft Research (de outubro de 2009), intitulado The Fourth Paradigm – Data Intensive Scientific Discovery, disponível para download, que trata de uma nova realidade da Ciência hoje.

De acordo com as próprias palavras de Otavio: 

Jim Gray, um dos desenvolvedores do System R nos primórdios dos Bancos de Dados Relacionais, do WorldWide Telescope e outros, escritor de um dos melhores livros da informática que já li (Transaction Processing: Concepts and Techniques) e Fellow da Microsoft até o ano de 2007, quando teve um acidente trágico, colocou o problema da seguinte forma: a ciência está iniciando o seu quarto paradigma que vai se apoiar numa TI que ainda não existe e temos que construir.

imageHistoricamente, a Ciência está sofrendo as seguintes mudanças de paradigmas:

  1. Ciência Empírica: onde os cientistas coletavam os dados de suas observações diretas e eles mesmos analisam esta informação chegando a algumas regras da natureza;
  2. Ciência Teórica: os cientistas constroem modelos analíticos (fórmulas) coerente com as observações e começam a fazer predições;
  3. Ciência Computacional: os cientistas adicionam ao seu instrumental computadores capazes de simular modelos analíticos, validá-los e construir predições (ex.: simulação da colisão de galáxias ou formação e morte de uma estrela);
  4. eScience: onde os computadores geram dados de simulações (exigindo muita computação), recebem eventos adquiridos por instrumentação, armazenam dados em arquivos ou banco de dados muitas vezes distribuídos (exigindo grande poder de armazenamento), e apoiam os cientistas na captura, organização, sumarização, análise e visualização desta monumental massa de dados (mais computação), tornando factível encontrar novas correlações e modelos analíticos capazes de novas predições.

Neste novo tempo, o computador tem o papel fundamental de analisar uma crescente massa de informações (na ordem hoje de muitos pentabytes e com previsão de dobrar nos próximos 2 anos) e  torná-los coerentes.

A computação, assim usada, seria comparável com a experiência do uso do telescópio por Galileu Galilei.

As empresas provavelmente terão um desafio semelhante. Elas também recebem um turbilhão de dados (dados de mercado, preços, vendas, ações dos clientes, etc.), e precisam entender rapidamente o que está acontecendo para agir corretamente e em tempo.

Investimentos enormes têm sido feitos para lidar com este novo volume de dados. Novas tecnologias para cubos OLAP (vide SQL Server 2008 R2 Parallel Data Warehouse), serviços para a coleta massiva de eventos (SQL Streaming), computação em paralelo, seja por HPC (Beowulf Computing) ou cloud computing (como o Azure), são exemplos de alternativas reais hoje.

Ao estabelecer este 4º paradigma Jim Gray parece estabelecer uma direção de forma semelhante ao que John Kennedy colocou para a ciência americana ao apontar como objetivo o homem na lua.

Quem ganha? Todos. Como na corrida espacial, toda a sociedade recebe seus benefícios diretos ou indiretos. Mas muito já pode ser feito agora, e a direção não é a de um computador muito grande, mas a de uma rede de armazenamento e computação em paralelo.

Quanto ao livro, recomendo a leitura do primeiro artigo, do próprio Jim Gray.”

É isso aí!  Fiz o download (mais de 90 mega) e vou ler este interessante livro!

The Fourth Paradigm – Data Intensive Scientific Discovery

fevereiro 17, 2010

Acabo de ver no blog de Otavio Coelho (http://blogs.msdn.com/otavio/) o link para um novo livro da Microsoft Research (de outubro de 2009), intitulado The Fourth Paradigm – Data Intensive Scientific Discovery, disponível para download, que trata de uma nova realidade da Ciência hoje.

De acordo com as próprias palavras de Otavio: 

Jim Gray, um dos desenvolvedores do System R nos primórdios dos Bancos de Dados Relacionais, do WorldWide Telescope e outros, escritor de um dos melhores livros da informática que já li (Transaction Processing: Concepts and Techniques) e Fellow da Microsoft até o ano de 2007, quando teve um acidente trágico, colocou o problema da seguinte forma: a ciência está iniciando o seu quarto paradigma que vai se apoiar numa TI que ainda não existe e temos que construir.

imageHistoricamente, a Ciência está sofrendo as seguintes mudanças de paradigmas:

  1. Ciência Empírica: onde os cientistas coletavam os dados de suas observações diretas e eles mesmos analisam esta informação chegando a algumas regras da natureza;
  2. Ciência Teórica: os cientistas constroem modelos analíticos (fórmulas) coerente com as observações e começam a fazer predições;
  3. Ciência Computacional: os cientistas adicionam ao seu instrumental computadores capazes de simular modelos analíticos, validá-los e construir predições (ex.: simulação da colisão de galáxias ou formação e morte de uma estrela);
  4. eScience: onde os computadores geram dados de simulações (exigindo muita computação), recebem eventos adquiridos por instrumentação, armazenam dados em arquivos ou banco de dados muitas vezes distribuídos (exigindo grande poder de armazenamento), e apoiam os cientistas na captura, organização, sumarização, análise e visualização desta monumental massa de dados (mais computação), tornando factível encontrar novas correlações e modelos analíticos capazes de novas predições.

Neste novo tempo, o computador tem o papel fundamental de analisar uma crescente massa de informações (na ordem hoje de muitos pentabytes e com previsão de dobrar nos próximos 2 anos) e  torná-los coerentes.

A computação, assim usada, seria comparável com a experiência do uso do telescópio por Galileu Galilei.

As empresas provavelmente terão um desafio semelhante. Elas também recebem um turbilhão de dados (dados de mercado, preços, vendas, ações dos clientes, etc.), e precisam entender rapidamente o que está acontecendo para agir corretamente e em tempo.

Investimentos enormes têm sido feitos para lidar com este novo volume de dados. Novas tecnologias para cubos OLAP (vide SQL Server 2008 R2 Parallel Data Warehouse), serviços para a coleta massiva de eventos (SQL Streaming), computação em paralelo, seja por HPC (Beowulf Computing) ou cloud computing (como o Azure), são exemplos de alternativas reais hoje.

Ao estabelecer este 4º paradigma Jim Gray parece estabelecer uma direção de forma semelhante ao que John Kennedy colocou para a ciência americana ao apontar como objetivo o homem na lua.

Quem ganha? Todos. Como na corrida espacial, toda a sociedade recebe seus benefícios diretos ou indiretos. Mas muito já pode ser feito agora, e a direção não é a de um computador muito grande, mas a de uma rede de armazenamento e computação em paralelo.

Quanto ao livro, recomendo a leitura do primeiro artigo, do próprio Jim Gray.”

É isso aí!  Fiz o download (mais de 90 mega) e vou ler este interessante livro!

Incentivo à Inovação é ‘Ilusório’, Afirma IEDI

fevereiro 12, 2010

Pesquisa recente do Instituto de Estudos para o Desenvolvimento Industrial- IEDI, mostra que o incentivo à inovação no Brasil é ilusório. 

Este estudo, feito por uma instituição idônea, considerada como o think tank da indústria brasileira, tira um pouco do ufanismo do governo quanto à questão de que no seu período o país “nunca se investiu tanto em inovação”!

Mais sobre o estudo pode ser visto no site http://www.iedi.org.br!

==================

 

 Incentivo à Inovação é ‘Ilusório’, Afirma IEDI
O Estado de São Paulo – 08/02/2010

Cerca de dois terços dos recursos contabilizados pelo governo como incentivo à pesquisa e desenvolvimento proveem da Lei de Informática

Marcelo Rehder

Prioridade na Política de Desenvolvimento Produtivo (PDP) do governo Lula, a inovação é uma atividade restrita a um número ainda pequeno de empresas no Brasil. Do total de 4,4 milhões de empresas de diferentes segmentos, apenas 30 mil se declaram inovadoras e só 6 mil realizam atividades de pesquisa e desenvolvimento (P&D), segundo o Instituto de Estudos para o Desenvolvimento Industrial (Iedi). Para a entidade, o sistema de incentivos brasileiros é pouco eficaz para alavancar o gasto privado em P&D o suficiente para mudar de forma radical o quadro da inovação no País.

“O Brasil precisa construir sua capacidade inovadora para ter uma pauta de produção industrial e de serviços mais sofisticada. Só vamos conseguir participar competitivamente do mercado global se tivermos uma base inovadora forte”, avalia o presidente do Iedi, Pedro Passos.

Cerca de dois terços de todos os recursos contabilizados pelo governo como incentivo a atividades de P&D do setor privado – algo como R$ 3,2 bilhões, em valores de 2008 – são, em boa parte, ilusórios, afirma estudo do Iedi. Trata-se dos recursos provenientes de incentivos fiscais e subvenções da chamada Lei de Informática (1991), principal mecanismo de apoio público ao esforço privado.

Na avaliação da entidade, a Lei de Informática é muito mais uma contingência da necessidade de equilibrar os incentivos concedidos na Zona Franca de Manaus à realidade tributária do restante do País do que propriamente uma lei de P&D. “Se não houvesse o incentivo, a produção migraria em massa para a Zona Franca ou seria importada”, explica o economista do Iedi, Rogério César de Souza.

“Não temos nenhuma crítica à Lei de Informática. Apenas não achamos correto incluí-la no conjunto de incentivos à inovação”, comenta o ex-secretário de Política Econômica do Ministério da Fazenda, Júlio Sergio Gomes de Almeida, que contribuiu para a elaboração do documento do Iedi. “Ela resolve um problema, mas não o da inovação”, ressalta.

Considerando-se todos os instrumentos, o apoio público ao gasto privado em P&D é da ordem de R$ 5,2 bilhões (dado de 2008), o equivalente a 0,18% do Produto Interno Bruto (PIB), o que colocaria o Brasil entre os países que mais incentivam a inovação. No entanto, excluindo a Lei de Informática, o montante cairia para cerca de R$ 2 bilhões, ou 0,07% do PIB, um porcentual considerado baixo quando comparado a outros países, em especial os principais concorrentes do Brasil.

“Com a Lei de Informática, nosso padrão de incentivos é comparável ao francês, acima do japonês e do norte-americano. Sem ela, o padrão é mexicano”, diz Gomes de Almeida.

Lula Pede Mais Inovação

Na semana passada, tanto o presidente Lula quanto o presidente do Banco Nacional do Desenvolvimento Econômico e Social (BNDES), Luciano Coutinho, convocaram os empresários a investir em inovação. “O momento é de investimento em inovação tecnológica e isso vai fazer toda a diferença para o crescimento e desenvolvimento do Brasil”, disse Lula, quarta-feira, em seu programa de rádio “Café com o Presidente”.

O presidente do Iedi é da mesma opinião. Para ele, o fato de o Brasil ainda investir pouco em inovação não é um problema só de governo, mas também de mobilização da iniciativa privada. “É preciso criar alguns mecanismos que facilitem a entrada da iniciativa privada, mas ela também precisa abraçar essa causa, acreditar que isso funciona e dá resultado”, afirma Passos.

O gasto das empresas em P&D atingiu R$ 15,16 bilhões em 2008, o que corresponde a 0,51% do PIB. Para elevar esse porcentual para 0,65%, tal como proposto pela atual política industrial, seria preciso rever a forma de incentivar as empresas, diz o documento do Iedi. Exigiria que o apoio governamental subisse de 0,07% para 0,09% do PIB, aumento de 30%.

Imposto Alto é Maior Obstáculo à Pesquisa
Juros elevados e falta de financiamento também atrapalham, diz Fiesp

As empresas querem inovar, mas o custo ainda é considerado alto para o risco do investimento, segundo pesquisa realizada pela Federação das Indústrias do Estado de São Paulo (Fiesp). Para as empresas, os principais obstáculos aos investimentos em atividades inovadoras são decorrentes do desequilíbrio do trinômio câmbio-juros-carga tributária.

Nos últimos dois meses, a entidade ouviu 334 empresas, que listaram as dificuldades que enfrentam para investir em inovação. A mais citada – por 59% dos entrevistados – foi a elevada carga tributária incidente sobre os gastos em pesquisa e desenvolvimento (P&D), seguida da alta taxa de juros e dos custos de financiamento (58%) e da valorização do câmbio (55%).

“Ainda persistem sérios problemas relacionados aos riscos econômicos que influenciam negativamente a capacidade de inovação do País”, diz o diretor do Departamento de Competitividade e Tecnologia da Fiesp, José Ricardo Roriz Coelho, que coordenou a pesquisa.

“Especialmente no caso das pequenas e médias, investir em algo novo para uma demanda incerta é uma ousadia que pode ser desempenhada apenas por um grupo restrito.”

A Brapenta, empresa de médio porte que fabrica equipamentos de inspeção e detectores de metais na linha de produção, tem feito “ginástica” para conseguir investir 10% do seu faturamento em pesquisa e inovação. “Exportamos para 30 países e ficamos numa situação de grande desvantagem totalmente expostos ao fogo cruzado do exterior”, diz o presidente da empresa, Martin Izarra.

Segundo ele, as dificuldades estruturais, burocráticas e econômicas que encarecem o investimento no País são tamanhas que serviu de inspiração para o lema interno da empresa: “Temos que ser mais inteligentes que os nossos concorrentes para compensar o custo Brasil”.

Roriz Coelho nota que, embora a criação de linhas de financiamento à inovação seja um fenômeno relativamente recente no País, já é possível observar muitos avanços. Porém, na avaliação dele, seria preciso tornar o financiamento mais adequado às empresas. “A estrutura do financiamento das atividades de P&D precisa ser melhor distribuída e menos concentrada em recursos próprios das empresas”, afirma.

Além disso, acrescenta o diretor da Fiesp, os instrumentos de desoneração dos investimentos em inovação hoje são restritos somente às empresas de lucro real, que representam só 10% das companhias do País.

Nesse seleto grupo está a Natura, líder no setor de cosméticos. Os investimentos da empresa em pesquisa alcançam R$ 120 milhões por ano, equivalentes a 3% do seu faturamento líquido. “Isso tem propiciado um índice de inovação bastante forte”, diz o copresidente do Conselho de Administração da Natura, Pedro Paulo Passos, que preside também o Instituto de Estudos para o Desenvolvimento Industrial (Iedi).

“Mais de 65% da nossa receita vem de produtos lançados nos últimos dois anos” , explica Passos. Boa parte das novas tecnologias em estudo na companhia está nas mãos de uma rede de parceiros, que já representa uma centena de universidades e pesquisadores dentro e fora do Brasil.

 

The 4 Ways IT is Driving Innovation

fevereiro 11, 2010

Uma entrevista super-interessante com um dos mais renomados especialistas da área de negócios de TI: o Prof. Erik Brynjolfsson, do MIT/EUA. 

Na matéria original (que pode ser lida aqui) há um vídeo da entrevista!

============= 
The 4 Ways IT is Driving Innovation

An Interview with Erik Brynjolfsson

February 4, 2010

MIT Sloan economist and digital-business expert Erik Brynjolfsson tells how the rising data flood, and emerging tools for analyzing it, are changing the ways innovation gets done.

It’s no surprise that historically, the ways that companies either embrace using technology or resist it—what IT innovation thinker Erik Brynjolfsson calls the gap between the leaders and laggers—has mattered.

What’s new is that while the gap was fairly steady for decades, a real divergence started happening around 1995. Credit the rise of systems like ERP (enterprise resource planning), the expanded use of the Internet, and the fact that every dollar buys incrementally more computerization.

What Brynjolfsson has found is that not only has the gap between companies grown in the past decade, but divergence between leaders and laggers has been greatest in IT-intensive industries. And what that suggests, he argues, is that the way companies use information technology—they way they measure, experiment with, share, and replicate it—has become critical.

In a conversation with MIT Sloan Management Review editor-in-chief Michael S. Hopkins, Brynjolfsson, the director of the MIT Center for Digital Business and the Schussel Family Professor at the MIT Sloan School of Management, talks about how smart companies have learned to tap the flood of data created by information technology and process it with what he calls a “higher information metabolism.”

Your research and work with the MIT Center for Digital Business focuses on the ways that information technology is linked to innovation. Let’s start with your big picture overview.

In the long run, our competitive advantage and all of our living standards depends on innovation, and I would argue that for our era, the most important driver of innovation is information technology. Thanks to Moore’s law, the adjusted power being delivered, for instance, by computers, has grown tremendously. That directly has led quantifiable increases of productivity.

Information technology is also a catalyst for complementary changes: it’s what economists call a “general purpose technology” which sets off waves of complementary innovations in things like business processes, new ways of reaching customers, new ways of connecting to suppliers, internal organization to the firm. These complementary changes are often ten times as large as the size of the initial investments in the IT itself and have profound and long-lasting effects on our ability to create goods and services.

But there’s a factor that has not been studied very much and, frankly, is not very well understood. And that is the possibility that IT can change the innovation process itself.

This is something that we haven’t seen much in the economic literature. But when I go and visit companies, I see it happening all the time in the 10 or 20 percent of businesses that are on the leading edge. And the way that they’ve been changing innovation is, I think, a harbinger for some more profound changes in the economy as a whole.

Is it that companies are using information technology to measure what they do in especially smart ways?

Yes, but it’s not just measurement. IT is setting off a revolution in innovation on four dimensions simultaneously: measurement, experimentation, sharing, and replication. Each of these is important in and of itself, but, more profoundly, they reinforce each other. They magnify the impact of each other. Improved measurement makes experimentation much more valuable, which in turn becomes more valuable still if you can share those results to the other locations. And, ultimately, if those results are important, you want to be able to scale those results up.

By doing all four of these changes together, companies are, in essence, creating a new kind of R&D.

Let’s go through those four one by one. The first, you’ve said, is measurement.

It’s more like radically improved measurement, through the use of what I call nano data. That includes clickstream data, Google trends, detailed email data, the billions and trillions of bits of information that are thrown off by enterprise planning systems. Even without any conscious effort on the part of the designers, this information is just generated. But by studying these data very carefully, companies can have much better knowledge of their customers, of their business processes, of their product quality, and of defects of their supply chains. The field of business intelligence has been tapping into this explosion of data.

If companies are measuring information, then they have the means to use IT to experiment with things like how they’re selling to their customers. You say that’s the second category of IT-driven innovation.

Yes. IT-based experimentation is most obvious in companies like Amazon, which regularly conducts what it calls “A/B experiments,” tests of its web pages that deliver different versions of the same page at the same time to different visitors, monitoring customer experience and follow-through. Google, similarly, does 200 to 300 experiments on any given day. But it’s also quite common in catalog companies, like credit card companies and direct mail companies, and even in mainstream brick and mortar companies like the casino chain Harrah’s.

The big advantage of an experimental approach that uses IT is that you can get at causality in a way that you can’t with just pure measurement and observation. And that, of course, is the gold standard for being able to have actionable knowledge about what’s really happening in your business, what innovations are paying off and which ones aren’t.

I’ll ask you about that Harrah’s example in a minute. Let’s first talk about the other two dimensions you mentioned, sharing and replication.

A third thing that’s changed a lot in businesses over the past five to ten years is the way that companies can share not only data, but insights. The internet and information technology is uniquely well designed for this kind of sharing, of course.

An example is what happened spontaneously at Cisco Systems, where the central IS department did not support Macintosh computers. There were about 10,000 users of Macs, and they set up their own wiki internally to share tips and tricks on things like how to install new software and how to get their Macs to work with the company’s Linux printers. This creation of a wiki shows how not just big innovations, but smaller ones, like figuring out how to network with a printer more quickly, can be easily shared.

We often think of grand innovations, like the invention of the light bulb, as what drives economic growth. But equally important, and perhaps more important, are the 1,001 small innovations that regular business managers and line workers do every day at their jobs. If we can find more effective ways of sharing those micro-innovations with one another so that each person doesn’t have to reinvent the wheel or reinvent the printer routine, then we’re much more likely to be able to get a faster, more steady pace of economic growth—and improved competitive advantage for the companies that make that easy.

The fourth change is replication. What do you mean by that?

IT makes it dramatically easier to replicate and scale up innovations once they’ve been identified. The first three approaches help companies find and share new innovations, but then IT makes it possible to take that innovation and copy it many times.

Now, the most obvious examples are innovations that are made of bits, like software and music and web pages. Those get replicated thousands, hundreds of thousands, millions of times, and that process of replication has obviously completely changed those industries.

However, what we also see is that business processes themselves can be replicated by leveraging information technology. A nice example is what Andrew McAfee at our Center for Digital Business described in his study of CVS. The company implemented an improved business process for prescription drug ordering at one of its pharmacies, which improved customer satisfaction significantly. But what happened next is what’s really important. Mangers took that business process and embedded it in an enterprise information technology system, and then they replicated it to 4,000 other pharmacies in 4,000 other CVS stores within a year.

We’re seeing that not just in retailing but also in manufacturing, in banking, in industry after industry.

Let’s go back to Harrah’s. You say it’s an example of an offline company—not an Amazon, but a business with staff people who interact, in person, with customers in the real world—that has used IT in all four ways to drive innovation. What is Harrah’s doing now that it wasn’t doing before?

The CEO there, Gary Loveman, was a PhD student here at MIT with me. And I think he’s an exemplar of a new kind of senior executive that we’re going to be seeing in the coming years. Gary has created a culture where employees at Harrah’s are regularly doing business experiments and carefully measuring their results through their information systems. The successful findings are shared with business managers at other locations and then scaled up to become part of corporate policy going forward.

“What we’re going to see in the coming decade are companies whose whole culture is based on continuous improvement and experimentation—not just of specific processes, but of the entire way the company runs. I think this revolution can be fairly compared to the scientific revolution that happened centuries ago.”
—Erik Brynjolfsson

When he first came to Harrah’s, it was, frankly, sort of a second-tier, also-ran casino company. But it did have a great deal of data. Most of that data was not being used effectively, and he brought a culture of experimentation and analytics that has propelled Harrah’s to being the leading casino company.

How did he do it? Well, he’s really good with numbers. And while a lot of his competitors were working on having fancier fountains and more incredible spectacles in Las Vegas, Gary was checking through the numbers to see what was really driving profitability. This kind of analysis is something that he has spread not just into the CEO suite, but all throughout the company.

In fact, when he came to speak at my MBA class last year, he told me that there were really just two things that could get you fired from Harrah’s. One is if they catch you stealing from the company. The other is if they catch you running an experiment without a proper control group. Now, that kind of culture, of taking experimentation and methodology that seriously, is something new—and something that IT makes a lot more feasible.

So, Harrah’s runs dozens of experiments. For instance, they will see whether different kinds of discounts and coupons can entice people that normally come for two days to come for three days, or get people who normally bet the $5 machines to bet the $25 machines. They bring experimentation to figure out what work practices can get their waiters and waitresses to serve customers more effectively and get higher customer satisfaction scores. This is a mentality that they bring to every aspect of their business.

It’s interesting that Loveman studied at MIT before he became CEO at Harrah’s. What kinds of training or changes in attitude do think this “new kind of senior executive of the future” will need?

One of the things that I see changing is a shift from a lot of long-term planning. Instead, there’s more sense and respond: experiment so that you can learn about what your customers’ needs are, what the supply chain changes are that could make a difference, how to redesign your products.

This is a mentality that requires much quicker cycle times. It requires people from the organization to be flexible and nimble. It requires a much higher information metabolism.

You have to have really high quality, intelligent people working for you who are getting the data they need to be able to make rapid decisions and then propagate the effects of those decisions equally rapidly.

You know, to be successful at this experimentation approach requires a unique set of skills, one that hasn’t been that common among most types of managers, and one that, frankly, we at business schools need to work harder at bringing together. Specifically, these managers need knowledge of business analytics, the way to understand the numbers to drive the statistics and to design intelligent experiments—but also deep knowledge of the business itself, to know how to ask the right questions.

In coming years, I think the real bottleneck will be finding people who combine those sets of skills, who can design experiments that get at genuine business problems in a way that can be analyzed through controlled business experiments. That’s something that we don’t see a whole lot currently.

In theory, companies have had access to data and been doing experiments forever. Isn’t the big problem—or, let’s call it the big challenge—that there’s just so much information that it’s hard to know where to start?

I think so. Most companies have just been overwhelmed with the flood of data that’s been created by their information systems. Much of that data arrives almost accidentally, when they install, say, a new enterprise resource planning system. Suddenly billions of bits of information are generated about their operations, about their customers, about their suppliers. And most of it just gets stored, never used, never looked at again.

Gary describes coming to Harrah’s as finding a gleaming new F-16 fighter, but with no pilot. Just all this wonderful data that had nobody to steer it and take advantage of it. And I think that’s more the norm than the exception at companies as they implement information systems. The original systems often have very specific operational goals, but ultimately, the data that they generate may be even more important if it leads to innovations and changes in business practice.

What we’re going to see in the coming decade are companies whose whole culture is based on continuous improvement and experimentation—not just of specific processes, but of the entire way the company runs. I think this revolution can be fairly compared to the scientific revolution that happened centuries ago. Great revolutions in science have almost always been preceded by great revolutions in measurement. Management historically has not had that kind of careful measurement or experimentation. But it’s time that we catch up.

Sounds like a massive opportunity. Where do companies start?

Well, like I’ve described, companies are going to have to nurture a mentality of experimentation, an expertise in how to run those kinds of business experiments, and an infrastructure that makes it possible to replicate and scale up successful innovations.

Paradoxically, this leads to a simultaneous centralization and decentralization of decision making. On one hand, the opportunities for innovation and experimentation need to be decentralized, because only the people who are on the spot are going to have the local, specific knowledge to know what kinds of experiments are likely to be valuable. On the other hand, to be truly successful, companies will have to find ways to embed the resulting innovations into a platform that can be scaled up and replicated. That’s easy to do in digital companies like Amazon or Google, and a little harder to do, say, in retail or manufacturing companies, but it can be done through the aid of enterprise information technology. Many business processes can be embedded in these systems. And when you find a better way of managing that process, if it can be leveraged or even fully embedded in a business process, it can be replicated. So, centralization of those parts of the business, with decentralization of the discovery phase.

We’ve started calling these companies “digital organizations.” For my book with Adam Saunders, Wired for Innovation (MIT Press, 2009), we identified their characteristics through a survey of several hundred companies. Over time, I think we’ll be able to get more nuance on when companies are likely to be most successful. But we’ve summarized what we know so far in this book.

What will be most difficult?

I think we’re furthest along in having a platform for replicating and scaling up the experiments. Enterprise resource planning systems are a great example of that.

The skill set is one that we’re in the process of working on. Frankly, it’s going to take a generation to fully work its way through. It’s not just knowledge of the experimental design and the mathematics to handle statistics and to understand what the data are saying. It’s also a culture of creativity to be able to bring together those kinds of hard skills with the flash of insight, the ah-ha moment that comes from really knowing your business, knowing your customers, and bringing those two together. That is, unfortunately, a fairly rare combination. It’s one that I think that we at business schools can do more to teach and bring to businesses.

Changing culture is probably the most difficult challenge. It requires a tolerance for failure and a desire to have employees try new things. Greg Linden, who was at Amazon for a while, has said that genius is the fruit of a thousand failures. That’s different than the old mentality of figuring out all the possibilities and then locking in on one. Instead, it’s an approach of rapidly prototyping many different options, seeing which ones pan out, and using the information infrastructure to get the feedback quickly. Cutting the losses quickly, pruning the failures, and then ramping up the successes.

What do you see as the biggest impediments for companies?

The reality is that most organizations are like a finely tuned watch. My watch has got little gears inside of it. It’s a mechanical watch. If I wanted to make this a digital watch, I suppose I could open it up and get some integrated circuits from a digital watch and kind of put them in there one-by-one. But that would not make this keep better time. That’s not the way to create a digital watch from an analog watch.

Yet many people think that you can take an existing organization that’s based on 20th century principles and add some of the elements of successful digital organizations one-by-one and get a more successful digital organization. I wish that were true, but in most cases that only makes things worse.

What’s required is an understanding of how all these components fit together. Half the battle is understanding that changing just an incentive system or a hiring practice or a technology infrastructure by itself is unlikely to lead to desired results unless all the other components are also matched together.

Now, trying to change that many things simultaneously is a daunting task. What companies can do to manage the scope a bit is to reduce the dimensions of change on some other dimensions. They can focus on a particular geography or a particular product line.

Have you seen this work in real time?

Yes. One company I worked with wanted to change the way its factories ran from a 20th century tailorist approach to what they called modern manufacturing. It involved changes on a dozen specific practices that they had identified, from incentive systems, training, and inventory flow to product mix and technology. Eventually, they implemented the new technology and business practices in a new location, isolated from the old workforce and old physical surroundings. They got the new system to work quite well in this new location, and over time, they back-propagated it to their other locations and were able to get the new system to work throughout the entire organization. But it was something that required them to, on one hand, make lots of changes simultaneously, and, on the other hand, isolate those changes from the rest of the organization so that they could focus on them to get them to work.

I really think that the way that companies implement business processes, organizational change, and IT-driven innovation is what will differentiates the leaders from the laggers. Rather than leveling the playing field, IT is actually led to greater discrepancies. In most industries, the top companies are pulling further away from the companies in the middle and the bottom of the competitive spectrum. Rather than having a compression, we’re seeing a growing spread in performance on multiple dimensions.

We’re in a period of tremendous change and turbulence. People have called this The Great Recession. But it’s been said, “In chaos, lies opportunity.” And when historians look back on this era, I think many people will call it not just The Great Recession, but perhaps The Great Restructuring because of the way that businesses are changing how they’re working and because of the central role that IT has in driving some of those changes.

Defining Business Architecture

fevereiro 10, 2010

Post do blog http://soa.sys-con.com!

=============

Defining Business Architecture

‘Business architecture’ helps business and IT leaders decide on and communicate changes at the new speed of business`

February 9, 2010 03:36 PM EST

SOA & WOA Magazine on Ulitzer

What’s the difference between enterprise architecture (EA) and business architecture (BA)? We pose the question to Tim Westbrock, Managing Director of EAdirections.

The discussion is moderated by me, Dana Gardner, principal analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: I really enjoyed your presentation today. Can you tell us a little bit about some of the high-level takeaways. Principally, how do you define BA?

Westbrock:

Well, the premise of my discussion today is that, in order for EA to maintain and continue to evolve, we have to go outside the domain of IT. Hence, the conversation about BA. To me, BA is an intrinsic component of EA, but what most people really perform in most organizations that I see is IT architecture.

A real business-owned enterprise business architecture and enterprise information architecture are really the differentiating factors for me. I’m not one of these guys that is straight about definitions. You’ve got to get a sense from the words that you use.

To me enterprise business architecture is a set of artifacts and methods that helps business leaders make decisions about direction and communicate the changes that are required in order to achieve that vision.

Gardner: How do we get here? What’s been the progression? And, why has there been such a gulf between what the IT people eat, sleep, and drink, and what the business people expect?

Westbrock: There are a lot of factors in that. Back in the late ’80s and early ’90s, we got really good at providing solutions really quickly in isolated spots. What happened in most organizations is that you had really good isolated solutions all over the place. Integrated? No. Was there a need to integrate? Eventually. And, that’s when we began really piling up the complexity.

We went from an environment, where we had one main vendor or two main vendors, to every specific solution having multiple vendors contributing to the software and the hardware environment.

That complexity is something that the business doesn’t really understand, and we haven’t done a real good job of getting the business to understand the implications of that complexity. But, it’s not something they should really be worried about. It’s our excuse sometimes that it’s too complex to change quickly.

Focus on capabilities

We really need to focus the conversation on capabilities. Part of my presentation talked about deriving capabilities as the next layer of abstraction down from business strategy, business outcomes, and business objectives. It’s a more finite discussion of the real changes that have to happen in an organization, to the channel, to the marketing approach, to the skill mix, and to the compensation. They’re real things that have to change for an organization to achieve its strategies.

In IT architecture, we talk about the changes in the systems. What are the changes in the data? What are the changes in the infrastructure? Those are capabilities that need to change as well. But, we don’t need to talk about the details of that. We need to understand the capabilities that the business requires. So, we talk to folks a lot about understanding capabilities and deriving them from business direction.

Gardner: It seems to me that, over the past 20 or 30 years, the pace of IT technological change was very rapid — business change, not so much. But now, it seems as if the technology change is not quite as fast, but the business change is. Is that a fair characterization?

Westbrock: It’s unbelievably fast now. It amazes me when I come across an organization now that’s surviving and they can’t get a new product out the door in less than a year — 18 months, 24 months. How in a world are they responding to what their customers are looking for, if it takes that long to get system changes products out the door?BA is a means by which we can engage as IT professionals with the business leadership, the business decision-makers who are really deciding how the business is going to change.

We’re looking at organizations trying monthly, every six weeks, every two months, quarterly to get significant product system changes out the door in production. You’ve got to be able to respond that quickly.

Gardner: So, in the past, the IT people had to really adapt and change to the technology that was so rapidly shifting around them, but now the IT people need to think about the rapidly shifting business environment around them.

Westbrock: “Think about,” yes, but not “figure out.” That’s the whole point. BA is a means by which we can engage as IT professionals with the business leadership, the business decision-makers who are really deciding how the business is going to change.

Some of that change is a natural response to government regulations, competitive pressures, political pressures, and demographics, but some of it is strategic, conscious decisions, and there are implications and dependencies that come along with that.

Sometimes, the businesses are aware of them and sometimes they’re not. Sometimes, we understand as IT professionals — some not all — about those dependencies and those implications. By having that meaningful dialogue on an ongoing basis, not just as a result of the big implementation, we can start to shorten that time to market.

Gardner: So, the folks who are practitioners of BA, rather than more narrowly EA, have to fill this role of Rosetta Stone in the organization. They have to translate cultural frames of mind and ideas about the priorities between that IT side and the business side.

Understanding your audience

Westbrock: This isn’t a technical skill, but understanding your audience is a big part of doing this. We like to joke about executives being ADD and not really being into the details, but you know what, some are. We’ve got to figure out the right way to communicate with this set of leadership that’s really steering the course for our enterprise.

That’s why there’s no, “This is the artifact to create.” There’s no, “This is the type of information that they require.” There is no, “This is the specific set of requirements to discuss.”

That’s why we like to start broad. Can you build the picture of the enterprise on one page and have conversations maybe that zero in on a particular part of that? Then, you go down to other levels of detail. But, you don’t know that until you start having the conversation.

Gardner: Okay, as we close out, you mentioned something called “strategic capability changes.” Explain that for us?. . . There’s a missing linkage between that vision, that strategy, that direction, and the actual activities that are going on in an organization.

Westbrock: To me, so many organizations have great vision and strategy. It comes from their leadership. They understand it. They think about it. But, there’s a missing linkage between that vision, that strategy, that direction, and the actual activities that are going on in an organization. Decisions are being made about who to hire, the kinds of projects we decide to invest in, and where we’re going to build our next manufacturing facility. All those are real decisions and real activities that are going on on a daily basis.

This jump from high-level strategy down to tactical daily decision-making and activities is too broad of a gap. So, we talk about strategic capability changes as being the vehicle that folks can use to have that conversation and to bring that discussion down to another level.

When we talk about strategic capability changes, it’s the answer to the question, “What capabilities do we need to change about our enterprise in order to achieve our strategy?” But, that’s a little bit too high level still. So, we help people carve out the specific questions that you would ask about business capability changes, about information capability changes, system, and technology.

Anatomy of Agile Enterprise

fevereiro 8, 2010

Post do blog http://www.ebizq.net!

==============
Anatomy of Agile Enterprise
Janne J. Korhonen

From Fragile to Agile: Enterprise Architecture Reform through BPM and SOA

user-pic
By Janne J. Korhonen on February 7, 2010 8:26 AM

Companies have invested vast amounts of money and effort in information systems. In the course of time, these systems have been integrated with idiosyncratic point-to-point solutions to address larger business transactions. While this has appeared as simple and practical on the outset, over time the increasing complexity has resulted in “integration spaghetti” that is prohibitively difficult and expensive to manage and maintain. The problem is aggravated by the increasing need for business agility that cannot be achieved with cast-in-concrete integration solutions.

 

While this deeply ingrained “technology mess” is too overwhelming to be dismantled and replaced, the truth is out there and can be disentangled. By mapping physical data to logical information objects and technical functions to business activities, a canonical information and operational model can be elicited. To this end, a business has to define its own, consistent ontology, which may partly be adopted from the normative nomenclature of its respective industry vertical. The data in the underlying information systems can be mapped to these canonical concepts using appropriate integration technologies and exposed as data centric SOA services. By analyzing the functional requirements of business processes, a set of logic centric services can also be identified.

In Service-Oriented Architecture (SOA), the intricacies of applications and technology infrastructure are encapsulated behind well-defined, self-describing service interfaces that expose the contained information and functionality as reusable, context-independent services. The underlying implementation of a service can be changed as long as the service contract is maintained. SOA services provide modular building blocks that can be composed to higher level constructs, e.g. orchestrations or composite applications. This coarse-grained modularity reduces the inherent complexity of the enterprise architecture and improves business agility. The service abstraction insulates business changes from IT development and thereby synchronizes the business and IT life-cycles.

In the following, I will attempt to outline how a “fragile” architecture, characterized by “technology mess” and “integration spaghetti”, could be reformed towards a more agile architecture through a concerted effort by business and IT, in which the business-driven top-down BPM (Business Process Management) meets the IT-driven bottom-up SOA. The stages can also be considered as architectural maturity levels.

  1. BPM: Review and specify the organization’s strategy. Describe the external context (position in the overall value configuration) and create a process map. Identify key business processes and specify their high-level key performance indicators (KPIs) and critical success factors (CSFs). This ensures that the global process performance is in line with the strategic objectives and provides the basis for more specific, accurate and correctly aligned objectives. SOA: Start the architecture reform within a pilot domain by replacing integration spaghetti with lasagna: route the idiosyncratic point-to-point integrations through Enterprise Application Integration (EAI) middleware. EAI approach simplifies application integration by providing a common integration backbone with routing and transformation capabilities. 
  2. BPM: Devise the process architecture that provides the foundation for the initial BPM initiative and all subsequent process management projects. Outline the process roles and sub-processes in the key processes. Choose one of these sub-processes in the same domain as the integration pilot and model the process in line with the process architecture guidelines. SOA: With the lessons learned from the pilot, expand EAI approach to the rest of the organization. Within the pilot domain, implement the first basic SOA services. The EAI tool can be used as the initial SOA platform. 
  3. BPM: Re-engineer the pilot process in sync with the SOA undertaking and start modeling other processes leveraging the experiences from the pilot exercise. SOA: Start implementing basic services in other domains. Automate selected parts of the pilot process by orchestrating implemented services to executable process flows. Also implement pertinent context-sensitive portal views to the workflow. As a result of this stage, one business process has been described and implemented top-down and bottom up, meeting in the middle, where services are bound to processes. 
  4. BPM: Once the business process is modeled and automated for relevant parts, it can be readily measured and continuously improved. Channel the business benefits of the pilot initiative to fund re-engineering other business processes in a similar vein. SOA: Expand service orchestration to new business process initiatives as needed. Build full-scale SOA infrastructure including Enterprise Service Bus (ESB), Registry and Repository, and SOA Management faculties. 
  5. BPM: When several domain-specific business processes have been subjected to Business Process Management, the “white space” between these processes can be considered. Whereas orchestration describes the control logic within a single process flow, choreography describes the coordination between these processes in the overall cross-domain end-to-end process. Deploy a full-fledged Business Process Management System (BPMS) to manage the choreography of collaborative, event-driven processes. SOA: Provide the improved capabilities to external partners as enterprise services and contract external services as needed.

It is important to think big, but start small; maintain a long-term grand vision, but realize it sustainably in incremental steps, learning from spearhead pilots, the success of which brings about confidence. Set the target level of maturity based on your business needs. Do not invest in top-notch infrastructure until you have reached the maturity level at which the technology is required. And do not try to jump over maturity levels. Slower is faster.


%d blogueiros gostam disto: