2. 2
Data
Migration
Data
Warehouse
Master Data
Management
Customer
Applications
Real – Time
Regulatory
Reporting
Application
ConsolidationTeradataFastLoad
OracleSQL,PL/SQL
Queuing
/Custom
IM
S
Code
TeradataTPump
O
racle
SQ
LLoad
ALE/ID
oc
Oracle LogMiner (Change Capture)
IMS
ChangeCapture
Oracle SQL
TeradataMultiLoad
BAPI
WebServices
WebServices
WebServices
WebServices
CustomIMSDML
Custom
IMSDML
Falta de visibilidade
Alto custo de manutenção
Dados heterogêneos de diversas fontes
Diferentes requisitos e meios para consumir dados
Abordagem Tradicional
87% das empresas utilizam hand-coding para Integração de Dados
5. 5
Application Database Interaction Data
…
Cloud Computing Unstructured
Data
Warehouse
Data Integration &
Migration
Test Data
& Archiving
Master Data
Management
Data
Synchronization
B2B Data
Exchange
Data
Consolidation
Abordagem Tradicional
87% das empresas utilizam hand-coding para Integração de Dados
7. 7
• Onde está minha informação?
• Como e quando posso coletá-la?
• Qual o seu significado (negócio)?
• É confiável?
• Como colocá-la no formato que eu
preciso?
• Qual a estratégia de coleta para a
entrega eficaz?
• Como posso monitorar sua
qualidade?
Dúvidas para início do projeto
8. 8
Data
Warehouse
ConsumersProviders
Plataforma de Integração
Garanta a Entrega de Dados em qualquer Projeto !
Acessar
Qualquer fonte
de dados em
Batch ou Real-
time
Integrar
Transformar e
consolidar
dados
TratarAnalisar
Validar, corrigir,
padronizar e
enriquecer
qualquer tipo de
dados
Buscar e
investigar dados
em qualquer
origem
Entregar
Prover dados
corretos, no
momento e
formato
esperados
10. 10
B2B Data ExchangeB2B Data Exchange
Informatica supports the requirements of
cross-organizational data exchange, so
users apply familiar & trusted data
integration tools and techniques to the
growing practice of B2B data integration.
Cloud Data IntegrationCloud Data IntegrationEnterprise Data IntegrationEnterprise Data Integration
Complex Event ProcessingComplex Event Processing
AgentLogic provides the means for
Informatica to couple event processing
with its identity resolution and change
data capture technologies that can
further help pinpoint changed events
Ultra MessagingUltra Messaging
29West is best-known for its low-latency
Latency Busters Messaging (LBM), which
is used extensively in financial markets
for streaming market-data applications.
29West is used by seven of the top 10
investment banks
Data QualityData Quality Master Data ManagementMaster Data Management
Application ILMApplication ILM
Informatica's heritage in data movement
and integration technology, expertise, and
experience makes it an obvious choice for
the application ILM needs of harried data
management professionals. It also expands
the company's addressable market to
encompass an adjacent growth category.
ULTRA
MESSAGING
COMPLEX
EVENT
PROCESSING
B2B DATA
EXCHANGE
CLOUD
DATA
INTEGRATION
ENTERPRISE
DATA
INTEGRATION
APPLICATION
ILM
DATA
QUALITY
MASTER
DATA
MANAGEMENT
Liderança Tecnológica Comprovada…
11. 11
Fundada em: 1993
Receita 2012: US$ 811.6 M
Crescimento Médio: 17%
(Nos Últimos 7 Anos)
Funcionários: 2.810+
Parceiros: 450+
Maiores líderes de SI, ISV, OEM e
On-Demand
Clientes: 5.000+
Clientes em 82 países
Presença Direta em 28 países
Nº 1 em Fidelidade do Cliente (há 7
anos sucessivos)
84 da Fortune 100
87% + do Dow Jones
Org. Gov. em 20 países
INFORMATICA
A Líder Independente #1 em Integração de Dados
12. 12
Informatica no Brasil
Filial São Paulo
IS Informatica Ltda
Av. Nações Unidas 12901 (CENU) 3º andar
São Paulo - SP - Brasil 04578-910
Escritórios Comerciais/Serviços
Rio de Janeiro, RJ Brasília, DF
• Início das Operações em 2005
• Escritórios em São Paulo, Rio de
Janeiro e Brasília
• Práticas de Suporte, Serviços
Profissionais e de Treinamento
estabelecidas
• Base Atual de clientes em diversos
setores da indústria Varejo, Óleo
Telecomunicações, Finanças, Governo,
Mineração, Saúde,Seguros,Gás, etc)
13. 13
A Abordagem da Informatica …
Uma plataforma Compreensiva, Unificada, Aberta e Econômica
Data
Warehouse
Projetos de Integração de Dados
Migração
de Dados
Arquivamento
Dados / Sistemas
Gerenciar
Dados Mestres
Sincronismo
Dados
Integração
Parceiros
(B2B)
Consolidação
Dados / Sistemas
Processamento
Eventos
Complexos
Ultra
Messaging
14. 14
• Permitiu a integração em
tempo real utilizando
infraestrutura ágil e
flexível.
• Garantiu acesso on-
demand a sistemas
heterogeneos sem
programação manual
• Simplificou o processo de
sincronização de dados
Melhorar a qualidade do Serviço de Informação de Crédito
e reduzir o consumo de recursos do mainframe - mips
IMPERATIVOS CHAVES DE NEGÓCIO
DIFERENCIAL INFORMATICA RESULTADOS/BENEFICIOS
• Modernizar, expandir e refinar o gerencimento de processos
• Obter informação financeira atual de todas as plataformas
(Mainframe e Baixa plataforma)
• Reduzir custo operacional para implementar e manter a
solução
O DESAFIO
INICIATIVA DE TI
Redução de consumo de
recursos do Mainframe (mips)
• Reduzir consumo de
recursos de Mainframe
• 700 Milhões de registros
do DB2 Mainframe, por
semana, para serem
sincronizados com a baixa
plataforma
• Melhorar a qualidade do
serviço para as instituições
financeiras (~ 6017)
• Aumentou a qualidade do
serviço para as
instituições financeiras
• Reduziu consumo do
Mainframe - mips em 20%
• Aumentou a segurança
operacional sem arquivos
intermediários
• Rápida implementação a
um baixo custo
PROJETO DE INTEGRAÇÃO DE DADOS
Data Synchronization
ref123
Banco Central reduz consumo de recursos de mainframe
em 20% com o uso das soluções da Informatica
Problem: Data Fragmentation Leads to Costly “Integration Hairball” As systems and applications have proliferated across the enterprise, a huge number of heterogeneous data sources have come into play. And different teams have used different technologies and approaches to integrate data across those systems. In part this is because different consuming systems– reporting tools, portals, ERP applications, etc.– have different requirements for what data they need and how they need to consume it. Too support all these different needs, an “integration hairball” has resulted, with thousands of point-to-point connections built up over time. This hairball is extremely complex and brittle– it’s far too difficult to maintain it, much less change it, as every change can have a potential ripple effect, breaking other integration points and systems. And because of the complexity of the hairball, it’s almost impossible to have visibility and transparency into your important data as it flows through the business.
Data is one of the most valuable assets for any enterprises. It’s the lifeblood of business operations and decision making. But enterprises like yours are dealing with overwhelming data challenges. Huge volumes of data. Data scattered across the enterprise, and now outside the enterprise. And it’s a big effort to turn all that data into valuable information for the business. If you can harness all your data and turn it into valuable information, it’s an extraordinary asset to the business. But if you aren’t able to integrate and manage all this data, it can become a major liability.
Key Point: It is important to start with understanding because of the complexity of most IT environments – integration cannot succeed without having a baseline of understanding within this complexity So let’s drill down on why it is important to have these functions all together in a unified platform. Why is it important to start with understanding? Many of our IT environments look like the diagram on this slide (which came from an actual customer environment and is actually slide 1 of 2). Our systems have grown over time in strange ways – through acquisitions, through tactical “quick fix” projects, and through new system implementations where the old systems were never actually retired. So what we’ve ended up with is a very complex picture – filled with fragile point to point connections, redundant data, and disaggregated information. Within this environment it is very difficult to answer even very simple questions? Where is my information? – how do I find my customer in this mess, for example. Customer information is probably spread all over the place in this picture. What does the information mean? – how does each system define customer Can I trust it? Is this data correct? Which system has the best information? How do I get timely information where it needs to go? And most of all, how do I control information in the face of all this complexity? Need to understand all the places where information is kept Need to understand exactly what is in a data source Need to understand fitness for its intended purpose Need to understand how different sources are related to each other Need to use this understanding to automate & assist integration
So how does the Informatica product platform help you right start your data warehousing initiatives within your IT architecture? Our platform is comprised of four products that, together, automate the entire data integration lifecycle . These four products are well-integrated, complement one-another, and can be extended through a number of add-on options (from Informatica and others) to deliver the best overall data services functionality within the enterprise’s IT architecture. You start by accessing data. With Informatica PowerExchange , you can access all data types in any system, in batch or real-time, in a very flexible manner, so your data is more usable. <click> The next steps in the data integration lifecycle are discovering what’s in those data sources and cleansing all types of data. With Informatica Data Explorer , you can profile and search your data to understand what your data issues may be. With Informatica Data Quality , you can by validate, correct, and standardize all the types of data you have. <click> The final two steps in the lifecycle are integrating and delivering the data. With Informatica PowerCenter , you can transform and reconcile all data types from all different sources, including unstructured and semi-structured data, and then deliver that data to the right place, at the right time, and in the right format. <click> This breadth of connectivity is crucial comparing against hand coding. In a hand coding environment – most of the things we can get to DIRECTLY require additional extracts or staging. IE How do you write PL SQL to get SAP data? You don’t. Someone hand codes an ABAP program to dump a file and then you write PL SQL to read the file or stage it to a db. Now you have 2 and 3 diff programs/technologies in play to provide SAP access vs a tool that provides drivers/connects directly to a wide variety of data. The connectivity provides a seamless architecture for data sources now and in the future. Lets say you get a new data source that is a PDF. Now, how would you source PDF with java code or PL SQL? You can’t at all. With the Informatica platform – an additional connector (PWX for B2B) is all you need and the same scheduling/development architecture is leveraged for the new data source. These are key points to consider. In data warehousing you are always seeing your source data structures change as applications are upgraded/enhanced or sometimes migrated/replaced. How do you respond quickly and effectively in a hand code environment? You don’t. This is where a tool really provides that level of adaptability. From an IT perspective, it’s important to have robust, easy-to-use development and management capabilities that can be used in each of these five steps to help your IT teams develop logic around your data and to better collaborate with each other. By leveraging a common repository and shared metadata , enterprise data is made more consistent, productive, and reusable across many different integration projects.
Intro 1. In almost all enterprises, data is fragmented across multiple departmental databases and applications like ERP and CRM. It’s a huge challenge just integrating all this data. But a lot of important enterprise data such as planning data is locked up in unstructured documents like Microsoft Excel. It’s critical to integrate all these various types of data to manage the enterprise holistically. But it gets even more complex. As companies adopt cloud applications and cloud computing, more and more data is also now hosted in the cloud. And also, more and more data is being shared with trading partners including suppliers and customers. To tackle this multi-faceted growing data fragmentation and delivery challenge, we deliver five best-in-class technologies. Enterprise data integration to integrate all the data silos managed within the enterprise, including unstructured data. Cloud data integration helps you retain control over off-premise data managed in the cloud. B2B data exchange enables you to share and manage data with partners. And in certain cases such as stock trading, where extremely low-latency, high-throughput delivery and dissemination of data is critical, we provide ultra messaging capabilities. And to address all the growing volumes of data, we provide Information Lifecycle Management to cost-effectively and securely manage that data. These five capabilities are the core foundation for integrating, moving and managing data across the extended enterprise. But that is not enough. Once you have integrated and managed the data, you still need to turn it into information that is of value to the business. To unlock the business value of all this data, we deliver three best-in-class technologies. Data Quality to cleanse the data and ensure that the business can trust it. Master Data Management to govern the most strategic information assets and ensure you can gain competitive advantage from it. And Complex Event Processing to sense to events as they occur, and to be able to act upon all the data coming in. The Informatica Platform provides all of these capabilities, to ensure you can integrate and manage all your data, and so that you can transform it into valuable information for the business.