Copyright © 2019 HashiCorp
Segurança na Nuvem
com Vault
© 2019 HashiCorp 2
Background
• Senior Solutions Engineer
• Na Hashicorp ha um ano e meio
• Background de desenvolvimento/
consultor/ vendas
• Brasileiro
• Tocava em banda de heavy metal
@stenio123
stenio@hashicorp.com
Copyright © 2019 HashiCorp ∕ 2
© 2019 HashiCorp 3
Agenda
• Transformação digital
• Desafios de segurança indo para a nuvem
• Demo 1: autenticação segura na nuvem
• Demo 2: acesso seguro aos serviços na nuvem
• Próximos Passos
• Perguntas e Respostas
Copyright © 2019 HashiCorp ∕ 2
Copyright © 2018 HashiCorp ∕
Transformação Digital
4
Copyright © 2019 HashiCorp ∕
Transformação Digital
Copyright © 2019 HashiCorp ∕ 5
Traditional Datacenter
“Static”
Dedicated
Infrastructure
Modern Datacenter
“Dynamic”
AWS Azure GCP+ + +Private Cloud +
“Ticket-based” “Self-service”
Copyright © 2019 HashiCorp ∕Copyright © 2019 HashiCorp ∕ 6
Traditional Datacenter
“Static”
Dedicated
Infrastructure
Modern Datacenter
“Dynamic”
AWS Azure GCP+ + +Private Cloud +
Why?
• Capex to Opex
• Scale, repeatability, maintainability
• Access to new technologies
Transformação Digital
Copyright © 2019 HashiCorp ∕
Desafios de Alto Nivel Durante
Transformação Digital
Copyright © 2019 HashiCorp ∕ 7
Copyright © 2018 HashiCorp ∕ 8
Uma Abordagem Integrada com a
Hashicorp Suite
C++
Provision
Operations
Secure
Security
Run
Development
Connect
Networking
Private Cloud AWS Azure GCP
Copyright © 2018 HashiCorp ∕
Segurança Durante a
Transformação Digital
9
Copyright © 2019 HashiCorp ∕Copyright © 2019 HashiCorp ∕ 10
Desafios de Segurança Durante a
Transfomação Digital
• Gerenciamento de Segredos
Passwords, API Keys, PKI certificates, encryption keys
• Autenticação
Identidade da solicitação - "Sou quem afirmo ser, esta é minha credencial"
• Autorização
Permissões associadas a uma identidade - "O usuário X pode ler o segredo Y"
• Criptografia em repouso
Informações confidenciais são armazenadas em texto criptografado
• Criptografia em trânsito
Informações confidenciais trocadas entre servidores em texto criptografado
• Scaling entre multiplos datacenters, DR, auditoria, fluxos de trabalho de
aprovação, compliance e outros
Copyright © 2019 HashiCorp ∕Copyright © 2019 HashiCorp ∕ 11
Desafios de Autenticação na Nuvem
Portanto,
Como permito que um aplicativo em execução na nuvem tenha acesso a um segredo, sem
hardcoding?
Como permito que um cliente (humano / aplicativo) executando em qualquer lugar, acesse
serviços em nuvem, sem API Keys hardcoded?
Abordagem padrão:
Chaves de API estática / service accounts
Limitações:
• Eventualmente hardcoded - maior risco de ser comprometido
• Dificil gerenciar ciclo de vida - maior risco de exposição
• Difícil de auditar - pode ser compartilhado pelo usuário original com outras pessoas
Copyright © 2019 HashiCorp ∕Copyright © 2019 HashiCorp ∕ 12
Como permito que um aplicativo em execução na nuvem tenha acesso a um
segredo, sem hardcoding?
-> Autenticação Vault baseado em plataforma
Como permito que um cliente (humano / aplicativo) executando em
qualquer lugar, acesse serviços em nuvem, sem API Keys hardcoded?
-> Segredos Dinâmicos com Vault
Authentication Challenges in the Cloud
Copyright © 2019 HashiCorp ∕Copyright © 2019 HashiCorp ∕ 13
Autenticação Vault Baseado em Plataforma
Overview
• Em vez de codificar credenciais em um aplicativo, use a
identidade fornecida pela plataforma
• Exemplo: a AWS usa o perfil IAM associado ao EC2 ou o
Lambda, Azure usa a conta de serviço
• Essas plataformas permitem chamadas para o API
endpoint local para recuperar o token de identidade
• A lógica pode ser codificada manualmente no aplicativo
ou pode ser externalizada usando agentes – Vault Agent,
Consul Template, Cron job
• Suporta: Cloud, Kubernetes, PCF, AppRole
Passos
1. Admin cria credenciais e configura o Vault
2. O aplicativo recupera a identidade da plataforma
3. O aplicativo faz uma solicitação de login no Vault,
enviando identidade
4. Vault valida a identidade com a plataforma e retorna
o token de acesso ao aplicativo
Copyright © 2019 HashiCorp ∕Copyright © 2019 HashiCorp ∕ 14
Segredos Dinâmicos com Vault
Overview
• Em vez de ter uma chave de API ou service account de
longa duração, o Vault pode gerar dinamicamente, conforme
necessário
• Essas chaves serão efêmeras, com um TTL associado
• Vault responsável por revogar a chave quando o TTL atingiu
limite
• Um administrador do Vault também pode revogar chaves ou
grupos de chaves (por prefixo) sem precisar acessar as
plataformas
• Opções específicas da plataforma: segredos gerados podem
ser unicos, service account existentes podem ser utilizadas,
suporta rotação de senhas (Active Directory)
• Segredos dinâmicos para: cloud, bancos de dados, plugins
Passos
1. Admin cria credenciais e configura o Vault
2. Cliente autenticado com Vault faz requisição de
segredo
3. Vault cria segredo e retorna ao cliente
4. Quando tempo limite atingido, Vault revoga o
segredo
Copyright © 2018 HashiCorp ∕
Demos
15
Copyright © 2019 HashiCorp ∕Copyright © 2019 HashiCorp ∕ 16
Demo 1 - Autenticação
Contexto
• Um aplicativo em execução na AWS
• Precisa acessar um banco de dados, portanto, precisa de credenciais de banco de dados
• Em vez de hardcoding essas credenciais, as armazenamos no Vault
• Mas como o aplicativo pode se autenticar no Vault? A fazer hardcoding das credenciais do Vault
anula o objetivo
• Podemos usar a autenticação da AWS do Vault para abstrair as credenciais
• Depois que o aplicativo se autentica no Vault, ele pode solicitar as credenciais de banco de dados
desejadas
TL;DR: aplicativo precisa de um segredo. Obtém do Vault sem credenciais explícitas
Copyright © 2019 HashiCorp ∕Copyright © 2019 HashiCorp ∕ 17
Demo 2 – Segredos Dinâmicos
Contexto
• Um desenvolvedor precisa acessar um bucket S3 do AWS
• Para acessar esse recurso da AWS, preciso das chaves da API da AWS
• Em vez de ter chaves de API estáticas de longa duração, o Vault gerará dinamicamente
essas chaves quando forem necessárias
• Como humano, posso me autenticar no Vault com credenciais tradicionais ou, como
aplicativo, posso usar um fluxo de trabalho como apresentado na demonstração 1
• Depois de me autenticar no Vault, posso solicitar as chaves da API
• Um administrador do Vault pode definir o tempo de vida dessas chaves, permissões e
quais identidades têm acesso para solicitá-las
• Cada chave gerada será única
TL;DR: cliente precisa de chaves na nuvem. O Vault gera chaves e as revoga após TTL
Copyright © 2018 HashiCorp ∕
Próximos Passos
18
Copyright © 2019 HashiCorp ∕Copyright © 2019 HashiCorp ∕ 19
Aprimorando os Demos
Resiliência
• Utilizar Vault como um High Availability cluster
• Não fazer hardcoding de segredos Vault no Terrafom (usar Replicação e DR)
• Utilizar immutable infrastructure (imagens de servidores)
Conectividade
• Utilizar TLS nas conexōes com Vault
• Colocar Vault em subnet privada
• Nao permitir ssh direto nas instancias, utilizar bastion hosts
• Utilize Service Mesh para proteger comunicação entre bastion host e servidor (Consul)
Permissōes
• Utilize usuarios Linux com permissōes restritas
• Utilize usuários Vault com permissōes restritas
• Utilize usuários Cloud com permissōes restritas
Gerenciamento
• Utilizar Vault Agent para gerenciar logica de Cloud Auth
• Colocar Vault como parte de um service discovery cluster (Consul)
Copyright © 2019 HashiCorp ∕Copyright © 2019 HashiCorp ∕ 20
Recursos
Vault Agent Tutorial:
https://learn.hashicorp.com/vault/developer/vault-agent-aws
Discussão sobre Immutable Infrastructure:
https://www.hashicorp.com/resources/what-is-mutable-vs-immutable-infrastructure
Discussão sobre Segredos Dinâmicos
https://www.hashicorp.com/blog/why-we-need-dynamic-secrets
Como Adobe está usando Vault para gerenciar segredos através de clouds:
https://youtu.be/THlpkBioAWQ
Demos
https://github.com/stenio123/vault_cloud_security
∕Copyright © 2019 HashiCorp
21
www.hashicorp.com
hello@hashicorp.com
Thank you

Hashicorp Webinar - Vault Cloud Security - Portuguese

  • 1.
    Copyright © 2019HashiCorp Segurança na Nuvem com Vault
  • 2.
    © 2019 HashiCorp2 Background • Senior Solutions Engineer • Na Hashicorp ha um ano e meio • Background de desenvolvimento/ consultor/ vendas • Brasileiro • Tocava em banda de heavy metal @stenio123 stenio@hashicorp.com Copyright © 2019 HashiCorp ∕ 2
  • 3.
    © 2019 HashiCorp3 Agenda • Transformação digital • Desafios de segurança indo para a nuvem • Demo 1: autenticação segura na nuvem • Demo 2: acesso seguro aos serviços na nuvem • Próximos Passos • Perguntas e Respostas Copyright © 2019 HashiCorp ∕ 2
  • 4.
    Copyright © 2018HashiCorp ∕ Transformação Digital 4
  • 5.
    Copyright © 2019HashiCorp ∕ Transformação Digital Copyright © 2019 HashiCorp ∕ 5 Traditional Datacenter “Static” Dedicated Infrastructure Modern Datacenter “Dynamic” AWS Azure GCP+ + +Private Cloud + “Ticket-based” “Self-service”
  • 6.
    Copyright © 2019HashiCorp ∕Copyright © 2019 HashiCorp ∕ 6 Traditional Datacenter “Static” Dedicated Infrastructure Modern Datacenter “Dynamic” AWS Azure GCP+ + +Private Cloud + Why? • Capex to Opex • Scale, repeatability, maintainability • Access to new technologies Transformação Digital
  • 7.
    Copyright © 2019HashiCorp ∕ Desafios de Alto Nivel Durante Transformação Digital Copyright © 2019 HashiCorp ∕ 7
  • 8.
    Copyright © 2018HashiCorp ∕ 8 Uma Abordagem Integrada com a Hashicorp Suite C++ Provision Operations Secure Security Run Development Connect Networking Private Cloud AWS Azure GCP
  • 9.
    Copyright © 2018HashiCorp ∕ Segurança Durante a Transformação Digital 9
  • 10.
    Copyright © 2019HashiCorp ∕Copyright © 2019 HashiCorp ∕ 10 Desafios de Segurança Durante a Transfomação Digital • Gerenciamento de Segredos Passwords, API Keys, PKI certificates, encryption keys • Autenticação Identidade da solicitação - "Sou quem afirmo ser, esta é minha credencial" • Autorização Permissões associadas a uma identidade - "O usuário X pode ler o segredo Y" • Criptografia em repouso Informações confidenciais são armazenadas em texto criptografado • Criptografia em trânsito Informações confidenciais trocadas entre servidores em texto criptografado • Scaling entre multiplos datacenters, DR, auditoria, fluxos de trabalho de aprovação, compliance e outros
  • 11.
    Copyright © 2019HashiCorp ∕Copyright © 2019 HashiCorp ∕ 11 Desafios de Autenticação na Nuvem Portanto, Como permito que um aplicativo em execução na nuvem tenha acesso a um segredo, sem hardcoding? Como permito que um cliente (humano / aplicativo) executando em qualquer lugar, acesse serviços em nuvem, sem API Keys hardcoded? Abordagem padrão: Chaves de API estática / service accounts Limitações: • Eventualmente hardcoded - maior risco de ser comprometido • Dificil gerenciar ciclo de vida - maior risco de exposição • Difícil de auditar - pode ser compartilhado pelo usuário original com outras pessoas
  • 12.
    Copyright © 2019HashiCorp ∕Copyright © 2019 HashiCorp ∕ 12 Como permito que um aplicativo em execução na nuvem tenha acesso a um segredo, sem hardcoding? -> Autenticação Vault baseado em plataforma Como permito que um cliente (humano / aplicativo) executando em qualquer lugar, acesse serviços em nuvem, sem API Keys hardcoded? -> Segredos Dinâmicos com Vault Authentication Challenges in the Cloud
  • 13.
    Copyright © 2019HashiCorp ∕Copyright © 2019 HashiCorp ∕ 13 Autenticação Vault Baseado em Plataforma Overview • Em vez de codificar credenciais em um aplicativo, use a identidade fornecida pela plataforma • Exemplo: a AWS usa o perfil IAM associado ao EC2 ou o Lambda, Azure usa a conta de serviço • Essas plataformas permitem chamadas para o API endpoint local para recuperar o token de identidade • A lógica pode ser codificada manualmente no aplicativo ou pode ser externalizada usando agentes – Vault Agent, Consul Template, Cron job • Suporta: Cloud, Kubernetes, PCF, AppRole Passos 1. Admin cria credenciais e configura o Vault 2. O aplicativo recupera a identidade da plataforma 3. O aplicativo faz uma solicitação de login no Vault, enviando identidade 4. Vault valida a identidade com a plataforma e retorna o token de acesso ao aplicativo
  • 14.
    Copyright © 2019HashiCorp ∕Copyright © 2019 HashiCorp ∕ 14 Segredos Dinâmicos com Vault Overview • Em vez de ter uma chave de API ou service account de longa duração, o Vault pode gerar dinamicamente, conforme necessário • Essas chaves serão efêmeras, com um TTL associado • Vault responsável por revogar a chave quando o TTL atingiu limite • Um administrador do Vault também pode revogar chaves ou grupos de chaves (por prefixo) sem precisar acessar as plataformas • Opções específicas da plataforma: segredos gerados podem ser unicos, service account existentes podem ser utilizadas, suporta rotação de senhas (Active Directory) • Segredos dinâmicos para: cloud, bancos de dados, plugins Passos 1. Admin cria credenciais e configura o Vault 2. Cliente autenticado com Vault faz requisição de segredo 3. Vault cria segredo e retorna ao cliente 4. Quando tempo limite atingido, Vault revoga o segredo
  • 15.
    Copyright © 2018HashiCorp ∕ Demos 15
  • 16.
    Copyright © 2019HashiCorp ∕Copyright © 2019 HashiCorp ∕ 16 Demo 1 - Autenticação Contexto • Um aplicativo em execução na AWS • Precisa acessar um banco de dados, portanto, precisa de credenciais de banco de dados • Em vez de hardcoding essas credenciais, as armazenamos no Vault • Mas como o aplicativo pode se autenticar no Vault? A fazer hardcoding das credenciais do Vault anula o objetivo • Podemos usar a autenticação da AWS do Vault para abstrair as credenciais • Depois que o aplicativo se autentica no Vault, ele pode solicitar as credenciais de banco de dados desejadas TL;DR: aplicativo precisa de um segredo. Obtém do Vault sem credenciais explícitas
  • 17.
    Copyright © 2019HashiCorp ∕Copyright © 2019 HashiCorp ∕ 17 Demo 2 – Segredos Dinâmicos Contexto • Um desenvolvedor precisa acessar um bucket S3 do AWS • Para acessar esse recurso da AWS, preciso das chaves da API da AWS • Em vez de ter chaves de API estáticas de longa duração, o Vault gerará dinamicamente essas chaves quando forem necessárias • Como humano, posso me autenticar no Vault com credenciais tradicionais ou, como aplicativo, posso usar um fluxo de trabalho como apresentado na demonstração 1 • Depois de me autenticar no Vault, posso solicitar as chaves da API • Um administrador do Vault pode definir o tempo de vida dessas chaves, permissões e quais identidades têm acesso para solicitá-las • Cada chave gerada será única TL;DR: cliente precisa de chaves na nuvem. O Vault gera chaves e as revoga após TTL
  • 18.
    Copyright © 2018HashiCorp ∕ Próximos Passos 18
  • 19.
    Copyright © 2019HashiCorp ∕Copyright © 2019 HashiCorp ∕ 19 Aprimorando os Demos Resiliência • Utilizar Vault como um High Availability cluster • Não fazer hardcoding de segredos Vault no Terrafom (usar Replicação e DR) • Utilizar immutable infrastructure (imagens de servidores) Conectividade • Utilizar TLS nas conexōes com Vault • Colocar Vault em subnet privada • Nao permitir ssh direto nas instancias, utilizar bastion hosts • Utilize Service Mesh para proteger comunicação entre bastion host e servidor (Consul) Permissōes • Utilize usuarios Linux com permissōes restritas • Utilize usuários Vault com permissōes restritas • Utilize usuários Cloud com permissōes restritas Gerenciamento • Utilizar Vault Agent para gerenciar logica de Cloud Auth • Colocar Vault como parte de um service discovery cluster (Consul)
  • 20.
    Copyright © 2019HashiCorp ∕Copyright © 2019 HashiCorp ∕ 20 Recursos Vault Agent Tutorial: https://learn.hashicorp.com/vault/developer/vault-agent-aws Discussão sobre Immutable Infrastructure: https://www.hashicorp.com/resources/what-is-mutable-vs-immutable-infrastructure Discussão sobre Segredos Dinâmicos https://www.hashicorp.com/blog/why-we-need-dynamic-secrets Como Adobe está usando Vault para gerenciar segredos através de clouds: https://youtu.be/THlpkBioAWQ Demos https://github.com/stenio123/vault_cloud_security
  • 21.
    ∕Copyright © 2019HashiCorp 21 www.hashicorp.com hello@hashicorp.com Thank you

Notas do Editor

  • #5 <note to presenter> frame the discussion to indicate that there are really three pictures that matter #1 is the transition in infrastructure #2 is how we think about them in layers #3 is what success looks like in terms of core Terraform, Vault and Consul as a shared service
  • #6 Talk about what’s happening in the world of infrastructure where we are going through a transition that happens in our industry every 20 years: this time from one which is largely dedicated servers in a private datacenter to a pool of compute capacity available on demand. In simple terms, this is a shift from “static” infrastructure to ‘dynamic infrastructure’ which is the reality of cloud. And while the first cloud provider was AWS, it is clear that it will be a multi-cloud world. Each of these platforms have their own key advantages and so it is inevitable that most G2K organizations will use more than one. This is not about moving applications around (since data gravity is a constraint) but rather creates a need for a common operating model across these distinct platforms that allows different teams to leverage the platform for their choice.
  • #7 Talk about what’s happening in the world of infrastructure where we are going through a transition that happens in our industry every 20 years: this time from one which is largely dedicated servers in a private datacenter to a pool of compute capacity available on demand. In simple terms, this is a shift from “static” infrastructure to ‘dynamic infrastructure’ which is the reality of cloud. And while the first cloud provider was AWS, it is clear that it will be a multi-cloud world. Each of these platforms have their own key advantages and so it is inevitable that most G2K organizations will use more than one. This is not about moving applications around (since data gravity is a constraint) but rather creates a need for a common operating model across these distinct platforms that allows different teams to leverage the platform for their choice.
  • #8 Talk about what’s happening in the world of infrastructure where we are going through a transition that happens in our industry every 20 years: this time from one which is largely dedicated servers in a private datacenter to a pool of compute capacity available on demand. In simple terms, this is a shift from “static” infrastructure to ‘dynamic infrastructure’ which is the reality of cloud. And while the first cloud provider was AWS, it is clear that it will be a multi-cloud world. Each of these platforms have their own key advantages and so it is inevitable that most G2K organizations will use more than one. This is not about moving applications around (since data gravity is a constraint) but rather creates a need for a common operating model across these distinct platforms that allows different teams to leverage the platform for their choice.
  • #10 <note to presenter> frame the discussion to indicate that there are really three pictures that matter #1 is the transition in infrastructure #2 is how we think about them in layers #3 is what success looks like in terms of core Terraform, Vault and Consul as a shared service
  • #11 Talk about what’s happening in the world of infrastructure where we are going through a transition that happens in our industry every 20 years: this time from one which is largely dedicated servers in a private datacenter to a pool of compute capacity available on demand. In simple terms, this is a shift from “static” infrastructure to ‘dynamic infrastructure’ which is the reality of cloud. And while the first cloud provider was AWS, it is clear that it will be a multi-cloud world. Each of these platforms have their own key advantages and so it is inevitable that most G2K organizations will use more than one. This is not about moving applications around (since data gravity is a constraint) but rather creates a need for a common operating model across these distinct platforms that allows different teams to leverage the platform for their choice.
  • #12 Talk about what’s happening in the world of infrastructure where we are going through a transition that happens in our industry every 20 years: this time from one which is largely dedicated servers in a private datacenter to a pool of compute capacity available on demand. In simple terms, this is a shift from “static” infrastructure to ‘dynamic infrastructure’ which is the reality of cloud. And while the first cloud provider was AWS, it is clear that it will be a multi-cloud world. Each of these platforms have their own key advantages and so it is inevitable that most G2K organizations will use more than one. This is not about moving applications around (since data gravity is a constraint) but rather creates a need for a common operating model across these distinct platforms that allows different teams to leverage the platform for their choice.
  • #13 Talk about what’s happening in the world of infrastructure where we are going through a transition that happens in our industry every 20 years: this time from one which is largely dedicated servers in a private datacenter to a pool of compute capacity available on demand. In simple terms, this is a shift from “static” infrastructure to ‘dynamic infrastructure’ which is the reality of cloud. And while the first cloud provider was AWS, it is clear that it will be a multi-cloud world. Each of these platforms have their own key advantages and so it is inevitable that most G2K organizations will use more than one. This is not about moving applications around (since data gravity is a constraint) but rather creates a need for a common operating model across these distinct platforms that allows different teams to leverage the platform for their choice.
  • #14 Talk about what’s happening in the world of infrastructure where we are going through a transition that happens in our industry every 20 years: this time from one which is largely dedicated servers in a private datacenter to a pool of compute capacity available on demand. In simple terms, this is a shift from “static” infrastructure to ‘dynamic infrastructure’ which is the reality of cloud. And while the first cloud provider was AWS, it is clear that it will be a multi-cloud world. Each of these platforms have their own key advantages and so it is inevitable that most G2K organizations will use more than one. This is not about moving applications around (since data gravity is a constraint) but rather creates a need for a common operating model across these distinct platforms that allows different teams to leverage the platform for their choice.
  • #15 Talk about what’s happening in the world of infrastructure where we are going through a transition that happens in our industry every 20 years: this time from one which is largely dedicated servers in a private datacenter to a pool of compute capacity available on demand. In simple terms, this is a shift from “static” infrastructure to ‘dynamic infrastructure’ which is the reality of cloud. And while the first cloud provider was AWS, it is clear that it will be a multi-cloud world. Each of these platforms have their own key advantages and so it is inevitable that most G2K organizations will use more than one. This is not about moving applications around (since data gravity is a constraint) but rather creates a need for a common operating model across these distinct platforms that allows different teams to leverage the platform for their choice.
  • #16 <note to presenter> frame the discussion to indicate that there are really three pictures that matter #1 is the transition in infrastructure #2 is how we think about them in layers #3 is what success looks like in terms of core Terraform, Vault and Consul as a shared service
  • #17 Talk about what’s happening in the world of infrastructure where we are going through a transition that happens in our industry every 20 years: this time from one which is largely dedicated servers in a private datacenter to a pool of compute capacity available on demand. In simple terms, this is a shift from “static” infrastructure to ‘dynamic infrastructure’ which is the reality of cloud. And while the first cloud provider was AWS, it is clear that it will be a multi-cloud world. Each of these platforms have their own key advantages and so it is inevitable that most G2K organizations will use more than one. This is not about moving applications around (since data gravity is a constraint) but rather creates a need for a common operating model across these distinct platforms that allows different teams to leverage the platform for their choice.
  • #18 Talk about what’s happening in the world of infrastructure where we are going through a transition that happens in our industry every 20 years: this time from one which is largely dedicated servers in a private datacenter to a pool of compute capacity available on demand. In simple terms, this is a shift from “static” infrastructure to ‘dynamic infrastructure’ which is the reality of cloud. And while the first cloud provider was AWS, it is clear that it will be a multi-cloud world. Each of these platforms have their own key advantages and so it is inevitable that most G2K organizations will use more than one. This is not about moving applications around (since data gravity is a constraint) but rather creates a need for a common operating model across these distinct platforms that allows different teams to leverage the platform for their choice.
  • #19 <note to presenter> frame the discussion to indicate that there are really three pictures that matter #1 is the transition in infrastructure #2 is how we think about them in layers #3 is what success looks like in terms of core Terraform, Vault and Consul as a shared service
  • #20 Talk about what’s happening in the world of infrastructure where we are going through a transition that happens in our industry every 20 years: this time from one which is largely dedicated servers in a private datacenter to a pool of compute capacity available on demand. In simple terms, this is a shift from “static” infrastructure to ‘dynamic infrastructure’ which is the reality of cloud. And while the first cloud provider was AWS, it is clear that it will be a multi-cloud world. Each of these platforms have their own key advantages and so it is inevitable that most G2K organizations will use more than one. This is not about moving applications around (since data gravity is a constraint) but rather creates a need for a common operating model across these distinct platforms that allows different teams to leverage the platform for their choice.
  • #21 Talk about what’s happening in the world of infrastructure where we are going through a transition that happens in our industry every 20 years: this time from one which is largely dedicated servers in a private datacenter to a pool of compute capacity available on demand. In simple terms, this is a shift from “static” infrastructure to ‘dynamic infrastructure’ which is the reality of cloud. And while the first cloud provider was AWS, it is clear that it will be a multi-cloud world. Each of these platforms have their own key advantages and so it is inevitable that most G2K organizations will use more than one. This is not about moving applications around (since data gravity is a constraint) but rather creates a need for a common operating model across these distinct platforms that allows different teams to leverage the platform for their choice.