O slideshow foi denunciado.
Utilizamos seu perfil e dados de atividades no LinkedIn para personalizar e exibir anúncios mais relevantes. Altere suas preferências de anúncios quando desejar.

[DW Webinar] Effective Management of APIs and the Edge when Adopting Kubernetes

In this webinar, we discussed the two most important challenges with an API gateway when adopting Kubernetes:

- How to scale the management of 100s of services and the associated APIs across your teams
- How the gateway can support a broad range of microservice architectures, protocols, and configuration that typically spans the entire edge stack

Daniel then presented a whistle-stop tour of the three main strategies we see the industry converging on for preparing the edge for Kubernetes:

1. Deploy an additional Kubernetes API gateway
2. Extend an existing API gateway
3. Deploy a comprehensive self-service edge stack

Audiolivros relacionados

Gratuito durante 30 dias do Scribd

Ver tudo
  • Seja o primeiro a comentar

  • Seja a primeira pessoa a gostar disto

[DW Webinar] Effective Management of APIs and the Edge when Adopting Kubernetes

  1. 1. Effective Management of APIs and the Edge when Adopting Kubernetes Daniel Bryant Product Architect, Datawire
  2. 2. tl;dr Microservices and Kubernetes presents two primary challenges at the edge: 1. How to scale the management of many services and APIs; 2. Supporting a broad range of microservice architectures and protocols. Three primary strategies for managing APIs in this new context 1. Deploying an additional gateway 2. Extending the existing gateway 3. Deploying edge stack
  3. 3. @danielbryantuk
  4. 4. The Increasing Importance of the Edge
  5. 5. The Increasing Importance of the Edge
  6. 6. API Gateway: A Focal Point with Microservices https://blog.christianposta.com/microservices/api-gateways-are-going-through-an-identity-crisis/
  7. 7. The Two Most Important Challenges with an API Gateway When Adopting Kubernetes
  8. 8. Challenge #1: Scaling Edge Management Operations / Platform Team Development Team
  9. 9. Challenge #1: Scaling Edge Management https://netflixtechblog.com/full-cycle-developers-at-netflix-a08c31f83249
  10. 10. Challenge #1: Scaling Edge Management
  11. 11. Challenge #2: Supporting Diverse Edge Requirements REST- REST- REST- REST- JSON over HTTP JSON over HTTP JSON over HTTP JSON over HTTP
  12. 12. Challenge #2: Supporting Diverse Edge Requirements JSON over HTTP gRPC WebSockets
  13. 13. Challenge #2: Supporting Diverse Edge Requirements
  14. 14. Two Challenges Key Takeaways Microservices help to scale development Kubernetes helps to scale deployment and the runtime We must scale the associated release/management mechanisms We must provide support for a diverse range of requirements Bottom line: More microservices, means more developer interaction at the edge
  15. 15. Three Strategies for Managing APIs and the Edge with Kubernetes
  16. 16. Choosing a “strategy” can sound scary
  17. 17. Overview #1: Deploy an Additional Kubernetes API Gateway #2: Extend Existing API Gateway #3: Deploy a Comprehensive Self-Service Edge Stack https://www.getambassador.io/resources/strategies-managing-apis-edge-kubernetes/
  18. 18. #1: Deploy an Additional Kubernetes API Gateway Simply deploy an additional “in-cluster” gateway ● Below the existing gateway ● Below the load balancer Management ● Development teams responsible ● OR existing ops team manages this
  19. 19. #1: Deploy an Additional Kubernetes API Gateway Pros There is minimal change to the core edge infrastructure. Incremental migration easily Provides a limited blast radius Cons Increased management overhead of working with different components Increased operational overhead with this approach Challenging to expose the functionality to each independent microservice teams Challenging to locate and debug where within the edge stack any issues occur.
  20. 20. #1: Deploy an Additional Kubernetes API Gateway As much edge functionality as possible should be pushed into the Kubernetes API Gateway, and directly exposed to application developers. For edge functionality that needs to remain centralized, the operations team should create a workflow for application developers, and support this with SLAs. Application development teams should use these SLAs in their release planning to minimize release delays. Existing edge components should be configured to pass-through traffic on specific routes to the additional gateway for proper functionality.
  21. 21. Strategy #2: Extend Existing API Gateway Implemented by modifying or augmenting the existing API gateway solution Enable synchronization between the API endpoints and location k8s services Custom ingress controller for the existing API Gateway or load balancer
  22. 22. Strategy #2: Extend Existing API Gateway Pros Reuse the existing tried and trusted API gateway Leverage existing integrations with on-premises infrastructure and services Minimal need to learn Kubernetes networking technologies Cons Workflows must change to preserve a single source of truth for the API gateway configuration. Limited amount of configuration parameters via Kubernetes annotations Additional education and training may be required to educate developers
  23. 23. Strategy #2: Extend Existing API Gateway Recommended to shift away from the traditional API/UI-driven configuration model of their existing gateway. In addition, a standardized set of scripts should be used so any modification of routes to services running outside the Kubernetes cluster does not conflict with the services running inside the new cluster Before adopting the strategy, an architectural roadmap review of current and anticipated edge requirements for microservices is essential.
  24. 24. #3: Deploy a Comprehensive Self-Service Edge Stack Deploy Kubernetes-native API gateway with integrated supporting edge components Installed in each of the new Kubernetes clusters, replacing existing edge Ops team own, and provide sane defaults Dev teams responsible for configuring the edge stack as part of their normal workflow
  25. 25. #3: Deploy a Comprehensive Self-Service Edge Stack Pros Good integration with the new API gateway and the surrounding “cloud native” stack Edge management is simplified into a single stack that is configured and deployed via the same mechanisms as any other Kubernetes configuration Supports cloud native best practices: “single source of truth”, GitOps etc Cons Potentially a large architectural shift. Platform team must learn about new proxy technologies and potentially new edge component technologies. Changes to engineering workflow, and increased responsibility for devs
  26. 26. #3: Deploy a Comprehensive Self-Service Edge Stack Each microservice team is empowered to maintain the edge configuration specific to each of their microservices. The edge stack aggregates the distributed configuration into a single consistent configuration for the edge. To support the diversity of the edge services, adopt an edge stack that has been built on a modern L7 proxy with a strong community such as the Cloud Native Computing Foundation’s Envoy Proxy. The breadth of the community helps ensure that the core routing engine in the edge stack will be able to support a diversity of use cases.
  27. 27. Conclusions Managing the edge of the system has always been complicated More services with a diversity of architectures increases demands on the edge Two primary challenges: 1. How to scale the management of the edge 2. How the gateway can support a broad range of requirements Three patterns at the “cloud native” edge (additional gw, extend gw, edge stack) We think the future is the self-service edge stack
  28. 28. Effective Management of APIs and the Edge when Adopting Kubernetes Daniel Bryant Product Architect, Datawire

×