Original source:Google Cloud Blog
Many organizations are moving all or part of their IT operations to the cloud to help them operate better. However, such large-scale migration, in the actual operation of enterprises also have certain difficulties. Many important resources that enterprises keep on local servers do not support direct migration to the cloud.
In addition, the migration process from enterprise on-premises to the cloud can be constrained by policy constraints, security concerns, and the original way of operating.
The following should help you understand how we can help your enterprise move smoothly to the cloud.
Move to the open cloud stack
At Google Cloud Next ’18, the Cloud Services Platform, a hosting solution based on Google’s open source technology, was announced. On this platform, there are containerized and microservices-based application architecture tools that users can leverage to quickly migrate IT resources and create applications.
Cloud Services Platform combines container choreography tool Kubernetes and service management Platform Istio to provide users with the best experience in architecture, security and operability. The goal is to improve cloud efficiency and reliability while being more adaptable to business scale expansion. Here’s how to do this with Istio on GKE.
Istio is preferred when setting up services
Istio is a key tool we believe in helping users practice the best microservices architecture. The advantage of Istio is that it provides better visibility and security, making containerization tasks easier to manage. With Istio, we integrate Kubernetes services directly and simplify container lifecycle management. Google Cloud was one of the first Cloud services to offer this feature.
Istio has personal authentication and can be associated with a user’s other services. MTLS (Mutual Transport Layer Security Protocol) will be added to service communications to ensure that all information will be encrypted during transmission. Istio provides an identity for each service, allowing users to enforce a separate service policy for each application while providing unique authentication.
In addition, because Istio integrates Stackdriver, GCP’s native monitoring tool, you will benefit from its visual interface capabilities. This integration sends metrics, logs, and telemetry to Stackdriver, which lets you monitor information (traffic, error rates, latency, etc.) for each service in GKE.
Istio 1.0 is a key step in helping users achieve hybrid cloud management, as workloads run in different environments on the ground and in the cloud, such as containerized microservices or monolithic virtual machines. By using Istio on GKE, you get containerized applications that are more visible, secure, and resilient, including a super-simple plug-in that can be used in conjunction with existing applications.
Use Istio in GKE
The service-level view and security provided by Istio are particularly important for distributed applications that containerize microservice deployments, and Istio on GKE allows you to deploy Istio to a Kubernetes cluster with one click.
Istio on GKE applies to new and existing container deployments. It allows you to incrementally update features, such as Istio security, to keep your existing deployments secure. It also simplifies Istio lifecycle management by automatically upgrading Istio deployment when a new release is released.
Istio Beta, currently available on GKE, is the latest in our efforts to make GKE the ideal choice for enterprises. Welcome to the Google Cloud Platform console, using Istio on GKE. For more information, visit the Istio documentation at cloud.google.com/istio or GKE.
Optimize the GKE network
Earlier this year, we announced a number of new web features for GKE, including VPC native clustering, shared VPCS, native container load balancing, and native container web services for applications on GKE and Kubernetes on Google Cloud.
-
With VPC native clustering, GKE itself supports many VPC functions such as scaling, IP management, security checks and mixed connections.
-
A shared VPC allows you to delegate management responsibilities to a cluster administrator while ensuring that critical network resources are managed by a network administrator.
-
Container native load balancing allows you to create load balancers by specifying the container as the endpoint for better load balancing.
-
Web services allow you to use Cloud Armor, Cloud CDN, and Identity Aware Proxy for container work.
We also announced a number of new features to simplify the configuration of container deployments, including some enhancements to both back-end and front-end configurations. These improvements make many operations and configurations easier, whether it’s identity and access management of network resources, or control of CDN, Cloud Armor, or load balancing.
Improved GKE security
GCP helps users secure container environments at every stage of the build and deployment life cycle with software supply chain and runtime security tools. This includes tool integration with multiple security partners, all of which build on Google’s security-centric infrastructure and practices. New features such as automatic node upgrades and private clustering add security options available to GKE users.
You can read more about GKE’s new security features in explore Container Security: This is the Year of Security.
Distribute Kubernetes applications through GCP Marketplace
Companies typically work with many partners in IT environments, whether in the cloud or locally. Six months ago, we showed you how to deliver Kubernetes applications through GCP Marketplace. Kubernetes applications do more than provide container images; They are integrated GKE production-level solutions for one-click deployment publishing, where Kubernetes applications will be managed entirely as applications, simplifying resource management. You can also deploy Kubernetes applications to non-GKE Kubernetes clusters — whether they are local or in the cloud — for easy rapid development and unified billing across multiple containers.
Define your cloud services on your terms
If you use Containers and Kubernetes, you’ll be familiar with how they optimize infrastructure resources, reduce operational overhead, and improve application portability. But by standardizing Kubernetes, you also set the stage for improved service management, security, and simplified application procurement and deployment across the cloud and locally.
Stay tuned for more information on Kubernetes, microservices and cloud service platforms in the coming months.
Make Google Cloud support you better
Google cloud team will be in this year with the Google developer community closer cooperation, hope to be able to provide better support and service for the Google cloud users, not only will produce more high quality documents, technology blog to solve the problems in the use, power will invest more resources, establish more direct contact with Google cloud users, face to face solve doubts in use.
We sincerely hope that more Google Cloud users and developers can participate in the construction of Google Cloud technology ecosystem, so that we can better understand your needs, problems encountered in use, and other want to say to us.
Scan the QR code below to participate in the online survey of Google Cloud team and get first-hand learning materials and technical resources. We will select 10 friends to get access to Google Cloud copyrighted surroundings and have the opportunity to participate in offline communication activities of Google Cloud team.