Multiple region, multi-cluster Istio on GKE/GCP

Hello! Does anyone have experience running Istio in the single control plane configuration connecting to multiple k8s clusters that are in more than one GCP region?

The single region case works perfectly, but we’re looking for a better availability story, and will also eventually need to cater for the case of having workloads distributed globally on k8s clusters. We’re running in a “flat” network 10.0.0.8/8 VPC, with subnets carved out for different k8s clusters.

The problem we’re running into seems to be more of a limitation with Google’s cloud network infra in that they don’t support exposing a LoadBalancer Service type (with the “Internal” annotation) across multiple regions (internal LBs are regional). Therefore, when exposing pilot, policy and telemetry IPs to provide a stable IP for remote clusters to connect to, any cluster that isn’t in the same region as the control plane service IPs can’t connect back to the control plane.

An obvious workaround seems to be to expose the Service IPs as external, rather than internal. However the drawback is that now our control plane traffic is exposed with public addresses.

Another workaround is to have stable IPs carved out for pilot, mixer, et. al. and assign that to a GCE VM running an LB like Envoy in the same region as the control plane, and let that do the load balancing to the service IPs.

Wondering if anyone has run up against the same thing, or even if we’re going about this the wrong way? If there are GCP people here and there’s a better place to ask, please let me know.

Thanks!

We’re running many clusters across many regions in GKE. We’re avoiding the single control plane option completely. We have multiple clusters for resiliency (that’s one reason), and we don’t want to couple all network traffic to a the control plane running in a single cluster. So for us, I believe we’ll be using external LB for cross region gateway traffic. (We’re still building this right now.)

hey @dwradcliffe I’ll be happy to hear more about your experience - I’m facing a similar decision right now, and any information could help me better understand my options.