Application stops responding when exposed via VirtualService + Gateway

Firstly I’d like to preface this by saying I’m new to Kubernetes and Istio so if the question below is a stupid one please have mercy on me!

I have a Kubernetes 1.13.10 cluster using Istio 1.3.2 running in Azure Kubernetes Service. It has a server which talks over one port using TCP, I’m running one replica and it comes up fine - I can, for example, port-forward from the pod and connect successfully to the server.

apiVersion: apps/v1
kind: Deployment
metadata:
  name: server
  labels:
    app: server
spec:
  selector:
    matchLabels:
        app: server
  template:
    metadata:
      name: server
      labels:
        app: server
    spec:
      containers:
        - image: myserverimage:latest
          name: server
          args: ["./Server", "3600"]
          ports:
            - containerPort: 25001
              protocol: TCP
              name: tcp
---
apiVersion: v1
kind: Service
metadata:
  name: server
  labels:
    app: server
spec:
  selector:
    app: server
  ports:
    - port: 50090
      targetPort: 25001
      name: tcp
  type: ClusterIP
---
apiVersion: networking.istio.io/v1alpha3
kind: VirtualService
metadata:
  name: server
spec:
  gateways:
  - server
  hosts:
  - '*'
  tcp:
  - match:
    - port: 50090
    route:
    - destination:
        host: server
        port:
          number: 50090

However when I add a Gateway the server starts to receive traffic and stops working - for example I can’t connect when port-forwarding the pod.

apiVersion: networking.istio.io/v1alpha3
kind: Gateway
metadata:
  labels:
    app: server
  name: server
spec:
  selector:
    istio: ingressgateway
  servers:
  - port:
      number: 50090
      name: tcp
      protocol: TCP
    hosts:
    - "*"

Here are the istio-proxy sidecar envoy logs from the point at which I add the Gateway, the first two requests are successful but they seem to start to fail after that (I’ve included one failing request but it continues like that):

{'authority': '-',
 'bytes_received': '0',
 'bytes_sent': '8501',
 'downstream_local_address': '10.244.3.127:25001',
 'downstream_remote_address': '10.244.2.189:46338',
 'duration': '4217',
 'istio_policy_status': '-',
 'method': '-',
 'path': '-',
 'protocol': '-',
 'request_id': '-',
 'requested_server_name': '-',
 'response_code': '0',
 'response_flags': '-',
 'route_name': '-',
 'start_time': '2019-10-22T14:16:08.442Z',
 'upstream_cluster': 'inbound|50090|tcp|server.namespace.svc.cluster.local',
 'upstream_host': '127.0.0.1:25001',
 'upstream_local_address': '127.0.0.1:41060',
 'upstream_service_time': '-',
 'upstream_transport_failure_reason': '-',
 'user_agent': '-',
 'x_forwarded_for': '-'}
{'authority': '-',
 'bytes_received': '0',
 'bytes_sent': '8501',
 'downstream_local_address': '10.244.3.127:25001',
 'downstream_remote_address': '10.244.2.189:46342',
 'duration': '4456',
 'istio_policy_status': '-',
 'method': '-',
 'path': '-',
 'protocol': '-',
 'request_id': '-',
 'requested_server_name': '-',
 'response_code': '0',
 'response_flags': '-',
 'route_name': '-',
 'start_time': '2019-10-22T14:16:08.660Z',
 'upstream_cluster': 'inbound|50090|tcp|server.namespace.svc.cluster.local',
 'upstream_host': '127.0.0.1:25001',
 'upstream_local_address': '127.0.0.1:41062',
 'upstream_service_time': '-',
 'upstream_transport_failure_reason': '-',
 'user_agent': '-',
 'x_forwarded_for': '-'}
{'authority': '-',
 'bytes_received': '0',
 'bytes_sent': '0',
 'downstream_local_address': '10.244.3.127:25001',
 'downstream_remote_address': '10.244.2.189:46546',
 'duration': '460',
 'istio_policy_status': '-',
 'method': '-',
 'path': '-',
 'protocol': '-',
 'request_id': '-',
 'requested_server_name': '-',
 'response_code': '0',
 'response_flags': 'UF,URX',
 'route_name': '-',
 'start_time': '2019-10-22T14:16:28.660Z',
 'upstream_cluster': 'inbound|50090|tcp|server.namespace.svc.cluster.local',
 'upstream_host': '127.0.0.1:25001',
 'upstream_local_address': '-',
 'upstream_service_time': '-',
 'upstream_transport_failure_reason': '-',
 'user_agent': '-',
 'x_forwarded_for': '-'}

I’m something of a beginner but I’m confused about what’s going on:

  1. Where is this traffic coming from? Is this the LoadBalancer querying my service?
  2. What is going wrong?
  3. What approach can I take to debugging this?

I suspect there might be something wrong with the server in my pod which is causing the traffic above to break it. But before I chase up that angle I’d like to understand if this traffic is coming from Istio itself or if I’ve overlooked something.