vmware horizon geolocation in heatmap kibana

v

Because of the Covid crisis most of our employees work from home. Fortunately we already had a VMware Horizon environment ready so our users could use that. Our servicedesk did get some new problems with this situation when users that they cant connect to our environment.

So my idea was to build a VMware Horizon dashboard within Kibana to pinpoint outside problems ex. users ISP. So if for example a group of users would call about connection problems and we would see that all the users are from the same geo location probably a local outage is the problem.

And of course its cool to create these kinds of heatmaps in Kibana

We would need an ELK stack with UDP on port 514 on logstash enabled.

Setup external Syslog server in VmwarE
  1. In Horizon Administrator, select View Configuration > Event Configuration.
  2. In the Syslog area, to configure Connection Server to send events to a Syslog server, so this would be your logstash server. Click Add next to Send to syslog servers, and supply the server name or IP address and the UDP port number of your logstash server
Elastic index template

To create maps within kibana you have to create a field that contains geo info within Elastic. To do so first create a template to map this field to geo_point.

{
  "properties": {
    "ClientIpAddress": {
      "type": "ip"
    },
    "location": {
      "type": "geo_point"
    }
  }
}
Configure Logstash

Now we can setup our syslog filter and input for logstash. You can put this in one file but i created 2 separate files for input/output and filters. I find this easier to maintain your filters and keep an overview. I am using a TLS connection to my Elasticsearch so you can see that the output has a SSL is True and a certfile location this is not needed if you don’t use TLS.

0100-syslog.conf

input {
  tcp {
    port => 514
    type => syslog
    tags => [ "syslog" ]
  }
  udp {
    port => 514
    type => syslog
    tags => [ "syslog" ]
  }
}

output {
if "syslog" in [tags]
{
  elasticsearch {
    hosts => ["https://elasticsearch:9200"]
    index => "syslog-%{+YYYY.MM.dd}"
    user => "user"
    password => "password"
    ssl_certificate_verification => "false"
    ssl => "true"
    cacert => "/certs/ca.crt"
    }
    stdout { codec => rubydebug }
  }
}

0101-syslog-vmware-filter.conf

I will break this filter down in pieces so its easier to understand.

First i want to add a tag to this filter so that i know the logs are from the vmware_view_server. So what i did is to add the ip’s of the VMWare view servers cluster in this filter so when this ip is true and the type of the log is syslog add the field with the vmware_view_logs value.

filter {

if [type] == "syslog"  and [server] == "100.100.100.100"  or [server] == "101.101.101.101" or [server] == "102.102.102.102" or [server] == "103.103.103.103" {
  mutate {
        add_field => { "tags" => "vmware_view_logs" }
  }
 }
}

Next i will be using the logstash KV filter plugin because of the log file build up from VMware. Vmware uses the field=value format and the KV filter extracts this field and values. Below you can see the log the vmware creates. The field we are looking for is the ClientIpAddress

Logstash KV Filter info

<166>1 2021-03-09T16:32:25.906+01:00 AB-WX-FL-138.domain.nl View - 1002 [View@6876 Severity="INFO" Module="Agent" EventType="AGENT_SHUTDOWN" DesktopId="GR-WX-FL" DesktopDisplayName="Windows 10 - Floating" PoolId="ab-wx-fl" MachineId="e793069c-e5ed-49b7-bd0b-05d482225aba" MachineName="AB-WX-FL-138" MachineDnsName="gr-wx-fl-138.domain.nl"] The agent running on machine AB-WX-FL-138 has shut down, this machine will be unavailable ClientIpAddress="123.123.123.123"

This is code of the kv filter i am using. I’ve added some characters i wanted to remove from the value so that i can get a cleaner output.

filter {
      kv { 
       remove_char_value => "<>\[\],"   
      }
    }

Next i created a filter that checks if the field ClientIpAddress is an external IP. I don’t want to log internal ip’s when users are connecting to from within our company. this defeats the whole purpose. I also added the 0.0.0.0 and the 127.0.0.1 just in case something this gets outputted somehow.

filter {
  if [ClientIpAddress] =~ "^100.100.*" or [ClientIpAddress] =~ "^127.0.*" or [ClientIpAddress] == "0.0.0.0" { 
	mutate {
		add_tag => ["internal"]
	}
}

So now comes the part where we would lookup the geoinfo from the ip address within the field ClientIpAddress.

Now i’ve filtered the internal ip with the tag if not internal and we only want this geoinfo function runned if ClientipAdress exist within a syslog message which isn’t always the case. If these values are true add a tag with vmware_horizon_geo_city.

I am using the default database type City and this gets its source from the field ClientIpAddress. I’ve added a target geoip and a tag on failure.

Now comes the tricky part, Kibana is really picky about the way how coordinates are written there are a few options:

  • Geo-points expressed as strings are ordered as "latitude,longitude". Geo-points expressed as arrays are ordered as the reverse: [longitude, latitude].
  • Geo-shapes expressed as geojson provide coordinates as [longitude, latitude]

We will be using Geo-points in the latitude,longitude string format. The geoip function creates the following fields geoip.location.lon and geoip.location.lat . I want to combine those 2 fields in 1 string with a comma between the two values. Before we created a geo_point template mapping for the field location so i want to write this info to the field location.

I also wanted information about the Client ISP. So what i did is to add an extra geoip function that uses the database type ASN. Because i dont want to create a heatmap of ISP i don’t need the geo_point convert function.

if "internal" not in [tags] and "vmware_view_logs" in [tags] and [ClientIpAddress] {
	geoip {
		add_tag => [ "vmware_horizon_geo_city" ]
        default_database_type => "City"
		source => ["ClientIpAddress"]
		target => "geoip"
		tag_on_failure => ["geoip-city-failed"]
        add_field => [ "[location]", "%{[geoip][longitude]}" ]
        add_field => [ "[location]", "%{[geoip][latitude]}"  ]
	}
    geoip {
		add_tag => [ "vmware_horizon_geo_asn" ]
		default_database_type => "ASN"
		source => "ClientIpAddress"
		target => "geo"
		tag_on_failure => ["geoip-asn-failed"]
	}
	mutate {
    convert => [ "[location]", "float"]
    }
 }
}
Create heatmap in Kibana

When you start logstash with these filters you should get some syslog files. and we can start to create the heatmap. Select your index patern within the maps section of Kibana.

About the author

Add Comment

By Semi

Semi

Get in touch

Quickly communicate covalent niche markets for maintainable sources. Collaboratively harness resource sucking experiences whereas cost effective meta-services.