Monitoring AWS Route Table size with Zabbix

Brandon Strohmeyer
3 min readMar 29, 2018

Having been bitten by the 100 prefix limit in an AWS route table, it was time to develop some kind of check that would alert if a route table broached that limit. While it would be possible to accomplish this same task via Lambda and SNS, we already have a robust monitoring platform in place that would be more than adequate to handle this task.

To get started, we need a script that Zabbix can use to automatically discover route tables in the same VPC as the Zabbix proxy. The script below uses the available AWS metadata link-local resources as well as the normal boto3 library.

#!/usr/bin/env python
# discover_rtb.py
metadata_url = 'http://169.254.169.254/latest/dynamic/instance-identity/document'import boto3
import json
import requests
def get_metadata(url, data):
response = requests.get(url=url).json()
return response[data]
def get_vpc_id(instance_id):
region = get_metadata(metadata_url, 'region')
client = boto3.client('ec2', region_name=region) response = client.describe_instances(
InstanceIds=[
instance_id
]
)
return response['Reservations'][0]['Instances'][0]['VpcId']
def describe_route_tables():
rtb_list = []
json_output = {}
region = get_metadata(metadata_url, 'region')
instance_id = get_metadata(metadata_url, 'instanceId')
vpc_id = get_vpc_id(instance_id)
client = boto3.client('ec2', region_name=region) response = client.describe_route_tables(
Filters=[
{
'Name': 'vpc-id',
'Values': [
vpc_id
]
}
]
)
for route_table in response['RouteTables']:
rtb_dict = {}
rtb_dict['{#RTBID}'] = route_table['RouteTableId'] rtb_list.append(rtb_dict) json_output['data'] = rtb_list return json.dumps(json_output, indent=4)
def main():
route_tables = describe_route_tables()
print route_tables
if __name__ == "__main__":
main()

This returns a JSON file in the correct format for consumption by Zabbix:

{
"data": [
{
"{#RTBID}": "rtb-dbc5xxxx"
},
{
"{#RTBID}": "rtb-1fb0xxxx"
},
{
"{#RTBID}": "rtb-b322xxxx"
}
]
}

Next, we need to be able to get the size of the route table. This takes a route table ID as input, and outputs a simple value; no special formatting is required by Zabbix.

#!/usr/bin/env python
# check_rtb_size.py
metadata_url = 'http://169.254.169.254/latest/dynamic/instance-identity/document'import boto3
import requests
import argparse
def arguments():
parser = argparse.ArgumentParser(description='Returns number of prefixes in specified route '
'table')
parser.add_argument('rtb',
action='store',
help='Route Table ID')
args = parser.parse_args()
return args
def get_metadata(url, data):
response = requests.get(url=url).json()
return response[data]
def get_rtb_size(rtb):
table_length = None
region = get_metadata(metadata_url, 'region') client = boto3.client('ec2', region_name=region) response = client.describe_route_tables(
RouteTableIds=[rtb]
)
for route_table in response['RouteTables']:
table_length = len(route_table['Routes'])
return table_length
def main():
args = arguments()
table_length = get_rtb_size(args.rtb) print(table_length)
if __name__ == "__main__":
main()

As these are being run directly from the EC2 instance hosting the Zabbix proxy, an IAM role should be created and attached to the instance to allow the required API calls. You can either roll your own or attach the AWS provided ‘AmazonEC2ReadOnlyAccess’ policy.

Manually testing should give you the number of routes in a specified route table:

[brandon@zabbix-proxy]$ ./check_rtb_size.py rtb-887axxxx
19

Now that the External Checks to both discover route tables and then find their size are done, we can build the discovery template in Zabbix.

{#RTBID} corresponds to the route table ID and is used as the key for item protoype.

Two triggers are created, a warning when route tables reach 80% of capacity, and then another higher priority if the route table is at maximum.

Finally, check Latest Data to confirm that Zabbix correctly auto-discovers VPC route tables and is pulling the route table size.

The python scripts as well as an export of this Zabbix template can be found on github

Originally published at https://stro.io on March 29, 2018.

--

--