Batfish Configuration Validation Testing
Author: Nathan Winemiller
Network Engineer - Dropbox
Large networks generally comprise of hundreds to thousands of devices that perform different functions and are connected to each other in various topologies. In many organizations the configurations for these devices are generated based on information that lives in a network database such as or . This information is considered the source of truth for how these devices are supposed to be configured.
However, while provisioning devices with the generated config is common, in my experience managing changes to these devices through the same configuration generation system is not (your mileage may vary here). Since operators are routinely performing operations on them, configuration drift is inevitable. Therefore having a tool to assess whether a device’s current configuration aligns with the configurations generated by the Source of Truth is extremely useful.
- Compare running configuration on a device to a configuration that is created by a configuration generation system
- Return differences between the running config and generated config in a way that is understandable (at first by a human and eventually by an automation system)
- Analyze configuration from multiple platforms / vendors (NXOS, EOS, Junos)
- Must be extendable to include new platforms / protocols / tests
Configurations to Verify
- BGP Peer configuration - Required
- Physical Interface configuration - Required
- LAG configuration - Required
- NTP/TACACS/Syslog/DNS/DHCP/SNMP - Required but flexible depending on feature set
Batfish / Pybatfish
Batfish is an open source network configuration analysis tool which can be used to test correctness of network configuration. More information can be found . Batfish primarily works by reading in a configuration and building vendor agnostic models for each of the portions of the configuration. Questions can then be run against these models for verification. Pybatfish is a python client library that can be used to interact with the Batfish service.
Batfish can also do advanced analysis based on topology, rib data, etc. However for the purposes of this system, we will only be using it to do configuration comparison and validation.
Batfish Question Sets (that we probably care about)
- - Configuration settings of the node (generally things that are globally significant)
- - Configuration settings that pertain to the properties of a given interface (includes some properties for LAGs)
- - Configuration that are global to a BGP process (router id, etc.)
- - Configuration settings for each BGP peer (includes inherited properties from peer groups)
Tester will gather a running configuration from a specified device and also generate a configuration via a config gen system for the same device. Tester will load both configurations into Batfish and run the specified question sets against them. Differences will be printed out to stdout and reviewed.
Currently the is running in a docker container on a local machine.
This library was installed via PIP and will be used to feed data into the Batfish service and to ask the required questions.
Currently the running configs will be saved and SCP’d from the device in question (although plans are to pull these from RANCID at a later date) and put in a directory named “running”. Generated configs will be created at the time the tests are executed and will be put into a directory named “generated”.
The configuration structure required to generate network snapshots for batfish is located .
The running configuration will be modified slightly based on the test and reverted back to its original state before the next test.
Initial tests will be conducted on Cisco Nexus 3K platform.
Initial Code - Very rough POC
from pybatfish.client.commands import *
from pybatfish.question.question import load_questions, list_questions
from pybatfish.question import bfq
import pandas as pd