Getting Started with the App¶
This document provides a step-by-step tutorial on how to get the App going and how to use it.
Install the App¶
To install the App, please follow the instructions detailed in the Installation Guide.
First steps with the App¶
After you have installed the app and set up the device with Nornir within Nautobot, you need to configure the app to run the checks.
The Operational Compliance app when installed only has the data structures for the checks, but no checks are defined. You need to define the checks that you want to run.
The main components of the app are, in the order that they are configured:
- Checks: The core building blocks of the app, defining the check type, description, check type options (what to match on in the comparison) and logging severity.
- Check commands: These are the commands that are run on the devices to gather the data for the checks, defining platform specific commands to run, JMESPath expressions for data extraction, and are associated with an existing check.
- Check groups: Associate one or more check commands with a group, and define the order in which the commands are run.
- Collect Checks: Run a check or a check group on the devices to gather the data for the checks.
- Compare Checks: Compare the data from the two collect checks to determine if the check passed or failed.
The high level steps to configure the app are:
- Create a check
- Create a check command
- Create a check group
- Run a collect check twice with the same check group to get two sets of data to compare
- Run a compare check to compare the two sets of data
Quick Start¶
If you want to get started quickly, you can use the following script to create a check, check command, and check group for the an IOS device. It will create a check for the device facts, and a check command for the show version
command. It will then create a check group with the check command, and enable the jobs so that the checks can be run.
from nautobot_operational_compliance.models import Check, CheckCommand, CheckGroup, CheckTypeChoices, CheckParserChoices
from nautobot.dcim.models import Platform
from nautobot.extras.models import Job
def create_operational_compliance_checks():
"""Create operational compliance checks for the Catalyst 8000."""
# Get the platform
ios_platform = Platform.objects.get(name="cisco_ios")
# Create check group
check_group, _ = CheckGroup.objects.get_or_create(
name="Catalyst 8000 Operational Checks",
defaults={"description": "Operational compliance checks for Catalyst 8000 router"}
)
# Define checks for Catalyst 8000
checks_data = {
"Device Facts": {
"desc": "Gather device facts using NAPALM",
"check_type": CheckTypeChoices.EXACT_MATCH,
"commands": [[ios_platform, "*", CheckParserChoices.NAPALM, "get_facts"]],
},
"OS Version": {
"desc": "Check OS version information",
"check_type": CheckTypeChoices.EXACT_MATCH,
"commands": [[ios_platform, "[*].version", CheckParserChoices.TEXTFSM, "show version"]],
},
}
for check_name, check_details in checks_data.items():
check, created = Check.objects.get_or_create(
name=check_name,
defaults={
"check_type": check_details["check_type"],
"description": check_details["desc"],
"check_type_options": check_details.get("options"),
}
)
if created:
print(f"Created check: {check_name}")
else:
print(f"Check already exists: {check_name}")
# Create check commands
for command_info in check_details["commands"]:
try:
check_comm, created = CheckCommand.objects.get_or_create(
parser=command_info[2],
path=command_info[1],
check_object=check,
command=command_info[3],
platform=command_info[0],
)
if created:
print(f" Created command: {command_info[3]}")
except Exception as e:
print(f" Error creating command {command_info[3]}: {e}")
# Add check to group
check_group.checks.add(check)
check_group.save()
print(f"Created check group: {check_group.name}")
return check_group
def enable_jobs():
"""Enable all operational compliance jobs."""
jobs = Job.objects.filter(name__icontains="operational")
for job in jobs:
job.enabled = True
job.save()
print(f"Enabled job: {job.name}")
Even though the script populates some sample data for the checks, check commands and check groups, you will still need to run the check collects yourself. These sample checks also do not have any check type options defined, so they will be very simple comparisons.
Creating a Check¶
Let's start by walking through a simple example of creating a check. This example will create a check for the device facts on an IOS device.
Creating a check in the Nautobot UI begins under the Operations menu, and under the Operational Compliance submenu.
Click the +
button to create a new check.
The check creation form will appear. Fill in the form with the following values:
- Name:
Sample Device Facts Check
- Check Type:
EXACT_MATCH
- Check Type Options:
{}
- Logging Severity:
INFO
Click the Save
button to create the check.
Creating a Check Command¶
The next step is to create a check command. A check command is a command that is run on the device to gather the data for the check. You can find the Check Commands page under the Operations menu, and under the Operational Compliance submenu.
Click the +
button to create a new check command.
The check command creation form will appear. Fill in the form with the following values:
- Parser:
NAPALM
(This is the parser that will be used to parse the output of the command) - NAPALM Getter:
get_facts
(This is the command that will be run on the device, in this case the NAPALM plugin will be used to gather the device facts and parse the output) - Check Object:
Check: Sample Device Facts Check
(This is the check that the check command will be associated with, created in the previous step) - Platform:
cisco_ios
(This is the platform of the device) - Path:
*
(This is a JMESPath expression that will match all values from the command, but you can also specify a specific path)
Note
The JMESPath expression is used to extract the data from the command output. In this case, we are using the *
wildcard to match all values from the command. This is a good way to get all the data from the command, but you can also specify a specific path to extract a specific value from the command output.
For example, if you wanted to extract the os_version
from the command output, you would use the following JMESPath expression: [*].os_version
Click the Save
button to create the check command.
Creating a Check Group¶
The next step is to create a check group. A check group is a group of check commands that are run on the device to gather the data for the check. You can find the Check Groups page under the Operations menu, and under the Operational Compliance submenu.
Click the +
button to create a new check group.
The check group creation form will appear. Fill in the form with the following values:
- Name:
Sample Device Facts Check Group
(This is the name of the check group) - Check Commands:
Check Command: Sample Device Facts Check
(This is the check command that will be associated with the check group, created in the previous step)
Note
You can add multiple check commands to a check group, and they will be run in the order that they are added.
Click the Save
button to create the check group.
Running a Collect Check¶
Now that you have created the check, check command, and check group, you can run a collect check. You can find the Collect Checks page under the Operations menu, and under the Operational Compliance submenu.
Click the +
button to create a new collect check.
The collect check creation form will appear. Fill in the form with the following values:
- Check:
Sample Device Facts Check
(This is the check that will be associated with the collect check, created in the previous step) - Check Group:
Sample Device Facts Check Group
(This is the check group that will be associated with the collect check, created in the previous step) - Devices: Select the device that you want to run the check on
You can also optionally select other fields in the form to add additional filters and metadata to the collect check.
Click the Collect Checks
button to run the collect check job.
The collect check job will run, and a pop-up will appear to confirm if the job was successful or if there were any errors.
You can view the results through the Job Results:
link in the pop-up, or by closing the pop-up and going back to the Collect Checks page.
The screenshot below shows the results of the collect check job. The data that was collected is within the Output field.
Running a Compare Check¶
In order to run a compare check, you need to run a collect check twice with the same check group. This will create two sets of data to compare.
Run the collect check job again, and this time select the same check group and device as the first collect check. You will now have two collect checks with the same check group and device.
Now that you have two collect checks with the same check group and device, you can run a compare check. You can find the Compare Checks page under the Operations menu, and under the Operational Compliance submenu.
Click the +
button to create a new compare check.
The compare check creation form will appear. Fill in the form with the following values:
- Device: Select the device that you want to run the check on
- Check:
Check: Sample Device Facts Check
(This is the check that will be associated with the compare check, created in the previous step) - Collect Check 1: Select the first collect check that you ran
- Collect Check 2: Select the second collect check that you ran
Click the Compare Checks
button to run the compare check job. You will see a pop-up displaying the results of the compare check job.
The screenshot below shows the results of the compare check job. The Match field will be True
if the two collect checks match, and False
if they do not match. The Diff field shows the difference between the two collect checks. In this example, the only difference was the uptime
, which makes sense since time has passed between the two collect checks.
What are the next steps?¶
You can check out the Use Cases section for more examples.