Testing Helpers & Logging Integration
The spx_python.helpers and logging extensions are designed to make test code shorter, more robust, and observable. This guide shows how to:
Bootstrap models and instances specifically for tests.
Capture unit-test assertions and test case results into SPX instance attributes.
Integrate logging with both
unittestandpytest.Inspect the resulting log structures for debugging and reporting.
The examples are based on spx_python.helpers, spx_python.unittest_logging, and tests such as tests/test_helpers.py, tests/test_unittest_logging.py, and tests/test_pytest_logging_integration.py from the spx-python repository.
Unittest integration
Bootstrapping a model + instance
For integration-style tests it is common to create a real model and instance and reuse them across multiple test methods.
import os
import unittest
from pathlib import Path
import spx_python
from spx_python.helpers import (
bootstrap_model_instance,
SpxAssertionLoggingMixin,
spx_ensure_attribute,
)
BASE_URL = os.getenv("SPX_BASE_URL", "http://localhost:8000")
PRODUCT_KEY = os.environ["SPX_PRODUCT_KEY"]
class HeaterTests(SpxAssertionLoggingMixin, unittest.TestCase):
@classmethod
def setUpClass(cls):
model_path = Path("models/heater.yaml")
client, instance, _changed = bootstrap_model_instance(
spx_module=spx_python,
product_key=PRODUCT_KEY,
base_url=BASE_URL,
model_path=model_path,
model_key="tests_heater",
instance_key="tests_heater_inst",
)
cls.client = client
cls.instance = instance
# Tell the mixin where to log
cls.spx_log_instance = instance
cls.spx_log_attr = "test_logs"
spx_ensure_attribute(instance, cls.spx_log_attr, default=[])This follows the same pattern as tests/test_helpers.py::TestHelperIntegration: model and instance are created once, then used by all tests in the class.
Capturing assertion logs with SpxAssertionLoggingMixin
SpxAssertionLoggingMixinSpxAssertionLoggingMixin wraps all standard unittest.TestCase.assert* methods. Each assertion call appends a log entry to spx_log_attr on spx_log_instance:
kind:"assertion"label: the assertion method name (e.g.,"assertEqual")status:"pass"or"fail"args/kwargs: JSON-safe copies of the assertion argumentsmessage: the failure message (for failed assertions)ts: timestamp (seconds since epoch)
Minimal usage:
If spx_log_instance or spx_log_attr is not set, assertions behave normally without logging, as shown in tests/test_helpers_logging.py::TestSpxAssertionLoggingMixin.
Recording test case start/end with spx_log_test_case
spx_log_test_caseThe decorator spx_log_test_case emits a pair of "testcase" entries around a test method:
event="start"before the body runs.event="end", withstatus="pass"orstatus="fail", after the body.
You can override the attribute path per test:
See tests/test_unittest_logging.py::test_spx_log_test_case_records_entries for the expected log shape.
Pytest integration
Overview of SpxPytestLoggerPlugin
SpxPytestLoggerPluginSpxPytestLoggerPlugin provides two fixtures and an automatic hook:
spx_instance: a class-scoped fixture that creates/returns the SPX instance used for logging and ensures the log attribute exists.spx_log: a function-scoped fixture that appends custom entries (for example"note"records) to the same attribute.pytest_runtest_makereporthook: automatically logs"testcase"entries with status and duration for each test function.
All three write into ATTR_PATH (by default "test_logs") on the instance produced by an instance_factory callable.
Wiring the plugin in conftest.py
conftest.pyA typical configuration mirrors tests/test_pytest_logging_integration.py but can be simplified for your own project.
This setup:
Registers the plugin globally.
Exposes
spx_instanceandspx_logfixtures to all tests.Ensures the log attribute exists on the backing SPX instance before tests run.
Using spx_log in tests
spx_log in testsThe spx_log fixture appends arbitrary entries to the configured attribute. Each call produces a payload like:
kind: value of the first argument (e.g.,"note","step","measurement").message: optional human-readable message.Additional keyword arguments: merged into the payload as metadata.
ts: timestamp.
This is the same pattern used in tests/test_pytest_logging_integration.py::test_pytest_log_fixture_can_append.
Automatic "testcase" entries from the plugin hook
"testcase" entries from the plugin hookSpxPytestLoggerPlugin.pytest_runtest_makereport runs after each test and appends a "testcase" entry when:
The phase is
"call"(setup/teardown are ignored).A corresponding SPX instance can be found:
Prefer
spx_instancefromitem.funcargsif present.Otherwise fall back to the instance cached for the node ID (created by
spx_log).As a last resort, use
cls.spx_log_instanceon the test class if defined.
The payload includes:
kind:"testcase"event:"end"status:"passed","failed", or"skipped"nodeid: pytest node ID (test_file.py::TestClass::test_method)duration: test duration in secondsmessage: stringified failure details for failed tests
In tests/test_pytest_logging_integration.py the verify_pytest_logging_entries fixture asserts that:
There are entries of
kind == "note"produced byspx_log.There are
"testcase"entries for at least two tests, all withevent == "end"andstatus == "passed".
Inspecting and consuming logs
All logging helpers ultimately append JSON-safe payloads to an SPX attribute, typically attributes/test_logs/internal_value:
Unittest mixin:
kind == "assertion"andkind == "testcase"entries.Pytest plugin:
kind == "note"(or any label you choose) for explicitspx_log(...)calls.kind == "testcase"from the plugin hook.
You can inspect them directly from Python:
Or via the SPX UI by browsing to the instance and inspecting the corresponding attribute value.
Because the payloads are consistent across unittest and pytest, you can post-process them in dashboards, CI reports, or custom analytics regardless of which test framework produced them.
Concrete end-to-end examples from spx-examples
spx-examplesThis section shows how the logging helpers fit into real scenarios taken from the spx-examples repository. The goal is to make it trivial to lift the patterns into your own test suites.
BLE Vital Signs Monitor (unittest)
The ble_vital_signs_monitor model exposes vital sign telemetry over a BLE-like SUT helper. The integration test tests/test_ble_vital_signs_monitor_sut.py prepares the model and instance, then drives scenarios such as brisk_walk.
A logging-aware variant of that setup could look like this:
After this test runs, SPX will contain a test_logs attribute on tests_ble_vital_signs_monitor_inst with entries for the assertion (and any additional test case decorators you apply).
Modbus Vacuum Gauge (unittest)
The Modbus vacuum gauge example (tests/test_modbus_vacuum_gauge_sut_example.py) uses bootstrap_model_instance and wait_for_condition to validate pressure dynamics and relay outputs.
To add structured logging around these assertions:
The full original test contains several assertions on pressure trajectories and relay flags; wrapping them in the mixin gives you a complete assertion trace inside SPX.
Pytest with a concrete model (generic MQTT environment sensor)
The MQTT environment sensor example in tests/test_mqtt_environment_sensor_sut_example.py uses ensure_model and ensure_instance to prepare an instance backed by library/domains/iot/generic/environment_sensor__mqtt.yaml.
Below is a sketch of how to integrate the SpxPytestLoggerPlugin into a similar scenario:
In a test file you can now use spx_log while exercising the MQTT SUT:
Together with the automatic "testcase" entries from the plugin hook, this gives you a timeline of MQTT-driven events and test outcomes attached to tests_generic_mqtt_environment_sensor_inst.
Last updated

