Install the pybind11 GridPACK Python module, run HADREC dynamic simulations and state estimation, and execute parallel power grid studies with mpiexec.
Use this file to discover all available pages before exploring further.
GridPACK exposes a subset of its C++ framework through a pybind11-based Python extension module. The primary capability currently accessible from Python is the HADREC application — a combined power flow and dynamic simulation driver — together with the full dynamic simulation (DSFullApp) and state estimation (SEApp) modules. This page covers installation, environment setup, and practical usage examples drawn directly from the scripts in python/src/.
Before installing the Python wrapper, verify that these prerequisites are satisfied:
GridPACK ≥ 3.4, built and installed as shared libraries
All GridPACK dependencies (PETSc, Global Arrays, Boost) also built as shared libraries
Python ≥ 3.x (Python 2.7 is no longer recommended)
pybind11 ≥ 2.4 (source included as a git submodule)
Python packages: setuptools, mpi4py, numpy
GridPACK must be installed — not just built in-tree — before the Python wrapper can be built. Pass -DCMAKE_INSTALL_PREFIX=/your/install/path to CMake and run cmake --install build/ after a successful build.
On Red Hat Enterprise Linux and CentOS, the stock OpenMPI packages can cause import failures. Set the RHEL_OPENMPI_HACK environment variable to work around this:
export RHEL_OPENMPI_HACK=yes
This is not needed with MPICH or OpenMPI on other Linux distributions.
import gridpack# Every script must create an Environment before any other GridPACK objects.# It initialises MPI, Global Arrays, and the math libraries.env = gridpack.Environment()comm = gridpack.Communicator()print("Rank:", comm.rank(), "of", comm.size())# Clean up in reverse order to avoid MPI teardown issuesdel commdel env
Always delete GridPACK objects in reverse creation order before the script exits. pybind11 calls C++ destructors at Python garbage-collection time, and MPI must still be active when those destructors run.
HADREC (Hybrid Application for Dynamic Reliability Evaluation and Control) is the main capability exposed through the Python interface. It combines power flow initialisation with full dynamic simulation and supports real-time control actions.
import gridpackimport gridpack.hadrecenv = gridpack.Environment()comm = gridpack.Communicator()hadapp = gridpack.hadrec.Module()# Solve power flow using the first RAW file listed in the XML inputhadapp.solvePowerFlowBeforeDynSimu("input_39bus_step005_v33.xml", 0)hadapp.transferPFtoDS()# Define a bus fault eventfault = gridpack.dynamic_simulation.Event()fault.start = 1.0 # fault starts at t = 1.0 sfault.end = 1.1 # fault clears at t = 1.1 sfault.step = 0.005 # internal time step during faultfault.isBus = Truefault.bus_idx = 22 # bus numberfaultlist = gridpack.dynamic_simulation.EventVector([fault])# Initialise the dynamic simulation with the fault listhadapp.initializeDynSimu(faultlist, 0)# Step through the simulationwhile not hadapp.isDynSimuDone(): hadapp.executeDynSimuOneStep()del hadappdel commdel env
After each time step you can retrieve network observations for monitoring or machine-learning applications:
# Get the list of observed quantities defined in the XML input(obs_genBus, obs_genIDs, obs_loadBuses, obs_loadIDs, obs_busIDs) = hadapp.getObservationLists()while not hadapp.isDynSimuDone(): hadapp.executeDynSimuOneStep() ob_vals = hadapp.getObservations() # ob_vals is a flat list: generator speeds, angles, P, Q, bus voltages, angles
DSFullApp is an alternative to the HADREC module that provides finer-grained control, including the ability to query bus and generator data by field name after each step.
DSFullApp exposes typed accessors to read bus and generator data by keyword:
for bus in range(ds_app.totalBuses()): bus_num = ds_app.getBusInfoInt(bus, "BUS_NUMBER") bus_name = ds_app.getBusInfoString(bus, "BUS_NAME") vmag = ds_app.getBusInfoReal(bus, "BUS_VMAG_CURRENT") for g in range(ds_app.numGenerators(bus)): model = ds_app.getBusInfoString(bus, "GENERATOR_MODEL", g) pg = ds_app.getBusInfoReal(bus, "GENERATOR_PG_CURRENT", g)
The gridpack.state_estimation submodule wraps the state estimation application module. Measurements are passed in as a MeasurementVector:
import gridpackimport gridpack.state_estimation as gseenv = gridpack.Environment()conf = gridpack.Configuration()conf.open("se_input.xml", gridpack.Communicator())app = gse.SEApp()app.readNetwork(conf)app.initialize()app.readMeasurements()# Retrieve measurements from the configuration, modify if needed,# then push them back inmeasurements = app.getMeasurements(conf)app.setMeasurements(measurements)app.solve()if app.hasConverged(): app.write() app.saveData()
Measurement objects carry bus/branch identifiers, a value, a standard deviation, and a type string ("VA", "VM", "PIJ", "QIJ", etc.):
m = gse.Measurement()m.busid = 5m.value = 1.02m.deviation = 0.001m.type("VM") # voltage magnitude at bus 5
The python/src/example/ directory contains ready-to-run scripts for the 39-bus IEEE test case. After installing the module and setting PYTHONPATH:
cd python/src/example# Serial runspython 39bus_test_example.pypython 39bus_test_example_dsf.pypython 39bus_scatterload_steptest_new_itr.pypython 39bus_scatterload_steptest_new_itr_dsf.pypython 39bus_scatterload_steptest_new_itr_compensateY.pypython 39bus_test_pfdata.py# Parallel runs with MPImpiexec -np 2 python 39bus_scatterload_steptest_new_itr.pympiexec -np 2 python 39bus_scatterload_steptest_new_itr_dsf.py
The 39bus_scatterload_steptest_new_itr.py script demonstrates how to change bus loads dynamically during a simulation — a common pattern in reinforcement-learning and control studies.
After installation, a dsf2.py script is placed in $GRIDPACK_DIR/bin/. It is a Python drop-in replacement for the compiled dsf2.x binary and can run any dynamic simulation XML input file:
The gridpack Python module is MPI-aware. Pass an mpiexec command with the desired process count and Python will pick up the MPI environment automatically:
mpiexec -np 4 python my_gridpack_script.py
gridpack.Communicator() returns the world communicator by default. Sub-communicators can be created for task-parallel simulations in exactly the same way as in C++.
mpi4py is required as a build dependency (pyproject.toml) but the gridpack module handles MPI initialisation internally through the Environment object. Do not call MPI_Init manually in scripts that use gridpack.Environment().