formak.python

Module Contents

Classes

Config

Options for generating C++.

BasicBlock

A run of statements without control flow.

Model

Python implementation of the model.

SensorModel

ExtendedKalmanFilter

SklearnEKFAdapter

Functions

assert_valid_covariance

Check that the covariance array is well formed:

nearest_positive_definite

compile

compile_ekf

force_to_ndarray

Data

DEFAULT_MODULES

StateAndCovariance

API

formak.python.DEFAULT_MODULES = ('scipy', 'numpy', 'math')
class formak.python.Config

Options for generating C++.

common_subexpression_elimination:

Remove common shared computation

python_modules:

Allow dependencies. Math is the Python standard library

extra_validation:

Catch errors earlier in exchange for increased compute time

common_subexpression_elimination: bool = True
python_modules: tuple[Any, Any, Any, Any] = None
extra_validation: bool = False
max_dt_sec: float = 0.1
innovation_filtering: float | None = 5.0
class formak.python.BasicBlock(*, arglist: list[str], statements: list[Any], config: formak.python.Config)

A run of statements without control flow.

All statements can be reordered or changed to improve performance.

Initialization

__len__()
_compile()
execute(*args, **kwargs)
class formak.python.Model(symbolic_model, config, calibration_map=None)

Python implementation of the model.

Initialization

model(dt, state, control=None)
class formak.python.SensorModel(state_model, sensor_model, calibration_map, config)

Initialization

__len__()
model(state_vector)
formak.python.StateAndCovariance = 'namedtuple(...)'
formak.python.assert_valid_covariance(covariance: numpy.typing.NDArray, *, name: str = 'Covariance', negative_tol: float = -1e-15)

Check that the covariance array is well formed:

  • symmetric (approximately)

  • positive semidefinite (approximately)

formak.python.nearest_positive_definite(covariance: dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float])
class formak.python.ExtendedKalmanFilter(state_model, process_noise: dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float], sensor_models: dict[str, sympy.core.expr.Expr], sensor_noises: dict[str, dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float]], config: formak.python.Config, calibration_map: dict[sympy.Symbol, float] | None = None)

Initialization

_construct_process(state_model, process_noise: dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float], calibration_map: dict[sympy.Symbol, float], config: formak.python.Config) None
_construct_sensors(state_model: formak.common.UiModelBase, sensor_models: dict[str, sympy.core.expr.Expr], sensor_noises: dict[str, dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float]], calibration_map: dict[sympy.Symbol, float], config: formak.python.Config) None
make_reading(key, *, data=None, **kwargs)
process_jacobian(dt, state, control)
control_jacobian(dt, state, control)
sensor_jacobian(sensor_key, state)
process_model(dt, state, covariance, control=None)
remove_innovation(innovation: numpy.typing.NDArray, S_inv: numpy.typing.NDArray) bool
sensor_model(state, covariance, *, sensor_key, sensor_reading)
formak.python.compile(symbolic_model, calibration_map=None, *, config=None)
formak.python.compile_ekf(symbolic_model: formak.common.UiModelBase, process_noise: dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float], sensor_models: dict[str, sympy.core.expr.Expr], sensor_noises, calibration_map: dict[sympy.Symbol, float] | None = None, *, config=None) formak.python.ExtendedKalmanFilter
formak.python.force_to_ndarray(mat: Any) numpy.typing.NDArray | None
class formak.python.SklearnEKFAdapter(symbolic_model: formak.common.UiModelBase | None = None, process_noise: dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float] | None = None, sensor_models: dict[sympy.Symbol, sympy.core.expr.Expr] | None = None, sensor_noises: dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float] | None = None, calibration_map: dict[sympy.Symbol, float] | None = None, *, config: formak.python.Config | None = None)

Bases: sklearn.base.BaseEstimator

Initialization

allowed_keys = ['symbolic_model', 'process_noise', 'sensor_models', 'sensor_noises', 'calibration_map', 'config']
classmethod Create(symbolic_model: formak.common.UiModelBase, process_noise: dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float], sensor_models: dict[str, sympy.core.expr.Expr], sensor_noises: dict[str, dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float]], calibration_map: dict[sympy.Symbol, float] | None = None, *, config: formak.python.Config | None = None)

Provide an interface with required arguments to be more structured and.

opinionated about how to Create this class. scikit-learn guides towards doing no construction or validation of the inputs in the __init__ method, some is also done in this method with the goal of guiding the user earlier in the process.

_flatten_process_noise(process_noise: dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float])
_sensor_noise_to_array(sensor_noises: dict[str, dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float]])
_compile_sensor_models(sensor_models: dict[str, sympy.core.expr.Expr])
_flatten_dict_diagonal(mapping: dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float], arglist: list[sympy.Symbol]) Iterator[float]
_inverse_flatten_dict_diagonal(vector, arglist) dict[sympy.Symbol | tuple[sympy.Symbol, sympy.Symbol], float]
_flatten_scoring_params() list[float]

Note: Known limitation, this only flattens the diagonals to simplify.

the fit optimizaiton problem

_inverse_flatten_scoring_params(flattened: list[float]) dict[str, Any]
fit(X: Any, y: Any | None = None, sample_weight: numpy.typing.NDArray | None = None) formak.python.SklearnEKFAdapter
mahalanobis(X: Any) numpy.typing.NDArray
score(X: Any, y: Any | None = None, sample_weight: Any | None = None, explain_score: bool = False) float | tuple[float, tuple[float, float, float, float, float, float]]
transform(X: Any, include_states=False) numpy.typing.NDArray | tuple[numpy.typing.NDArray, numpy.typing.NDArray, numpy.typing.NDArray]
fit_transform(X, y=None) numpy.typing.NDArray | tuple[numpy.typing.NDArray, numpy.typing.NDArray, numpy.typing.NDArray]
get_params(deep: bool = True) dict[str, Any]
set_params(**params) formak.python.SklearnEKFAdapter
export_python() formak.python.ExtendedKalmanFilter