Parameter Estimation¶
Example created by Wilson Rocha Lacerda Junior
Here we import the NARMAX model, the metric for model evaluation and the methods to generate sample data for tests. Also, we import pandas for specific usage.
import pandas as pd
from sysidentpy.model_structure_selection import FROLS
from sysidentpy.basis_function._basis_function import Polynomial
from sysidentpy.parameter_estimation import (
TotalLeastSquares,
RecursiveLeastSquares,
NonNegativeLeastSquares,
LeastMeanSquares,
AffineLeastMeanSquares,
)
from sysidentpy.metrics import root_relative_squared_error
from sysidentpy.utils.generate_data import get_siso_data
from sysidentpy.utils.display_results import results
Generating 1 input 1 output sample data¶
The data is generated by simulating the following model:
$y_k = 0.2y_{k-1} + 0.1y_{k-1}x_{k-1} + 0.9x_{k-1} + e_{k}$
If colored_noise is set to True:
$e_{k} = 0.8\nu_{k-1} + \nu_{k}$
where $x$ is a uniformly distributed random variable and $\nu$ is a gaussian distributed variable with $\mu=0$ and $\sigma=0.1$
In the next example we will generate a data with 1000 samples with white noise and selecting 90% of the data to train the model.
x_train, x_valid, y_train, y_valid = get_siso_data(
n=1000, colored_noise=False, sigma=0.001, train_percentage=90
)
There are several method to be used for parameter estimation.¶
- Least Squares;
- Total Least Squares;
- Recursive Least Squares
- Ridge Regression
- NonNegative Least Squares
- Least Squares Minimal Residues
- Bounded Variable Least Squares
- Least Mean Squares
- Affine Least Mean Squares
- Least Mean Squares Sign Error
- Normalized Least Mean Squares
- Least Mean Squares Normalized Sign Error
- Least Mean Squares Sign Regressor
- Least Mean Squares Normalized Sign Regressor
- Least Mean Squares Sign Sign
- Least Mean Squares Normalized Sign Sign
- Least Mean Squares Normalized Leaky
- Least Mean Squares Leaky
- Least Mean Squares Fourth
- Least Mean Squares Mixed Norm
Polynomial NARMAX models are linear-in-the-parameter, so Least Squares based methods works well for most cases (using with extended least squares algorithm when dealing with colered noise).
However, the user can choose some recursive and stochastic gradient descent methods (in this case, the least mean squares algorithm and its variants) to that task too.
Choosing the method is straightforward: pass any of the methods mentioned above on estimator parameters.
- Note: Each algorithm have specifc parameter that need to be tunned. In the following examples we will use the default ones. More examples regarding tunned parameter will be available soon. For now, the user can read the method documentation for more information.
Total Least Squares¶
basis_function = Polynomial(degree=2)
estimator = TotalLeastSquares()
model = FROLS(
order_selection=False,
n_terms=3,
ylag=2,
xlag=2,
estimator=estimator,
basis_function=basis_function,
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model,
model.theta,
model.err,
model.n_terms,
err_precision=8,
dtype="sci",
),
columns=["Regressors", "Parameters", "ERR"],
)
print(r)
0.0021167167052431584 Regressors Parameters ERR 0 x1(k-2) 9.0000E-01 9.56200123E-01 1 y(k-1) 1.9995E-01 4.05078042E-02 2 x1(k-1)y(k-1) 1.0004E-01 3.28866604E-03
Recursive Least Squares¶
# recursive least squares
basis_function = Polynomial(degree=2)
estimator = RecursiveLeastSquares()
model = FROLS(
order_selection=False,
n_terms=3,
ylag=2,
xlag=2,
estimator=estimator,
basis_function=basis_function,
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model,
model.theta,
model.err,
model.n_terms,
err_precision=8,
dtype="sci",
),
columns=["Regressors", "Parameters", "ERR"],
)
print(r)
0.0020703083403116164 Regressors Parameters ERR 0 x1(k-2) 9.0012E-01 9.56200123E-01 1 y(k-1) 2.0021E-01 4.05078042E-02 2 x1(k-1)y(k-1) 9.9550E-02 3.28866604E-03
Least Mean Squares¶
basis_function = Polynomial(degree=2)
estimator = LeastMeanSquares()
model = FROLS(
order_selection=False,
n_terms=3,
ylag=2,
xlag=2,
estimator=estimator,
basis_function=basis_function,
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model,
model.theta,
model.err,
model.n_terms,
err_precision=8,
dtype="sci",
),
columns=["Regressors", "Parameters", "ERR"],
)
print(r)
0.015488793944313425 Regressors Parameters ERR 0 x1(k-2) 8.9775E-01 9.56200123E-01 1 y(k-1) 2.0085E-01 4.05078042E-02 2 x1(k-1)y(k-1) 7.5708E-02 3.28866604E-03
Affine Least Mean Squares¶
basis_function = Polynomial(degree=2)
estimator = AffineLeastMeanSquares()
model = FROLS(
order_selection=False,
n_terms=3,
ylag=2,
xlag=2,
estimator=estimator,
basis_function=basis_function,
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model,
model.theta,
model.err,
model.n_terms,
err_precision=8,
dtype="sci",
),
columns=["Regressors", "Parameters", "ERR"],
)
print(r)
0.0021441596280611167 Regressors Parameters ERR 0 x1(k-2) 8.9989E-01 9.56200123E-01 1 y(k-1) 1.9992E-01 4.05078042E-02 2 x1(k-1)y(k-1) 1.0003E-01 3.28866604E-03
NonNegative Least Squares¶
basis_function = Polynomial(degree=2)
estimator = NonNegativeLeastSquares()
model = FROLS(
order_selection=False,
n_terms=3,
ylag=2,
xlag=2,
estimator=estimator,
basis_function=basis_function,
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model,
model.theta,
model.err,
model.n_terms,
err_precision=8,
dtype="sci",
),
columns=["Regressors", "Parameters", "ERR"],
)
print(r)
0.0021170157359329173 Regressors Parameters ERR 0 x1(k-2) 9.0000E-01 9.56200123E-01 1 y(k-1) 1.9995E-01 4.05078042E-02 2 x1(k-1)y(k-1) 1.0004E-01 3.28866604E-03