Parameter Estimation
Contents
Parameter Estimation¶
Example created by Wilson Rocha Lacerda Junior
Here we import the NARMAX model, the metric for model evaluation and the methods to generate sample data for tests. Also, we import pandas for specific usage.
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from sysidentpy.model_structure_selection import FROLS
from sysidentpy.basis_function._basis_function import Polynomial
from sysidentpy.metrics import root_relative_squared_error
from sysidentpy.utils.generate_data import get_siso_data
from sysidentpy.utils.display_results import results
Generating 1 input 1 output sample data¶
The data is generated by simulating the following model:
\(y_k = 0.2y_{k-1} + 0.1y_{k-1}x_{k-1} + 0.9x_{k-1} + e_{k}\)
If colored_noise is set to True:
\(e_{k} = 0.8\nu_{k-1} + \nu_{k}\)
where \(x\) is a uniformly distributed random variable and \(\nu\) is a gaussian distributed variable with \(\mu=0\) and \(\sigma=0.1\)
In the next example we will generate a data with 1000 samples with white noise and selecting 90% of the data to train the model.
x_train, x_valid, y_train, y_valid = get_siso_data(
n=1000,
colored_noise=False,
sigma=0.001,
train_percentage=90
)
There are several method to be used for parameter estimation.¶
least_squares;
total_least_squares;
recursive_least_squares;
least_mean_squares
affine_least_mean_squares
least_mean_squares_sign_error
normalized_least_mean_squares
least_mean_squares_normalized_sign_error
least_mean_squares_sign_regressor
least_mean_squares_normalized_sign_regressor
least_mean_squares_sign_sign
least_mean_squares_normalized_sign_sign
least_mean_squares_normalized_leaky
least_mean_squares_leaky
least_mean_squares_fourth
least_mean_squares_mixed_norm
Polynomial NARMAX models are linear-in-the-parameter, so Least Squares based methods works well for most cases (using with extended least squares algorithm when dealing with colered noise).
However, the user can choose some recursive and stochastic gradient descent methods (in this case, the least mean squares algorithm and its variants) to that task too.
Choosing the method is straightforward: pass any of the methods mentioned above on estimator parameters.
Note: The adaptative filters have specifc parameter that need to be tunned. In the following examples we will use the default ones. More examples regarding tunned parameter will be available soon. For now, the user can read the method documentation for more information.
Total Least Squares¶
basis_function = Polynomial(degree=2)
model = FROLS(
order_selection=False,
n_terms=3,
extended_least_squares=False,
ylag=2, xlag=2,
estimator='total_least_squares',
basis_function=basis_function
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model, model.theta, model.err,
model.n_terms, err_precision=8, dtype='sci'
),
columns=['Regressors', 'Parameters', 'ERR'])
print(r)
0.0017546043817803368
Regressors Parameters ERR
0 x1(k-2) 8.9994E-01 9.55863828E-01
1 y(k-1) 2.0002E-01 4.06706802E-02
2 x1(k-1)y(k-1) 1.0004E-01 3.46188050E-03
Recursive Least Squares¶
# recursive least squares
basis_function = Polynomial(degree=2)
model = FROLS(
order_selection=False,
n_terms=3,
extended_least_squares=False,
ylag=2, xlag=2,
estimator='recursive_least_squares',
basis_function=basis_function
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model, model.theta, model.err,
model.n_terms, err_precision=8, dtype='sci'
),
columns=['Regressors', 'Parameters', 'ERR'])
print(r)
0.0017428918921948222
Regressors Parameters ERR
0 x1(k-2) 9.0005E-01 9.55863828E-01
1 y(k-1) 1.9991E-01 4.06706802E-02
2 x1(k-1)y(k-1) 1.0030E-01 3.46188050E-03
Least Mean Squares¶
# recursive least squares
basis_function = Polynomial(degree=2)
model = FROLS(
order_selection=False,
n_terms=3,
extended_least_squares=False,
ylag=2, xlag=2,
estimator='least_mean_squares',
basis_function=basis_function
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model, model.theta, model.err,
model.n_terms, err_precision=8, dtype='sci'
),
columns=['Regressors', 'Parameters', 'ERR'])
print(r)
0.011788905296672262
Regressors Parameters ERR
0 x1(k-2) 8.9844E-01 9.55863828E-01
1 y(k-1) 1.9880E-01 4.06706802E-02
2 x1(k-1)y(k-1) 8.3552E-02 3.46188050E-03
Affine Least Mean Squares¶
# recursive least squares
basis_function = Polynomial(degree=2)
model = FROLS(
order_selection=False,
n_terms=3,
extended_least_squares=False,
ylag=2, xlag=2,
estimator='affine_least_mean_squares',
basis_function=basis_function
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model, model.theta, model.err,
model.n_terms, err_precision=8, dtype='sci'
),
columns=['Regressors', 'Parameters', 'ERR'])
print(r)
0.0017565655363110896
Regressors Parameters ERR
0 x1(k-2) 8.9983E-01 9.55863828E-01
1 y(k-1) 2.0000E-01 4.06706802E-02
2 x1(k-1)y(k-1) 1.0002E-01 3.46188050E-03