Parameter Estimation¶
Example created by Wilson Rocha Lacerda Junior
Here we import the NARMAX model, the metric for model evaluation and the methods to generate sample data for tests. Also, we import pandas for specific usage.
import pandas as pd
from sysidentpy.model_structure_selection import FROLS
from sysidentpy.basis_function._basis_function import Polynomial
from sysidentpy.metrics import root_relative_squared_error
from sysidentpy.utils.generate_data import get_siso_data
from sysidentpy.utils.display_results import results
Generating 1 input 1 output sample data¶
The data is generated by simulating the following model:
$y_k = 0.2y_{k-1} + 0.1y_{k-1}x_{k-1} + 0.9x_{k-1} + e_{k}$
If colored_noise is set to True:
$e_{k} = 0.8\nu_{k-1} + \nu_{k}$
where $x$ is a uniformly distributed random variable and $\nu$ is a gaussian distributed variable with $\mu=0$ and $\sigma=0.1$
In the next example we will generate a data with 1000 samples with white noise and selecting 90% of the data to train the model.
x_train, x_valid, y_train, y_valid = get_siso_data(
n=1000, colored_noise=False, sigma=0.001, train_percentage=90
)
There are several method to be used for parameter estimation.¶
- least_squares;
- total_least_squares;
- recursive_least_squares;
- least_mean_squares
- affine_least_mean_squares
- least_mean_squares_sign_error
- normalized_least_mean_squares
- least_mean_squares_normalized_sign_error
- least_mean_squares_sign_regressor
- least_mean_squares_normalized_sign_regressor
- least_mean_squares_sign_sign
- least_mean_squares_normalized_sign_sign
- least_mean_squares_normalized_leaky
- least_mean_squares_leaky
- least_mean_squares_fourth
- least_mean_squares_mixed_norm
Polynomial NARMAX models are linear-in-the-parameter, so Least Squares based methods works well for most cases (using with extended least squares algorithm when dealing with colered noise).
However, the user can choose some recursive and stochastic gradient descent methods (in this case, the least mean squares algorithm and its variants) to that task too.
Choosing the method is straightforward: pass any of the methods mentioned above on estimator parameters.
- Note: The adaptative filters have specifc parameter that need to be tunned. In the following examples we will use the default ones. More examples regarding tunned parameter will be available soon. For now, the user can read the method documentation for more information.
Total Least Squares¶
basis_function = Polynomial(degree=2)
model = FROLS(
order_selection=False,
n_terms=3,
extended_least_squares=False,
ylag=2,
xlag=2,
estimator="total_least_squares",
basis_function=basis_function,
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model,
model.theta,
model.err,
model.n_terms,
err_precision=8,
dtype="sci",
),
columns=["Regressors", "Parameters", "ERR"],
)
print(r)
c:\Users\wilso\Desktop\projects\GitHub\sysidentpy\sysidentpy\utils\deprecation.py:37: FutureWarning: Passing a string to define the estimator will rise an error in v0.4.0. You'll have to use FROLS(estimator=LeastSquares()) instead. The only change is that you'll have to define the estimator first instead of passing a string like 'least_squares'. This change will make easier to implement new estimators and it'll improve code readability. warnings.warn(message, FutureWarning)
0.002052875265903371 Regressors Parameters ERR 0 x1(k-2) 8.9997E-01 9.56561336E-01 1 y(k-1) 1.9994E-01 4.00308024E-02 2 x1(k-1)y(k-1) 1.0009E-01 3.40419218E-03
Recursive Least Squares¶
# recursive least squares
basis_function = Polynomial(degree=2)
model = FROLS(
order_selection=False,
n_terms=3,
extended_least_squares=False,
ylag=2,
xlag=2,
estimator="recursive_least_squares",
basis_function=basis_function,
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model,
model.theta,
model.err,
model.n_terms,
err_precision=8,
dtype="sci",
),
columns=["Regressors", "Parameters", "ERR"],
)
print(r)
0.0020701726710049364 Regressors Parameters ERR 0 x1(k-2) 9.0015E-01 9.56561336E-01 1 y(k-1) 1.9989E-01 4.00308024E-02 2 x1(k-1)y(k-1) 9.9799E-02 3.40419218E-03
c:\Users\wilso\Desktop\projects\GitHub\sysidentpy\sysidentpy\utils\deprecation.py:37: FutureWarning: Passing a string to define the estimator will rise an error in v0.4.0. You'll have to use FROLS(estimator=LeastSquares()) instead. The only change is that you'll have to define the estimator first instead of passing a string like 'least_squares'. This change will make easier to implement new estimators and it'll improve code readability. warnings.warn(message, FutureWarning)
Least Mean Squares¶
# recursive least squares
basis_function = Polynomial(degree=2)
model = FROLS(
order_selection=False,
n_terms=3,
extended_least_squares=False,
ylag=2,
xlag=2,
estimator="least_mean_squares",
basis_function=basis_function,
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model,
model.theta,
model.err,
model.n_terms,
err_precision=8,
dtype="sci",
),
columns=["Regressors", "Parameters", "ERR"],
)
print(r)
0.0076840766645651214 Regressors Parameters ERR 0 x1(k-2) 8.9806E-01 9.56561336E-01 1 y(k-1) 1.9729E-01 4.00308024E-02 2 x1(k-1)y(k-1) 9.1871E-02 3.40419218E-03
c:\Users\wilso\Desktop\projects\GitHub\sysidentpy\sysidentpy\utils\deprecation.py:37: FutureWarning: Passing a string to define the estimator will rise an error in v0.4.0. You'll have to use FROLS(estimator=LeastSquares()) instead. The only change is that you'll have to define the estimator first instead of passing a string like 'least_squares'. This change will make easier to implement new estimators and it'll improve code readability. warnings.warn(message, FutureWarning)
Affine Least Mean Squares¶
# recursive least squares
basis_function = Polynomial(degree=2)
model = FROLS(
order_selection=False,
n_terms=3,
extended_least_squares=False,
ylag=2,
xlag=2,
estimator="affine_least_mean_squares",
basis_function=basis_function,
)
model.fit(X=x_train, y=y_train)
yhat = model.predict(X=x_valid, y=y_valid)
rrse = root_relative_squared_error(y_valid, yhat)
print(rrse)
r = pd.DataFrame(
results(
model.final_model,
model.theta,
model.err,
model.n_terms,
err_precision=8,
dtype="sci",
),
columns=["Regressors", "Parameters", "ERR"],
)
print(r)
0.0020846666245813036 Regressors Parameters ERR 0 x1(k-2) 8.9986E-01 9.56561336E-01 1 y(k-1) 1.9991E-01 4.00308024E-02 2 x1(k-1)y(k-1) 1.0007E-01 3.40419218E-03
c:\Users\wilso\Desktop\projects\GitHub\sysidentpy\sysidentpy\utils\deprecation.py:37: FutureWarning: Passing a string to define the estimator will rise an error in v0.4.0. You'll have to use FROLS(estimator=LeastSquares()) instead. The only change is that you'll have to define the estimator first instead of passing a string like 'least_squares'. This change will make easier to implement new estimators and it'll improve code readability. warnings.warn(message, FutureWarning)