Building NARX models using general estimators

Example created by Wilson Rocha Lacerda Junior

In this example we will create NARX models using different estimator like GrandientBoostingRegressior, Bayesian Regression, Automatic Relevance Determination (ARD) Regression and Catboost

pip install sysidentpy
Note: you may need to restart the kernel to use updated packages.
A sintaxe do nome do arquivo, do nome do diret¢rio ou do r¢tulo do volume est  incorreta.
import matplotlib.pyplot as plt
from sysidentpy.metrics import mean_squared_error
from sysidentpy.utils.generate_data import get_siso_data
from sysidentpy.general_estimators import NARX
from sklearn.linear_model import BayesianRidge, ARDRegression
from sklearn.ensemble import GradientBoostingRegressor
from catboost import CatBoostRegressor
09-22 16:13:39 - INFO - Note: NumExpr detected 12 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
09-22 16:13:39 - INFO - NumExpr defaulting to 8 threads.
# simulated dataset
x_train, x_valid, y_train, y_valid = get_siso_data(n=10000,
                                                   colored_noise=False,
                                                   sigma=0.01,
                                                   train_percentage=80)

Importance of the NARX architecture

To get an idea of the importance of the NARX architecture, lets take a look in the performance of the models without the NARX configuration.

catboost = CatBoostRegressor(iterations=300,
                             learning_rate=0.1,
                             depth=6)
gb = GradientBoostingRegressor(loss='quantile', alpha=0.90,
                                n_estimators=250, max_depth=10,
                                learning_rate=.1, min_samples_leaf=9,
                                min_samples_split=9)
def plot_results(yvalid, yhat):
    _, ax = plt.subplots(figsize=(14, 8))
    ax.plot(y_valid[:200], label='Data', marker='o')
    ax.plot(yhat[:200], label='Prediction', marker='*')
    ax.set_xlabel("$n$", fontsize=18)
    ax.set_ylabel("$y[n]$", fontsize=18)
    ax.grid()
    ax.legend(fontsize=18)
    plt.show()
catboost.fit(x_train, y_train, verbose=False)

plot_results(y_valid, catboost.predict(x_valid))
../_images/general_estimators_9_0.png
gb.fit(x_train, y_train)

plot_results(y_valid, gb.predict(x_valid))
C:\Users\wilso\miniconda3\envs\sysidentpy\lib\site-packages\sklearn\utils\validation.py:63: DataConversionWarning: A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples, ), for example using ravel().
  return f(*args, **kwargs)
../_images/general_estimators_10_1.png

Introducing the NARX configuration using SysIdentPy

As you can see, you just need to pass the base estimator you want to the NARX class from SysIdentPy do build the NARX model! You can choose the lags of the input and output variables to build the regressor matrix.

We keep the fit/predict method to make the process straightforward.

NARX with Catboost

from sysidentpy.general_estimators import NARX

catboost_narx = NARX(base_estimator=CatBoostRegressor(iterations=300,
                                                      learning_rate=0.1,
                                                      depth=6),
                     xlag=2,
                     ylag=2,
                     fit_params={'verbose': False}
)

catboost_narx.fit(x_train, y_train)
yhat = catboost_narx.predict(x_valid, y_valid)
print(mean_squared_error(y_valid, yhat))

ee, ex, extras, lam = catboost_narx.residuals(x_valid, y_valid, yhat)
catboost_narx.plot_result(y_valid, yhat, ee, ex, n=200)
09-22 16:13:44 - INFO - Training the model
09-22 16:13:44 - INFO - Creating the regressor matrix
09-22 16:13:44 - INFO - The regressor matrix have 5 features
09-22 16:13:45 - INFO - Done! Model is built!
0.00014978326704096954
../_images/general_estimators_13_5.png

NARX with Gradient Boosting

from sysidentpy.general_estimators import NARX

gb_narx = NARX(base_estimator=GradientBoostingRegressor(loss='quantile', alpha=0.90,
                                n_estimators=250, max_depth=10,
                                learning_rate=.1, min_samples_leaf=9,
                                min_samples_split=9),
              xlag=2,
              ylag=2
)

gb_narx.fit(x_train, y_train)
yhat = gb_narx.predict(x_valid, y_valid)
print(mean_squared_error(y_valid, yhat))

ee, ex, extras, lam = gb_narx.residuals(x_valid, y_valid, yhat)
gb_narx.plot_result(y_valid, yhat, ee, ex, n=200)
09-22 16:13:47 - INFO - Training the model
09-22 16:13:47 - INFO - Creating the regressor matrix
09-22 16:13:47 - INFO - The regressor matrix have 5 features
C:\Users\wilso\miniconda3\envs\sysidentpy\lib\site-packages\sklearn\utils\validation.py:63: DataConversionWarning: A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples, ), for example using ravel().
  return f(*args, **kwargs)
09-22 16:13:55 - INFO - Done! Model is built!
0.0010765485009297116
../_images/general_estimators_15_6.png

NARX with ARD

from sysidentpy.general_estimators import NARX

ARD_narx = NARX(base_estimator=ARDRegression(),
                xlag=2,
                ylag=2
)

ARD_narx.fit(x_train, y_train)
yhat = ARD_narx.predict(x_valid, y_valid)
print(mean_squared_error(y_valid, yhat))

ee, ex, extras, lam = ARD_narx.residuals(x_valid, y_valid, yhat)
ARD_narx.plot_result(y_valid, yhat, ee, ex, n=200)
09-22 16:13:56 - INFO - Training the model
09-22 16:13:56 - INFO - Creating the regressor matrix
09-22 16:13:56 - INFO - The regressor matrix have 5 features
09-22 16:13:56 - INFO - Done! Model is built!
C:\Users\wilso\miniconda3\envs\sysidentpy\lib\site-packages\sklearn\utils\validation.py:63: DataConversionWarning: A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples, ), for example using ravel().
  return f(*args, **kwargs)
0.0009871679464452691
../_images/general_estimators_17_6.png

NARX with Bayesian Rigde

from sysidentpy.general_estimators import NARX

BayesianRidge_narx = NARX(base_estimator=BayesianRidge(),
                          xlag=2,
                          ylag=2
)

BayesianRidge_narx.fit(x_train, y_train)
yhat = BayesianRidge_narx.predict(x_valid, y_valid)
print(mean_squared_error(y_valid, yhat))

ee, ex, extras, lam = BayesianRidge_narx.residuals(x_valid, y_valid, yhat)
BayesianRidge_narx.plot_result(y_valid, yhat, ee, ex, n=200)
09-22 16:13:57 - INFO - Training the model
09-22 16:13:57 - INFO - Creating the regressor matrix
09-22 16:13:57 - INFO - The regressor matrix have 5 features
09-22 16:13:57 - INFO - Done! Model is built!
C:\Users\wilso\miniconda3\envs\sysidentpy\lib\site-packages\sklearn\utils\validation.py:63: DataConversionWarning: A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples, ), for example using ravel().
  return f(*args, **kwargs)
0.0009856741830581606
../_images/general_estimators_19_6.png