Complexity Reduction of Local Linear Models Extracted from Neural Networks

Authors

  • Tamás Kenesei
  • Balázs Feil
  • János Abonyi

Keywords:

model reduction, model transformation, knowledge discovery

Abstract

Nonlinear black-box models have become more and more important not only in research but also in industrial practice. However, their main disadvantage is that they are often too complex and not interpretable; therefore it is a hard and complex task to validate them by human experts. It is a challenge how a priori knowledge can be utilized and integrated into the black-box modeling approach. This could be a difficult multi-stage process. One of these steps can be the reduction of the identified model. It is also important from the viewpoint of overparameterization and to reduce the time and computational demand of the model. This article would like to show how model reduction techniques can be used for complexity reduction purposes by local models from neural networks. A possible method family is orthogonal techniques. These methods can roughly be divided into two groups: the rank revealing ones like SVD-QR algorithm and those that evaluate the individual contribution of the rule or local models, like the orthogonal least-squares approach (OLS). This later technique requires more computations, but for system identification purposes it is preferable as it gives a better approximation result. Apart from that, other methods can also be used to reduce the number of local models: the most similar models can be merged together. The analyzed methods are used for knowledge discovery purposes from neural networks.

Downloads

Published

2007-02-15

How to Cite

Complexity Reduction of Local Linear Models Extracted from Neural Networks. (2007). ACTA AGRARIA KAPOSVARIENSIS, 11(2), 259-271. https://journal.uni-mate.hu/index.php/aak/article/view/1887

Most read articles by the same author(s)

1 2 > >>