Research Article

Searching Most Efficient Neural Network Architecture Using Akaike's Information Criterion (AIC)

by  Gaurang Panchal, Amit Ganatra, Y.P.Kosta, Devyani Panchal
journal cover
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 1 - Issue 5
Published: February 2010
Authors: Gaurang Panchal, Amit Ganatra, Y.P.Kosta, Devyani Panchal
10.5120/126-242
PDF

Gaurang Panchal, Amit Ganatra, Y.P.Kosta, Devyani Panchal . Searching Most Efficient Neural Network Architecture Using Akaike's Information Criterion (AIC). International Journal of Computer Applications. 1, 5 (February 2010), 41-44. DOI=10.5120/126-242

                        @article{ 10.5120/126-242,
                        author  = { Gaurang Panchal,Amit Ganatra,Y.P.Kosta,Devyani Panchal },
                        title   = { Searching Most Efficient Neural Network Architecture Using Akaike's Information Criterion (AIC) },
                        journal = { International Journal of Computer Applications },
                        year    = { 2010 },
                        volume  = { 1 },
                        number  = { 5 },
                        pages   = { 41-44 },
                        doi     = { 10.5120/126-242 },
                        publisher = { Foundation of Computer Science (FCS), NY, USA }
                        }
                        %0 Journal Article
                        %D 2010
                        %A Gaurang Panchal
                        %A Amit Ganatra
                        %A Y.P.Kosta
                        %A Devyani Panchal
                        %T Searching Most Efficient Neural Network Architecture Using Akaike's Information Criterion (AIC)%T 
                        %J International Journal of Computer Applications
                        %V 1
                        %N 5
                        %P 41-44
                        %R 10.5120/126-242
                        %I Foundation of Computer Science (FCS), NY, USA
Abstract

The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. Neural networks are commonly used networks in many engineering applications due to its better generalization property. An ensemble neural network algorithm is proposed based on the Akaike information criterion (AIC). Ecologists have long relied on hypothesis testing to include or exclude variables in models, although the conclusions often depend on the approach used. The advent of methods based on information theory, also known as information-theoretic approaches, has changed the way we look at model selection The Akaike information criterion (AIC) has been successfully used in model selection. It is not easy to decide the optimal size of the neural network because of its strong nonlinearity. We discuss problems with well used information and propose a model selection method.

References
  • S. Raksekaran, G.A. Vijayalakshmi.: Neural Network Fuzzy Logic and Genetic Algorithm Pai. PHI Publication (2005).
  • Jaiwer Han, Micheline Kamber.: Data Mining, Concepts & Techniques Morgan. Kaufmamn Publication (2005).
  • Pythia Version 1.02, The Neural Network Designer, http://www.runtime.org
  • Alyuda NeuroIntelligence 2.2, http://www.aluyada.com.
  • AIC,Wikipidia,http://www.wikipidia.com
  • AIC, http://www.modelselection.org
  • AIC, http://en.wikipedia.org/wiki/Residual_sum_of_squares
Index Terms
Computer Science
Information Sciences
No index terms available.
Keywords

Neural Network Hidden Neurons Akaike's Information Criterion (AIC) Correct Classification Rate (CRR)

Powered by PhDFocusTM