Conditioning Optimization of Extreme Learning Machine by Multitask Beetle Antennae Swarm Algorithm

Reading time: 5 minute
...

📝 Original Info

  • Title: Conditioning Optimization of Extreme Learning Machine by Multitask Beetle Antennae Swarm Algorithm
  • ArXiv ID: 1811.09100
  • Date: 2023-05-18
  • Authors: : G.B. Huang, Wong et al., Xu et al., Rong et al., 등.

📝 Abstract

Extreme learning machine (ELM) as a simple and rapid neural network has been shown its good performance in various areas. Different from the general single hidden layer feedforward neural network (SLFN), the input weights and biases in hidden layer of ELM are generated randomly, so that it only takes a little computation overhead to train the model. However, the strategy of selecting input weights and biases at random may result in ill-posed problem. Aiming to optimize the conditioning of ELM, we propose an effective particle swarm heuristic algorithm called Multitask Beetle Antennae Swarm Algorithm (MBAS), which is inspired by the structures of artificial bee colony (ABS) algorithm and Beetle Antennae Search (BAS) algorithm. Then, the proposed MBAS is applied to optimize the input weights and biases of ELM. Experiment results show that the proposed method is capable of simultaneously reducing the condition number and regression error, and achieving good generalization performances.

💡 Deep Analysis

Figure 1

📄 Full Content

Extreme learning machine (ELM) proposed by G.B. Huang [1], is a feasible single hidden layer feedforward network (SLFN). It is composed of three core components: input layer, hidden layer and output layer. It has been successfully applied for many research and engineering problems. As a flexible and fast SLFN, the input weights and biases in hidden layer of ELM are assigned randomly, making the training speed increase greatly and process huge size of data in short time. There is no doubt that ELM is a good choice to cope with the tasks requiring instantaneity. Wong et al utilizes ELM to detect the real-time fault signal of gas turbine generator system [2]. Xu et al proposed a predictor based on ELM model for immediate assessment of electrical power system [3] [4]. Meanwhile, it has been proved that the performances of ELM and its variants are superior to most of classical machine learning methods in the field of image processing [5][6] [27], speech recognition [7][8] [9], biomedical sciences [10] [11] [12], and so on.

There are plenty of efforts have been laid emphasis on enhancing the accuracy of ELM by means of adjusting neural network architecture and changing the numbers of hidden layer by certain rules. An incremental constructive ELM, adding its hidden nodes on the basic of convex optimization means and neural network theory, was proposed by Huang et al in 2007 [13]. While Rong et al dropped the irrelevant hidden nodes by calculating the correlation of each node with statistical approach [14]. This pruned-ELM not only raises the precision but also reduces the computation cost. It is inappropriate to evaluate the performance of a model by just considering the testing accuracy. Because the stability is also one of the most significant criterions to evaluate machine learning models. Choosing the coefficient randomly in hidden nodes is unable to ensure the numerical stability of ELM. In contrast, it will increase the risk of ill-conditioned problem, implying that the output of network may tremendously change even though the input values appear slight fluctuation.

To obtain well-conditioned ELM network, heuristic algorithms are generally adopted to optimize the parameters in hidden nodes. Recently, a promising heuristic algorithm called Beetle antennae search algorithm (BAS) was proposed [20]. The Beetle model has two antennas in different directions to detect the new solution. Beetle model always moves in the direction of antennae with the better result. Even though its mechanism of velocity alteration is uncomplicated, it still shows very good performances in many applications. Nevertheless, it is hard for BAS to find an acceptable solution in the case of non-ideal scene, since the searching ability is sensitive to the initial length of step. If the length is too large, it may miss the globally optimal solution. The small size of step will lead to “false convergence” cause of decreasing step.

Beetle in BAS always moves toward the better direction. Since improving the conditioning of ELM network is not a unimodal problem, it will be easy to fall into the local optimal solution by single beetle particle. In order to enhance the searching ability of BAS, we attempt to increase more beetle particles. Different from Particle Swarm Optimization (PSO) [16], we propose a novel particle swarm algorithm called Multitask Beetle Antennae Swarm Algorithm (MBAS) based on the frame of Artificial Bee Colony algorithm (ABC) [22], where different particles have different update rules. We add some follower particles and explore particles to enlarge the searching range and prevent falling in local optimal solutions. Some particle swarm based algorithms [17][18][19] have adopted to increase the accuracy of ELM. Analogously, we can put condition number of ELM into the fitness function of MBAS to optimize input weights and biases in hidden layer.

The main contributions of this paper can be summarized as follows: (1) a novel beetle swarm optimization algorithm as the extending version of BAS is proposed to enhance the ability of searching the optimal solution. A beetle group with fixed population is defined, where each beetle has different function to enlarge the searching route around the solution. (2) An improved ELM named Multitask Beetle Antennae Swarm Algorithm Extreme learning machine (MBAS-ELM) is then proposed, by combining the beetle swarm algorithm with ELM to optimize the parameters of hidden nodes. Experiment results show that MBAS-ELM is available to lower the condition number as well as testing error for regression. More details about MBAS and MBAS-ELM will be discussed in Section 3.

The remainder of this paper is scheduled as follows. In Section 2, we introduce outline of ELM and BAS. In Section3, the proposed MBAS and MBAS-ELM are introduced in details. In Section 4, results and discussion of experiments are given. Finally, conclusion of our works is shown in Section 5.

Extreme learning machine (ELM) prese

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut