INFORMATION TECHNOLOGY

Scientists propose attention pulse neural networks


Recently, Li Guoqi, a researcher at the Laboratory of Brain Atlas and Brain-like Intelligence of the Institute of Automation, Chinese Academy of Sciences, and Zhao Guangshe, a professor at Xi’an Jiaotong University, published a research paper entitled Attention Spiking Neural Networks in IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI). This work integrates the attention mechanism into a million-scale pulsed neural network, and for the first time achieves the performance comparable to the traditional artificial neural network on the ImageNet-1K dataset, and the theoretical energy efficiency is 31.8 times that of the same structure artificial neural network. This method can significantly improve the task performance while greatly reducing the network energy consumption, and provides a new idea for the development of low-power neuromorphic systems.

In recent years, deep learning, represented by traditional artificial neural networks, has demonstrated the ability to approach or surpass humans in some tasks. These achievements have been accompanied by enormous energy consumption costs. The human brain, on the other hand, is able to perform the same or more complex tasks efficiently with very little energy consumption. How to make machine intelligence work as efficiently as the human brain is the goal that scientists are striving for. Neuromorphic computing based on pulsed neural networks offers an attractive low-energy alternative to conventional artificial intelligence. Pulsed neurons simulate complex spatiotemporal dynamics in biological neurons, and their expression ability is theoretically stronger than that of existing artificial neurons. At the same time, pulsed neurons inherit the impulse communication method in biological neurons, which is the key to achieving low power consumption of pulsed neural networks. On the one hand, only low-energy synaptic addition needs to be performed in neuromorphic systems; On the other hand, event-driven properties make the neuromorphic system trigger computation only when pulsed neurons emit impulses. Therefore, how to achieve high task performance with low pulse emission rate is an important problem in neuromorphic computing. The human brain can naturally and efficiently find important information in complex scenes, known as attention mechanisms. The attention mechanism is widely used in deep learning and has achieved remarkable results. However, applications in the field of neuromorphic computing are still challenging.

In order to incorporate attention mechanisms into pulsed neural networks, there are three fundamental questions to consider. First, the key to the high energy efficiency of pulsed neural networks is the event-driven feature based on pulse communication, and the attention mechanism cannot destroy this feature. Second, pulsed neural networks have a wide range of application scenarios, and a variety of designs are required to ensure their effectiveness in various scenarios. Third, binary pulse communication makes deep pulse neural networks prone to performance degradation caused by gradient disappearance or explosion, and the addition of attention mechanism should not at least aggravate the degradation problem.

As shown in Figure 1, the realization of the function of attention in the human brain is mainly reflected in the regulation of impulse emission in different brain regions or neurons. Inspired by this, this study optimizes the membrane potential distribution inside the pulsed neural network through the attention mechanism, focusing on important features and suppressing unnecessary features, thereby regulating pulse emission. The network architecture is shown in Figure 2. Further, in order to adapt the attention pulse neural network to various application scenarios, as shown in Figure 3, the study integrates the three dimensions of time, channel, and space to learn “when”, “what” and “where” is important.

Figure 1: The brain has an attention mechanism on a multi-level structure

Figure 2.Multidimensional attention pulse neural network where attention mechanisms are used to adjust membrane potential distribution

Figure 3.Schematic diagram of attention mechanism in time, channel, and spatial dimensions

The research team experimented on the proposed multi-dimensional attention pulse neural network on the event-based action recognition dataset and the static image classification dataset ImageNet-1K. Experiments show that the addition of attention module helps the pulse neural network to significantly improve the performance, and the number of pulses in the network can also be reduced, thereby reducing the energy consumption of the model. On the DVS128 Gait dataset, the multidimensional attention module reduced the pulse emission of the original pulse neural network by 81.6% and brought a performance improvement of 4.7% (Table 1). On the ImageNet-1K dataset, the attention pulse neural network achieves the same performance as the traditional artificial neural network for the first time, and the theoretical energy efficiency is 31.8 times that of the same structure artificial neural network (Table 2).

Table 1.Performance comparison of results on DVS128 Gesture/Gait

Table 2.Performance comparison of results on ImageNet-1K

The study also proposes a new visualization method to analyze why the proposed attention module can improve network performance while reducing pulse emitting. As shown in Figure 4 and Figure 5, the pulsed neural network with the attention mechanism can suppress unimportant background noise information while focusing on important information (each pixel in the feature map represents the emission rate of a neuron. The redder the color, the greater the distribution rate; The bluer the smaller the distribution rate). In all feature maps, the noise signature map or the pulse emission rate in neurons is higher. Therefore, suppressing the noise information can significantly reduce the pulse emission in the network.

Figure 4. Case studies in the Gait dataset

Figure 5. Impulse response in DVS128 Gait dataset; The attention mechanism significantly suppresses background noise

Furthermore, this study proves through block dynamic isometric theory that dynamic isometry can still be achieved by adding the proposed attention module to the deep pulse neural network. That is, the attention module does not cause performance degradation in deep pulsed neural networks.

In summary, this work explores how to use the attention mechanism in the pulse neural network, and finds that by inserting the attention mechanism into the pulse neural network as an auxiliary module, the task performance can be significantly improved while greatly reducing the network pulse emitted. By visualizing the impulse response of primitive and attention pulse neural networks, it can be seen that the attention mechanism helps the original network to focus on important information while suppressing noisy information, which contains a large number of impulses in noisy channels or neurons. Therefore, in neuromorphic computing based on pulsed neural networks, better performance can be achieved with lower energy consumption like the human brain.

The research work is supported by the Beijing Science Foundation for Outstanding Young Scholars, the National Natural Science Foundation of China Key Project, the Joint Fund for Regional Innovation and Development, etc. Researchers from Peking University and Tsinghua University participated in the research. (Source: Institute of Automation, Chinese Academy of Sciences)

Related paper information:https://doi.org/10.1109/TPAMI.2023.3241201

Special statement: This article is reproduced only for the need to disseminate information, and does not mean to represent the views of this website or confirm the authenticity of its content; If other media, websites or individuals reprint and use from this website, they must retain the “source” indicated on this website and bear their own legal responsibilities such as copyright; If the author does not wish to be reprinted or contact the reprint fee, please contact us.



Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button