Optimizing Temporal Hyperparameters Through Ablation Studies in Spiking Neural Networks
Neural Networks are machine learning models that use existing data to predict new data which are used in almost every part of academia, industry, and our everyday life in simplifying and optimizing tasks. Our research pertains to a specific type of neural network, Spiking Neural Networks (SNNs), which are very power efficient and well suited for spatio-temporal learning due to the binary nature of the output of the spiking neurons, or energy spikes throughout time. However, they lack the accuracy and precision that other more traditional Artificial Neural Networks have due to their non-differentiable spiking output. The current focus pertains to this temporality; we study how certain temporal constants affect the accuracy of Spiking Neural Networks through an ablation study(variable testing conducted on several datasets). After data visualization for analysis, we extrapolate significance in such variables through whisker plots and random forest sampling. Our main findings are that some variables have clear optimal ranges, while others tend to be more codependent when achieving high accuracy. The most significant correlation is found in these variables: decay after a spike(𝜏_s), the ratio between decay after spike and decay after input (𝜏_s/𝜏_m), and a warmup constant (a). As such, we are currently experimenting with more values and different ways to test the data. Learning the nuances between set constants, temporal variables, and accuracy, can lead us to viable paths for future directions and experiments.
Project Mentor: Richard Boone
Faculty Mentor: Peng Li