loading page

Enhancing Additive Recurrent OCOS Neural Networks with Chebyshev Polynomial Activation Functions for Signal Processing and AI Applications
  • Rakesh Sengupta
Rakesh Sengupta
SR University

Corresponding Author:rakesh.sengupta@sru.edu.in

Author Profile

Abstract

Neural networks have proven highly effective in signal processing and AI, particularly in handling complex, high-dimensional data with temporal dependencies. Additive recurrent On-Center Off-Surround (OCOS) networks are inspired by biological neural systems and are known for their ability to enhance contrast and selectively process information. In this work, we propose the use of Chebyshev polynomials as activation functions in additive recurrent OCOS networks to improve their performance in signal processing tasks. Chebyshev polynomials offer excellent approximation properties with minimal computational overhead, making them well-suited for dynamic systems. We demonstrate that integrating these polynomials as activation functions enhances the network’s ability to extract features, recognize patterns, and maintain stability in real-time signal analysis. Through empirical evaluations, we show that networks using Chebyshev polynomial activations outperform those using traditional activation functions in terms of stability, accuracy, and computational efficiency. The proposed framework is applied to various AI tasks, including real-time data analysis and adaptive filtering, highlighting its advantages in dynamic environments. Our results suggest that Chebyshev polynomials, when combined with additive recurrent OCOS networks, provide a robust and efficient approach to solving complex problems in AI and signal processing.
07 Oct 2024Submitted to Mathematical Methods in the Applied Sciences
08 Oct 2024Submission Checks Completed
08 Oct 2024Assigned to Editor
17 Oct 2024Review(s) Completed, Editorial Evaluation Pending
13 Dec 2024Reviewer(s) Assigned