Omar Tahmi

and 2 more

The proliferation of the Internet of Things (IoT) and the rapid advancement of Neural Networks (NNs) jointly facilitate the Internet of Artificially Intelligent Things. However, the shift towards cloud-based solutions raises significant privacy concerns, as sensitive data may be misused during NN inference tasks (predictions). To address these concerns, Homomorphic Encryption (HE) was introduced, allowing computations to be performed directly on encrypted data. Integrating NNs with HE presents challenges when introducing non-linearity into the model due to the constraints of current HE schemes that support only linear/polynomial functions. This often necessitates finding approximations for traditional activation functions (AFs) in NNs, leading to several issues: (i) Fixing an AF from the outset limits the model’s flexibility and may not yield the optimal fit for the data; (ii) Approximating a fixed AF requires significant effort to explore various approximation techniques and often leads to degradation in classification performance in exchange for reduced computational complexity; (iii) Avoiding certain approximations, if possible, to maintain accuracy increases the computational burden on the client side in the form of extra operations/communications. To overcome these challenges, we introduce CryptoKANs—Kolmogorov-Arnold Networks over Encrypted Data—a novel approach that enables NNs to operate over encrypted data by leveraging learnable AFs and symbolization to enhance privacy-preserving machine learning (PPML) for inference tasks. CryptoKANs allow the model to learn HE-suitable AFs as part of the training process. Experimental results demonstrate that CryptoKANs outperform traditional PPML models, achieving superior accuracy and, in some cases, even surpassing the performance of original models that operate on plaintext data. These findings underscore the potential of CryptoKANs to provide efficient, interpretable, accurate, and scalable private inference, marking a significant advancement in the field of PPML.