Adaptive Method for Selecting Basis Functions in Kolmogorov–Arnold Networks for Magnetic Resonance Image Enhancement
- Autores: Penkin M.A.1, Krylov A.S.1
-
Afiliações:
- Faculty of Computational Mathematics and Cybernetics, Moscow State University
- Edição: Nº 3 (2025)
- Páginas: 63–69
- Seção: COMPUTER GRAFICS AND VISUALIZATION
- URL: https://modernonco.orscience.ru/0132-3474/article/view/688122
- DOI: https://doi.org/10.31857/S0132347425030061
- EDN: https://elibrary.ru/GRJEJW
- ID: 688122
Citar
Texto integral



Resumo
A way to improve the quality of magnetic resonance image processing using the Kolmogorov–Arnold networks for deep feature filtering in the convolutional neural network is studied. Recently proposed Kolmogorov–Arnold networks are inspired by the representation theorem of the same name from real analysis and approximation theory. It states that every multivariate continuous function on a compact set can be represented as a superposition of continuous single-variable functions. However, further gradient descent application imposes restrictions on the inner Kolmogorov functions to be at least differentiable, that’s why, in practice, they are searched in a linear span of B-Splines or some other differentiable basis functions. In this study we propose an adaptive method of basis functions selection by the model itself during training, mitigating the rule of thumb choice of that basis functions. The method is based on the attention mechanism, successfully used in state-of-the-art transformers. The proposed approach is tested on magnetic resonance images enhancement on IXI dataset and demonstrates the best average PSNR and TV over the synthetic testing dataset. Without loss of generality, the system of basis functions included: B-splines, Chebyshev polynomials and Hermite functions.
Texto integral

Sobre autores
M. Penkin
Faculty of Computational Mathematics and Cybernetics, Moscow State University
Autor responsável pela correspondência
Email: penkin97@gmail.com
Rússia, Moscow, 119991
A. Krylov
Faculty of Computational Mathematics and Cybernetics, Moscow State University
Email: kryl@cs.msu.ru
Laboratory of Mathematical Methods of Image Processing
Rússia, Moscow, 119991Bibliografia
- Smith T. B. MRI Artifacts and Correction Strategies // Imaging in Medicine. 2010. V. 2. № 4. 445 p.
- Senyukova O., Zubov A. Full Anatomical Labeling of Magnetic Resonance Images of Human Brain by Registration with Multiple Atlases // Programming and Computer Software. 2016. V. 46. № 6. P. 356–360.
- Gray A., Pinsky M. Gibbs Phenomenon for Fourier-Bessel Series. Expos. Math. 11, 1993. 123 p.
- Penkin M., Krylov A., Khvostikov A. Hybrid Method for Gibbs-ringing Artifact Suppression in Magnetic Resonance Images // Programming and Computer Software. 2021. V. 47. № 3. P. 207–214.
- Penkin M., Khvostikov A., Krylov A. Automated Method for Optimum Scale Search when Using Trained Models for Histological Image Analysis // Programming and Computer Software. 2023. V. 49. № 3. P. 172–177.
- Zhang Y., Yu H. Convolutional Neural Network based Metal Artifact Reduction in X-Ray Computed Tomography // IEEE transactions on medical imaging. 2018. V. 37. № 6. P. 1370–1381.
- Penkin M., Krylov A. Medical Image Joint Deringing and Denoising using Fourier Neural Operator // Proceedings of the 2023 8th International Conference on Biomedical Imaging, Signal Processing. 2023. P. 40–45.
- Liu Z., Wang Y., Vaidya S., Ruehle F., Halverson J., Soljačić M., Hou T.Y., Tegmark M. KAN: Kolmogorov–Arnold Networks // arXiv preprint arXiv:2404.19756. 2024.
- Seydi S.T. Exploring the Potential of Polynomial Basis Functions in Kolmogorov–Arnold Networks: A Comparative Study of Different Groups of Polynomials // arXiv preprint arXiv:2406.02583. 2024.
- Vaswani A. Attention Is All You Need // Advances in Neural Information Processing Systems. 2017.
- Girosi F., Poggio T. Representation Properties of Networks: Kolmogorov’s Theorem is Irrelevant // Neural Computation. 1989. V. 1. № 4. P. 465–469.
- Somvanshi S., Javed S.A., Islam M.M., Pandit D., Das S. A Survey on Kolmogorov–Arnold Network // arXiv preprint arXiv:2411.06078. 2024.
- Li Z. Kolmogorov–Arnold Networks are Radial Basis Function Networks // arXiv preprint arXiv:2405.06721. 2024.
- Bozorgasl Z., Chen H. WAV–KAN: Wavelet Kolmogorov–Arnold Networks // arXiv preprint arXiv:2405.12832. 2024.
- Sidharth S.S., Keerthana A.R., Anas K.P. Chebyshev Polynomial-based Kolmogorov–Arnold Networks: An Efficient Architecture for Nonlinear Function Approximation // arXiv preprint arXiv:2405.07200. 2024.
- Abueidda D.W., Pantidis P., Mobasher M.E. DeepOKAN: Deep Operator Network based on Kolmogorov–Arnold Networks for Mechanics Problems // Computer Methods in Applied Mechanics and Engineering. 2025. V. 436. № 4. 117699 p.
- Li C., Liu X., Li W., Wang C., Liu H., Liu Y., Chen Z., Yuan Y. U-KAN Makes Strong Backbone for Medical Image Segmentation and Generation // arXiv preprint arXiv:2406.02918. 2024.
- Penkin M., Krylov A. Kolmogorov–Arnold Networks as Deep Feature Extractors for MRI Reconstruction // Proceedings of the 2023 8th International Conference on Biomedical Imaging, Signal Processing. ACM. 2024. P. 40–45.
- Lei Ba J., Kiros J.R., Hinton G.E. Layer Normalization // ArXiv e-prints. 2016. 1607 p.
- Duta I.C., Liu L., Zhu F., Shao L. Improved Residual Networks for Image and Video Recognition // 2020 25th International Conference on Pattern Recognition (ICPR). 2021. P. 9415–9422.
- Kingma D.P., Ba J. Adam: A Method for Stochastic Optimization // arXiv preprint arXiv:1412.6980. 2014.
- Kellner E., Dhital B., Kiselev V.G., Reisert M. Gibbs‐ringing artifact removal based on local subvoxel‐shifts // Magnetic resonance in medicine. 2016. V. 76. № 5. P. 1574–1581.
- Boyd J.P. Chebyshev and Fourier Spectral Methods // Courier Corporation. 2001.
- Krylov A., Korchagin D. Fast Hermite Projection Method // International Conference Image Analysis and Recognition. Springer. 2006. P. 329–338.
Arquivos suplementares
