Omnidirectional images (ODIs) capture 360-degree scenes but require extremely high resolution to provide a truly immersive experience. However, most existing ODIs are insufficient in resolution, leading to degraded visual quality. While prior methods like OSRT have made progress in addressing this issue, they still struggle to fully exploit the geometric properties of equirectangular projection (ERP), particularly in handling geometric distortions and preserving fine details. To overcome these challenges, we propose a Multi-scale Fractal Analysis and Self-adaptive Spherical Harmonics (MFASH) method, which builds upon the OSRT framework to enhance the performance of omnidirectional image super-resolution (ODISR). MFASH introduces fractal geometry to adaptively capture varying levels of detail, self-adaptive spherical harmonics for flexible geometric representation, and nonlinear diffusion equations for edge preservation and noise reduction. By integrating these advanced mathematical models into the OSRT architecture, our method better handles the complexities of ODIs, including geometric distortions and texture preservation. Extensive experiments demonstrate that MFASH outperforms state-of-the-art methods in both quantitative metrics and visual quality, especially in reconstructing fine details and complex textures in ODIs.