Human movement emerges from the interplay between nervous, muscular, and skeletal systems, interacting with the environment. Understanding these processes is crucial for developing wearable robotic technologies to restore movement following neuro-muscular injuries. Movement neuro-mechanics is often studied via computer models of the composite neuromusculoskeletal system, which use static, dynamic optimization or reinforcement learning to estimate muscle activation and resulting mechanical forces from kinematic and kinetic data. However, such approaches often fail at capturing the variability in multi-muscle neural recruitment and force generation across movements, anatomies, and conditions (i.e., ageing or injury). Electromyography (EMG)-driven musculoskeletal modeling uses measured EMGs and joint angles for simulating muscle-tendon level mechanics with no assumptions on how muscles are recruited by the central nervous system. EMG-driven models enabled task-agnostic, myoelectric model-based controllers for bionic limbs and exoskeletons. However, real-time neuromechanical models still remain largely proprietary, hindering their widespread use, progress and standardization. Here, we introduce CEINMS-RT, a freely available, open-source, neuromechanical modeling framework for the real-time myoelectric model-based control of wearable robots, including exoskeletons, exosuits, haptic devices, and bionic limbs. CEINMS-RT explicitly models person-specific movement neuro-mechanics and estimates EMG-dependent variables including muscle activation, muscletendon force, and resulting joint dynamics. This represents an open-source alternative to end-to-end neural regressors, which do not estimate intermediate biomechanicasl variables that would be critical for roboust wearable robot control (e.g., joint stiffness, damping or underlying muscle-tendon impedance). Here, we introduce the CEINMS-RT framework and provide application results in the context of wearable robotic control.