Gallium nitride (GaN) high electron mobility transistors (HEMTs) have attracted considerable attention due to high electron mobility, wide bandgap, and other advantageous properties. However, the self-heating that occurs at high power densities has emerged as a significant challenge that restricts HEMTs’ potential in high-performance applications. This study introduces a novel approach for extracting channel temperature distributions and thermal resistance, which considers the bias-dependence of the heat source model and the electro-thermal coupling among multiple gate fingers. The thermal resistance extracted by the optimized thermal model in this study differs from the traditional thermal model by 14.8% under a power density of 8W/mm and a base temperature of 1 0 0 ◦ C . The precision of the model is validated through infrared (IR) thermography, showing only a 1.76% discrepancy in the peak surface temperature of the filtered model compared to the measurement. To evaluate the model’s applicability across various conditions, comparisons are conducted between the model’s predictions and measurements across a range of ambient temperatures and power dissipation, revealing maximum errors of 4.1% at 7 5 ◦ C and 2.7% at 1 0 0 ◦ C . Finally, the influence of thermal boundary resistance (TBR) on the total thermal resistance is explored to provide guidance for device modeling and thermal management.