Recent experiments systematically explore rock friction under crustal earthquake conditions revealing that faults undergo abrupt dynamic weakening. Processes related to heating and weakening of fault surface have been invoked to explain pronounced velocity weakening. Both contact asperity temperature $T_a$ and background temperature $T$ of the slip zone evolve significantly during high velocity slip due to heat sources (frictional work), heat sinks (e.g. latent heat of decomposition processes) and diffusion. Using carefully calibrated High Velocity Rotary Friction experiments, we test the compatibility of thermal weakening models: (1) a model of friction based only on $T$ in an extremely simplified, Arrhenius-like thermal dependence; (2) a flash heating model which accounts for evolution of both $V$ and $T$; (3) same but including heat sinks in the thermal balance; (4) same but including the thermal dependence of diffusivity and heat capacity. All models reflect the experimental results but model (1) results in unrealistically low temperatures and models (2) reproduces the restrengthening phase only by modifying the parameters for each experimental condition. The presence of dissipative heat sinks in (3) significantly affects $T$ and reflects on the friction, allowing a better joint fit of the initial weakening and final strength recovery across a range of experiments. Temperature is significantly altered by thermal dependence of (4). However, similar results can be obtained by (3) and (4) by adjusting the energy sinks. To compute temperature in this type of problem we compare the efficiency of three different numerical solutions (Finite differences, wavenumber summation, and discrete integral).