Dear Jason and all,
I appreciate your patience and continued support in this matter.
As shown below, for one particular simulation scenario, the mean and standard deviation (SD) of the fore-aft bending moment between the yaw bearing and the top tower node (node 9) are consistent with the overall trend—i.e., both values decrease progressively from the bottom to the top (between node 8 and node 9).
However, when I examine a shorter time window (for example, a 30-second snippet), this trend disappears, and I instead observe an increase in SD between node 9 and the yaw bearing.
In a realistic situation, even without active yaw control, I would expect some yaw degree of freedom (Yaw DOF), which is why I kept it enabled. Setting Yaw DOF = True allows relative load transfer between the rigid nacelle and the yaw bearing, which is also influenced by the YawSpring and YawDamp parameters.
After reviewing the discussion (Nacelle Yaw Parameter (YawSpr and YawDamp)), I am now considering whether disabling the Yaw DOF might help in diagnosing the cause of this discrepancy.
Would you recommend switching off the Yaw DOF for a diagnostic run?
Regards
Abhinay Goga

