TurbSim and mean wind speed

Hi again everybody,
I performed a simple check (that I never tried before) on the output of TurbSim V1.0, and I am not understanding the results.
I compute the average of the “U” column, and it is not what I espect.

I need only a hh (hub height) file for the purpose of a simple MathCAD simulation. The inportant inputs are as follows:

---------Runtime Options-----------------------------------
30061958 First random seed (1-999999999)
RANLUX Second random seed (1-999999999)
--------Turbine/Model Specifications-----------------------
2 Vertical grid-point matrix dimension
2 Horizontal grid-point matrix dimension
0.05 Time step [seconds] 600
120 Usable length of output time series [seconds] (
25 Hub height [m] (should be > 0.5GridHeight)
20.00 Grid height [m]
20.00 Grid width [m] (should be >= 2
0 Vertical mean flow (uptilt) angle [degrees]
0 Horizontal mean flow (skew) angle [degrees]

--------Meteorological Boundary Conditions-------------------
“IECKAI” Turbulence model
50 IEC turbulence characteristic
24 Height of the reference wind speed [m]
2.0 Mean wind speed at the reference height [m/s]
default Surface roughness length [m] (or “default”)

--------Non-IEC Meteorological Boundary Conditions------------
default Site latitude [degrees] (or “default”)
0.05 Gradient Richardson number
default Power law exponent (or “default”)
default Friction or shear velocity [m/s] (or “default”)
default Mixing layer depth [m] (or “default”)
default U’W’ cross-correlation coefficient (or “default”)
default U’V’ cross-correlation coefficient (or “default”)
default V’W’ cross-correlation coefficient (or “default”)
default U-component coherence decrement (or “default”)
default V-component coherence decrement (or “default”)
default W-component coherence decrement (or “default”)
default Coherence exponent (or “default”)

As you can see, I am using a very low wind speed and a high turbulence (I am using this to simulate the turbine startup).

I am aware that the input “2.0” as mean wind speed at the reference height will be LESS than the mean horizontal wind speed (because of the vector sum). But in this example, from the .dat file, I get 2.85 as the mean of the “U” column.
From the .hh file, the mean is 2.95, coherent with the “Uh” mean of the .dat file. The .sum files states, instead, that the U mean is 2.02…

I really hope someone can point me to some dummy error in my reasoning…
Any help will be appreciated
thanks a lot!

Best regards


The differences you see between the mean wind speed printed in the .sum file and the values you calculated in the .dat and .hh output files are a result of record length.

The .sum file statistics are based on the “length of the analysis time series” from the input file. However, the output files are based on the “usable length of output time series” from the input file. If those two inputs are not the same, the .sum file and output files may have very different statistics.

In your case, the input file requests that the 600-second analysis time series contain a 2 m/s mean wind speed at 24 m. The output files contain only the first 130 seconds (120 seconds + (20 m grid width)/(2.016 m/s HH mean wind speed) ) of that time series.

As a word of caution, the “length of the analysis time series” determines the frequency content of the generated time series. If it is too small, the output may have strange results. I’d recommend leaving it around 600 seconds. If you want only 120 seconds of usable data and you’d like the mean wind speed to be closer to the input value, you could try using different random seeds.

I hope this helps.