Coherence function

Hi all,

I have got a theoretical question concerning the coherence function as defined within the TurbSim user’s manual.

Preamble: by using Matlab, I am trying to code my own wind file generator, in accordance with Sandia theory and Kaimal spectrum. Now, assuming that N = Ny*Nz, I generally calculate a distance matrix with size b[/b]; the frequency array is, instead, a NumFreq*1 array.

Therefore, when applying the coherence formula, there is a clear mismatch between frequency vector size and distance matrix size: I expect the coherence matrix to be sized as the distance matrix, that means a N*N coherence matrix.

In which way may I overcome this size issue? Would you mind to shed a light? My code, up to now, looks like this:

[code]clc, clear all, close all

HubHt = 110; % Hub Height
GridWidth = 150; % Grid length along Y axis
GridHeight = 150; % Grid length along Z axis
RotorDiameter = min(GridWidth,GridHeight); % Turbine Diameter
Ny = 7;
Nz = 7;
N = Ny*Nz;
AnalysisTime = 600;
UsableTime = 40; % Time as output
TimeStep = 0.05;
Uhub = 8; % Average wind speed at hub height

TI = ‘A’; % Turbulent intensity
%set delta1 according to guidelines (chap.6)
if HubHt < 60
lambda1 = 0.7HubHt;
lambda1 = 42;
L = 0.8
lambda1;%IEC, (B.12)
%IEC, Table 1, p.22
%IEC, sect.
if isequal(TI,‘A’)
Iref = 0.16;
sigma1 = Iref*(0.75Uhub + b);
elseif isequal(TI,‘B’)
Iref = 0.14;
sigma1 = Iref
(0.75Uhub + b);
elseif isequal(TI,‘C’)
Iref = 0.12;
sigma1 = Iref
(0.75*Uhub + b);
sigma1 = str2num(TI)*Uhub/100;

    % sigma1=Iref*(0.75*Uhub+b);
    %derived constants
    l=0.8*lambda1; %IEC, (B.12)
    xLu = 8.1*l; % Integral length scale along X axis
    xLv = 2.7*l; % Integral length scale along Y axis
    xLw = 0.66*l; % Integral length scale along Z axis

% Frequency defition
NumOutSteps =ceil((UsableTime + GridWidth/Uhub)/TimeStep);
NumSteps =max(ceil(AnalysisTime/TimeStep), NumOutSteps);
INumSteps =1.0/NumSteps;
NumFreq =NumSteps/2;
DelF =1.0/(NumStepsTimeStep);
DelF5 =DelF
f= ((1:NumFreq)*DelF);


dy = GridWidth/(Ny-1);
dz = GridHeight/(Nz-1);
if isequal(mod(Ny,2),0)
iky = [(-Ny/2:-1) (1:Ny/2)];
iky = -floor(Ny/2):ceil(Ny/2-1);

if isequal(mod(Nz,2),0)
ikz = [(-Nz/2:-1) (1:Nz/2)];
ikz = -floor(Nz/2):ceil(Nz/2-1);

[Y Z] = ndgrid(ikydy,(ikzdz + HubHt));

% define distance matrix
dist = pdist2([Y(:), Z(:)], [Y(:), Z(:)]);[/code]

Hi Francesco,

You will need an NN coherence matrix at each frequency. If you store the NNNumFreq matrix, it will get very large! In TurbSim, we store an NN matrix for coherence that gets calculated at each frequency (inside a loop).

Hope that helps.

Dear Bonnie,

first of all thanks for the quick reaction to my post.

Secondly, I already had the hunch to pursue a procedure as the one you advised. I, therefore, tried to adjust my code accordingly:


% Define normalized power spectra
psd = zeros(length(f),3);
psd(:,1) = abs((4*(sigma1^2)(xLu/Uhub))./((1 + 6f*(xLu/Uhub)).^(5/3)));
psd(:,2) = abs((4*(sigma2^2)(xLv/Uhub))./((1 + 6f*(xLv/Uhub)).^(5/3)));
psd(:,3) = abs((4*(sigma3^2)(xLw/Uhub))./((1 + 6f*(xLw/Uhub)).^(5/3)));

% Coherence constant parameters
a = 12;
Lc = 5.67*min(60,HubHt);

Coh = zeros(N,N,3);
Coh_vw = eye(N);

PSD(:,:,1) = repmat(psd(:,1),[1,N]);
PSD(:,:,2) = repmat(psd(:,2),[1,N]);
PSD(:,:,3) = repmat(psd(:,3),[1,N]);
PSD = permute(PSD,[2 1 3]);

H = zeros(N,length(f),3);
for jj = 1:length(f)
HH = exp(-a.*sqrt(((f(jj).dist)./Uhub).^2 + (0.12.(dist./Lc)).^2));
Coh(:,:,1) = chol(HH,‘lower’);
Coh(:,:,2) = Coh_vw;
Coh(:,:,3) = Coh_vw;
for kk = 1:3
H(:,jj,kk) = Coh(:,:,kk)*sqrt(PSD(:,jj,kk));

Is this correct, or is there any relevant flaw? If you could give me a suggestion, I would more than welcome it.

Best regards,
Francesco Perrone

Dear Francesco,

I don’t have time to thoroughly review your code, but at first glance it looks okay. (Keep in mind that some codes do assume that there is some coherence in v and w, but the IEC standard doesn’t specify anything.)

Dear Bonnie,

I thank you again for your reply. In the end, I have been able to implement the whole code for generating turbulent wind fields in MATLAB.

As you pointed out, I am currently implementing no coherence for v- and w- components: only the Dirlik function is taken into account for those two components.

To prove the correctness of the code, I am running some easy tests:

mean values and standard deviations [list][/list:u]
gaussian distribution test [list][/list:u]
integral length scale test

I happened into some issue when checking for the integral length scales: I blatantly expect that [340.2, 113.4, 27.72] m will be, at least on average, the representative length scales retrieved by each triplet of simulated wind components. Which procedure would you suggest for computing the length scales? At the moment I am proceeding this way (e.g. for one time series in x-direction)

[acorr, lags] = xcorr(U(1,:),U(1,:),‘coeff’); % auto-correlation of the time series
zero_cross = crossing(acorr,lags); % find the zero-crossing levels of the auto-correlation function

Then, I integrate the auto-correlation function between the max value (i.e. 1) and the first zero-crossing level:

Lu = trapz(lags,acorr)

I expect this procedure to yield me to the above values for the integral length scales: unfortunately, it does not work up to now.

What would you suggest? I thank you in advance.

Hi Francesco,

My apologies for the late response. I’ve been extremely busy with another project and haven’t had time to read the forum or respond to email for many weeks.

It has been a while since I’ve thought about the integral length scales. They are tricky to calculate, and I don’t recall if I ever directly tried to estimate them using TurbSim outputs. Several years ago, I wrote an appendix in an AMS journal article describing some methods we used for estimating the time scales from measurements (not simulated data): PICHUGINA, et al (2008) “Horizontal-Velocity and Variance Measurements in the Stable Boundary Layer Using Doppler Lidar: Sensitivity to Averaging Procedures” J. Atmos. Oceanic Technol., 25, 1307–1327. (See Appendix B). The zero-lag crossing method didn’t work well with our data, either.

I hope that helps!