If you remove the arguments to DISCON required by the super controller in FAST.Farm (inputs from_SCglob and from_SC and output to_SC), you should be able to use the controller you have written in C in standalone OpenFAST (uncoupled from FAST.Farm). Does that work as expected, i.e., is your DISCON in C properly communicating with the Fortran-based OpenFAST? I’m trying to isolate where the problem is…is it in the DISCON-OpenFAST coupling or in the FAST.Farm-super controller coupling?
The DISCON in C sends a pitch command and a demanded torque to OpenFAST with the avrSWAP array. I think it is working fine, since the turbine seems to follow my commands.
I think I managed to solve the issue by modifying my C code. The problem was in the FAST.Farm-super controller coupling.
Specifically, I was passing from_SC as
in the preamble of the code. Then, for some reason, the values were corrupted when from_SC was added as an input to one of the functions in the C code (controllerStep).
I think things are now good, even if I am not really sure why. Still, thanks a lot for the prompt help.
I am using the super controller in Fast.Farm to assign a power demand to my wind turbines. I do so with the from_SC array. The super controller is called every 3 seconds into the simulation.
The problem I am encountering is that the value brought by from_SC has some oscillations before reaching an equilibrium at the value I want.
For instance, if I ask for 1.52 MW to turbine 1 at t=3 s with the super controller, the value that is seen by the TC is
1.49 MW at t=3.001,
1.54 MW at t=3.002, and finally, 1.52 MW at t=3.002.
This is harming my results, since strong oscillations are introduced every 3 seconds.
Based on your comment, yes, I believe the extrapolation is causing the behavior you are seeing. Is there no way to smooth the output from the super controller, e.g., by applying a low-pass filter to the data before outputting the signal?
Thank you for the suggestion. A low-pass filter improves things considerably.
I have an additional thing I do not understand. It seems to me that there could be a delay between from_SC and to_SC.
For example, if I ask for 1 MW at the low -resolution timestep 1 with from_SC, I can see this demand being received by the Turbine Controller (.so) only at the low -resolution timestep 3. Does this make sense to you? Thanks a lot.
Thank you for your answer. Sorry it took my a while to answer, but I just wanted to double-check if I was doing something wrong.
I am doing active power control of wind farms. To do so, I use a Super Controller. The procedure is the following
Between a call to the Super Controller and another, the turbine controller performs an average of load and generated power. This is done with a Bladed-DLL style controller (.so), by filling the to_SC array.
When the Super Controller is called, it takes these average values to compute a power demand, and sends it back to each turbine. This operation is done with the sc.f90 file provided by NREL, which I edited. For this, the from_SC array is used.
This is repeated many times in the course of a simulation. The turbine controller is called every 0.006 seconds (dt), while the super controller is called every 2 seconds (DT).
The problem I have is that the average value computed by the turbine controller is not immediately sent to the super controller, which is what I would like/expect.
Instead, there is a delay in the exchange of information. For instance, when the turbine controller is done with its average at t=DTN (generic N), I can see the value reach the Super Controller only at time t = DT(N+2).
This is problematic for me because my controller gets super slow and I cannot track power very well.
I’m not understanding why are you seeing a delay between the output from individual turbine controller and the input to the super controller (to_SC). As discussed in the FAST.Farm User’s Guide and Theory Manual, DISCON is called before SC at each time step other than zero. So, while the SC output is extrapolated for use by DISCON, the DISCON output for use by SC is not extrapolated or delayed. See the following links for more information: openfast.readthedocs.io/en/main … f-parallel.
Thank you for your answer. I added the print statements
in my dll.c file (then compiled into a dynamic library .so), and
in the sc.f90 (then also compiled into a dynamic library .so), more specifically in the SC_CalcOutputs subroutine
for the turbine controller (sending to_SC) and for the super controller (receiving to_SC) respectively.
My understanding is that your turbine controller (DLL.c source file) is called every OpenFAST time step (0.006 s) in your case, whereas the super controller (sc.f90 source file) is called every FAST.Farm low-resolution time step (2.0 s). However, the math doesn’t quite work out because 0.006 is not an integer divisor of 2 (which FAST.Farm requires). Also, in your screen print, I see the results printed every 1 s from both the turbine and super controllers and I don’t see that the turbine controller statement is printed far more frequently than the statement printed by the super controller. Can you clarify?
What you say is correct. I apologize, I have been actually modifying the low-resolution timestep (DT_LOW). In fact, I made it smaller to try and improve my power tracking performance.
In the example showed, the low-resolution timestep is indeed 1.0 s . The OpenFAST timestep is 0.00625 s (DT_HIGH), so that DT_LOW/DT_HIGH = 160.
The print statement in the turbine controller is executed only when time t is a multiple of DT_LOW . What is printed in this case is the mean power output within a low-resolution timestep. If I print without constraints, I see messages displayed every 0.00625 s .
The print statement in the super controller is executed immediately into SC_CalcOutputs .
OK, thanks for clarifying. I looked into the code and see why there is a one time-step delay. The transfer within FAST.Farm of to_SC, which is an output from OpenFAST and an input to the super controller, happens after the call to the super controller CalcOutput, which causes a one time-step delay. (You can see this in SUBROUTINE FARM_CalcOutput of source file FAST_Farm_Subs.f90).) If the transfer was made before the call to super controller CalcOutput, this delay would be eliminated. I looked at the FAST.Farm glue code implementation plan (not published) and the source code implementation follows the plan, but the FAST.Farm documentation is not correct because it does not mention this time-step delay. I’m questioning, however, why the implementation is done this way because it may be worth changing to eliminate the delay.
Thanks for the answer and for checking the source files.
I now finally understand the reason for the time-step delay. However, for my application this is not really ideal.
For this reason, I was wondering if I could modify the FAST_Farm_Subs.f90 file to make the transfer occur before CalcOutput. Do you know if this is possible? If so, do I then need to recompile the whole FAST.Farm?
Yes, it looks the transfer (line 2216 of FAST_Farm_Subs.f90) could easily be moved a few lines earlier (before line 2207). Making this change would require a recompile of the FAST.Farm source code.
As I said, I see the implemented order in our plans and source code, but it is not clear to me now why this ordering was chosen versus an ordering that would not introduce a one time-step delay. This is something I still need to think through.
But in the interim, I don’t see a reason why you can’t change the source code and recompile as you want.
I would like to follow up on my previous post, quoted below.
I have read the user guide, but I am still confused by this extrapolation procedure taking place in the from_SC array. From FAST.Farm user’s guide and theory manual:
In my case, I would just like to update the from_SC array at every call of the super controller, and its value to remain constant until the next call. I introduced a low-pass filter as recommended. This helped but slows down my controller a little bit.
Therefore I was wondering if it could be possible to modify the source code to remove the extrapolation? I think it may be interesting, since in many instances (i.e. sudden inflow velocity deficit due to high turbulence intensity) the output from the super controller requires to react in the same way (suddenly), and - considering that the super controller is called at the rather high low-resolution time step - this may imply steps.
I have searched in the source code, but I cannot figure out where/at which point the extrapolation takes place.
Thanks for the attention.
The extrapolation of the super controller inputs (as well as all other inputs to the DISCON controller) happens in SUBROUTINE SrvD_UpdateStates() at line 1084 of the ServoDyn.f90 source file (in OpenFAST v3.0) in the CALL to SrvD_Input_ExtrapInterp(). To eliminate this extrapolation, you should send the actual inputs at time “t”, Inputs(1), as an argument to SUBROUTINE DLL_controller_call() instead of the extrapolated inputs, u_interp.