I just installed OpenFast on My MacOS. I tried to execute the regression tests as described in the manual. Most of the tests passed however all the tests using HydroDyn fail. I have very high values of the infinity norms for some channels when it is compared to the baseline results.
Do you have any idea why ?
Thank you for your help,
Could this be related to the following open issues in OpenFAST?:
It is probably the problem. I did not see your posts on GitHub. Thank you for that.
However, When I write RanLux as keyword for the WaveSeed, I have got an error : Invalid numerical input for file.
Do I have to modify the code somewhere else ?
Thank you very much,
The OpenFAST issue #89 (github.com/OpenFAST/openfast/issues/89) is an enhancement that I’ve proposed be implemented in HydroDyn to resolve the problem you and others have reported. However, this enhancement has not yet been implemented in source code. It is proposed future work.
All right, I understand. Thank you for the help.
I’m trying to run some of the Regression Tests with OpenFAST on Cygwin64, e.g. for the 5MW_OC4Jckt_DLL_WTurb_WavesIrr_MGrowth test, but i keep getting the same error: Segmentation fault (`core’ generado), without changing any input file. Could someone help me please?
I’ll leave a screenshot so you can see in more detail.
Thank you very much.
I’m not familiar enough with Cygwin to know the answer to your question. I suggest posting your question on OpenFAST github issues (github.com/OpenFAST/openfast/issues) to ensure that it is read by others on the OpenFAST team who may not check this forum regularly.
Ok, thank you very much for the quick response. I’ll post it there as you say.
I’m trying to run the test by other means, different from using it on Cygwin64, but now the simulation doesn’t end. It just stays in the “Timestep: 0 of 60 seconds.” step of the simulation. Can you help me please?
I’m not familiar with OpenFAST simply “hanging” at time zero. Perhaps the controller is not loading properly and is causing the simulation to hang? Can you run the simulation without the controller by setting CompServo = 0 in the FAST primary input file and GenDOF = False in the ElastoDyn primary input file?
I’ve just done what you suggested, running the simulation without the controller, but now I encounter this problem (In attached). I guess I have to add those parameters that FAST can’t allocate to the Hydrodyn input file, but where? I’ve been inspecting the Hydrodyn file but I still can’t figure it out.
Your screenshot is reporting an error about “allocating space”, which is a question that has recently been answered in the following forum topic: http://forums.nrel.gov/t/problems-in-test-in-fast-8/1868/3.
I recently installed OpenFAST on Ubuntu 18.04. I followed the first example to build OpenFAST and the test suite present in: https://openfast.readthedocs.io/en/master/source/testing/regression_test.html#regression-test-example. The command ‘make install’ appeared to go correctly. However, after executing ‘ctest’, 18 of the 37 tests failed. I also made sure that the three DISCON controllers for the NREL 5MW turbine where present (in the documentation the folder assigned to contain them is ‘ServoDyn’, but I guess it is actually ‘ServoData’, since there isn’t any folder named ‘ServoDyn’ in ‘5MW_Baseline’). There is a file attached with the results of ctest and after checking the results with the information in the ‘CtestList.make’ file, the majority of the tests that failed used aerodyn15 (but not all). I have not been able to find a reason why so many tests failed. Does anyone know what could have happened or a way to find out?
Thank you for your help,
ctest_results.txt (5.69 KB)
Thanks for pointing out the bug in the docs, I’ve opened an issue on GitHub here: github.com/OpenFAST/openfast/issues/new/choose.
The fact that 19 tests passed means that openfast was compiled fine, but there could be some differences in your system which caused the tests to fail. These tests could have failed for a variety of reasons as they are very sensitive to your environment. For example, using a different version of the compiler or math library than the one used to generate the baseline is a common cause. My suggestion is to plot the results and visually inspect the failing cases.
Since you’ve already run the cases, you can just generate the plots with this python script located in openfast/reg_tests:
python3 manualRegressionTest.py ../build/glue-codes/openfast/openfast Linux GNU 1e-5 -n -p
The -n means don’t run the case and -p means generate the plots. By the way, you need matplotlib installed for this to work. After it finishes successfully, you’ll have a html file in openfast/build/reg_tests which you can open in any browser and navigate through the failed test cases.
We are working on making this plotting feature more robust and ready for general use. Once its “finished”, I’ll be sure to add documentation about it.
I compiled openfast via Visual Studio as explained in the documentation. I am using Visual Studio 2019 and Intel Parallel Studio 2020.
I did the regression tests, of which 19 passed and 10 failed (including the linear ones, which, I read, do not count), as shown in the image below.
Most of the 5MW configurations fail. DISCON controllers should be compiled correctly, in fact some of the 5MW test were passed.
Then I generated the plots and inspected the results. All solutions seem good to me.
Relative errors exceeding tolerance are usually of order of magnitude 1e-5, some of them are 1e-4, and are generally related to loads or yaw. For the land configuration, which is the one I’ll be working on, max relative error is of order 1e-3. An image of 5MW_Land_BD_DLL_WTurb errors is posted below.
What I’m wondering is: is it fine to start working with my setup, as I think, or I’d better try to recompile everything with different versions of Visual Studio, etc.?
Thank you for your attention.
If the plots look reasonable to you, I’m sure the solution is fine and you are can proceed running OpenFAST for your analysis.
The OpenFAST r-test is highly sensitive and will indicate failures due to differences in compiler, platform, precision, source code changes, etc., even though the results are clearly within engineering accuracy. This is something that needs to be fixed with the testing framework in the future.
I have compiled OpenFAST with Windows Visual Studio 2019 and I tried running the regression tests.
23 out of 33 tests result PASSED but the other ones have quite high residuals at some point (I attach the results of the 5MW_Land_BD_DLL_WTurb)
I was wondering if I can proceed with my work or if there is somethign wrong.
Thank you very much in advance,
How do the plots look for the cases that are failing?
Dear Mr. Jonkman,
here is the report of the 5MW_Land_BD_DLL_WTurb case.
The pdf is splitted in two parts to respect dimensions constraint.
part2.pdf (2.69 MB)
Part1.pdf (2.47 MB)
The differences shown in your plots look reasonable given the possibility of significant differences between your system and the one used to generate the baseline results. In my opinion, you are ok to proceed using the executable you’ve obtained.