Python Fatigue tools

Hi,

I was looking for Python based tools to do the below activities and search in the repositories pasted down, but, I was not able to find the codes. Did I miss it or is it still under construction?

  • Fatigue DEL calculation from a simulations from cut in to cut out for the lifetime as well as for a timeseries
  • Markov’s, LDD etc. calculation
  • Extreme loads postprocessing

Any help is highly appreciated,

Best regards,
Karthik

Hi Karthik,

We recognize that our python OpenFAST utilities are a bit fragmented and require familiarity with reading python code. Apologies for that upfront. Consolidating them is on our long-term roadmap.

There is the pCrunch utility (develop branch) that builds an extreme event table and uses the fatpack-library for the fatigue calculation post-processing. Here is a brief example on usage. pCrunch is also part of the WEIS stack, where OpenFAST is embedded in a design loop and the extreme loads or fatigue loads are available as objectives or constraints.

Emmanuel may chime in with some guides on the python tools you linked to as well.

Cheers,
Garrett

Hi all,
For the python-toolbox you can find my answer on the separate thread:
Python based post-processing tool · Issue #46 · OpenFAST/python-toolbox · GitHub

Cheers,

Emmanuel

1 Like

Hi @Garrett.Barter

Many thanks for getting back. Does the code Example calculates the DEL for a set of time series. For example, if I have time series from 3 to 25 m/s and I want to get the overall DEL, will the code be able to do it.

I did a

pip install pCrunch

Then pasted the Example code in a seperate folder and pasted the .out file at the same location and tried to run the script, but gave the following error:

Best regards,
Karthik

You will need to work off of the develop-branch of pCrunch. I actually didn’t know it was pip-installable, but if that command worked, then you got an old release, not the develop-branch version. You will have to clone the repo, checkout the develop branch, and then install from source.

For your DEL question, yes pCrunch is intended to capture the full DEL across a DLC, such as a wind speed sweep. You can also provide probability weightings for each wind speed.

Thanks @Garrett.Barter for getting back.

I didn’t actually get by install from Source? Did you mean to follow the steps under **Development setup ** in the link GitHub - NREL/pCrunch at develop

I am still learning Python and so my inferences below may be basic, pardon me for that.

How is lifetime defined in analysis.py I see that it is defined in the class FatigueParams is that used elsewhere since I couldn’t find that Class called anywhere.

In the same analysis.py for the method get_DELs in the below deftiniton, lifetime is called

    for chan, fatparams in self._fc.items():

        try:

            DELs[chan], D[chan] = self._compute_del(
                output[chan], output.elapsed_time,
                fatparams.lifetime,
                fatparams.load2stress, fatparams.slope,
                fatparams.ult_stress, fatparams.S_intercept,
                **kwargs
            )

But, I didn’t understand how lifetime is called or stored by the above code and from where (for example in ‘fatparams’?

When I ran the tool, I got the below error.

image

Any help is highly appreciated.

Best regards,
Karthik

Hi Karthik,

You have to initialize a FatigueParams object and use that as the input to the loads analysis calculation. Here is some code in WEIS that creates those objects and then passes those to the analysis constructor , which is executed directly after the time series is loaded from OpenFAST (it may seem a little intimidating at first).

Hopefully that helps. The processing example in the pCrunch repo may not be up-to-date or the best standalone example (and I’m not sure if it is regularly tested). If you have suggestions for improvements, we would welcome a PR or suggestions.

Cheers,
Garrett

Hi Garrett,

Many Thanks for getting back. Basic query, I couldn’t see any instance on FatigueParams object (lss_fatiugue) being passed into the LoadsAnalysis class. Please see the figure below and pardon me for my naive queries.

I am using the below code and defined the Class

fatigueparme = FatigueParams(lifetime=20.0, load2stress=1.0, slope=4.0, ult_stress=1.0, S_intercept=0.0)

when I pass the class instance fatigueparme to the LoadsAnalysis class as argument, i get the warning

__author__ = ["Nikhar Abbas", "Jake Nunemaker"]
__copyright__ = "Copyright 2020, National Renewable Energy Laboratory"
__maintainer__ = ["Nikhar Abbas", "Jake Nunemaker"]
__email__ = ["nikhar.abbas@nrel.gov", "jake.nunemaker@nrel.gov"]


import os
from fnmatch import fnmatch

import numpy as np
import pandas as pd
import ruamel.yaml as ry

from pCrunch import LoadsAnalysis, PowerProduction, FatigueParams
from pCrunch.io import load_FAST_out
from pCrunch.utility import save_yaml, get_windspeeds, convert_summary_stats


def valid_extension(fp):
    return any([fnmatch(fp, ext) for ext in ["*.outb", "*.out"]])


# Define input files paths
output_dir = "C:/Users/prakakar/source/pCrunch/pCrunch/postProcessing/"
results_dir = os.path.join(output_dir, "results")
save_results = True


# Find outfiles
outfiles = [
    os.path.join(output_dir, f)
    for f in os.listdir(output_dir)
    if valid_extension(f)
]

# Configure pCrunch
magnitude_channels = {
    "RootMb1": ["RootMxb1", "RootMyb1", "RootMzb1"],

}

fatigue_channels = {"RootMb1": 10}

channel_extremes = [
    "RotSpeed",
    "RotTorq",
    "RootMb1",

]



fatigueparme = FatigueParams(lifetime=20.0, load2stress=1.0, slope=4.0, ult_stress=1.0, S_intercept=0.0)

# =============================================================================
# # Configure pCrunch
# magnitude_channels = {
#     "RootMc1": ["RootMxc1", "RootMyc1", "RootMzc1"],
#     "RootMc2": ["RootMxc2", "RootMyc2", "RootMzc2"],
#     "RootMc3": ["RootMxc3", "RootMyc3", "RootMzc3"],
# }
# 
# fatigue_channels = {"RootMc1": 10, "RootMc2": 10, "RootMc3": 10}
# 
# channel_extremes = [
#     "RotSpeed",
#     "RotThrust",
#     "RotTorq",
#     "RootMc1",
#     "RootMc2",
#     "RootMc3",
# ]
# =============================================================================


# Run pCrunch
la = LoadsAnalysis(
    outfiles,
    magnitude_channels=magnitude_channels,
    fatigue_channels=fatigue_channels,
    extreme_channels=channel_extremes,
    trim_data=(0,),
)
la.process_outputs(cores=1)

if save_results:
    save_yaml(
        results_dir,
        "summary_stats.yaml",
        convert_summary_stats(la.summary_stats),
    )

I think fatigue_channels should be a Dictionary of FatigueParams objects. This is the part you are missing I think. Here is one of the lines in WEIS where that is stored.

In your example, you would do:

fatigueparme = FatigueParams(lifetime=20.0, load2stress=1.0, slope=4.0, ult_stress=1.0, S_intercept=0.0)
fatigue_channels = {"RootMb1": fatigueparme}

Many Thanks @Garrett.Barter for getting back. I tired the approach and is pasting the code below. The code ran without any error, but I could not see the results neither the results folder was created.

__author__ = ["Nikhar Abbas", "Jake Nunemaker"]
__copyright__ = "Copyright 2020, National Renewable Energy Laboratory"
__maintainer__ = ["Nikhar Abbas", "Jake Nunemaker"]
__email__ = ["nikhar.abbas@nrel.gov", "jake.nunemaker@nrel.gov"]


import os
from fnmatch import fnmatch

import numpy as np
import pandas as pd
import ruamel.yaml as ry

from pCrunch import LoadsAnalysis, PowerProduction, FatigueParams
from pCrunch.io import load_FAST_out
from pCrunch.utility import save_yaml, get_windspeeds, convert_summary_stats


def valid_extension(fp):
    return any([fnmatch(fp, ext) for ext in ["*.outb", "*.out"]])


# Define input files paths
output_dir = "C:/Users/prakakar/source/pCrunch/pCrunch/postProcessing/"
results_dir = os.path.join(output_dir, "results")
save_results = True


# Find outfiles
outfiles = [
    os.path.join(output_dir, f)
    for f in os.listdir(output_dir)
    if valid_extension(f)
]

# Configure pCrunch
magnitude_channels = {
    "RootMb1": ["RootMxb1", "RootMyb1", "RootMzb1"],

}


fatigueparme = FatigueParams(lifetime=20.0, load2stress=1.0, slope=10.0, ult_stress=1.0, S_intercept=1.0)

#fatigue_channels = {"RootMb1": 10}
fatigue_channels = {"RootMb1": fatigueparme}
#fatigue_channels = {"RootMb1": fatigueparme}

channel_extremes = [
    "RotSpeed",
    "RotTorq",
    "RootMb1",

]



# fatigueparme = FatigueParams(lifetime=20.0, load2stress=1.0, slope=10.0, ult_stress=1.0, S_intercept=1.0)

# =============================================================================
# # Configure pCrunch
# magnitude_channels = {
#     "RootMc1": ["RootMxc1", "RootMyc1", "RootMzc1"],
#     "RootMc2": ["RootMxc2", "RootMyc2", "RootMzc2"],
#     "RootMc3": ["RootMxc3", "RootMyc3", "RootMzc3"],
# }
# 
# fatigue_channels = {"RootMc1": 10, "RootMc2": 10, "RootMc3": 10}
# 
# channel_extremes = [
#     "RotSpeed",
#     "RotThrust",
#     "RotTorq",
#     "RootMc1",
#     "RootMc2",
#     "RootMc3",
# ]
# =============================================================================


# Run pCrunch
la = LoadsAnalysis(
    outfiles,
    magnitude_channels=magnitude_channels,
    fatigue_channels=fatigue_channels,
    extreme_channels=channel_extremes,
    trim_data=(0,),
)

#la = LoadsAnalysis( outfiles, magnitude_channels=magnitude_channels, fatigue_channels=fatigue_channels, extreme_channels=channel_extremes, trim_data=(0,),fatigueparme)
la.process_outputs(cores=1)

if save_results:
    save_yaml(
        results_dir,
        "summary_stats.yaml",
        convert_summary_stats(la.summary_stats),
    )

# =============================================================================
# # Load case matrix into dataframe
# fname_case_matrix = os.path.join(output_dir, "case_matrix.yaml")
# with open(fname_case_matrix, "r") as f:
#     case_matrix = ry.load(f, Loader=ry.Loader)
# cm = pd.DataFrame(case_matrix)
# 
# # Get wind speeds for processed runs
# windspeeds, seed, IECtype, cm_wind = get_windspeeds(cm, return_df=True)
# 
# # Get AEP
# turbine_class = 1
# pp = PowerProduction(turbine_class)
# AEP, perf_data = pp.AEP(
#     la.summary_stats,
#     windspeeds,
#     ["GenPwr", "RtAeroCp", "RotSpeed", "BldPitch1"],
# )
# print(f"AEP: {AEP}")
# =============================================================================

# # ========== Plotting ==========
# an_plts = Analysis.wsPlotting()
# #  --- Time domain analysis ---
# filenames = [outfiles[0][2], outfiles[1][2]] # select the 2nd run from each dataset
# cases = {'Baseline': ['Wind1VelX', 'GenPwr', 'BldPitch1', 'GenTq', 'RotSpeed']}
# fast_dict = fast_io.load_FAST_out(filenames, tmin=30)
# fast_pl.plot_fast_out(cases, fast_dict)

# # Plot some spectral cases
# spec_cases = [('RootMyb1', 0), ('TwrBsFyt', 0)]
# twrfreq = .0716
# fig,ax = fast_pl.plot_spectral(fast_dict, spec_cases, show_RtSpeed=True,
#                         add_freqs=[twrfreq], add_freq_labels=['Tower'],
#                         averaging='Welch')
# ax.set_title('DLC1.1')

# # Plot a data distribution
# channels = ['RotSpeed']
# caseid = [0, 1]
# an_plts.distribution(fast_dict, channels, caseid, names=['DLC 1.1', 'DLC 1.3'])

# # --- Batch Statistical analysis ---
# # Bar plot
# fig,ax = an_plts.stat_curve(windspeeds, stats, 'RotSpeed', 'bar', names=['DLC1.1', 'DLC1.3'])

# # Turbulent power curve
# fig,ax = an_plts.stat_curve(windspeeds, stats, 'GenPwr', 'line', stat_idx=0, names=['DLC1.1'])

# plt.show()

Hi Karthik,

I’m not sure what format of output you were expecting or wanting. The save_yaml method is likely meant for Pandas dataframe data. I would suggest using your debugger to explore the data structure in the pCrunch objects. Using the code as a guide will likely help.

Cheers,
Garrett

1 Like

Hi @Garrett.Barter

Many thanks for the help.

Best regards,
Karthik

Hi @Garrett.Barter

Debugging like you suggested worked. Many thanks for that.

Also, is there any set of tools in Python that can calculate the LDD’s, Markov’s etc.

Best regards,
Karthik

Not sure if it has everything you are looking for, but take a look at fatpack:

1 Like