aixcalibuha package
Python package to calibrate models created in Modelica or possible other simulation software.
Subpackages
- aixcalibuha.calibration package
- aixcalibuha.sensitivity_analysis package
- Submodules
- aixcalibuha.sensitivity_analysis.fast module
- aixcalibuha.sensitivity_analysis.morris module
- aixcalibuha.sensitivity_analysis.pawn module
- aixcalibuha.sensitivity_analysis.plotting module
- aixcalibuha.sensitivity_analysis.sensitivity_analyzer module
SenAnalyzer
SenAnalyzer.analysis_function()
SenAnalyzer.analysis_variables
SenAnalyzer.create_problem()
SenAnalyzer.create_sampler_demand()
SenAnalyzer.eval_statistical_measure()
SenAnalyzer.generate_samples()
SenAnalyzer.load_from_csv()
SenAnalyzer.plot()
SenAnalyzer.run()
SenAnalyzer.run_time_dependent()
SenAnalyzer.save_for_reproduction()
SenAnalyzer.select_by_threshold()
SenAnalyzer.select_by_threshold_verbose()
SenAnalyzer.simulate_samples()
- aixcalibuha.sensitivity_analysis.sobol module
- aixcalibuha.utils package
MaxIterationsReached
validate_cal_class_input()
- Submodules
- aixcalibuha.utils.configuration module
- aixcalibuha.utils.visualizer module
CalibrationLogger
CalibrationLogger.calibrate_new_class()
CalibrationLogger.calibration_callback_func()
CalibrationLogger.calibration_class
CalibrationLogger.cd
CalibrationLogger.decimal_prec
CalibrationLogger.error()
CalibrationLogger.goals
CalibrationLogger.integer_prec
CalibrationLogger.log()
CalibrationLogger.log_initial_names()
CalibrationLogger.log_intersection_of_tuners()
CalibrationLogger.save_calibration_result()
CalibrationLogger.tuner_paras
CalibrationLogger.validation_callback_func()
CalibrationVisualizer
short_name()
Submodules
aixcalibuha.data_types module
Module containing data types to enable an automatic usage of different other modules in the Python package.
- class aixcalibuha.data_types.CalibrationClass(name, start_time, stop_time, goals=None, tuner_paras=None, relevant_intervals=None, **kwargs)[source]
Bases:
object
Class used for calibration of time-series data.
- Parameters:
name (str) – Name of the class, e.g. ‘device on’
goals (Goals) – Goals parameters which are relevant in this class. As this class may be used in the classifier, a Goals-Class may not be available at all times and can be added later.
tuner_paras (TunerParas) – As this class may be used in the classifier, a TunerParas-Class may not be available at all times and can be added later.
relevant_intervals (list) – List with time-intervals relevant for the calibration. Each list element has to be a tuple with the first element being the start-time as float/int and the second item being the end-time of the interval as float/int. E.g: For a class with start_time=0 and stop_time=1000, given following intervals [(0, 100), [150, 200), (500, 600)] will only evaluate the data between 0-100, 150-200 and 500-600. The given intervals may overlap. Furthermore the intervals do not need to be in an ascending order or be limited to the start_time and end_time parameters.
inputs ((pd.DataFrame, ebcpy.data_types.TimeSeriesData)) – TimeSeriesData or DataFrame that holds input data for the simulation to run. The time-index should be float index and match the overall ranges set by start- and stop-time.
input_kwargs (dict) – If inputs are provided, additional input keyword-args passed to the simulation API can be specified. Using FMUs, you don’t need to specify anything. Using DymolaAPI, you have to specify ‘table_name’ and ‘file_name’
- property inputs: TimeSeriesData | DataFrame
Get the inputs for this calibration class
- property name
Get name of calibration class
- property tuner_paras: TunerParas
Get the tuner parameters of the calibration-class
- class aixcalibuha.data_types.Goals(meas_target_data: TimeSeriesData | DataFrame, variable_names: dict, statistical_measure: str, weightings: list | None = None)[source]
Bases:
object
Class for one or multiple goals. Used to evaluate the difference between current simulation and measured data
- Parameters:
meas_target_data ((ebcpy.data_types.TimeSeriesData, pd.DataFrame)) – The dataset of the measurement. It acts as a point of reference for the simulation output. If the dimensions of the given DataFrame and later added simulation-data are not equal, an error is raised. Has to hold all variables listed under the MEASUREMENT_NAME variable in the variable_names dict.
variable_names (dict) –
A dictionary to construct the goals-DataFrame using pandas MultiIndex-Functionality. The dict has to follow the structure.
variable_names = {VARIABLE_NAME: [MEASUREMENT_NAME, SIMULATION_NAME]}
VARIABLE_NAME: A string which holds the actual name of the variable you use as a goal. E.g.:
VARIABLE_NAME="Temperature_Condenser_Outflow"
MEASUREMENT_NAME: Is either a string or a tuple. Hold the name the variable has inside the given meas_target_data. If you want to specify a tag you have to pass a tuple, like:
(MEASUREMENT_NAME, TAG_NAME)
. Else just pass a string. E.g.:MEASUREMENT_NAME="HydraulicBench[4].T_Out"
orMEASUREMENT_NAME=("HydraulicBench[4].T_Out", "preprocessed")
SIMULATION_NAME is either a string or a tuple, just like MEASUREMENT_NAME. E.g. (for Modelica):
SIMULATION_NAME="HeatPump.Condenser.Vol.T"
You may use a tuple instead of a list OR a dict with key “meas” for measurement and key “sim” for simulation. These options may be relevant for your own code readability. E.g.
variable_names = {VARIABLE_NAME: {"meas":MEASUREMENT_NAME, "sim": SIMULATION_NAME}}
statistical_measure (str) – Measure to calculate the scalar of the objective, One of the supported methods in ebcpy.utils.statistics_analyzer.StatisticsAnalyzer e.g. RMSE, MAE, NRMSE
weightings (list) – Values between 0 and 1 to account for multiple Goals to be evaluated. If multiple goals are selected, and weightings is None, each weighting will be equal to 1/(Number of goals). The weighting is scaled so that the sum will equal 1.
- eval_difference(verbose=False, penaltyfactor=1)[source]
Evaluate the difference of the measurement and simulated data based on the chosen statistical_measure.
- Parameters:
verbose (boolean) – If True, a dict with difference-values of for all goals and the corresponding weightings is returned together with the total difference. This can be useful to better understand which goals is performing well in an optimization and which goals needs further is not performing well.
penaltyfactor (float) – Muliplty result with this factor to account for penatlies of some sort.
- Returns:
float total_difference weighted ouput for all goals.
- get_meas_frequency()[source]
Get the frequency of the measurement data.
- Returns:
float: Mean frequency of the index
- get_sim_var_names()[source]
Get the names of the simulation variables.
- Returns list sim_var_names:
Names of the simulation variables as a list
- meas_tag_str = 'meas'
- set_relevant_time_intervals(intervals)[source]
For many calibration-uses cases, different time-intervals of the measured and simulated data are relevant. Set the interval to be used with this function. This will change both measured and simulated data. Therefore, the eval_difference function can be called at every moment.
- Parameters:
intervals (list) – List with time-intervals. Each list element has to be a tuple with the first element being the start_time as float or int and the second item being the end_time of the interval as float or int. E.g: [(0, 100), [150, 200), (500, 600)]
- set_sim_target_data(sim_target_data)[source]
Alter the object with new simulation data self._sim_target_data based on the given dataframe sim_target_data.
- Parameters:
sim_target_data (TimeSeriesData) – Object with simulation target data. This data should be the output of a simulation, hence “sim”-target-data.
- sim_tag_str = 'sim'
- property statistical_measure
The statistical measure of this Goal instance
- class aixcalibuha.data_types.TunerParas(names, initial_values, bounds=None)[source]
Bases:
object
Class for tuner parameters. Tuner parameters are parameters of a model which are constant during simulation but are varied during calibration or other analysis.
- Parameters:
names (list) – List of names of the tuner parameters
initial_values (float,int) – Initial values for optimization. Even though some optimization methods don’t require an initial guess, specifying a initial guess based on expected values or experience is helpful to better check the results of the calibration
bounds (list,tuple) – Tuple or list of float or ints for lower and upper bound to the tuner parameter. The bounds object is optional, however highly recommend for calibration or optimization in general. As soon as you tune parameters with different units, such as Capacity and heat conductivity, the solver will fail to find good solutions.
Example:
>>> tuner_paras = TunerParas(names=["C", "m_flow_2", "heatConv_a"], >>> initial_values=[5000, 0.02, 200], >>> bounds=[(4000, 6000), (0.01, 0.1), (10, 300)]) >>> print(tuner_paras) initial_value min max scale names C 5000.00 4000.00 6000.0 2000.00 m_flow_2 0.02 0.01 0.1 0.09 heatConv_a 200.00 10.00 300.0 290.00
- property bounds
Get property bounds
- descale(scaled)[source]
Converts the given scaled value to an descaled one.
- Parameters:
scaled (np.array,list) – Scaled input value between 0 and 1
- Returns:
np.array descaled: descaled value based on bounds.
- remove_names(names)[source]
Remove gives list of names from the Tuner-parameters
- Parameters:
names (list) – List with names inside of the TunerParas-dataframe
- aixcalibuha.data_types.merge_calibration_classes(calibration_classes)[source]
Given a list of multiple calibration-classes, this function merges given objects by the “name” attribute. Relevant intervals are set, in order to maintain the start and stop-time info.
- Parameters:
calibration_classes (list) – List containing multiple CalibrationClass-Objects
- Returns:
list cal_classes_merged: A list containing one CalibrationClass-Object for each different “name” of class.
Example: >>> cal_classes = [CalibrationClass(“on”, 0, 100), >>> CalibrationClass(“off”, 100, 200), >>> CalibrationClass(“on”, 200, 300)] >>> merged_classes = merge_calibration_classes(cal_classes) Is equal to: >>> merged_classes = [CalibrationClass(“on”, 0, 300, >>> relevant_intervals=[(0,100), (200,300)]), >>> CalibrationClass(“off”, 100, 200)]