GAMCR.model.GAMCR module#
- class GAMCR.model.GAMCR.GAMCR(max_lag=240, features={}, n_splines=10, lam=10)[source]#
Bases:
Dataset,Trainer,ComputeStatisticsMain class of the GAMCR package to learn transfer functions of a given watershed.
…
- features#
dictionary of the different features used in the model
- Type:
dic
- lam#
regularization parameter related to the smoothing penalty in the GAM
- Type:
positive float
- train(X, matJ, Y, dates=None, lr=1e-3, max_iter=200, warm_start=False, save_folder=None, name_model='', normalization_loss=1, lam_global=0)[source]#
Train the model.
- predict_streamflow(matJ)[source]#
Predict the hydrograph from the matrix matJ (obtained from the method ‘get_GAMdesign’ of the class ‘Dataset’).
- train(X, matJ, Y, dates=None, lr=0.001, max_iter=200, warm_start=False, save_folder=None, name_model='', normalization_loss=1, lam_global=0)[source]#
Train the model.
- Parameters:
- X array
Design matrix of the GAM compute from the method ‘get_design’. X has dimension: number of timepoints x number of features.
- matJ array
Matrix used in the convolution to get the streamflow values (obtained from the method ‘get_GAMdesign’ of the class ‘Dataset’).
- dates array, optional
Array of dates.
- lr float, optional
Initial value of the learning rate. Note that the learning rate will be automatically adjusted to ensure a strict descrease of the training loss.
- max_iter int, optional
Maximum number of iterations of the projected gradient descent algorithm.
- warm_start bool, optional
If True, the model parameters will be initialized to the parameters saved in the model loaded.
- save_folder str, optional
Path of the folder of the studied site where the optimized model will be saved.
- name_model str, optional
Custom name of the model that will be saved.
- normalization_loss positive float, optional
Normalization factor for the loss (should be kept to 1).
- lam_global positive float, optional
Regularization parameter for the smoothing penalty applied on the transfer functions