API
This page details the methods and classes provided by the nnogada
.
Nnogada
-
class
nnogada.Nnogada.
MLP
(ncols, noutput, numneurons=200, numlayers=3, dropout=0.5)[source] Bases:
torch.nn.modules.module.Module
Multilayer Perceptron class for regression.
-
__init__
(ncols, noutput, numneurons=200, numlayers=3, dropout=0.5)[source] Initialization method.
Parameters:
-
-
class
nnogada.Nnogada.
Nnogada
(hyp_to_find, X_train, Y_train, X_val, Y_val, regression=True, verbose=False, **kwargs)[source] Bases:
object
Main class for nnogada.
-
__init__
(hyp_to_find, X_train, Y_train, X_val, Y_val, regression=True, verbose=False, **kwargs)[source] Initialization of Nnogada class.
Parameters: - hyp_to_find (dict) – Dictionary with the free hyperparameters of the neural net. The names must match with the names in the hyperparameters.py file. Ex: hyperparams = {‘deep’: [2,3], ‘num_units’: [100, 200], ‘batch_size’: [8, 32]}
- X_train (numpy.ndarray) – Set of attributes, or independent variables, for training.
- Y_train (numpy.ndarray) – Set of labels or dependent variable for training.
- X_val (numpy.ndarray) – Set of attributes, or independent variables, for testing/validation.
- Y_test (numpy.ndarray) – Set of labels or dependent variable for testing/validation.
- regression (Boolean) – If True assumes a regression task. Else, a classification is assumed. It affects the default choice in the activation function for the last layer, if regression it is the linear function, else it is softmax.
- **kwargs (kwargs) –
Optional arguments:
- deep: Hyperparameter object
- Number of layers.
- num_units: Hyperparameter object
- Number of nodes by layer.
- batch_size: Hyperparameter object
- Batch size.
- learning_rate: Hyperparameter object
- Learning rate for Adam optimizer.
- epochs: Hyperparameter object
- Number of epochs for training.
- act_fn: Hyperparameter object
- Activation function for the hidden layers.
- last_act_fn: Hyperparameter object
- Activation function for the last layer.
- loss_fn: Hyperparameter object
- Loss function.
-
eaSimpleWithElitism
(population, toolbox, cxpb, mutpb, ngen, stats=None, halloffame=None, pbar=None)[source] Method based on https://github.com/PacktPublishing/Hands-On-Genetic-Algorithms-with-Python.
The individuals contained in the halloffame are directly injected into the next generation and are not subject to the genetic operators of selection, crossover and mutation.
Parameters: - population (list) – List of individuals.
- toolbox (deap.base.Toolbox object) – Toolbox that contains the genetic operators.
- cxpb (float) – The probability of crossover between two individuals.
- mutpb (float) – Probability of mutation.
- ngen (int) – Number of generation.
- stats (deap.tools.Statistics object) – A Statistics object that is updated inplace, optional.
- halloffame (deap.tools.HallOfFame object) – Object that will contain the best individuals, optional.
- pbar (bool) – Flag to use progres bar with tqdm library.
Returns: - population (list) – List of individuals.
- logbook (deap.tools.Logbook object.) – Statistics of the evolution.
-
ga_with_elitism
(population_size, max_generations, gene_length, k, pmutation=0.5, pcrossover=0.5, hof=1)[source] Simple genetic algorithm with elitism.
Parameters: - population_size (int) – Population size.
- max_generations (int) – Maximum number of generations.
- gene_length (int) – Length of each gene.
- k (int) – k parameter for the tournament selection method
- pmutation (float) – Probability of mutation, between 0 and 1.
- pcrossover (float) – Probability of crossover, between 0 and 1.
- hof (int) – Number of individuals to stay in the hall of fame.
Returns: best_population – Individuals in the last population.
Return type:
-
neural_train_evaluate
(ga_individual_solution)[source] This train and evaluates the neural network models with the different solutions proposed by the Genetic Algorithm .
Parameters: ga_individual_solution – Individual of the genetic algorithm. Returns: loss – Last value for the loss function. Return type: float
-
Hyperparameter
-
class
nnogada.Hyperparameter.
Hyperparameter
(name, values, val, vary=False)[source] Bases:
object
This class defines a hyperparameter object.