API

This page details the methods and classes provided by the nnogada.

Nnogada

class nnogada.Nnogada.MLP(ncols, noutput, numneurons=200, numlayers=3, dropout=0.5)[source]

Bases: torch.nn.modules.module.Module

Multilayer Perceptron class for regression.

__init__(ncols, noutput, numneurons=200, numlayers=3, dropout=0.5)[source]

Initialization method.

Parameters:
  • ncols (int) – Number of attributes.
  • noutput (int) – Size of the output.
  • numneurons (int) – Number of neurons for the hidden layers.
  • numlayers (int) – Number of hidden layers.
  • dropout (float) – Dropout value.
forward(x)[source]

Forward method using activation function and other functions defined in the torch architecture.

Parameters:x (numpy.array) – Input array.
Returns:x – Array before a forward step.
Return type:numpy.array
init_weights(m)[source]

Initilization of the ANN weights.

Parameters:m (MLP class.) – Multilayer perceptron model.
class nnogada.Nnogada.Nnogada(hyp_to_find, X_train, Y_train, X_val, Y_val, regression=True, verbose=False, **kwargs)[source]

Bases: object

Main class for nnogada.

__init__(hyp_to_find, X_train, Y_train, X_val, Y_val, regression=True, verbose=False, **kwargs)[source]

Initialization of Nnogada class.

Parameters:
  • hyp_to_find (dict) – Dictionary with the free hyperparameters of the neural net. The names must match with the names in the hyperparameters.py file. Ex: hyperparams = {‘deep’: [2,3], ‘num_units’: [100, 200], ‘batch_size’: [8, 32]}
  • X_train (numpy.ndarray) – Set of attributes, or independent variables, for training.
  • Y_train (numpy.ndarray) – Set of labels or dependent variable for training.
  • X_val (numpy.ndarray) – Set of attributes, or independent variables, for testing/validation.
  • Y_test (numpy.ndarray) – Set of labels or dependent variable for testing/validation.
  • regression (Boolean) – If True assumes a regression task. Else, a classification is assumed. It affects the default choice in the activation function for the last layer, if regression it is the linear function, else it is softmax.
  • **kwargs (kwargs) –

    Optional arguments:

    deep: Hyperparameter object
    Number of layers.
    num_units: Hyperparameter object
    Number of nodes by layer.
    batch_size: Hyperparameter object
    Batch size.
    learning_rate: Hyperparameter object
    Learning rate for Adam optimizer.
    epochs: Hyperparameter object
    Number of epochs for training.
    act_fn: Hyperparameter object
    Activation function for the hidden layers.
    last_act_fn: Hyperparameter object
    Activation function for the last layer.
    loss_fn: Hyperparameter object
    Loss function.
eaSimpleWithElitism(population, toolbox, cxpb, mutpb, ngen, stats=None, halloffame=None, pbar=None)[source]

Method based on https://github.com/PacktPublishing/Hands-On-Genetic-Algorithms-with-Python.

The individuals contained in the halloffame are directly injected into the next generation and are not subject to the genetic operators of selection, crossover and mutation.

Parameters:
  • population (list) – List of individuals.
  • toolbox (deap.base.Toolbox object) – Toolbox that contains the genetic operators.
  • cxpb (float) – The probability of crossover between two individuals.
  • mutpb (float) – Probability of mutation.
  • ngen (int) – Number of generation.
  • stats (deap.tools.Statistics object) – A Statistics object that is updated inplace, optional.
  • halloffame (deap.tools.HallOfFame object) – Object that will contain the best individuals, optional.
  • pbar (bool) – Flag to use progres bar with tqdm library.
Returns:

  • population (list) – List of individuals.
  • logbook (deap.tools.Logbook object.) – Statistics of the evolution.

ga_with_elitism(population_size, max_generations, gene_length, k, pmutation=0.5, pcrossover=0.5, hof=1)[source]

Simple genetic algorithm with elitism.

Parameters:
  • population_size (int) – Population size.
  • max_generations (int) – Maximum number of generations.
  • gene_length (int) – Length of each gene.
  • k (int) – k parameter for the tournament selection method
  • pmutation (float) – Probability of mutation, between 0 and 1.
  • pcrossover (float) – Probability of crossover, between 0 and 1.
  • hof (int) – Number of individuals to stay in the hall of fame.
Returns:

best_population – Individuals in the last population.

Return type:

list

neural_train_evaluate(ga_individual_solution)[source]

This train and evaluates the neural network models with the different solutions proposed by the Genetic Algorithm .

Parameters:ga_individual_solution – Individual of the genetic algorithm.
Returns:loss – Last value for the loss function.
Return type:float
set_hyperparameters()[source]

This small routine sets as variable the hyperparameters indicated in the hyp_to_find dictionary.

Hyperparameter

class nnogada.Hyperparameter.Hyperparameter(name, values, val, vary=False)[source]

Bases: object

This class defines a hyperparameter object.

__init__(name, values, val, vary=False)[source]
Parameters:
  • name (str) – Name of the hyperparameter.
  • values (list) – Possible values of the hyperparameter.
  • val (float) – Value of the hyperparameter if vary is false.
  • vary (bool) – Flag that indicates if the hyperparameter is fixed (vary=False) or not (vary=True).
setVal(new_val)[source]
Parameters:new_val (float) – Set a new value for the hyperparameter object.
setValues(values)[source]
Parameters:values (list) – list of hyperparameters