Options
All
  • Public
  • Public/Protected
  • All
Menu

Configuration used for training models.

internal

Hierarchy

  • Config

Index

Properties

dropout

dropout: number

The value to use for the dropout layer, which is used to prevent over-fitting.

This must be a value greater than 0 and less than 1.

embeddingDimension

embeddingDimension: number

The output dimension used for the embedding layer.

See here for further details.

epochs

epochs: number

The total number of iterations to use when training a model.

The total number of iterations may be less than this if the conditions for earlyStopping are met.

lstmUnits

lstmUnits: number

The number of units to remember in the Long-Short Term Memory layer.

See ltsm for more details..

maxSequenceLength

maxSequenceLength: number

The maximum sequence length.

This value corresponds to the length that each sequence is normalised to.

Larger values are better, but take longer to train and use more memory.

patience

patience: number

Number of epochs with no improvement after which training will be stopped.

See earlyStopping for more details.

trainingRatio

trainingRatio: number

The ratio of messages to split between training and validation.

This must be a value greater than 0 and less than 1.

vocabSize

vocabSize: number

The maximum vocabulary size.

This value corresponds to the maximum number of distinct symbols (usually words) stored.

Larger values are better, but take longer to train and use more memory.

Generated using TypeDoc