trainer#

nfflr.train provides an ignite-based trainer utility (and helper functions) for training force fields and general atomistic models. The trainer supports multi-GPU data-parallel distributed training based on ignite.distributed.

Configuration#

TrainingConfig([experiment_dir, output_dir, ...])

NFFLr configuration for the optimization process.

ignite-based trainer#

train(model, dataset, config[, local_rank])

NFFLr trainer entry point.

lr(model, dataset, config[, local_rank])

NFFLr learning rate finder entry point.

training setup#

get_dataflow(dataset, config)

Configure training and validation datasets.

setup_model_and_optimizer(model, dataset, config)

Initialize model, criterion, and optimizer.

setup_trainer(model, criterion, optimizer, ...)

Create ignite trainer and attach common event handlers.

setup_checkpointing(state, config)

Configure model and trainer checkpointing.

setup_evaluators(model, prepare_batch, ...)

Configure train and validation evaluators.