composer.core.evaluator#
A wrapper for a dataloader to include metrics that apply to a specific dataset.
Functions
Ensure that |
|
Helper function to generate an evaluation interval callable. |
Classes
A wrapper for a dataloader to include metrics that apply to a specific dataset. |
- class composer.core.evaluator.Evaluator(*, label, dataloader, metrics, subset_num_batches=None, eval_interval=None)[source]#
A wrapper for a dataloader to include metrics that apply to a specific dataset.
For example,
CrossEntropyLoss
metric for NLP models.>>> from torchmetrics.classification.accuracy import Accuracy >>> eval_evaluator = Evaluator(label="myEvaluator", dataloader=eval_dataloader, metrics=Accuracy()) >>> trainer = Trainer( ... model=model, ... train_dataloader=train_dataloader, ... eval_dataloader=eval_evaluator, ... optimizers=optimizer, ... max_duration="1ep", ... )
- Parameters
label (str) โ Name of the Evaluator
dataloader (DataSpec | Iterable | Dict[str, Any]) โ Iterable that yields batches, a
DataSpec
for evaluation, or a Dict ofDataSpec
kwargs.metrics (Metric | MetricCollection) โ
torchmetrics.Metric
to log.metrics
will be deep-copied to ensure that each evaluator updates only itsmetrics
.subset_num_batches (int, optional) โ
The maximum number of batches to use for each evaluation. Defaults to
None
, which means that theeval_subset_num_batches
parameter from theTrainer
will be used.Set to
-1
to evaluate the entiredataloader
eval_interval (int | str | Time | (State, Event) -> bool, optional) โ
An integer, which will be interpreted to be epochs, a str (e.g.
1ep
, or10ba
), aTime
object, or a callable. Defaults toNone
, which means that theeval_interval
parameter from theTrainer
will be used.If an integer (in epochs),
Time
string, orTime
instance, the evaluator will be run with this frequency.Time
strings orTime
instances must have units ofTimeUnit.BATCH
orTimeUnit.EPOCH
.Set to
0
to disable evaluation.If a callable, it should take two arguments (
State
,Event
) and return a bool representing whether the evaluator should be invoked. The event will be eitherEvent.BATCH_END
orEvent.EPOCH_END
.When specifying
eval_interval
, the evaluator(s) are also run at theEvent.FIT_END
if it doesnโt evenly divide the training duration.
- composer.core.evaluator.ensure_evaluator(evaluator, default_metrics)[source]#
Ensure that
evaluator
is anEvaluator
.- Parameters
- Returns
Evaluator โ An evaluator.