tao classification_tf1 train [-h] -e <spec file> -k <encoding key> -r <result directory> [--gpus <num GPUs>] [--num_processes <number_of_processes>] [--gpu_index <gpu_index>] [--use_amp] [--log_file <log_file_path>] Required Arguments -r, --results_dir: Path to a fo...
Sample Command for a Classification TF1/TF2/PyT Model To generate an .onnx file for Classification TF1/TF2/PyT, refer to the Classification documentation. You can also refer to the Classification TAO-Deploy documentation for instructions on generating an INT8 calibration file. Copy Copied! trtexec...
int8-calib-file=<Path to optional INT8 calibration cache> labelfile-path=<Path to classification_labels.txt> onnx-file=<Path to Classification onnx model> ## 0=FP32, 1=INT8, 2=FP16 mode network-mode=0 # process-mode: 2 - inferences on crops from primary detector, 1 - inferences ...
For classification, set this argument to predictions/Softmax. Optional Arguments -e: The path to save the engine to. The default value is ./saved.engine. -t: The engine data type; this argument generates a calibration cache if in INT8 mode. The default value is fp32. The options are...
tao model classification_tf1 train [-h] -e <spec file> -k <encoding key> -r <result directory> [--gpus <num GPUs>] [--num_processes <number_of_processes>] [--gpu_index <gpu_index>] [--use_amp] [--log_file <log_file_path>] Required Arguments -r, --results_dir: Path to...