importtensorflowastf# model = tf.keras.applications.mobilenet.MobileNet(weights='imagenet')model=tf.keras.applications.mobilenet.MobileNet(weights=None) With reference tohttps://leimao.github.io/blog/Save-Load-Inference-From-TF2-Frozen-Graph/andhttps://github.com/leimao/Frozen_Graph_TensorFlow/blob/...
import transformers help(transformers) Help on package transformers: NAME transformers PACKAGE CONTENTS activations activations_tf benchmark (package) commands (package) configuration_utils convert_graph_to_onnx convert_pytorch_checkpoint_to_tf2 convert_slow_tokenizer convert_slow_tokenizers_checkpoints_to_...
Description My current workflow is pb -> onnx -> tensorrt. Thanks to @jignparm , (refer onnx/tensorflow-onnx#994) I finally converted the original pb to onnx model. But I received a importPad error when converting onnx to tensorrt engine...