White Papers

46 CheXNet Inference with Nvidia T4 on Dell EMC PowerEdge R7425 | Document ID
variable scope as it was in the restored checkpoints. Solution: we customized official TensorFlow
base script resnet_model.py and placed the variables in the same variable scope name
“resnet_model” as it was in the official checkpoints downloaded previously. So, we added this
code line in the model function with tf.variable_scope("resnet_model"):. For more information
see What's the difference of name scope and a variable scope in TensorFlow [8].
TensorFlow Serving for Inference. When building and training the custom model, save the
trained model with TensorFlow Serving for Inference. To do so, export the trained model as
SavedModel with the Estimator function export_savedmodel. Also include the PREDICT
Estimator’s method to enable the inferences mode. For predict mode, it is required to provide
the export_output argument to the EstimatorSpec, it defines signatures for tensorflow serving
when Serving a SavedModel. Specify the inputs and outputs node manes, which will be needed
later on by the TensorRT™ library. See Serving Pre-Modeled and Custom TensorFlow Estimator
with Tensorflow Serving [10].
ValueError: Negative dimension size caused by subtracting 8 from 7 for
'import/resnet_model/average_pooling2d/AvgPool' (op: 'AvgPool') with input shapes:
[128,7,7,2048]. Problem solved updating the base model script resnet_model.py , in the model
function section, changing from padding=‘VALID’ to padding='SAME’ : inputs =
tf.layers.average_pooling2d( inputs=inputs, pool_size=pool_size, strides=1, padding='SAME’,
data_format=data_format)