Tensorflow Serving Setup

Install Tensorflow Serving from Source

  • clone the repo
$ mkdir src

$ cd src

$ git clone --recurse-submodules https://github.com/tensorflow/serving

$ cd serving
  • configurate tensorflow
$ cd tensorflow

./configure

$ cd ..

Build Tensorflow Serving Model Servers and Example Models

  • build the tensorflow serving model servers
# build server (tensorflow serving server)

$ bazel build -c opt //tensorflow_serving/model_servers:tensorflow_model_server

# build server with customized output_user_root

$ bazel --output_user_root=./tf_bazel_cache/ build -c opt //tensorflow_serving/model_servers:tensorflow_model_server
  • build the tensorflow serving example models
# build models (tensorflow serving example models)

$ export PYTHON_LIB_PATH=...

$ export PYTHON_LIB_PATH=/home/yg/anaconda2/lib/python2.7/site-packages

$ bazel --output_user_root=./tf_bazel_cache/ build -c opt //tensorflow_serving/example/[...]

$ bazel --output_user_root=./tf_bazel_cache/ build -c opt //tensorflow_serving/example:mnist_saved_model

$ bazel --output_user_root=./tf_bazel_cache/ build -c opt //tensorflow_serving/example:inception_saved_model
  • build the tensorflow serving example models clients
$ bazel --output_user_root=./tf_bazel_cache/ build -c opt //tensorflow_serving/example:mnist_client

$ bazel --output_user_root=./tf_bazel_cache/ build -c opt //tensorflow_serving/example:inception_client

Inspect Tensorflow Serving Example Models

mnist_saved_model

mnist_saved_model include a training phase, whereas inception models, and customized models like we will implemented often recover the network from a pre-trained and pre-saved checkpoint.

  • setup saved model as servable
# run mnist_saved_model to export freezed graph and weights for serving

# check ./tensorflow_serving/example/mnist_saved_model.py for options

$ bazel-bin/tensorflow_serving/example/mnist_saved_model \
    [--training_iteration=x] \
    [--model_version=y] \
    export_dir

$ mkdir -p tf_servables/mnist

$ bazel-bin/tensorflow_serving/example/mnist_saved_model \
    --training_iteration=1000 \
    --model_version=1 \
    tf_servables/mnist
  • serve
# use tensorflow_model_server to serve model servable 

$ bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server [...]

# note --model_base_path requires an absolute path, so use environment variable $PWD to provide an absolute path prefix

$ bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server \
    --model_name=mnist \
    --model_base_path=$PWD/tf_servables/mnist/ \
    --port=9000
  • client
# open another terminal (ctrl + alt + t)

$ bazel-bin/tensorflow_serving/example/mnist_client \
    --server=localhost:9000 \
    --num_tests=1000
  • tensorflow_model_server auto-discovers new versions
# run mnist_saved_model again to export a new model version

# keep the tensorflow model server serving the mnist saved model 

$ bazel-bin/tensorflow_serving/example/mnist_saved_model \
    --training_iteration=2000 \
    --model_version=2 \
    tf_servables/mnist

# an updated freezed graph and weights for serving is available

$ ls tf_servables/mnist
> 1 2

# back to the terminal on tensorflow model server serving mnist
> : I tensorflow_serving/core/loader_harness.cc:86] Successfully loaded servable version {name: mnist version: 2}
> : I tensorflow_serving/core/loader_harness.cc:137] Quiescing servable version {name: mnist version: 1}
> : I tensorflow_serving/core/loader_harness.cc:144] Done quiescing servable version {name: mnist version: 1}
> : I tensorflow_serving/core/loader_harness.cc:119] Unloading servable version {name: mnist version: 1}
> : I ./tensorflow_serving/core/simple_loader.h:294] Calling MallocExtension_ReleaseToSystem() after servable unload with 60736
> : I tensorflow_serving/core/loader_harness.cc:127] Done unloading servable version {name: mnist version: 1}

$ bazel-bin/tensorflow_serving/example/mnist_client \
    --server=localhost:9000 \
    --num_tests=1000

inception-v3

inception_saved_model requires a pre-trained and pre-saved checkpoint on inception-v3 model

  • download google published inception-v3 checkpoint file
# an investigation follow the path

- ./tensorflow_serving/example/BUILD
...
py_binary(
    name = "inception_saved_model",
    srcs = [
        "inception_saved_model.py",
    ],
    srcs_version = "PY2AND3",
    deps = [
        "@inception_model//inception",
        "@org_tensorflow//tensorflow:tensorflow_py",
    ],
)
...

- ./tensorflow_serving/workspace.bzl
...
def tf_serving_workspace():
  native.new_local_repository(
      name = "inception_model",
      path = "tf_models/research/inception",
      build_file = "tf_models/research/inception/inception/BUILD",
  )
  ...
...

# so, this inception-v3 is defined at ./tf_models/research/inception


# in order to make it clear, there are at leaset two places where inception-v3 is defined
# one is defined at ./tf_models/research/inception, which uses slim module, 
# another is defined at ./tf_models/research/slim, which includes a model inception-v3 
# and these two models are slightly different, so model definition must match to the checkpoint downloaded

# see https://github.com/tensorflow/models/tree/master/research/inception
# model ./tf_models/research/inception match to checkpoint http://download.tensorflow.org/models/image/imagenet/inception-v3-2016-03-01.tar.gz

# see https://github.com/tensorflow/models/tree/master/research/slim
# model ./tf_models/research/slim match to checkpoint http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz
$ mkdir -p tf_checkpoints/inception

$ wget -O tf_checkpoints/inception/inception-v3-2016-03-01.tar.gz http://download.tensorflow.org/models/image/imagenet/inception-v3-2016-03-01.tar.gz

$ tar -xvzf tf_checkpoints/inception/inception-v3-2016-03-01.tar.gz -C tf_checkpoints/inception

$ ls tf_checkpoints/inception/inception-v3
> checkpoint  model.ckpt-157585  README.txt

$ rm tf_checkpoints/inception/inception-v3-2016-03-01.tar.gz
# check ./tensorflow_serving/example/inception_saved_model.py for options

# ckpt = tf.train.get_checkpoint_state(FLAGS.checkpoint_dir) is called in 
# ./tensorflow_serving/example/inception_saved_model.py to get checkpoint
# status and model_checkpoint_path, which require a file named checkpoint  
# in the checkpoint_dir to process. 

# create a checkpoint file if not present

$ echo 'model_checkpoint_path: "model.ckpt-157585"' > ./tf_checkpoints/inception/inception-v3/checkpoint
  • inspect checkpoint file
# https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/inspect_checkpoint.py

$ cd tensorflow

$ bazel --output_user_root=../tf_bazel_cache build -c opt //tensorflow/python/tools:inspect_checkpoint

$ bazel-bin/tensorflow/python/tools/inspect_checkpoint --file_name=../tf_checkpoints/inception/inception-v3/model.ckpt-157585

$ cd ..
  • setup saved model as servable
# check ./tensorflow_serving/example/inception_saved_model.py for options

$ mkdir -p tf_servables/inception/inception-v3

$ bazel-bin/tensorflow_serving/example/inception_saved_model \
    --checkpoint_dir=tf_checkpoints/inception/inception-v3 \
    --output_dir=tf_servables/inception/inception-v3 \
    --model_version=1 \
    --image_size=299
  • serve
# check ./tensorflow_serving/example/inception_saved_model.py for options

$ bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server [...]

# --model_name=inception from ./tensorflow_serving/example/BUILD
# --model_base_path=/an/absolute/path/ from previous bazel build on setup model servables
# --port=9090 since mnist is running on 9000

$ bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server \
    --model_name=inception \
    --model_base_path=$PWD/tf_servables/inception/inception-v3 \
    --port=9090
  • client
# check ./tensorflow_serving/example/inception_client.py for options

$ bazel-bin/tensorflow_serving/example/inception_client \
    --server=localhost:9000 \
    --image=/path/to/my_favoriate_image.jpg

Blind Build and Test All (NODO)

$ bazel build tensorflow_serving/...

$ bazel test tensorflow_serving/...

results matching ""

    No results matching ""