Ubuntu 16.04 TensorFlow Servering

前端之家收集整理的这篇文章主要介绍了Ubuntu 16.04 TensorFlow Servering前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。

Prerequisites

Bazel

Using Bazel custom APT repository (recommended)

  • Install JDK 8
$ sudo apt-get@H_403_12@ install openjdk-@H_403_12@8@H_403_12@-jdk@H_403_12@
  • Add Bazel distribution URI as a package source (one time setup)
$ echo "deb [arch=amd64] http://storage.googleapis.com/bazel-apt stable jdk1.8"@H_403_12@ | sudo tee /etc/apt/sources.list@H_403_12@.d@H_403_12@/bazel.list@H_403_12@
$ curl https://bazel.build@H_403_12@/bazel-release.pub@H_403_12@.gpg@H_403_12@ | sudo apt-key add@H_403_12@ -

If you want to install the testing version of Bazel,replace stable with testing.

  • Install and update Bazel
$ sudo@H_403_12@ apt-get update && sudo@H_403_12@ apt-get install bazel

Once installed,you can upgrade to a newer version of Bazel with:

$ sudo@H_403_12@ apt-get upgrade bazel

gRPC Python

$ sudo@H_403_12@ pip install grpcio

Packages dependencies

$ sudo apt-get update &@H_403_12@&@H_403_12@ sudo apt-get install -y \ @H_403_12@ build-essential \ @H_403_12@ curl \ @H_403_12@ libcurl3-dev \ @H_403_12@ git \ @H_403_12@ libfreetype6-dev \ @H_403_12@ libpng12-dev \ @H_403_12@ libzmq3-dev \ @H_403_12@ pkg-config \ @H_403_12@ python-dev \ @H_403_12@ python-numpy \ @H_403_12@ python-pip \ @H_403_12@ software-properties-common \ @H_403_12@ swig \ @H_403_12@ zip \ @H_403_12@ zlib1g-dev@H_403_12@

TensorFlow Serving Python API PIP package

$ pip install tensorflow-serving@H_403_12@-api@H_403_12@

Installing from source

Clone the TensorFlow Serving repository

$ git clone@H_403_12@ --recurse-submodules https://github.com/tensorflow/serving@H_403_12@
$ cd serving

–recurse-submodules is required to fetch TensorFlow,gRPC,and other libraries that TensorFlow Serving depends on.
Note that these instructions will install the latest master branch of TensorFlow Serving. If you want to install a specific branch (such as a release branch),pass -b <branchname> to the git clone command.

Install prerequisites

Follow the Prerequisites section above to install all dependencies. To configure TensorFlow,run

$ @H_403_12@cd tensorflow
$ @H_403_12@./configure
$ @H_403_12@cd ..

Build

$ bazel build -c opt tensorflow_serving/...@H_403_12@

Binaries are placed in the bazel-bin directory,and can be run using a command like:

$ @H_403_12@bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server

To test your installation,execute:

$ sudo pip install autograd # handle ImportError: No module named autograd@H_403_12@
$ bazel test -c opt tensorflow_serving/...@H_403_12@

Serving a TensorFlow Model

Train And Export TensorFlow Model

  • step 0
$ @H_403_12@rm -rf /tmp/mnist_model
  • step 1
$ bazel build -c opt //tensorflow_serving/example:mnist_saved_model
$ bazel-bin/tensorflow_serving/example/mnist_saved_model /tmp/mnist_model
Training model...

...@H_403_12@

Done training!
Exporting trained model to /tmp/mnist_model
Done exporting!

OR

$ @H_403_12@python tensorflow_serving/example/mnist_saved_model.py /tmp/mnist_model

list the model files

$ @H_403_12@ls /tmp/mnist_model
1@H_403_12@
$ @H_403_12@ls /tmp/mnist_model/1@H_403_12@
saved_model.pb variables

Each version sub-directory contains the following files:

  • saved_model.pb is the serialized tensorflow::SavedModel. It includes one or more
  • graph definitions of the model,as well as Metadata of the model such as signatures.
    variables are files that hold the serialized variables of the graphs.

Load Exported Model With Standard TensorFlow ModelServer

$ @H_403_12@bazel build -c opt //tensorflow_serving/model@H_403_12@_servers:@H_403_12@tensorflow_model_server # do not use@H_403_12@
$ @H_403_12@bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000@H_403_12@ --model_name=mnist --model_base_path=/tmp/mnist@H_403_12@_model/

Test The Server

$ bazel build -c opt //tensorflow_serving/example:mnist_client
$ bazel-bin/tensorflow_serving/example/mnist_client --num_tests=1000@H_403_12@ --server=localhost:9000@H_403_12@
...@H_403_12@
Inference error rate: 10.4@H_403_12@%

OR

$@H_403_12@ python@H_403_12@ tensorflow_serving/example/mnist_client@H_403_12@.@H_403_12@py@H_403_12@ -@H_403_12@-@H_403_12@num_tests=1000@H_403_12@ -@H_403_12@-@H_403_12@server=localhost:9000@H_403_12@

Reference

原文链接:https://www.f2er.com/ubuntu/350981.html

猜你在找的Ubuntu相关文章