-
Notifications
You must be signed in to change notification settings - Fork 56
Building TensorFlow Serving
The instructions provided here specify the steps to build TensorFlow Serving version 2.16.1 on Linux on IBM Z for the following distributions:
- Ubuntu (20.04, 22.04, 24.04)
- When following the steps below please use a standard permission user unless otherwise specified.
- A directory
/<source_root>/
will be referred to in these instructions, this is a temporary writable directory anywhere you'd like to place it.
TensorFlow Serving can be built manually using STEP 1.2.
Use the following commands to build TensorFlow Serving using the build script. Please ensure wget
is installed.
wget -q https://raw.githubusercontent.com/linux-on-ibm-z/scripts/master/TensorflowServing/2.16.1/build_tensorflow_serving.sh
bash build_tensorflow_serving.sh [Provide -t option for executing build with tests]
If the build completes successfully, go to STEP 2. In case of error, check logs
for more details or go to STEP 1.2 to follow manual build steps.
- Set environment variable
SOURCE_ROOT
:
export SOURCE_ROOT=/<source_root>/
export PATCH_URL="https://raw.githubusercontent.com/linux-on-ibm-z/scripts/master/TensorflowServing/2.16.1/patch"
cd $SOURCE_ROOT
wget -q https://raw.githubusercontent.com/linux-on-ibm-z/scripts/master/Tensorflow/2.16.1/build_tensorflow.sh
sed -i "161i curl -o tf.patch $PATCH_URL/tf.patch" build_tensorflow.sh
sed -i '162i patch -p1 < tf.patch' build_tensorflow.sh
sed -i "163i curl -o upb_str_fix.patch $PATCH_URL/upb_str_fix.patch" build_tensorflow.sh
sed -i "164i cp upb_str_fix.patch third_party/grpc/" build_tensorflow.sh
bash build_tensorflow.sh -y
-
Install dependencies
sudo apt-get update sudo apt-get install -y wget automake libtool
-
Download source code
cd $SOURCE_ROOT git clone https://github.com/tensorflow/serving cd serving git checkout 2.16.1
-
Apply patches
curl -o tfs.patch $PATCH_URL/tfs.patch curl -o rules.patch $PATCH_URL/rules.patch sed -i "s?SOURCE_ROOT?$SOURCE_ROOT?" tfs.patch git apply tfs.patch cp rules.patch third_party/ touch third_party/BUILD curl -o tfs_libevent.patch $PATCH_URL/tfs_libevent.patch # Only for Ub24.04 patch -p1 < tfs_libevent.patch # Only for Ub24.04
-
Build TensorFlow Serving
Tensorflow Serving can be built as follows:
cd $SOURCE_ROOT/serving bazel build tensorflow_serving/...
Note: TensorFlow Serving build is resource intensive operation. If build continues to fail try increasing the swap space or reducing the number of concurrent jobs by specifying
--jobs=n
in the build command above, wheren
is the number of concurrent jobs.Copy binary to access it from anywhere, make sure /usr/local/bin is in $PATH. Run command:
sudo cp bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server /usr/local/bin/ tensorflow_model_server --version
The following output should be seen in the console:
TensorFlow ModelServer: 2.16.1+dev.sha.0d726954 TensorFlow Library: 2.16.1
sudo pip3 install tensorflow-serving-api==2.16.1
-
Run TensorFlow Serving from command Line
tensorflow_model_server --rest_api_port=8501 --model_name=<model_name> --model_base_path=<model_path> &
- For example:
export TESTDATA="$SOURCE_ROOT/serving/tensorflow_serving/servables/tensorflow/testdata" # Start TensorFlow Serving model server and open the REST API port tensorflow_model_server --rest_api_port=8501 --model_name=half_plus_two --model_base_path=$TESTDATA/saved_model_half_plus_two_cpu & # Query the model using the predict API curl -d '{"instances": [1.0, 2.0, 5.0]}' -X POST http://localhost:8501/v1/models/half_plus_two:predict
Output should look like:
{ "predictions": [2.5, 3.0, 4.5 ] }
-
Run complete testsuite
cd $SOURCE_ROOT/serving bazel test tensorflow_serving/...
All test cases should pass.
The information provided in this article is accurate at the time of writing, but on-going development in the open-source projects involved may make the information incorrect or obsolete. Please open issue or contact us on IBM Z Community if you have any questions or feedback.