Skip to content

Building TensorFlow Serving

aborkar-ibm edited this page Aug 10, 2021 · 17 revisions

Building TensorFlow Serving

The instructions provided here specify the steps to build TensorFlow Serving version 2.5.1 on Linux on IBM Z for the following distributions:

  • Ubuntu (18.04, 20.04, 21.04)

General Notes:

  • When following the steps below please use a standard permission user unless otherwise specified.
  • A directory /<source_root>/ will be referred to in these instructions, this is a temporary writable directory anywhere you'd like to place it.

Step 1: Build and Install TensorFlow Serving

1.1) Build using script

TensorFlow Serving can be built manually using STEP 1.2.

Use the following commands to build TensorFlow Serving using the build script. Please ensure wget is installed.

wget -q https://raw.githubusercontent.com/linux-on-ibm-z/scripts/master/TensorflowServing/2.5.1/build_tensorflow_serving.sh

bash build_tensorflow_serving.sh    [Provide -t option for executing build with tests]

If the build completes successfully, go to STEP 2. In case of error, check logs for more details or go to STEP 1.2 to follow manual build steps.

1.2) Install the dependencies

export SOURCE_ROOT=/<source_root>/
  • Ubuntu 18.04
 sudo apt-get update
 sudo apt-get install sudo wget git unzip zip python3-dev python3-pip openjdk-11-jdk pkg-config libhdf5-dev libssl-dev libblas-dev liblapack-dev gfortran -y
 sudo ldconfig
 sudo pip3 install --upgrade pip
 sudo pip3 install --no-cache-dir numpy==1.19.5 wheel scipy portpicker protobuf==3.13.0
 sudo pip3 install keras_preprocessing --no-deps
  • Ubuntu 20.04
 sudo apt-get update
 sudo apt-get install sudo wget git unzip zip python3-dev python3-pip openjdk-11-jdk pkg-config libhdf5-dev libssl-dev libblas-dev liblapack-dev gfortran -y
 sudo ldconfig
 sudo pip3 install --upgrade pip
 sudo pip3 install --no-cache-dir numpy==1.19.5 wheel scipy==1.6.3 portpicker protobuf==3.13.0
 sudo pip3 install keras_preprocessing --no-deps
  • Ubuntu 21.04
 sudo apt-get update
 sudo apt-get install sudo wget git unzip zip python3-dev python3-pip openjdk-11-jdk pkg-config libhdf5-dev libssl-dev libblas-dev liblapack-dev gfortran -y
 sudo ldconfig
 sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-7 60 --slave /usr/bin/g++ g++ /usr/bin/g++-7
 sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-10 40 --slave /usr/bin/g++ g++ /usr/bin/g++-10
 sudo update-alternatives --auto gcc
 sudo pip3 install --upgrade pip
 sudo pip3 install --no-cache-dir numpy==1.19.5 wheel scipy==1.6.3 portpicker protobuf==3.13.0
 sudo pip3 install keras_preprocessing --no-deps
  • Ensure /usr/bin/python points to Python3 to build TensorFlow in a Python3 environment
  sudo update-alternatives --install /usr/bin/python python /usr/bin/python3 40
  • Install grpcio
 export GRPC_PYTHON_BUILD_SYSTEM_OPENSSL=True
 sudo -E pip3 install grpcio
  • Build Bazel v3.7.2 -- Instructions for building Bazel can be found here.
    Note: Bazel community has not officially supported Ubuntu 20.04 and Ubuntu 21.04 yet, but you could still follow the building instruction above to build Bazel on Ubuntu 20.04 and Ubuntu 21.04. Please note that if you intend to use the build script of bazel on Ubuntu 20.04 or Ubuntu 21.04, you will need to edit line 211 to change "ubuntu-18.04" into "ubuntu-18.04" | "ubuntu-20.04" | "ubuntu-21.04".

  • Build TensorFlow v2.5.0 -- Instructions for building Tensorflow can be found here.

1.3) Install BoringSSL

  • Download source code and apply patch for s390x support

    cd $SOURCE_ROOT
    wget https://github.com/google/boringssl/archive/80ca9f9f6ece29ab132cce4cf807a9465a18cfac.tar.gz
    tar -zxvf 80ca9f9f6ece29ab132cce4cf807a9465a18cfac.tar.gz
    mv boringssl-80ca9f9f6ece29ab132cce4cf807a9465a18cfac/ boringssl/
    cd boringssl/
    sed -i '/set(ARCH "ppc64le")/a \elseif (${CMAKE_SYSTEM_PROCESSOR} STREQUAL "s390x")\n\ set(ARCH "s390x")' src/CMakeLists.txt
    sed -i '/OPENSSL_PNACL/a \#elif defined(__s390x__)\n\#define OPENSSL_64_BIT' src/include/openssl/base.h

1.4) Build TensorFlow Serving

  • Download source code

    cd $SOURCE_ROOT
    git clone https://github.com/tensorflow/serving
    cd serving
    git checkout 2.5.1
  • Apply patches

    export PATCH_URL="https://raw.githubusercontent.com/linux-on-ibm-z/scripts/master/TensorflowServing/2.5.1/patch"
    wget -O tfs_patch.diff $PATCH_URL/tfs_patch.diff
    sed -i "s?source_root?$SOURCE_ROOT?" tfs_patch.diff
    git apply tfs_patch.diff
    cd $SOURCE_ROOT/tensorflow
    wget -O tf_patch.diff $PATCH_URL/tf_patch.diff
    git apply tf_patch.diff
  • Build TensorFlow Serving

    Tensorflow Serving can be built as follows:

    • To build entire tree:
    cd $SOURCE_ROOT/serving
    bazel --host_jvm_args="-Xms1024m" --host_jvm_args="-Xmx2048m" build --verbose_failures -c opt tensorflow_serving/...
    • To build tensorflow_model_server target:
    cd $SOURCE_ROOT/serving
    bazel --host_jvm_args="-Xms1024m" --host_jvm_args="-Xmx2048m" build --verbose_failures -c opt tensorflow_serving/model_servers:tensorflow_model_server

    Note: TensorFlow Serving build is resource intensive operation. If build continues to fail try increasing the swap space and reduce the number of concurrent jobs by specifying --jobs=n in the build command above, where n is the number of concurrent jobs.

    Copy binary to access it from anywhere, make sure /usr/local/bin is in $PATH. Run command:

    cp bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server /usr/local/bin/.
    tensorflow_model_server --version

1.5) Install TensorFlow Serving API

sudo pip3 install tensorflow-serving-api==2.5.1

Step 2: Verify TensorFlow Serving (Optional)

  • Run TensorFlow Serving from command Line

     tensorflow_model_server --rest_api_port=8501 --model_name=<model_name> --model_base_path=<model_path> &
     curl -d '{"instances": [1.0, 2.0, 5.0]}'     -X POST http://localhost:8501/v1/models/<model_name>:predict
    • For example:
     export TESTDATA="$SOURCE_ROOT/serving/tensorflow_serving/servables/tensorflow/testdata"
     tensorflow_model_server --rest_api_port=8501 --model_name=half_plus_two --model_base_path=$TESTDATA/saved_model_half_plus_two_cpu &
     curl -d '{"instances": [1.0, 2.0, 5.0]}'     -X POST http://localhost:8501/v1/models/half_plus_two:predict

    Output should look like:

    {
        "predictions": [2.5, 3.0, 4.5
        ]
    }
    

Step 3: Execute Test Suite (Optional)

  • Run complete testsuite

    cd $SOURCE_ROOT/serving
     bazel --host_jvm_args="-Xms1024m" --host_jvm_args="-Xmx2048m" test --test_tag_filters=-gpu,-benchmark-test -k --build_tests_only --test_output=errors --verbose_failures -c opt tensorflow_serving/...

    Note: tensorflow_model_server_test, //tensorflow_serving/servables/tensorflow:tflite_interpreter_pool_test, tflite_session_test and saved_model_bundle_factory_test testcases require model files to be regenerated on s390x. For tflite, create and use new model files using (Please modify the python library path based on your local environment and python version):

    cd $SOURCE_ROOT/tensorflow
    bazel build --host_javabase="@local_jdk//:jdk" //tensorflow/lite/tools/signature:signature_def_utils
    cp -r bazel-bin/tensorflow/lite/tools/signature/* tensorflow/lite/tools/signature/
    sudo rm -rf /usr/local/lib/python3.8/dist-packages/tensorflow/lite/tools
    sudo ln -s $SOURCE_ROOT/tensorflow/tensorflow/lite/tools /usr/local/lib/python3.6/dist-packages/tensorflow/lite/tools
    
    sudo rm -rf /tmp/saved_model_half_plus_two*
    sudo python $SOURCE_ROOT/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two.py
    sudo cp /tmp/saved_model_half_plus_two_tflite/model.tflite $SOURCE_ROOT/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_tflite/00000123/
    sudo cp /tmp/saved_model_half_plus_two_tflite_with_sigdef/model.tflite $SOURCE_ROOT/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_tflite_with_sigdef/00000123/
    
    mkdir /tmp/parse_example_tflite
    python $SOURCE_ROOT/serving/tensorflow_serving/servables/tensorflow/testdata/parse_example_tflite.py
    cp /tmp/parse_example_tflite/model.tflite $SOURCE_ROOT/serving/tensorflow_serving/servables/tensorflow/testdata/parse_example_tflite/00000123/model.tflite

    Note: The test case tensorflow_model_server_test may still fail after regenerating the models. The core issue of the test case is that the underlying TF Text model is using Little Endian ICU normalizer, investigation is underway for how to regenerate the model after solving the ICU issue: https://github.com/tensorflow/serving/issues/1897

    Note: The test case tflite_interpreter_pool_test may still fail after regenerating the models. This is because one of the model it uses (MobileNetModel) could not be generated in open source release of TensorFlow Serving

References:

Clone this wiki locally