This repository contains a reference implementation of an attested OHTTP client for Azure AI confidential inferencing on Linux.
We support the following client profiles.
Rust and python packages can be built using this repo. We support development and build using GitHub Codespaces and devcontainers. The repository includes a devcontainer configuration that installs all dependencies.
Build the pyohttp package as follows.
./scripts/build-pyohttp.sh
Install the package from target/wheels
using pip. The sample python script shows how use this package and make an attested OHTTP inference request to a confidential whisper endpoint.
You can use pre-built attested OHTTP container images to send inferencing requests and print completions.
Set the inferencing endpoint and access key as follows.
export TARGET_URI=<URL for your endpoint>
export API_KEY=<key for accessing the endpoint>
Set the Key Management Service (KMS) endpoint as follows.
export KMS_URL=https://accconfinferenceproduction.confidential-ledger.azure.com
Run inferencing using a pre-packaged audio file.
docker run -e KMS_URL=${KMS_URL} mcr.microsoft.com/acc/samples/attested-ohttp-client:latest \
${TARGET_URI} -F "file=@/examples/audio.mp3" -O "api-key: ${API_KEY}" -F "response_format=json"
Run inferencing using a pre-packaged audio file and receive the attestation token. The attestation token will be returned as a blob. It can be decoded at jwt.io.
docker run -e KMS_URL=${KMS_URL} mcr.microsoft.com/acc/samples/attested-ohttp-client:latest \
${TARGET_URI} -F "file=@/examples/audio.mp3" -O "api-key: ${API_KEY}" -O "x-attestation-token:true" \
-F "response_format=json"
Run inferencing using your own audio file by mounting the file into the container. The maximum audio file size supported is 25MB.
export INPUT_PATH=<path_to_your_input_audio_file_excluding_name>
export INPUT_FILE=<name_of_your_audio_file>
export MOUNTED_PATH=/test
docker run -e KMS_URL=${KMS_URL} --volume ${INPUT_PATH}:${MOUNTED_PATH} \
mcr.microsoft.com/acc/samples/attested-ohttp-client:latest \
${TARGET_URI} -F "file=@${MOUNTED_PATH}/${INPUT_FILE}" -O "api-key: ${API_KEY}" -F "response_format=json"
You can build the client container using docker.
docker build -f docker/Dockerfile -t attested-ohttp-client .
Setup environment by installing dependencies.
sudo apt update
sudo apt install -y curl build-essential jq libssl-dev
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
Build the client using cargo.
cargo build --bin ohttp-client-cli
Run the CLI as follows.
curl -s -k ${KMS_URL}/node/network | jq -r .service_certificate > /tmp/service_cert.pem
cargo run --bin=ohttp-client-cli -- ${TARGET_URI} -F "file=examples/audio.mp3" \
-O "api-key: ${API_KEY}" --kms-url=${KMS_URL} --kms-cert=/tmp/service_cert.pem