-
Notifications
You must be signed in to change notification settings - Fork 56
Building Apache Kafka
The instructions provided below specify the steps to build Apache Kafka 2.8.0 on Linux on IBM Z for following distributions:
- RHEL (7.8, 7.9, 8.2, 8.3, 8.4)
- SLES (12 SP5, 15 SP2, 15 SP3)
- Ubuntu (18.04, 20.04, 21.04)
General Notes:
-
When following the steps below please use a standard permission user unless otherwise specified.
-
A directory
/<source_root>/will be referred to in these instructions, this is a temporary writable directory anywhere you'd like to place it.
If you want to build Apache Kafka using manual steps, go to STEP 2.
Use the following commands to build Apache Kafka using the build script. Please make sure you have wget installed.
-
With Eclipse Adoptium Temurin Runtime (previously known as AdoptOpenJDK hotspot, also the recommended runtime)
wget -q https://raw.githubusercontent.com/linux-on-ibm-z/scripts/master/ApacheKafka/2.8.0/build_kafka_AdoptiumTemurin.sh # Build Apache Kafka bash build_kafka_AdoptiumTemurin.shAlternatively, you can use IBM Semeru Runtime (previously known as AdoptOpenJDK openj9) Java
wget -q https://raw.githubusercontent.com/linux-on-ibm-z/scripts/master/ApacheKafka/2.8.0/build_kafka_IBMSemeru.sh # Build Apache Kafka bash build_kafka_IBMSemeru.sh
If the build completes successfully, go to STEP 6 for test execution. In case of error, check logs for more details or go to STEP 2 to follow manual build steps.
export SOURCE_ROOT=/<source_root>/-
RHEL (7.8, 7.9, 8.2, 8.3, 8.4)
-
With Eclipse Adoptium Temurin Runtime (previously known as AdoptOpenJDK hotspot, also the recommended runtime)
sudo yum install -y wget tar git curl ca-certificates
- Download and install Eclipse Adoptium Temurin Runtime (Java 11) from here.
-
With IBM Semeru Runtime (previously known as AdoptOpenJDK openj9)
sudo yum install -y wget tar git curl ca-certificates
- Download and Install AdoptOpenJDK (OpenJDK11 with Eclipse OpenJ9 Large Heap) from here
-
-
SLES (12 SP5, 15 SP2, 15 SP3)
-
With Eclipse Adoptium Temurin Runtime (previously known as AdoptOpenJDK hotspot)
sudo zypper install -y wget tar git curl
- Download and install Eclipse Adoptium Temurin Runtime (Java 11) from here.
-
With IBM Semeru Runtime (previously known as AdoptOpenJDK openj9)
sudo zypper install -y wget tar git curl
- Download and Install AdoptOpenJDK (OpenJDK11 with Eclipse OpenJ9 Large Heap) from here
-
-
Ubuntu (18.04, 20.04, 21.04)
-
With Eclipse Adoptium Temurin Runtime (previously known as AdoptOpenJDK hotspot)
sudo apt-get update sudo apt-get -y install wget tar git curl
- Download and install Eclipse Adoptium Temurin Runtime (Java 11) from here.
-
With IBM Semeru Runtime (previously known as AdoptOpenJDK openj9)
sudo apt-get update sudo apt-get -y install wget tar git curl
- Download and Install AdoptOpenJDK (OpenJDK11 with Eclipse OpenJ9 Large Heap) from here
-
Note:
- At the time of creation of these build instructions, Apache Kafka was verified with JDK 11 version (Temurin-11.0.12+7) and (jdk-11.0.10+9_openj9-0.24.0).
- Eclipse Adoptium Temurin Runtime Java is Recommended.
cd $SOURCE_ROOT
export PATCH_URL="https://raw.githubusercontent.com/linux-on-ibm-z/scripts/master/ApacheKafka/2.8.0/patch"
git clone https://github.com/apache/kafka.git
cd kafka
git checkout 2.8.0
curl -sSL $PATCH_URL/scala-2.13.3.patch | git apply
curl -sSL $PATCH_URL/IBMSemeru.patch | git apply (Only for IBM Semeru Runtime Java)
./gradlew jar
Note: Building rocksdbjni requires java 8. It can be installed from here.
-
Set JAVA_HOME and PATH for JDK8
export JAVA_HOME=<path to java8> export PATH=$JAVA_HOME/bin:$PATH -
Install the dependencies
-
RHEL (7.8, 7.9)
sudo yum install -y wget tar git hostname unzip procps snappy binutils bzip2 bzip2-devel curl gcc-c++ make which zlib-devel diffutils
- Build GCC 7.3.0:
cd $SOURCE_ROOT mkdir gcc cd gcc wget https://ftpmirror.gnu.org/gcc/gcc-7.3.0/gcc-7.3.0.tar.xz tar -xf gcc-7.3.0.tar.xz cd gcc-7.3.0 ./contrib/download_prerequisites mkdir objdir cd objdir ../configure --prefix=/opt/gcc --enable-languages=c,c++ --with-arch=zEC12 --with-long-double-128 \ --build=s390x-linux-gnu --host=s390x-linux-gnu --target=s390x-linux-gnu \ --enable-threads=posix --with-system-zlib --disable-multilib make -j 8 sudo make install sudo ln -sf /opt/gcc/bin/gcc /usr/bin/gcc sudo ln -sf /opt/gcc/bin/g++ /usr/bin/g++ sudo ln -sf /opt/gcc/bin/g++ /usr/bin/c++ export PATH=/opt/gcc/bin:"$PATH" export LD_LIBRARY_PATH=/opt/gcc/lib64:"$LD_LIBRARY_PATH" export C_INCLUDE_PATH=/opt/gcc/lib/gcc/s390x-linux-gnu/7.3.0/include export CPLUS_INCLUDE_PATH=/opt/gcc/lib/gcc/s390x-linux-gnu/7.3.0/include sudo ln -sf /opt/gcc/lib64/libstdc++.so.6.0.24 /lib64/libstdc++.so.6 sudo ln -sf /opt/gcc/lib64/libatomic.so.1 /lib64/libatomic.so.1
-
RHEL (8.2, 8.3, 8.4)
sudo yum install -y wget tar git hostname unzip procps snappy binutils bzip2 bzip2-devel curl gcc-c++ make which zlib-devel diffutils
-
SLES (12 SP5)
sudo zypper install -y wget tar unzip snappy-devel libzip2 bzip2 curl gcc7 gcc7-c++ make which zlib-devel git sudo ln -sf /usr/bin/gcc-7 /usr/bin/gcc sudo ln -sf /usr/bin/g++-7 /usr/bin/g++ sudo ln -sf /usr/bin/gcc /usr/bin/cc
-
SLES (15 SP2, 15 SP3)
sudo zypper install -y unzip snappy-devel libzip5 bzip2 curl gcc-c++ make which zlib-devel tar wget git gzip gawk
-
Ubuntu (18.04, 20.04, 21.04)
sudo apt-get update sudo apt-get -y install wget tar hostname unzip zlib1g-dev libbz2-dev liblz4-dev libzstd-dev git make gcc-7 g++-7 curl sudo rm -rf /usr/bin/gcc /usr/bin/g++ /usr/bin/cc sudo ln -sf /usr/bin/gcc-7 /usr/bin/gcc sudo ln -sf /usr/bin/g++-7 /usr/bin/g++ sudo ln -sf /usr/bin/gcc /usr/bin/cc
-
-
Build and Create Rocksdb Jar
cd $SOURCE_ROOT git clone https://github.com/facebook/rocksdb.git cd rocksdb git checkout v5.18.4 sed -i '1656s/ARCH/MACHINE/g' Makefile export DEBUG_LEVEL=0 PORTABLE=1 make shared_lib make -j8 rocksdbjava -
Replace Rocksdbjni jar
cp $SOURCE_ROOT/rocksdb/java/target/rocksdbjni-5.18.4-linux64.jar $HOME/.gradle/caches/modules-2/files-2.1/org.rocksdb/rocksdbjni/5.18.4/def7af83920ad2c39eb452f6ef9603777d899ea0/rocksdbjni-5.18.4.jar cp $SOURCE_ROOT/rocksdb/java/target/rocksdbjni-5.18.4-linux64.jar $SOURCE_ROOT/kafka/streams/examples/build/dependant-libs-2.13.3/rocksdbjni-5.18.4.jar cp $SOURCE_ROOT/rocksdb/java/target/rocksdbjni-5.18.4-linux64.jar $SOURCE_ROOT/kafka/streams/build/dependant-libs-2.13.3/rocksdbjni-5.18.4.jar cp $SOURCE_ROOT/rocksdb/java/target/rocksdbjni-5.18.4-linux64.jar $SOURCE_ROOT/kafka/streams/streams-scala/build/dependant-libs-2.13.3/rocksdbjni-5.18.4.jar cp $SOURCE_ROOT/rocksdb/java/target/rocksdbjni-5.18.4-linux64.jar $SOURCE_ROOT/kafka/streams/test-utils/build/dependant-libs-2.13.3/rocksdbjni-5.18.4.jar
-
Update JAVA_HOME and PATH variable to switch to your JDK11
export JAVA_HOME=<path to java11> export PATH=$JAVA_HOME/bin:$PATH
cd $SOURCE_ROOT/kafka
./gradlew test -PscalaVersion=2.13.3 --continueNote:
- If any test fails due to timeout, try running it individually.
- We are using lower scala version for testsing as some tests from core module are failing with error like
java.lang.IllegalAccessError: Class 'scala.util.Random' no access to: class 'scala.collection.immutable.LazyList$State'. More details can be found here. - org.apache.kafka.streams.state.internals.RocksDBTimestampedStoreTest.shouldMigrateDataFromDefaultToTimestampColumnFamily test is failing on s390x, it passes with 6.19.3 rocksdb version present on Kafka's trunk branch.
- org.apache.kafka.streams.state.internals.ThreadCacheTest.cacheOverheadsSmallValues test is failing on s390x and Intel, can be ignored.
-
Below tests are failing from client:test module and complete fix is being added. Refer this for more details.
- org.apache.kafka.clients.ClientUtilsTest.testResolveDnsLookupResolveCanonicalBootstrapServers
- org.apache.kafka.clients.ClientUtilsTest.testResolveDnsLookupAllIps
Follow official quickstart guide given here to verify the installation. Follow instructions given here for using Kafka Exporter with Kafka 2.8.0
The information provided in this article is accurate at the time of writing, but on-going development in the open-source projects involved may make the information incorrect or obsolete. Please open issue or contact us on IBM Z Community if you have any questions or feedback.