Skip to content

Commit 7b95d90

Browse files
Chenyu-Shioctaviansima
authored andcommitted
Separate IN PR (#124)
* finishing the in expression. adding more tests and null support. need confirmation on null behavior and also I wonder why integer field is sufficient for string * adding additional test * adding additional test * saving concat implementation and it's passing basic functionality tests * adding type aware comparison and better error message for IN operator * adding null checking for the concat operator and adding one additional test * cleaning up IN&Concat PR * deleting concat and preping the in branch for in pr * fixing null bahavior now it's only null when there's no match and there's null input * Build failed Co-authored-by: Ubuntu <[email protected]> Co-authored-by: Wenting Zheng <[email protected]> Co-authored-by: Wenting Zheng <[email protected]> Separate Concat PR (#125) Implementation of the CONCAT expression. Co-authored-by: Ubuntu <[email protected]> Co-authored-by: Wenting Zheng <[email protected]> Removed calls to toSet in TPC-H tests (#140) * removed calls to toSet * added calls to toSet back where queries are unordered Documentation update (#148) Cluster Remote Attestation Fix (#146) The existing code only had RA working when run locally. This PR adds a sleep for 5 seconds to make sure that all executors are spun up successfully before attestation begins. Closes #147 upgrade to 3.0.1 (#144) Update two TPC-H queries (#149) Tests for TPC-H 12 and 19 pass. TPC-H 20 Fix (#142) * string to stringtype error * tpch 20 passes * cleanup * implemented changes * decimal.tofloat Co-authored-by: Wenting Zheng <[email protected]> Join update (#145) Migrate from Travis CI to Github Actions (#156) matching in strategies.scala set up class thing cleanup added test cases for non-equi left anti join rename to serializeEquiJoinExpression added isEncrypted condition set up keys JoinExpr now has condition rename serialization does not throw compile error for BNLJ split up added condition in ExpressionEvaluation.h zipPartitions cpp put in place typo added func to header two loops in place update tests condition fixed scala loop interchange rows added tags ensure cached == match working comparison decoupling in ExpressionEvalulation save compiles and condition works is printing fix swap outer/inner o_i_match show() has the same result tests pass test cleanup added test cases for different condition BuildLeft works optional keys in scala started C++ passes the operator tests comments, cleanup attemping to do it the ~right~ way comments to distinguish between primary/secondary, operator tests pass cleanup comments, about to begin implementation for distinct agg ops is_distinct added test case serializing with isDistinct is_distinct in ExpressionEvaluation.h removed unused code from join implementation remove RowWriter/Reader in condition evaluation (join) easier test serialization done correct checking in Scala set is set up spaghetti but it finally works function for clearing values condition_eval isntead of condition goto comment remove explain from test, need to fix distinct aggregation for >1 partitions started impl of multiple partitions fix added rangepartitionexec that runs partitioning cleanup serialization properly comments, generalization for > 1 distinct function comments about to refactor into logical.Aggregation the new case has distinct in result expressions need to match on distinct removed new case (doesn't make difference?) works remove traces of distinct more cleanup Upgrade to OE 0.12 (#153) Update README.md Support for scalar subquery (#157) This PR implements the scalar subquery expression, which is triggered whenever a subquery returns a scalar value. There were two main problems that needed to be solved. First, support for matching the scalar subquery expression is necessary. Spark implements this by wrapping a SparkPlan within the expression and calls executeCollect. Then it constructs a literal with that value. However, this is problematic for us because that value should not be decrypted by the driver and serialized into an expression, since it's an intermediate value. Therefore, the second issue to be addressed here is supporting an encrypted literal. This is implemented in this PR by serializing an encrypted ciphertext into a base64 encoded string, and wrapping a Decrypt expression on top of it. This expression is then evaluated in the enclave and returns a literal. Note that, in order to test our implementation, we also implement a Decrypt expression in Scala. However, this should never be evaluated on the driver side and serialized into a plaintext literal. This is because Decrypt is designated as a Nondeterministic expression, and therefore will always evaluate on the workers. Add TPC-H Benchmarks (#139) * logic decoupling in TPCH.scala for easier benchmarking * added TPCHBenchmark.scala * Benchmark.scala rewrite * done adding all support TPC-H query benchmarks * changed commandline arguments that benchmark takes * TPCHBenchmark takes in parameters * fixed issue with spark conf * size error handling, --help flag * add Utils.force, break cluster mode * comment out logistic regression benchmark * ensureCached right before temp view created/replaced * upgrade to 3.0.1 * upgrade to 3.0.1 * 10 scale factor * persistData * almost done refactor * more cleanup * compiles * 9 passes * cleanup * collect instead of force, sf_none * remove sf_none * defaultParallelism * no removing trailing/leading whitespace * add sf_med * hdfs works in local case * cleanup, added new CLI argument * added newly supported tpch queries * function for running all supported tests address comments added one test case non-null case working rename equi join split Join.cpp into two files outer and default joins split up not handling nulls at all first test case works force_null to all appends test, matching in scala non-nulls working it works for anti and outer cleanup test cases added one row is not being added in the sort merge implementation tpc-h 13 passes comments outer/inner swap, breaks a bunch of things Update App.cpp fixed swap issues for loop instead of flatten concatEncryptedBlocks tpch 13 test passes one more swap stream/broadcast concatEncryptedBlocks, remove import iostream comment for for loop added comments explaining constraints with broadcast side comments
1 parent 0a20d71 commit 7b95d90

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

43 files changed

+1772
-423
lines changed

.github/scripts/build.sh

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
# Install OpenEnclave 0.9.0
2+
echo 'deb [arch=amd64] https://download.01.org/intel-sgx/sgx_repo/ubuntu bionic main' | sudo tee /etc/apt/sources.list.d/intel-sgx.list
3+
wget -qO - https://download.01.org/intel-sgx/sgx_repo/ubuntu/intel-sgx-deb.key | sudo apt-key add -
4+
echo "deb http://apt.llvm.org/bionic/ llvm-toolchain-bionic-7 main" | sudo tee /etc/apt/sources.list.d/llvm-toolchain-bionic-7.list
5+
wget -qO - https://apt.llvm.org/llvm-snapshot.gpg.key | sudo apt-key add -
6+
echo "deb [arch=amd64] https://packages.microsoft.com/ubuntu/18.04/prod bionic main" | sudo tee /etc/apt/sources.list.d/msprod.list
7+
wget -qO - https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -
8+
9+
sudo apt update
10+
sudo apt -y install clang-7 libssl-dev gdb libsgx-enclave-common libsgx-enclave-common-dev libprotobuf10 libsgx-dcap-ql libsgx-dcap-ql-dev az-dcap-client open-enclave=0.12.0
11+
12+
# Install Opaque Dependencies
13+
sudo apt -y install wget build-essential openjdk-8-jdk python libssl-dev
14+
15+
wget https://github.com/Kitware/CMake/releases/download/v3.15.6/cmake-3.15.6-Linux-x86_64.sh
16+
sudo bash cmake-3.15.6-Linux-x86_64.sh --skip-license --prefix=/usr/local
17+
18+
# Generate keypair for attestation
19+
openssl genrsa -out ./private_key.pem -3 3072
20+
21+
source opaqueenv
22+
source /opt/openenclave/share/openenclave/openenclaverc
23+
export MODE=SIMULATE
24+
25+
build/sbt test

.github/workflows/main.yml

Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,40 @@
1+
name: CI
2+
3+
# Controls when the action will run.
4+
on:
5+
# Triggers the workflow on push or pull request events but only for the master branch
6+
push:
7+
branches: [ master ]
8+
pull_request:
9+
branches: [ master ]
10+
11+
# Allows you to run this workflow manually from the Actions tab
12+
workflow_dispatch:
13+
14+
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
15+
jobs:
16+
build:
17+
# Define the OS to run on
18+
runs-on: ubuntu-18.04
19+
# Steps represent a sequence of tasks that will be executed as part of the job
20+
steps:
21+
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
22+
- uses: actions/checkout@v2
23+
# Specify the version of Java that is installed
24+
- uses: actions/setup-java@v1
25+
with:
26+
java-version: '8'
27+
# Caching (from https://www.scala-sbt.org/1.x/docs/GitHub-Actions-with-sbt.html)
28+
- uses: coursier/cache-action@v5
29+
# Run the test
30+
- name: Install dependencies, set environment variables, and run sbt tests
31+
run: |
32+
./.github/scripts/build.sh
33+
34+
rm -rf "$HOME/.ivy2/local" || true
35+
find $HOME/Library/Caches/Coursier/v1 -name "ivydata-*.properties" -delete || true
36+
find $HOME/.ivy2/cache -name "ivydata-*.properties" -delete || true
37+
find $HOME/.cache/coursier/v1 -name "ivydata-*.properties" -delete || true
38+
find $HOME/.sbt -name "*.lock" -delete || true
39+
shell: bash
40+

.travis.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ before_install:
1616
- sudo apt update
1717
- sudo apt -y install clang-7 libssl-dev gdb libsgx-enclave-common libsgx-enclave-common-dev libprotobuf10 libsgx-dcap-ql libsgx-dcap-ql-dev
1818
- sudo apt-get -y install wget build-essential openjdk-8-jdk python libssl-dev
19-
- sudo apt-get -y install open-enclave=0.9.0
19+
- sudo apt-get -y install open-enclave=0.12.0
2020
- wget https://github.com/Kitware/CMake/releases/download/v3.15.6/cmake-3.15.6-Linux-x86_64.sh
2121
- sudo bash cmake-3.15.6-Linux-x86_64.sh --skip-license --prefix=/usr/local
2222
- export PATH=/usr/local/bin:"$PATH"

README.md

Lines changed: 42 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -8,11 +8,12 @@ Opaque is a package for Apache Spark SQL that enables encryption for DataFrames
88

99
This project is based on the following NSDI 2017 paper [1]. The oblivious execution mode is not included in this release.
1010

11-
This is an alpha preview of Opaque, which means the software is still in development (not production-ready!). It currently has the following limitations:
11+
This is an alpha preview of Opaque, but the software is still in active development. It currently has the following limitations:
1212

1313
- Unlike the Spark cluster, the master must be run within a trusted environment (e.g., on the client).
1414

15-
- Not all Spark SQL operations are supported. UDFs must be [implemented in C++](#user-defined-functions-udfs).
15+
- Not all Spark SQL operations are supported (see the [list of supported operations](#supported-functionalities)).
16+
UDFs must be [implemented in C++](#user-defined-functions-udfs).
1617

1718
- Computation integrity verification (section 4.2 of the NSDI paper) is currently work in progress.
1819

@@ -23,7 +24,7 @@ This is an alpha preview of Opaque, which means the software is still in develop
2324

2425
After downloading the Opaque codebase, build and test it as follows.
2526

26-
1. Install dependencies and the [OpenEnclave SDK](https://github.com/openenclave/openenclave/blob/v0.9.x/docs/GettingStartedDocs/install_oe_sdk-Ubuntu_18.04.md). We currently support OE version 0.9.0 (so please install with `open-enclave=0.9.0`) and Ubuntu 18.04.
27+
1. Install dependencies and the [OpenEnclave SDK](https://github.com/openenclave/openenclave/blob/v0.12.0/docs/GettingStartedDocs/install_oe_sdk-Ubuntu_18.04.md). We currently support OE version 0.12.0 (so please install with `open-enclave=0.12.0`) and Ubuntu 18.04.
2728

2829
```sh
2930
# For Ubuntu 18.04:
@@ -59,7 +60,9 @@ After downloading the Opaque codebase, build and test it as follows.
5960

6061
## Usage
6162

62-
Next, run Apache Spark SQL queries with Opaque as follows, assuming [Spark 3.0](https://www.apache.org/dyn/closer.lua/spark/spark-3.0.1/spark-3.0.1-bin-hadoop2.7.tgz) (`wget http://apache.mirrors.pair.com/spark/spark-3.0.1/spark-3.0.1-bin-hadoop2.7.tgz`) is already installed:
63+
Next, run Apache Spark SQL queries with Opaque as follows, assuming [Spark 3.0.1](https://www.apache.org/dyn/closer.lua/spark/spark-3.0.1/spark-3.0.1-bin-hadoop2.7.tgz) (`wget http://apache.mirrors.pair.com/spark/spark-3.0.1/spark-3.0.1-bin-hadoop2.7.tgz`) is already installed:
64+
65+
\* Opaque needs Spark's `'spark.executor.instances'` property to be set. This can be done in a custom config file, the default config file found at `/opt/spark/conf/spark-defaults.conf`, or as a `spark-submit` or `spark-shell` argument: `--conf 'spark.executor.instances=<value>`.
6366
6467
1. Package Opaque into a JAR:
6568
@@ -136,6 +139,41 @@ Next, run Apache Spark SQL queries with Opaque as follows, assuming [Spark 3.0](
136139
// | baz| 5|
137140
// +----+-----+
138141
```
142+
143+
## Supported functionalities
144+
145+
This section lists Opaque's supported functionalities, which is a subset of that of Spark SQL. Note that the syntax for these functionalities is the same as Spark SQL -- Opaque simply replaces the execution to work with encrypted data.
146+
147+
### Data types
148+
Out of the existing [Spark SQL types](https://spark.apache.org/docs/latest/sql-ref-datatypes.html), Opaque supports
149+
150+
- All numeric types except `DecimalType`, which is currently converted into `FloatType`
151+
- `StringType`
152+
- `BinaryType`
153+
- `BooleanType`
154+
- `TimestampTime`, `DateType`
155+
- `ArrayType`, `MapType`
156+
157+
### Functions
158+
We currently support a subset of the Spark SQL functions, including both scalar and aggregate-like functions.
159+
160+
- Scalar functions: `case`, `cast`, `concat`, `contains`, `if`, `in`, `like`, `substring`, `upper`
161+
- Aggregate functions: `average`, `count`, `first`, `last`, `max`, `min`, `sum`
162+
163+
UDFs are not supported directly, but one can [extend Opaque with additional functions](#user-defined-functions-udfs) by writing it in C++.
164+
165+
166+
### Operators
167+
168+
Opaque supports the core SQL operators:
169+
170+
- Projection
171+
- Filter
172+
- Global aggregation and grouping aggregation
173+
- Order by, sort by
174+
- Inner join
175+
- Limit
176+
139177
140178
## User-Defined Functions (UDFs)
141179
@@ -168,7 +206,3 @@ Now we can port this UDF to Opaque as follows:
168206
```
169207
170208
3. Finally, implement the UDF in C++. In [`FlatbuffersExpressionEvaluator#eval_helper`](src/enclave/Enclave/ExpressionEvaluation.h), add a case for `tuix::ExprUnion_DotProduct`. Within that case, cast the expression to a `tuix::DotProduct`, recursively evaluate the left and right children, perform the dot product computation on them, and construct a `DoubleField` containing the result.
171-
172-
## Contact
173-
174-
If you want to know more about our project or have questions, please contact Wenting ([email protected]) and/or Ankur ([email protected]).

build.sbt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ scalaVersion := "2.12.10"
88

99
spName := "amplab/opaque"
1010

11-
sparkVersion := "3.0.0"
11+
sparkVersion := "3.0.1"
1212

1313
sparkComponents ++= Seq("core", "sql", "catalyst")
1414

src/enclave/App/App.cpp

Lines changed: 22 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -519,7 +519,7 @@ JNIEXPORT jbyteArray JNICALL Java_edu_berkeley_cs_rise_opaque_execution_SGXEncla
519519
}
520520

521521
JNIEXPORT jbyteArray JNICALL
522-
Java_edu_berkeley_cs_rise_opaque_execution_SGXEnclave_ScanCollectLastPrimary(
522+
Java_edu_berkeley_cs_rise_opaque_execution_SGXEnclave_NonObliviousSortMergeJoin(
523523
JNIEnv *env, jobject obj, jlong eid, jbyteArray join_expr, jbyteArray input_rows) {
524524
(void)obj;
525525

@@ -535,16 +535,16 @@ Java_edu_berkeley_cs_rise_opaque_execution_SGXEnclave_ScanCollectLastPrimary(
535535
size_t output_rows_length = 0;
536536

537537
if (input_rows_ptr == nullptr) {
538-
ocall_throw("ScanCollectLastPrimary: JNI failed to get input byte array.");
538+
ocall_throw("NonObliviousSortMergeJoin: JNI failed to get input byte array.");
539539
} else {
540-
oe_check_and_time("Scan Collect Last Primary",
541-
ecall_scan_collect_last_primary(
540+
oe_check_and_time("Non-Oblivious Sort-Merge Join",
541+
ecall_non_oblivious_sort_merge_join(
542542
(oe_enclave_t*)eid,
543543
join_expr_ptr, join_expr_length,
544544
input_rows_ptr, input_rows_length,
545545
&output_rows, &output_rows_length));
546546
}
547-
547+
548548
jbyteArray ret = env->NewByteArray(output_rows_length);
549549
env->SetByteArrayRegion(ret, 0, output_rows_length, (jbyte *) output_rows);
550550
free(output_rows);
@@ -556,44 +556,45 @@ Java_edu_berkeley_cs_rise_opaque_execution_SGXEnclave_ScanCollectLastPrimary(
556556
}
557557

558558
JNIEXPORT jbyteArray JNICALL
559-
Java_edu_berkeley_cs_rise_opaque_execution_SGXEnclave_NonObliviousSortMergeJoin(
560-
JNIEnv *env, jobject obj, jlong eid, jbyteArray join_expr, jbyteArray input_rows,
561-
jbyteArray join_row) {
559+
Java_edu_berkeley_cs_rise_opaque_execution_SGXEnclave_BroadcastNestedLoopJoin(
560+
JNIEnv *env, jobject obj, jlong eid, jbyteArray join_expr, jbyteArray outer_rows, jbyteArray inner_rows) {
562561
(void)obj;
563562

564563
jboolean if_copy;
565564

566565
uint32_t join_expr_length = (uint32_t) env->GetArrayLength(join_expr);
567566
uint8_t *join_expr_ptr = (uint8_t *) env->GetByteArrayElements(join_expr, &if_copy);
568567

569-
uint32_t input_rows_length = (uint32_t) env->GetArrayLength(input_rows);
570-
uint8_t *input_rows_ptr = (uint8_t *) env->GetByteArrayElements(input_rows, &if_copy);
568+
uint32_t outer_rows_length = (uint32_t) env->GetArrayLength(outer_rows);
569+
uint8_t *outer_rows_ptr = (uint8_t *) env->GetByteArrayElements(outer_rows, &if_copy);
571570

572-
uint32_t join_row_length = (uint32_t) env->GetArrayLength(join_row);
573-
uint8_t *join_row_ptr = (uint8_t *) env->GetByteArrayElements(join_row, &if_copy);
571+
uint32_t inner_rows_length = (uint32_t) env->GetArrayLength(inner_rows);
572+
uint8_t *inner_rows_ptr = (uint8_t *) env->GetByteArrayElements(inner_rows, &if_copy);
574573

575574
uint8_t *output_rows = nullptr;
576575
size_t output_rows_length = 0;
577576

578-
if (input_rows_ptr == nullptr) {
579-
ocall_throw("NonObliviousSortMergeJoin: JNI failed to get input byte array.");
577+
if (outer_rows_ptr == nullptr) {
578+
ocall_throw("BroadcastNestedLoopJoin: JNI failed to get inner byte array.");
579+
} else if (inner_rows_ptr == nullptr) {
580+
ocall_throw("BroadcastNestedLoopJoin: JNI failed to get outer byte array.");
580581
} else {
581-
oe_check_and_time("Non-Oblivious Sort-Merge Join",
582-
ecall_non_oblivious_sort_merge_join(
582+
oe_check_and_time("Broadcast Nested Loop Join",
583+
ecall_broadcast_nested_loop_join(
583584
(oe_enclave_t*)eid,
584585
join_expr_ptr, join_expr_length,
585-
input_rows_ptr, input_rows_length,
586-
join_row_ptr, join_row_length,
586+
outer_rows_ptr, outer_rows_length,
587+
inner_rows_ptr, inner_rows_length,
587588
&output_rows, &output_rows_length));
588589
}
589-
590+
590591
jbyteArray ret = env->NewByteArray(output_rows_length);
591592
env->SetByteArrayRegion(ret, 0, output_rows_length, (jbyte *) output_rows);
592593
free(output_rows);
593594

594595
env->ReleaseByteArrayElements(join_expr, (jbyte *) join_expr_ptr, 0);
595-
env->ReleaseByteArrayElements(input_rows, (jbyte *) input_rows_ptr, 0);
596-
env->ReleaseByteArrayElements(join_row, (jbyte *) join_row_ptr, 0);
596+
env->ReleaseByteArrayElements(outer_rows, (jbyte *) outer_rows_ptr, 0);
597+
env->ReleaseByteArrayElements(inner_rows, (jbyte *) inner_rows_ptr, 0);
597598

598599
return ret;
599600
}

src/enclave/App/CMakeLists.txt

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,10 @@ set(SOURCES
77
${CMAKE_CURRENT_BINARY_DIR}/Enclave_u.c)
88

99
add_custom_command(
10-
COMMAND oeedger8r --untrusted ${CMAKE_SOURCE_DIR}/Enclave/Enclave.edl --search-path ${CMAKE_SOURCE_DIR}/Enclave
10+
COMMAND oeedger8r --untrusted ${CMAKE_SOURCE_DIR}/Enclave/Enclave.edl
11+
--search-path ${CMAKE_SOURCE_DIR}/Enclave
12+
--search-path ${OE_INCLUDEDIR}
13+
--search-path ${OE_INCLUDEDIR}/openenclave/edl/sgx
1114
DEPENDS ${CMAKE_SOURCE_DIR}/Enclave/Enclave.edl
1215
OUTPUT ${CMAKE_CURRENT_BINARY_DIR}/Enclave_u.h ${CMAKE_CURRENT_BINARY_DIR}/Enclave_u.c ${CMAKE_CURRENT_BINARY_DIR}/Enclave_args.h)
1316

@@ -22,6 +25,6 @@ if ("$ENV{MODE}" STREQUAL "SIMULATE")
2225
target_compile_definitions(enclave_jni PUBLIC -DSIMULATE)
2326
endif()
2427

25-
target_link_libraries(enclave_jni openenclave::oehost openenclave::oehostverify)
28+
target_link_libraries(enclave_jni openenclave::oehost)
2629

2730
install(TARGETS enclave_jni DESTINATION lib)

src/enclave/App/SGXEnclave.h

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -38,11 +38,11 @@ extern "C" {
3838
JNIEnv *, jobject, jlong, jbyteArray, jbyteArray);
3939

4040
JNIEXPORT jbyteArray JNICALL
41-
Java_edu_berkeley_cs_rise_opaque_execution_SGXEnclave_ScanCollectLastPrimary(
41+
Java_edu_berkeley_cs_rise_opaque_execution_SGXEnclave_NonObliviousSortMergeJoin(
4242
JNIEnv *, jobject, jlong, jbyteArray, jbyteArray);
4343

4444
JNIEXPORT jbyteArray JNICALL
45-
Java_edu_berkeley_cs_rise_opaque_execution_SGXEnclave_NonObliviousSortMergeJoin(
45+
Java_edu_berkeley_cs_rise_opaque_execution_SGXEnclave_BroadcastNestedLoopJoin(
4646
JNIEnv *, jobject, jlong, jbyteArray, jbyteArray, jbyteArray);
4747

4848
JNIEXPORT jobject JNICALL

src/enclave/CMakeLists.txt

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,17 @@
11
cmake_minimum_required(VERSION 3.13)
22

33
project(OpaqueEnclave)
4-
54
enable_language(ASM)
65

76
option(FLATBUFFERS_LIB_DIR "Location of Flatbuffers library headers.")
87
option(FLATBUFFERS_GEN_CPP_DIR "Location of Flatbuffers generated C++ files.")
98

10-
find_package(OpenEnclave CONFIG REQUIRED)
9+
set(OE_MIN_VERSION 0.12.0)
10+
find_package(OpenEnclave ${OE_MIN_VERSION} CONFIG REQUIRED)
11+
12+
set(OE_CRYPTO_LIB
13+
mbed
14+
CACHE STRING "Crypto library used by enclaves.")
1115

1216
include_directories(App)
1317
include_directories(${CMAKE_BINARY_DIR}/App)
@@ -18,7 +22,7 @@ include_directories(${CMAKE_BINARY_DIR}/Enclave)
1822
include_directories(ServiceProvider)
1923
include_directories(${FLATBUFFERS_LIB_DIR})
2024
include_directories(${FLATBUFFERS_GEN_CPP_DIR})
21-
include_directories("/opt/openenclave/include")
25+
include_directories(${OE_INCLUDEDIR})
2226

2327
if(CMAKE_SIZEOF_VOID_P EQUAL 4)
2428
set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -m32")
@@ -31,14 +35,11 @@ set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} -O0 -g -DDEBUG -UNDEBUG -UED
3135
set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} -O2 -DNDEBUG -DEDEBUG -UDEBUG")
3236
set(CMAKE_CXX_FLAGS_PROFILE "${CMAKE_CXX_FLAGS_PROFILE} -O2 -DNDEBUG -DEDEBUG -UDEBUG -DPERF")
3337

34-
message("openssl rsa -in $ENV{OPAQUE_HOME}/private_key.pem -pubout -out $ENV{OPAQUE_HOME}/public_key.pub")
35-
message("$ENV{OPAQUE_HOME}/public_key.pub")
36-
3738
add_custom_target(run ALL
3839
DEPENDS $ENV{OPAQUE_HOME}/public_key.pub)
3940

4041
add_custom_command(
41-
COMMAND openssl rsa -in $ENV{OPAQUE_HOME}/private_key.pem -pubout -out $ENV{OPAQUE_HOME}/public_key.pub
42+
COMMAND openssl rsa -in $ENV{PRIVATE_KEY_PATH} -pubout -out $ENV{OPAQUE_HOME}/public_key.pub
4243
OUTPUT $ENV{OPAQUE_HOME}/public_key.pub)
4344

4445
add_subdirectory(App)

0 commit comments

Comments
 (0)