Skip to content

Commit 6f978a7

Browse files
andrewlawhhAndrew Laweric-feng-2011Eric Fengchester-leung
authored
Fix merge errors in the test cases (#176)
* Support for multiple branched CaseWhen * Interval (#116) * add date_add, interval sql still running into issues * Add Interval SQL support * uncomment out the other tests * resolve comments * change interval equality Co-authored-by: Eric Feng <[email protected]> * Remove partition ID argument from enclaves * Fix comments * updates * Modifications to integrate crumb, log-mac, and all-outputs_mac, wip * Store log mac after each output buffer, add all-outputs-mac to each encryptedblocks wip * Add all_outputs_mac to all EncryptedBlocks once all log_macs have been generated * Almost builds * cpp builds * Use ubyte for all_outputs_mac * use Mac for all_outputs_mac * Hopefully this works for flatbuffers all_outputs_mac mutation, cpp builds * Scala builds now too, running into error with union * Stuff builds, error with all outputs mac serialization. this commit uses all_outputs_mac as Mac table * Fixed bug, basic encryption / show works * All single partition tests pass, multiple partiton passes until tpch-9 * All tests pass except tpch-9 and skew join * comment tpch back in * Check same number of ecalls per partition - exception for scanCollectLastPrimary(?) * First attempt at constructing executed DAG * Fix typos * Rework graph * Add log macs to graph nodes * Construct expected DAG and refactor JobNode. Refactor construction of executed DAG. * Implement 'paths to sink' for a DAG * add crumb for last ecall * Fix NULL handling for aggregation (#130) * Modify COUNT and SUM to correctly handle NULL values * Change average to support NULL values * Fix * Changing operator matching from logical to physical (#129) * WIP * Fix * Unapply change * Aggregation rewrite (#132) * updated build/sbt file (#135) * Travis update (#137) * update breeze (#138) * TPC-H test suite added (#136) * added tpch sql files * functions updated to save temp view * main function skeleton done * load and clear done * fix clear * performQuery done * import cleanup, use OPAQUE_HOME * TPC-H 9 refactored to use SQL rather than DF operations * removed : Unit, unused imports * added TestUtils.scala * moved all common initialization to TestUtils * update name * begin rewriting TPCH.scala to store persistent tables * invalid table name error * TPCH conversion to class started * compiles * added second case, cleared up names * added TPC-H 6 to check that persistent state has no issues * added functions for the last two tables * addressed most logic changes * DataFrame only loaded once * apply method in companion object * full test suite added * added testFunc parameter to testAgainstSpark * ignore #18 * Separate IN PR (#124) * finishing the in expression. adding more tests and null support. need confirmation on null behavior and also I wonder why integer field is sufficient for string * adding additional test * adding additional test * saving concat implementation and it's passing basic functionality tests * adding type aware comparison and better error message for IN operator * adding null checking for the concat operator and adding one additional test * cleaning up IN&Concat PR * deleting concat and preping the in branch for in pr * fixing null bahavior now it's only null when there's no match and there's null input * Build failed Co-authored-by: Ubuntu <[email protected]> Co-authored-by: Wenting Zheng <[email protected]> Co-authored-by: Wenting Zheng <[email protected]> * Merge new aggregate * Uncomment log_mac_lst clear * Clean up comments * Separate Concat PR (#125) Implementation of the CONCAT expression. Co-authored-by: Ubuntu <[email protected]> Co-authored-by: Wenting Zheng <[email protected]> * Clean up comments in other files * Update pathsEqual to be less conservative * Remove print statements from unit tests * Removed calls to toSet in TPC-H tests (#140) * removed calls to toSet * added calls to toSet back where queries are unordered * Documentation update (#148) * Cluster Remote Attestation Fix (#146) The existing code only had RA working when run locally. This PR adds a sleep for 5 seconds to make sure that all executors are spun up successfully before attestation begins. Closes #147 * upgrade to 3.0.1 (#144) * Update two TPC-H queries (#149) Tests for TPC-H 12 and 19 pass. * TPC-H 20 Fix (#142) * string to stringtype error * tpch 20 passes * cleanup * implemented changes * decimal.tofloat Co-authored-by: Wenting Zheng <[email protected]> * Add expected operator DAG generation from executedPlan string * Rebase * Merge join update * Integrate new join * Add expected operator for sortexec * Merge comp-integrity with join update * Remove some print statements * Construct expected DAG from dataframe physical plan * Refactor collect and add integrity checking helper function to OpaqueOperatorTest * Remove addExpectedOperator from JobVerificationEngine, add comments * Implement expected DAG construction by doing graph manipulation on dataframe field instead of string parsing * Fix merge errors in the test cases Co-authored-by: Andrew Law <[email protected]> Co-authored-by: Eric Feng <[email protected]> Co-authored-by: Eric Feng <[email protected]> Co-authored-by: Chester Leung <[email protected]> Co-authored-by: Wenting Zheng <[email protected]> Co-authored-by: octaviansima <[email protected]> Co-authored-by: Chenyu Shi <[email protected]> Co-authored-by: Ubuntu <[email protected]> Co-authored-by: Wenting Zheng <[email protected]>
1 parent 5dc5561 commit 6f978a7

File tree

1 file changed

+0
-48
lines changed

1 file changed

+0
-48
lines changed

src/test/scala/edu/berkeley/cs/rise/opaque/OpaqueOperatorTests.scala

Lines changed: 0 additions & 48 deletions
Original file line numberDiff line numberDiff line change
@@ -525,54 +525,6 @@ trait OpaqueOperatorTests extends OpaqueTestsBase { self =>
525525
integrityCollect(result)
526526
}
527527

528-
testAgainstSpark("concat with string") { securityLevel =>
529-
val data = for (i <- 0 until 256) yield ("%03d".format(i) * 3, i.toString)
530-
val df = makeDF(data, securityLevel, "str", "x")
531-
df.select(concat(col("str"),lit(","),col("x"))).collect
532-
}
533-
534-
testAgainstSpark("concat with other datatype") { securityLevel =>
535-
// float causes a formating issue where opaque outputs 1.000000 and spark produces 1.0 so the following line is commented out
536-
// val data = for (i <- 0 until 3) yield ("%03d".format(i) * 3, i, 1.0f)
537-
// you can't serialize date so that's not supported as well
538-
// opaque doesn't support byte
539-
val data = for (i <- 0 until 3) yield ("%03d".format(i) * 3, i, null.asInstanceOf[Int],"")
540-
val df = makeDF(data, securityLevel, "str", "int","null","emptystring")
541-
df.select(concat(col("str"),lit(","),col("int"),col("null"),col("emptystring"))).collect
542-
}
543-
544-
testAgainstSpark("isin1") { securityLevel =>
545-
val ids = Seq((1, 2, 2), (2, 3, 1))
546-
val df = makeDF(ids, securityLevel, "x", "y", "id")
547-
val c = $"id" isin ($"x", $"y")
548-
val result = df.filter(c)
549-
result.collect
550-
}
551-
552-
testAgainstSpark("isin2") { securityLevel =>
553-
val ids2 = Seq((1, 1, 1), (2, 2, 2), (3,3,3), (4,4,4))
554-
val df2 = makeDF(ids2, securityLevel, "x", "y", "id")
555-
val c2 = $"id" isin (1 ,2, 4, 5, 6)
556-
val result = df2.filter(c2)
557-
result.collect
558-
}
559-
560-
testAgainstSpark("isin with string") { securityLevel =>
561-
val ids3 = Seq(("aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa", "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa", "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"), ("b", "b", "b"), ("c","c","c"), ("d","d","d"))
562-
val df3 = makeDF(ids3, securityLevel, "x", "y", "id")
563-
val c3 = $"id" isin ("aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa" ,"b", "c", "d", "e")
564-
val result = df3.filter(c3)
565-
result.collect
566-
}
567-
568-
testAgainstSpark("isin with null") { securityLevel =>
569-
val ids4 = Seq((1, 1, 1), (2, 2, 2), (3,3,null.asInstanceOf[Int]), (4,4,4))
570-
val df4 = makeDF(ids4, securityLevel, "x", "y", "id")
571-
val c4 = $"id" isin (null.asInstanceOf[Int])
572-
val result = df4.filter(c4)
573-
result.collect
574-
}
575-
576528
testAgainstSpark("between") { securityLevel =>
577529
val data = for (i <- 0 until 256) yield(i.toString, i)
578530
val df = makeDF(data, securityLevel, "word", "count")

0 commit comments

Comments
 (0)