@@ -74,13 +74,13 @@ You can link against this library in your program at the following coordinates:
7474</tr >
7575<tr >
7676<td >
77- <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.11<br >version: 2.7.5 </pre >
77+ <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.11<br >version: 2.7.6 </pre >
7878</td >
7979<td >
80- <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.12<br >version: 2.7.5 </pre >
80+ <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.12<br >version: 2.7.6 </pre >
8181</td >
8282<td >
83- <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.13<br >version: 2.7.5 </pre >
83+ <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.13<br >version: 2.7.6 </pre >
8484</td >
8585</tr >
8686</table >
@@ -91,17 +91,17 @@ This package can be added to Spark using the `--packages` command line option. F
9191
9292### Spark compiled with Scala 2.11
9393```
94- $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.11:2.7.5
94+ $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.11:2.7.6
9595```
9696
9797### Spark compiled with Scala 2.12
9898```
99- $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.7.5
99+ $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.7.6
100100```
101101
102102### Spark compiled with Scala 2.13
103103```
104- $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.13:2.7.5
104+ $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.13:2.7.6
105105```
106106
107107## Usage
@@ -238,18 +238,18 @@ to decode various binary formats.
238238
239239The jars that you need to get are:
240240
241- * spark-cobol_2.12-2.7.5 .jar
242- * cobol-parser_2.12-2.7.5 .jar
241+ * spark-cobol_2.12-2.7.6 .jar
242+ * cobol-parser_2.12-2.7.6 .jar
243243* scodec-core_2.12-1.10.3.jar
244244* scodec-bits_2.12-1.1.4.jar
245245
246246> Versions older than 2.7.1 also need ` antlr4-runtime-4.8.jar ` .
247247
248248After that you can specify these jars in ` spark-shell ` command line. Here is an example:
249249```
250- $ spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.7.5
250+ $ spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.7.6
251251or
252- $ spark-shell --master yarn --deploy-mode client --driver-cores 4 --driver-memory 4G --jars spark-cobol_2.12-2.7.5 .jar,cobol-parser_2.12-2.7.5 .jar,scodec-core_2.12-1.10.3.jar,scodec-bits_2.12-1.1.4.jar
252+ $ spark-shell --master yarn --deploy-mode client --driver-cores 4 --driver-memory 4G --jars spark-cobol_2.12-2.7.6 .jar,cobol-parser_2.12-2.7.6 .jar,scodec-core_2.12-1.10.3.jar,scodec-bits_2.12-1.1.4.jar
253253
254254Setting default log level to "WARN".
255255To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
@@ -320,7 +320,7 @@ The fat jar will have '-bundle' suffix. You can also download pre-built bundles
320320
321321Then, run ` spark-shell` or ` spark-submit` adding the fat jar as the option.
322322` ` ` sh
323- $ spark-shell --jars spark-cobol_2.12_3.3-2.7.6 -SNAPSHOT-bundle.jar
323+ $ spark-shell --jars spark-cobol_2.12_3.3-2.7.7 -SNAPSHOT-bundle.jar
324324` ` `
325325
326326> < b> A note for building and running tests on Windows< /b>
@@ -1771,6 +1771,14 @@ at org.apache.hadoop.io.nativeio.NativeIO$POSIX.getStat(NativeIO.java:608)
17711771A: Update hadoop dll to version 3.2.2 or newer.
17721772
17731773## Changelog
1774+ - #### 2.7.6 released 26 September 2024.
1775+ - [ #710 ] ( https://github.com/AbsaOSS/cobrix/issues/710 ) Fix index generation for files with record length fields or expressions.
1776+ - [ #712 ] ( https://github.com/AbsaOSS/cobrix/issues/712 ) Add an option for explicitly logging layout positions (` false ` by default).
1777+ ``` scala
1778+ // Enable logging of layout positions
1779+ .option(" debug_layout_positions" , " true" )
1780+ ```
1781+
17741782- #### 2.7.5 released 19 August 2024 .
17751783 - [# 703 ](https:// github.com/ AbsaOSS / cobrix/ issues/ 703 ) Add maximum length for generated segment id fields, like `seg_id0`, `seg_id1`, etc.
17761784
0 commit comments