You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This makes three changes in preparation for switching the docs to
Asciidoctor:
1. Fixes a broken link. As a side effect this fixes a missing emphasis
in Asciidoctor that was caused by parsing issues with the `_` in the old
link.
2. Fixes an `added` macro that renders "funny" in Asciidoctor.
3. Replace a tab in a code example with spaces. AsciiDoc was doing this
automatically but Asciidoctor preserves the tab. We don't need the tab.
Copy file name to clipboardExpand all lines: docs/src/reference/asciidoc/core/configuration.adoc
+4Lines changed: 4 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -533,12 +533,16 @@ added[2.1]
533
533
534
534
added[2.2]
535
535
`es.net.proxy.https.host`:: Https proxy host name
536
+
536
537
added[2.2]
537
538
`es.net.proxy.https.port`:: Https proxy port
539
+
538
540
added[2.2]
539
541
`es.net.proxy.https.user`:: Https proxy user name
542
+
540
543
added[2.2]
541
544
`es.net.proxy.https.pass`:: Https proxy password
545
+
542
546
added[2.2]
543
547
`es.net.proxy.https.use.system.props`(default yes):: Whether the use the system Https proxy properties (namely `https.proxyHost` and `https.proxyPort`) or not
{es} allows each document to have its own http://www.elasticsearch.org/guide/en/elasticsearch/guide/current/\_document\_metadata.html[metadata]. As explained above, through the various <<cfg-mapping, mapping>> options one can customize these parameters so that their values are extracted from their belonging document. Further more, one can even include/exclude what parts of the data are sent back to {es}. In Spark, {eh} extends this functionality allowing metadata to be supplied _outside_ the document itself through the use of http://spark.apache.org/docs/latest/programming-guide.html#working-with-key-value-pairs[_pair_ ++RDD++s].
298
+
{es} allows each document to have its own {ref}/mapping-fields.html[metadata]. As explained above, through the various <<cfg-mapping, mapping>> options one can customize these parameters so that their values are extracted from their belonging document. Further more, one can even include/exclude what parts of the data are sent back to {es}. In Spark, {eh} extends this functionality allowing metadata to be supplied _outside_ the document itself through the use of http://spark.apache.org/docs/latest/programming-guide.html#working-with-key-value-pairs[_pair_ ++RDD++s].
299
299
In other words, for ++RDD++s containing a key-value tuple, the metadata can be extracted from the key and the value used as the document source.
300
300
301
301
The metadata is described through the +Metadata+ Java http://docs.oracle.com/javase/tutorial/java/javaOO/enum.html[enum] within +org.elasticsearch.spark.rdd+ package which identifies its type - +id+, +ttl+, +version+, etc...
@@ -924,7 +924,7 @@ jssc.start();
924
924
[[spark-streaming-write-meta]]
925
925
==== Handling document metadata
926
926
927
-
{es} allows each document to have its own http://www.elasticsearch.org/guide/en/elasticsearch/guide/current/\_document\_metadata.html[metadata]. As explained above, through the various <<cfg-mapping, mapping>> options one can customize these parameters so that their values are extracted from their belonging document. Further more, one can even include/exclude what parts of the data are sent back to {es}. In Spark, {eh} extends this functionality allowing metadata to be supplied _outside_ the document itself through the use of http://spark.apache.org/docs/latest/programming-guide.html#working-with-key-value-pairs[_pair_ ++RDD++s].
927
+
{es} allows each document to have its own {ref}/mapping-fields.html[metadata]. As explained above, through the various <<cfg-mapping, mapping>> options one can customize these parameters so that their values are extracted from their belonging document. Further more, one can even include/exclude what parts of the data are sent back to {es}. In Spark, {eh} extends this functionality allowing metadata to be supplied _outside_ the document itself through the use of http://spark.apache.org/docs/latest/programming-guide.html#working-with-key-value-pairs[_pair_ ++RDD++s].
928
928
929
929
This is no different in Spark Streaming. For ++DStreams++s containing a key-value tuple, the metadata can be extracted from the key and the value used as the document source.
0 commit comments