@@ -50,13 +50,38 @@ You can configure the following properties when reading data from MongoDB in str
5050 with a comma.
5151 |
5252 | To learn more about specifying multiple collections, see :ref:`spark-specify-multiple-collections`.
53-
53+
5454 * - ``comment``
5555 - | The comment to append to the read operation. Comments appear in the
5656 :manual:`output of the Database Profiler. </reference/database-profiler>`
5757 |
5858 | **Default:** None
5959
60+ * - ``mode``
61+ - | The parsing strategy to use when handling documents that don't match the
62+ expected schema. This option accepts the following values:
63+
64+ - ``ReadConfig.ParseMode.FAILFAST``: Throws an exception when parsing a document that
65+ doesn't match the schema.
66+ - ``ReadConfig.ParseMode.PERMISSIVE``: Sets fields to ``null`` when data types don't match
67+ the schema. To store each invalid document as an extended JSON string,
68+ combine this value with the ``columnNameOfCorruptRecord`` option.
69+ - ``ReadConfig.ParseMode.DROPMALFORMED``: Ignores any document that doesn't match
70+ the schema.
71+
72+ |
73+ | **Default:** ``ReadConfig.ParseMode.FAILFAST``
74+
75+ * - ``columnNameOfCorruptRecord``
76+ - | If you set the ``mode`` option to ``ReadConfig.ParseMode.PERMISSIVE``,
77+ this option specifies the name of the new column that stores the invalid
78+ document as extended JSON. If you're using an explicit schema, it must
79+ include the name of the new column. If you're
80+ using an inferred schema, the {+connector-short+} adds the new column to the
81+ end of the schema.
82+ |
83+ | **Default:** None
84+
6085 * - ``mongoClientFactory``
6186 - | MongoClientFactory configuration key.
6287 | You can specify a custom implementation, which must implement the
0 commit comments