(PDB-2256) Fix terminus bug with large binary catalog data #1792
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The terminus includes code to help users diagnose the source of data
that cannot be converted to UTF-8. There are several sources of this,
but one is having an incorrect (or known) character set for portions of
the catalog. When this invalid character data is large, it would cause
the terminus to hang trying to include debugging information.
This patch changes the terminus to only look for the first instance of
bad data. It will also avoid doing the extra calculations needed for the
error context unless debug mode is enabled. When not in debug mode there
should be no impact on performance.