Fix JSONDecodeError
when deserializing streamed JSON documents
#790
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR fixes a bug in databroker where it attempts and fails to deserialize partially streamed JSON objects due to chunking over http interfering with the chunking of JSON objects in the data layer.
Description
databroker.documents()
interface fails to list all the documents in certain cases, especially when there are large serialized JSON documents involved. A test casetest_large_document
is written to test the large JSON deserialization bug, and it is patched by modifying the databroker client'sdocument()
interface to maintain a buffer (tail
) until a complete JSON object is streamed.Motivation and Context
This bug was encountered when testing the prefect workflows at NSLS-II CHX beamline. Databroker failed to list the documents associated with certail Bluesky runs, and raises a
JSONDecodeError
.How Has This Been Tested?
A unit test is written in
databroker/tests/test_broker.py
which uses the demo server at https://tiled-demo.blueskyproject.io to list documents associated with a hard-coded uid knnown to reproduce the error without the patch.