You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there (or will there be) an option to validate a very large JSON file (up to 5GB) in chunks, e.g. via streaming, so that the whole JSON file never has to be held in memory?
In principle, I think it is possible, as serdesupports deserialization without buffering + the file could be accessed through mmap. Though I am not sure how much effort it will require, but I'd be happy to have support for this feature here :)
Can you give a first guess about when it might be available? I would love to use jsonschema-rs for our microservice, but since we need the service for very large JSON files within 2 weeks, I wonder if I have to port it to a JVM language to be able to use https://github.com/worldturner/medeia-validator...
Is there (or will there be) an option to validate a very large JSON file (up to 5GB) in chunks, e.g. via streaming, so that the whole JSON file never has to be held in memory?
This would be awesome, since I haven't found any other JSON schema validator in Python being able to do this. For other languages, there is e.g. https://github.com/worldturner/medeia-validator.
The text was updated successfully, but these errors were encountered: