-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE REQUEST] DataLakeFileClient read from InputStream #19612
Comments
Hi @ppalaga Thank you for reporting this issue. |
Thanks for the explanation, @gapra-msft. I agree this can be solved by improving the documentation. While I see that there are other endpoints of the API that allow to get the file without the newline appended, |
@ppalaga Ah, I see the problem now. We do not currently support being able to read Datalake files from InputStream (and correspondingly write to them via OutputStream) but it is something that is on our radar, and I'd be happy to tag this as a Feature Request. With that being said, as a temporary workaround for you, we do currently support reading from an InputStream using our blobs library. You can instantiate a BlobClient that points to the datalake file (just replace the dfs in the endpoint with blob) and use the openInputStream() API to read the file, and this will work as expected. |
That would be nice, thanks!
What a trick! Thanks! |
Dev notes for feature request: OuputStream will call into DataLakeFileClient buffered upload using a similar pattern to BlockBlobOutputStream |
Hey @gapra-msft, |
Hi @omarsmak Thank you for posting your question. I am following up with my team to figure out what are invalid parameters for the query method. I believe /0 is one such character for record separator. |
Thanks for the feedback. If indeed |
Hi @omarsmak It looks like there are two issues at play here.
|
Add "Pending" to LabProperties status (Azure#19612) Add "Pending" to LabProperties status, an extensible enum
Closing as #21322 added |
My intuitive expectation is that (1)
DataLakeFileClient.openQueryInputStream("SELECT * from BlobStorage")
is equivalent to (2)DataLakeFileClient.read(OutputStream)
. However, the result of (1) always contains a\n
characted appended at the end, which (2) does not.Steps to reproduce:
Expected: both tests pass
Actual:
Perhaps, the behavior is not an error. If so, I wonder where it is documented?
The text was updated successfully, but these errors were encountered: