-
Notifications
You must be signed in to change notification settings - Fork 437
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use Streams instead of Events #186
Comments
As far as time frame, there is none. Everyone wants this to happen; it just isn't a priority for anyone right now. If it's a priority for you, you're likely going to be more motivated to get it done than any of the maintainers. Speaking for myself and @rossipedia, our priorities have been to fix bugs and implement missing features we need (bulk insert, SQL Server Availability Groups), not performance. I believe when would eventually get around to this, but I don't know when. I think the way to learn the codebase is to dive in. I haven't really given a ton of thought on how streams could be integrated, but as a primer: The token-stream-parser reads from the TDS stream in the nextToken() method, which calls the token parsers. In particular, the one you probably care about most is the row-token-parser which returns a Make sure to talk often about how you're going about the implementation so we can discuss the impacts before it gets out of control, and make sure it's something we'll accept. |
@stephen-bartell have you started working on this? @bretcope my assumption is - the TDS protocol response packet size is negotiated at the beginning - http://msdn.microsoft.com/en-us/library/dd305039.aspx (as specified here https://github.com/pekim/tedious/blob/master/src/login7-payload.coffee line 307) but after the packet is received and understood by tedious based on the incoming token of row - https://github.com/pekim/tedious/blob/master/src/token/token-stream-parser.coffee the row token parser using event emitters. And all of this is operating on buffers. In https://github.com/pekim/tedious/blob/master/src/token/token-stream-parser.coffee (line 36) there is an addBuffer action which allows to add the incoming binary data from TDS. @stephen-bartell so when you pointed out replacing the event emitters with streams so - translating the data into readable streams instead of an event? Please let me know if I am completely wrong about all this :) |
Hi, Please let me know your feedback on the note above .. is that the correct direction? |
There's nothing that stands out to me as wrong at first glance. |
I think it's probably the best approach to emit events for metadata (such as login events and column metadata and such), and save the A |
Yeah, a transform on the stream is what we'd want. |
Some work is being done on this #225 I'm closing this issue in favor of keeping progress updates on the PR. |
I'm aware that request is no a real stream, just an event emitter. It would be very advantageous if request streamed. I've seen it thrown around in a couple threads here that streaming is somewhat planned, like a nice to have just no time to do it.
So, first, is there any solid idea on when that time might come?
Second, i'm willing to do it, I'm just totally unfamiliar with the codebase. So if you guys are down to have another body in the code, could i get some cheat codes on where to start looking to implement this feature?
please inform either way, a lack of streams is making me throw things at the wall ;) i'd like to help
The text was updated successfully, but these errors were encountered: