Prefetch while consuming from Stream #346
Replies: 2 comments 9 replies
-
Hi @vivek-singh2, rabbitmq-stream-dotnet-client/RabbitMQ.Stream.Client/RawConsumer.cs Lines 414 to 424 in fa8f66b At the moment is hidden from the user. You can only configure the Initial credits in this way: var c = await system.CreateRawConsumer(new RawConsumerConfig(streamsList[0])
{
OffsetSpec = new OffsetTypeFirst(),
InitialCredits = 1, // be careful to use high values here.
The client asks for more credits only when a chuck is terminated. About:
Can you please elaborate more on the use case? |
Beta Was this translation helpful? Give feedback.
-
Thanks @Gsantomaggio for inputs. Initial credit is just used while consumer starts or gets used later as well? My use case is- I need to consume messages from super stream near real time (offset lag should remain close to 0 or may be few hundreds max) Message rate which is expected to be consumed is 11K per second. I am using three partitions of stream as of now and trying to consume 900 messages per second. But I always see offset lag is increasing at fast pace and reaches to 100K within few mins. Processing at consume end, for each message, is just a DB hit, which is taking 3-4 milliseconds. My aim is to increase consumer throughput and reach 11K consumption rate. |
Beta Was this translation helpful? Give feedback.
-
Is there a way to specify Prefetch (Qos) while consuming from stream (super stream)? By default it is showing 0 as Prefetch on Rabbit UI Dashboard against consumer, which I guess is saying unlimited messages. I am running load and wanted to control how many messages to be read from stream. Or if there is other way unlike classic queues then kindly suggest.
Beta Was this translation helpful? Give feedback.
All reactions