You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I sync MongoDB to Elasticsearch, I need to write into multiple Elasticsearch indexes due to the large size of the MongoDB table. Because there are update operations, I can't use Elasticsearch's built-in rollover index and instead need to split the indexes based on the record creation time. Currently, Flink CDC does not support routing a single MongoDB table to multiple downstream tables.
Can we support the Elasticsearch sink to write into different indexes based on a specific field? Or would it be better to enhance the route function to support routing based on fields?Or is this just a very specific use case?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
When I sync MongoDB to Elasticsearch, I need to write into multiple Elasticsearch indexes due to the large size of the MongoDB table. Because there are update operations, I can't use Elasticsearch's built-in rollover index and instead need to split the indexes based on the record creation time. Currently, Flink CDC does not support routing a single MongoDB table to multiple downstream tables.
Can we support the Elasticsearch sink to write into different indexes based on a specific field? Or would it be better to enhance the route function to support routing based on fields?Or is this just a very specific use case?
Beta Was this translation helpful? Give feedback.
All reactions