Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Backport 2.x] Fix SuggestSearch.testSkipDuplicates by forceing refresh when indexing its test documents (#11068) #11141

Merged
merged 2 commits into from
Nov 10, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
### Fixed
- Fix class_cast_exception when passing int to _version and other metadata fields in ingest simulate API ([#10101](https://github.com/opensearch-project/OpenSearch/pull/10101))
- Adding version condition while adding geoshape doc values to the index, to ensure backward compatibility.([#11095](https://github.com/opensearch-project/OpenSearch/pull/11095))
- Fix SuggestSearch.testSkipDuplicates by forcing refresh when indexing its test documents ([#11068](https://github.com/opensearch-project/OpenSearch/pull/11068))

### Security

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,7 @@
import java.util.Set;

import static org.opensearch.action.support.WriteRequest.RefreshPolicy.IMMEDIATE;
import static org.opensearch.action.support.WriteRequest.RefreshPolicy.WAIT_UNTIL;
import static org.opensearch.cluster.metadata.IndexMetadata.SETTING_NUMBER_OF_REPLICAS;
import static org.opensearch.cluster.metadata.IndexMetadata.SETTING_NUMBER_OF_SHARDS;
import static org.opensearch.common.xcontent.XContentFactory.jsonBuilder;
Expand Down Expand Up @@ -1171,6 +1172,7 @@ public void testSkipDuplicates() throws Exception {
createIndexAndMapping(mapping);
int numDocs = randomIntBetween(10, 100);
int numUnique = randomIntBetween(1, numDocs);
logger.info("Suggestion duplicate parameters: numDocs {} numUnique {}", numDocs, numUnique);
List<IndexRequestBuilder> indexRequestBuilders = new ArrayList<>();
int[] weights = new int[numUnique];
Integer[] termIds = new Integer[numUnique];
Expand All @@ -1180,8 +1182,10 @@ public void testSkipDuplicates() throws Exception {
int weight = randomIntBetween(0, 100);
weights[id] = Math.max(weight, weights[id]);
String suggestion = "suggestion-" + String.format(Locale.ENGLISH, "%03d", id);
logger.info("Creating {}, id {}, weight {}", suggestion, i, id, weight);
indexRequestBuilders.add(
client().prepareIndex(INDEX)
.setRefreshPolicy(WAIT_UNTIL)
.setSource(
jsonBuilder().startObject()
.startObject(FIELD)
Expand All @@ -1195,10 +1199,12 @@ public void testSkipDuplicates() throws Exception {
indexRandom(true, indexRequestBuilders);

Arrays.sort(termIds, Comparator.comparingInt(o -> weights[(int) o]).reversed().thenComparingInt(a -> (int) a));
logger.info("Expected terms id ordered {}", (Object[]) termIds);
String[] expected = new String[numUnique];
for (int i = 0; i < termIds.length; i++) {
expected[i] = "suggestion-" + String.format(Locale.ENGLISH, "%03d", termIds[i]);
}
logger.info("Expected suggestions field values {}", (Object[]) expected);
CompletionSuggestionBuilder completionSuggestionBuilder = SuggestBuilders.completionSuggestion(FIELD)
.prefix("sugg")
.skipDuplicates(true)
Expand All @@ -1207,6 +1213,7 @@ public void testSkipDuplicates() throws Exception {
SearchResponse searchResponse = client().prepareSearch(INDEX)
.suggest(new SuggestBuilder().addSuggestion("suggestions", completionSuggestionBuilder))
.get();
logger.info("Search Response with Suggestions {}", searchResponse);
assertSuggestions(searchResponse, true, "suggestions", expected);
}

Expand Down
Loading