Skip to content

Commit

Permalink
Merge branch 'main' into feature/wlm-cancellation
Browse files Browse the repository at this point in the history
Signed-off-by: Ankit Jain <akjain@amazon.com>
  • Loading branch information
jainankitk authored Sep 11, 2024
2 parents b78ca02 + 9354dd9 commit 7bb6b2c
Show file tree
Hide file tree
Showing 15 changed files with 172 additions and 35 deletions.
5 changes: 4 additions & 1 deletion .github/workflows/assemble.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,10 @@ jobs:
if: runner.os == 'macos'
continue-on-error: true
run: |
brew install docker colima coreutils
# Force QEMU 9.0.2 usage
curl https://raw.githubusercontent.com/Homebrew/homebrew-core/f1a9cf104a9a51779c7a532b658c490f69974839/Formula/q/qemu.rb > qemu.rb
brew install qemu.rb
HOMEBREW_NO_INSTALLED_DEPENDENTS_CHECK=1 HOMEBREW_NO_AUTO_UPDATE=1 brew install docker colima coreutils
gtimeout 15m colima start
shell: bash
- name: Run Gradle (assemble)
Expand Down
7 changes: 2 additions & 5 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,19 +6,15 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
## [Unreleased 2.x]
### Added
- Adding WithFieldName interface for QueryBuilders with fieldName ([#15705](https://github.com/opensearch-project/OpenSearch/pull/15705))
- Relax the join validation for Remote State publication ([#15471](https://github.com/opensearch-project/OpenSearch/pull/15471))
- Static RemotePublication setting added, removed experimental feature flag ([#15478](https://github.com/opensearch-project/OpenSearch/pull/15478))
- MultiTermQueries in keyword fields now default to `indexed` approach and gated behind cluster setting ([#15637](https://github.com/opensearch-project/OpenSearch/pull/15637))
- [Remote Publication] Upload incremental cluster state on master re-election ([#15145](https://github.com/opensearch-project/OpenSearch/pull/15145))
- Making _cat/allocation API use indexLevelStats ([#15292](https://github.com/opensearch-project/OpenSearch/pull/15292))
- Memory optimisations in _cluster/health API ([#15492](https://github.com/opensearch-project/OpenSearch/pull/15492))
- [Workload Management] QueryGroup resource cancellation framework changes ([#15651](https://github.com/opensearch-project/OpenSearch/pull/15651))

### Dependencies
- Bump `com.azure:azure-identity` from 1.13.0 to 1.13.2 ([#15578](https://github.com/opensearch-project/OpenSearch/pull/15578))
- Bump `protobuf` from 3.22.3 to 3.25.4 ([#15684](https://github.com/opensearch-project/OpenSearch/pull/15684))
- Bump `org.apache.logging.log4j:log4j-core` from 2.23.1 to 2.24.0 ([#15858](https://github.com/opensearch-project/OpenSearch/pull/15858))
- Bump `peter-evans/create-pull-request` from 6 to 7 ([#15863](https://github.com/opensearch-project/OpenSearch/pull/15863))
- Bump `com.nimbusds:oauth2-oidc-sdk` from 11.9.1 to 11.19.1 ([#15862](https://github.com/opensearch-project/OpenSearch/pull/15862))

### Changed

Expand All @@ -28,6 +24,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
### Removed

### Fixed
- Fix wildcard query containing escaped character ([#15737](https://github.com/opensearch-project/OpenSearch/pull/15737))

### Security

Expand Down
4 changes: 2 additions & 2 deletions gradle/wrapper/gradle-wrapper.properties
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@

distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-8.10-all.zip
distributionUrl=https\://services.gradle.org/distributions/gradle-8.10.1-all.zip
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionSha256Sum=682b4df7fe5accdca84a4d1ef6a3a6ab096b3efd5edf7de2bd8c758d95a93703
distributionSha256Sum=fdfca5dbc2834f0ece5020465737538e5ba679deeff5ab6c09621d67f8bb1a15
2 changes: 1 addition & 1 deletion plugins/repository-azure/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ dependencies {
api 'com.microsoft.azure:msal4j-persistence-extension:1.3.0'
api "net.java.dev.jna:jna-platform:${versions.jna}"
api 'com.microsoft.azure:msal4j:1.17.0'
api 'com.nimbusds:oauth2-oidc-sdk:11.9.1'
api 'com.nimbusds:oauth2-oidc-sdk:11.19.1'
api 'com.nimbusds:nimbus-jose-jwt:9.40'
api 'com.nimbusds:content-type:2.3'
api 'com.nimbusds:lang-tag:1.7'
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
58db85a807a56ae76baffa519772271ad5808195

This file was deleted.

4 changes: 4 additions & 0 deletions release-notes/opensearch.release-notes-2.17.0.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,10 @@
- ClusterManagerTaskThrottler Improvements ([#15508](https://github.com/opensearch-project/OpenSearch/pull/15508))
- Relax the join validation for Remote State publication ([#15471](https://github.com/opensearch-project/OpenSearch/pull/15471))
- Reset DiscoveryNodes in all transport node actions request ([#15131](https://github.com/opensearch-project/OpenSearch/pull/15131))
- [Remote Publication] Upload incremental cluster state on master re-election ([#15145](https://github.com/opensearch-project/OpenSearch/pull/15145))
- Static RemotePublication setting added, removed experimental feature flag ([#15478](https://github.com/opensearch-project/OpenSearch/pull/15478))
- Making _cat/allocation API use indexLevelStats ([#15292](https://github.com/opensearch-project/OpenSearch/pull/15292))
- Memory optimisations in _cluster/health API ([#15492](https://github.com/opensearch-project/OpenSearch/pull/15492))

### Dependencies
- Bump `netty` from 4.1.111.Final to 4.1.112.Final ([#15081](https://github.com/opensearch-project/OpenSearch/pull/15081))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1278,7 +1278,7 @@ public void writeVerifiableTo(BufferedChecksumStreamOutput out) throws IOExcepti
out.writeByte(state.id());
writeSettingsToStream(settings, out);
out.writeVLongArray(primaryTerms);
out.writeMapValues(mappings, (stream, val) -> val.writeTo(stream));
out.writeMapValues(mappings, (stream, val) -> val.writeVerifiableTo((BufferedChecksumStreamOutput) stream));
out.writeMapValues(aliases, (stream, val) -> val.writeTo(stream));
out.writeMap(customData, StreamOutput::writeString, (stream, val) -> val.writeTo(stream));
out.writeMap(
Expand All @@ -1293,6 +1293,44 @@ public void writeVerifiableTo(BufferedChecksumStreamOutput out) throws IOExcepti
}
}

@Override
public String toString() {
return new StringBuilder().append("IndexMetadata{routingNumShards=")
.append(routingNumShards)
.append(", index=")
.append(index)
.append(", version=")
.append(version)
.append(", state=")
.append(state)
.append(", settingsVersion=")
.append(settingsVersion)
.append(", mappingVersion=")
.append(mappingVersion)
.append(", aliasesVersion=")
.append(aliasesVersion)
.append(", primaryTerms=")
.append(Arrays.toString(primaryTerms))
.append(", aliases=")
.append(aliases)
.append(", settings=")
.append(settings)
.append(", mappings=")
.append(mappings)
.append(", customData=")
.append(customData)
.append(", inSyncAllocationIds=")
.append(inSyncAllocationIds)
.append(", rolloverInfos=")
.append(rolloverInfos)
.append(", isSystem=")
.append(isSystem)
.append(", context=")
.append(context)
.append("}")
.toString();
}

public boolean isSystem() {
return isSystem;
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,8 +40,10 @@
import org.opensearch.common.xcontent.XContentFactory;
import org.opensearch.common.xcontent.XContentHelper;
import org.opensearch.core.common.bytes.BytesReference;
import org.opensearch.core.common.io.stream.BufferedChecksumStreamOutput;
import org.opensearch.core.common.io.stream.StreamInput;
import org.opensearch.core.common.io.stream.StreamOutput;
import org.opensearch.core.common.io.stream.VerifiableWriteable;
import org.opensearch.core.xcontent.XContentBuilder;
import org.opensearch.index.mapper.DocumentMapper;
import org.opensearch.index.mapper.MapperService;
Expand All @@ -60,7 +62,7 @@
* @opensearch.api
*/
@PublicApi(since = "1.0.0")
public class MappingMetadata extends AbstractDiffable<MappingMetadata> {
public class MappingMetadata extends AbstractDiffable<MappingMetadata> implements VerifiableWriteable {
public static final MappingMetadata EMPTY_MAPPINGS = new MappingMetadata(MapperService.SINGLE_MAPPING_NAME, Collections.emptyMap());

private final String type;
Expand Down Expand Up @@ -164,6 +166,13 @@ public void writeTo(StreamOutput out) throws IOException {
out.writeBoolean(routingRequired);
}

@Override
public void writeVerifiableTo(BufferedChecksumStreamOutput out) throws IOException {
out.writeString(type());
source().writeVerifiableTo(out);
out.writeBoolean(routingRequired);
}

@Override
public boolean equals(Object o) {
if (this == o) return true;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@
import org.opensearch.common.xcontent.XContentFactory;
import org.opensearch.core.common.bytes.BytesArray;
import org.opensearch.core.common.bytes.BytesReference;
import org.opensearch.core.common.io.stream.BufferedChecksumStreamOutput;
import org.opensearch.core.common.io.stream.StreamInput;
import org.opensearch.core.common.io.stream.StreamOutput;
import org.opensearch.core.compress.Compressor;
Expand Down Expand Up @@ -169,6 +170,10 @@ public void writeTo(StreamOutput out) throws IOException {
out.writeByteArray(bytes);
}

public void writeVerifiableTo(BufferedChecksumStreamOutput out) throws IOException {
out.writeInt(crc32);
}

@Override
public boolean equals(Object o) {
if (this == o) return true;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,6 @@
import org.apache.lucene.util.automaton.RegExp;
import org.opensearch.common.lucene.BytesRefs;
import org.opensearch.common.lucene.Lucene;
import org.opensearch.common.regex.Regex;
import org.opensearch.common.unit.Fuzziness;
import org.opensearch.core.xcontent.XContentParser;
import org.opensearch.index.analysis.IndexAnalyzers;
Expand Down Expand Up @@ -430,22 +429,27 @@ public Query wildcardQuery(String value, MultiTermQuery.RewriteMethod method, bo
finalValue = value;
}
Predicate<String> matchPredicate;
if (value.contains("?")) {
Automaton automaton = WildcardQuery.toAutomaton(new Term(name(), finalValue));
CompiledAutomaton compiledAutomaton = new CompiledAutomaton(automaton);
Automaton automaton = WildcardQuery.toAutomaton(new Term(name(), finalValue));
CompiledAutomaton compiledAutomaton = new CompiledAutomaton(automaton);
if (compiledAutomaton.type == CompiledAutomaton.AUTOMATON_TYPE.SINGLE) {
// when type equals SINGLE, #compiledAutomaton.runAutomaton is null
matchPredicate = s -> {
if (caseInsensitive) {
s = s.toLowerCase(Locale.ROOT);
}
BytesRef valueBytes = BytesRefs.toBytesRef(s);
return compiledAutomaton.runAutomaton.run(valueBytes.bytes, valueBytes.offset, valueBytes.length);
return s.equals(finalValue);
};
} else if (compiledAutomaton.type == CompiledAutomaton.AUTOMATON_TYPE.ALL) {
return existsQuery(context);
} else if (compiledAutomaton.type == CompiledAutomaton.AUTOMATON_TYPE.NONE) {
return new MatchNoDocsQuery("Wildcard expression matches nothing");
} else {
matchPredicate = s -> {
if (caseInsensitive) {
s = s.toLowerCase(Locale.ROOT);
}
return Regex.simpleMatch(finalValue, s);
BytesRef valueBytes = BytesRefs.toBytesRef(s);
return compiledAutomaton.runAutomaton.run(valueBytes.bytes, valueBytes.offset, valueBytes.length);
};
}

Expand All @@ -468,22 +472,30 @@ public Query wildcardQuery(String value, MultiTermQuery.RewriteMethod method, bo
// Package-private for testing
static Set<String> getRequiredNGrams(String value) {
Set<String> terms = new HashSet<>();

if (value.isEmpty()) {
return terms;
}

int pos = 0;
String rawSequence = null;
String currentSequence = null;
if (!value.startsWith("?") && !value.startsWith("*")) {
// Can add prefix term
currentSequence = getNonWildcardSequence(value, 0);
rawSequence = getNonWildcardSequence(value, 0);
currentSequence = performEscape(rawSequence);
if (currentSequence.length() == 1) {
terms.add(new String(new char[] { 0, currentSequence.charAt(0) }));
} else {
terms.add(new String(new char[] { 0, currentSequence.charAt(0), currentSequence.charAt(1) }));
}
} else {
pos = findNonWildcardSequence(value, pos);
currentSequence = getNonWildcardSequence(value, pos);
rawSequence = getNonWildcardSequence(value, pos);
}
while (pos < value.length()) {
boolean isEndOfValue = pos + currentSequence.length() == value.length();
boolean isEndOfValue = pos + rawSequence.length() == value.length();
currentSequence = performEscape(rawSequence);
if (!currentSequence.isEmpty() && currentSequence.length() < 3 && !isEndOfValue && pos > 0) {
// If this is a prefix or suffix of length < 3, then we already have a longer token including the anchor.
terms.add(currentSequence);
Expand All @@ -502,16 +514,16 @@ static Set<String> getRequiredNGrams(String value) {
terms.add(new String(new char[] { a, b, 0 }));
}
}
pos = findNonWildcardSequence(value, pos + currentSequence.length());
currentSequence = getNonWildcardSequence(value, pos);
pos = findNonWildcardSequence(value, pos + rawSequence.length());
rawSequence = getNonWildcardSequence(value, pos);
}
return terms;
}

private static String getNonWildcardSequence(String value, int startFrom) {
for (int i = startFrom; i < value.length(); i++) {
char c = value.charAt(i);
if (c == '?' || c == '*') {
if ((c == '?' || c == '*') && (i == 0 || value.charAt(i - 1) != '\\')) {
return value.substring(startFrom, i);
}
}
Expand All @@ -529,6 +541,22 @@ private static int findNonWildcardSequence(String value, int startFrom) {
return value.length();
}

private static String performEscape(String str) {
StringBuilder sb = new StringBuilder();
for (int i = 0; i < str.length(); i++) {
if (str.charAt(i) == '\\' && (i + 1) < str.length()) {
char c = str.charAt(i + 1);
if (c == '*' || c == '?') {
i++;
}
}
sb.append(str.charAt(i));
}
assert !sb.toString().contains("\\*");
assert !sb.toString().contains("\\?");
return sb.toString();
}

@Override
public Query regexpQuery(
String value,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -199,7 +199,30 @@ public void testWriteVerifiableTo() throws IOException {
),
randomNonNegativeLong()
);

String mappings = " {\n"
+ " \"_doc\": {\n"
+ " \"properties\": {\n"
+ " \"actiongroups\": {\n"
+ " \"type\": \"text\",\n"
+ " \"fields\": {\n"
+ " \"keyword\": {\n"
+ " \"type\": \"keyword\",\n"
+ " \"ignore_above\": 256\n"
+ " }\n"
+ " }\n"
+ " },\n"
+ " \"allowlist\": {\n"
+ " \"type\": \"text\",\n"
+ " \"fields\": {\n"
+ " \"keyword\": {\n"
+ " \"type\": \"keyword\",\n"
+ " \"ignore_above\": 256\n"
+ " }\n"
+ " }\n"
+ " }\n"
+ " }\n"
+ " }\n"
+ " }";
IndexMetadata metadata1 = IndexMetadata.builder("foo")
.settings(
Settings.builder()
Expand All @@ -220,11 +243,13 @@ public void testWriteVerifiableTo() throws IOException {
.putRolloverInfo(info1)
.putRolloverInfo(info2)
.putInSyncAllocationIds(0, Set.of("1", "2", "3"))
.putMapping(mappings)
.build();

BytesStreamOutput out = new BytesStreamOutput();
BufferedChecksumStreamOutput checksumOut = new BufferedChecksumStreamOutput(out);
metadata1.writeVerifiableTo(checksumOut);
assertNotNull(metadata1.toString());

IndexMetadata metadata2 = IndexMetadata.builder(metadata1.getIndex().getName())
.settings(
Expand All @@ -246,6 +271,7 @@ public void testWriteVerifiableTo() throws IOException {
.putRolloverInfo(info2)
.putRolloverInfo(info1)
.putInSyncAllocationIds(0, Set.of("3", "1", "2"))
.putMapping(mappings)
.build();

BytesStreamOutput out2 = new BytesStreamOutput();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -88,6 +88,32 @@ public void testWildcardQuery() {
);
}

public void testEscapedWildcardQuery() {
MappedFieldType ft = new WildcardFieldMapper.WildcardFieldType("field");
Set<String> expectedTerms = new HashSet<>();
expectedTerms.add(prefixAnchored("*"));
expectedTerms.add(suffixAnchored("*"));

BooleanQuery.Builder builder = new BooleanQuery.Builder();
for (String term : expectedTerms) {
builder.add(new TermQuery(new Term("field", term)), BooleanClause.Occur.FILTER);
}

assertEquals(
new WildcardFieldMapper.WildcardMatchingQuery("field", builder.build(), "\\**\\*"),
ft.wildcardQuery("\\**\\*", null, null)
);

assertEquals(new WildcardFieldMapper.WildcardMatchingQuery("field", builder.build(), "\\*"), ft.wildcardQuery("\\*", null, null));

expectedTerms.remove(suffixAnchored("*"));
builder = new BooleanQuery.Builder();
for (String term : expectedTerms) {
builder.add(new TermQuery(new Term("field", term)), BooleanClause.Occur.FILTER);
}
assertEquals(new WildcardFieldMapper.WildcardMatchingQuery("field", builder.build(), "\\**"), ft.wildcardQuery("\\**", null, null));
}

public void testMultipleWildcardsInQuery() {
final String pattern = "a?cd*efg?h";
MappedFieldType ft = new WildcardFieldMapper.WildcardFieldType("field");
Expand Down
Loading

0 comments on commit 7bb6b2c

Please sign in to comment.