Skip to content

Commit

Permalink
Merge branch 'ing-bank:release/next' into add-test
Browse files Browse the repository at this point in the history
  • Loading branch information
sualeh authored Nov 19, 2023
2 parents 69ff689 + 9ffb712 commit ae3d11b
Show file tree
Hide file tree
Showing 17 changed files with 551 additions and 209 deletions.
6 changes: 5 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,11 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
### Fixed
- Fix `NullPointerException` issue [#38](https://github.com/ing-bank/cassandra-jdbc-wrapper/issues/38) when a null
type name pattern is specified in a call to `CassandraDatabaseMetaData.getUDTs(String, String, String, int[])`.
- Add null safety on some methods of `CassandraMetadataResultSet`.
- Fix issue [#39](https://github.com/ing-bank/cassandra-jdbc-wrapper/issues/39): return `false` when the method
`isSearchable(int)` is called on the metadata of a result set without table or schema name (typically on
`CassandraMetadataResultSet`s).
- Fix issue preventing to retrieve the metadata of an empty `CassandraMetadataResultSet`.
- Add null safety on some methods of `CassandraResultSet` and `CassandraMetadataResultSet`.

## [4.10.2] - 2023-11-01
### Fixed
Expand Down
22 changes: 18 additions & 4 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,12 @@ To run the tests, execute the following command:
```
mvn test
```
Regarding the tests, you need **Docker** installed on your machine.
Indeed, most of the tests in this project are based on
[Testcontainers for Cassandra](https://java.testcontainers.org/modules/databases/cassandra/), because testing a JDBC API
implementation requires ensuring that the driver is able to connect to a database and execute queries correctly.
For example, very common JDBC drivers like those for [PostgreSQL](https://github.com/pgjdbc/pgjdbc) or
[MS SQL Server](https://github.com/Microsoft/mssql-jdbc/) are also tested against a real database.

### Submit a pull request

Expand All @@ -63,12 +69,20 @@ Once your changes and tests are ready for review, submit them:
to verify it or simply run `mvn clean install` and check the logs).

3. Rebase your changes: update your local repository with the most recent code from the original repository, and rebase
your branch on top of the latest `release/next` branch. It is better that your initial changes are squashed into a
single commit. If more changes are required to validate the pull request, we invite you to add them as separate commits.
your branch on top of the latest `release/next` branch. It is better that your initial changes are squashed into a
single commit. If more changes are required to validate the pull request, we invite you to add them as separate
commits.

4. Finally, push your local changes to your forked repository and submit a pull request into the branch `release/next`
with a title which sums up the changes that you have made (try to not exceed 50 characters), and provide more details in
the body. If necessary, also mention the number of the issue solved by your changes, e.g. "Closes #123".
with a title which sums up the changes that you have made (try to not exceed 50 characters), and provide more details
in the body. If necessary, also mention the number of the issue solved by your changes, e.g. "Closes #123".

### About dependencies

If your changes require to add a new dependency or update an existing one, be sure to check these points first of all:
* the dependency is the latest stable version of the library compatible with JDK 8
* the dependency does not introduce vulnerabilities
* the version of the dependency is specified in a property `<artifactId>.version` in `pom.xml`.

### License headers

Expand Down
16 changes: 16 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,11 @@ To compile and run tests, execute the following Maven command:
mvn clean package
```

To build a bundled version of the JDBC wrapper, run the following command:
```bash
mvn clean package -Pbundle
```

#### Some considerations about running tests

If for some reason the tests using DataStax Enterprise server (`*DseContainerTest`) fail in your local environment, you
Expand Down Expand Up @@ -90,6 +95,17 @@ JetBrains DataGrip, you can have a look to the following links:
this example uses Astra JDBC driver (based on this project), so refer to the "Usage" section below to adapt driver
class and JDBC URL values.

This JDBC wrapper for Apache Cassandra® is also used to run
[Liquibase for Cassandra databases](https://github.com/liquibase/liquibase-cassandra) (from Liquibase 4.25.0). To execute Liquibase scripts on
your Cassandra database, specify the following properties in your Liquibase properties file:
```
driver: com.ing.data.cassandra.jdbc.CassandraDriver
url: jdbc:cassandra://<host>:<port>/<keyspaceName>?compliancemode=Liquibase
```
See the "Usage" section below for further details about the allowed parameters in the JDBC URL.
For further details about Liquibase usage, please check the
[official documentation](https://contribute.liquibase.com/extensions-integrations/directory/database-tutorials/cassandra/).

## Usage

Connect to a Cassandra cluster using the following arguments:
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@
<lombok.version>1.18.30</lombok.version>
<mockito.version>3.12.4</mockito.version>
<slf4j.version>1.7.36</slf4j.version>
<testcontainers.version>1.19.1</testcontainers.version>
<testcontainers.version>1.19.2</testcontainers.version>
<astra-sdk.version>0.6.11</astra-sdk.version>
<!-- Versions for plugins -->
<maven-checkstyle-plugin.version>3.3.0</maven-checkstyle-plugin.version>
Expand Down
12 changes: 8 additions & 4 deletions src/main/java/com/ing/data/cassandra/jdbc/AbstractResultSet.java
Original file line number Diff line number Diff line change
Expand Up @@ -55,8 +55,9 @@ abstract class AbstractResultSet implements Wrapper {
* @param columnIndex The column index (first column is 1).
* @param type The data type to check.
* @return {@code true} if the column CQL data type is the given one, {@code false} otherwise.
* @throws SQLException when the CQL type cannot be determined for the given column.
*/
boolean isCqlType(final int columnIndex, @Nonnull final DataTypeEnum type) {
boolean isCqlType(final int columnIndex, @Nonnull final DataTypeEnum type) throws SQLException {
final String columnType = StringUtils.substringBefore(DataTypeEnum.cqlName(getCqlDataType(columnIndex)), "<");
return type.cqlType.equalsIgnoreCase(columnType);
}
Expand All @@ -67,8 +68,9 @@ boolean isCqlType(final int columnIndex, @Nonnull final DataTypeEnum type) {
* @param columnLabel The column name.
* @param type The data type to check.
* @return {@code true} if the column CQL data type is the given one, {@code false} otherwise.
* @throws SQLException when the CQL type cannot be determined for the given column.
*/
boolean isCqlType(final String columnLabel, @Nonnull final DataTypeEnum type) {
boolean isCqlType(final String columnLabel, @Nonnull final DataTypeEnum type) throws SQLException {
final String columnType = StringUtils.substringBefore(DataTypeEnum.cqlName(getCqlDataType(columnLabel)), "<");
return type.cqlType.equalsIgnoreCase(columnType);
}
Expand All @@ -78,16 +80,18 @@ boolean isCqlType(final String columnLabel, @Nonnull final DataTypeEnum type) {
*
* @param columnIndex The column index (first column is 1).
* @return The CQL data type of the column.
* @throws SQLException when the CQL type cannot be determined for the given column.
*/
abstract DataType getCqlDataType(int columnIndex);
abstract DataType getCqlDataType(int columnIndex) throws SQLException;

/**
* Gets the CQL type of the column with the given name.
*
* @param columnLabel The column name.
* @return The CQL data type of the column.
* @throws SQLException when the CQL type cannot be determined for the given column.
*/
abstract DataType getCqlDataType(String columnLabel);
abstract DataType getCqlDataType(String columnLabel) throws SQLException;

public boolean absolute(final int row) throws SQLException {
throw new SQLFeatureNotSupportedException(NOT_SUPPORTED);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -196,13 +196,25 @@ private void populateColumns() {
}

@Override
DataType getCqlDataType(final int columnIndex) {
return this.currentRow.getColumnDefinitions().getType(columnIndex - 1);
DataType getCqlDataType(final int columnIndex) throws SQLException {
if (this.currentRow != null && this.currentRow.getColumnDefinitions() != null) {
return this.currentRow.getColumnDefinitions().getType(columnIndex - 1);
}
if (this.driverResultSet != null && this.driverResultSet.getColumnDefinitions() != null) {
return this.driverResultSet.getColumnDefinitions().getType(columnIndex - 1);
}
throw new SQLException(UNABLE_TO_RETRIEVE_METADATA);
}

@Override
DataType getCqlDataType(final String columnLabel) {
return this.currentRow.getColumnDefinitions().getType(columnLabel);
DataType getCqlDataType(final String columnLabel) throws SQLException {
if (this.currentRow != null && this.currentRow.getColumnDefinitions() != null) {
return this.currentRow.getColumnDefinitions().getType(columnLabel);
}
if (this.driverResultSet != null && this.driverResultSet.getColumnDefinitions() != null) {
return this.driverResultSet.getColumnDefinitions().getType(columnLabel);
}
throw new SQLException(UNABLE_TO_RETRIEVE_METADATA);
}

@Override
Expand Down Expand Up @@ -1071,7 +1083,7 @@ public String getColumnLabel(final int column) throws SQLException {

@Override
public String getColumnName(final int column) throws SQLException {
if (currentRow != null) {
if (currentRow != null && currentRow.getColumnDefinitions() != null) {
return currentRow.getColumnDefinitions().getName(column - 1);
}
if (driverResultSet != null && driverResultSet.getColumnDefinitions() != null) {
Expand All @@ -1085,7 +1097,7 @@ public int getColumnDisplaySize(final int column) {
try {
final AbstractJdbcType<?> jdbcEquivalentType;
final ColumnDefinitions.Definition columnDefinition;
if (currentRow != null) {
if (currentRow != null && currentRow.getColumnDefinitions() != null) {
columnDefinition = currentRow.getColumnDefinitions().asList().get(column - 1);
} else if (driverResultSet != null && driverResultSet.getColumnDefinitions() != null) {
columnDefinition = driverResultSet.getColumnDefinitions().asList().get(column - 1);
Expand All @@ -1105,7 +1117,7 @@ public int getColumnDisplaySize(final int column) {
}

@Override
public int getColumnType(final int column) {
public int getColumnType(final int column) throws SQLException {
final DataType type;
if (currentRow != null) {
type = getCqlDataType(column);
Expand All @@ -1118,7 +1130,7 @@ public int getColumnType(final int column) {
}

@Override
public String getColumnTypeName(final int column) {
public String getColumnTypeName(final int column) throws SQLException {
// Specification says "database specific type name"; for Cassandra this means the AbstractType.
final DataType type;
if (currentRow != null) {
Expand All @@ -1141,7 +1153,7 @@ public int getScale(final int column) {
try {
final AbstractJdbcType<?> jdbcEquivalentType;
final ColumnDefinitions.Definition columnDefinition;
if (currentRow != null) {
if (currentRow != null && currentRow.getColumnDefinitions() != null) {
columnDefinition = currentRow.getColumnDefinitions().asList().get(column - 1);
} else if (driverResultSet != null && driverResultSet.getColumnDefinitions() != null) {
columnDefinition = driverResultSet.getColumnDefinitions().asList().get(column - 1);
Expand Down Expand Up @@ -1171,7 +1183,7 @@ public String getSchemaName(final int column) throws SQLException {
@Override
public String getTableName(final int column) {
final String tableName;
if (currentRow != null) {
if (currentRow != null && currentRow.getColumnDefinitions() != null) {
tableName = currentRow.getColumnDefinitions().getTable(column - 1);
} else if (driverResultSet != null && driverResultSet.getColumnDefinitions() != null) {
tableName = driverResultSet.getColumnDefinitions().getTable(column - 1);
Expand Down Expand Up @@ -1231,9 +1243,16 @@ public boolean isSearchable(final int column) throws SQLException {
return false;
}
final String columnName = getColumnName(column);
final String schemaName = getSchemaName(column);
final String tableName = getTableName(column);
// If the schema or table name is not defined, always returns false since we cannot determine if the column
// is searchable in this context.
if (StringUtils.isEmpty(schemaName) || StringUtils.isEmpty(tableName)) {
return false;
}
final AtomicBoolean searchable = new AtomicBoolean(false);
statement.connection.getSession().getMetadata().getKeyspace(getSchemaName(column))
.flatMap(metadata -> metadata.getTable(getTableName(column)))
statement.connection.getSession().getMetadata().getKeyspace(schemaName)
.flatMap(metadata -> metadata.getTable(tableName))
.ifPresent(tableMetadata -> {
boolean result;
// Check first if the column is a clustering column or in a partitioning key.
Expand All @@ -1250,7 +1269,7 @@ public boolean isSearchable(final int column) throws SQLException {
}

@Override
public boolean isSigned(final int column) {
public boolean isSigned(final int column) throws SQLException {
final DataType type;
if (currentRow != null) {
type = getCqlDataType(column);
Expand Down
22 changes: 18 additions & 4 deletions src/main/java/com/ing/data/cassandra/jdbc/CassandraResultSet.java
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,7 @@
import static com.ing.data.cassandra.jdbc.utils.ErrorConstants.NOT_SUPPORTED;
import static com.ing.data.cassandra.jdbc.utils.ErrorConstants.NO_INTERFACE;
import static com.ing.data.cassandra.jdbc.utils.ErrorConstants.UNABLE_TO_READ_VALUE;
import static com.ing.data.cassandra.jdbc.utils.ErrorConstants.UNABLE_TO_RETRIEVE_METADATA;
import static com.ing.data.cassandra.jdbc.utils.ErrorConstants.UNSUPPORTED_JSON_TYPE_CONVERSION;
import static com.ing.data.cassandra.jdbc.utils.ErrorConstants.UNSUPPORTED_TYPE_CONVERSION;
import static com.ing.data.cassandra.jdbc.utils.ErrorConstants.VALID_LABELS;
Expand Down Expand Up @@ -270,12 +271,18 @@ private void populateColumns() {

@Override
DataType getCqlDataType(final int columnIndex) {
return this.currentRow.getColumnDefinitions().get(columnIndex - 1).getType();
if (this.currentRow != null) {
return this.currentRow.getColumnDefinitions().get(columnIndex - 1).getType();
}
return this.driverResultSet.getColumnDefinitions().get(columnIndex - 1).getType();
}

@Override
DataType getCqlDataType(final String columnLabel) {
return this.currentRow.getColumnDefinitions().get(columnLabel).getType();
if (this.currentRow != null) {
return this.currentRow.getColumnDefinitions().get(columnLabel).getType();
}
return this.driverResultSet.getColumnDefinitions().get(columnLabel).getType();
}

@Override
Expand Down Expand Up @@ -1780,9 +1787,16 @@ public boolean isSearchable(final int column) throws SQLException {
return false;
}
final String columnName = getColumnName(column);
final String schemaName = getSchemaName(column);
final String tableName = getTableName(column);
// If the schema or table name is not defined (this should not happen here, but better to be careful),
// always returns false since we cannot determine if the column is searchable in this context.
if (StringUtils.isEmpty(schemaName) || StringUtils.isEmpty(tableName)) {
return false;
}
final AtomicBoolean searchable = new AtomicBoolean(false);
statement.connection.getSession().getMetadata().getKeyspace(getSchemaName(column))
.flatMap(metadata -> metadata.getTable(getTableName(column)))
statement.connection.getSession().getMetadata().getKeyspace(schemaName)
.flatMap(metadata -> metadata.getTable(tableName))
.ifPresent(tableMetadata -> {
boolean result;
// Check first if the column is a clustering column or in a partitioning key.
Expand Down
12 changes: 12 additions & 0 deletions src/main/java/com/ing/data/cassandra/jdbc/ColumnDefinitions.java
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
package com.ing.data.cassandra.jdbc;

import com.datastax.oss.driver.api.core.type.DataType;
import org.apache.commons.lang3.StringUtils;

import javax.annotation.Nonnull;
import java.util.Arrays;
Expand Down Expand Up @@ -310,6 +311,17 @@ public Definition(final String keyspace, final String table, final String name,
this.type = type;
}

/**
* Builds a column definition in an anonymous table (useful for metadata result sets built programmatically).
*
* @param name The column name.
* @param type The column type.
* @return A new column definition instance.
*/
public static Definition buildDefinitionInAnonymousTable(final String name, final DataType type) {
return new Definition(StringUtils.EMPTY, StringUtils.EMPTY, name, type);
}

/**
* Gets the name of the keyspace this column is part of.
*
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,13 +15,16 @@

package com.ing.data.cassandra.jdbc.metadata;

import com.datastax.oss.driver.api.core.type.DataTypes;
import com.ing.data.cassandra.jdbc.CassandraMetadataResultSet;
import com.ing.data.cassandra.jdbc.CassandraStatement;

import java.sql.DatabaseMetaData;
import java.sql.SQLException;
import java.util.ArrayList;

import static com.ing.data.cassandra.jdbc.ColumnDefinitions.Definition.buildDefinitionInAnonymousTable;

/**
* Utility class building metadata result sets ({@link CassandraMetadataResultSet} objects) related to catalogs.
*/
Expand Down Expand Up @@ -52,10 +55,14 @@ public CatalogMetadataResultSetBuilder(final CassandraStatement statement) throw
*/
public CassandraMetadataResultSet buildCatalogs() throws SQLException {
final ArrayList<MetadataRow> catalogs = new ArrayList<>();
final MetadataRow row = new MetadataRow().addEntry(TABLE_CATALOG_SHORTNAME,
this.statement.getConnection().getCatalog());
catalogs.add(row);
return CassandraMetadataResultSet.buildFrom(this.statement, new MetadataResultSet().setRows(catalogs));
final MetadataRow.MetadataRowTemplate rowTemplate = new MetadataRow.MetadataRowTemplate(
buildDefinitionInAnonymousTable(TABLE_CATALOG_SHORTNAME, DataTypes.TEXT)
);

catalogs.add(new MetadataRow().withTemplate(rowTemplate, this.statement.getConnection().getCatalog()));

return CassandraMetadataResultSet.buildFrom(this.statement,
new MetadataResultSet(rowTemplate).setRows(catalogs));
}

}
Loading

0 comments on commit ae3d11b

Please sign in to comment.