Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature][Connector-V2][JDBC] support sqlite Source & Sink #3089

Merged
merged 32 commits into from
Nov 24, 2022

Conversation

nutsjian
Copy link
Contributor

@nutsjian nutsjian commented Oct 13, 2022

Purpose of this pull request

[JDBC] SqLite Connector

Due to the type affinity of sqlite3, and the dynamic data types. SqliteTypeMapper.mapping cannot just use columnTypeName to determine the real SeaTunnel Data Type.

Check list

Copy link
Member

@liugddx liugddx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please migrate the test case to seatunnel-connector-v2-e2e module.

Comment on lines 69 to 76
String createSource = "CREATE TABLE source (\n" +
"age INT NOT NULL,\n" +
"name VARCHAR(255) NOT NULL\n" +
")";
String createSink = "CREATE TABLE sink (\n" +
"age INT NOT NULL,\n" +
"name VARCHAR(255) NOT NULL\n" +
")";
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please test all field types

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Roger that. I'm fixing it.

Copy link
Member

@hailin0 hailin0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

user = ""
password = ""
type_affinity = true
query = "select age, name from source"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks man. I'm fixing it. SQLite uses a more general dynamic type system and type affinity (https://www.sqlite.org/datatype3.html). So I consider test all supported datatypes refer to the JDBC mysql e2e test, workable?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good

@nutsjian nutsjian requested review from hailin0 and liugddx and removed request for hailin0 and liugddx October 14, 2022 08:09
Copy link
Member

@liugddx liugddx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please test all field types

}

@Test
public void testJdbcMysqlSourceAndSinkDataType() throws Exception {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please check the method name.

}

@AfterEach
public void closeGreenplumContainer() throws SQLException, IOException {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please check the method name.

}

@Test
public void testJdbcMysqlSourceAndSinkDataType() throws Exception {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please check the method name.

}

@AfterEach
public void closeGreenplumContainer() throws SQLException, IOException {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please check the method name.

Copy link
Contributor Author

@nutsjian nutsjian Oct 14, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thx, sloved.

@nutsjian nutsjian requested a review from liugddx October 14, 2022 09:47
@EricJoy2048 EricJoy2048 added the First-time contributor First-time contributor label Oct 15, 2022
@nutsjian
Copy link
Contributor Author

@EricJoy2048 thx, man. I fixed the e2e test. Please approve running workflows.

@nutsjian nutsjian requested review from hailin0 and removed request for liugddx October 16, 2022 01:17

private Connection jdbcConnection;

@Test
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove @Test

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thx, I'll fix it.

Hisoka-X
Hisoka-X previously approved these changes Nov 11, 2022
Copy link
Member

@Hisoka-X Hisoka-X left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@nutsjian
Copy link
Contributor Author

@EricJoy2048 please help me rerun ci workflows, if the unit test error has been resolved. thx.

2022-11-11T02:48:26.4091671Z [ERROR] Failed to execute goal on project connector-jdbc-spark-e2e: Could not resolve dependencies for project org.apache.seatunnel:connector-jdbc-spark-e2e:jar:2.1.3-SNAPSHOT: Failed to collect dependencies at org.testcontainers:postgresql:jar:1.17.3: Failed to read artifact descriptor for org.testcontainers:postgresql:jar:1.17.3: Could not transfer artifact org.testcontainers:postgresql:pom:1.17.3 from/to central (https://repo.maven.apache.org/maven2): transfer failed for https://repo.maven.apache.org/maven2/org/testcontainers/postgresql/1.17.3/postgresql-1.17.3.pom: Connection reset -> [Help 1]

@Hisoka-X
Copy link
Member

Please resolve conflict, Thanks!

@nutsjian
Copy link
Contributor Author

@Hisoka-X @EricJoy2048 conflict resolved, please help me rerun the CI workflows, thx.

Copy link
Member

@liugddx liugddx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1.Update change log that in connector document. For more details you can refer to connector-v2

2.Update plugin-mapping.properties and add new connector information in it

3.Update the pom file of seatunnel-dist

@@ -115,6 +115,7 @@ there are some reference value for params above.
| phoenix | org.apache.phoenix.queryserver.client.Driver | jdbc:phoenix:thin:url=http://localhost:8765;serialization=PROTOBUF | / | https://mvnrepository.com/artifact/com.aliyun.phoenix/ali-phoenix-shaded-thin-client |
| sqlserver | com.microsoft.sqlserver.jdbc.SQLServerDriver | jdbc:microsoft:sqlserver://localhost:1433 | com.microsoft.sqlserver.jdbc.SQLServerXADataSource | https://mvnrepository.com/artifact/com.microsoft.sqlserver/mssql-jdbc |
| oracle | oracle.jdbc.OracleDriver | jdbc:oracle:thin:@localhost:1521/xepdb1 | oracle.jdbc.xa.OracleXADataSource | https://mvnrepository.com/artifact/com.oracle.database.jdbc/ojdbc8 |
| sqlite | org.sqlite.JDBC | Jdbc:sqlite:test.db | / | https://mvnrepository.com/artifact/org.xerial/sqlite-jdbc |
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Driver right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

}

private void batchInsertData() throws SQLException {
String sql = "insert into source(age, name) values(?, ?)";
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you test all data types?

}

private void initializeJdbcTable() throws Exception {
URI resource = Objects.requireNonNull(JdbcMysqlIT.class.getResource("/jdbc/init_sql/sqlite_init.conf")).toURI();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

JdbcMysqlIT?

}

private void checkSinkDataTypeTable() throws Exception {
URI resource = Objects.requireNonNull(JdbcMysqlIT.class.getResource("/jdbc/init_sql/sqlite_init.conf")).toURI();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

JdbcMysqlIT?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've been fixed and commit before, is there any problem with my git operation steps, request for comments.

}

private void initializeJdbcTable() throws Exception {
URI resource = Objects.requireNonNull(JdbcMysqlIT.class.getResource("/jdbc/init_sql/sqlite_init.conf")).toURI();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

JdbcMysqlIT?

@Slf4j
public class SqliteTypeMapper implements JdbcDialectTypeMapper {

private static final Logger LOG = LoggerFactory.getLogger(SqliteTypeMapper.class);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Useless code

<groupId>org.xerial</groupId>
<artifactId>sqlite-jdbc</artifactId>
<version>${sqlite.version}</version>
<scope>provided</scope>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

duplicated code.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

@@ -135,6 +137,14 @@ private SeaTunnelRowType initTableField(Connection conn) {
}
} catch (Exception e) {
LOG.warn("get row type info exception", e);
} finally {
if (Objects.nonNull(resultSet)) {
try {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the role of this code?

@@ -36,6 +36,7 @@
<module>connector-neo4j-spark-e2e</module>
<module>connector-kafka-spark-e2e</module>
<module>connector-iceberg-spark-e2e</module>
<module>connector-elasticsearch-spark-e2e</module>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this code belong to this PR?

@@ -36,6 +36,7 @@
<module>connector-iceberg-flink-e2e</module>
<module>connector-neo4j-flink-e2e</module>
<module>connector-kafka-flink-e2e</module>
<module>connector-elasticsearch-flink-e2e</module>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this code belong to this PR?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add e2e in flink/spark e2e module. So that e2e can be start in CI. image

thx man, done. I have merged upstream dev, and add flink/spark connector-elasticsearch-e2e module.

this code is for this comment. I merge upstream dev, and add elasticsearch-flink-e2e in pom.xml.

@Hisoka-X
Copy link
Member

Please fix ci problem. Thanks!

Copy link
Member

@Hisoka-X Hisoka-X left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants