-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] [Connector-V2][FLINK-Hive-Sink] Caused by: java.io.FileNotFoundException while mysql data write to hive #3203
Comments
HI @EricJoy2048 @TyrantLucifer can you help me solve this problem? |
Could you please offer more detailed log? |
And here is job manager log:
Any other detail log I can offer to you ? |
It's enough. BTW, could you please offer some example data from mysql and the hive table information, I need in the local simulation repetition. Thank you. |
Here is MYSQL table: CREATE TABLE `sg_cti_call_record` (
`ID` varchar(40) NOT NULL,
`CREATED_USER` varchar(40) DEFAULT NULL,
`CREATED_TIME` datetime DEFAULT NULL,
`LAST_UPDATE_USER` varchar(40) DEFAULT NULL,
`LAST_UPDATE_TIME` datetime DEFAULT NULL,
`VERSION_NUMBER` int(11) DEFAULT '0',
`RECORD_STATUS` varchar(40) DEFAULT 'VALID',
`CALL_ID` varchar(40) DEFAULT NULL COMMENT '',
`MAIN_CALLID` varchar(60) DEFAULT NULL COMMENT '',
`CALLED_NUMBER` varchar(15) DEFAULT NULL COMMENT '',
`CALLING_NUMBER` varchar(15) DEFAULT NULL COMMENT '',
`CNO` varchar(40) DEFAULT NULL COMMENT '',
`GROUP_ID` varchar(40) DEFAULT NULL COMMENT '',
`TOTAL_DURATION` int(11) DEFAULT NULL COMMENT '',
`TALK_TIME_LONG` int(11) DEFAULT '0' COMMENT '',
`CONTENT` varchar(255) DEFAULT NULL COMMENT '',
`CALL_TYPE` varchar(20) DEFAULT NULL COMMENT '',
`STATUS` varchar(40) DEFAULT NULL,
`START_TIME` datetime DEFAULT NULL COMMENT '',
`ANSWER_TIME` datetime DEFAULT NULL COMMENT '',
`END_TIME` datetime DEFAULT NULL COMMENT '',
`IVR_KEY` varchar(30) DEFAULT NULL COMMENT '',
`RING_TIME_LONG` int(11) DEFAULT '0' COMMENT '',
`SATISFACTION` varchar(255) CHARACTER SET utf8mb4 DEFAULT NULL COMMENT '',
`WORD_PROCESSING_LENGTH` int(11) DEFAULT NULL,
`HANG_UP_REASON` varchar(255) DEFAULT NULL,
`RECORD_URL` varchar(255) DEFAULT NULL,
`RECORD_FILE_TYPE` varchar(20) DEFAULT NULL COMMENT '',
`LOCAL_RECORD_FILE` varchar(255) DEFAULT NULL COMMENT '',
`QNO` varchar(10) DEFAULT NULL COMMENT '',
`QUEUE_NAME` varchar(30) DEFAULT NULL COMMENT '',
`END_REASON` varchar(10) DEFAULT NULL,
`SUBMIT_TIME` datetime DEFAULT NULL,
PRIMARY KEY (`ID`),
KEY `CALL_ID` (`CALL_ID`) USING BTREE,
KEY `CREATED_TIME` (`CREATED_TIME`),
KEY `START_TIME` (`START_TIME`),
KEY `END_TIME` (`END_TIME`),
KEY `ANSWER_TIME` (`ANSWER_TIME`),
KEY `MAIN_CALLID` (`MAIN_CALLID`),
KEY `QNO` (`QNO`),
KEY `RECORD_URL` (`RECORD_URL`(18)),
KEY `idx_sg_cti_call_record_y01` (`RECORD_FILE_TYPE`,`LOCAL_RECORD_FILE`),
KEY `idx_CNO_START_TIME` (`CNO`,`START_TIME`) USING BTREE,
KEY `idx_lastupdate` (`LAST_UPDATE_TIME`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 ROW_FORMAT=COMPACT; MYSQL example data: INSERT INTO `sg_cti_call_record` (`ID`) VALUES ('000000db55d8432b90f318c8052dfe96');
INSERT INTO `sg_cti_call_record` (`ID`) VALUES ('0000035ccd044d0f93d67cdd7da4ae95');
INSERT INTO `sg_cti_call_record` (`ID`) VALUES ('000003d5e9914a9c9203430632bd7b8e');
INSERT INTO `sg_cti_call_record` (`ID`) VALUES ('0000046480924703bb754f49a6fe461f');
INSERT INTO `sg_cti_call_record` (`ID`) VALUES ('00000504d1e04b2197a6b4d707945f5e');
INSERT INTO `sg_cti_call_record` (`ID`) VALUES ('0000059d7bb447d78810e3d5725da0c7');
INSERT INTO `sg_cti_call_record` (`ID`) VALUES ('000005ada16f4eb4889d90d2b23f4991');
INSERT INTO `sg_cti_call_record` (`ID`) VALUES ('0000068dba0b432bb1e525c6eba5ff02');
INSERT INTO `sg_cti_call_record` (`ID`) VALUES ('00000894fdd346d2be5ee1e1713b40de');
INSERT INTO `sg_cti_call_record` (`ID`) VALUES ('00000b245fea4e9983c9fbf736d9d952'); And here is hive table:
And here is seatunnel config file:
|
HI @TyrantLucifer any idea about this bug? |
Sorry, I'm busy these days. I'll let you know tonight. |
Could you please add my wechat: |
Search before asking
What happened
I use seatunnel to test mysql data writing to hive it threw an exception, causing data not to be written to hive
SeaTunnel Version
seatunnel-version: 2.2.0-beta
flink-version:1.13.3
hive-version:3.0.0
mysql-version:5.7
SeaTunnel Config
Running Command
Error Exception
Flink or Spark Version
flink-version:1.13.3
Java or Scala Version
No response
Screenshots
No response
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: