Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Contribute to GitLab
Sign in / Register
Toggle navigation
D
dsk-dsc-flink
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
shezaixing
dsk-dsc-flink
Commits
965215e7
Commit
965215e7
authored
Dec 03, 2024
by
liaowenwu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
修改bug
parent
5abd8d28
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
5 additions
and
5 deletions
+5
-5
AsyncMysqlDataTransferFunctionNew.java
...sc/common/function/AsyncMysqlDataTransferFunctionNew.java
+4
-4
SyncCustomerDataSource.java
...n/java/com/dsk/flink/dsc/sync/SyncCustomerDataSource.java
+1
-1
No files found.
src/main/java/com/dsk/flink/dsc/common/function/AsyncMysqlDataTransferFunctionNew.java
View file @
965215e7
...
...
@@ -98,9 +98,9 @@ public class AsyncMysqlDataTransferFunctionNew extends RichAsyncFunction<JSONObj
excueteSql
=
tranferInsertSql
(
table
,
dataObj
,
mysqlType
);
}
if
(
"UPDATE"
.
equals
(
type
)){
JSONObject
oldDataObj
=
oldDataList
.
getJSONObject
(
0
);
excueteSql
=
tranferUpdateSql
(
table
,
dataObj
,
oldDataObj
,
mysqlType
,
pkNameSet
);
//
excueteSql = tranferInsertSql(table,dataObj,mysqlType);
//
JSONObject oldDataObj = oldDataList.getJSONObject(0);
//
excueteSql = tranferUpdateSql(table,dataObj,oldDataObj,mysqlType,pkNameSet);
excueteSql
=
tranferInsertSql
(
table
,
dataObj
,
mysqlType
);
}
if
(
"DELETE"
.
equals
(
type
)){
...
...
@@ -158,7 +158,7 @@ public class AsyncMysqlDataTransferFunctionNew extends RichAsyncFunction<JSONObj
String
valueString
=
String
.
join
(
","
,
valueList
);
//return String.format("INSERT INTO %s (%s) values (%s) ON DUPLICATE KEY UPDATE %s;",table,columnString,valueString,updateString);
return
String
.
format
(
"
INSERT
INTO %s (%s) values (%s);"
,
table
,
columnString
,
valueString
);
return
String
.
format
(
"
REPLACE
INTO %s (%s) values (%s);"
,
table
,
columnString
,
valueString
);
}
private
String
tranferUpdateSql
(
String
table
,
JSONObject
dataObj
,
JSONObject
oldDataObj
,
JSONObject
mysqlType
,
Set
<
String
>
pkNameSet
)
{
...
...
src/main/java/com/dsk/flink/dsc/sync/SyncCustomerDataSource.java
View file @
965215e7
...
...
@@ -69,7 +69,7 @@ public class SyncCustomerDataSource {
System
.
out
.
println
(
"获取到的kafka消费组:->"
+
EtlUtils
.
getKafkaGroup
(
envProps
));
FlinkKafkaConsumer
<
String
>
kafkaConsumer
=
new
FlinkKafkaConsumer
<
String
>(
envProps
.
getKafka_topic
(),
new
SimpleStringSchema
(),
EtlUtils
.
getKafkaConfig
(
envProps
.
getKafka_brokers
(),
EtlUtils
.
getKafkaGroup
(
envProps
),
envProps
.
getKafka_username
(),
envProps
.
getKafka_password
()));
//System.out.println(envProps.getKafka_topic());
long
defaultOffset
=
LocalDateTime
.
now
().
minusMinutes
(
3
0
).
atZone
(
ZoneId
.
systemDefault
()).
toInstant
().
toEpochMilli
();
long
defaultOffset
=
LocalDateTime
.
now
().
minusMinutes
(
1
0
).
atZone
(
ZoneId
.
systemDefault
()).
toInstant
().
toEpochMilli
();
kafkaConsumer
.
setStartFromTimestamp
(
defaultOffset
);
//kafkaConsumer.setStartFromLatest();
//偏移量
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment