Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Contribute to GitLab
Sign in / Register
Toggle navigation
D
dlink
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
zhaowei
dlink
Commits
e2e7c5a3
Commit
e2e7c5a3
authored
Aug 21, 2021
by
wenmo
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
flink执行图
parent
3df7d3bb
Changes
21
Show whitespace changes
Inline
Side-by-side
Showing
21 changed files
with
297 additions
and
45 deletions
+297
-45
README.md
README.md
+11
-13
StudioController.java
.../src/main/java/com/dlink/controller/StudioController.java
+8
-0
StudioService.java
...-admin/src/main/java/com/dlink/service/StudioService.java
+3
-0
StudioServiceImpl.java
...c/main/java/com/dlink/service/impl/StudioServiceImpl.java
+11
-0
CustomTableEnvironmentImpl.java
...com/dlink/executor/custom/CustomTableEnvironmentImpl.java
+3
-3
CustomTableEnvironmentImpl.java
...com/dlink/executor/custom/CustomTableEnvironmentImpl.java
+4
-3
CustomTableEnvironmentImpl.java
...com/dlink/executor/custom/CustomTableEnvironmentImpl.java
+3
-3
pom.xml
dlink-common/pom.xml
+8
-0
Executor.java
dlink-core/src/main/java/com/dlink/executor/Executor.java
+1
-2
Explainer.java
dlink-core/src/main/java/com/dlink/explainer/Explainer.java
+21
-3
AbstractTrans.java
...rc/main/java/com/dlink/explainer/trans/AbstractTrans.java
+1
-1
Trans.java
...k-core/src/main/java/com/dlink/explainer/trans/Trans.java
+1
-1
TransGenerator.java
...c/main/java/com/dlink/explainer/trans/TransGenerator.java
+2
-3
JobManager.java
dlink-core/src/main/java/com/dlink/job/JobManager.java
+6
-0
FlinkSqlPlus.java
dlink-core/src/main/java/com/dlink/plus/FlinkSqlPlus.java
+1
-1
package.json
dlink-web/package.json
+1
-1
index.tsx
...eb/src/components/Studio/StudioConsole/StudioCA/index.tsx
+4
-4
index.less
...b/src/components/Studio/StudioMenu/StudioGraph/index.less
+0
-0
index.tsx
...eb/src/components/Studio/StudioMenu/StudioGraph/index.tsx
+117
-0
index.tsx
dlink-web/src/components/Studio/StudioMenu/index.tsx
+82
-7
service.ts
dlink-web/src/pages/FlinkSqlStudio/service.ts
+9
-0
No files found.
README.md
View file @
e2e7c5a3
...
@@ -171,6 +171,8 @@ dlink -- 父项目
...
@@ -171,6 +171,8 @@ dlink -- 父项目
| node.js | 14.17.0 |
| node.js | 14.17.0 |
| jdk | 1.8.0_201|
| jdk | 1.8.0_201|
| maven | 3.6.0 |
| maven | 3.6.0 |
| lombok | 1.18.16 |
| mysql | 5.7+ |
```
shell
```
shell
mvn clean
install
-Dmaven
.test.skip
=
true
mvn clean
install
-Dmaven
.test.skip
=
true
...
@@ -191,7 +193,7 @@ mvn clean install -Dmaven.test.skip=true
...
@@ -191,7 +193,7 @@ mvn clean install -Dmaven.test.skip=true
Flink 的版本取决于 lib 下的 dlink-client-1.12.jar。
Flink 的版本取决于 lib 下的 dlink-client-1.12.jar。
当前版本默认为 Flink 1.12.4 API。
当前版本默认为 Flink 1.12.4 API。
向其他版本的集群提交任务可能存在问题,
未来将实现 1.13、1.11、1.10.
向其他版本的集群提交任务可能存在问题,
已实现 1.11、1.12、1.13,切换版本时只需要将对应依赖在lib下进行替换,然后重启即可。
## 使用手册
## 使用手册
...
@@ -203,11 +205,9 @@ Flink 的版本取决于 lib 下的 dlink-client-1.12.jar。
...
@@ -203,11 +205,9 @@ Flink 的版本取决于 lib 下的 dlink-client-1.12.jar。
#### 集群中心
#### 集群中心
注册Flink集群地址,格式为 host:port ,用英文逗号分隔。
注册 Flink 集群地址时,格式为 host:port ,用英文逗号分隔。即添加 Flink 集群的 JobManager 的 RestApi 地址。当 HA 模式时,地址间用英文逗号分隔,例如:192.168.123.101:8081,192.168.123.102:8081,192.168.123.103:8081。
新增和修改的等待时间较长,是因为需要检测最新的 JobManager 地址。
新增和修改的等待时间较长,是因为需要重新计算最新的 JM 地址。
心跳检测为手动触发,会更新集群状态与 JobManager 地址。
心跳检测为手动触发,会更新集群状态与 JM 地址。
#### Studio
#### Studio
...
@@ -215,7 +215,6 @@ Flink 的版本取决于 lib 下的 dlink-client-1.12.jar。
...
@@ -215,7 +215,6 @@ Flink 的版本取决于 lib 下的 dlink-client-1.12.jar。
2.
在中间编辑区编写 FlinkSQL 。
2.
在中间编辑区编写 FlinkSQL 。
3.
在右侧配置执行参数。
3.
在右侧配置执行参数。
4.
Fragment 开启后,可以使用增强的 sql 片段语法:
4.
Fragment 开启后,可以使用增强的 sql 片段语法:
```
sql
```
sql
sf
:
=
select
*
from
;
tb
:
=
student
;
sf
:
=
select
*
from
;
tb
:
=
student
;
${
sf
}
${
tb
}
${
sf
}
${
tb
}
...
@@ -223,7 +222,6 @@ ${sf} ${tb}
...
@@ -223,7 +222,6 @@ ${sf} ${tb}
select
*
from
student
select
*
from
student
```
```
5.
内置 sql 增强语法-表值聚合:
5.
内置 sql 增强语法-表值聚合:
```
sql
```
sql
CREATE
AGGTABLE
aggdemo
AS
CREATE
AGGTABLE
aggdemo
AS
SELECT
myField
,
value
,
rank
SELECT
myField
,
value
,
rank
...
@@ -233,11 +231,11 @@ AGG BY TOP2(value) as (value,rank);
...
@@ -233,11 +231,11 @@ AGG BY TOP2(value) as (value,rank);
```
```
6.
MaxRowNum 为批流执行Select时预览查询结果的最大集合长度,默认 100,最大 9999。
6.
MaxRowNum 为批流执行Select时预览查询结果的最大集合长度,默认 100,最大 9999。
7.
SavePointPath 当前版本属于非 Jar 提交,暂不可用。
7.
SavePointPath 当前版本属于非 Jar 提交,暂不可用。
8.
Flink 共享会话共享 Catalog
ue
。
8.
Flink 共享会话共享 Catalog 。
9.
连接器为 Catalog
ue
里的表信息,清空按钮会销毁当前会话。
9.
连接器为 Catalog 里的表信息,清空按钮会销毁当前会话。
10.
Local 模式请使用少量测试数据,真实数据请使用远程集群。
10.
Local 模式请使用少量测试数据,真实数据请使用远程集群。
11.
执行 SQL 时,如果您选中了部分 SQL,则会执行选中的内容,否则执行全部内容。
11.
执行 SQL 时,如果您选中了部分 SQL,则会执行选中的内容,否则执行全部内容。
12.
小火箭的提交功能是异步提交当前任务保存的 FlinkSQL 及配置到集群。无法提交草稿。
12.
小火箭的提交功能是异步提交当前任务
已
保存的 FlinkSQL 及配置到集群。无法提交草稿。
13.
执行信息或者历史中那个很长很长的就是集群上的 JobId。
13.
执行信息或者历史中那个很长很长的就是集群上的 JobId。
14.
草稿是无法被异步远程提交的,只能同步执行。
14.
草稿是无法被异步远程提交的,只能同步执行。
15.
灰色按钮代表近期将实现。
15.
灰色按钮代表近期将实现。
...
@@ -256,7 +254,7 @@ AGG BY TOP2(value) as (value,rank);
...
@@ -256,7 +254,7 @@ AGG BY TOP2(value) as (value,rank);
[
Mybatis Plus
](
https://github.com/baomidou/mybatis-plus
)
[
Mybatis Plus
](
https://github.com/baomidou/mybatis-plus
)
[
ant-design-pro
](
https://github.com/a
iwenmo
/ant-design-pro
)
[
ant-design-pro
](
https://github.com/a
nt-design
/ant-design-pro
)
[
Monaco Editor
](
https://github.com/Microsoft/monaco-editor
)
[
Monaco Editor
](
https://github.com/Microsoft/monaco-editor
)
...
...
dlink-admin/src/main/java/com/dlink/controller/StudioController.java
View file @
e2e7c5a3
...
@@ -47,6 +47,14 @@ public class StudioController {
...
@@ -47,6 +47,14 @@ public class StudioController {
return
Result
.
succeed
(
studioService
.
explainSql
(
studioExecuteDTO
),
"解释成功"
);
return
Result
.
succeed
(
studioService
.
explainSql
(
studioExecuteDTO
),
"解释成功"
);
}
}
/**
* 解释Sql
*/
@PostMapping
(
"/getStreamGraph"
)
public
Result
getStreamGraph
(
@RequestBody
StudioExecuteDTO
studioExecuteDTO
)
{
return
Result
.
succeed
(
studioService
.
getStreamGraph
(
studioExecuteDTO
),
"获取执行图成功"
);
}
/**
/**
* 进行DDL操作
* 进行DDL操作
*/
*/
...
...
dlink-admin/src/main/java/com/dlink/service/StudioService.java
View file @
e2e7c5a3
...
@@ -11,6 +11,7 @@ import com.dlink.result.SelectResult;
...
@@ -11,6 +11,7 @@ import com.dlink.result.SelectResult;
import
com.dlink.result.SqlExplainResult
;
import
com.dlink.result.SqlExplainResult
;
import
com.dlink.session.SessionInfo
;
import
com.dlink.session.SessionInfo
;
import
com.fasterxml.jackson.databind.JsonNode
;
import
com.fasterxml.jackson.databind.JsonNode
;
import
com.fasterxml.jackson.databind.node.ObjectNode
;
import
java.util.List
;
import
java.util.List
;
...
@@ -28,6 +29,8 @@ public interface StudioService {
...
@@ -28,6 +29,8 @@ public interface StudioService {
List
<
SqlExplainResult
>
explainSql
(
StudioExecuteDTO
studioExecuteDTO
);
List
<
SqlExplainResult
>
explainSql
(
StudioExecuteDTO
studioExecuteDTO
);
ObjectNode
getStreamGraph
(
StudioExecuteDTO
studioExecuteDTO
);
SelectResult
getJobData
(
String
jobId
);
SelectResult
getJobData
(
String
jobId
);
SessionInfo
createSession
(
SessionDTO
sessionDTO
,
String
createUser
);
SessionInfo
createSession
(
SessionDTO
sessionDTO
,
String
createUser
);
...
...
dlink-admin/src/main/java/com/dlink/service/impl/StudioServiceImpl.java
View file @
e2e7c5a3
...
@@ -23,6 +23,7 @@ import com.dlink.session.SessionInfo;
...
@@ -23,6 +23,7 @@ import com.dlink.session.SessionInfo;
import
com.dlink.session.SessionPool
;
import
com.dlink.session.SessionPool
;
import
com.dlink.trans.Operations
;
import
com.dlink.trans.Operations
;
import
com.fasterxml.jackson.databind.JsonNode
;
import
com.fasterxml.jackson.databind.JsonNode
;
import
com.fasterxml.jackson.databind.node.ObjectNode
;
import
org.springframework.beans.factory.annotation.Autowired
;
import
org.springframework.beans.factory.annotation.Autowired
;
import
org.springframework.stereotype.Service
;
import
org.springframework.stereotype.Service
;
...
@@ -71,6 +72,16 @@ public class StudioServiceImpl implements StudioService {
...
@@ -71,6 +72,16 @@ public class StudioServiceImpl implements StudioService {
return
jobManager
.
explainSql
(
studioExecuteDTO
.
getStatement
());
return
jobManager
.
explainSql
(
studioExecuteDTO
.
getStatement
());
}
}
@Override
public
ObjectNode
getStreamGraph
(
StudioExecuteDTO
studioExecuteDTO
)
{
JobConfig
config
=
studioExecuteDTO
.
getJobConfig
();
if
(!
config
.
isUseSession
())
{
config
.
setAddress
(
clusterService
.
buildEnvironmentAddress
(
config
.
isUseRemote
(),
studioExecuteDTO
.
getClusterId
()));
}
JobManager
jobManager
=
JobManager
.
build
(
config
);
return
jobManager
.
getStreamGraph
(
studioExecuteDTO
.
getStatement
());
}
@Override
@Override
public
SelectResult
getJobData
(
String
jobId
)
{
public
SelectResult
getJobData
(
String
jobId
)
{
return
JobManager
.
getJobData
(
jobId
);
return
JobManager
.
getJobData
(
jobId
);
...
...
dlink-client/dlink-client-1.11/src/main/java/com/dlink/executor/custom/CustomTableEnvironmentImpl.java
View file @
e2e7c5a3
...
@@ -3,9 +3,9 @@ package com.dlink.executor.custom;
...
@@ -3,9 +3,9 @@ package com.dlink.executor.custom;
import
com.dlink.result.SqlExplainResult
;
import
com.dlink.result.SqlExplainResult
;
import
org.apache.flink.api.common.typeinfo.TypeInformation
;
import
org.apache.flink.api.common.typeinfo.TypeInformation
;
import
org.apache.flink.api.dag.Transformation
;
import
org.apache.flink.api.dag.Transformation
;
import
org.apache.flink.shaded.jackson2.
com.fasterxml.jackson.core.JsonProcessingException
;
import
com.fasterxml.jackson.core.JsonProcessingException
;
import
org.apache.flink.shaded.jackson2.
com.fasterxml.jackson.databind.ObjectMapper
;
import
com.fasterxml.jackson.databind.ObjectMapper
;
import
org.apache.flink.shaded.jackson2.
com.fasterxml.jackson.databind.node.ObjectNode
;
import
com.fasterxml.jackson.databind.node.ObjectNode
;
import
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
;
import
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
;
import
org.apache.flink.streaming.api.graph.JSONGenerator
;
import
org.apache.flink.streaming.api.graph.JSONGenerator
;
import
org.apache.flink.streaming.api.graph.StreamGraph
;
import
org.apache.flink.streaming.api.graph.StreamGraph
;
...
...
dlink-client/dlink-client-1.12/src/main/java/com/dlink/executor/custom/CustomTableEnvironmentImpl.java
View file @
e2e7c5a3
package
com
.
dlink
.
executor
.
custom
;
package
com
.
dlink
.
executor
.
custom
;
import
com.dlink.result.SqlExplainResult
;
import
com.dlink.result.SqlExplainResult
;
import
com.fasterxml.jackson.core.JsonProcessingException
;
import
com.fasterxml.jackson.databind.ObjectMapper
;
import
com.fasterxml.jackson.databind.node.ObjectNode
;
import
org.apache.flink.api.common.typeinfo.TypeInformation
;
import
org.apache.flink.api.common.typeinfo.TypeInformation
;
import
org.apache.flink.api.dag.Transformation
;
import
org.apache.flink.api.dag.Transformation
;
import
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonProcessingException
;
import
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper
;
import
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode
;
import
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
;
import
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
;
import
org.apache.flink.streaming.api.graph.JSONGenerator
;
import
org.apache.flink.streaming.api.graph.JSONGenerator
;
import
org.apache.flink.streaming.api.graph.StreamGraph
;
import
org.apache.flink.streaming.api.graph.StreamGraph
;
...
...
dlink-client/dlink-client-1.13/src/main/java/com/dlink/executor/custom/CustomTableEnvironmentImpl.java
View file @
e2e7c5a3
...
@@ -3,9 +3,9 @@ package com.dlink.executor.custom;
...
@@ -3,9 +3,9 @@ package com.dlink.executor.custom;
import
com.dlink.result.SqlExplainResult
;
import
com.dlink.result.SqlExplainResult
;
import
org.apache.flink.api.common.typeinfo.TypeInformation
;
import
org.apache.flink.api.common.typeinfo.TypeInformation
;
import
org.apache.flink.api.dag.Transformation
;
import
org.apache.flink.api.dag.Transformation
;
import
org.apache.flink.shaded.jackson2.
com.fasterxml.jackson.core.JsonProcessingException
;
import
com.fasterxml.jackson.core.JsonProcessingException
;
import
org.apache.flink.shaded.jackson2.
com.fasterxml.jackson.databind.ObjectMapper
;
import
com.fasterxml.jackson.databind.ObjectMapper
;
import
org.apache.flink.shaded.jackson2.
com.fasterxml.jackson.databind.node.ObjectNode
;
import
com.fasterxml.jackson.databind.node.ObjectNode
;
import
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
;
import
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
;
import
org.apache.flink.streaming.api.graph.JSONGenerator
;
import
org.apache.flink.streaming.api.graph.JSONGenerator
;
import
org.apache.flink.streaming.api.graph.StreamGraph
;
import
org.apache.flink.streaming.api.graph.StreamGraph
;
...
...
dlink-common/pom.xml
View file @
e2e7c5a3
...
@@ -15,6 +15,14 @@
...
@@ -15,6 +15,14 @@
<groupId>
org.projectlombok
</groupId>
<groupId>
org.projectlombok
</groupId>
<artifactId>
lombok
</artifactId>
<artifactId>
lombok
</artifactId>
</dependency>
</dependency>
<dependency>
<groupId>
com.fasterxml.jackson.core
</groupId>
<artifactId>
jackson-annotations
</artifactId>
</dependency>
<dependency>
<groupId>
com.fasterxml.jackson.core
</groupId>
<artifactId>
jackson-databind
</artifactId>
</dependency>
</dependencies>
</dependencies>
</project>
</project>
\ No newline at end of file
dlink-core/src/main/java/com/dlink/executor/Executor.java
View file @
e2e7c5a3
...
@@ -2,13 +2,12 @@ package com.dlink.executor;
...
@@ -2,13 +2,12 @@ package com.dlink.executor;
import
com.dlink.executor.custom.CustomTableEnvironmentImpl
;
import
com.dlink.executor.custom.CustomTableEnvironmentImpl
;
import
com.dlink.result.SqlExplainResult
;
import
com.dlink.result.SqlExplainResult
;
import
com.fasterxml.jackson.databind.node.ObjectNode
;
import
org.apache.flink.api.common.JobExecutionResult
;
import
org.apache.flink.api.common.JobExecutionResult
;
import
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode
;
import
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
;
import
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
;
import
org.apache.flink.table.api.ExplainDetail
;
import
org.apache.flink.table.api.ExplainDetail
;
import
org.apache.flink.table.api.Table
;
import
org.apache.flink.table.api.Table
;
import
org.apache.flink.table.api.TableResult
;
import
org.apache.flink.table.api.TableResult
;
import
org.apache.flink.table.catalog.Catalog
;
import
org.apache.flink.table.catalog.CatalogManager
;
import
org.apache.flink.table.catalog.CatalogManager
;
import
org.apache.flink.table.functions.ScalarFunction
;
import
org.apache.flink.table.functions.ScalarFunction
;
import
org.apache.flink.table.functions.UserDefinedFunction
;
import
org.apache.flink.table.functions.UserDefinedFunction
;
...
...
dlink-core/src/main/java/com/dlink/explainer/Explainer.java
View file @
e2e7c5a3
...
@@ -13,7 +13,8 @@ import com.dlink.explainer.trans.TransGenerator;
...
@@ -13,7 +13,8 @@ import com.dlink.explainer.trans.TransGenerator;
import
com.dlink.interceptor.FlinkInterceptor
;
import
com.dlink.interceptor.FlinkInterceptor
;
import
com.dlink.result.SqlExplainResult
;
import
com.dlink.result.SqlExplainResult
;
import
com.dlink.utils.SqlUtil
;
import
com.dlink.utils.SqlUtil
;
import
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode
;
import
com.fasterxml.jackson.databind.ObjectMapper
;
import
com.fasterxml.jackson.databind.node.ObjectNode
;
import
org.apache.flink.table.api.ExplainDetail
;
import
org.apache.flink.table.api.ExplainDetail
;
import
org.apache.flink.table.catalog.CatalogManager
;
import
org.apache.flink.table.catalog.CatalogManager
;
import
org.apache.flink.table.catalog.ObjectIdentifier
;
import
org.apache.flink.table.catalog.ObjectIdentifier
;
...
@@ -33,6 +34,7 @@ import java.util.Optional;
...
@@ -33,6 +34,7 @@ import java.util.Optional;
public
class
Explainer
{
public
class
Explainer
{
private
Executor
executor
;
private
Executor
executor
;
private
ObjectMapper
mapper
=
new
ObjectMapper
();
public
Explainer
(
Executor
executor
)
{
public
Explainer
(
Executor
executor
)
{
this
.
executor
=
executor
;
this
.
executor
=
executor
;
...
@@ -67,6 +69,22 @@ public class Explainer {
...
@@ -67,6 +69,22 @@ public class Explainer {
return
sqlExplainRecords
;
return
sqlExplainRecords
;
}
}
public
ObjectNode
getStreamGraph
(
String
statement
){
List
<
SqlExplainResult
>
sqlExplainRecords
=
explainSqlResult
(
statement
);
List
<
String
>
strPlans
=
new
ArrayList
<>();
for
(
int
i
=
0
;
i
<
sqlExplainRecords
.
size
();
i
++)
{
if
(
Asserts
.
isNotNull
(
sqlExplainRecords
.
get
(
i
).
getType
())
&&
sqlExplainRecords
.
get
(
i
).
getType
().
contains
(
FlinkSQLConstant
.
DML
))
{
strPlans
.
add
(
sqlExplainRecords
.
get
(
i
).
getSql
());
}
}
if
(
strPlans
.
size
()>
0
){
return
translateObjectNode
(
strPlans
.
get
(
0
));
}
else
{
return
mapper
.
createObjectNode
();
}
}
private
List
<
TableCAResult
>
generateTableCA
(
String
statement
,
boolean
onlyTable
)
{
private
List
<
TableCAResult
>
generateTableCA
(
String
statement
,
boolean
onlyTable
)
{
List
<
SqlExplainResult
>
sqlExplainRecords
=
explainSqlResult
(
statement
);
List
<
SqlExplainResult
>
sqlExplainRecords
=
explainSqlResult
(
statement
);
List
<
String
>
strPlans
=
new
ArrayList
<>();
List
<
String
>
strPlans
=
new
ArrayList
<>();
...
@@ -134,8 +152,8 @@ public class Explainer {
...
@@ -134,8 +152,8 @@ public class Explainer {
return
results
;
return
results
;
}
}
private
ObjectNode
translateObjectNode
(
String
st
rPlans
)
{
private
ObjectNode
translateObjectNode
(
String
st
atement
)
{
return
executor
.
getStreamGraph
(
st
rPlans
);
return
executor
.
getStreamGraph
(
st
atement
);
}
}
private
List
<
Trans
>
translateTrans
(
ObjectNode
plan
)
{
private
List
<
Trans
>
translateTrans
(
ObjectNode
plan
)
{
...
...
dlink-core/src/main/java/com/dlink/explainer/trans/AbstractTrans.java
View file @
e2e7c5a3
package
com
.
dlink
.
explainer
.
trans
;
package
com
.
dlink
.
explainer
.
trans
;
import
org.apache.flink.shaded.jackson2.
com.fasterxml.jackson.databind.JsonNode
;
import
com.fasterxml.jackson.databind.JsonNode
;
import
java.util.ArrayList
;
import
java.util.ArrayList
;
import
java.util.List
;
import
java.util.List
;
...
...
dlink-core/src/main/java/com/dlink/explainer/trans/Trans.java
View file @
e2e7c5a3
package
com
.
dlink
.
explainer
.
trans
;
package
com
.
dlink
.
explainer
.
trans
;
import
org.apache.flink.shaded.jackson2.
com.fasterxml.jackson.databind.JsonNode
;
import
com.fasterxml.jackson.databind.JsonNode
;
import
java.util.List
;
import
java.util.List
;
...
...
dlink-core/src/main/java/com/dlink/explainer/trans/TransGenerator.java
View file @
e2e7c5a3
package
com
.
dlink
.
explainer
.
trans
;
package
com
.
dlink
.
explainer
.
trans
;
import
com.dlink.assertion.Asserts
;
import
com.dlink.assertion.Asserts
;
import
com.dlink.exception.SqlException
;
import
com.fasterxml.jackson.databind.JsonNode
;
import
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonNode
;
import
com.fasterxml.jackson.databind.node.ObjectNode
;
import
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode
;
import
java.util.ArrayList
;
import
java.util.ArrayList
;
import
java.util.HashMap
;
import
java.util.HashMap
;
...
...
dlink-core/src/main/java/com/dlink/job/JobManager.java
View file @
e2e7c5a3
...
@@ -15,6 +15,7 @@ import com.dlink.session.SessionConfig;
...
@@ -15,6 +15,7 @@ import com.dlink.session.SessionConfig;
import
com.dlink.session.SessionInfo
;
import
com.dlink.session.SessionInfo
;
import
com.dlink.session.SessionPool
;
import
com.dlink.session.SessionPool
;
import
com.dlink.trans.Operations
;
import
com.dlink.trans.Operations
;
import
com.fasterxml.jackson.databind.node.ObjectNode
;
import
org.apache.flink.api.common.JobID
;
import
org.apache.flink.api.common.JobID
;
import
org.apache.flink.table.api.TableResult
;
import
org.apache.flink.table.api.TableResult
;
...
@@ -302,4 +303,9 @@ public class JobManager extends RunTime {
...
@@ -302,4 +303,9 @@ public class JobManager extends RunTime {
Explainer
explainer
=
Explainer
.
build
(
executor
);
Explainer
explainer
=
Explainer
.
build
(
executor
);
return
explainer
.
explainSqlResult
(
statement
);
return
explainer
.
explainSqlResult
(
statement
);
}
}
public
ObjectNode
getStreamGraph
(
String
statement
){
Explainer
explainer
=
Explainer
.
build
(
executor
);
return
explainer
.
getStreamGraph
(
statement
);
}
}
}
dlink-core/src/main/java/com/dlink/plus/FlinkSqlPlus.java
View file @
e2e7c5a3
...
@@ -5,7 +5,7 @@ import com.dlink.explainer.Explainer;
...
@@ -5,7 +5,7 @@ import com.dlink.explainer.Explainer;
import
com.dlink.explainer.ca.ColumnCAResult
;
import
com.dlink.explainer.ca.ColumnCAResult
;
import
com.dlink.explainer.ca.TableCAResult
;
import
com.dlink.explainer.ca.TableCAResult
;
import
com.dlink.result.SqlExplainResult
;
import
com.dlink.result.SqlExplainResult
;
import
org.apache.flink.shaded.jackson2.
com.fasterxml.jackson.databind.node.ObjectNode
;
import
com.fasterxml.jackson.databind.node.ObjectNode
;
import
org.apache.flink.table.api.ExplainDetail
;
import
org.apache.flink.table.api.ExplainDetail
;
import
java.util.ArrayList
;
import
java.util.ArrayList
;
...
...
dlink-web/package.json
View file @
e2e7c5a3
...
@@ -46,7 +46,7 @@
...
@@ -46,7 +46,7 @@
"not ie <= 10"
"not ie <= 10"
],
],
"dependencies"
:
{
"dependencies"
:
{
"@ant-design/charts"
:
"^1.
1.18
"
,
"@ant-design/charts"
:
"^1.
2.10
"
,
"@ant-design/icons"
:
"^4.5.0"
,
"@ant-design/icons"
:
"^4.5.0"
,
"@ant-design/pro-descriptions"
:
"^1.6.8"
,
"@ant-design/pro-descriptions"
:
"^1.6.8"
,
"@ant-design/pro-form"
:
"^1.18.3"
,
"@ant-design/pro-form"
:
"^1.18.3"
,
...
...
dlink-web/src/components/Studio/StudioConsole/StudioCA/index.tsx
View file @
e2e7c5a3
...
@@ -106,7 +106,7 @@ const StudioCA = (props:any) => {
...
@@ -106,7 +106,7 @@ const StudioCA = (props:any) => {
});
});
res
.
then
((
result
)
=>
{
res
.
then
((
result
)
=>
{
if
(
result
.
code
==
0
){
if
(
result
.
code
==
0
){
setOneTableCAData
(
convert
TreeData
(
result
.
datas
[
0
]));
setOneTableCAData
(
full
TreeData
(
result
.
datas
[
0
]));
}
else
{
}
else
{
setOneTableCAData
(
null
);
setOneTableCAData
(
null
);
}
}
...
@@ -120,18 +120,18 @@ const StudioCA = (props:any) => {
...
@@ -120,18 +120,18 @@ const StudioCA = (props:any) => {
});
});
res
.
then
((
result
)
=>
{
res
.
then
((
result
)
=>
{
if
(
result
.
code
==
0
){
if
(
result
.
code
==
0
){
setOneColumnCAData
(
convert
TreeData
(
result
.
datas
[
0
]));
setOneColumnCAData
(
full
TreeData
(
result
.
datas
[
0
]));
}
else
{
}
else
{
setOneColumnCAData
(
null
);
setOneColumnCAData
(
null
);
}
}
})
})
};
};
const
convert
TreeData
=
(
node
)
=>
{
const
full
TreeData
=
(
node
)
=>
{
if
(
node
){
if
(
node
){
node
.
body
=
node
.
columns
.
toString
();
node
.
body
=
node
.
columns
.
toString
();
for
(
let
i
in
node
.
children
){
for
(
let
i
in
node
.
children
){
node
.
children
[
i
]
=
convert
TreeData
(
node
.
children
[
i
])
node
.
children
[
i
]
=
full
TreeData
(
node
.
children
[
i
])
}
}
return
node
;
return
node
;
}
}
...
...
dlink-web/src/components/Studio/StudioMenu/StudioGraph/index.less
0 → 100644
View file @
e2e7c5a3
dlink-web/src/components/Studio/StudioMenu/StudioGraph/index.tsx
0 → 100644
View file @
e2e7c5a3
import
{
Empty
}
from
"antd"
;
import
{
FlowAnalysisGraph
}
from
'@ant-design/charts'
;
import
{
StateType
}
from
"@/pages/FlinkSqlStudio/model"
;
import
{
connect
}
from
"umi"
;
import
styles
from
"./index.less"
;
import
React
,
{
useState
}
from
"react"
;
const
StudioGraph
=
(
props
:
any
)
=>
{
const
{
graphData
,
current
,
currentSession
}
=
props
;
const
config
=
{
data
:
graphData
,
nodeCfg
:
{
size
:
[
140
,
25
],
items
:
{
padding
:
6
,
containerStyle
:
{
fill
:
'#fff'
,
},
style
:
(
cfg
,
group
,
type
)
=>
{
const
styles
=
{
icon
:
{
width
:
12
,
height
:
12
,
},
value
:
{
fill
:
'#f00'
,
},
text
:
{
fill
:
'#aaa'
,
},
};
return
styles
[
type
];
},
},
nodeStateStyles
:
{
hover
:
{
stroke
:
'#1890ff'
,
lineWidth
:
2
,
},
},
title
:
{
containerStyle
:
{
fill
:
'transparent'
,
},
style
:
{
fill
:
'#000'
,
fontSize
:
12
,
},
},
style
:
{
fill
:
'#E6EAF1'
,
stroke
:
'#B2BED5'
,
radius
:
[
2
,
2
,
2
,
2
],
},
},
edgeCfg
:
{
label
:
{
style
:
{
fill
:
'#aaa'
,
fontSize
:
12
,
fillOpacity
:
1
,
},
},
style
:
(
edge
)
=>
{
const
stroke
=
edge
.
target
===
'0'
?
'#c86bdd'
:
'#5ae859'
;
return
{
stroke
,
lineWidth
:
1
,
strokeOpacity
:
0.5
,
};
},
edgeStateStyles
:
{
hover
:
{
lineWidth
:
2
,
strokeOpacity
:
1
,
},
},
},
markerCfg
:
(
cfg
)
=>
{
const
{
edges
}
=
graphData
;
return
{
position
:
'right'
,
show
:
edges
.
find
((
item
)
=>
item
.
source
==
cfg
.
id
),
collapsed
:
!
edges
.
find
((
item
)
=>
item
.
source
==
cfg
.
id
),
};
},
behaviors
:
[
'drag-canvas'
,
'zoom-canvas'
,
'drag-node'
],
};
/*const buildGraphEdges=(nodes)=>{
let edges = [];
for(let i in nodes){
if(nodes[i].predecessors){
for(let j in nodes[i].predecessors){
edges.push({source: nodes[i].predecessors[j].id.toString(),
target: nodes[i].id.toString(),
value: nodes[i].predecessors[j].ship_strategy})
}
}
}
return edges;
};*/
return
(
<>
{
graphData
?
<
FlowAnalysisGraph
{
...
config
}
/>
:<
Empty
image=
{
Empty
.
PRESENTED_IMAGE_SIMPLE
}
/>
}
</>
);
};
export
default
connect
(({
Studio
}:
{
Studio
:
StateType
})
=>
({
current
:
Studio
.
current
,
currentSession
:
Studio
.
currentSession
,
}))(
StudioGraph
);
dlink-web/src/components/Studio/StudioMenu/index.tsx
View file @
e2e7c5a3
...
@@ -12,8 +12,9 @@ import Breadcrumb from "antd/es/breadcrumb/Breadcrumb";
...
@@ -12,8 +12,9 @@ import Breadcrumb from "antd/es/breadcrumb/Breadcrumb";
import
{
StateType
}
from
"@/pages/FlinkSqlStudio/model"
;
import
{
StateType
}
from
"@/pages/FlinkSqlStudio/model"
;
import
{
connect
}
from
"umi"
;
import
{
connect
}
from
"umi"
;
import
{
handleAddOrUpdate
,
postDataArray
}
from
"@/components/Common/crud"
;
import
{
handleAddOrUpdate
,
postDataArray
}
from
"@/components/Common/crud"
;
import
{
executeSql
,
explainSql
}
from
"@/pages/FlinkSqlStudio/service"
;
import
{
executeSql
,
explainSql
,
getStreamGraph
}
from
"@/pages/FlinkSqlStudio/service"
;
import
StudioHelp
from
"./StudioHelp"
;
import
StudioHelp
from
"./StudioHelp"
;
import
StudioGraph
from
"./StudioGraph"
;
import
{
showCluster
,
showTables
}
from
"@/components/Studio/StudioEvent/DDL"
;
import
{
showCluster
,
showTables
}
from
"@/components/Studio/StudioEvent/DDL"
;
import
{
useState
}
from
"react"
;
import
{
useState
}
from
"react"
;
import
StudioExplain
from
"../StudioConsole/StudioExplain"
;
import
StudioExplain
from
"../StudioConsole/StudioExplain"
;
...
@@ -29,7 +30,9 @@ const StudioMenu = (props: any) => {
...
@@ -29,7 +30,9 @@ const StudioMenu = (props: any) => {
const
{
tabs
,
current
,
currentPath
,
form
,
refs
,
dispatch
,
currentSession
}
=
props
;
const
{
tabs
,
current
,
currentPath
,
form
,
refs
,
dispatch
,
currentSession
}
=
props
;
const
[
modalVisible
,
handleModalVisible
]
=
useState
<
boolean
>
(
false
);
const
[
modalVisible
,
handleModalVisible
]
=
useState
<
boolean
>
(
false
);
const
[
graphModalVisible
,
handleGraphModalVisible
]
=
useState
<
boolean
>
(
false
);
const
[
explainData
,
setExplainData
]
=
useState
([]);
const
[
explainData
,
setExplainData
]
=
useState
([]);
const
[
graphData
,
setGraphData
]
=
useState
();
const
execute
=
()
=>
{
const
execute
=
()
=>
{
let
selectsql
=
null
;
let
selectsql
=
null
;
...
@@ -131,7 +134,7 @@ const StudioMenu = (props: any) => {
...
@@ -131,7 +134,7 @@ const StudioMenu = (props: any) => {
});
});
};
};
const
onCheckSql
=
()
=>
{
const
onCheckSql
=
()
=>
{
let
selectsql
=
null
;
let
selectsql
=
null
;
if
(
current
.
monaco
.
current
)
{
if
(
current
.
monaco
.
current
)
{
let
selection
=
current
.
monaco
.
current
.
editor
.
getSelection
();
let
selection
=
current
.
monaco
.
current
.
editor
.
getSelection
();
...
@@ -171,6 +174,65 @@ const StudioMenu = (props: any) => {
...
@@ -171,6 +174,65 @@ const StudioMenu = (props: any) => {
})
})
};
};
const
onGetStreamGraph
=
()
=>
{
let
selectsql
=
null
;
if
(
current
.
monaco
.
current
)
{
let
selection
=
current
.
monaco
.
current
.
editor
.
getSelection
();
selectsql
=
current
.
monaco
.
current
.
editor
.
getModel
().
getValueInRange
(
selection
);
}
if
(
selectsql
==
null
||
selectsql
==
''
)
{
selectsql
=
current
.
value
;
}
let
useSession
=
!!
currentSession
.
session
;
let
param
=
{
useSession
:
useSession
,
session
:
currentSession
.
session
,
useRemote
:
current
.
task
.
useRemote
,
clusterId
:
current
.
task
.
clusterId
,
useResult
:
current
.
task
.
useResult
,
maxRowNum
:
current
.
task
.
maxRowNum
,
statement
:
selectsql
,
fragment
:
current
.
task
.
fragment
,
jobName
:
current
.
task
.
jobName
,
parallelism
:
current
.
task
.
parallelism
,
checkPoint
:
current
.
task
.
checkPoint
,
savePointPath
:
current
.
task
.
savePointPath
,
};
const
res
=
getStreamGraph
(
param
);
handleGraphModalVisible
(
true
);
res
.
then
((
result
)
=>
{
if
(
result
.
code
==
0
){
setGraphData
(
buildGraphData
(
result
.
datas
));
}
else
{
setGraphData
(
undefined
);
}
})
};
const
buildGraphData
=
(
data
)
=>
{
let
edges
=
[];
for
(
let
i
in
data
.
nodes
){
data
.
nodes
[
i
].
id
=
data
.
nodes
[
i
].
id
.
toString
();
data
.
nodes
[
i
].
value
=
{
title
:
data
.
nodes
[
i
].
pact
,
items
:
[
{
text
:
data
.
nodes
[
i
].
contents
,
},
],
};
if
(
data
.
nodes
[
i
].
predecessors
){
for
(
let
j
in
data
.
nodes
[
i
].
predecessors
){
edges
.
push
({
source
:
data
.
nodes
[
i
].
predecessors
[
j
].
id
.
toString
(),
target
:
data
.
nodes
[
i
].
id
.
toString
(),
value
:
data
.
nodes
[
i
].
predecessors
[
j
].
ship_strategy
})
}
}
}
data
.
edges
=
edges
;
return
data
;
};
const
saveSqlAndSettingToTask
=
async
()
=>
{
const
saveSqlAndSettingToTask
=
async
()
=>
{
const
fieldsValue
=
await
form
.
validateFields
();
const
fieldsValue
=
await
form
.
validateFields
();
if
(
current
.
task
)
{
if
(
current
.
task
)
{
...
@@ -288,14 +350,17 @@ const StudioMenu = (props: any) => {
...
@@ -288,14 +350,17 @@ const StudioMenu = (props: any) => {
<
Tooltip
title=
"检查当前的 FlinkSql"
>
<
Tooltip
title=
"检查当前的 FlinkSql"
>
<
Button
<
Button
type=
"text"
type=
"text"
icon=
{
<
SafetyCertificateTwoTone
/>
}
icon=
{
<
SafetyCertificateTwoTone
/>
}
onClick=
{
onCheckSql
}
onClick=
{
onCheckSql
}
/>
/>
</
Tooltip
>
</
Tooltip
>
<
Tooltip
title=
"获取当前的 FlinkSql 的执行图"
>
<
Button
<
Button
type=
"text"
type=
"text"
icon=
{
<
FlagTwoTone
twoToneColor=
"#ddd"
/>
}
icon=
{
<
FlagTwoTone
/>
}
onClick=
{
onGetStreamGraph
}
/>
/>
</
Tooltip
>
<
Tooltip
title=
"执行当前的 FlinkSql"
>
<
Tooltip
title=
"执行当前的 FlinkSql"
>
<
Button
<
Button
type=
"text"
type=
"text"
...
@@ -358,6 +423,16 @@ const StudioMenu = (props: any) => {
...
@@ -358,6 +423,16 @@ const StudioMenu = (props: any) => {
modalVisible=
{
modalVisible
}
modalVisible=
{
modalVisible
}
data=
{
explainData
}
data=
{
explainData
}
/>
/>
<
Modal
width=
{
1200
}
bodyStyle=
{
{
padding
:
'32px 40px 48px'
}
}
destroyOnClose
title=
"FlinkSQL 的 StreamGraph"
visible=
{
graphModalVisible
}
onCancel=
{
()
=>
handleGraphModalVisible
(
false
)
}
>
<
StudioGraph
graphData=
{
graphData
}
/>
</
Modal
>
</
Row
>
</
Row
>
);
);
};
};
...
...
dlink-web/src/pages/FlinkSqlStudio/service.ts
View file @
e2e7c5a3
...
@@ -28,6 +28,15 @@ export async function explainSql(params: StudioParam) {
...
@@ -28,6 +28,15 @@ export async function explainSql(params: StudioParam) {
});
});
}
}
export
async
function
getStreamGraph
(
params
:
StudioParam
)
{
return
request
<
API
.
Result
>
(
'/api/studio/getStreamGraph'
,
{
method
:
'POST'
,
data
:
{
...
params
,
},
});
}
export
async
function
getJobData
(
jobId
:
string
)
{
export
async
function
getJobData
(
jobId
:
string
)
{
return
request
<
API
.
Result
>
(
'/api/studio/getJobData'
,
{
return
request
<
API
.
Result
>
(
'/api/studio/getJobData'
,
{
method
:
'GET'
,
method
:
'GET'
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment