feature(hudi-query): 完成 Hudi 表时间线查询
对于归档中的时间线查询似乎不会直接给出合并后的时间线状态,需要后续对时间线的内容进行单独处理
This commit is contained in:
@@ -0,0 +1,23 @@
|
||||
package com.lanyuanxiaoyao.service.forest.service;
|
||||
|
||||
import com.dtflys.forest.annotation.BaseRequest;
|
||||
import com.dtflys.forest.annotation.Get;
|
||||
import com.dtflys.forest.annotation.Query;
|
||||
import com.lanyuanxiaoyao.service.configuration.entity.hudi.HudiInstant;
|
||||
import java.util.Map;
|
||||
import org.eclipse.collections.api.list.ImmutableList;
|
||||
|
||||
/**
|
||||
* Hudi 操作
|
||||
*
|
||||
* @author lanyuanxiaoyao
|
||||
* @date 2023-05-01
|
||||
*/
|
||||
@BaseRequest(baseURL = "http://service-hudi-query")
|
||||
public interface HudiService {
|
||||
@Get("/timeline/list")
|
||||
ImmutableList<HudiInstant> timelineList(@Query Map<String, Object> queryMap);
|
||||
|
||||
@Get("/timeline/list_hdfs")
|
||||
ImmutableList<HudiInstant> timelineHdfsList(@Query Map<String, Object> queryMap);
|
||||
}
|
||||
@@ -3,6 +3,8 @@ package com.lanyuanxiaoyao.service.hudi.controller;
|
||||
import com.lanyuanxiaoyao.service.configuration.entity.hudi.HudiInstant;
|
||||
import com.lanyuanxiaoyao.service.hudi.service.TimelineService;
|
||||
import java.io.IOException;
|
||||
import java.util.List;
|
||||
import org.eclipse.collections.api.factory.Lists;
|
||||
import org.eclipse.collections.api.list.ImmutableList;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
@@ -29,12 +31,34 @@ public class TimelineController {
|
||||
}
|
||||
|
||||
@GetMapping("list")
|
||||
public ImmutableList<HudiInstant> allInstants(@RequestParam("flink_job_id") Long flinkJobId, @RequestParam("alias") String alias) throws IOException {
|
||||
return timelineService.timeline(flinkJobId, alias);
|
||||
public ImmutableList<HudiInstant> allInstants(
|
||||
@RequestParam("flink_job_id") Long flinkJobId,
|
||||
@RequestParam("alias") String alias,
|
||||
@RequestParam(value = "filter_type", required = false) List<String> filterType,
|
||||
@RequestParam(value = "filter_action", required = false) List<String> filterAction,
|
||||
@RequestParam(value = "filter_state", required = false) List<String> filterState
|
||||
) throws IOException {
|
||||
return timelineService.timeline(
|
||||
flinkJobId,
|
||||
alias,
|
||||
Lists.immutable.ofAll(filterType),
|
||||
Lists.immutable.ofAll(filterAction),
|
||||
Lists.immutable.ofAll(filterState)
|
||||
);
|
||||
}
|
||||
|
||||
@GetMapping("list_hdfs")
|
||||
public ImmutableList<HudiInstant> allInstants(@RequestParam("hdfs") String hdfs) throws IOException {
|
||||
return timelineService.timeline(hdfs);
|
||||
public ImmutableList<HudiInstant> allInstants(
|
||||
@RequestParam("hdfs") String hdfs,
|
||||
@RequestParam(value = "filter_type", required = false) List<String> filterType,
|
||||
@RequestParam(value = "filter_action", required = false) List<String> filterAction,
|
||||
@RequestParam(value = "filter_state", required = false) List<String> filterState
|
||||
) throws IOException {
|
||||
return timelineService.timeline(
|
||||
hdfs,
|
||||
Lists.immutable.ofAll(filterType),
|
||||
Lists.immutable.ofAll(filterAction),
|
||||
Lists.immutable.ofAll(filterState)
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
package com.lanyuanxiaoyao.service.hudi.service;
|
||||
|
||||
import cn.hutool.core.util.ObjectUtil;
|
||||
import com.eshore.odcp.hudi.connector.entity.TableMeta;
|
||||
import com.lanyuanxiaoyao.service.configuration.entity.hudi.HudiInstant;
|
||||
import com.lanyuanxiaoyao.service.forest.service.InfoService;
|
||||
@@ -8,7 +9,9 @@ import java.io.IOException;
|
||||
import org.apache.hadoop.conf.Configuration;
|
||||
import org.apache.hudi.common.table.HoodieTableMetaClient;
|
||||
import org.apache.hudi.common.table.timeline.HoodieInstant;
|
||||
import org.eclipse.collections.api.factory.Lists;
|
||||
import org.eclipse.collections.api.list.ImmutableList;
|
||||
import org.eclipse.collections.api.list.MutableList;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.cache.annotation.Cacheable;
|
||||
@@ -34,23 +37,40 @@ public class TimelineService {
|
||||
|
||||
@Cacheable(value = "timeline", sync = true, key = "#flinkJobId.toString()+#alias")
|
||||
@Retryable(Throwable.class)
|
||||
public ImmutableList<HudiInstant> timeline(Long flinkJobId, String alias) throws IOException {
|
||||
public ImmutableList<HudiInstant> timeline(Long flinkJobId, String alias, ImmutableList<String> filterType, ImmutableList<String> filterAction, ImmutableList<String> filterState) throws IOException {
|
||||
TableMeta meta = infoService.tableMetaDetail(flinkJobId, alias);
|
||||
return timeline(meta.getHudi().getTargetHdfsPath());
|
||||
return timeline(meta.getHudi().getTargetHdfsPath(), filterType, filterAction, filterState);
|
||||
}
|
||||
|
||||
private static final String INSTANT_TYPE_ACTIVE = "active";
|
||||
private static final String INSTANT_TYPE_ARCHIVE = "archive";
|
||||
|
||||
@Cacheable(value = "timeline", sync = true, key = "#hdfs")
|
||||
@Retryable(Throwable.class)
|
||||
public ImmutableList<HudiInstant> timeline(String hdfs) throws IOException {
|
||||
public ImmutableList<HudiInstant> timeline(String hdfs, ImmutableList<String> filterType, ImmutableList<String> filterAction, ImmutableList<String> filterState) throws IOException {
|
||||
HoodieTableMetaClient client = HoodieTableMetaClient.builder()
|
||||
.setConf(new Configuration())
|
||||
.setBasePath(hdfs)
|
||||
.build();
|
||||
ImmutableList<HudiInstant> activeInstants = HoodieUtils.getAllInstants(client, HoodieTableMetaClient::getActiveTimeline)
|
||||
.collect(instant -> covert("active", instant));
|
||||
ImmutableList<HudiInstant> archiveInstants = HoodieUtils.getAllInstants(client, HoodieTableMetaClient::getArchivedTimeline)
|
||||
.collect(instant -> covert("archive", instant));
|
||||
return activeInstants.newWithAll(archiveInstants)
|
||||
MutableList<HudiInstant> instants = Lists.mutable.empty();
|
||||
if (ObjectUtil.isEmpty(filterType)) {
|
||||
filterType = Lists.immutable.of(INSTANT_TYPE_ARCHIVE, INSTANT_TYPE_ACTIVE);
|
||||
}
|
||||
if (filterType.contains(INSTANT_TYPE_ARCHIVE)) {
|
||||
HoodieUtils.getAllInstants(client, HoodieTableMetaClient::getArchivedTimeline)
|
||||
.collect(instant -> covert(INSTANT_TYPE_ARCHIVE, instant))
|
||||
.select(instant -> ObjectUtil.isEmpty(filterAction) || filterAction.contains(instant.getAction()))
|
||||
.select(instant -> ObjectUtil.isEmpty(filterState) || filterState.contains(instant.getState()))
|
||||
.forEach(instants::add);
|
||||
}
|
||||
if (filterType.contains(INSTANT_TYPE_ACTIVE)) {
|
||||
HoodieUtils.getAllInstants(client, HoodieTableMetaClient::getActiveTimeline)
|
||||
.collect(instant -> covert(INSTANT_TYPE_ACTIVE, instant))
|
||||
.select(instant -> ObjectUtil.isEmpty(filterAction) || filterAction.contains(instant.getAction()))
|
||||
.select(instant -> ObjectUtil.isEmpty(filterState) || filterState.contains(instant.getState()))
|
||||
.forEach(instants::add);
|
||||
}
|
||||
return instants
|
||||
.toSortedList(HudiInstant::compareTo)
|
||||
.toImmutable();
|
||||
}
|
||||
|
||||
@@ -0,0 +1,56 @@
|
||||
package com.lanyuanxiaoyao.service.web.controller;
|
||||
|
||||
import cn.hutool.core.util.ObjectUtil;
|
||||
import com.lanyuanxiaoyao.service.configuration.entity.AmisResponse;
|
||||
import com.lanyuanxiaoyao.service.forest.service.HudiService;
|
||||
import java.util.List;
|
||||
import org.eclipse.collections.api.factory.Maps;
|
||||
import org.eclipse.collections.api.map.MutableMap;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.web.bind.annotation.GetMapping;
|
||||
import org.springframework.web.bind.annotation.RequestMapping;
|
||||
import org.springframework.web.bind.annotation.RequestParam;
|
||||
import org.springframework.web.bind.annotation.RestController;
|
||||
|
||||
/**
|
||||
* Hudi 接口
|
||||
*
|
||||
* @author lanyuanxiaoyao
|
||||
* @date 2023-05-01
|
||||
*/
|
||||
@RestController
|
||||
@RequestMapping("hudi")
|
||||
public class HudiController extends BaseController {
|
||||
private static final Logger logger = LoggerFactory.getLogger(HudiController.class);
|
||||
|
||||
private final HudiService hudiService;
|
||||
|
||||
@SuppressWarnings("SpringJavaInjectionPointsAutowiringInspection")
|
||||
public HudiController(HudiService hudiService) {
|
||||
this.hudiService = hudiService;
|
||||
}
|
||||
|
||||
@GetMapping("/timeline/list")
|
||||
public AmisResponse timeline(
|
||||
@RequestParam("flink_job_id") Long flinkJobId,
|
||||
@RequestParam("alias") String alias,
|
||||
@RequestParam(value = "filter_type", required = false) List<String> filterType,
|
||||
@RequestParam(value = "filter_action", required = false) List<String> filterAction,
|
||||
@RequestParam(value = "filter_state", required = false) List<String> filterState
|
||||
) {
|
||||
MutableMap<String, Object> queryMap = Maps.mutable.empty();
|
||||
queryMap.put("flink_job_id", flinkJobId);
|
||||
queryMap.put("alias", alias);
|
||||
if (ObjectUtil.isNotEmpty(filterType)) {
|
||||
queryMap.put("filter_type", filterType);
|
||||
}
|
||||
if (ObjectUtil.isNotEmpty(filterAction)) {
|
||||
queryMap.put("filter_action", filterAction);
|
||||
}
|
||||
if (ObjectUtil.isNotEmpty(filterAction)) {
|
||||
queryMap.put("filter_state", filterState);
|
||||
}
|
||||
return responseCrudData(hudiService.timelineList(queryMap));
|
||||
}
|
||||
}
|
||||
@@ -659,3 +659,49 @@ function publishTypeMapping(field) {
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
function hudiTimelineActionMapping(field) {
|
||||
return {
|
||||
type: 'mapping',
|
||||
value: `\${${field}}`,
|
||||
map: {
|
||||
'commit': "<span class='label label-info'>Commit</span>",
|
||||
'deltacommit': "<span class='label label-info'>Delta Commit</span>",
|
||||
'clean': "<span class='label label-info'>Clean</span>",
|
||||
'rollback': "<span class='label label-danger'>Rollback</span>",
|
||||
'savepoint': "<span class='label label-info'>Savepoint</span>",
|
||||
'replacecommit': "<span class='label label-warning'>Replace Commit</span>",
|
||||
'compaction': "<span class='label label-success'>Compaction</span>",
|
||||
'restore': "<span class='label label-warning'>Restore</span>",
|
||||
'indexing': "<span class='label label-info'>Indexing</span>",
|
||||
'schemacommit': "<span class='label label-warning'>Schema Commit</span>",
|
||||
'*': `<span class='label bg-gray-300'>\${${field}}</span>`
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
function hudiTimelineStateMapping(field) {
|
||||
return {
|
||||
type: 'mapping',
|
||||
value: `\${${field}}`,
|
||||
map: {
|
||||
'REQUESTED': "<span class='label label-info'>已提交</span>",
|
||||
'INFLIGHT': "<span class='label label-warning'>操作中</span>",
|
||||
'COMPLETED': "<span class='label label-success'>已完成</span>",
|
||||
'INVALID': "<span class='label label-danger'>错误</span>",
|
||||
'*': `<span class='label bg-gray-300'>\${${field}}</span>`
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
function hudiTimelineTypeMapping(field) {
|
||||
return {
|
||||
type: 'mapping',
|
||||
value: `\${${field}}`,
|
||||
map: {
|
||||
'active': "<span class='label label-info'>活跃</span>",
|
||||
'archive': "<span class='label bg-gray-300'>归档</span>",
|
||||
'*': `<span class='label bg-gray-300'>\${${field}}</span>`
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
@@ -68,7 +68,7 @@ function tableTab() {
|
||||
filterDefaultVisible: true,
|
||||
stopAutoRefreshWhenModalIsOpen: true,
|
||||
resizable: false,
|
||||
perPage: 10,
|
||||
perPage: 20,
|
||||
headerToolbar: [
|
||||
"reload",
|
||||
'filter-toggler',
|
||||
@@ -255,6 +255,55 @@ function tableTab() {
|
||||
actionType: 'dialog',
|
||||
dialog: simpleYarnDialog('compaction', '压缩详情')
|
||||
},
|
||||
{
|
||||
label: '时间线',
|
||||
type: 'action',
|
||||
level: 'link',
|
||||
actionType: 'dialog',
|
||||
dialog: {
|
||||
title: 'Hudi 表时间线',
|
||||
actions: [],
|
||||
size: 'lg',
|
||||
body: {
|
||||
type: 'crud',
|
||||
api: {
|
||||
method: 'get',
|
||||
url: '${base}/hudi/timeline/list',
|
||||
data: {
|
||||
flink_job_id: '${flinkJobId}',
|
||||
alias: '${tableMeta.alias}',
|
||||
filter_type: 'active'
|
||||
},
|
||||
},
|
||||
syncLocation: false,
|
||||
columns: [
|
||||
{
|
||||
name: 'timestamp',
|
||||
label: '时间点',
|
||||
},
|
||||
{
|
||||
name: 'action',
|
||||
label: '类型',
|
||||
...hudiTimelineActionMapping('action'),
|
||||
},
|
||||
{
|
||||
name: 'state',
|
||||
label: ' 状态',
|
||||
...hudiTimelineStateMapping('state'),
|
||||
},
|
||||
{
|
||||
name: 'fileName',
|
||||
label: '文件名',
|
||||
},
|
||||
{
|
||||
name: 'type',
|
||||
label: '来源',
|
||||
...hudiTimelineTypeMapping('type'),
|
||||
},
|
||||
],
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
label: '队列',
|
||||
type: 'action',
|
||||
|
||||
Reference in New Issue
Block a user