欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页

获取hadoop-yarn 的列表信息

程序员文章站 2022-06-07 23:02:25
...

一.背景

     一般hadoop集群的东西,需要去监控,或者杀东西之类的,所以需要调用下他们的API,下面是基本操作,获取正在跑的flink 程序,当然,也能获取CPU 内存等信息:

 

 

二.代码

    public static void main(String[] args) throws IOException, YarnException {

        YarnConfiguration yarnConf = YarnConfLoader.getYarnConf("/Users/qqr/Downloads/yarn-conf");
        YarnClient yarnClient = YarnClient.createYarnClient();
        yarnClient.init(yarnConf);
        yarnClient.start();
        Set<String> set = new HashSet<>();
        // ApplicationType
        set.add("Apache Flink");
        EnumSet<YarnApplicationState> enumSet = EnumSet.noneOf(YarnApplicationState.class);
        enumSet.add(YarnApplicationState.RUNNING);
        List<ApplicationReport> reportList = yarnClient.getApplications(set, enumSet);
        for(ApplicationReport report : reportList){
            System.out.println("id-name:"+report.getApplicationId()+":"+report.getName());
            System.out.println("mem:"+report.getApplicationResourceUsageReport().getNeededResources().getMemory());
            System.out.println();
        }
    }

 

 

输出:

id-name:application_1546069948045_128203:huoguo

mem:2048

 

id-name:application_1542000413153_5869:binlog_hbase

mem:2048

相关标签: flink hadoop yarn