diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000..a205d36 Binary files /dev/null and b/.gitignore differ diff --git a/README.md b/README.md index 20ddaa9..d3d7b80 100644 --- a/README.md +++ b/README.md @@ -1,7 +1,91 @@ +# We're moving... + +We've been replaced! Well, actually we're growing to more than just hdfs. A new project has been created to reflect this and is now the active replacement. Please see `hadoopcli` for all the same capabilities, plus more! + +[Hadoop-Cli](https://github.com/dstreev/hadoop-cli) + ## HDFS-CLI HDFS-CLI is an interactive command line shell that makes interacting with the Hadoop Distribted Filesystem (HDFS) -simpler and more intuitive than the standard command-line tools that come with Hadoop. If you are familiar with OS X, Linux, or even Windows terminal/console-based applications, then you are likely familiar with features such as tab completion, command history, and ANSI formatting. +simpler and more intuitive than the standard command-line tools that come with Hadoop. If you're familiar with OS X, Linux, or even Windows terminal/console-based applications, then you are likely familiar with features such as tab completion, command history, and ANSI formatting. + +### Binary Package + +[Pre-Built Distribution](https://github.com/dstreev/hdfs-cli/releases) + +Download the release files to a temp location. As a root user, chmod +x the 3 shell script files then run the 'setup.sh'. This will create and install the hdfscli application to your path. + +Try it out on a host with default configs: + + hdfscli + +To use an alternate HADOOP_CONF_DIR: + + hdfscli --config /var/hadoop/dev-cfg + +### Release Notes + +#### 2.3.2-SNAPSHOT (in-progress) + +##### Behavioural Changes +- Due to the complexities and requirements of connecting to an environment, I've removed the +ability to connect manually to an environment by simply using 'connect'. This option was there from the beginning, but as more and more features are added, I'm finding myself hacking +away at recreating the settings and controls enabled through the configurations available in +`hdfs-site.xml` and `core-site.xml`. Therefore, the options `-k for kerberos` and `-a for auto connect` are no longer available. Unless specified via the `--config` option, the `hdfs-site.xml` and `core-site.xml` files in the default location of `/etc/hadoop/conf` will be used to establish all of the environment variables needed to connect. You can have multiple +directories with various `hdfs|core-site.xml` files in them and use the `--config` option to +enable connectivity to alternate hadoop environments. +##### Removed Options +`-a` Auto Connect. Use `--config` for alternate site files or nothing for the default `/etc/hadoop/conf`. +`-k` Kerberos Option + +##### Enhancements +I noticed some pauses coming from inquiries into the Namenode JMX for `nnstat`. Instead of requesting the +entire Namenode Jmx stack, now we target only the JmxBeans that we're interested in. This will help +the observed pauses and relieve the Namenode of some unnecessary work. + +#### 2.3.1-SNAPSHOT (in-progress) + - Added new 'nnstat' function to collect Namenode JMX stats for long term analysis. + +See [NN Stat Feature](https://youtu.be/CZxx_BxCX4Y) + +Also checkout how to auto fetch the stats via a script. Use this technique to run cron jobs to gather the stats. + +Where `-i stats` defines the initialization file. See [Auto nnstat](https://youtu.be/gy43_Hg2RXk) +``` + hdfscli -i stats +``` + +#### 2.3.0-SNAPSHOT + - Added new 'lsp' function. Consider it an 'ls' PLUS. + +#### 2.2.1-SNAPSHOT + + - External Config Support (See Below) + - Supports NN HA (thru hdfs and core site files) + - Auto Config support using a default config directory. + +#### 2.2.0-SNAPSHOT + + - Setup Script to help deploy (bin/setup.sh) + - hdfscli shell script to launch (bin/hdfscli.sh) + - Support for initialization Script (-i ) + - Kerberos Support via default and --config option for hadoop site files. + +#### 2.1.0 + + - Added support for create/delete/rename Snapshot + +#### 2.0.0 + + - Initial Forked Release of P. Taylor Goetz. + - Update to 2.6.0 Hadoop Libraries + - Re-wrote Command Implementation to use FSShell as basis for issuing commands. + - Provide Context Feedback in command window to show local and remote context. + - Added several missing hdfs dfs commands that didn't exist earlier. + +### Building + +This project requires the artifacts from https://github.com/dstreev/stemshell , which is a forked enhancement that has added support for processing command line parameters and deals with quoted variables. ### Basic Usage HDFS-CLI works much like a command-line ftp client: You first establish a connection to a remote HDFS filesystem, @@ -9,7 +93,15 @@ then manage local/remote files and transfers. To start HDFS-CLI, run the following command: - java -jar hdfs-cli-0.0.1-SNAPSHOT.jar + java -jar hdfs-cli-full-bin.jar + +### Command Documentation + +Help for any command can be obtained by executing the `help` command: + + help pwd + +Note that currently, documentation may be limited. #### Local vs. Remote Commands When working within a HDFS-CLI session, you manage both local (on your computer) and remote (HDFS) files. By convention, commands that apply to both local and remote filesystems are differentiated by prepending an `l` @@ -23,14 +115,115 @@ For example: Every HDFS-CLI session keeps track of both the local and remote current working directories. +### Support for External Configurations (core-site.xml,hdfs-site.xml) + +By default, hdfs-cli will use `/etc/hadoop/conf` as the default location to search for +`core-site.xml` and `hdfs-site.xml`. If you want to use an alternate, use the `--config` +option when starting up hdfs-cli. + +The `--config` option takes 1 parameter, a local directory. This directory should contain hdfs-site.xml and core-site.xml files. When used, you'll automatically be connected to hdfs and changed to you're hdfs home directory. + +Example Connection parameters. + + # Use the hadoop files in the input directory to configure and connect to HDFS. + hdfscli --config ../mydir + +This can be used in conjunction with the 'Startup' Init option below to run a set of commands automatically after the connection is made. The 'connect' option should NOT be used in the initialization script. + +### Startup Initialization Option + +Using the option '-i ' when launching the CLI, it will run all the commands in the file. + +The file needs to be location in the $HOME/.hdfs-cli directory. For example: + + # If you're using the helper shell script + hdfscli -i test + + # If you're using the java command + java -jar hdfs-cli-full-bin.jar -i test + + +Will initialize the session with the command(s) in $HOME/.hdfs-cli/test. One command per line. + +The contents could be any set of valid commands that you would use in the cli. For example: + + cd user/dstreev + +### NN Stats + +Collect Namenode stats from the available Namenode JMX url's. + +3 Type of stats are current collected and written to hdfs (with -o option) or to screen (no option specified) + +The 'default' delimiter for all records is '\u0001' (Cntl-A) + +>> Namenode Information: (optionally written to the directory 'nn_info') + Fields: Timestamp, HostAndPort, State, Version, Used, Free, Safemode, TotalBlocks, TotalFiles, NumberOfMissingBlocks, NumberOfMissingBlocksWithReplicationFactorOne + +>> Filesystem State: (optionally written to the directory 'fs_state') + Fields: Timestamp, HostAndPort, State, CapacityUsed, CapacityRemaining, BlocksTotal, PendingReplicationBlocks, UnderReplicatedBlocks, ScheduledReplicationBlocks, PendingDeletionBlocks, FSState, NumLiveDataNodes, NumDeadDataNodes, NumDecomLiveDataNodes, NumDecomDeadDataNodes, VolumeFailuresTotal + +>> Top User Operations: (optionally written to the directory 'top_user_ops') + Fields: Timestamp, HostAndPort, State, WindowLenMs, Operation, User, Count + +[Hive Table DDL for NN Stats](./src/main/hive/nn_stats.ddl) + +### Enhanced Directory Listing (lsp) + +Like 'ls', you can fetch many details about a file. But with this, you can also add information about the file that includes: +- Block Size +- Access Time +- Ratio of File Size to Block +- Datanode information for the files blocks (Host and Block Id) + +Use help to get the options: + + help lsp + +``` +usage: stats [OPTION ...] [ARGS ...] +Options: + -d,--maxDepth Depth of Recursion (default 5), use '-1' + for unlimited + -f,--format Comma separated list of one or more: + permissions_long,replication,user,group,siz + e,block_size,ratio,mod,access,path,datanode + _info (default all of the above) + -o,--output Output File (HDFS) (default System.out) +``` + +When not argument is specified, it will use the current directory. + +Examples: + + # Using the default format, output a listing to the files in `/user/dstreev/perf` to `/tmp/test.out` + lsp -o /tmp/test.out /user/dstreev/perf + +Output with the default format of: + + permissions_long,replication,user,group,size,block_size,ratio,mod,access,path,datanode_info + +``` + rw-------,3,dstreev,hdfs,429496700,134217728,3.200,2015-10-24 12:26:39.689,2015-10-24 12:23:27.406,/user/dstreev/perf/teragen_27/part-m-00004,10.0.0.166,d2.hdp.local,blk_1073747900 + rw-------,3,dstreev,hdfs,429496700,134217728,3.200,2015-10-24 12:26:39.689,2015-10-24 12:23:27.406,/user/dstreev/perf/teragen_27/part-m-00004,10.0.0.167,d3.hdp.local,blk_1073747900 + rw-------,3,dstreev,hdfs,33,134217728,2.459E-7,2015-10-24 12:27:09.134,2015-10-24 12:27:06.560,/user/dstreev/perf/terasort_27/_partition.lst,10.0.0.166,d2.hdp.local,blk_1073747909 + rw-------,3,dstreev,hdfs,33,134217728,2.459E-7,2015-10-24 12:27:09.134,2015-10-24 12:27:06.560,/user/dstreev/perf/terasort_27/_partition.lst,10.0.0.167,d3.hdp.local,blk_1073747909 + rw-------,1,dstreev,hdfs,543201700,134217728,4.047,2015-10-24 12:29:28.706,2015-10-24 12:29:20.882,/user/dstreev/perf/terasort_27/part-r-00002,10.0.0.167,d3.hdp.local,blk_1073747920 + rw-------,1,dstreev,hdfs,543201700,134217728,4.047,2015-10-24 12:29:28.706,2015-10-24 12:29:20.882,/user/dstreev/perf/terasort_27/part-r-00002,10.0.0.167,d3.hdp.local,blk_1073747921 +``` + +With the file in HDFS, you can build a [hive table](./src/main/hive/lsp.ddl) on top of it to do some analysis. One of the reasons I created this was to be able to review a directory used by some process and get a baring on the file construction and distribution across the cluster. + +#### Use Cases +- The ratio can be used to identify files that are below the block size (small files). +- With the Datanode information, you can determine if a dataset is hot-spotted on a cluster. All you need is a full list of hosts to join the results with. ### Available Commands #### Common Commands - connect connect to a remote HDFS instance help display help information put upload local files to the remote HDFS - + get (todo) retrieve remote files from HDFS to Local Filesystem #### Remote (HDFS) Commands cd change current working directory @@ -38,9 +231,21 @@ Every HDFS-CLI session keeps track of both the local and remote current working rm delete files/directories pwd print working directory path cat print file contents + chown change ownership + chmod change permissions + chgrp change group head print first few lines of a file mkdir create directories + count Count the number of directories, files and bytes under the paths that match the specified file pattern. + stat Print statistics about the file/directory at in the specified format. + tail Displays last kilobyte of the file to stdout. + text Takes a source file and outputs the file in text format. + touchz Create a file of zero length. + usage Return the help for an individual command. + createSnapshot Create Snapshot + deleteSnapshot Delete Snapshot + renameSnapshot Rename Snapshot #### Local (Local File System) Commands lcd change current working directory @@ -50,19 +255,25 @@ Every HDFS-CLI session keeps track of both the local and remote current working lcat print file contents lhead print first few lines of a file lmkdir create directories - -### Command Documentation -Help for any command can be obtained by executing the `help` command: - - help pwd - -Note that currently, documentation may be limited. +#### Tools and Utilities + lsp ls plus. Includes Block information and locations. + nnstat Namenode Statistics ### Known Bugs/Limitations * No support for paths containing spaces * No support for Windows XP +* Path Completion for chown, chmod, chgrp, rm is broken + +### Road Map + +- Support input variables +- Expand to support Extended ACL's (get/set) +- Add Support for setrep +- HA Commands + - NN and RM + diff --git a/bin/setup.sh b/bin/setup.sh new file mode 100755 index 0000000..22486ba --- /dev/null +++ b/bin/setup.sh @@ -0,0 +1,28 @@ +#!/usr/bin/env bash + +# Should be run as root. + +cd `dirname $0` + +mkdir -p /usr/local/hdfs-cli/bin +mkdir -p /usr/local/hdfs-cli/lib + +cp -f hdfscli /usr/local/hdfs-cli/bin +cp -f JCECheck /usr/local/hdfs-cli/bin + +if [ -f ../target/hdfs-cli-full-bin.jar ]; then + cp -f ../target/hdfs-cli-full-bin.jar /usr/local/hdfs-cli/lib +fi + +if [ -f hdfs-cli-full-bin.jar ]; then + cp -f hdfs-cli-full-bin.jar /usr/local/hdfs-cli/lib +fi + +chmod -R +r /usr/local/hdfs-cli +chmod +x /usr/local/hdfs-cli/bin/hdfscli +chmod +x /usr/local/hdfs-cli/bin/JCECheck + +ln -sf /usr/local/hdfs-cli/bin/JCECheck /usr/local/bin/JCECheck +ln -sf /usr/local/hdfs-cli/bin/hdfscli /usr/local/bin/hdfscli + + diff --git a/pom.xml b/pom.xml index 817f94b..e5f7d04 100644 --- a/pom.xml +++ b/pom.xml @@ -3,33 +3,53 @@ 4.0.0 com.instanceone.hdfs hdfs-cli - 0.0.1-SNAPSHOT + 2.3.2-SNAPSHOT Hadoop HDFS CLI Hadoop HDFS Command Line Interface - 0.20.205.0 + 2.6.0 + 4.11 + 2.7.3 + 2.4 + com.github.ptgoetz stemshell - 0.1.0-SNAPSHOT + 0.3.2-SNAPSHOT org.apache.hadoop - hadoop-core + hadoop-client ${hadoop.version} + + commons-io + commons-io + ${commons-io.version} + + + com.fasterxml.jackson.core + jackson-core + ${jackson.parser} + + + junit + junit + ${junit.version} + test + org.apache.maven.plugins maven-shade-plugin - 1.4 - - true - + 2.0 + + + package @@ -37,9 +57,15 @@ shade + true + hdfs-cli-full-bin + true + + implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/> + com.instanceone.hdfs.shell.HdfsShell @@ -51,8 +77,8 @@ org.apache.maven.plugins maven-compiler-plugin - 1.6 - 1.6 + 1.7 + 1.7 diff --git a/src/main/hive/lsp.ddl b/src/main/hive/lsp.ddl new file mode 100644 index 0000000..d7d1b77 --- /dev/null +++ b/src/main/hive/lsp.ddl @@ -0,0 +1,36 @@ +create database if not exists ${DB}; + +use ${DB}; + +drop table hdfs_analysis; + +create external table if not exists hdfs_analysis ( +permissions_long string, +replication int, +user_ string, +group_ string, +size bigint, +block_size bigint, +ratio double, +mod string, +access string, +path_ string, +ip_address string, +hostname string, +block_id string +) +ROW FORMAT DELIMITED + FIELDS TERMINATED BY ',' +STORED AS TEXTFILE +LOCATION '${LOCATION}'; + +drop table hosts; + +-- Build a hosts table from known blocks. Might be better to manually build this +-- record set to ensure ALL hosts are accounted for!! +create table hosts +stored as orc + as +select distinct hostname from hdfs_analysis; + +select * from hosts; \ No newline at end of file diff --git a/src/main/hive/nn_stats.ddl b/src/main/hive/nn_stats.ddl new file mode 100644 index 0000000..2d4d78a --- /dev/null +++ b/src/main/hive/nn_stats.ddl @@ -0,0 +1,52 @@ +use ${DB}; + +drop table if exists nn_info; +create external table if not exists nn_info ( +date_ String, +HostAndPort String, +State_ String, +Version String, +Used DOUBLE, +Free_ DOUBLE, +Safemode STRING, +TotalBlocks BIGINT, +TotalFiles BIGINT, +NumberOfMissingBlocks BIGINT, +NumberOfMissingBlocksWithReplicationFactorOne BIGINT +) +STORED AS TEXTFILE +LOCATION '${BASE_DIR}/nn_info'; + +drop table if exists fs_state; +create external table if not exists fs_state ( +date_ String, +HostAndPort String, +State_ String, +CapacityUsed BIGINT, +CapacityRemaining BIGINT, +BlocksTotal BIGINT, +PendingReplicationBlocks BIGINT, +UnderReplicatedBlocks BIGINT, +ScheduledReplicationBlocks BIGINT, +PendingDeletionBlocks BIGINT, +FSState String, +NumLiveDataNodes INT, +NumDeadDataNodes INT, +NumDecomLiveDataNodes INT, +NumDecomDeadDataNodes INT, +VolumeFailuresTotal INT +) +STORED AS TEXTFILE +LOCATION '${BASE_DIR}/fs_state'; + +drop table if exists top_user_ops; +create external table if not exists top_user_ops ( +date_ String, +HostAndPort String, +State_ String, +WindowLenMs BIGINT, +Operation String, +User_ String, +Count_ BIGINT) +STORED AS TEXTFILE +LOCATION '${BASE_DIR}/top_user_ops'; \ No newline at end of file diff --git a/src/main/java/JCECheck.java b/src/main/java/JCECheck.java new file mode 100644 index 0000000..70a9548 --- /dev/null +++ b/src/main/java/JCECheck.java @@ -0,0 +1,38 @@ +import javax.crypto.Cipher; +import java.security.NoSuchAlgorithmException; +import java.security.Provider; +import java.security.Security; +import java.util.Iterator; + +/** + * Created by dstreev on 2015-09-29. + */ +public class JCECheck { + public static void main(final String[] args) { + final Provider[] providers = Security.getProviders(); + for (int i = 0; i < providers.length; i++) { + final String name = providers[i].getName(); + final double version = providers[i].getVersion(); + System.out.println("Provider[" + i + "]:: " + name + " " + version); + if (args.length > 0) { + final Iterator it = providers[i].keySet().iterator(); + while (it.hasNext()) { + final String element = (String) it.next(); + if (element.toLowerCase().startsWith(args[0].toLowerCase()) + || args[0].equals("-all")) + System.out.println("\t" + element); + } + } + } + try { + int keyLength = Cipher.getMaxAllowedKeyLength("AES"); + if (keyLength > 128) { + System.out.println("JCE is available for Unlimited encryption. ["+keyLength+"]"); + } else { + System.out.println("JCE is NOT available for Unlimited encryption. ["+keyLength+"]"); + } + } catch (NoSuchAlgorithmException e) { + e.printStackTrace(); + } + } +} diff --git a/src/main/java/com/dstreev/hadoop/util/HdfsLsPlus.java b/src/main/java/com/dstreev/hadoop/util/HdfsLsPlus.java new file mode 100644 index 0000000..325acd9 --- /dev/null +++ b/src/main/java/com/dstreev/hadoop/util/HdfsLsPlus.java @@ -0,0 +1,377 @@ +package com.dstreev.hadoop.util; + +import com.dstreev.hdfs.shell.command.Constants; +import com.dstreev.hdfs.shell.command.Direction; +import com.instanceone.hdfs.shell.command.HdfsAbstract; +import com.instanceone.stemshell.Environment; +import jline.console.ConsoleReader; +import org.apache.commons.cli.CommandLine; +import org.apache.commons.cli.Option; +import org.apache.commons.cli.Options; +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FSDataOutputStream; +import org.apache.hadoop.fs.FileStatus; +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.fs.shell.PathData; +import org.apache.hadoop.hdfs.DFSClient; +import org.apache.hadoop.hdfs.protocol.DatanodeInfo; +import org.apache.hadoop.hdfs.protocol.LocatedBlock; +import org.apache.hadoop.hdfs.protocol.LocatedBlocks; + +import java.io.IOException; +import java.math.BigDecimal; +import java.math.MathContext; +import java.math.RoundingMode; +import java.net.URI; +import java.text.DateFormat; +import java.text.SimpleDateFormat; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.Date; +import java.util.List; + +import static com.dstreev.hadoop.util.HdfsLsPlus.PRINT_OPTION.*; + +/** + * Created by dstreev on 2016-02-15. + *

+ * The intent here is to provide a means of querying the Namenode and + * producing Metadata about the directory AND the files in it. + */ +public class HdfsLsPlus extends HdfsAbstract { + + private FileSystem fs = null; + + private static DateFormat df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS"); + + // TODO: Extended ACL's + private static String DEFAULT_FORMAT = "permissions_long,replication,user,group,size,block_size,ratio,mod,access,path,datanode_info"; + + enum PRINT_OPTION { + PERMISSIONS_LONG, + PERMISSIONS_SHORT, + REPLICATION, + USER, + GROUP, + SIZE, + BLOCK_SIZE, + RATIO, + MOD, + ACCESS, + PATH, + DATANODE_INFO + } + + // default + private PRINT_OPTION[] print_options = new PRINT_OPTION[]{PERMISSIONS_LONG, PATH, REPLICATION, + USER, + GROUP, + SIZE, + BLOCK_SIZE, + RATIO, + MOD, + ACCESS, + DATANODE_INFO}; + + private static int DEFAULT_DEPTH = 5; + private static String DEFAULT_SEPARATOR = ","; + private static String DEFAULT_NEWLINE = "\n"; + private int currentDepth = 0; + private int maxDepth = DEFAULT_DEPTH; +// private Boolean recurse = Boolean.TRUE; + private String format = DEFAULT_FORMAT; + private Configuration configuration = null; + private DFSClient dfsClient = null; + private FSDataOutputStream outFS = null; + private static MathContext mc = new MathContext(4, RoundingMode.HALF_UP); + private int count = 0; + + public HdfsLsPlus(String name) { + super(name); + } + + public HdfsLsPlus(String name, Environment env, Direction directionContext) { + super(name, env, directionContext); + } + + public HdfsLsPlus(String name, Environment env, Direction directionContext, int directives) { + super(name, env, directionContext, directives); + } + + public HdfsLsPlus(String name, Environment env, Direction directionContext, int directives, boolean directivesBefore, boolean directivesOptional) { + super(name, env, directionContext, directives, directivesBefore, directivesOptional); + } + + public HdfsLsPlus(String name, Environment env) { + super(name, env); + } + + public void setMaxDepth(int maxDepth) { + this.maxDepth = maxDepth; + } + +// public void setRecurse(Boolean recurse) { +// this.recurse = recurse; +// } + + public void setFormat(String format) { + this.format = format; + String[] strOptions = this.format.split(","); + List options_list = new ArrayList<>(); + for (String strOption: strOptions) { + PRINT_OPTION in = PRINT_OPTION.valueOf(strOption.toUpperCase()); + if (in != null) { + options_list.add(in); + } + } + print_options = new PRINT_OPTION[strOptions.length]; + print_options = options_list.toArray(print_options); + } + + private void writeItem(PathData item, FileStatus itemStatus) { + try { + StringBuilder sb = new StringBuilder(); + + boolean in = false; + for (PRINT_OPTION option : print_options) { + if (in && option != DATANODE_INFO) + sb.append(DEFAULT_SEPARATOR); + in = true; + switch (option) { + case PERMISSIONS_SHORT: + sb.append(itemStatus.getPermission().toExtendedShort()); + break; + case PERMISSIONS_LONG: + sb.append(itemStatus.getPermission()); + break; + case REPLICATION: + sb.append(itemStatus.getReplication()); + break; + case USER: + sb.append(itemStatus.getOwner()); + break; + case GROUP: + sb.append(itemStatus.getGroup()); + break; + case SIZE: + sb.append(itemStatus.getLen()); + break; + case BLOCK_SIZE: + sb.append(itemStatus.getBlockSize()); + break; + case RATIO: + Double blockRatio = (double) itemStatus.getLen() / itemStatus.getBlockSize(); + BigDecimal ratioBD = new BigDecimal(blockRatio, mc); + sb.append(ratioBD.toString()); + break; + case MOD: + sb.append(df.format(new Date(itemStatus.getModificationTime()))); + break; + case ACCESS: + sb.append(df.format(new Date(itemStatus.getAccessTime()))); + break; + case PATH: + sb.append(item.toString()); + break; + } + } + if (Arrays.asList(print_options).contains(DATANODE_INFO)) { + LocatedBlocks blocks = null; + blocks = dfsClient.getLocatedBlocks(item.toString(), 0, Long.MAX_VALUE); + for (LocatedBlock block : blocks.getLocatedBlocks()) { + DatanodeInfo[] datanodeInfo = block.getLocations(); +// System.out.println("\tBlock: " + block.getBlock().getBlockName()); + + for (DatanodeInfo dni : datanodeInfo) { +// System.out.println(dni.getIpAddr() + " - " + dni.getHostName()); + StringBuilder sb1 = new StringBuilder(sb); + sb1.append(DEFAULT_SEPARATOR); + sb1.append(dni.getIpAddr()).append(DEFAULT_SEPARATOR); + sb1.append(dni.getHostName()).append(DEFAULT_SEPARATOR); + sb1.append(block.getBlock().getBlockName()); + postItem(sb1.append(DEFAULT_NEWLINE).toString()); + } + } + } else { + postItem(sb.append(DEFAULT_NEWLINE).toString()); + } + + } catch (IOException e) { + e.printStackTrace(); + } + } + + private void postItem(String line) { + if (outFS != null) { + try { + outFS.write(line.getBytes()); + } catch (IOException e) { + e.printStackTrace(); + } + count++; + if (count % 10 == 0) + System.out.print("."); + if (count % 1000 == 0) + System.out.println(); + if (count % 10000 == 0) + System.out.println("----------"); + } else { + System.out.print(line); + } + } + + private void processPath(PathData path) { + // + currentDepth++; + if (maxDepth == -1 || currentDepth <= (maxDepth + 1)) { + FileStatus fileStatus = path.stat; + if (fileStatus.isDirectory()) { + PathData[] pathDatas = new PathData[0]; + try { + pathDatas = path.getDirectoryContents(); + } catch (IOException e) { + e.printStackTrace(); + } + for (PathData intPd : pathDatas) { + processPath(intPd); + } + } else { + // Go through contents. + writeItem(path, fileStatus); + } + } else { + System.out.println("Max Depth of: " + maxDepth + " Reached. Sub-folder will not be traversed beyond this depth. Increase of set to -1 for unlimited depth"); + } + currentDepth--; + } + + @Override + public void execute(Environment environment, CommandLine cmd, ConsoleReader consoleReader) { + // reset counter. + count = 0; + System.out.println("Beginning 'lsp' collection."); + + // Get the Filesystem + configuration = (Configuration) env.getValue(Constants.CFG); + + String hdfs_uri = (String) env.getProperty(Constants.HDFS_URL); + + fs = (FileSystem) env.getValue(Constants.HDFS); + + if (fs == null) { + System.out.println("Please connect first"); + return; + } + + URI nnURI = fs.getUri(); + + try { + dfsClient = new DFSClient(nnURI, configuration); + } catch (IOException e) { + e.printStackTrace(); + return; + } + + Option[] cmdOpts = cmd.getOptions(); + String[] cmdArgs = cmd.getArgs(); + + if (cmd.hasOption("maxDepth")) { + setMaxDepth(Integer.parseInt(cmd.getOptionValue("maxDepth"))); + } else { + setMaxDepth(DEFAULT_DEPTH); + } + + if (cmd.hasOption("format")) { + setFormat(cmd.getOptionValue("format")); + } else { + setFormat(DEFAULT_FORMAT); + } + + String outputFile = null; + + if (cmd.hasOption("output")) { + outputFile = buildPath2(fs.getWorkingDirectory().toString().substring(((String) env.getProperty(Constants.HDFS_URL)).length()), cmd.getOptionValue("output")); + Path pof = new Path(outputFile); + try { + if (fs.exists(pof)) + fs.delete(pof, false); + outFS = fs.create(pof); + } catch (IOException e) { + e.printStackTrace(); + } + + } + + String targetPath = null; + if (cmdArgs.length > 0) { + String pathIn = cmdArgs[0]; + targetPath = buildPath2(fs.getWorkingDirectory().toString().substring(((String) env.getProperty(Constants.HDFS_URL)).length()), pathIn); + } else { + targetPath = fs.getWorkingDirectory().toString().substring(((String) env.getProperty(Constants.HDFS_URL)).length()); + + } + + currentDepth = 0; + + PathData targetPathData = null; + try { + targetPathData = new PathData(targetPath, configuration); + } catch (IOException e) { + e.printStackTrace(); + return; + } + + processPath(targetPathData); + + if (outFS != null) + try { + outFS.close(); + } catch (IOException e) { + e.printStackTrace(); + } finally { + outFS = null; + } + System.out.println(); + System.out.println("'lsp' complete."); + + } + + @Override + public Options getOptions() { + Options opts = super.getOptions(); + +// opts.addOption("r", "recurse", false, "recurse (default false)"); + + Option formatOption = Option.builder("f").required(false) + .argName("output-format") + .desc("Comma separated list of one or more: permissions_long,replication,user,group,size,block_size,ratio,mod,access,path,datanode_info (default all of the above)") + .hasArg(true) + .numberOfArgs(1) + .valueSeparator(',') + .longOpt("format") + .build(); + opts.addOption(formatOption); + + Option depthOption = Option.builder("d").required(false) + .argName("maxDepth") + .desc("Depth of Recursion (default 5), use '-1' for unlimited") + .hasArg(true) + .numberOfArgs(1) + .longOpt("maxDepth") + .build(); + opts.addOption(depthOption); + + Option outputOption = Option.builder("o").required(false) + .argName("output") + .desc("Output File (HDFS) (default System.out)") + .hasArg(true) + .numberOfArgs(1) + .longOpt("output") + .build(); + opts.addOption(outputOption); + + return opts; + } + +} \ No newline at end of file diff --git a/src/main/java/com/dstreev/hadoop/util/HdfsNNStats.java b/src/main/java/com/dstreev/hadoop/util/HdfsNNStats.java new file mode 100644 index 0000000..e2560bc --- /dev/null +++ b/src/main/java/com/dstreev/hadoop/util/HdfsNNStats.java @@ -0,0 +1,362 @@ +package com.dstreev.hadoop.util; + +import com.dstreev.hdfs.shell.command.Constants; +import com.dstreev.hdfs.shell.command.Direction; +import com.instanceone.hdfs.shell.command.HdfsAbstract; +import com.instanceone.stemshell.Environment; +import jline.console.ConsoleReader; +import org.apache.commons.cli.CommandLine; +import org.apache.commons.cli.Option; +import org.apache.commons.cli.Options; +import org.apache.commons.io.IOUtils; +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FSDataOutputStream; +import org.apache.hadoop.fs.Path; +import org.apache.hadoop.hdfs.DFSClient; +import org.apache.hadoop.hdfs.DistributedFileSystem; + +import java.io.IOException; +import java.net.MalformedURLException; +import java.net.URI; +import java.net.URL; +import java.net.URLConnection; +import java.text.DateFormat; +import java.text.SimpleDateFormat; +import java.util.Date; +import java.util.LinkedHashMap; +import java.util.List; +import java.util.Map; + +/** + * Created by dstreev on 2016-02-15. + *

+ * The intent here is to provide a means of querying the Namenode and + * producing Metadata about the directory AND the files in it. + */ +public class HdfsNNStats extends HdfsAbstract { + private Configuration configuration = null; +// private DFSClient dfsClient = null; + private FSDataOutputStream outFS = null; + private String baseOutputDir = null; + + private DistributedFileSystem fs = null; + + private static String DEFAULT_FILE_FORMAT = "yyyy-MM"; + + private DateFormat dfFile = null; + +// private static String FS_STATE_JMX_BEAN = "Hadoop:service=NameNode,name=FSNamesystemState"; +// private static String NN_INFO_JMX_BEAN = "Hadoop:service=NameNode,name=NameNodeInfo"; + +// enum TARGET_BEAN { +// FS_STATE_JMX_BEAN("Hadoop:service=NameNode,name=FSNamesystemState"), +//// NN_STATUS_JMX_BEAN("Hadoop:service=NameNode,name=NameNodeStatus"), +// NN_INFO_JMX_BEAN("Hadoop:service=NameNode,name=NameNodeInfo"); +// +// private String beanName; +// +// public String getBeanName() { +// return beanName; +// } +// private TARGET_BEAN(String beanName) { +// this.beanName = beanName; +// } +// } + + public HdfsNNStats(String name) { + super(name); + } + + public HdfsNNStats(String name, Environment env, Direction directionContext) { + super(name, env, directionContext); + } + + public HdfsNNStats(String name, Environment env, Direction directionContext, int directives) { + super(name, env, directionContext, directives); + } + + public HdfsNNStats(String name, Environment env, Direction directionContext, int directives, boolean directivesBefore, boolean directivesOptional) { + super(name, env, directionContext, directives, directivesBefore, directivesOptional); + } + + public HdfsNNStats(String name, Environment env) { + super(name, env); + } + + protected void getHelp() { + StringBuilder sb = new StringBuilder(); + sb.append("Collect Namenode stats from the available Namenode JMX url's.").append("\n"); + sb.append("").append("\n"); + sb.append("3 Type of stats are current collected and written to hdfs (with -o option) or to screen (no option specified)").append("\n"); + sb.append("The 'default' delimiter for all records is '\u0001' (Cntl-A)").append("\n"); + sb.append("").append("\n"); + sb.append(">> Namenode Information: (optionally written to the directory 'nn_info')").append("\n"); + sb.append("Fields: Timestamp, HostAndPort, State, Version, Used, Free, Safemode, TotalBlocks, TotalFiles, NumberOfMissingBlocks, NumberOfMissingBlocksWithReplicationFactorOne").append("\n"); + sb.append("").append("\n"); + sb.append(">> Filesystem State: (optionally written to the directory 'fs_state')").append("\n"); + sb.append("Fields: Timestamp, HostAndPort, State, CapacityUsed, CapacityRemaining, BlocksTotal, PendingReplicationBlocks, UnderReplicatedBlocks, ScheduledReplicationBlocks, PendingDeletionBlocks, FSState, NumLiveDataNodes, NumDeadDataNodes, NumDecomLiveDataNodes, NumDecomDeadDataNodes, VolumeFailuresTotal").append("\n"); + sb.append("").append("\n"); + sb.append(">> Top User Operations: (optionally written to the directory 'top_user_ops')").append("\n"); + sb.append("Fields: Timestamp, HostAndPort, State, WindowLenMs, Operation, User, Count").append("\n"); + sb.append("").append("\n"); + System.out.println(sb.toString()); + } + + @Override + public void execute(Environment environment, CommandLine cmd, ConsoleReader consoleReader) { + + if (cmd.hasOption("help")) { + getHelp(); + return; + } + + System.out.println("Beginning 'Namenode Stat' collection."); + + // Get the Filesystem + configuration = (Configuration) env.getValue(Constants.CFG); + + fs = (DistributedFileSystem) env.getValue(Constants.HDFS); + + if (fs == null) { + System.out.println("Please connect first"); + return; + } + + URI nnURI = fs.getUri(); + + // Find the hdfs http urls. + Map> namenodeJmxUrls = getNamenodeHTTPUrls(configuration); + + Option[] cmdOpts = cmd.getOptions(); + String[] cmdArgs = cmd.getArgs(); + + if (cmd.hasOption("fileFormat")) { + dfFile = new SimpleDateFormat(cmd.getOptionValue("fileFormat")); + } else { + dfFile = new SimpleDateFormat(DEFAULT_FILE_FORMAT); + } + + if (cmd.hasOption("output")) { + baseOutputDir = buildPath2(fs.getWorkingDirectory().toString().substring(((String) env.getProperty(Constants.HDFS_URL)).length()), cmd.getOptionValue("output")); + } else { + baseOutputDir = null; + } + + // For each URL. + for (Map.Entry> entry : namenodeJmxUrls.entrySet()) { + try { + + URLConnection statusConnection = entry.getKey().openConnection(); + String statusJson = IOUtils.toString(statusConnection.getInputStream()); + + + for (Map.Entry innerEntry : entry.getValue().entrySet()) { +// System.out.println(entry.getKey() + ": " + innerEntry.getValue()); + + URLConnection httpConnection = innerEntry.getValue().openConnection(); + String beanJson = IOUtils.toString(httpConnection.getInputStream()); + + NamenodeJmxParser njp = null; + + String outputFilename = dfFile.format(new Date()) + ".txt"; + njp = new NamenodeJmxParser(statusJson, beanJson); + + // URL Query should match key. + switch (innerEntry.getKey()) { + case NN_INFO_JMX_BEAN: + // Get and Save NN Info. + String nnInfo = njp.getNamenodeInfo(); + // Open NN Info File. + if (baseOutputDir != null) { + String nnInfoPathStr = null; + if (baseOutputDir.endsWith("/")) { + nnInfoPathStr = baseOutputDir + "nn_info/" + outputFilename; + } else { + nnInfoPathStr = baseOutputDir + "/nn_info/" + outputFilename; + } + + Path nnInfoPath = new Path(nnInfoPathStr); + FSDataOutputStream nnInfoOut = null; + if (fs.exists(nnInfoPath)) { +// System.out.println("NN Info APPENDING"); + nnInfoOut = fs.append(nnInfoPath); + } else { +// System.out.println("NN Info CREATING"); + nnInfoOut = fs.create(nnInfoPath); + } + nnInfoOut.write(nnInfo.getBytes()); + // Newline + nnInfoOut.write("\n".getBytes()); + nnInfoOut.close(); + } else { + System.out.println(">> Namenode Info: "); + System.out.println(nnInfo); + } + break; + case FS_STATE_JMX_BEAN: + List topUserOps = njp.getTopUserOpRecords(); + // Get and Save TopUserOps + // Open TopUserOps file. + if (topUserOps.size() > 0) { + if (baseOutputDir != null) { + String topUserOpsPathStr = null; + if (baseOutputDir.endsWith("/")) { + topUserOpsPathStr = baseOutputDir + "top_user_ops/" + outputFilename; + } else { + topUserOpsPathStr = baseOutputDir + "/top_user_ops/" + outputFilename; + } + + Path topUserOpsPath = new Path(topUserOpsPathStr); + FSDataOutputStream topUserOpsOut = null; + if (fs.exists(topUserOpsPath)) { +// System.out.println("Top User Ops APPENDING"); + topUserOpsOut = fs.append(topUserOpsPath); + } else { +// System.out.println("Top User Ops CREATING"); + topUserOpsOut = fs.create(topUserOpsPath); + } + for (String record : topUserOps) { + topUserOpsOut.write(record.getBytes()); + // Newline + topUserOpsOut.write("\n".getBytes()); + + } + topUserOpsOut.close(); + } else { + System.out.println(">> Top User Operations: "); + for (String record : topUserOps) { + System.out.println(record); + } + } + } + + // Get and Save FS State + String fsState = njp.getFSState(); + // Open FS State Stat File. + if (baseOutputDir != null) { + String fsStatePathStr = null; + if (baseOutputDir.endsWith("/")) { + fsStatePathStr = baseOutputDir + "fs_state/" + outputFilename; + } else { + fsStatePathStr = baseOutputDir + "/fs_state/" + outputFilename; + } + + Path fsStatePath = new Path(fsStatePathStr); + FSDataOutputStream fsStateOut = null; + if (fs.exists(fsStatePath)) { +// System.out.println("FS State APPENDING"); + fsStateOut = fs.append(fsStatePath); + } else { +// System.out.println("FS State CREATING"); + fsStateOut = fs.create(fsStatePath); + } + fsStateOut.write(fsState.getBytes()); + // Newline + fsStateOut.write("\n".getBytes()); + fsStateOut.close(); + } else { + System.out.println(">> Filesystem State: "); + System.out.println(fsState); + } + break; + } + + + } + } catch (Throwable e) { + e.printStackTrace(); + } + + } + // Get Namenode Info + // Open File for Append. + if (fs instanceof DistributedFileSystem) { + System.out.println("Filesystem Reference is Distributed"); + } else { + System.out.println("Filesystem Reference is NOT distributed"); + } + + } + + private Map> getNamenodeHTTPUrls(Configuration configuration) { + Map> rtn = new LinkedHashMap>(); + + // Determine if HA is enabled. + // Look for 'dfs.nameservices', if present, then HA is enabled. + String nameServices = configuration.get("dfs.nameservices"); + if (nameServices != null) { + // HA Enabled + String[] nnRefs = configuration.get("dfs.ha.namenodes." + nameServices).split(","); + + // Get the http addresses. + for (String nnRef : nnRefs) { + String hostAndPort = configuration.get("dfs.namenode.http-address." + nameServices + "." + nnRef); + if (hostAndPort != null) { + try { + URL statusURL = new URL("http://" + hostAndPort + "/jmx?qry=" + NamenodeJmxParser.NN_STATUS_JMX_BEAN); + rtn.put(statusURL, getURLMap(hostAndPort)); + } catch (MalformedURLException e) { + e.printStackTrace(); + } + } + } + } else { + // Standalone + String hostAndPort = configuration.get("dfs.namenode.http-address"); + try { + URL statusURL = new URL("http://" + hostAndPort + "/jmx?qry=" + NamenodeJmxParser.NN_STATUS_JMX_BEAN); + rtn.put(statusURL, getURLMap(hostAndPort)); + } catch (MalformedURLException e) { + e.printStackTrace(); + } + } + + return rtn; + } + + private Map getURLMap(String hostAndPort) { + Map rtn = new LinkedHashMap<>(); + try { + for (NamenodeJmxBean target_bean : NamenodeJmxBean.values()) { + rtn.put(target_bean, new URL("http://" + hostAndPort + "/jmx?qry=" + target_bean.getBeanName())); + } + } catch (MalformedURLException e) { + e.printStackTrace(); + } + return rtn; + } + + @Override + public Options getOptions() { + Options opts = super.getOptions(); + + Option helpOption = Option.builder("h").required(false) + .argName("help") + .desc("Help") + .hasArg(false) + .longOpt("help") + .build(); + opts.addOption(helpOption); + + Option formatOption = Option.builder("ff").required(false) + .argName("fileFormat") + .desc("Output filename format. Value must be a pattern of 'SimpleDateFormat' format options.") + .hasArg(true) + .numberOfArgs(1) + .longOpt("fileFormat") + .build(); + opts.addOption(formatOption); + + Option outputOption = Option.builder("o").required(false) + .argName("output") + .desc("Output Base Directory (HDFS) (default System.out) from which all other sub-directories are based.") + .hasArg(true) + .numberOfArgs(1) + .longOpt("output") + .build(); + opts.addOption(outputOption); + + return opts; + } + +} \ No newline at end of file diff --git a/src/main/java/com/dstreev/hadoop/util/HdfsRepair.java b/src/main/java/com/dstreev/hadoop/util/HdfsRepair.java new file mode 100644 index 0000000..947b136 --- /dev/null +++ b/src/main/java/com/dstreev/hadoop/util/HdfsRepair.java @@ -0,0 +1,67 @@ +package com.dstreev.hadoop.util; + +import com.dstreev.hdfs.shell.command.Direction; +import com.instanceone.hdfs.shell.command.HdfsAbstract; +import com.instanceone.stemshell.Environment; +import jline.console.ConsoleReader; +import org.apache.commons.cli.CommandLine; +import org.apache.commons.cli.Options; + +/** + * Created by dstreev on 2015-11-22. + * + * Syntax: + * repair [-n ] [path] + * repair -n 100 /user + */ +public class HdfsRepair extends HdfsAbstract { + + public HdfsRepair(String name) { + super(name); + } + + public HdfsRepair(String name, Environment env, Direction directionContext ) { + super(name, env, directionContext); + } + + public HdfsRepair(String name, Environment env, Direction directionContext, int directives ) { + super(name,env,directionContext,directives); + } + + public HdfsRepair(String name, Environment env, Direction directionContext, int directives, boolean directivesBefore, boolean directivesOptional ) { + super(name,env,directionContext,directives,directivesBefore,directivesOptional); + } + + public HdfsRepair(String name, Environment env) { + super(name,env); + } + + @Override + public void execute(Environment environment, CommandLine commandLine, ConsoleReader consoleReader) { + System.out.println("Not implemented yet... :( "); + /* + + Run fsck from the current hdfs path. + + Issue the repairs as they come in. We don't want to get the whole list and then repair. + Question: Will these be blocking? Probably not, so we need another thread to work from a queue of entries + that's populated by this process. + + From the OutputStream, get the files listed as "Under replicated" + + Split the line on ":" and issue an HDFS setrep to 3 on the file. + + + */ + + } + + @Override + public Options getOptions() { + Options opts = super.getOptions(); + opts.addOption("n", true, "repair file count limit"); + opts.addOption("r", true, "override replication value"); + return opts; + } + +} diff --git a/src/main/java/com/dstreev/hadoop/util/JmxJsonParser.java b/src/main/java/com/dstreev/hadoop/util/JmxJsonParser.java new file mode 100644 index 0000000..10b942f --- /dev/null +++ b/src/main/java/com/dstreev/hadoop/util/JmxJsonParser.java @@ -0,0 +1,67 @@ +package com.dstreev.hadoop.util; + +import org.codehaus.jackson.JsonNode; +import org.codehaus.jackson.JsonParser; +import org.codehaus.jackson.map.ObjectMapper; + +import java.io.IOException; +import java.io.InputStream; +import java.util.Iterator; +import java.util.Map; +import java.util.TreeMap; + +/** + * Created by dstreev on 2016-03-21. + */ +public class JmxJsonParser { + + private ObjectMapper mapper = null; + private JsonNode root = null; + private JsonNode beanArrayNode = null; + + private static String NAME = "name"; + + private Map> beanMap = new TreeMap>(); + + public JmxJsonParser(String input) throws Exception { + mapper = new ObjectMapper(); + try { + root = mapper.readValue(input, JsonNode.class); + beanArrayNode = root.get("beans"); + if (beanArrayNode == null) { + throw new Exception("Couldn't locate Jmx 'Beans' array."); + } + } catch (IOException e) { + e.printStackTrace(); + } + + } + + public Map getJmxBeanContent(String name) { + Map rtn = null; + rtn = beanMap.get(name); + if (rtn == null) { + for (JsonNode beanNode : beanArrayNode) { + if (beanNode.get(NAME).asText().equals(name)) { + Map content = new TreeMap(); +// Iterator iNodes = beanNode.iterator(); + Iterator> iEntries = beanNode.getFields(); + + while (iEntries.hasNext()) { + Map.Entry entry = iEntries.next(); + + if (entry.getValue().isNumber()) { + content.put(entry.getKey(), entry.getValue().getNumberValue()); + } else { + content.put(entry.getKey(), entry.getValue().asText()); + } + } + beanMap.put(name, content); + rtn = content; + break; + } + } + } + return rtn; + } +} diff --git a/src/main/java/com/dstreev/hadoop/util/NamenodeJmxBean.java b/src/main/java/com/dstreev/hadoop/util/NamenodeJmxBean.java new file mode 100644 index 0000000..518a7d3 --- /dev/null +++ b/src/main/java/com/dstreev/hadoop/util/NamenodeJmxBean.java @@ -0,0 +1,20 @@ +package com.dstreev.hadoop.util; + +/** + * Created by dstreev on 2016-03-24. + */ +public enum NamenodeJmxBean { + FS_STATE_JMX_BEAN("Hadoop:service=NameNode,name=FSNamesystemState"), + NN_INFO_JMX_BEAN("Hadoop:service=NameNode,name=NameNodeInfo"); + + private String beanName; + + public String getBeanName() { + return beanName; + } + + NamenodeJmxBean(String beanName) { + this.beanName = beanName; + } + +} diff --git a/src/main/java/com/dstreev/hadoop/util/NamenodeJmxParser.java b/src/main/java/com/dstreev/hadoop/util/NamenodeJmxParser.java new file mode 100644 index 0000000..2725afa --- /dev/null +++ b/src/main/java/com/dstreev/hadoop/util/NamenodeJmxParser.java @@ -0,0 +1,250 @@ +package com.dstreev.hadoop.util; + +import org.apache.commons.io.IOUtils; +import org.codehaus.jackson.JsonNode; +import org.codehaus.jackson.map.ObjectMapper; + +import java.io.IOException; +import java.io.InputStream; +import java.text.DateFormat; +import java.text.SimpleDateFormat; +import java.util.*; + +/** + * Created by dstreev on 2016-03-21. + */ +public class NamenodeJmxParser { + + public static String NN_STATUS_JMX_BEAN = "Hadoop:service=NameNode,name=NameNodeStatus"; + + private static String TOP_USER_OPS_COUNT = "TopUserOpCounts"; + + private JmxJsonParser jjp = null; + private String delimiter = "\u0001"; + + private Map metadata = new LinkedHashMap(); + + + public NamenodeJmxParser(String resource) throws Exception { + ClassLoader classLoader = getClass().getClassLoader(); + + InputStream resourceStream = classLoader.getResourceAsStream(resource); +// InputStream dataIn = classLoader.getResourceAsStream(resource); + String resourceJson = IOUtils.toString(resourceStream); + + init(resourceJson, resourceJson); + } + + public NamenodeJmxParser(String statusIn, String dataIn) throws Exception { + init(statusIn, dataIn); + } + + private void init(String statusIn, String dataIn) throws Exception { + JmxJsonParser statusjp = new JmxJsonParser(statusIn); + jjp = new JmxJsonParser(dataIn); + + // Get Host Info + Map nnStatusMap = statusjp.getJmxBeanContent(NN_STATUS_JMX_BEAN); + String[] statusInclude = {"HostAndPort", "State"}; + + DateFormat df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ssZ"); + metadata.put("Timestamp", df.format(new Date())); + + transferTo(metadata, nnStatusMap, statusInclude); + + } + + public String getDelimiter() { + return delimiter; + } + + public void setDelimiter(String delimiter) { + this.delimiter = delimiter; + } + + private void transferTo(Map target, Map source, String[] transferItems) { + if (transferItems == null) { + for (Map.Entry entry : source.entrySet()) { + target.put(entry.getKey(), entry.getValue()); + } + } else { + for (String transferItem : transferItems) { + if (source.containsKey(transferItem)) { + target.put(transferItem, source.get(transferItem)); + } else { + System.out.println("Source doesn't contain key: " + transferItem); + } + } + } + } + + private List mapValuesToList(Map map, String[] fields) { + List rtn = new LinkedList<>(); + if (fields == null) { + for (Map.Entry entry : map.entrySet()) { + rtn.add(entry.getValue().toString()); + } + } else { + for (String field: fields) { + if (map.containsKey(field)) { + rtn.add(map.get(field).toString()); + } else { + System.out.println("Map doesn't contain key: " + field); + } + } + } + return rtn; + } + + public String listToString(List list) { + StringBuilder sb = new StringBuilder(); + Iterator iItems = list.iterator(); + while (iItems.hasNext()) { + sb.append(iItems.next()); + if (iItems.hasNext()) { + sb.append(delimiter); + } + } + return sb.toString(); + } + + public List getTopUserOpRecords() { + List rtn = null; + Map fsState = jjp.getJmxBeanContent(NamenodeJmxBean.FS_STATE_JMX_BEAN.getBeanName()); + + ObjectMapper mapper = new ObjectMapper(); + String tuocStr = fsState.get(TOP_USER_OPS_COUNT).toString(); + + try { + JsonNode tuocNode = mapper.readValue(tuocStr.getBytes(), JsonNode.class); + rtn = buildTopUserOpsCountRecords(tuocNode); + } catch (IOException e) { + e.printStackTrace(); + } + + return rtn; + } + + public String getNamenodeInfo() { + + List working = mapValuesToList(metadata, null); + + Map nnInfo = jjp.getJmxBeanContent(NamenodeJmxBean.NN_INFO_JMX_BEAN.getBeanName()); + + String[] fields = {"Version", "Used", "Free", "Safemode", "TotalBlocks", "TotalFiles", "NumberOfMissingBlocks", "NumberOfMissingBlocksWithReplicationFactorOne"}; + + List fieldList = mapValuesToList(nnInfo, fields); + + working.addAll(fieldList); + + return listToString(working); +// return working; + + /* + "name" : "Hadoop:service=NameNode,name=NameNodeInfo", + "modelerType" : "org.apache.hadoop.hdfs.server.namenode.FSNamesystem", + "Total" : 1254254346240, + "UpgradeFinalized" : true, + "ClusterId" : "CID-b255ee79-e4f1-44a8-b134-044c25d7bfd4", + "Version" : "2.7.1.2.3.5.1-68, rfe3c6b6dd1526d3c46f61a2e8fab9bb5eb649989", + "Used" : 43518906368, + "Free" : 1203736371256, + "Safemode" : "", + "NonDfsUsedSpace" : 6999068616, + "PercentUsed" : 3.4697034, + "BlockPoolUsedSpace" : 43518906368, + "PercentBlockPoolUsed" : 3.4697034, + "PercentRemaining" : 95.972275, + "CacheCapacity" : 0, + "CacheUsed" : 0, + "TotalBlocks" : 7813, + "TotalFiles" : 9555, + "NumberOfMissingBlocks" : 0, + "NumberOfMissingBlocksWithReplicationFactorOne" : 0, + + */ + } + + public String getFSState() { + + List working = mapValuesToList(metadata, null); + + Map fsState = jjp.getJmxBeanContent(NamenodeJmxBean.FS_STATE_JMX_BEAN.getBeanName()); + + String[] fields = {"CapacityUsed", "CapacityRemaining", "BlocksTotal", "PendingReplicationBlocks", "UnderReplicatedBlocks", "ScheduledReplicationBlocks", "PendingDeletionBlocks", "FSState", "NumLiveDataNodes", "NumDeadDataNodes", "NumDecomLiveDataNodes", "NumDecomDeadDataNodes", "VolumeFailuresTotal"}; + + + List fieldList = mapValuesToList(fsState, fields); + + working.addAll(fieldList); + + return listToString(working); + + /* + "CapacityTotal" : 1254254346240, + "CapacityUsed" : 43518906368, + "CapacityRemaining" : 1203736371256, + "TotalLoad" : 36, + "SnapshotStats" : "{\"SnapshottableDirectories\":4,\"Snapshots\":8}", + "FsLockQueueLength" : 0, + "BlocksTotal" : 7813, + "MaxObjects" : 0, + "FilesTotal" : 9555, + "PendingReplicationBlocks" : 0, + "UnderReplicatedBlocks" : 4, + "ScheduledReplicationBlocks" : 0, + "PendingDeletionBlocks" : 0, + "BlockDeletionStartTime" : 1458341216736, + "FSState" : "Operational", + "NumLiveDataNodes" : 3, + "NumDeadDataNodes" : 0, + "NumDecomLiveDataNodes" : 0, + "NumDecomDeadDataNodes" : 0, + "VolumeFailuresTotal" : 0, + "EstimatedCapacityLostTotal" : 0, + "NumDecommissioningDataNodes" : 0, + "NumStaleDataNodes" : 0, + "NumStaleStorages" : 0, + + */ + } + + private List buildTopUserOpsCountRecords(JsonNode topNode) { + List rtn = null; + if (topNode != null) { + rtn = new ArrayList(); + try { + StringBuilder sbHeader = new StringBuilder(); + // Build the Key for the Record. + for (String key : metadata.keySet()) { + sbHeader.append(metadata.get(key)).append(delimiter); + } + + // Cycle through the Windows + for (JsonNode wNode : topNode.get("windows")) { + StringBuilder sbWindow = new StringBuilder(sbHeader); + + sbWindow.append(wNode.get("windowLenMs").asText()).append(delimiter); + // Cycle through the Operations. + for (JsonNode opNode : wNode.get("ops")) { + // Get Op Type + StringBuilder sbOp = new StringBuilder(sbWindow); + sbOp.append(opNode.get("opType").asText()).append(delimiter); + // Cycle through the Users. + for (JsonNode userNode : opNode.get("topUsers")) { + StringBuilder sbUser = new StringBuilder(sbOp); + sbUser.append(userNode.get("user").asText()).append(delimiter); + sbUser.append(userNode.get("count").asText()); + // Add to the list. + rtn.add(sbUser.toString()); + } + } + } + + } catch (Throwable t) { + t.printStackTrace(); + } + } + return rtn; + } +} diff --git a/src/main/java/com/dstreev/hdfs/shell/command/Constants.java b/src/main/java/com/dstreev/hdfs/shell/command/Constants.java new file mode 100644 index 0000000..3b7e502 --- /dev/null +++ b/src/main/java/com/dstreev/hdfs/shell/command/Constants.java @@ -0,0 +1,13 @@ +package com.dstreev.hdfs.shell.command; + +/** + * Created by dstreev on 2015-11-22. + */ +public interface Constants { + + String HDFS_URL = "hdfs.url"; + String HDFS = "hdfs.fs"; + String LOCAL_FS = "local.fs"; + String CFG = "config"; + +} diff --git a/src/main/java/com/dstreev/hdfs/shell/command/Direction.java b/src/main/java/com/dstreev/hdfs/shell/command/Direction.java new file mode 100644 index 0000000..be1a0ca --- /dev/null +++ b/src/main/java/com/dstreev/hdfs/shell/command/Direction.java @@ -0,0 +1,11 @@ +package com.dstreev.hdfs.shell.command; + +/** + * Created by dstreev on 2015-11-22. + */ +public enum Direction { + LOCAL_REMOTE, + REMOTE_LOCAL, + REMOTE_REMOTE, + NONE +} diff --git a/src/main/java/com/instanceone/hdfs/shell/command/HdfsCat.java b/src/main/java/com/dstreev/hdfs/shell/command/LocalCat.java similarity index 85% rename from src/main/java/com/instanceone/hdfs/shell/command/HdfsCat.java rename to src/main/java/com/dstreev/hdfs/shell/command/LocalCat.java index fad97a7..ed2bc3f 100644 --- a/src/main/java/com/instanceone/hdfs/shell/command/HdfsCat.java +++ b/src/main/java/com/dstreev/hdfs/shell/command/LocalCat.java @@ -1,5 +1,3 @@ -// Copyright (c) 2012 P. Taylor Goetz (ptgoetz@gmail.com) - package com.instanceone.hdfs.shell.command; import java.io.BufferedReader; @@ -7,6 +5,7 @@ import java.io.InputStream; import java.io.InputStreamReader; +import com.dstreev.hdfs.shell.command.Constants; import jline.console.ConsoleReader; import jline.console.completer.Completer; @@ -18,21 +17,25 @@ import com.instanceone.hdfs.shell.completers.FileSystemNameCompleter; import com.instanceone.stemshell.Environment; -public class HdfsCat extends HdfsCommand { +/** + * Created by dstreev on 2015-11-22. + */ + +public class LocalCat extends HdfsCommand { public static final int LINE_COUNT = 10; private Environment env; private boolean local = false; - public HdfsCat(String name, Environment env, boolean local) { - super(name); - this.env = env; + public LocalCat(String name, Environment env, boolean local) { + super(name, env); +// this.env = env; this.local = local; } public void execute(Environment env, CommandLine cmd, ConsoleReader console) { - FileSystem hdfs = this.local ? (FileSystem)env.getValue(LOCAL_FS) : (FileSystem)env.getValue(HDFS); + FileSystem hdfs = this.local ? (FileSystem)env.getValue(Constants.LOCAL_FS) : (FileSystem)env.getValue(Constants.HDFS); logv(cmd, "CWD: " + hdfs.getWorkingDirectory()); if(cmd.getArgs().length == 1){ @@ -62,8 +65,7 @@ public void execute(Environment env, CommandLine cmd, ConsoleReader console) { } else{ // usage(); } - - + FSUtil.prompt(env); } @Override diff --git a/src/main/java/com/instanceone/hdfs/shell/command/HdfsHead.java b/src/main/java/com/dstreev/hdfs/shell/command/LocalHead.java similarity index 87% rename from src/main/java/com/instanceone/hdfs/shell/command/HdfsHead.java rename to src/main/java/com/dstreev/hdfs/shell/command/LocalHead.java index ecf66b0..50e40a3 100644 --- a/src/main/java/com/instanceone/hdfs/shell/command/HdfsHead.java +++ b/src/main/java/com/dstreev/hdfs/shell/command/LocalHead.java @@ -1,5 +1,3 @@ -// Copyright (c) 2012 P. Taylor Goetz (ptgoetz@gmail.com) - package com.instanceone.hdfs.shell.command; import java.io.BufferedReader; @@ -7,6 +5,7 @@ import java.io.InputStream; import java.io.InputStreamReader; +import com.dstreev.hdfs.shell.command.Constants; import jline.console.ConsoleReader; import jline.console.completer.Completer; @@ -18,22 +17,26 @@ import com.instanceone.hdfs.shell.completers.FileSystemNameCompleter; import com.instanceone.stemshell.Environment; -public class HdfsHead extends HdfsCommand { +/** + * Created by dstreev on 2015-11-22. + */ + +public class LocalHead extends HdfsCommand { public static final int LINE_COUNT = 10; private Environment env; private boolean local = false; - public HdfsHead(String name, Environment env, boolean local) { - super(name); + public LocalHead(String name, Environment env, boolean local) { + super(name, env); this.env = env; this.local = local; } public void execute(Environment env, CommandLine cmd, ConsoleReader console) { - FileSystem hdfs = this.local ? (FileSystem) env.getValue(LOCAL_FS) - : (FileSystem) env.getValue(HDFS); + FileSystem hdfs = this.local ? (FileSystem) env.getValue(Constants.LOCAL_FS) + : (FileSystem) env.getValue(Constants.HDFS); logv(cmd, "CWD: " + hdfs.getWorkingDirectory()); if (cmd.getArgs().length == 1) { @@ -68,6 +71,7 @@ public void execute(Environment env, CommandLine cmd, ConsoleReader console) { else { // usage(); } + FSUtil.prompt(env); } @Override diff --git a/src/main/java/com/instanceone/hdfs/shell/command/HdfsMkdir.java b/src/main/java/com/dstreev/hdfs/shell/command/LocalMkdir.java similarity index 73% rename from src/main/java/com/instanceone/hdfs/shell/command/HdfsMkdir.java rename to src/main/java/com/dstreev/hdfs/shell/command/LocalMkdir.java index 71fd8f3..2d15fca 100644 --- a/src/main/java/com/instanceone/hdfs/shell/command/HdfsMkdir.java +++ b/src/main/java/com/dstreev/hdfs/shell/command/LocalMkdir.java @@ -1,9 +1,11 @@ -// Copyright (c) 2012 P. Taylor Goetz (ptgoetz@gmail.com) - -package com.instanceone.hdfs.shell.command; +package com.dstreev.hdfs.shell.command; import java.io.IOException; +import com.dstreev.hdfs.shell.command.Constants; +import com.instanceone.hdfs.shell.command.FSUtil; +import com.instanceone.hdfs.shell.command.HdfsCommand; + import jline.console.ConsoleReader; import jline.console.completer.Completer; @@ -15,22 +17,26 @@ import com.instanceone.hdfs.shell.completers.FileSystemNameCompleter; import com.instanceone.stemshell.Environment; -public class HdfsMkdir extends HdfsCommand { +/** + * Created by dstreev on 2015-11-22. + */ + +public class LocalMkdir extends HdfsCommand { public static final int LINE_COUNT = 10; private Environment env; private boolean local = false; - public HdfsMkdir(String name, Environment env, boolean local) { - super(name); - this.env = env; + public LocalMkdir(String name, Environment env, boolean local) { + super(name, env); +// this.env = env; this.local = local; } public void execute(Environment env, CommandLine cmd, ConsoleReader console) { - FileSystem hdfs = this.local ? (FileSystem) env.getValue(LOCAL_FS) - : (FileSystem) env.getValue(HDFS); + FileSystem hdfs = this.local ? (FileSystem) env.getValue(Constants.LOCAL_FS) + : (FileSystem) env.getValue(Constants.HDFS); logv(cmd, "CWD: " + hdfs.getWorkingDirectory()); if (cmd.getArgs().length == 1) { @@ -48,6 +54,7 @@ public void execute(Environment env, CommandLine cmd, ConsoleReader console) { } else { } + FSUtil.prompt(env); } @Override diff --git a/src/main/java/com/instanceone/hdfs/shell/command/HdfsRm.java b/src/main/java/com/dstreev/hdfs/shell/command/LocalRm.java similarity index 71% rename from src/main/java/com/instanceone/hdfs/shell/command/HdfsRm.java rename to src/main/java/com/dstreev/hdfs/shell/command/LocalRm.java index 907ffd4..b29110d 100644 --- a/src/main/java/com/instanceone/hdfs/shell/command/HdfsRm.java +++ b/src/main/java/com/dstreev/hdfs/shell/command/LocalRm.java @@ -1,7 +1,8 @@ -// Copyright (c) 2012 P. Taylor Goetz (ptgoetz@gmail.com) - -package com.instanceone.hdfs.shell.command; +package com.dstreev.hdfs.shell.command; +import com.dstreev.hdfs.shell.command.Constants; +import com.instanceone.hdfs.shell.command.FSUtil; +import com.instanceone.hdfs.shell.command.HdfsCommand; import jline.console.ConsoleReader; import org.apache.commons.cli.CommandLine; @@ -11,18 +12,22 @@ import com.instanceone.stemshell.Environment; -public class HdfsRm extends HdfsCommand { +/** + * Created by dstreev on 2015-11-22. + */ + +public class LocalRm extends HdfsCommand { private boolean local = false; - public HdfsRm(String name, boolean local) { + public LocalRm(String name, boolean local) { super(name); this.local = local; } public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { try { - FileSystem hdfs = this.local ? (FileSystem) env.getValue(LOCAL_FS) - : (FileSystem) env.getValue(HDFS); + FileSystem hdfs = this.local ? (FileSystem) env.getValue(Constants.LOCAL_FS) + : (FileSystem) env.getValue(Constants.HDFS); String remoteFile = cmd.getArgs()[0]; logv(cmd, "HDFS file: " + remoteFile); @@ -33,6 +38,8 @@ public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { logv(cmd, "Deleting recursively..."); hdfs.delete(hdfsPath, recursive); + FSUtil.prompt(env); + } catch (Throwable e) { log(cmd, "Error: " + e.getMessage()); diff --git a/src/main/java/com/instanceone/hdfs/shell/HdfsShell.java b/src/main/java/com/instanceone/hdfs/shell/HdfsShell.java index ad54e77..c989b0a 100644 --- a/src/main/java/com/instanceone/hdfs/shell/HdfsShell.java +++ b/src/main/java/com/instanceone/hdfs/shell/HdfsShell.java @@ -2,54 +2,241 @@ package com.instanceone.hdfs.shell; -import com.instanceone.hdfs.shell.command.HdfsCat; -import com.instanceone.hdfs.shell.command.HdfsCd; -import com.instanceone.hdfs.shell.command.HdfsConnect; -import com.instanceone.hdfs.shell.command.HdfsHead; -import com.instanceone.hdfs.shell.command.HdfsLs; -import com.instanceone.hdfs.shell.command.HdfsMkdir; -import com.instanceone.hdfs.shell.command.HdfsPut; -import com.instanceone.hdfs.shell.command.HdfsPwd; -import com.instanceone.hdfs.shell.command.HdfsRm; -import com.instanceone.hdfs.shell.command.LocalCd; -import com.instanceone.hdfs.shell.command.LocalLs; -import com.instanceone.hdfs.shell.command.LocalPwd; +import com.dstreev.hadoop.util.HdfsLsPlus; +import com.dstreev.hadoop.util.HdfsNNStats; +import com.dstreev.hdfs.shell.command.Direction; +import com.dstreev.hdfs.shell.command.LocalMkdir; +import com.dstreev.hdfs.shell.command.LocalRm; +import com.instanceone.hdfs.shell.command.*; +import com.dstreev.hadoop.util.HdfsRepair; +import com.instanceone.hdfs.shell.command.LocalCat; +import com.instanceone.hdfs.shell.command.LocalHead; import com.instanceone.stemshell.Environment; import com.instanceone.stemshell.commands.Env; import com.instanceone.stemshell.commands.Exit; import com.instanceone.stemshell.commands.Help; import com.instanceone.stemshell.commands.HistoryCmd; +import jline.console.ConsoleReader; +import org.apache.commons.cli.*; -public class HdfsShell extends com.instanceone.stemshell.Shell{ - - public static void main(String[] args) throws Exception{ +import java.io.BufferedReader; +import java.io.File; +import java.io.FileReader; +import java.io.IOException; + +public class HdfsShell extends com.instanceone.stemshell.Shell { + + public static void main(String[] args) throws Exception { new HdfsShell().run(args); } + private Options getOptions() { + // create Options object + Options options = new Options(); + + // add i option + Option initOption = Option.builder("i").required(false) + .argName("init set").desc("Initialize with set") + .longOpt("init") + .hasArg(true).numberOfArgs(1) + .build(); + options.addOption(initOption); + + Option helpOption = Option.builder("?").required(false) + .longOpt("help") + .build(); + options.addOption(helpOption); + + // TODO: Scripting + //options.addOption("f", true, "Script file"); + + return options; + + } + + @Override + protected void preProcessInitializationArguments(String[] arguments) { + super.preProcessInitializationArguments(arguments); + + // create Options object + Options options = getOptions(); + + CommandLineParser parser = new DefaultParser(); + CommandLine cmd = null; + + try { + cmd = parser.parse(options, arguments); + } catch (ParseException pe) { + HelpFormatter formatter = new HelpFormatter(); + formatter.printHelp("hdfs-cli", options); + } + + if (cmd.hasOption("help")) { + HelpFormatter formatter = new HelpFormatter(); + formatter.printHelp("hdfs-cli", options); + System.exit(-1); + } + + } + + @Override + protected void postProcessInitializationArguments(String[] arguments, ConsoleReader reader) { + super.postProcessInitializationArguments(arguments, reader); + + // create Options object + Options options = getOptions(); + + CommandLineParser parser = new DefaultParser(); + CommandLine cmd = null; + + try { + cmd = parser.parse(options, arguments); + } catch (ParseException pe) { + HelpFormatter formatter = new HelpFormatter(); + formatter.printHelp("hdfs-cli", options); + } + + autoConnect(reader); + + if (cmd.hasOption("init")) { + initialSet(cmd.getOptionValue("init"), reader); + } + + } + + private void initialSet(String set, ConsoleReader reader) { + System.out.println("-- Initializing with set: " + set); + + File dir = new File(System.getProperty("user.home"), "." + + this.getName()); + if (dir.exists() && dir.isFile()) { + throw new IllegalStateException( + "Default configuration file exists and is not a directory: " + + dir.getAbsolutePath()); + } else if (!dir.exists()) { + dir.mkdir(); + } + // directory created, touch history file + File setFile = new File(dir, set); + if (!setFile.exists()) { + try { + if (!setFile.createNewFile()) { + throw new IllegalStateException( + "Unable to create set file: " + + setFile.getAbsolutePath()); + } else { + System.out.println("New Initialization File create in: " + System.getProperty("user.home") + System.getProperty("file.separator") + this.getName() + System.getProperty("file.separator") + set + ". Add commands to this file to initialized the next session"); + } + } catch (IOException ioe) { + ioe.printStackTrace(); + } + } else { + try { + BufferedReader br = new BufferedReader(new FileReader(setFile)); + String line = null; + while ((line = br.readLine()) != null) { + System.out.println(line); + String line2 = line.trim(); + if (line2.length() > 0 && !line2.startsWith("#")) { + processInput(line2, reader); + } + } + } catch (Exception e) { + e.printStackTrace(); + } + + } + + } + + private void autoConnect(ConsoleReader reader) { + try { + String userHome = System.getProperty("user.name"); + processInput("connect", reader); + processInput("cd /user/" + userHome, reader); + } catch (Exception e) { + e.printStackTrace(); + } + + } + + private void runScript(String file, ConsoleReader reader) { + + } + @Override public void initialize(Environment env) throws Exception { - + env.addCommand(new Exit("exit")); env.addCommand(new LocalLs("lls", env)); env.addCommand(new LocalPwd("lpwd")); env.addCommand(new LocalCd("lcd", env)); - env.addCommand(new HdfsLs("ls")); env.addCommand(new HdfsCd("cd", env)); env.addCommand(new HdfsPwd("pwd")); - env.addCommand(new HdfsPut("put", env)); - env.addCommand(new HdfsHead("head", env, false)); - env.addCommand(new HdfsHead("lhead", env, true)); - env.addCommand(new HdfsCat("cat", env, false)); - env.addCommand(new HdfsCat("lcat", env, true)); - env.addCommand(new HdfsMkdir("mkdir", env, false)); - env.addCommand(new HdfsMkdir("lmkdir", env, true)); - env.addCommand(new HdfsRm("rm", false)); - env.addCommand(new HdfsRm("lrm", true)); + + // remote local + env.addCommand(new HdfsCommand("get", env, Direction.REMOTE_LOCAL)); + env.addCommand(new HdfsCommand("copyFromLocal", env, Direction.LOCAL_REMOTE)); + // local remote + env.addCommand(new HdfsCommand("put", env, Direction.LOCAL_REMOTE)); + env.addCommand(new HdfsCommand("copyToLocal", env, Direction.REMOTE_LOCAL)); + // src dest + env.addCommand(new HdfsCommand("cp", env, Direction.REMOTE_REMOTE)); + + // amend to context path, if present + env.addCommand(new HdfsCommand("chown", env, Direction.NONE, 1)); + env.addCommand(new HdfsCommand("chmod", env, Direction.NONE, 1)); + env.addCommand(new HdfsCommand("chgrp", env, Direction.NONE, 1)); + + env.addCommand(new HdfsCommand("createSnapshot", env, Direction.NONE, 1, false, true)); + env.addCommand(new HdfsCommand("deleteSnapshot", env, Direction.NONE, 1, false, false)); + env.addCommand(new HdfsCommand("renameSnapshot", env, Direction.NONE, 2, false, false)); + + env.addCommand(new HdfsCommand("du", env, Direction.NONE)); + env.addCommand(new HdfsCommand("df", env, Direction.NONE)); + env.addCommand(new HdfsCommand("dus", env, Direction.NONE)); + env.addCommand(new HdfsCommand("ls", env, Direction.NONE)); + env.addCommand(new HdfsCommand("lsr", env, Direction.NONE)); +// env.addCommand(new HdfsCommand("find", env, Direction.NONE, 1, false)); + + + env.addCommand(new HdfsCommand("mkdir", env, Direction.NONE)); + + env.addCommand(new HdfsCommand("count", env, Direction.NONE)); + env.addCommand(new HdfsCommand("stat", env, Direction.NONE)); + env.addCommand(new HdfsCommand("tail", env, Direction.NONE)); + env.addCommand(new HdfsCommand("head", env, Direction.NONE)); +// env.addCommand(new HdfsCommand("test", env, Direction.NONE)); + env.addCommand(new HdfsCommand("touchz", env, Direction.NONE)); + + env.addCommand(new HdfsCommand("rm", env, Direction.NONE)); + env.addCommand(new HdfsCommand("rmdir", env, Direction.NONE)); + env.addCommand(new HdfsCommand("mv", env, Direction.REMOTE_REMOTE)); + env.addCommand(new HdfsCommand("cat", env, Direction.NONE)); + env.addCommand(new HdfsCommand("text", env, Direction.NONE)); + env.addCommand(new HdfsCommand("checksum", env, Direction.NONE)); + env.addCommand(new HdfsCommand("usage", env)); + + // Security Help +// env.addCommand(new HdfsUGI("ugi")); +// env.addCommand(new HdfsKrb("krb", env, Direction.NONE, 1)); + + // HDFS Utils + //env.addCommand(new HdfsRepair("repair", env, Direction.NONE, 2, true, true)); + + env.addCommand(new LocalHead("lhead", env, true)); + env.addCommand(new LocalCat("lcat", env, true)); + env.addCommand(new LocalMkdir("lmkdir", env, true)); + env.addCommand(new LocalRm("lrm", true)); env.addCommand(new Env("env")); env.addCommand(new HdfsConnect("connect")); env.addCommand(new Help("help", env)); env.addCommand(new HistoryCmd("history")); - + + // HDFS Tools + env.addCommand(new HdfsLsPlus("lsp", env, Direction.NONE)); + env.addCommand(new HdfsNNStats("nnstat", env, Direction.NONE)); + } @Override diff --git a/src/main/java/com/instanceone/hdfs/shell/command/FSUtil.java b/src/main/java/com/instanceone/hdfs/shell/command/FSUtil.java index 323f041..0bdf2ce 100644 --- a/src/main/java/com/instanceone/hdfs/shell/command/FSUtil.java +++ b/src/main/java/com/instanceone/hdfs/shell/command/FSUtil.java @@ -7,9 +7,12 @@ import java.text.SimpleDateFormat; import java.util.Date; +import com.dstreev.hdfs.shell.command.Constants; +import com.instanceone.stemshell.Environment; import org.apache.hadoop.fs.FileStatus; import com.instanceone.stemshell.format.ANSIStyle; +import org.apache.hadoop.fs.FileSystem; public class FSUtil { @@ -19,18 +22,20 @@ private FSUtil() { public static String longFormat(FileStatus file) { String retval = (file.isDir() ? "d" : "-") + file.getPermission() - + " " + + (file.getPermission().getAclBit() ? "+":"") + + (file.getPermission().getEncryptedBit() ? "#":"") + + "\t" + file.getOwner() - + " " + + "\t" + file.getGroup() - + " " + + "\t" + formatFileSize(file.getLen()) - + " " + + "\t" + formatDate(file.getModificationTime()) - + " " + + "\t" + (file.isDir() ? ANSIStyle.style(file.getPath() - .getName(), ANSIStyle.FG_BLUE) : file + .getName(), ANSIStyle.FG_GREEN) : file .getPath().getName()); return retval; @@ -38,19 +43,40 @@ public static String longFormat(FileStatus file) { public static String shortFormat(FileStatus file) { String retval = (file.isDir() ? ANSIStyle.style(file.getPath() - .getName(), ANSIStyle.FG_BLUE) : file.getPath() + .getName(), ANSIStyle.FG_GREEN) : file.getPath() .getName()); return retval; } - + + public static void prompt(Environment env) { + try { + StringBuilder sb = new StringBuilder(); + + FileSystem localfs = (FileSystem) env.getValue(Constants.LOCAL_FS); + FileSystem hdfs = (FileSystem) env.getValue(Constants.HDFS); + + String hdfswd = hdfs.getWorkingDirectory().toString(); + String localwd = localfs.getWorkingDirectory().toString(); + + String hwd = ANSIStyle.style(hdfswd, ANSIStyle.FG_GREEN) ; + + String lwd = ANSIStyle.style(localwd, ANSIStyle.FG_YELLOW); + + env.setPrompt(" " + hwd + "\n " + lwd + "\n$ "); + + } catch (Exception e) { + e.printStackTrace(); + } + } + public static String formatDate(long millis){ Date date = new Date(millis); return formatDate(date); } public static String formatDate(Date date){ - DateFormat df = new SimpleDateFormat("MM/dd/yyyy HH:mm:ss z"); + DateFormat df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss z"); return df.format(date); } diff --git a/src/main/java/com/instanceone/hdfs/shell/command/HdfsAbstract.java b/src/main/java/com/instanceone/hdfs/shell/command/HdfsAbstract.java new file mode 100644 index 0000000..16fdd92 --- /dev/null +++ b/src/main/java/com/instanceone/hdfs/shell/command/HdfsAbstract.java @@ -0,0 +1,274 @@ +// Copyright (c) 2012 P. Taylor Goetz (ptgoetz@gmail.com) + +package com.instanceone.hdfs.shell.command; + +import com.dstreev.hdfs.shell.command.Constants; +import com.dstreev.hdfs.shell.command.Direction; +import com.instanceone.hdfs.shell.completers.FileSystemNameCompleter; +import com.instanceone.stemshell.Environment; +import com.instanceone.stemshell.command.AbstractCommand; +import jline.console.completer.Completer; +import org.apache.hadoop.fs.FileSystem; + +public abstract class HdfsAbstract extends AbstractCommand { + protected Environment env; + +// public static final String HDFS_URL = "hdfs.url"; +// public static final String HDFS = "hdfs.fs"; +// public static final String LOCAL_FS = "local.fs"; +// public static final String CFG = "config"; + +// public enum Direction { +// LOCAL_REMOTE, +// REMOTE_LOCAL, +// REMOTE_REMOTE, +// NONE; +// } + + enum Side { + LEFT,RIGHT + } + + protected Direction directionContext = null; + + protected int directives = 0; + protected boolean directivesBefore = true; + protected boolean directivesOptional = false; + + public HdfsAbstract(String name) { + super(name); + } + + public HdfsAbstract(String name, Environment env, Direction directionContext ) { + super(name); + this.env = env; + this.directionContext = directionContext; + } + + public HdfsAbstract(String name, Environment env, Direction directionContext, int directives ) { + super(name); + this.env = env; + this.directionContext = directionContext; + this.directives = directives; + } + + public HdfsAbstract(String name, Environment env, Direction directionContext, int directives, boolean directivesBefore, boolean directivesOptional ) { + super(name); + this.env = env; + this.directionContext = directionContext; + this.directives = directives; + this.directivesBefore = directivesBefore; + this.directivesOptional = directivesOptional; + } + + public HdfsAbstract(String name, Environment env) { + super(name); + this.env = env; + } +// +// public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { +// FsShell shell = new FsShell(); +// +// Configuration conf = (Configuration)env.getValue(Constants.CFG); +// +// String hdfs_uri = (String)env.getProperty(Constants.HDFS_URL); +// +// FileSystem hdfs = (FileSystem) env.getValue(Constants.HDFS); +// +// if (hdfs == null) { +// System.out.println("Please connect first"); +// } +// conf.set("fs.defaultFS", hdfs_uri); +// +// conf.setQuietMode(false); +// shell.setConf(conf); +// int res; +// String[] argv = null; +// +// Option[] cmdOpts = cmd.getOptions(); +// String[] cmdArgs = cmd.getArgs(); +// +// // TODO: Need to Handle context aware file operations. +// // put, get, mv, copy.., chmod, chown, chgrp, count +// int pathCount = 0; +// +// String leftPath = null; +// String rightPath = null; +// +// switch (directionContext) { +// case REMOTE_LOCAL: +// pathCount += 2; // Source and Destination Path Elements. +// break; +// case LOCAL_REMOTE: +// pathCount += 2; // Source and Destination Path Elements. +// +// break; +// case REMOTE_REMOTE: +// pathCount += 2; // Source and Destination Path Elements. +// +// break; +// default: // NONE +// pathCount += 1; +// } +// +// leftPath = buildPath(Side.LEFT, cmdArgs, directionContext); +// if (directionContext != Direction.NONE) { +// rightPath = buildPath(Side.RIGHT, cmdArgs, directionContext); +// } +// +// String[] newCmdArgs = new String[pathCount]; +// if (rightPath != null) { +// newCmdArgs[0] = leftPath; +// newCmdArgs[1] = rightPath; +// } else { +// newCmdArgs[0] = leftPath; +// } +// +// argv = new String[cmdOpts.length + newCmdArgs.length + 1 + directives]; +// +// int pos = 1; +// +// for (Option opt: cmdOpts) { +// argv[pos++] = "-" + opt.getOpt(); +// } +// +// if (directivesBefore) { +// for (int i = 0; i < directives; i++) { +// argv[pos++] = cmdArgs[i]; +// } +// } +// +// for (String arg: newCmdArgs) { +// argv[pos++] = arg; +// } +// +// if (!directivesBefore) { +// for (int i = directives; i > 0; i--) { +// try { +// argv[pos++] = cmdArgs[cmdArgs.length - (i)]; +// } catch (Exception e) { +// // Can happen when args are optional +// } +// } +// } +// +// argv[0] = "-" + getName(); +// +// System.out.println("HDFS Command: " + Arrays.toString(argv)); +// +// try { +// res = ToolRunner.run(shell, argv); +// } catch (Exception e) { +// e.printStackTrace(); +// } finally { +// try { +// shell.close(); +// } catch (IOException e) { +// e.printStackTrace(); +// } +// } +// } + + protected String buildPath(Side side, String[] args, Direction context) { + String rtn = null; + + FileSystem localfs = (FileSystem)env.getValue(Constants.LOCAL_FS); + FileSystem hdfs = (FileSystem) env.getValue(Constants.HDFS); + + String in = null; + + switch (side) { + case LEFT: + if (args.length > 0) + if (directivesBefore) { + in = args[directives]; + } else { + if (directivesOptional) { + if (args.length > directives) { + in = args[args.length-(directives+1)]; + } else { + // in is null + } + } else { + in = args[args.length-(directives+1)]; + } + } + switch (context) { + case REMOTE_LOCAL: + case REMOTE_REMOTE: + case NONE: + rtn = buildPath2(hdfs.getWorkingDirectory().toString().substring(((String)env.getProperty(Constants.HDFS_URL)).length()), in); + break; + case LOCAL_REMOTE: + rtn = buildPath2(localfs.getWorkingDirectory().toString().substring(5), in); + break; + } + break; + case RIGHT: + if (args.length > 1) + if (directivesBefore) + in = args[directives + 1]; + else + in = args[args.length-(directives+1)]; + switch (context) { + case REMOTE_LOCAL: + rtn = buildPath2(localfs.getWorkingDirectory().toString().substring(5), in); + break; + case LOCAL_REMOTE: + case REMOTE_REMOTE: + rtn = buildPath2(hdfs.getWorkingDirectory().toString().substring(((String)env.getProperty(Constants.HDFS_URL)).length()), in); + break; + case NONE: + break; + } + break; + } + if (rtn != null && rtn.contains(" ")) { + rtn = "'" + rtn + "'"; + } + return rtn; + } + + protected String buildPath2(String current, String input) { + if (input != null) { + if (input.startsWith("/")) + return input; + else + return current + "/" + input; + } else { + return current; + } + } + +// @Override +// public Options getOptions() { +// Options opts = super.getOptions(); +// opts.addOption("l", false, "show extended file attributes"); +// opts.addOption("R", false, "recurse"); +// opts.addOption("f", false, "force / is file"); +// opts.addOption("p", false, "preserve"); +// opts.addOption("h", false, "human readable"); +// opts.addOption("s", false, "summary"); +// opts.addOption("q", false, "query"); +// opts.addOption("d", false, "dump / path is directory"); +// opts.addOption("e", false, "encoding / path exists"); +// opts.addOption("t", false, "sort by Timestamp"); +// opts.addOption("S", false, "sort by Size / path not empty"); +// opts.addOption("r", false, "reverse"); +// opts.addOption("z", false, "file length is 0"); +// opts.addOption("u", false, "user access time"); +// opts.addOption("skipTrash", false, "Skip Trash"); +// opts.addOption("ignorecrc", false, "ignorecrc"); +// opts.addOption("crc", false, "crc"); +// +//// opts.addOption("ignore-fail-on-non-empty", false, "ignore-fail-on-non-empty"); +// return opts; +// } + + @Override + public Completer getCompleter() { + return new FileSystemNameCompleter(this.env, false); + } + + +} diff --git a/src/main/java/com/instanceone/hdfs/shell/command/HdfsCd.java b/src/main/java/com/instanceone/hdfs/shell/command/HdfsCd.java index 5ff6df9..aa9fadc 100644 --- a/src/main/java/com/instanceone/hdfs/shell/command/HdfsCd.java +++ b/src/main/java/com/instanceone/hdfs/shell/command/HdfsCd.java @@ -4,6 +4,7 @@ import java.io.IOException; +import com.dstreev.hdfs.shell.command.Constants; import jline.console.ConsoleReader; import jline.console.completer.Completer; @@ -18,13 +19,14 @@ public class HdfsCd extends HdfsCommand { private Environment env; public HdfsCd(String name, Environment env) { - super(name); + super(name, env); this.env = env; } public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { + FileSystem hdfs = null; try { - FileSystem hdfs = (FileSystem)env.getValue(HDFS); + hdfs = (FileSystem)env.getValue(Constants.HDFS); String dir = cmd.getArgs().length == 0 ? "/" : cmd.getArgs()[0]; logv(cmd, "CWD before: " + hdfs.getWorkingDirectory()); @@ -32,7 +34,7 @@ public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { Path newPath = null; if(dir.startsWith("/")){ - newPath = new Path(env.getProperty(HDFS_URL), dir); + newPath = new Path(env.getProperty(Constants.HDFS_URL), dir); } else{ newPath = new Path(hdfs.getWorkingDirectory(), dir); } @@ -49,6 +51,8 @@ public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { } catch (IOException e) { System.out.println(e.getMessage()); + } finally { + FSUtil.prompt(env); } } diff --git a/src/main/java/com/instanceone/hdfs/shell/command/HdfsCommand.java b/src/main/java/com/instanceone/hdfs/shell/command/HdfsCommand.java index 71a3b49..22f3b5a 100644 --- a/src/main/java/com/instanceone/hdfs/shell/command/HdfsCommand.java +++ b/src/main/java/com/instanceone/hdfs/shell/command/HdfsCommand.java @@ -2,16 +2,175 @@ package com.instanceone.hdfs.shell.command; -import com.instanceone.stemshell.command.AbstractCommand; +import com.dstreev.hdfs.shell.command.Constants; +import com.dstreev.hdfs.shell.command.Direction; +import com.instanceone.stemshell.Environment; +import jline.console.ConsoleReader; +import org.apache.commons.cli.CommandLine; +import org.apache.commons.cli.Option; +import org.apache.commons.cli.Options; +import org.apache.hadoop.conf.Configuration; +import org.apache.hadoop.fs.FileSystem; +import org.apache.hadoop.fs.FsShell; +import org.apache.hadoop.util.ToolRunner; + +import java.io.IOException; +import java.util.Arrays; + +public class HdfsCommand extends HdfsAbstract { -public abstract class HdfsCommand extends AbstractCommand { - public static final String HDFS_URL = "hdfs.url"; - public static final String HDFS = "hdfs.fs"; - public static final String LOCAL_FS = "local.fs"; - public HdfsCommand(String name) { super(name); } + public HdfsCommand(String name, Environment env, Direction directionContext ) { + super(name, env, directionContext); + } + + public HdfsCommand(String name, Environment env, Direction directionContext, int directives ) { + super(name,env,directionContext,directives); + } + + public HdfsCommand(String name, Environment env, Direction directionContext, int directives, boolean directivesBefore, boolean directivesOptional ) { + super(name,env,directionContext,directives,directivesBefore,directivesOptional); + } + + public HdfsCommand(String name, Environment env) { + super(name,env); + } + + public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { + FsShell shell = new FsShell(); + + Configuration conf = (Configuration)env.getValue(Constants.CFG); + + String hdfs_uri = (String)env.getProperty(Constants.HDFS_URL); + + FileSystem hdfs = (FileSystem) env.getValue(Constants.HDFS); + + if (hdfs == null) { + System.out.println("Please connect first"); + } + conf.set("fs.defaultFS", hdfs_uri); + + conf.setQuietMode(false); + shell.setConf(conf); + int res; + String[] argv = null; + + Option[] cmdOpts = cmd.getOptions(); + String[] cmdArgs = cmd.getArgs(); + + // TODO: Need to Handle context aware file operations. + // put, get, mv, copy.., chmod, chown, chgrp, count + int pathCount = 0; + + String leftPath = null; + String rightPath = null; + + switch (directionContext) { + case REMOTE_LOCAL: + pathCount += 2; // Source and Destination Path Elements. + break; + case LOCAL_REMOTE: + pathCount += 2; // Source and Destination Path Elements. + + break; + case REMOTE_REMOTE: + pathCount += 2; // Source and Destination Path Elements. + + break; + default: // NONE + pathCount += 1; + } + + leftPath = buildPath(Side.LEFT, cmdArgs, directionContext); + if (directionContext != Direction.NONE) { + rightPath = buildPath(Side.RIGHT, cmdArgs, directionContext); + } + + String[] newCmdArgs = new String[pathCount]; + if (rightPath != null) { + newCmdArgs[0] = leftPath; + newCmdArgs[1] = rightPath; + } else { + newCmdArgs[0] = leftPath; + } + + argv = new String[cmdOpts.length + newCmdArgs.length + 1 + directives]; + + int pos = 1; + + for (Option opt: cmdOpts) { + argv[pos++] = "-" + opt.getOpt(); + } + + if (directivesBefore) { + for (int i = 0; i < directives; i++) { + argv[pos++] = cmdArgs[i]; + } + } + + for (String arg: newCmdArgs) { + argv[pos++] = arg; + } + + if (!directivesBefore) { + for (int i = directives; i > 0; i--) { + try { + argv[pos++] = cmdArgs[cmdArgs.length - (i)]; + } catch (Exception e) { + // Can happen when args are optional + } + } + } + + argv[0] = "-" + getName(); + + System.out.println("HDFS Command: " + Arrays.toString(argv)); + + try { + res = ToolRunner.run(shell, argv); + } catch (Exception e) { + e.printStackTrace(); + } finally { + try { + shell.close(); + } catch (IOException e) { + e.printStackTrace(); + } + } + } + + @Override + public Options getOptions() { + Options opts = super.getOptions(); + opts.addOption("l", false, "show extended file attributes"); + opts.addOption("R", false, "recurse"); + opts.addOption("f", false, "force / is file"); + opts.addOption("p", false, "preserve"); + opts.addOption("h", false, "human readable"); + opts.addOption("s", false, "summary"); + opts.addOption("q", false, "query"); + opts.addOption("d", false, "dump / path is directory"); + opts.addOption("e", false, "encoding / path exists"); + opts.addOption("t", false, "sort by Timestamp"); + opts.addOption("S", false, "sort by Size / path not empty"); + opts.addOption("r", false, "reverse"); + opts.addOption("z", false, "file length is 0"); + opts.addOption("u", false, "user access time"); + opts.addOption("skipTrash", false, "Skip Trash"); + opts.addOption("ignorecrc", false, "ignorecrc"); + opts.addOption("crc", false, "crc"); + +// opts.addOption("ignore-fail-on-non-empty", false, "ignore-fail-on-non-empty"); + return opts; + } + +// @Override +// public Completer getCompleter() { +// return new FileSystemNameCompleter(this.env, false); +// } + } diff --git a/src/main/java/com/instanceone/hdfs/shell/command/HdfsConnect.java b/src/main/java/com/instanceone/hdfs/shell/command/HdfsConnect.java index 51a7616..9349e93 100644 --- a/src/main/java/com/instanceone/hdfs/shell/command/HdfsConnect.java +++ b/src/main/java/com/instanceone/hdfs/shell/command/HdfsConnect.java @@ -2,9 +2,12 @@ package com.instanceone.hdfs.shell.command; +import java.io.File; import java.io.IOException; import java.net.URI; +import com.dstreev.hdfs.shell.command.Constants; +import com.instanceone.stemshell.command.AbstractCommand; import jline.console.ConsoleReader; import jline.console.completer.Completer; import jline.console.completer.StringsCompleter; @@ -15,40 +18,71 @@ import org.apache.hadoop.fs.Path; import com.instanceone.stemshell.Environment; +import org.apache.hadoop.security.UserGroupInformation; -public class HdfsConnect extends HdfsCommand { +public class HdfsConnect extends AbstractCommand { + + public static final String HADOOP_CONF_DIR = "HADOOP_CONF_DIR"; + private static final String[] HADOOP_CONF_FILES = {"core-site.xml", "hdfs-site.xml"}; public HdfsConnect(String name) { super(name); - Completer completer = new StringsCompleter("hdfs://localhost:9000/", "hdfs://hdfshost:9000/"); + Completer completer = new StringsCompleter("hdfs://localhost:8020/", "hdfs://hdfshost:8020/"); this.completer = completer; } public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { try { - if(cmd.getArgs().length > 0){ - Configuration config = new Configuration(); - FileSystem hdfs = FileSystem.get(URI.create(cmd.getArgs()[0]), - config); - env.setValue(HDFS, hdfs); - // set working dir to root - hdfs.setWorkingDirectory(hdfs.makeQualified(new Path("/"))); - - FileSystem local = FileSystem.getLocal(new Configuration()); - env.setValue(LOCAL_FS, local); - env.setProperty(HDFS_URL, hdfs.getUri().toString()); - - - log(cmd, "Connected: " + hdfs.getUri()); - logv(cmd, "HDFS CWD: " + hdfs.getWorkingDirectory()); - logv(cmd, "Local CWD: " + local.getWorkingDirectory()); - - } - } - catch (IOException e) { + // Get a value that over rides the default, if nothing then use default. +// Requires Java 1.8... +// String hadoopConfDirProp = System.getenv().getOrDefault(HADOOP_CONF_DIR, "/etc/hadoop/conf"); + + String hadoopConfDirProp = System.getenv().get(HADOOP_CONF_DIR); + // Set a default + if (hadoopConfDirProp == null) + hadoopConfDirProp = "/etc/hadoop/conf"; + + Configuration config = new Configuration(false); + + File hadoopConfDir = new File(hadoopConfDirProp).getAbsoluteFile(); + for (String file : HADOOP_CONF_FILES) { + File f = new File(hadoopConfDir, file); + if (f.exists()) { + config.addResource(new Path(f.getAbsolutePath())); + } + } + + FileSystem hdfs = null; + try { + hdfs = FileSystem.get(config); + } catch (Throwable t) { + t.printStackTrace(); + } + + env.setValue(Constants.CFG, config); + env.setValue(Constants.HDFS, hdfs); + // set working dir to root + hdfs.setWorkingDirectory(hdfs.makeQualified(new Path("/"))); + + FileSystem local = FileSystem.getLocal(new Configuration()); + env.setValue(Constants.LOCAL_FS, local); + env.setProperty(Constants.HDFS_URL, hdfs.getUri().toString()); + + FSUtil.prompt(env); + + log(cmd, "Connected: " + hdfs.getUri()); + logv(cmd, "HDFS CWD: " + hdfs.getWorkingDirectory()); + logv(cmd, "Local CWD: " + local.getWorkingDirectory()); + + } catch (IOException e) { log(cmd, e.getMessage()); } } - + @Override + public Completer getCompleter() { + return this.completer; + } + + } diff --git a/src/main/java/com/instanceone/hdfs/shell/command/HdfsLs.java b/src/main/java/com/instanceone/hdfs/shell/command/HdfsLs.java deleted file mode 100644 index 43b40c7..0000000 --- a/src/main/java/com/instanceone/hdfs/shell/command/HdfsLs.java +++ /dev/null @@ -1,54 +0,0 @@ -// Copyright (c) 2012 P. Taylor Goetz (ptgoetz@gmail.com) - -package com.instanceone.hdfs.shell.command; - -import static com.instanceone.hdfs.shell.command.FSUtil.longFormat; -import static com.instanceone.hdfs.shell.command.FSUtil.shortFormat; - -import java.io.IOException; - -import jline.console.ConsoleReader; - -import org.apache.commons.cli.CommandLine; -import org.apache.commons.cli.Options; -import org.apache.hadoop.fs.FileStatus; -import org.apache.hadoop.fs.FileSystem; -import org.apache.hadoop.fs.Path; - -import com.instanceone.stemshell.Environment; - -public class HdfsLs extends HdfsCommand { - - public HdfsLs(String name) { - super(name); - } - - public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { - try { - FileSystem hdfs = (FileSystem) env.getValue(HDFS); - Path srcPath = cmd.getArgs().length == 0 ? hdfs - .getWorkingDirectory() : new Path( - hdfs.getWorkingDirectory(), cmd.getArgs()[0]); - FileStatus[] files = hdfs.listStatus(srcPath); - for (FileStatus file : files) { - if (cmd.hasOption("l")) { - log(cmd, longFormat(file)); - } - else { - log(cmd, shortFormat(file)); - } - } - } - catch (IOException e) { - log(cmd, e.getMessage()); - } - } - - @Override - public Options getOptions() { - Options opts = super.getOptions(); - opts.addOption("l", false, "show extended file attributes"); - return opts; - } - -} diff --git a/src/main/java/com/instanceone/hdfs/shell/command/HdfsPut.java b/src/main/java/com/instanceone/hdfs/shell/command/HdfsPut.java deleted file mode 100644 index 786756c..0000000 --- a/src/main/java/com/instanceone/hdfs/shell/command/HdfsPut.java +++ /dev/null @@ -1,81 +0,0 @@ -// Copyright (c) 2012 P. Taylor Goetz (ptgoetz@gmail.com) - -package com.instanceone.hdfs.shell.command; - -import java.io.File; -import java.io.FilenameFilter; -import java.io.IOException; -import java.util.ArrayList; -import java.util.regex.Pattern; - -import jline.console.ConsoleReader; -import jline.console.completer.ArgumentCompleter; -import jline.console.completer.Completer; - -import org.apache.commons.cli.CommandLine; -import org.apache.hadoop.fs.FileSystem; -import org.apache.hadoop.fs.Path; - -import com.instanceone.hdfs.shell.completers.FileSystemNameCompleter; -import com.instanceone.stemshell.Environment; - -public class HdfsPut extends HdfsCommand { - private Environment env; - - public HdfsPut(String name, Environment env) { - super(name); - this.env = env; - } - - public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { - - try { - - FileSystem hdfs = (FileSystem)env.getValue(HDFS); - FileSystem localfs = (FileSystem)env.getValue(LOCAL_FS); - String localFile = cmd.getArgs()[0]; - - - String localFileRegex = localFile.replaceAll("\\*", ".*"); - FilenameFilter regexFilter = new RegexFilenameFilter(localFileRegex); - File cwdFile = new File(localfs.getWorkingDirectory().toString().substring(5)); - File[] files = cwdFile.listFiles(regexFilter); - //Path[] filesToUpload = new Path[files.length]; - Path hdfsPath = cmd.getArgs().length > 1 ? new Path(hdfs.getWorkingDirectory(), cmd.getArgs()[1]) : hdfs.getWorkingDirectory(); - logv(cmd,"Remote path: " + hdfsPath); - for(int i = 0; i completers = new ArrayList(); - completers.add(new FileSystemNameCompleter(env, true)); - completers.add(new FileSystemNameCompleter(env, false)); - ArgumentCompleter completer = new ArgumentCompleter(completers); - - return completer; - } - - public static class RegexFilenameFilter implements FilenameFilter { - private String regex; - public RegexFilenameFilter(String regex){ - this.regex = regex; - } - public boolean accept(File dir, String name) { - return Pattern.matches(this.regex, name); - } - } - -} diff --git a/src/main/java/com/instanceone/hdfs/shell/command/HdfsPwd.java b/src/main/java/com/instanceone/hdfs/shell/command/HdfsPwd.java index e811363..0723be6 100644 --- a/src/main/java/com/instanceone/hdfs/shell/command/HdfsPwd.java +++ b/src/main/java/com/instanceone/hdfs/shell/command/HdfsPwd.java @@ -2,6 +2,7 @@ package com.instanceone.hdfs.shell.command; +import com.dstreev.hdfs.shell.command.Constants; import jline.console.ConsoleReader; import org.apache.commons.cli.CommandLine; @@ -17,14 +18,15 @@ public HdfsPwd(String name) { } public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { - FileSystem hdfs = (FileSystem) env.getValue(HDFS); + FileSystem hdfs = (FileSystem) env.getValue(Constants.HDFS); String wd = hdfs.getWorkingDirectory().toString(); if (cmd.hasOption("l")) { log(cmd, wd); } else { - log(cmd, wd.substring(env.getProperty(HDFS_URL).length())); + log(cmd, wd.substring(env.getProperty(Constants.HDFS_URL).length())); } + FSUtil.prompt(env); } diff --git a/src/main/java/com/instanceone/hdfs/shell/command/LocalCd.java b/src/main/java/com/instanceone/hdfs/shell/command/LocalCd.java index 4042547..9a1d412 100644 --- a/src/main/java/com/instanceone/hdfs/shell/command/LocalCd.java +++ b/src/main/java/com/instanceone/hdfs/shell/command/LocalCd.java @@ -4,6 +4,7 @@ import java.io.IOException; +import com.dstreev.hdfs.shell.command.Constants; import jline.console.ConsoleReader; import jline.console.completer.Completer; @@ -18,14 +19,14 @@ public class LocalCd extends HdfsCommand { private Environment env; public LocalCd(String name, Environment env) { - super(name); - this.env = env; + super(name,env); +// this.env = env; } public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { try { - FileSystem localfs = (FileSystem) env.getValue(LOCAL_FS); + FileSystem localfs = (FileSystem) env.getValue(Constants.LOCAL_FS); String dir = cmd.getArgs().length == 0 ? System .getProperty("user.home") : cmd.getArgs()[0]; logv(cmd, "Change Dir to: " + dir); @@ -46,6 +47,7 @@ public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { else { log(cmd, "No such directory: " + dir); } + FSUtil.prompt(env); } catch (IOException e) { log(cmd, e.getMessage()); diff --git a/src/main/java/com/instanceone/hdfs/shell/command/LocalLs.java b/src/main/java/com/instanceone/hdfs/shell/command/LocalLs.java index 99574f2..e15bc6a 100644 --- a/src/main/java/com/instanceone/hdfs/shell/command/LocalLs.java +++ b/src/main/java/com/instanceone/hdfs/shell/command/LocalLs.java @@ -7,6 +7,7 @@ import java.io.IOException; +import com.dstreev.hdfs.shell.command.Constants; import jline.console.ConsoleReader; import jline.console.completer.Completer; @@ -23,12 +24,12 @@ public class LocalLs extends HdfsCommand { private Environment env; public LocalLs(String name, Environment env) { - super(name); + super(name, env); } public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { try { - FileSystem localfs = (FileSystem)env.getValue(LOCAL_FS); + FileSystem localfs = (FileSystem)env.getValue(Constants.LOCAL_FS); Path srcPath = cmd.getArgs().length == 0 ? localfs.getWorkingDirectory() : new Path(localfs.getWorkingDirectory(), cmd.getArgs()[0]); FileStatus[] files = localfs.listStatus(srcPath); for (FileStatus file : files) { @@ -39,6 +40,7 @@ public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { log(cmd, shortFormat(file)); } } + FSUtil.prompt(env); } catch (IOException e) { log(cmd, e.getMessage()); diff --git a/src/main/java/com/instanceone/hdfs/shell/command/LocalPwd.java b/src/main/java/com/instanceone/hdfs/shell/command/LocalPwd.java index 0bcfd7e..a8d2124 100644 --- a/src/main/java/com/instanceone/hdfs/shell/command/LocalPwd.java +++ b/src/main/java/com/instanceone/hdfs/shell/command/LocalPwd.java @@ -2,6 +2,7 @@ package com.instanceone.hdfs.shell.command; +import com.dstreev.hdfs.shell.command.Constants; import jline.console.ConsoleReader; import org.apache.commons.cli.CommandLine; @@ -17,7 +18,7 @@ public LocalPwd(String name) { } public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { - FileSystem localfs = (FileSystem)env.getValue(LOCAL_FS); + FileSystem localfs = (FileSystem)env.getValue(Constants.LOCAL_FS); String wd = localfs.getWorkingDirectory().toString(); if (cmd.hasOption("l")) { @@ -27,6 +28,7 @@ public void execute(Environment env, CommandLine cmd, ConsoleReader reader) { // strip off prefix: "file:" log(cmd, wd.substring(5)); } + FSUtil.prompt(env); } @Override diff --git a/src/main/java/com/instanceone/hdfs/shell/completers/FileSystemNameCompleter.java b/src/main/java/com/instanceone/hdfs/shell/completers/FileSystemNameCompleter.java index e8ecf64..abd8dcb 100644 --- a/src/main/java/com/instanceone/hdfs/shell/completers/FileSystemNameCompleter.java +++ b/src/main/java/com/instanceone/hdfs/shell/completers/FileSystemNameCompleter.java @@ -3,13 +3,13 @@ import java.io.IOException; import java.util.List; +import com.dstreev.hdfs.shell.command.Constants; import jline.console.completer.Completer; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; -import com.instanceone.hdfs.shell.command.HdfsCommand; import com.instanceone.stemshell.Environment; public class FileSystemNameCompleter implements Completer { @@ -36,11 +36,11 @@ public int complete(String buffer, final int cursor, String prefix; if (!this.local) { - fs = (FileSystem) env.getValue(HdfsCommand.HDFS); - prefix = env.getProperty(HdfsCommand.HDFS_URL); + fs = (FileSystem) env.getValue(Constants.HDFS); + prefix = env.getProperty(Constants.HDFS_URL); } else { - fs = (FileSystem) env.getValue(HdfsCommand.LOCAL_FS); + fs = (FileSystem) env.getValue(Constants.LOCAL_FS); prefix = "file:" + (buffer != null && buffer.startsWith("/") ? "/" : ""); } if(fs == null){ diff --git a/src/main/resources/banner.txt b/src/main/resources/banner.txt index d47fc73..3d9d62e 100644 --- a/src/main/resources/banner.txt +++ b/src/main/resources/banner.txt @@ -21,6 +21,6 @@ _ _ ___ ___ ___ ___ _ ___ | || | \| __/ __|___ / __| | |_ _| | __ | |) | _|\__ \___| (__| |__ | | - |_||_|___/|_| |___/ \___|____|___| + |_||_|___/|_| |___/ \___|____|___| v2.3.1-SNAPSHOT -------------------------------------------------------------------- diff --git a/src/test/java/com/dstreev/hadoop/util/DirectoryReaderTest.java b/src/test/java/com/dstreev/hadoop/util/DirectoryReaderTest.java new file mode 100644 index 0000000..3dcf97c --- /dev/null +++ b/src/test/java/com/dstreev/hadoop/util/DirectoryReaderTest.java @@ -0,0 +1,7 @@ +package com.dstreev.hadoop.util; + +/** + * Created by dstreev on 2016-02-15. + */ +public class DirectoryReaderTest { +} diff --git a/src/test/java/com/dstreev/hadoop/util/NamenodeParserTest.java b/src/test/java/com/dstreev/hadoop/util/NamenodeParserTest.java new file mode 100644 index 0000000..c871b22 --- /dev/null +++ b/src/test/java/com/dstreev/hadoop/util/NamenodeParserTest.java @@ -0,0 +1,132 @@ +package com.dstreev.hadoop.util; + +import org.junit.Test; + +import java.util.List; + +import static org.junit.Assert.assertFalse; +import static org.junit.Assert.assertTrue; + +/** + * Created by dstreev on 2016-03-21. + */ +public class NamenodeParserTest { + + @Test + public void getTopUserOpsString() { + try { + NamenodeJmxParser njp = new NamenodeJmxParser("nn_2.3.5.1_active.json"); + List userOpsList = njp.getTopUserOpRecords(); + for (String userOps : userOpsList) { + System.out.println(userOps); + } +// assertTrue(njp.getTopUserOpRecords() != null); + } catch (Exception e) { + e.printStackTrace(); + assertTrue(false); + } + } + + @Test + public void getFSState() { + try { + NamenodeJmxParser njp = new NamenodeJmxParser("nn_2.3.5.1_active.json"); + String fsState = njp.getFSState(); + System.out.println(fsState); + } catch (Exception e) { + e.printStackTrace(); + assertTrue(false); + } + } + + @Test + public void getNamenodeInfo() { + try { + NamenodeJmxParser njp = new NamenodeJmxParser("nn_2.3.5.1_active.json"); + String nnInfo = njp.getNamenodeInfo(); + System.out.println(nnInfo); + } catch (Exception e) { + e.printStackTrace(); + assertTrue(false); + } + } + + @Test + public void getTopUserOpsStringStandby() { + try { + NamenodeJmxParser njp = new NamenodeJmxParser("nn_2.3.5.1_standby.json"); + List userOpsList = njp.getTopUserOpRecords(); + for (String userOps : userOpsList) { + System.out.println(userOps); + } +// assertTrue(njp.getTopUserOpRecords() != null); + } catch (Exception e) { + e.printStackTrace(); + assertTrue(false); + } + } + + @Test + public void getFSStateStandby() { + try { + NamenodeJmxParser njp = new NamenodeJmxParser("nn_2.3.5.1_standby.json"); + String fsState = njp.getFSState(); + System.out.println(fsState); + } catch (Exception e) { + e.printStackTrace(); + assertTrue(false); + } + } + + @Test + public void getNamenodeInfoStandby() { + try { + NamenodeJmxParser njp = new NamenodeJmxParser("nn_2.3.5.1_standby.json"); + String nnInfo = njp.getNamenodeInfo(); + System.out.println(nnInfo); + } catch (Exception e) { + e.printStackTrace(); + assertTrue(false); + } + } + + @Test + public void getTopUserOpsStringStandalone() { + try { + NamenodeJmxParser njp = new NamenodeJmxParser("nn_2.3.2.0_standalone.json"); + List userOpsList = njp.getTopUserOpRecords(); + for (String userOps : userOpsList) { + System.out.println(userOps); + } +// assertTrue(njp.getTopUserOpRecords() != null); + } catch (Exception e) { + e.printStackTrace(); + assertTrue(false); + } + } + + @Test + public void getFSStateStandalone() { + try { + NamenodeJmxParser njp = new NamenodeJmxParser("nn_2.3.2.0_standalone.json"); + String fsState = njp.getFSState(); + System.out.println(fsState); + } catch (Exception e) { + e.printStackTrace(); + assertTrue(false); + } + } + + @Test + public void getNamenodeInfoStandalone() { + try { + NamenodeJmxParser njp = new NamenodeJmxParser("nn_2.3.2.0_standalone.json"); + String nnInfo = njp.getNamenodeInfo(); + System.out.println(nnInfo); + } catch (Exception e) { + e.printStackTrace(); + assertTrue(false); + } + } + +} diff --git a/src/test/resources/nn_2.3.5.1_active.json b/src/test/resources/nn_2.3.5.1_active.json new file mode 100644 index 0000000..6fc5d8c --- /dev/null +++ b/src/test/resources/nn_2.3.5.1_active.json @@ -0,0 +1,1282 @@ +{ + "beans" : [ { + "name" : "Hadoop:service=NameNode,name=JvmMetrics", + "modelerType" : "JvmMetrics", + "tag.Context" : "jvm", + "tag.ProcessName" : "NameNode", + "tag.SessionId" : null, + "tag.Hostname" : "m2.hdp.local", + "MemNonHeapUsedM" : 107.579124, + "MemNonHeapCommittedM" : 109.66797, + "MemNonHeapMaxM" : -1.0, + "MemHeapUsedM" : 257.37604, + "MemHeapCommittedM" : 1011.25, + "MemHeapMaxM" : 1011.25, + "MemMaxM" : 1011.25, + "GcCountParNew" : 770, + "GcTimeMillisParNew" : 12062, + "GcCountConcurrentMarkSweep" : 2, + "GcTimeMillisConcurrentMarkSweep" : 82, + "GcCount" : 772, + "GcTimeMillis" : 12144, + "GcNumWarnThresholdExceeded" : 0, + "GcNumInfoThresholdExceeded" : 0, + "GcTotalExtraSleepTime" : 1540, + "ThreadsNew" : 0, + "ThreadsRunnable" : 9, + "ThreadsBlocked" : 0, + "ThreadsWaiting" : 10, + "ThreadsTimedWaiting" : 132, + "ThreadsTerminated" : 0, + "LogFatal" : 0, + "LogError" : 5, + "LogWarn" : 60, + "LogInfo" : 260936 + }, { + "name" : "JMImplementation:type=MBeanServerDelegate", + "modelerType" : "javax.management.MBeanServerDelegate", + "MBeanServerId" : "m2.hdp.local_1458337616064", + "SpecificationName" : "Java Management Extensions", + "SpecificationVersion" : "1.4", + "SpecificationVendor" : "Oracle Corporation", + "ImplementationName" : "JMX", + "ImplementationVersion" : "1.8.0_73-b02", + "ImplementationVendor" : "Oracle Corporation" + }, { + "name" : "java.lang:type=Runtime", + "modelerType" : "sun.management.RuntimeImpl", + "BootClassPath" : "/usr/java/jdk1.8.0_73/jre/lib/resources.jar:/usr/java/jdk1.8.0_73/jre/lib/rt.jar:/usr/java/jdk1.8.0_73/jre/lib/sunrsasign.jar:/usr/java/jdk1.8.0_73/jre/lib/jsse.jar:/usr/java/jdk1.8.0_73/jre/lib/jce.jar:/usr/java/jdk1.8.0_73/jre/lib/charsets.jar:/usr/java/jdk1.8.0_73/jre/lib/jfr.jar:/usr/java/jdk1.8.0_73/jre/classes", + "LibraryPath" : ":/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native:/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native", + "Uptime" : 230167455, + "VmName" : "Java HotSpot(TM) 64-Bit Server VM", + "VmVendor" : "Oracle Corporation", + "VmVersion" : "25.73-b02", + "BootClassPathSupported" : true, + "InputArguments" : [ "-Dproc_namenode", "-Xmx1024m", "-Dhdp.version=2.3.5.1-68", "-Djava.net.preferIPv4Stack=true", "-Dhdp.version=", "-Djava.net.preferIPv4Stack=true", "-Dhdp.version=", "-Djava.net.preferIPv4Stack=true", "-Dhadoop.log.dir=/var/log/hadoop/hdfs", "-Dhadoop.log.file=hadoop.log", "-Dhadoop.home.dir=/usr/hdp/2.3.5.1-68/hadoop", "-Dhadoop.id.str=hdfs", "-Dhadoop.root.logger=INFO,console", "-Djava.library.path=:/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native", "-Dhadoop.policy.file=hadoop-policy.xml", "-Djava.net.preferIPv4Stack=true", "-Dhdp.version=2.3.5.1-68", "-Dhadoop.log.dir=/var/log/hadoop/hdfs", "-Dhadoop.log.file=hadoop-hdfs-namenode-m2.hdp.local.log", "-Dhadoop.home.dir=/usr/hdp/2.3.5.1-68/hadoop", "-Dhadoop.id.str=hdfs", "-Dhadoop.root.logger=INFO,RFA", "-Djava.library.path=:/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native:/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native", "-Dhadoop.policy.file=hadoop-policy.xml", "-Djava.net.preferIPv4Stack=true", "-XX:ParallelGCThreads=8", "-XX:+UseConcMarkSweepGC", "-XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log", "-XX:NewSize=128m", "-XX:MaxNewSize=128m", "-Xloggc:/var/log/hadoop/hdfs/gc.log-201603181746", "-verbose:gc", "-XX:+PrintGCDetails", "-XX:+PrintGCTimeStamps", "-XX:+PrintGCDateStamps", "-Xms1024m", "-Xmx1024m", "-Dhadoop.security.logger=INFO,DRFAS", "-Dhdfs.audit.logger=INFO,DRFAAUDIT", "-XX:OnOutOfMemoryError=\"/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node\"", "-Dorg.mortbay.jetty.Request.maxFormContentSize=-1", "-XX:ParallelGCThreads=8", "-XX:+UseConcMarkSweepGC", "-XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log", "-XX:NewSize=128m", "-XX:MaxNewSize=128m", "-Xloggc:/var/log/hadoop/hdfs/gc.log-201603181746", "-verbose:gc", "-XX:+PrintGCDetails", "-XX:+PrintGCTimeStamps", "-XX:+PrintGCDateStamps", "-Xms1024m", "-Xmx1024m", "-Dhadoop.security.logger=INFO,DRFAS", "-Dhdfs.audit.logger=INFO,DRFAAUDIT", "-XX:OnOutOfMemoryError=\"/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node\"", "-Dorg.mortbay.jetty.Request.maxFormContentSize=-1", "-XX:ParallelGCThreads=8", "-XX:+UseConcMarkSweepGC", "-XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log", "-XX:NewSize=128m", "-XX:MaxNewSize=128m", "-Xloggc:/var/log/hadoop/hdfs/gc.log-201603181746", "-verbose:gc", "-XX:+PrintGCDetails", "-XX:+PrintGCTimeStamps", "-XX:+PrintGCDateStamps", "-Xms1024m", "-Xmx1024m", "-Dhadoop.security.logger=INFO,DRFAS", "-Dhdfs.audit.logger=INFO,DRFAAUDIT", "-XX:OnOutOfMemoryError=\"/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node\"", "-Dorg.mortbay.jetty.Request.maxFormContentSize=-1", "-Dhadoop.security.logger=INFO,RFAS" ], + "ManagementSpecVersion" : "1.2", + "SpecName" : "Java Virtual Machine Specification", + "SpecVendor" : "Oracle Corporation", + "SpecVersion" : "1.8", + "SystemProperties" : [ { + "key" : "awt.toolkit", + "value" : "sun.awt.X11.XToolkit" + }, { + "key" : "file.encoding.pkg", + "value" : "sun.io" + }, { + "key" : "java.specification.version", + "value" : "1.8" + }, { + "key" : "sun.cpu.isalist", + "value" : "" + }, { + "key" : "sun.jnu.encoding", + "value" : "UTF-8" + }, { + "key" : "java.class.path", + "value" : "/usr/hdp/current/hadoop-client/conf:/usr/hdp/2.3.5.1-68/hadoop/lib/ojdbc6.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/ranger-hdfs-plugin-shim-0.5.0.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/ranger-plugin-classloader-0.5.0.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/ranger-yarn-plugin-shim-0.5.0.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/activation-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/junit-4.11.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/azure-storage-2.2.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/curator-client-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/curator-framework-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/zookeeper-3.4.6.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/mysql-connector-java.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-annotations-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-auth-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-auth.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-aws-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-aws.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-azure-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-azure.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common-2.7.1.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-nfs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/./:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/okhttp-2.4.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/okio-1.4.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-nfs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/objenesis-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/curator-client-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/curator-framework-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/zookeeper-3.4.6.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/fst-2.24.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/zookeeper-3.4.6.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/javassist-3.18.1-GA.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-api-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-client-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-registry-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-tests-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-streaming-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//httpcore-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//curator-client-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//curator-framework-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//curator-recipes-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-ant-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//joda-time-2.9.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-archives-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-auth-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-openstack-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-datajoin-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-distcp-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-rumen-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-extras-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-gridmix-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-sls-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//zookeeper-3.4.6.2.3.5.1-68.jar:::/usr/share/java/mysql-connector-java.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-hdp-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-api-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-common-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-dag-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-examples-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-ext-service-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-history-parser-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-javadoc-tools-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-job-analyzer-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-mapreduce-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-internals-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-library-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-cache-plugin-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-acls-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-fs-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/async-http-client-1.8.16.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-aws-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-azure-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.3.5.1-68/tez/conf::/usr/share/java/mysql-connector-java.jar::/usr/share/java/mysql-connector-java.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-hdp-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-api-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-common-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-dag-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-examples-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-ext-service-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-history-parser-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-javadoc-tools-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-job-analyzer-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-mapreduce-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-internals-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-library-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-cache-plugin-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-acls-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-fs-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/async-http-client-1.8.16.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-aws-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-azure-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.3.5.1-68/tez/conf" + }, { + "key" : "java.vm.vendor", + "value" : "Oracle Corporation" + }, { + "key" : "sun.arch.data.model", + "value" : "64" + }, { + "key" : "java.vendor.url", + "value" : "http://java.oracle.com/" + }, { + "key" : "user.timezone", + "value" : "America/New_York" + }, { + "key" : "os.name", + "value" : "Linux" + }, { + "key" : "java.vm.specification.version", + "value" : "1.8" + }, { + "key" : "user.country", + "value" : "US" + }, { + "key" : "sun.java.launcher", + "value" : "SUN_STANDARD" + }, { + "key" : "sun.boot.library.path", + "value" : "/usr/java/jdk1.8.0_73/jre/lib/amd64" + }, { + "key" : "sun.java.command", + "value" : "org.apache.hadoop.hdfs.server.namenode.NameNode" + }, { + "key" : "hdfs.audit.logger", + "value" : "INFO,DRFAAUDIT" + }, { + "key" : "sun.cpu.endian", + "value" : "little" + }, { + "key" : "user.home", + "value" : "/home/hdfs" + }, { + "key" : "user.language", + "value" : "en" + }, { + "key" : "org.mortbay.jetty.Request.maxFormContentSize", + "value" : "-1" + }, { + "key" : "java.specification.vendor", + "value" : "Oracle Corporation" + }, { + "key" : "hdp.version", + "value" : "2.3.5.1-68" + }, { + "key" : "java.home", + "value" : "/usr/java/jdk1.8.0_73/jre" + }, { + "key" : "file.separator", + "value" : "/" + }, { + "key" : "line.separator", + "value" : "\n" + }, { + "key" : "java.vm.specification.vendor", + "value" : "Oracle Corporation" + }, { + "key" : "java.specification.name", + "value" : "Java Platform API Specification" + }, { + "key" : "java.awt.graphicsenv", + "value" : "sun.awt.X11GraphicsEnvironment" + }, { + "key" : "hadoop.security.logger", + "value" : "INFO,RFAS" + }, { + "key" : "hadoop.log.dir", + "value" : "/var/log/hadoop/hdfs" + }, { + "key" : "sun.boot.class.path", + "value" : "/usr/java/jdk1.8.0_73/jre/lib/resources.jar:/usr/java/jdk1.8.0_73/jre/lib/rt.jar:/usr/java/jdk1.8.0_73/jre/lib/sunrsasign.jar:/usr/java/jdk1.8.0_73/jre/lib/jsse.jar:/usr/java/jdk1.8.0_73/jre/lib/jce.jar:/usr/java/jdk1.8.0_73/jre/lib/charsets.jar:/usr/java/jdk1.8.0_73/jre/lib/jfr.jar:/usr/java/jdk1.8.0_73/jre/classes" + }, { + "key" : "sun.management.compiler", + "value" : "HotSpot 64-Bit Tiered Compilers" + }, { + "key" : "java.runtime.version", + "value" : "1.8.0_73-b02" + }, { + "key" : "hadoop.log.file", + "value" : "hadoop-hdfs-namenode-m2.hdp.local.log" + }, { + "key" : "java.net.preferIPv4Stack", + "value" : "true" + }, { + "key" : "user.name", + "value" : "hdfs" + }, { + "key" : "hadoop.home.dir", + "value" : "/usr/hdp/2.3.5.1-68/hadoop" + }, { + "key" : "path.separator", + "value" : ":" + }, { + "key" : "hadoop.id.str", + "value" : "hdfs" + }, { + "key" : "os.version", + "value" : "3.10.0-229.el7.x86_64" + }, { + "key" : "java.endorsed.dirs", + "value" : "/usr/java/jdk1.8.0_73/jre/lib/endorsed" + }, { + "key" : "java.runtime.name", + "value" : "Java(TM) SE Runtime Environment" + }, { + "key" : "hadoop.root.logger", + "value" : "INFO,RFA" + }, { + "key" : "file.encoding", + "value" : "UTF-8" + }, { + "key" : "java.vm.name", + "value" : "Java HotSpot(TM) 64-Bit Server VM" + }, { + "key" : "java.vendor.url.bug", + "value" : "http://bugreport.sun.com/bugreport/" + }, { + "key" : "java.io.tmpdir", + "value" : "/tmp" + }, { + "key" : "java.version", + "value" : "1.8.0_73" + }, { + "key" : "user.dir", + "value" : "/usr/hdp/2.3.5.1-68/hadoop" + }, { + "key" : "os.arch", + "value" : "amd64" + }, { + "key" : "java.vm.specification.name", + "value" : "Java Virtual Machine Specification" + }, { + "key" : "java.awt.printerjob", + "value" : "sun.print.PSPrinterJob" + }, { + "key" : "hadoop.policy.file", + "value" : "hadoop-policy.xml" + }, { + "key" : "sun.os.patch.level", + "value" : "unknown" + }, { + "key" : "proc_namenode", + "value" : "" + }, { + "key" : "java.library.path", + "value" : ":/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native:/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native" + }, { + "key" : "java.vm.info", + "value" : "mixed mode" + }, { + "key" : "java.vendor", + "value" : "Oracle Corporation" + }, { + "key" : "java.vm.version", + "value" : "25.73-b02" + }, { + "key" : "java.ext.dirs", + "value" : "/usr/java/jdk1.8.0_73/jre/lib/ext:/usr/java/packages/lib/ext" + }, { + "key" : "sun.io.unicode.encoding", + "value" : "UnicodeLittle" + }, { + "key" : "java.class.version", + "value" : "52.0" + } ], + "StartTime" : 1458337614944, + "Name" : "11624@m2.hdp.local", + "ClassPath" : "/usr/hdp/current/hadoop-client/conf:/usr/hdp/2.3.5.1-68/hadoop/lib/ojdbc6.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/ranger-hdfs-plugin-shim-0.5.0.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/ranger-plugin-classloader-0.5.0.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/ranger-yarn-plugin-shim-0.5.0.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/activation-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/junit-4.11.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/azure-storage-2.2.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/curator-client-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/curator-framework-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/zookeeper-3.4.6.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/mysql-connector-java.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-annotations-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-auth-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-auth.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-aws-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-aws.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-azure-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-azure.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common-2.7.1.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-nfs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/./:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/okhttp-2.4.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/okio-1.4.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-nfs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/objenesis-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/curator-client-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/curator-framework-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/zookeeper-3.4.6.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/fst-2.24.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/zookeeper-3.4.6.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/javassist-3.18.1-GA.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-api-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-client-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-registry-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-tests-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-streaming-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//httpcore-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//curator-client-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//curator-framework-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//curator-recipes-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-ant-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//joda-time-2.9.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-archives-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-auth-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-openstack-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-datajoin-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-distcp-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-rumen-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-extras-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-gridmix-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-sls-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//zookeeper-3.4.6.2.3.5.1-68.jar:::/usr/share/java/mysql-connector-java.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-hdp-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-api-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-common-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-dag-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-examples-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-ext-service-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-history-parser-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-javadoc-tools-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-job-analyzer-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-mapreduce-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-internals-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-library-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-cache-plugin-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-acls-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-fs-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/async-http-client-1.8.16.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-aws-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-azure-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.3.5.1-68/tez/conf::/usr/share/java/mysql-connector-java.jar::/usr/share/java/mysql-connector-java.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-hdp-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-api-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-common-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-dag-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-examples-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-ext-service-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-history-parser-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-javadoc-tools-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-job-analyzer-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-mapreduce-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-internals-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-library-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-cache-plugin-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-acls-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-fs-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/async-http-client-1.8.16.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-aws-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-azure-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.3.5.1-68/tez/conf", + "ObjectName" : "java.lang:type=Runtime" + }, { + "name" : "Hadoop:service=NameNode,name=StartupProgress", + "modelerType" : "StartupProgress", + "tag.Hostname" : "m2.hdp.local", + "ElapsedTime" : 55411, + "PercentComplete" : 1.0, + "LoadingFsImageCount" : 0, + "LoadingFsImageElapsedTime" : 902, + "LoadingFsImageTotal" : 0, + "LoadingFsImagePercentComplete" : 1.0, + "LoadingEditsCount" : 781, + "LoadingEditsElapsedTime" : 81, + "LoadingEditsTotal" : 781, + "LoadingEditsPercentComplete" : 1.0, + "SavingCheckpointCount" : 0, + "SavingCheckpointElapsedTime" : 0, + "SavingCheckpointTotal" : 0, + "SavingCheckpointPercentComplete" : 1.0, + "SafeModeCount" : 7732, + "SafeModeElapsedTime" : 48581, + "SafeModeTotal" : 7741, + "SafeModePercentComplete" : 1.0 + }, { + "name" : "java.lang:type=Threading", + "modelerType" : "sun.management.ThreadImpl", + "ThreadAllocatedMemoryEnabled" : true, + "ThreadAllocatedMemorySupported" : true, + "DaemonThreadCount" : 146, + "PeakThreadCount" : 157, + "CurrentThreadCpuTimeSupported" : true, + "ObjectMonitorUsageSupported" : true, + "SynchronizerUsageSupported" : true, + "ThreadContentionMonitoringSupported" : true, + "ThreadCpuTimeEnabled" : true, + "CurrentThreadCpuTime" : 144352306, + "CurrentThreadUserTime" : 130000000, + "TotalStartedThreadCount" : 30834, + "ThreadCpuTimeSupported" : true, + "ThreadCount" : 151, + "ThreadContentionMonitoringEnabled" : false, + "AllThreadIds" : [ 30844, 30843, 30842, 30841, 30840, 30833, 30829, 30828, 30813, 30794, 23342, 23272, 200, 196, 194, 193, 192, 191, 190, 189, 178, 177, 176, 159, 158, 157, 156, 155, 154, 153, 152, 151, 150, 149, 148, 147, 146, 145, 144, 143, 142, 141, 140, 139, 138, 137, 136, 135, 134, 133, 132, 131, 130, 129, 128, 127, 126, 125, 124, 123, 122, 121, 120, 119, 118, 117, 116, 115, 114, 113, 112, 111, 110, 109, 108, 107, 106, 105, 104, 103, 102, 101, 100, 99, 98, 97, 96, 95, 94, 93, 92, 91, 90, 89, 88, 87, 86, 85, 84, 83, 82, 81, 80, 79, 78, 77, 76, 75, 74, 73, 72, 71, 70, 69, 68, 67, 66, 65, 64, 63, 62, 61, 60, 59, 42, 45, 58, 56, 55, 54, 53, 52, 51, 49, 24, 25, 48, 47, 44, 43, 23, 22, 21, 19, 17, 16, 15, 5, 3, 2, 1 ], + "ObjectName" : "java.lang:type=Threading" + }, { + "name" : "java.lang:type=OperatingSystem", + "modelerType" : "sun.management.OperatingSystemImpl", + "MaxFileDescriptorCount" : 128000, + "OpenFileDescriptorCount" : 442, + "CommittedVirtualMemorySize" : 3212779520, + "FreePhysicalMemorySize" : 2036277248, + "FreeSwapSpaceSize" : 8321495040, + "ProcessCpuLoad" : 0.00782472613458529, + "ProcessCpuTime" : 3930550000000, + "SystemCpuLoad" : 0.025039123630672927, + "TotalPhysicalMemorySize" : 16472367104, + "TotalSwapSpaceSize" : 8321495040, + "AvailableProcessors" : 4, + "Arch" : "amd64", + "SystemLoadAverage" : 0.27, + "Version" : "3.10.0-229.el7.x86_64", + "Name" : "Linux", + "ObjectName" : "java.lang:type=OperatingSystem" + }, { + "name" : "Hadoop:service=NameNode,name=FSNamesystem", + "modelerType" : "FSNamesystem", + "tag.Context" : "dfs", + "tag.HAState" : "active", + "tag.TotalSyncTimes" : "33 38 ", + "tag.Hostname" : "m2.hdp.local", + "MissingBlocks" : 0, + "MissingReplOneBlocks" : 0, + "ExpiredHeartbeats" : 0, + "TransactionsSinceLastCheckpoint" : 637, + "TransactionsSinceLastLogRoll" : 4, + "LastWrittenTransactionId" : 2096709, + "LastCheckpointTime" : 1458553696500, + "CapacityTotal" : 1254254346240, + "CapacityTotalGB" : 1168.0, + "CapacityUsed" : 43518906368, + "CapacityUsedGB" : 41.0, + "CapacityRemaining" : 1203736371256, + "CapacityRemainingGB" : 1121.0, + "CapacityUsedNonDFS" : 6999068616, + "TotalLoad" : 36, + "SnapshottableDirectories" : 4, + "Snapshots" : 8, + "LockQueueLength" : 0, + "BlocksTotal" : 7813, + "NumFilesUnderConstruction" : 7, + "NumActiveClients" : 6, + "FilesTotal" : 9555, + "PendingReplicationBlocks" : 0, + "UnderReplicatedBlocks" : 4, + "CorruptBlocks" : 0, + "ScheduledReplicationBlocks" : 0, + "PendingDeletionBlocks" : 0, + "ExcessBlocks" : 0, + "PostponedMisreplicatedBlocks" : 0, + "PendingDataNodeMessageCount" : 0, + "MillisSinceLastLoadedEdits" : 0, + "BlockCapacity" : 2097152, + "StaleDataNodes" : 0, + "TotalFiles" : 9555, + "TotalSyncCount" : 4 + }, { + "name" : "java.lang:type=MemoryPool,name=Code Cache", + "modelerType" : "sun.management.MemoryPoolImpl", + "Valid" : true, + "CollectionUsage" : null, + "MemoryManagerNames" : [ "CodeCacheManager" ], + "PeakUsage" : { + "committed" : 43319296, + "init" : 2555904, + "max" : 251658240, + "used" : 42902784 + }, + "Usage" : { + "committed" : 43319296, + "init" : 2555904, + "max" : 251658240, + "used" : 42902784 + }, + "UsageThreshold" : 0, + "UsageThresholdCount" : 0, + "CollectionUsageThresholdSupported" : false, + "UsageThresholdExceeded" : false, + "UsageThresholdSupported" : true, + "Name" : "Code Cache", + "Type" : "NON_HEAP", + "ObjectName" : "java.lang:type=MemoryPool,name=Code Cache" + }, { + "name" : "Hadoop:service=NameNode,name=IPCLoggerChannel-10.0.0.161-8485", + "modelerType" : "IPCLoggerChannel-10.0.0.161-8485", + "tag.Context" : "dfs", + "tag.IsOutOfSync" : "false", + "tag.Hostname" : "m2.hdp.local", + "CurrentLagTxns" : 0, + "QueuedEditsSize" : 0, + "LagTimeMillis" : 5 + }, { + "name" : "java.nio:type=BufferPool,name=direct", + "modelerType" : "sun.management.ManagementFactoryHelper$1", + "MemoryUsed" : 8057023, + "TotalCapacity" : 8057023, + "Count" : 272, + "Name" : "direct", + "ObjectName" : "java.nio:type=BufferPool,name=direct" + }, { + "name" : "java.lang:type=Compilation", + "modelerType" : "sun.management.CompilationImpl", + "CompilationTimeMonitoringSupported" : true, + "TotalCompilationTime" : 118865, + "Name" : "HotSpot 64-Bit Tiered Compilers", + "ObjectName" : "java.lang:type=Compilation" + }, { + "name" : "java.lang:type=MemoryManager,name=CodeCacheManager", + "modelerType" : "sun.management.MemoryManagerImpl", + "Valid" : true, + "MemoryPoolNames" : [ "Code Cache" ], + "Name" : "CodeCacheManager", + "ObjectName" : "java.lang:type=MemoryManager,name=CodeCacheManager" + }, { + "name" : "java.util.logging:type=Logging", + "modelerType" : "sun.management.ManagementFactoryHelper$PlatformLoggingImpl", + "ObjectName" : "java.util.logging:type=Logging", + "LoggerNames" : [ "javax.management.snmp.daemon", "com.sun.jersey.api.client.ClientResponse", "com.sun.jersey.api.client.RequestWriter", "javax.management.notification", "com.google.common.cache.CacheBuilder", "com.google.common.util.concurrent.ExecutionList", "javax.management.misc", "javax.management", "javax.management.relation", "com.google.common.collect.MapMakerInternalMap", "com.google.common.util.concurrent.UncaughtExceptionHandlers$Exiter", "com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider", "sun.net.www.protocol.http.HttpURLConnection", "com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider", "com.sun.jersey.core.spi.component.ProviderServices", "com.sun.jersey.api.client.Client", "global", "com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider", "javax.management.snmp", "javax.management.mbeanserver", "com.sun.jersey.core.impl.provider.entity.EntityHolderReader", "com.sun.jersey.spi.service.ServiceFinder", "com.sun.jersey.spi.inject.Errors", "javax.management.modelmbean", "com.sun.jersey.core.spi.component.ProviderFactory", "javax.management.mlet", "javax.management.timer", "com.google.common.cache.LocalCache", "", "javax.management.monitor" ] + }, { + "name" : "java.lang:type=ClassLoading", + "modelerType" : "sun.management.ClassLoadingImpl", + "TotalLoadedClassCount" : 8353, + "Verbose" : false, + "LoadedClassCount" : 8353, + "UnloadedClassCount" : 0, + "ObjectName" : "java.lang:type=ClassLoading" + }, { + "name" : "java.lang:type=MemoryManager,name=Metaspace Manager", + "modelerType" : "sun.management.MemoryManagerImpl", + "Valid" : true, + "MemoryPoolNames" : [ "Metaspace", "Compressed Class Space" ], + "Name" : "Metaspace Manager", + "ObjectName" : "java.lang:type=MemoryManager,name=Metaspace Manager" + }, { + "name" : "Hadoop:service=NameNode,name=RpcDetailedActivityForPort8020", + "modelerType" : "RpcDetailedActivityForPort8020", + "tag.port" : "8020", + "tag.Context" : "rpcdetailed", + "tag.Hostname" : "m2.hdp.local", + "GetServiceStatusNumOps" : 229749, + "GetServiceStatusAvgTime" : 0.16666666666666669, + "MonitorHealthNumOps" : 229743, + "MonitorHealthAvgTime" : 0.16666666666666669, + "SendHeartbeatNumOps" : 230148, + "SendHeartbeatAvgTime" : 0.0, + "TransitionToStandbyNumOps" : 2, + "TransitionToStandbyAvgTime" : 0.0, + "VersionRequestNumOps" : 3, + "VersionRequestAvgTime" : 8.0, + "RegisterDatanodeNumOps" : 3, + "RegisterDatanodeAvgTime" : 55.0, + "StandbyExceptionNumOps" : 1, + "StandbyExceptionAvgTime" : 0.0, + "BlockReportNumOps" : 35, + "BlockReportAvgTime" : 5.0, + "TransitionToActiveNumOps" : 1, + "TransitionToActiveAvgTime" : 1584.0, + "RenewLeaseNumOps" : 46007, + "RenewLeaseAvgTime" : 0.0, + "GetListingNumOps" : 76097, + "GetListingAvgTime" : 0.47058823529411764, + "GetFileInfoNumOps" : 17067, + "GetFileInfoAvgTime" : 0.14285714285714288, + "MkdirsNumOps" : 2559, + "MkdirsAvgTime" : 12.5, + "CreateNumOps" : 323, + "CreateAvgTime" : 17.8, + "DeleteNumOps" : 1534, + "DeleteAvgTime" : 11.0, + "RollEditLogNumOps" : 1912, + "RollEditLogAvgTime" : 167.0, + "AddBlockNumOps" : 323, + "AddBlockAvgTime" : 12.2, + "GetServerDefaultsNumOps" : 259, + "GetServerDefaultsAvgTime" : 0.0, + "BlockReceivedAndDeletedNumOps" : 977, + "BlockReceivedAndDeletedAvgTime" : 0.13333333333333336, + "CompleteNumOps" : 319, + "CompleteAvgTime" : 11.2, + "FsyncNumOps" : 256, + "FsyncAvgTime" : 10.5, + "SetTimesNumOps" : 256, + "SetTimesAvgTime" : 11.25, + "RenameNumOps" : 256, + "RenameAvgTime" : 11.5, + "CommitBlockSynchronizationNumOps" : 1, + "CommitBlockSynchronizationAvgTime" : 13.0 + }, { + "name" : "Hadoop:service=NameNode,name=MetricsSystem,sub=Stats", + "modelerType" : "MetricsSystem,sub=Stats", + "tag.Context" : "metricssystem", + "tag.Hostname" : "m2.hdp.local", + "NumActiveSources" : 11, + "NumAllSources" : 11, + "NumActiveSinks" : 1, + "NumAllSinks" : 0, + "Sink_timelineNumOps" : 3836, + "Sink_timelineAvgTime" : 27.0, + "Sink_timelineDropped" : 0, + "Sink_timelineQsize" : 0, + "SnapshotNumOps" : 46032, + "SnapshotAvgTime" : 0.0, + "PublishNumOps" : 3836, + "PublishAvgTime" : 0.0, + "DroppedPubAll" : 0 + }, { + "name" : "Hadoop:service=NameNode,name=IPCLoggerChannel-10.0.0.162-8485", + "modelerType" : "IPCLoggerChannel-10.0.0.162-8485", + "tag.Context" : "dfs", + "tag.IsOutOfSync" : "false", + "tag.Hostname" : "m2.hdp.local", + "CurrentLagTxns" : 0, + "QueuedEditsSize" : 0, + "LagTimeMillis" : 5 + }, { + "name" : "Hadoop:service=NameNode,name=MetricsSystem,sub=Control", + "modelerType" : "org.apache.hadoop.metrics2.impl.MetricsSystemImpl" + }, { + "name" : "java.lang:type=MemoryPool,name=Metaspace", + "modelerType" : "sun.management.MemoryPoolImpl", + "Valid" : true, + "CollectionUsage" : null, + "MemoryManagerNames" : [ "Metaspace Manager" ], + "PeakUsage" : { + "committed" : 64790528, + "init" : 0, + "max" : -1, + "used" : 63364832 + }, + "Usage" : { + "committed" : 64790528, + "init" : 0, + "max" : -1, + "used" : 63364832 + }, + "UsageThreshold" : 0, + "UsageThresholdCount" : 0, + "CollectionUsageThresholdSupported" : false, + "UsageThresholdExceeded" : false, + "UsageThresholdSupported" : true, + "Name" : "Metaspace", + "Type" : "NON_HEAP", + "ObjectName" : "java.lang:type=MemoryPool,name=Metaspace" + }, { + "name" : "Hadoop:service=NameNode,name=NameNodeActivity", + "modelerType" : "NameNodeActivity", + "tag.ProcessName" : "NameNode", + "tag.SessionId" : null, + "tag.Context" : "dfs", + "tag.Hostname" : "m2.hdp.local", + "CreateFileOps" : 323, + "FilesCreated" : 2882, + "FilesAppended" : 0, + "GetBlockLocations" : 0, + "FilesRenamed" : 256, + "FilesTruncated" : 0, + "GetListingOps" : 60561, + "DeleteFileOps" : 1534, + "FilesDeleted" : 2812, + "FileInfoOps" : 17067, + "AddBlockOps" : 323, + "GetAdditionalDatanodeOps" : 0, + "CreateSymlinkOps" : 0, + "GetLinkTargetOps" : 0, + "FilesInGetListingOps" : 122537, + "AllowSnapshotOps" : 0, + "DisallowSnapshotOps" : 0, + "CreateSnapshotOps" : 0, + "DeleteSnapshotOps" : 0, + "RenameSnapshotOps" : 0, + "ListSnapshottableDirOps" : 0, + "SnapshotDiffReportOps" : 0, + "BlockReceivedAndDeletedOps" : 977, + "StorageBlockReportOps" : 35, + "TransactionsNumOps" : 10303, + "TransactionsAvgTime" : 0.33333333333333337, + "SyncsNumOps" : 9652, + "SyncsAvgTime" : 11.0, + "TransactionsBatchedInSync" : 1, + "BlockReportNumOps" : 35, + "BlockReportAvgTime" : 4.0, + "CacheReportNumOps" : 0, + "CacheReportAvgTime" : 0.0, + "SafeModeTime" : 56010, + "FsImageLoadTime" : 6784, + "GetEditNumOps" : 0, + "GetEditAvgTime" : 0.0, + "GetImageNumOps" : 0, + "GetImageAvgTime" : 0.0, + "PutImageNumOps" : 10, + "PutImageAvgTime" : 37.0, + "TotalFileOps" : 80064 + }, { + "name" : "java.lang:type=MemoryPool,name=Par Eden Space", + "modelerType" : "sun.management.MemoryPoolImpl", + "Valid" : true, + "CollectionUsage" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 0 + }, + "CollectionUsageThreshold" : 0, + "CollectionUsageThresholdCount" : 0, + "MemoryManagerNames" : [ "ConcurrentMarkSweep", "ParNew" ], + "PeakUsage" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 107479040 + }, + "Usage" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 41269440 + }, + "CollectionUsageThresholdExceeded" : false, + "CollectionUsageThresholdSupported" : true, + "UsageThresholdSupported" : false, + "Name" : "Par Eden Space", + "Type" : "HEAP", + "ObjectName" : "java.lang:type=MemoryPool,name=Par Eden Space" + }, { + "name" : "java.lang:type=GarbageCollector,name=ParNew", + "modelerType" : "sun.management.GarbageCollectorImpl", + "LastGcInfo" : { + "GcThreadCount" : 11, + "duration" : 26, + "endTime" : 230036939, + "id" : 770, + "memoryUsageAfterGc" : [ { + "key" : "Par Survivor Space", + "value" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 5520400 + } + }, { + "key" : "Compressed Class Space", + "value" : { + "committed" : 6754304, + "init" : 0, + "max" : 1073741824, + "used" : 6470656 + } + }, { + "key" : "Metaspace", + "value" : { + "committed" : 64135168, + "init" : 0, + "max" : -1, + "used" : 62790832 + } + }, { + "key" : "Code Cache", + "value" : { + "committed" : 43253760, + "init" : 2555904, + "max" : 251658240, + "used" : 42688640 + } + }, { + "key" : "Par Eden Space", + "value" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 0 + } + }, { + "key" : "CMS Old Gen", + "value" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 223574952 + } + } ], + "memoryUsageBeforeGc" : [ { + "key" : "Par Survivor Space", + "value" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 5865080 + } + }, { + "key" : "Compressed Class Space", + "value" : { + "committed" : 6754304, + "init" : 0, + "max" : 1073741824, + "used" : 6470656 + } + }, { + "key" : "Metaspace", + "value" : { + "committed" : 64135168, + "init" : 0, + "max" : -1, + "used" : 62790832 + } + }, { + "key" : "Code Cache", + "value" : { + "committed" : 43253760, + "init" : 2555904, + "max" : 251658240, + "used" : 42688640 + } + }, { + "key" : "Par Eden Space", + "value" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 107479040 + } + }, { + "key" : "CMS Old Gen", + "value" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 223546904 + } + } ], + "startTime" : 230036913 + }, + "CollectionCount" : 770, + "CollectionTime" : 12062, + "Valid" : true, + "MemoryPoolNames" : [ "Par Eden Space", "Par Survivor Space" ], + "Name" : "ParNew", + "ObjectName" : "java.lang:type=GarbageCollector,name=ParNew" + }, { + "name" : "java.lang:type=GarbageCollector,name=ConcurrentMarkSweep", + "modelerType" : "sun.management.GarbageCollectorImpl", + "LastGcInfo" : { + "GcThreadCount" : 11, + "duration" : 613, + "endTime" : 11914, + "id" : 2, + "memoryUsageAfterGc" : [ { + "key" : "Par Survivor Space", + "value" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 13369344 + } + }, { + "key" : "Compressed Class Space", + "value" : { + "committed" : 5181440, + "init" : 0, + "max" : 1073741824, + "used" : 5074064 + } + }, { + "key" : "Metaspace", + "value" : { + "committed" : 44752896, + "init" : 0, + "max" : -1, + "used" : 43916784 + } + }, { + "key" : "Code Cache", + "value" : { + "committed" : 10485760, + "init" : 2555904, + "max" : 251658240, + "used" : 10171776 + } + }, { + "key" : "Par Eden Space", + "value" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 8163416 + } + }, { + "key" : "CMS Old Gen", + "value" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 34717304 + } + } ], + "memoryUsageBeforeGc" : [ { + "key" : "Par Survivor Space", + "value" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 11139672 + } + }, { + "key" : "Compressed Class Space", + "value" : { + "committed" : 5103616, + "init" : 0, + "max" : 1073741824, + "used" : 4935288 + } + }, { + "key" : "Metaspace", + "value" : { + "committed" : 43810816, + "init" : 0, + "max" : -1, + "used" : 42961272 + } + }, { + "key" : "Code Cache", + "value" : { + "committed" : 10485760, + "init" : 2555904, + "max" : 251658240, + "used" : 9712448 + } + }, { + "key" : "Par Eden Space", + "value" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 82683744 + } + }, { + "key" : "CMS Old Gen", + "value" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 20681144 + } + } ], + "startTime" : 11301 + }, + "CollectionCount" : 2, + "CollectionTime" : 82, + "Valid" : true, + "MemoryPoolNames" : [ "Par Eden Space", "Par Survivor Space", "CMS Old Gen" ], + "Name" : "ConcurrentMarkSweep", + "ObjectName" : "java.lang:type=GarbageCollector,name=ConcurrentMarkSweep" + }, { + "name" : "Hadoop:service=NameNode,name=NameNodeStatus", + "modelerType" : "org.apache.hadoop.hdfs.server.namenode.NameNode", + "NNRole" : "NameNode", + "HostAndPort" : "m2.hdp.local:8020", + "SecurityEnabled" : false, + "LastHATransitionTime" : 1458337689771, + "BytesWithFutureGenerationStamps" : 0, + "State" : "active" + }, { + "name" : "Hadoop:service=NameNode,name=NameNodeInfo", + "modelerType" : "org.apache.hadoop.hdfs.server.namenode.FSNamesystem", + "Total" : 1254254346240, + "UpgradeFinalized" : true, + "ClusterId" : "CID-b255ee79-e4f1-44a8-b134-044c25d7bfd4", + "Version" : "2.7.1.2.3.5.1-68, rfe3c6b6dd1526d3c46f61a2e8fab9bb5eb649989", + "Used" : 43518906368, + "Free" : 1203736371256, + "Safemode" : "", + "NonDfsUsedSpace" : 6999068616, + "PercentUsed" : 3.4697034, + "BlockPoolUsedSpace" : 43518906368, + "PercentBlockPoolUsed" : 3.4697034, + "PercentRemaining" : 95.972275, + "CacheCapacity" : 0, + "CacheUsed" : 0, + "TotalBlocks" : 7813, + "TotalFiles" : 9555, + "NumberOfMissingBlocks" : 0, + "NumberOfMissingBlocksWithReplicationFactorOne" : 0, + "LiveNodes" : "{\"d2.hdp.local:50010\":{\"infoAddr\":\"10.0.0.166:50075\",\"infoSecureAddr\":\"10.0.0.166:0\",\"xferaddr\":\"10.0.0.166:50010\",\"lastContact\":0,\"usedSpace\":14322884608,\"adminState\":\"In Service\",\"nonDfsUsedSpace\":2070821528,\"capacity\":418084782080,\"numBlocks\":7794,\"version\":\"2.7.1.2.3.5.1-68\",\"used\":14322884608,\"remaining\":401691075944,\"blockScheduled\":0,\"blockPoolUsed\":14322884608,\"blockPoolUsedPercent\":3.4258327,\"volfails\":0},\"d1.hdp.local:50010\":{\"infoAddr\":\"10.0.0.165:50075\",\"infoSecureAddr\":\"10.0.0.165:0\",\"xferaddr\":\"10.0.0.165:50010\",\"lastContact\":0,\"usedSpace\":14866464768,\"adminState\":\"In Service\",\"nonDfsUsedSpace\":2284001944,\"capacity\":418084782080,\"numBlocks\":7799,\"version\":\"2.7.1.2.3.5.1-68\",\"used\":14866464768,\"remaining\":400934315368,\"blockScheduled\":0,\"blockPoolUsed\":14866464768,\"blockPoolUsedPercent\":3.555849,\"volfails\":0},\"d3.hdp.local:50010\":{\"infoAddr\":\"10.0.0.167:50075\",\"infoSecureAddr\":\"10.0.0.167:0\",\"xferaddr\":\"10.0.0.167:50010\",\"lastContact\":0,\"usedSpace\":14329556992,\"adminState\":\"In Service\",\"nonDfsUsedSpace\":2644245144,\"capacity\":418084782080,\"numBlocks\":7795,\"version\":\"2.7.1.2.3.5.1-68\",\"used\":14329556992,\"remaining\":401110979944,\"blockScheduled\":0,\"blockPoolUsed\":14329556992,\"blockPoolUsedPercent\":3.4274285,\"volfails\":0}}", + "DeadNodes" : "{}", + "DecomNodes" : "{}", + "BlockPoolId" : "BP-331388818-10.0.0.160-1443469980189", + "NameDirStatuses" : "{\"active\":{\"/data/hadoop/hdfs/namenode\":\"IMAGE_AND_EDITS\"},\"failed\":{}}", + "NodeUsage" : "{\"nodeUsage\":{\"min\":\"3.43%\",\"median\":\"3.43%\",\"max\":\"3.56%\",\"stdDev\":\"0.06%\"}}", + "NameJournalStatus" : "[{\"manager\":\"QJM to [10.0.0.160:8485, 10.0.0.161:8485, 10.0.0.162:8485]\",\"stream\":\"Writing segment beginning at txid 2096706. \\n10.0.0.160:8485 (Written txid 2096709), 10.0.0.161:8485 (Written txid 2096709), 10.0.0.162:8485 (Written txid 2096709)\",\"disabled\":\"false\",\"required\":\"true\"},{\"manager\":\"FileJournalManager(root=/data/hadoop/hdfs/namenode)\",\"stream\":\"EditLogFileOutputStream(/data/hadoop/hdfs/namenode/current/edits_inprogress_0000000000002096706)\",\"disabled\":\"false\",\"required\":\"false\"}]", + "JournalTransactionInfo" : "{\"MostRecentCheckpointTxId\":\"2096072\",\"LastAppliedOrWrittenTxId\":\"2096709\"}", + "NNStarted" : "Fri Mar 18 17:46:56 EDT 2016", + "CompileInfo" : "2016-03-10T05:05Z by jenkins from (HEAD detached at fe3c6b6)", + "CorruptFiles" : "[]", + "DistinctVersionCount" : 1, + "DistinctVersions" : [ { + "key" : "2.7.1.2.3.5.1-68", + "value" : 3 + } ], + "SoftwareVersion" : "2.7.1.2.3.5.1-68", + "RollingUpgradeStatus" : null, + "Threads" : 151 + }, { + "name" : "Hadoop:service=NameNode,name=SnapshotInfo", + "modelerType" : "org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotManager", + "SnapshottableDirectories" : [ { + "group" : "hdfs", + "modificationTime" : 1456355116813, + "owner" : "hive", + "path" : "/apps/hive/warehouse", + "permission" : 777, + "snapshotNumber" : 5, + "snapshotQuota" : 65536 + }, { + "group" : "hdfs", + "modificationTime" : 1449767160188, + "owner" : "fred", + "path" : "/user/fred", + "permission" : 700, + "snapshotNumber" : 0, + "snapshotQuota" : 65536 + }, { + "group" : "hdfs", + "modificationTime" : 1457370883059, + "owner" : "dstreev", + "path" : "/user/dstreev", + "permission" : 755, + "snapshotNumber" : 3, + "snapshotQuota" : 65536 + }, { + "group" : "hdfs", + "modificationTime" : 1456358248544, + "owner" : "dstreev", + "path" : "/tmp/dstreev", + "permission" : 700, + "snapshotNumber" : 0, + "snapshotQuota" : 65536 + } ], + "Snapshots" : [ { + "modificationTime" : 1456354755947, + "snapshotDirectory" : "/apps/hive/warehouse/.snapshot/2016-02-24-1500", + "snapshotID" : "2016-02-24-1500" + }, { + "modificationTime" : 1456355116813, + "snapshotDirectory" : "/apps/hive/warehouse/.snapshot/2016-02-24-1515", + "snapshotID" : "2016-02-24-1515" + }, { + "modificationTime" : 1456346696470, + "snapshotDirectory" : "/apps/hive/warehouse/.snapshot/s20160224-154300.702", + "snapshotID" : "s20160224-154300.702" + }, { + "modificationTime" : 1456346937065, + "snapshotDirectory" : "/apps/hive/warehouse/.snapshot/s20160224-154753.719", + "snapshotID" : "s20160224-154753.719" + }, { + "modificationTime" : 1456354515321, + "snapshotDirectory" : "/apps/hive/warehouse/.snapshot/s20160224-175513.358", + "snapshotID" : "s20160224-175513.358" + }, { + "modificationTime" : 1456355116813, + "snapshotDirectory" : "/user/dstreev/.snapshot/2016-02-24-1515", + "snapshotID" : "2016-02-24-1515" + }, { + "modificationTime" : 1456347057346, + "snapshotDirectory" : "/user/dstreev/.snapshot/s20160224-155029.640", + "snapshotID" : "s20160224-155029.640" + }, { + "modificationTime" : 1456347177621, + "snapshotDirectory" : "/user/dstreev/.snapshot/s20160224-155109.634", + "snapshotID" : "s20160224-155109.634" + } ] + }, { + "name" : "Hadoop:service=NameNode,name=BlockStats", + "modelerType" : "org.apache.hadoop.hdfs.server.blockmanagement.BlockManager", + "StorageTypeStats" : [ { + "key" : "DISK", + "value" : { + "blockPoolUsed" : 43518906368, + "capacityRemaining" : 1203736371256, + "capacityTotal" : 1254254346240, + "capacityUsed" : 43518906368, + "nodesInService" : 3 + } + } ] + }, { + "name" : "Hadoop:service=NameNode,name=IPCLoggerChannel-10.0.0.160-8485", + "modelerType" : "IPCLoggerChannel-10.0.0.160-8485", + "tag.Context" : "dfs", + "tag.IsOutOfSync" : "false", + "tag.Hostname" : "m2.hdp.local", + "CurrentLagTxns" : 0, + "QueuedEditsSize" : 0, + "LagTimeMillis" : 5 + }, { + "name" : "java.lang:type=MemoryPool,name=Compressed Class Space", + "modelerType" : "sun.management.MemoryPoolImpl", + "Valid" : true, + "CollectionUsage" : null, + "MemoryManagerNames" : [ "Metaspace Manager" ], + "PeakUsage" : { + "committed" : 6885376, + "init" : 0, + "max" : 1073741824, + "used" : 6537272 + }, + "Usage" : { + "committed" : 6885376, + "init" : 0, + "max" : 1073741824, + "used" : 6537272 + }, + "UsageThreshold" : 0, + "UsageThresholdCount" : 0, + "CollectionUsageThresholdSupported" : false, + "UsageThresholdExceeded" : false, + "UsageThresholdSupported" : true, + "Name" : "Compressed Class Space", + "Type" : "NON_HEAP", + "ObjectName" : "java.lang:type=MemoryPool,name=Compressed Class Space" + }, { + "name" : "java.lang:type=Memory", + "modelerType" : "sun.management.MemoryImpl", + "Verbose" : true, + "HeapMemoryUsage" : { + "committed" : 1060372480, + "init" : 1073741824, + "max" : 1060372480, + "used" : 270546056 + }, + "NonHeapMemoryUsage" : { + "committed" : 114995200, + "init" : 2555904, + "max" : -1, + "used" : 112804888 + }, + "ObjectPendingFinalizationCount" : 0, + "ObjectName" : "java.lang:type=Memory" + }, { + "name" : "Hadoop:service=NameNode,name=FSNamesystemState", + "modelerType" : "org.apache.hadoop.hdfs.server.namenode.FSNamesystem", + "CapacityTotal" : 1254254346240, + "CapacityUsed" : 43518906368, + "CapacityRemaining" : 1203736371256, + "TotalLoad" : 36, + "SnapshotStats" : "{\"SnapshottableDirectories\":4,\"Snapshots\":8}", + "FsLockQueueLength" : 0, + "BlocksTotal" : 7813, + "MaxObjects" : 0, + "FilesTotal" : 9555, + "PendingReplicationBlocks" : 0, + "UnderReplicatedBlocks" : 4, + "ScheduledReplicationBlocks" : 0, + "PendingDeletionBlocks" : 0, + "BlockDeletionStartTime" : 1458341216736, + "FSState" : "Operational", + "NumLiveDataNodes" : 3, + "NumDeadDataNodes" : 0, + "NumDecomLiveDataNodes" : 0, + "NumDecomDeadDataNodes" : 0, + "VolumeFailuresTotal" : 0, + "EstimatedCapacityLostTotal" : 0, + "NumDecommissioningDataNodes" : 0, + "NumStaleDataNodes" : 0, + "NumStaleStorages" : 0, + "TopUserOpCounts" : "{\"timestamp\":\"2016-03-21T09:43:02-0400\",\"windows\":[{\"ops\":[{\"opType\":\"getfileinfo\",\"topUsers\":[{\"user\":\"ambari-qa\",\"count\":10},{\"user\":\"hbase\",\"count\":4}],\"totalCount\":14},{\"opType\":\"mkdirs\",\"topUsers\":[{\"user\":\"ambari-qa\",\"count\":2}],\"totalCount\":2},{\"opType\":\"listStatus\",\"topUsers\":[{\"user\":\"hbase\",\"count\":15},{\"user\":\"mapred\",\"count\":1},{\"user\":\"oozie\",\"count\":1}],\"totalCount\":17},{\"opType\":\"*\",\"topUsers\":[{\"user\":\"hbase\",\"count\":19},{\"user\":\"ambari-qa\",\"count\":13},{\"user\":\"mapred\",\"count\":1},{\"user\":\"oozie\",\"count\":1}],\"totalCount\":34},{\"opType\":\"delete\",\"topUsers\":[{\"user\":\"ambari-qa\",\"count\":1}],\"totalCount\":1}],\"windowLenMs\":60000},{\"ops\":[{\"opType\":\"getfileinfo\",\"topUsers\":[{\"user\":\"ambari-qa\",\"count\":20},{\"user\":\"hbase\",\"count\":4}],\"totalCount\":24},{\"opType\":\"mkdirs\",\"topUsers\":[{\"user\":\"ambari-qa\",\"count\":4}],\"totalCount\":4},{\"opType\":\"listStatus\",\"topUsers\":[{\"user\":\"hbase\",\"count\":52},{\"user\":\"mapred\",\"count\":5},{\"user\":\"oozie\",\"count\":5}],\"totalCount\":62},{\"opType\":\"*\",\"topUsers\":[{\"user\":\"hbase\",\"count\":58},{\"user\":\"ambari-qa\",\"count\":26},{\"user\":\"mapred\",\"count\":5},{\"user\":\"oozie\",\"count\":5}],\"totalCount\":94},{\"opType\":\"delete\",\"topUsers\":[{\"user\":\"ambari-qa\",\"count\":2}],\"totalCount\":2}],\"windowLenMs\":300000},{\"ops\":[{\"opType\":\"getfileinfo\",\"topUsers\":[{\"user\":\"ambari-qa\",\"count\":60},{\"user\":\"hbase\",\"count\":25}],\"totalCount\":85},{\"opType\":\"rename\",\"topUsers\":[{\"user\":\"hbase\",\"count\":4}],\"totalCount\":4},{\"opType\":\"mkdirs\",\"topUsers\":[{\"user\":\"ambari-qa\",\"count\":12}],\"totalCount\":12},{\"opType\":\"listStatus\",\"topUsers\":[{\"user\":\"hbase\",\"count\":237},{\"user\":\"mapred\",\"count\":23},{\"user\":\"oozie\",\"count\":18}],\"totalCount\":278},{\"opType\":\"create\",\"topUsers\":[{\"user\":\"hbase\",\"count\":5}],\"totalCount\":5},{\"opType\":\"*\",\"topUsers\":[{\"user\":\"hbase\",\"count\":196},{\"user\":\"ambari-qa\",\"count\":78},{\"user\":\"mapred\",\"count\":23},{\"user\":\"oozie\",\"count\":18}],\"totalCount\":315},{\"opType\":\"setTimes\",\"topUsers\":[{\"user\":\"hbase\",\"count\":4}],\"totalCount\":4},{\"opType\":\"delete\",\"topUsers\":[{\"user\":\"ambari-qa\",\"count\":6},{\"user\":\"hbase\",\"count\":4}],\"totalCount\":10}],\"windowLenMs\":1500000}]}", + "TotalSyncCount" : 4, + "TotalSyncTimes" : "33 38 " + }, { + "name" : "java.nio:type=BufferPool,name=mapped", + "modelerType" : "sun.management.ManagementFactoryHelper$1", + "MemoryUsed" : 24812, + "TotalCapacity" : 24812, + "Count" : 3, + "Name" : "mapped", + "ObjectName" : "java.nio:type=BufferPool,name=mapped" + }, { + "name" : "java.lang:type=MemoryPool,name=Par Survivor Space", + "modelerType" : "sun.management.MemoryPoolImpl", + "Valid" : true, + "CollectionUsage" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 5520400 + }, + "CollectionUsageThreshold" : 0, + "CollectionUsageThresholdCount" : 0, + "MemoryManagerNames" : [ "ConcurrentMarkSweep", "ParNew" ], + "PeakUsage" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 13369344 + }, + "Usage" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 5520400 + }, + "CollectionUsageThresholdExceeded" : false, + "CollectionUsageThresholdSupported" : true, + "UsageThresholdSupported" : false, + "Name" : "Par Survivor Space", + "Type" : "HEAP", + "ObjectName" : "java.lang:type=MemoryPool,name=Par Survivor Space" + }, { + "name" : "Hadoop:service=NameNode,name=RetryCache.NameNodeRetryCache", + "modelerType" : "RetryCache.NameNodeRetryCache", + "tag.Context" : "rpc", + "tag.Hostname" : "m2.hdp.local", + "CacheHit" : 0, + "CacheCleared" : 0, + "CacheUpdated" : 2274 + }, { + "name" : "com.sun.management:type=DiagnosticCommand", + "modelerType" : "sun.management.DiagnosticCommandImpl" + }, { + "name" : "Hadoop:service=NameNode,name=RpcActivityForPort8020", + "modelerType" : "RpcActivityForPort8020", + "tag.port" : "8020", + "tag.Context" : "rpc", + "tag.Hostname" : "m2.hdp.local", + "ReceivedBytes" : 202304113, + "SentBytes" : 41683107, + "RpcQueueTimeNumOps" : 837831, + "RpcQueueTimeAvgTime" : 0.19444444444444448, + "RpcProcessingTimeNumOps" : 837831, + "RpcProcessingTimeAvgTime" : 0.2777777777777778, + "RpcAuthenticationFailures" : 0, + "RpcAuthenticationSuccesses" : 0, + "RpcAuthorizationFailures" : 0, + "RpcAuthorizationSuccesses" : 52470, + "RpcClientBackoff" : 0, + "RpcSlowCalls" : 0, + "NumOpenConnections" : 11, + "CallQueueLength" : 0 + }, { + "name" : "Hadoop:service=NameNode,name=UgiMetrics", + "modelerType" : "UgiMetrics", + "tag.Context" : "ugi", + "tag.Hostname" : "m2.hdp.local", + "LoginSuccessNumOps" : 0, + "LoginSuccessAvgTime" : 0.0, + "LoginFailureNumOps" : 0, + "LoginFailureAvgTime" : 0.0, + "GetGroupsNumOps" : 3619, + "GetGroupsAvgTime" : 0.0 + }, { + "name" : "com.sun.management:type=HotSpotDiagnostic", + "modelerType" : "sun.management.HotSpotDiagnostic", + "DiagnosticOptions" : [ { + "name" : "HeapDumpBeforeFullGC", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "HeapDumpAfterFullGC", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "HeapDumpOnOutOfMemoryError", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "HeapDumpPath", + "origin" : "DEFAULT", + "value" : "", + "writeable" : true + }, { + "name" : "CMSAbortablePrecleanWaitMillis", + "origin" : "DEFAULT", + "value" : "100", + "writeable" : true + }, { + "name" : "CMSWaitDuration", + "origin" : "DEFAULT", + "value" : "2000", + "writeable" : true + }, { + "name" : "CMSTriggerInterval", + "origin" : "DEFAULT", + "value" : "-1", + "writeable" : true + }, { + "name" : "PrintGC", + "origin" : "VM_CREATION", + "value" : "true", + "writeable" : true + }, { + "name" : "PrintGCDetails", + "origin" : "VM_CREATION", + "value" : "true", + "writeable" : true + }, { + "name" : "PrintGCDateStamps", + "origin" : "VM_CREATION", + "value" : "true", + "writeable" : true + }, { + "name" : "PrintGCTimeStamps", + "origin" : "VM_CREATION", + "value" : "true", + "writeable" : true + }, { + "name" : "PrintGCID", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "PrintClassHistogramBeforeFullGC", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "PrintClassHistogramAfterFullGC", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "PrintClassHistogram", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "MinHeapFreeRatio", + "origin" : "DEFAULT", + "value" : "40", + "writeable" : true + }, { + "name" : "MaxHeapFreeRatio", + "origin" : "DEFAULT", + "value" : "70", + "writeable" : true + }, { + "name" : "PrintConcurrentLocks", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "UnlockCommercialFeatures", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + } ], + "ObjectName" : "com.sun.management:type=HotSpotDiagnostic" + }, { + "name" : "java.lang:type=MemoryPool,name=CMS Old Gen", + "modelerType" : "sun.management.MemoryPoolImpl", + "Valid" : true, + "CollectionUsage" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 34717304 + }, + "CollectionUsageThreshold" : 0, + "CollectionUsageThresholdCount" : 0, + "MemoryManagerNames" : [ "ConcurrentMarkSweep" ], + "PeakUsage" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 223574952 + }, + "Usage" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 223574952 + }, + "UsageThreshold" : 0, + "UsageThresholdCount" : 0, + "CollectionUsageThresholdExceeded" : false, + "CollectionUsageThresholdSupported" : true, + "UsageThresholdExceeded" : false, + "UsageThresholdSupported" : true, + "Name" : "CMS Old Gen", + "Type" : "HEAP", + "ObjectName" : "java.lang:type=MemoryPool,name=CMS Old Gen" + } ] +} \ No newline at end of file diff --git a/src/test/resources/nn_2.3.5.1_standby.json b/src/test/resources/nn_2.3.5.1_standby.json new file mode 100644 index 0000000..2eb25e0 --- /dev/null +++ b/src/test/resources/nn_2.3.5.1_standby.json @@ -0,0 +1,1266 @@ +{ + "beans" : [ { + "name" : "Hadoop:service=NameNode,name=JvmMetrics", + "modelerType" : "JvmMetrics", + "tag.Context" : "jvm", + "tag.ProcessName" : "NameNode", + "tag.SessionId" : null, + "tag.Hostname" : "m1.hdp.local", + "MemNonHeapUsedM" : 101.98388, + "MemNonHeapCommittedM" : 104.12109, + "MemNonHeapMaxM" : -1.0, + "MemHeapUsedM" : 121.64624, + "MemHeapCommittedM" : 1011.25, + "MemHeapMaxM" : 1011.25, + "MemMaxM" : 1011.25, + "GcCountParNew" : 806, + "GcTimeMillisParNew" : 9556, + "GcCountConcurrentMarkSweep" : 2, + "GcTimeMillisConcurrentMarkSweep" : 93, + "GcCount" : 808, + "GcTimeMillis" : 9649, + "GcNumWarnThresholdExceeded" : 0, + "GcNumInfoThresholdExceeded" : 0, + "GcTotalExtraSleepTime" : 590, + "ThreadsNew" : 0, + "ThreadsRunnable" : 7, + "ThreadsBlocked" : 0, + "ThreadsWaiting" : 7, + "ThreadsTimedWaiting" : 134, + "ThreadsTerminated" : 0, + "LogFatal" : 0, + "LogError" : 0, + "LogWarn" : 58561, + "LogInfo" : 21176 + }, { + "name" : "JMImplementation:type=MBeanServerDelegate", + "modelerType" : "javax.management.MBeanServerDelegate", + "MBeanServerId" : "m1.hdp.local_1458337609789", + "SpecificationName" : "Java Management Extensions", + "SpecificationVersion" : "1.4", + "SpecificationVendor" : "Oracle Corporation", + "ImplementationName" : "JMX", + "ImplementationVersion" : "1.8.0_73-b02", + "ImplementationVendor" : "Oracle Corporation" + }, { + "name" : "java.lang:type=Runtime", + "modelerType" : "sun.management.RuntimeImpl", + "InputArguments" : [ "-Dproc_namenode", "-Xmx1024m", "-Dhdp.version=2.3.5.1-68", "-Djava.net.preferIPv4Stack=true", "-Dhdp.version=", "-Djava.net.preferIPv4Stack=true", "-Dhdp.version=", "-Djava.net.preferIPv4Stack=true", "-Dhadoop.log.dir=/var/log/hadoop/hdfs", "-Dhadoop.log.file=hadoop.log", "-Dhadoop.home.dir=/usr/hdp/2.3.5.1-68/hadoop", "-Dhadoop.id.str=hdfs", "-Dhadoop.root.logger=INFO,console", "-Djava.library.path=:/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native", "-Dhadoop.policy.file=hadoop-policy.xml", "-Djava.net.preferIPv4Stack=true", "-Dhdp.version=2.3.5.1-68", "-Dhadoop.log.dir=/var/log/hadoop/hdfs", "-Dhadoop.log.file=hadoop-hdfs-namenode-m1.hdp.local.log", "-Dhadoop.home.dir=/usr/hdp/2.3.5.1-68/hadoop", "-Dhadoop.id.str=hdfs", "-Dhadoop.root.logger=INFO,RFA", "-Djava.library.path=:/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native:/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native", "-Dhadoop.policy.file=hadoop-policy.xml", "-Djava.net.preferIPv4Stack=true", "-XX:ParallelGCThreads=8", "-XX:+UseConcMarkSweepGC", "-XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log", "-XX:NewSize=128m", "-XX:MaxNewSize=128m", "-Xloggc:/var/log/hadoop/hdfs/gc.log-201603181746", "-verbose:gc", "-XX:+PrintGCDetails", "-XX:+PrintGCTimeStamps", "-XX:+PrintGCDateStamps", "-Xms1024m", "-Xmx1024m", "-Dhadoop.security.logger=INFO,DRFAS", "-Dhdfs.audit.logger=INFO,DRFAAUDIT", "-XX:OnOutOfMemoryError=\"/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node\"", "-Dorg.mortbay.jetty.Request.maxFormContentSize=-1", "-XX:ParallelGCThreads=8", "-XX:+UseConcMarkSweepGC", "-XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log", "-XX:NewSize=128m", "-XX:MaxNewSize=128m", "-Xloggc:/var/log/hadoop/hdfs/gc.log-201603181746", "-verbose:gc", "-XX:+PrintGCDetails", "-XX:+PrintGCTimeStamps", "-XX:+PrintGCDateStamps", "-Xms1024m", "-Xmx1024m", "-Dhadoop.security.logger=INFO,DRFAS", "-Dhdfs.audit.logger=INFO,DRFAAUDIT", "-XX:OnOutOfMemoryError=\"/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node\"", "-Dorg.mortbay.jetty.Request.maxFormContentSize=-1", "-XX:ParallelGCThreads=8", "-XX:+UseConcMarkSweepGC", "-XX:ErrorFile=/var/log/hadoop/hdfs/hs_err_pid%p.log", "-XX:NewSize=128m", "-XX:MaxNewSize=128m", "-Xloggc:/var/log/hadoop/hdfs/gc.log-201603181746", "-verbose:gc", "-XX:+PrintGCDetails", "-XX:+PrintGCTimeStamps", "-XX:+PrintGCDateStamps", "-Xms1024m", "-Xmx1024m", "-Dhadoop.security.logger=INFO,DRFAS", "-Dhdfs.audit.logger=INFO,DRFAAUDIT", "-XX:OnOutOfMemoryError=\"/usr/hdp/current/hadoop-hdfs-namenode/bin/kill-name-node\"", "-Dorg.mortbay.jetty.Request.maxFormContentSize=-1", "-Dhadoop.security.logger=INFO,RFAS" ], + "ManagementSpecVersion" : "1.2", + "SpecName" : "Java Virtual Machine Specification", + "SpecVendor" : "Oracle Corporation", + "SpecVersion" : "1.8", + "SystemProperties" : [ { + "key" : "awt.toolkit", + "value" : "sun.awt.X11.XToolkit" + }, { + "key" : "file.encoding.pkg", + "value" : "sun.io" + }, { + "key" : "java.specification.version", + "value" : "1.8" + }, { + "key" : "sun.cpu.isalist", + "value" : "" + }, { + "key" : "sun.jnu.encoding", + "value" : "UTF-8" + }, { + "key" : "java.class.path", + "value" : "/usr/hdp/current/hadoop-client/conf:/usr/hdp/2.3.5.1-68/hadoop/lib/ojdbc6.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/ranger-hdfs-plugin-shim-0.5.0.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/ranger-plugin-classloader-0.5.0.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/ranger-yarn-plugin-shim-0.5.0.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/activation-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/junit-4.11.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/azure-storage-2.2.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/curator-client-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/curator-framework-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/zookeeper-3.4.6.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/mysql-connector-java.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-annotations-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-auth-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-auth.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-aws-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-aws.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-azure-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-azure.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common-2.7.1.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-nfs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/./:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/okhttp-2.4.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/okio-1.4.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-nfs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/objenesis-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/curator-client-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/curator-framework-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/zookeeper-3.4.6.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/fst-2.24.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/zookeeper-3.4.6.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/javassist-3.18.1-GA.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-api-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-client-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-registry-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-tests-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-streaming-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//httpcore-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//curator-client-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//curator-framework-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//curator-recipes-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-ant-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//joda-time-2.9.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-archives-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-auth-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-openstack-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-datajoin-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-distcp-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-rumen-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-extras-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-gridmix-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-sls-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//zookeeper-3.4.6.2.3.5.1-68.jar:::/usr/share/java/mysql-connector-java.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-hdp-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-api-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-common-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-dag-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-examples-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-ext-service-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-history-parser-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-javadoc-tools-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-job-analyzer-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-mapreduce-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-internals-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-library-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-cache-plugin-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-acls-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-fs-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/async-http-client-1.8.16.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-aws-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-azure-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.3.5.1-68/tez/conf::/usr/share/java/mysql-connector-java.jar::/usr/share/java/mysql-connector-java.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-hdp-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-api-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-common-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-dag-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-examples-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-ext-service-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-history-parser-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-javadoc-tools-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-job-analyzer-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-mapreduce-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-internals-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-library-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-cache-plugin-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-acls-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-fs-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/async-http-client-1.8.16.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-aws-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-azure-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.3.5.1-68/tez/conf" + }, { + "key" : "java.vm.vendor", + "value" : "Oracle Corporation" + }, { + "key" : "sun.arch.data.model", + "value" : "64" + }, { + "key" : "java.vendor.url", + "value" : "http://java.oracle.com/" + }, { + "key" : "user.timezone", + "value" : "America/New_York" + }, { + "key" : "os.name", + "value" : "Linux" + }, { + "key" : "java.vm.specification.version", + "value" : "1.8" + }, { + "key" : "user.country", + "value" : "US" + }, { + "key" : "sun.java.launcher", + "value" : "SUN_STANDARD" + }, { + "key" : "sun.boot.library.path", + "value" : "/usr/java/jdk1.8.0_73/jre/lib/amd64" + }, { + "key" : "sun.java.command", + "value" : "org.apache.hadoop.hdfs.server.namenode.NameNode" + }, { + "key" : "hdfs.audit.logger", + "value" : "INFO,DRFAAUDIT" + }, { + "key" : "sun.cpu.endian", + "value" : "little" + }, { + "key" : "user.home", + "value" : "/home/hdfs" + }, { + "key" : "user.language", + "value" : "en" + }, { + "key" : "org.mortbay.jetty.Request.maxFormContentSize", + "value" : "-1" + }, { + "key" : "java.specification.vendor", + "value" : "Oracle Corporation" + }, { + "key" : "hdp.version", + "value" : "2.3.5.1-68" + }, { + "key" : "java.home", + "value" : "/usr/java/jdk1.8.0_73/jre" + }, { + "key" : "file.separator", + "value" : "/" + }, { + "key" : "line.separator", + "value" : "\n" + }, { + "key" : "java.vm.specification.vendor", + "value" : "Oracle Corporation" + }, { + "key" : "java.specification.name", + "value" : "Java Platform API Specification" + }, { + "key" : "java.awt.graphicsenv", + "value" : "sun.awt.X11GraphicsEnvironment" + }, { + "key" : "hadoop.security.logger", + "value" : "INFO,RFAS" + }, { + "key" : "hadoop.log.dir", + "value" : "/var/log/hadoop/hdfs" + }, { + "key" : "sun.boot.class.path", + "value" : "/usr/java/jdk1.8.0_73/jre/lib/resources.jar:/usr/java/jdk1.8.0_73/jre/lib/rt.jar:/usr/java/jdk1.8.0_73/jre/lib/sunrsasign.jar:/usr/java/jdk1.8.0_73/jre/lib/jsse.jar:/usr/java/jdk1.8.0_73/jre/lib/jce.jar:/usr/java/jdk1.8.0_73/jre/lib/charsets.jar:/usr/java/jdk1.8.0_73/jre/lib/jfr.jar:/usr/java/jdk1.8.0_73/jre/classes" + }, { + "key" : "sun.management.compiler", + "value" : "HotSpot 64-Bit Tiered Compilers" + }, { + "key" : "java.runtime.version", + "value" : "1.8.0_73-b02" + }, { + "key" : "hadoop.log.file", + "value" : "hadoop-hdfs-namenode-m1.hdp.local.log" + }, { + "key" : "java.net.preferIPv4Stack", + "value" : "true" + }, { + "key" : "user.name", + "value" : "hdfs" + }, { + "key" : "hadoop.home.dir", + "value" : "/usr/hdp/2.3.5.1-68/hadoop" + }, { + "key" : "path.separator", + "value" : ":" + }, { + "key" : "hadoop.id.str", + "value" : "hdfs" + }, { + "key" : "os.version", + "value" : "3.10.0-229.el7.x86_64" + }, { + "key" : "java.endorsed.dirs", + "value" : "/usr/java/jdk1.8.0_73/jre/lib/endorsed" + }, { + "key" : "java.runtime.name", + "value" : "Java(TM) SE Runtime Environment" + }, { + "key" : "hadoop.root.logger", + "value" : "INFO,RFA" + }, { + "key" : "file.encoding", + "value" : "UTF-8" + }, { + "key" : "java.vm.name", + "value" : "Java HotSpot(TM) 64-Bit Server VM" + }, { + "key" : "java.vendor.url.bug", + "value" : "http://bugreport.sun.com/bugreport/" + }, { + "key" : "java.io.tmpdir", + "value" : "/tmp" + }, { + "key" : "java.version", + "value" : "1.8.0_73" + }, { + "key" : "user.dir", + "value" : "/usr/hdp/2.3.5.1-68/hadoop" + }, { + "key" : "os.arch", + "value" : "amd64" + }, { + "key" : "java.vm.specification.name", + "value" : "Java Virtual Machine Specification" + }, { + "key" : "java.awt.printerjob", + "value" : "sun.print.PSPrinterJob" + }, { + "key" : "hadoop.policy.file", + "value" : "hadoop-policy.xml" + }, { + "key" : "sun.os.patch.level", + "value" : "unknown" + }, { + "key" : "proc_namenode", + "value" : "" + }, { + "key" : "java.library.path", + "value" : ":/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native:/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native" + }, { + "key" : "java.vm.info", + "value" : "mixed mode" + }, { + "key" : "java.vendor", + "value" : "Oracle Corporation" + }, { + "key" : "java.vm.version", + "value" : "25.73-b02" + }, { + "key" : "java.ext.dirs", + "value" : "/usr/java/jdk1.8.0_73/jre/lib/ext:/usr/java/packages/lib/ext" + }, { + "key" : "sun.io.unicode.encoding", + "value" : "UnicodeLittle" + }, { + "key" : "java.class.version", + "value" : "52.0" + } ], + "BootClassPath" : "/usr/java/jdk1.8.0_73/jre/lib/resources.jar:/usr/java/jdk1.8.0_73/jre/lib/rt.jar:/usr/java/jdk1.8.0_73/jre/lib/sunrsasign.jar:/usr/java/jdk1.8.0_73/jre/lib/jsse.jar:/usr/java/jdk1.8.0_73/jre/lib/jce.jar:/usr/java/jdk1.8.0_73/jre/lib/charsets.jar:/usr/java/jdk1.8.0_73/jre/lib/jfr.jar:/usr/java/jdk1.8.0_73/jre/classes", + "LibraryPath" : ":/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native:/usr/hdp/2.3.5.1-68/hadoop/lib/native/Linux-amd64-64:/usr/hdp/2.3.5.1-68/hadoop/lib/native", + "Uptime" : 232798102, + "VmName" : "Java HotSpot(TM) 64-Bit Server VM", + "VmVendor" : "Oracle Corporation", + "VmVersion" : "25.73-b02", + "BootClassPathSupported" : true, + "StartTime" : 1458337608748, + "Name" : "17447@m1.hdp.local", + "ClassPath" : "/usr/hdp/current/hadoop-client/conf:/usr/hdp/2.3.5.1-68/hadoop/lib/ojdbc6.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/ranger-hdfs-plugin-shim-0.5.0.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jettison-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/ranger-plugin-classloader-0.5.0.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/ranger-yarn-plugin-shim-0.5.0.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/activation-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/junit-4.11.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/mockito-all-1.8.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/aws-java-sdk-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/azure-storage-2.2.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-configuration-1.6.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-digester-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/slf4j-log4j12-1.7.10.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-httpclient-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/stax-api-1.0-2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/commons-net-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/curator-client-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/curator-framework-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/zookeeper-3.4.6.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/gson-2.2.4.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/hamcrest-core-1.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jsch-0.1.42.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/httpclient-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/httpcore-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jsp-api-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-core-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/jets3t-0.9.0.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/mysql-connector-java.jar:/usr/hdp/2.3.5.1-68/hadoop/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-annotations-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-annotations.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-auth-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-auth.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-aws-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-aws.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-azure-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-azure.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common-2.7.1.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common-tests.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-common.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-nfs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop/.//hadoop-nfs.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/./:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/netty-all-4.0.23.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/okhttp-2.4.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/okio-1.4.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-nfs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-nfs.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-hdfs/.//hadoop-hdfs.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/activation-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jets3t-0.9.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jettison-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/api-util-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/objenesis-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-beanutils-1.7.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-beanutils-core-1.8.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-configuration-1.6.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-digester-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-httpclient-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/commons-net-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/curator-client-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/curator-framework-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/zookeeper-3.4.6.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/curator-recipes-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/fst-2.24.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/gson-2.2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/guice-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/zookeeper-3.4.6.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/httpclient-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/httpcore-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-annotations-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-core-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-databind-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jackson-xc-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/java-xmlbuilder-0.4.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/javassist-3.18.1-GA.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/javax.inject-1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jsch-0.1.42.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jsp-api-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-api-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-api.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-distributedshell-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-distributedshell.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-client-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-client.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-common.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-registry-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-registry.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-applicationhistoryservice.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-common.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-nodemanager-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-nodemanager.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-resourcemanager-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-resourcemanager.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-sharedcachemanager.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-tests-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-timeline-plugins.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-web-proxy-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-yarn/.//hadoop-yarn-server-web-proxy.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/guice-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/junit-4.11.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/lib/xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jaxb-impl-2.2.3-1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//activation-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-streaming-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//apacheds-i18n-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//apacheds-kerberos-codec-2.0.0-M15.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hamcrest-core-1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//api-asn1-api-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-streaming.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//api-util-1.0.0-M20.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//netty-3.6.2.Final.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//asm-3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jersey-core-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//avro-1.7.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//httpcore-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-beanutils-1.7.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-beanutils-core-1.8.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//htrace-core-3.1.0-incubating.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//httpclient-4.2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-compress-1.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-configuration-1.6.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-core-2.2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-digester-1.8.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-core-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-httpclient-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jersey-server-1.9.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jets3t-0.9.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-jaxrs-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-lang3-3.3.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//java-xmlbuilder-0.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-logging-1.1.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-mapper-asl-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jettison-1.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//commons-net-3.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jackson-xc-1.9.13.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//curator-client-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jaxb-api-2.2.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//curator-framework-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//paranamer-2.3.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//curator-recipes-2.7.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//gson-2.2.4.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-ant-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//joda-time-2.9.2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-ant.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-archives-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jsch-0.1.42.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-archives.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-examples.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-auth-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jsp-api-2.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-auth.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-openstack-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-datajoin-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-datajoin.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-openstack.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-distcp-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//junit-4.11.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-distcp.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-rumen-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-extras-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//mockito-all-1.8.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-extras.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-rumen.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//xz-1.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-gridmix-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//log4j-1.2.17.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-gridmix.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-core-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-app-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-app.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-core.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-sls-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-common.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//metrics-core-3.0.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-sls.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-hs-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.3.5.1-68-tests.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-jobclient-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-client-shuffle-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//hadoop-mapreduce-examples-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//snappy-java-1.0.4.1.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//stax-api-1.0-2.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//xmlenc-0.52.jar:/usr/hdp/2.3.5.1-68/hadoop-mapreduce/.//zookeeper-3.4.6.2.3.5.1-68.jar:::/usr/share/java/mysql-connector-java.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-hdp-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-api-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-common-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-dag-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-examples-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-ext-service-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-history-parser-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-javadoc-tools-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-job-analyzer-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-mapreduce-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-internals-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-library-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-cache-plugin-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-acls-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-fs-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/async-http-client-1.8.16.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-aws-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-azure-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.3.5.1-68/tez/conf::/usr/share/java/mysql-connector-java.jar::/usr/share/java/mysql-connector-java.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/hadoop-shim-hdp-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-api-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-common-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-dag-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-examples-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-ext-service-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-history-parser-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-javadoc-tools-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-job-analyzer-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-mapreduce-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-internals-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-runtime-library-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-tests-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-cache-plugin-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-acls-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/tez-yarn-timeline-history-with-fs-0.8.2.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/async-http-client-1.8.16.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-cli-1.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-codec-1.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-collections4-4.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-io-2.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-lang-2.6.jar:/usr/hdp/2.3.5.1-68/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/2.3.5.1-68/tez/lib/guava-11.0.2.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-aws-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-azure-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-common-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-mapreduce-client-core-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/hadoop-yarn-server-timeline-plugins-2.7.1.2.3.5.1-68.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-client-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jersey-json-1.9.jar:/usr/hdp/2.3.5.1-68/tez/lib/jettison-1.3.4.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jetty-util-6.1.26.hwx.jar:/usr/hdp/2.3.5.1-68/tez/lib/jsr305-3.0.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/2.3.5.1-68/tez/lib/servlet-api-2.5.jar:/usr/hdp/2.3.5.1-68/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/2.3.5.1-68/tez/conf", + "ObjectName" : "java.lang:type=Runtime" + }, { + "name" : "Hadoop:service=NameNode,name=StartupProgress", + "modelerType" : "StartupProgress", + "tag.Hostname" : "m1.hdp.local", + "ElapsedTime" : 43115, + "PercentComplete" : 1.0, + "LoadingFsImageCount" : 0, + "LoadingFsImageElapsedTime" : 1013, + "LoadingFsImageTotal" : 0, + "LoadingFsImagePercentComplete" : 1.0, + "LoadingEditsCount" : 781, + "LoadingEditsElapsedTime" : 139, + "LoadingEditsTotal" : 781, + "LoadingEditsPercentComplete" : 1.0, + "SavingCheckpointCount" : 0, + "SavingCheckpointElapsedTime" : 0, + "SavingCheckpointTotal" : 0, + "SavingCheckpointPercentComplete" : 1.0, + "SafeModeCount" : 7736, + "SafeModeElapsedTime" : 36295, + "SafeModeTotal" : 7741, + "SafeModePercentComplete" : 1.0 + }, { + "name" : "java.lang:type=Threading", + "modelerType" : "sun.management.ThreadImpl", + "ThreadAllocatedMemoryEnabled" : true, + "ThreadAllocatedMemorySupported" : true, + "DaemonThreadCount" : 143, + "PeakThreadCount" : 167, + "CurrentThreadCpuTimeSupported" : true, + "ObjectMonitorUsageSupported" : true, + "SynchronizerUsageSupported" : true, + "ThreadContentionMonitoringSupported" : true, + "ThreadCpuTimeEnabled" : true, + "CurrentThreadCpuTime" : 99831384, + "CurrentThreadUserTime" : 90000000, + "TotalStartedThreadCount" : 50816, + "ThreadCpuTimeSupported" : true, + "ThreadCount" : 148, + "ThreadContentionMonitoringEnabled" : false, + "AllThreadIds" : [ 50826, 50825, 50824, 50822, 50821, 50820, 50819, 50817, 50818, 50816, 50815, 50810, 50801, 50800, 50763, 205, 204, 202, 201, 191, 155, 154, 153, 152, 151, 150, 149, 148, 147, 146, 145, 144, 143, 142, 141, 140, 139, 138, 137, 136, 135, 134, 133, 132, 131, 130, 129, 128, 127, 126, 125, 124, 123, 122, 121, 120, 119, 118, 117, 116, 115, 114, 113, 112, 111, 110, 109, 108, 107, 106, 105, 104, 103, 102, 101, 100, 99, 98, 97, 96, 95, 94, 93, 92, 91, 90, 89, 88, 87, 86, 85, 84, 83, 82, 81, 80, 79, 78, 77, 76, 75, 74, 73, 72, 71, 70, 69, 68, 67, 66, 65, 64, 63, 62, 61, 60, 59, 58, 57, 56, 55, 38, 41, 54, 52, 51, 50, 49, 48, 47, 45, 24, 25, 44, 43, 40, 39, 23, 22, 21, 19, 17, 16, 15, 5, 3, 2, 1 ], + "ObjectName" : "java.lang:type=Threading" + }, { + "name" : "java.lang:type=OperatingSystem", + "modelerType" : "sun.management.OperatingSystemImpl", + "MaxFileDescriptorCount" : 128000, + "OpenFileDescriptorCount" : 432, + "CommittedVirtualMemorySize" : 3236540416, + "FreePhysicalMemorySize" : 3616194560, + "FreeSwapSpaceSize" : 8321495040, + "ProcessCpuLoad" : 0.01906519065190652, + "ProcessCpuTime" : 3690090000000, + "SystemCpuLoad" : 0.03381494005533354, + "TotalPhysicalMemorySize" : 16472367104, + "TotalSwapSpaceSize" : 8321495040, + "Arch" : "amd64", + "SystemLoadAverage" : 0.11, + "AvailableProcessors" : 4, + "Version" : "3.10.0-229.el7.x86_64", + "Name" : "Linux", + "ObjectName" : "java.lang:type=OperatingSystem" + }, { + "name" : "Hadoop:service=NameNode,name=FSNamesystem", + "modelerType" : "FSNamesystem", + "tag.Context" : "dfs", + "tag.HAState" : "standby", + "tag.TotalSyncTimes" : "", + "tag.Hostname" : "m1.hdp.local", + "MissingBlocks" : 0, + "MissingReplOneBlocks" : 0, + "ExpiredHeartbeats" : 0, + "TransactionsSinceLastCheckpoint" : -9666, + "TransactionsSinceLastLogRoll" : 0, + "LastWrittenTransactionId" : 2086406, + "LastCheckpointTime" : 1458553696292, + "CapacityTotal" : 1254254346240, + "CapacityTotalGB" : 1168.0, + "CapacityUsed" : 43499128218, + "CapacityUsedGB" : 41.0, + "CapacityRemaining" : 1203736229944, + "CapacityRemainingGB" : 1121.0, + "CapacityUsedNonDFS" : 7018988078, + "TotalLoad" : 36, + "SnapshottableDirectories" : 4, + "Snapshots" : 8, + "LockQueueLength" : 0, + "BlocksTotal" : 7818, + "NumFilesUnderConstruction" : 7, + "NumActiveClients" : 6, + "FilesTotal" : 9560, + "PendingReplicationBlocks" : 44, + "UnderReplicatedBlocks" : 7687, + "CorruptBlocks" : 0, + "ScheduledReplicationBlocks" : 4, + "PendingDeletionBlocks" : 0, + "ExcessBlocks" : 0, + "PostponedMisreplicatedBlocks" : 0, + "PendingDataNodeMessageCount" : 0, + "MillisSinceLastLoadedEdits" : 86596, + "BlockCapacity" : 2097152, + "StaleDataNodes" : 0, + "TotalFiles" : 9560, + "TotalSyncCount" : 0 + }, { + "name" : "java.lang:type=MemoryPool,name=Code Cache", + "modelerType" : "sun.management.MemoryPoolImpl", + "CollectionUsage" : null, + "MemoryManagerNames" : [ "CodeCacheManager" ], + "PeakUsage" : { + "committed" : 35192832, + "init" : 2555904, + "max" : 251658240, + "used" : 34810944 + }, + "Usage" : { + "committed" : 35192832, + "init" : 2555904, + "max" : 251658240, + "used" : 34808000 + }, + "UsageThreshold" : 0, + "UsageThresholdCount" : 0, + "CollectionUsageThresholdSupported" : false, + "UsageThresholdExceeded" : false, + "UsageThresholdSupported" : true, + "Valid" : true, + "Name" : "Code Cache", + "Type" : "NON_HEAP", + "ObjectName" : "java.lang:type=MemoryPool,name=Code Cache" + }, { + "name" : "Hadoop:service=NameNode,name=IPCLoggerChannel-10.0.0.161-8485", + "modelerType" : "IPCLoggerChannel-10.0.0.161-8485", + "tag.Context" : "dfs", + "tag.IsOutOfSync" : "false", + "tag.Hostname" : "m1.hdp.local", + "QueuedEditsSize" : 0, + "LagTimeMillis" : 0, + "CurrentLagTxns" : 0 + }, { + "name" : "java.nio:type=BufferPool,name=direct", + "modelerType" : "sun.management.ManagementFactoryHelper$1", + "MemoryUsed" : 7988074, + "TotalCapacity" : 7988074, + "Count" : 316, + "Name" : "direct", + "ObjectName" : "java.nio:type=BufferPool,name=direct" + }, { + "name" : "java.lang:type=Compilation", + "modelerType" : "sun.management.CompilationImpl", + "TotalCompilationTime" : 99907, + "CompilationTimeMonitoringSupported" : true, + "Name" : "HotSpot 64-Bit Tiered Compilers", + "ObjectName" : "java.lang:type=Compilation" + }, { + "name" : "java.lang:type=MemoryManager,name=CodeCacheManager", + "modelerType" : "sun.management.MemoryManagerImpl", + "MemoryPoolNames" : [ "Code Cache" ], + "Valid" : true, + "Name" : "CodeCacheManager", + "ObjectName" : "java.lang:type=MemoryManager,name=CodeCacheManager" + }, { + "name" : "java.util.logging:type=Logging", + "modelerType" : "sun.management.ManagementFactoryHelper$PlatformLoggingImpl", + "ObjectName" : "java.util.logging:type=Logging", + "LoggerNames" : [ "com.sun.jersey.server.impl.resource.PerRequestFactory", "com.sun.jersey.api.client.RequestWriter", "global", "com.sun.jersey.api.core.ScanningResourceConfig", "com.sun.jersey.spi.container.ContainerRequest", "com.sun.jersey.server.impl.wadl.WadlApplicationContextImpl", "com.sun.jersey.server.impl.application.ResourceMethodDispatcherFactory", "com.sun.jersey.server.impl.monitoring.GlassFishMonitoringInitializer", "com.sun.jersey.spi.container.servlet.WebComponent", "javax.management.monitor", "com.sun.jersey.api.client.Client", "com.sun.jersey.server.impl.container.filter.FilterFactory", "com.google.common.cache.LocalCache", "com.sun.xml.bind.v2.runtime.ClassBeanInfoImpl", "com.sun.xml.bind.v2.runtime.JaxBeanInfo", "com.sun.jersey.server.impl.cdi.CDIComponentProviderFactoryInitializer", "com.sun.jersey.api.core.PackagesResourceConfig", "javax.management", "com.sun.jersey.server.impl.ejb.EJBComponentProviderFactoryInitilizer", "javax.management.snmp", "com.google.common.util.concurrent.ExecutionList", "javax.management.notification", "com.sun.jersey.server.impl.managedbeans.ManagedBeanComponentProviderFactoryInitilizer", "javax.management.snmp.daemon", "com.sun.jersey.core.impl.provider.xml.SAXParserContextProvider", "com.sun.xml.bind.v2.runtime.reflect.opt.OptimizedAccessorFactory", "com.sun.xml.bind.v2.ContextFactory", "com.sun.jersey.server.impl.wadl.WadlResource", "com.sun.jersey.json.impl.provider.entity.JSONWithPaddingProvider", "javax.management.modelmbean", "com.sun.xml.bind.v2.ClassFactory", "com.sun.jersey.server.impl.application.CloseableServiceFactory", "sun.net.www.protocol.http.HttpURLConnection", "com.google.common.util.concurrent.UncaughtExceptionHandlers$Exiter", "com.sun.jersey.server.impl.resource.SingletonFactory", "com.sun.xml.bind.v2.runtime.reflect.opt.OptimizedTransducedAccessorFactory", "javax.management.misc", "com.sun.xml.bind.v2.runtime.reflect.opt.AccessorInjector", "com.sun.jersey.spi.container.ContainerResponse", "javax.management.mlet", "com.sun.jersey.core.impl.provider.entity.EntityHolderReader", "javax.xml.bind", "com.sun.jersey.core.spi.component.ProviderServices", "javax.management.timer", "com.sun.xml.bind.v2.runtime.reflect.Accessor$FieldReflection", "com.sun.jersey.server.impl.application.RootResourceUriRules", "com.sun.jersey.api.core.ResourceConfig", "com.sun.xml.bind.v2.runtime.reflect.opt.Injector", "com.sun.jersey.api.wadl.config.WadlGeneratorLoader", "com.google.common.collect.MapMakerInternalMap", "com.sun.jersey.server.impl.application.WebApplicationImpl", "com.sun.jersey.server.impl.model.method.dispatch.MultipartFormDispatchProvider", "javax.management.relation", "com.sun.jersey.spi.service.ServiceFinder", "com.google.common.cache.CacheBuilder", "com.sun.jersey.api.client.ClientResponse", "com.sun.xml.bind.v2.bytecode.ClassTailor", "com.sun.jersey.server.wadl.generators.WadlGeneratorJAXBGrammarGenerator", "com.sun.jersey.server.impl.modelapi.annotation.IntrospectionModeller", "com.sun.jersey.server.impl.wadl.WadlFactory", "com.sun.jersey.core.impl.provider.xml.DocumentBuilderFactoryProvider", "com.sun.jersey.core.spi.component.ProviderFactory", "", "com.sun.jersey.spi.inject.Errors", "javax.management.mbeanserver" ] + }, { + "name" : "java.lang:type=ClassLoading", + "modelerType" : "sun.management.ClassLoadingImpl", + "LoadedClassCount" : 9319, + "UnloadedClassCount" : 4, + "TotalLoadedClassCount" : 9323, + "Verbose" : false, + "ObjectName" : "java.lang:type=ClassLoading" + }, { + "name" : "java.lang:type=MemoryManager,name=Metaspace Manager", + "modelerType" : "sun.management.MemoryManagerImpl", + "MemoryPoolNames" : [ "Metaspace", "Compressed Class Space" ], + "Valid" : true, + "Name" : "Metaspace Manager", + "ObjectName" : "java.lang:type=MemoryManager,name=Metaspace Manager" + }, { + "name" : "Hadoop:service=NameNode,name=RpcDetailedActivityForPort8020", + "modelerType" : "RpcDetailedActivityForPort8020", + "tag.port" : "8020", + "tag.Context" : "rpcdetailed", + "tag.Hostname" : "m1.hdp.local", + "VersionRequestNumOps" : 3, + "VersionRequestAvgTime" : 8.0, + "GetServiceStatusNumOps" : 232360, + "GetServiceStatusAvgTime" : 0.0, + "MonitorHealthNumOps" : 232352, + "MonitorHealthAvgTime" : 0.0, + "StandbyExceptionNumOps" : 1304, + "StandbyExceptionAvgTime" : 0.0, + "RegisterDatanodeNumOps" : 3, + "RegisterDatanodeAvgTime" : 58.33333333333333, + "SendHeartbeatNumOps" : 232777, + "SendHeartbeatAvgTime" : 0.11111111111111113, + "TransitionToActiveNumOps" : 1, + "TransitionToActiveAvgTime" : 2080.0, + "RetriableExceptionNumOps" : 20, + "RetriableExceptionAvgTime" : 0.6666666666666667, + "GetListingNumOps" : 40, + "GetListingAvgTime" : 1.0, + "BlockReportNumOps" : 36, + "BlockReportAvgTime" : 3.0, + "GetFileInfoNumOps" : 8, + "GetFileInfoAvgTime" : 1.25, + "SetSafeModeNumOps" : 4, + "SetSafeModeAvgTime" : 0.0, + "RenewLeaseNumOps" : 4, + "RenewLeaseAvgTime" : 0.0, + "CreateNumOps" : 1, + "CreateAvgTime" : 44.0, + "TransitionToStandbyNumOps" : 2, + "TransitionToStandbyAvgTime" : 1.0, + "BlockReceivedAndDeletedNumOps" : 989, + "BlockReceivedAndDeletedAvgTime" : 0.25000000000000006 + }, { + "name" : "Hadoop:service=NameNode,name=MetricsSystem,sub=Stats", + "modelerType" : "MetricsSystem,sub=Stats", + "tag.Context" : "metricssystem", + "tag.Hostname" : "m1.hdp.local", + "NumActiveSources" : 11, + "NumAllSources" : 11, + "NumActiveSinks" : 1, + "NumAllSinks" : 0, + "Sink_timelineNumOps" : 3879, + "Sink_timelineAvgTime" : 44.0, + "Sink_timelineDropped" : 0, + "Sink_timelineQsize" : 0, + "SnapshotNumOps" : 46548, + "SnapshotAvgTime" : 0.0, + "PublishNumOps" : 3879, + "PublishAvgTime" : 0.0, + "DroppedPubAll" : 0 + }, { + "name" : "Hadoop:service=NameNode,name=IPCLoggerChannel-10.0.0.162-8485", + "modelerType" : "IPCLoggerChannel-10.0.0.162-8485", + "tag.Context" : "dfs", + "tag.IsOutOfSync" : "false", + "tag.Hostname" : "m1.hdp.local", + "QueuedEditsSize" : 0, + "LagTimeMillis" : 0, + "CurrentLagTxns" : 0 + }, { + "name" : "Hadoop:service=NameNode,name=MetricsSystem,sub=Control", + "modelerType" : "org.apache.hadoop.metrics2.impl.MetricsSystemImpl" + }, { + "name" : "java.lang:type=MemoryPool,name=Metaspace", + "modelerType" : "sun.management.MemoryPoolImpl", + "CollectionUsage" : null, + "MemoryManagerNames" : [ "Metaspace Manager" ], + "PeakUsage" : { + "committed" : 66482176, + "init" : 0, + "max" : -1, + "used" : 64996216 + }, + "Usage" : { + "committed" : 66482176, + "init" : 0, + "max" : -1, + "used" : 64996216 + }, + "UsageThreshold" : 0, + "UsageThresholdCount" : 0, + "CollectionUsageThresholdSupported" : false, + "UsageThresholdExceeded" : false, + "UsageThresholdSupported" : true, + "Valid" : true, + "Name" : "Metaspace", + "Type" : "NON_HEAP", + "ObjectName" : "java.lang:type=MemoryPool,name=Metaspace" + }, { + "name" : "Hadoop:service=NameNode,name=NameNodeActivity", + "modelerType" : "NameNodeActivity", + "tag.ProcessName" : "NameNode", + "tag.SessionId" : null, + "tag.Context" : "dfs", + "tag.Hostname" : "m1.hdp.local", + "CreateFileOps" : 1, + "FilesCreated" : 1, + "FilesAppended" : 0, + "GetBlockLocations" : 0, + "FilesRenamed" : 0, + "FilesTruncated" : 0, + "GetListingOps" : 32, + "DeleteFileOps" : 0, + "FilesDeleted" : 0, + "FileInfoOps" : 1305, + "AddBlockOps" : 0, + "GetAdditionalDatanodeOps" : 0, + "CreateSymlinkOps" : 0, + "GetLinkTargetOps" : 0, + "FilesInGetListingOps" : 81, + "AllowSnapshotOps" : 0, + "DisallowSnapshotOps" : 0, + "CreateSnapshotOps" : 0, + "DeleteSnapshotOps" : 0, + "RenameSnapshotOps" : 0, + "ListSnapshottableDirOps" : 0, + "SnapshotDiffReportOps" : 0, + "BlockReceivedAndDeletedOps" : 989, + "StorageBlockReportOps" : 36, + "TransactionsNumOps" : 3, + "TransactionsAvgTime" : 1.0, + "SyncsNumOps" : 10, + "SyncsAvgTime" : 17.0, + "TransactionsBatchedInSync" : 0, + "BlockReportNumOps" : 36, + "BlockReportAvgTime" : 3.0, + "CacheReportNumOps" : 0, + "CacheReportAvgTime" : 0.0, + "SafeModeTime" : 43871, + "FsImageLoadTime" : 6785, + "GetEditNumOps" : 0, + "GetEditAvgTime" : 0.0, + "GetImageNumOps" : 0, + "GetImageAvgTime" : 0.0, + "PutImageNumOps" : 0, + "PutImageAvgTime" : 0.0, + "TotalFileOps" : 1338 + }, { + "name" : "java.lang:type=MemoryPool,name=Par Eden Space", + "modelerType" : "sun.management.MemoryPoolImpl", + "CollectionUsage" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 0 + }, + "CollectionUsageThreshold" : 0, + "CollectionUsageThresholdCount" : 0, + "MemoryManagerNames" : [ "ConcurrentMarkSweep", "ParNew" ], + "PeakUsage" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 107479040 + }, + "Usage" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 42931304 + }, + "CollectionUsageThresholdExceeded" : false, + "CollectionUsageThresholdSupported" : true, + "UsageThresholdSupported" : false, + "Valid" : true, + "Name" : "Par Eden Space", + "Type" : "HEAP", + "ObjectName" : "java.lang:type=MemoryPool,name=Par Eden Space" + }, { + "name" : "java.lang:type=GarbageCollector,name=ParNew", + "modelerType" : "sun.management.GarbageCollectorImpl", + "LastGcInfo" : { + "GcThreadCount" : 11, + "duration" : 5, + "endTime" : 232710394, + "id" : 806, + "memoryUsageAfterGc" : [ { + "key" : "Par Survivor Space", + "value" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 1851568 + } + }, { + "key" : "Compressed Class Space", + "value" : { + "committed" : 7372800, + "init" : 0, + "max" : 1073741824, + "used" : 7071000 + } + }, { + "key" : "Metaspace", + "value" : { + "committed" : 65826816, + "init" : 0, + "max" : -1, + "used" : 64423048 + } + }, { + "key" : "Code Cache", + "value" : { + "committed" : 34996224, + "init" : 2555904, + "max" : 251658240, + "used" : 34488192 + } + }, { + "key" : "Par Eden Space", + "value" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 0 + } + }, { + "key" : "CMS Old Gen", + "value" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 83181712 + } + } ], + "memoryUsageBeforeGc" : [ { + "key" : "Par Survivor Space", + "value" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 1731144 + } + }, { + "key" : "Compressed Class Space", + "value" : { + "committed" : 7372800, + "init" : 0, + "max" : 1073741824, + "used" : 7071000 + } + }, { + "key" : "Metaspace", + "value" : { + "committed" : 65826816, + "init" : 0, + "max" : -1, + "used" : 64423048 + } + }, { + "key" : "Code Cache", + "value" : { + "committed" : 34996224, + "init" : 2555904, + "max" : 251658240, + "used" : 34488192 + } + }, { + "key" : "Par Eden Space", + "value" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 107479032 + } + }, { + "key" : "CMS Old Gen", + "value" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 83181712 + } + } ], + "startTime" : 232710389 + }, + "CollectionCount" : 806, + "CollectionTime" : 9556, + "MemoryPoolNames" : [ "Par Eden Space", "Par Survivor Space" ], + "Valid" : true, + "Name" : "ParNew", + "ObjectName" : "java.lang:type=GarbageCollector,name=ParNew" + }, { + "name" : "java.lang:type=GarbageCollector,name=ConcurrentMarkSweep", + "modelerType" : "sun.management.GarbageCollectorImpl", + "LastGcInfo" : { + "GcThreadCount" : 11, + "duration" : 773, + "endTime" : 11891, + "id" : 2, + "memoryUsageAfterGc" : [ { + "key" : "Par Survivor Space", + "value" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 13369344 + } + }, { + "key" : "Compressed Class Space", + "value" : { + "committed" : 5406720, + "init" : 0, + "max" : 1073741824, + "used" : 5233200 + } + }, { + "key" : "Metaspace", + "value" : { + "committed" : 46452736, + "init" : 0, + "max" : -1, + "used" : 45552032 + } + }, { + "key" : "Code Cache", + "value" : { + "committed" : 11206656, + "init" : 2555904, + "max" : 251658240, + "used" : 10636352 + } + }, { + "key" : "Par Eden Space", + "value" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 58201664 + } + }, { + "key" : "CMS Old Gen", + "value" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 35747536 + } + } ], + "memoryUsageBeforeGc" : [ { + "key" : "Par Survivor Space", + "value" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 13369344 + } + }, { + "key" : "Compressed Class Space", + "value" : { + "committed" : 5251072, + "init" : 0, + "max" : 1073741824, + "used" : 5080856 + } + }, { + "key" : "Metaspace", + "value" : { + "committed" : 44752896, + "init" : 0, + "max" : -1, + "used" : 43891832 + } + }, { + "key" : "Code Cache", + "value" : { + "committed" : 11141120, + "init" : 2555904, + "max" : 251658240, + "used" : 10064000 + } + }, { + "key" : "Par Eden Space", + "value" : { + "committed" : 107479040, + "init" : 107479040, + "max" : 107479040, + "used" : 1982896 + } + }, { + "key" : "CMS Old Gen", + "value" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 37478184 + } + } ], + "startTime" : 11118 + }, + "CollectionCount" : 2, + "CollectionTime" : 93, + "MemoryPoolNames" : [ "Par Eden Space", "Par Survivor Space", "CMS Old Gen" ], + "Valid" : true, + "Name" : "ConcurrentMarkSweep", + "ObjectName" : "java.lang:type=GarbageCollector,name=ConcurrentMarkSweep" + }, { + "name" : "Hadoop:service=NameNode,name=NameNodeStatus", + "modelerType" : "org.apache.hadoop.hdfs.server.namenode.NameNode", + "NNRole" : "NameNode", + "HostAndPort" : "m1.hdp.local:8020", + "SecurityEnabled" : false, + "LastHATransitionTime" : 1458337688158, + "BytesWithFutureGenerationStamps" : 0, + "State" : "standby" + }, { + "name" : "Hadoop:service=NameNode,name=NameNodeInfo", + "modelerType" : "org.apache.hadoop.hdfs.server.namenode.FSNamesystem", + "Total" : 1254254346240, + "UpgradeFinalized" : true, + "ClusterId" : "CID-b255ee79-e4f1-44a8-b134-044c25d7bfd4", + "Version" : "2.7.1.2.3.5.1-68, rfe3c6b6dd1526d3c46f61a2e8fab9bb5eb649989", + "Used" : 43499128218, + "Free" : 1203736229944, + "Safemode" : "", + "NonDfsUsedSpace" : 7018988078, + "PercentUsed" : 3.4681265, + "BlockPoolUsedSpace" : 43499128218, + "PercentBlockPoolUsed" : 3.4681265, + "PercentRemaining" : 95.97227, + "CacheCapacity" : 0, + "CacheUsed" : 0, + "TotalBlocks" : 7818, + "TotalFiles" : 9560, + "NumberOfMissingBlocks" : 0, + "NumberOfMissingBlocksWithReplicationFactorOne" : 0, + "LiveNodes" : "{\"d2.hdp.local:50010\":{\"infoAddr\":\"10.0.0.166:50075\",\"infoSecureAddr\":\"10.0.0.166:0\",\"xferaddr\":\"10.0.0.166:50010\",\"lastContact\":0,\"usedSpace\":14314893790,\"adminState\":\"In Service\",\"nonDfsUsedSpace\":2078859450,\"capacity\":418084782080,\"numBlocks\":7799,\"version\":\"2.7.1.2.3.5.1-68\",\"used\":14314893790,\"remaining\":401691028840,\"blockScheduled\":0,\"blockPoolUsed\":14314893790,\"blockPoolUsedPercent\":3.423921,\"volfails\":0},\"d1.hdp.local:50010\":{\"infoAddr\":\"10.0.0.165:50075\",\"infoSecureAddr\":\"10.0.0.165:0\",\"xferaddr\":\"10.0.0.165:50010\",\"lastContact\":0,\"usedSpace\":14858473950,\"adminState\":\"In Service\",\"nonDfsUsedSpace\":2292039866,\"capacity\":418084782080,\"numBlocks\":7804,\"version\":\"2.7.1.2.3.5.1-68\",\"used\":14858473950,\"remaining\":400934268264,\"blockScheduled\":0,\"blockPoolUsed\":14858473950,\"blockPoolUsedPercent\":3.553938,\"volfails\":0},\"d3.hdp.local:50010\":{\"infoAddr\":\"10.0.0.167:50075\",\"infoSecureAddr\":\"10.0.0.167:0\",\"xferaddr\":\"10.0.0.167:50010\",\"lastContact\":0,\"usedSpace\":14325760478,\"adminState\":\"In Service\",\"nonDfsUsedSpace\":2648088762,\"capacity\":418084782080,\"numBlocks\":7800,\"version\":\"2.7.1.2.3.5.1-68\",\"used\":14325760478,\"remaining\":401110932840,\"blockScheduled\":0,\"blockPoolUsed\":14325760478,\"blockPoolUsedPercent\":3.4265206,\"volfails\":0}}", + "DeadNodes" : "{}", + "DecomNodes" : "{}", + "BlockPoolId" : "BP-331388818-10.0.0.160-1443469980189", + "NameDirStatuses" : "{\"active\":{\"/data/hadoop/hdfs/namenode\":\"IMAGE_AND_EDITS\"},\"failed\":{}}", + "NodeUsage" : "{\"nodeUsage\":{\"min\":\"3.42%\",\"median\":\"3.43%\",\"max\":\"3.55%\",\"stdDev\":\"0.06%\"}}", + "NameJournalStatus" : "[{\"manager\":\"QJM to [10.0.0.160:8485, 10.0.0.161:8485, 10.0.0.162:8485]\",\"stream\":\"open for read\",\"disabled\":\"false\",\"required\":\"true\"}]", + "JournalTransactionInfo" : "{\"MostRecentCheckpointTxId\":\"2096072\",\"LastAppliedOrWrittenTxId\":\"2096831\"}", + "NNStarted" : "Fri Mar 18 17:46:50 EDT 2016", + "CompileInfo" : "2016-03-10T05:05Z by jenkins from (HEAD detached at fe3c6b6)", + "CorruptFiles" : "[]", + "DistinctVersionCount" : 1, + "DistinctVersions" : [ { + "key" : "2.7.1.2.3.5.1-68", + "value" : 3 + } ], + "SoftwareVersion" : "2.7.1.2.3.5.1-68", + "RollingUpgradeStatus" : null, + "Threads" : 148 + }, { + "name" : "Hadoop:service=NameNode,name=SnapshotInfo", + "modelerType" : "org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotManager", + "SnapshottableDirectories" : [ { + "group" : "hdfs", + "modificationTime" : 1456355116813, + "owner" : "hive", + "path" : "/apps/hive/warehouse", + "permission" : 777, + "snapshotNumber" : 5, + "snapshotQuota" : 65536 + }, { + "group" : "hdfs", + "modificationTime" : 1449767160188, + "owner" : "fred", + "path" : "/user/fred", + "permission" : 700, + "snapshotNumber" : 0, + "snapshotQuota" : 65536 + }, { + "group" : "hdfs", + "modificationTime" : 1457370883059, + "owner" : "dstreev", + "path" : "/user/dstreev", + "permission" : 755, + "snapshotNumber" : 3, + "snapshotQuota" : 65536 + }, { + "group" : "hdfs", + "modificationTime" : 1456358248544, + "owner" : "dstreev", + "path" : "/tmp/dstreev", + "permission" : 700, + "snapshotNumber" : 0, + "snapshotQuota" : 65536 + } ], + "Snapshots" : [ { + "modificationTime" : 1456354755947, + "snapshotDirectory" : "/apps/hive/warehouse/.snapshot/2016-02-24-1500", + "snapshotID" : "2016-02-24-1500" + }, { + "modificationTime" : 1456355116813, + "snapshotDirectory" : "/apps/hive/warehouse/.snapshot/2016-02-24-1515", + "snapshotID" : "2016-02-24-1515" + }, { + "modificationTime" : 1456346696470, + "snapshotDirectory" : "/apps/hive/warehouse/.snapshot/s20160224-154300.702", + "snapshotID" : "s20160224-154300.702" + }, { + "modificationTime" : 1456346937065, + "snapshotDirectory" : "/apps/hive/warehouse/.snapshot/s20160224-154753.719", + "snapshotID" : "s20160224-154753.719" + }, { + "modificationTime" : 1456354515321, + "snapshotDirectory" : "/apps/hive/warehouse/.snapshot/s20160224-175513.358", + "snapshotID" : "s20160224-175513.358" + }, { + "modificationTime" : 1456355116813, + "snapshotDirectory" : "/user/dstreev/.snapshot/2016-02-24-1515", + "snapshotID" : "2016-02-24-1515" + }, { + "modificationTime" : 1456347057346, + "snapshotDirectory" : "/user/dstreev/.snapshot/s20160224-155029.640", + "snapshotID" : "s20160224-155029.640" + }, { + "modificationTime" : 1456347177621, + "snapshotDirectory" : "/user/dstreev/.snapshot/s20160224-155109.634", + "snapshotID" : "s20160224-155109.634" + } ] + }, { + "name" : "Hadoop:service=NameNode,name=BlockStats", + "modelerType" : "org.apache.hadoop.hdfs.server.blockmanagement.BlockManager", + "StorageTypeStats" : [ { + "key" : "DISK", + "value" : { + "blockPoolUsed" : 43499128218, + "capacityRemaining" : 1203736229944, + "capacityTotal" : 1254254346240, + "capacityUsed" : 43499128218, + "nodesInService" : 3 + } + } ] + }, { + "name" : "Hadoop:service=NameNode,name=IPCLoggerChannel-10.0.0.160-8485", + "modelerType" : "IPCLoggerChannel-10.0.0.160-8485", + "tag.Context" : "dfs", + "tag.IsOutOfSync" : "false", + "tag.Hostname" : "m1.hdp.local", + "QueuedEditsSize" : 0, + "LagTimeMillis" : 0, + "CurrentLagTxns" : 0 + }, { + "name" : "java.lang:type=MemoryPool,name=Compressed Class Space", + "modelerType" : "sun.management.MemoryPoolImpl", + "CollectionUsage" : null, + "MemoryManagerNames" : [ "Metaspace Manager" ], + "PeakUsage" : { + "committed" : 7503872, + "init" : 0, + "max" : 1073741824, + "used" : 7137088 + }, + "Usage" : { + "committed" : 7503872, + "init" : 0, + "max" : 1073741824, + "used" : 7137088 + }, + "UsageThreshold" : 0, + "UsageThresholdCount" : 0, + "CollectionUsageThresholdSupported" : false, + "UsageThresholdExceeded" : false, + "UsageThresholdSupported" : true, + "Valid" : true, + "Name" : "Compressed Class Space", + "Type" : "NON_HEAP", + "ObjectName" : "java.lang:type=MemoryPool,name=Compressed Class Space" + }, { + "name" : "java.lang:type=Memory", + "modelerType" : "sun.management.MemoryImpl", + "HeapMemoryUsage" : { + "committed" : 1060372480, + "init" : 1073741824, + "max" : 1060372480, + "used" : 128128416 + }, + "NonHeapMemoryUsage" : { + "committed" : 109178880, + "init" : 2555904, + "max" : -1, + "used" : 106941304 + }, + "ObjectPendingFinalizationCount" : 0, + "Verbose" : true, + "ObjectName" : "java.lang:type=Memory" + }, { + "name" : "Hadoop:service=NameNode,name=FSNamesystemState", + "modelerType" : "org.apache.hadoop.hdfs.server.namenode.FSNamesystem", + "CapacityTotal" : 1254254346240, + "CapacityUsed" : 43499128218, + "CapacityRemaining" : 1203736229944, + "TotalLoad" : 36, + "SnapshotStats" : "{\"SnapshottableDirectories\":4,\"Snapshots\":8}", + "FsLockQueueLength" : 0, + "BlocksTotal" : 7818, + "MaxObjects" : 0, + "FilesTotal" : 9560, + "PendingReplicationBlocks" : 44, + "UnderReplicatedBlocks" : 7687, + "ScheduledReplicationBlocks" : 4, + "PendingDeletionBlocks" : 0, + "BlockDeletionStartTime" : 1458341210409, + "FSState" : "Operational", + "NumLiveDataNodes" : 3, + "NumDeadDataNodes" : 0, + "NumDecomLiveDataNodes" : 0, + "NumDecomDeadDataNodes" : 0, + "VolumeFailuresTotal" : 0, + "EstimatedCapacityLostTotal" : 0, + "NumDecommissioningDataNodes" : 0, + "NumStaleDataNodes" : 0, + "NumStaleStorages" : 0, + "TopUserOpCounts" : "{\"timestamp\":\"2016-03-21T10:26:46-0400\",\"windows\":[{\"windowLenMs\":60000,\"ops\":[]},{\"windowLenMs\":300000,\"ops\":[]},{\"windowLenMs\":1500000,\"ops\":[]}]}", + "TotalSyncCount" : 0, + "TotalSyncTimes" : "" + }, { + "name" : "java.nio:type=BufferPool,name=mapped", + "modelerType" : "sun.management.ManagementFactoryHelper$1", + "MemoryUsed" : 24812, + "TotalCapacity" : 24812, + "Count" : 3, + "Name" : "mapped", + "ObjectName" : "java.nio:type=BufferPool,name=mapped" + }, { + "name" : "java.lang:type=MemoryPool,name=Par Survivor Space", + "modelerType" : "sun.management.MemoryPoolImpl", + "CollectionUsage" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 1851568 + }, + "CollectionUsageThreshold" : 0, + "CollectionUsageThresholdCount" : 0, + "MemoryManagerNames" : [ "ConcurrentMarkSweep", "ParNew" ], + "PeakUsage" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 13369344 + }, + "Usage" : { + "committed" : 13369344, + "init" : 13369344, + "max" : 13369344, + "used" : 1851568 + }, + "CollectionUsageThresholdExceeded" : false, + "CollectionUsageThresholdSupported" : true, + "UsageThresholdSupported" : false, + "Valid" : true, + "Name" : "Par Survivor Space", + "Type" : "HEAP", + "ObjectName" : "java.lang:type=MemoryPool,name=Par Survivor Space" + }, { + "name" : "Hadoop:service=NameNode,name=RetryCache.NameNodeRetryCache", + "modelerType" : "RetryCache.NameNodeRetryCache", + "tag.Context" : "rpc", + "tag.Hostname" : "m1.hdp.local", + "CacheHit" : 7, + "CacheCleared" : 0, + "CacheUpdated" : 2297 + }, { + "name" : "com.sun.management:type=DiagnosticCommand", + "modelerType" : "sun.management.DiagnosticCommandImpl" + }, { + "name" : "Hadoop:service=NameNode,name=RpcActivityForPort8020", + "modelerType" : "RpcActivityForPort8020", + "tag.port" : "8020", + "tag.Context" : "rpc", + "tag.Hostname" : "m1.hdp.local", + "ReceivedBytes" : 175348215, + "SentBytes" : 29317275, + "RpcQueueTimeNumOps" : 699904, + "RpcQueueTimeAvgTime" : 0.12000000000000002, + "RpcProcessingTimeNumOps" : 699904, + "RpcProcessingTimeAvgTime" : 0.04000000000000001, + "RpcAuthenticationFailures" : 0, + "RpcAuthenticationSuccesses" : 0, + "RpcAuthorizationFailures" : 2585, + "RpcAuthorizationSuccesses" : 1328, + "RpcClientBackoff" : 0, + "RpcSlowCalls" : 0, + "NumOpenConnections" : 4, + "CallQueueLength" : 0 + }, { + "name" : "Hadoop:service=NameNode,name=UgiMetrics", + "modelerType" : "UgiMetrics", + "tag.Context" : "ugi", + "tag.Hostname" : "m1.hdp.local", + "LoginSuccessNumOps" : 0, + "LoginSuccessAvgTime" : 0.0, + "LoginFailureNumOps" : 0, + "LoginFailureAvgTime" : 0.0, + "GetGroupsNumOps" : 2071, + "GetGroupsAvgTime" : 2.0 + }, { + "name" : "com.sun.management:type=HotSpotDiagnostic", + "modelerType" : "sun.management.HotSpotDiagnostic", + "DiagnosticOptions" : [ { + "name" : "HeapDumpBeforeFullGC", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "HeapDumpAfterFullGC", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "HeapDumpOnOutOfMemoryError", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "HeapDumpPath", + "origin" : "DEFAULT", + "value" : "", + "writeable" : true + }, { + "name" : "CMSAbortablePrecleanWaitMillis", + "origin" : "DEFAULT", + "value" : "100", + "writeable" : true + }, { + "name" : "CMSWaitDuration", + "origin" : "DEFAULT", + "value" : "2000", + "writeable" : true + }, { + "name" : "CMSTriggerInterval", + "origin" : "DEFAULT", + "value" : "-1", + "writeable" : true + }, { + "name" : "PrintGC", + "origin" : "VM_CREATION", + "value" : "true", + "writeable" : true + }, { + "name" : "PrintGCDetails", + "origin" : "VM_CREATION", + "value" : "true", + "writeable" : true + }, { + "name" : "PrintGCDateStamps", + "origin" : "VM_CREATION", + "value" : "true", + "writeable" : true + }, { + "name" : "PrintGCTimeStamps", + "origin" : "VM_CREATION", + "value" : "true", + "writeable" : true + }, { + "name" : "PrintGCID", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "PrintClassHistogramBeforeFullGC", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "PrintClassHistogramAfterFullGC", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "PrintClassHistogram", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "MinHeapFreeRatio", + "origin" : "DEFAULT", + "value" : "40", + "writeable" : true + }, { + "name" : "MaxHeapFreeRatio", + "origin" : "DEFAULT", + "value" : "70", + "writeable" : true + }, { + "name" : "PrintConcurrentLocks", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + }, { + "name" : "UnlockCommercialFeatures", + "origin" : "DEFAULT", + "value" : "false", + "writeable" : true + } ], + "ObjectName" : "com.sun.management:type=HotSpotDiagnostic" + }, { + "name" : "java.lang:type=MemoryPool,name=CMS Old Gen", + "modelerType" : "sun.management.MemoryPoolImpl", + "CollectionUsage" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 35747536 + }, + "CollectionUsageThreshold" : 0, + "CollectionUsageThresholdCount" : 0, + "MemoryManagerNames" : [ "ConcurrentMarkSweep" ], + "PeakUsage" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 83181712 + }, + "Usage" : { + "committed" : 939524096, + "init" : 939524096, + "max" : 939524096, + "used" : 83181712 + }, + "UsageThreshold" : 0, + "UsageThresholdCount" : 0, + "CollectionUsageThresholdExceeded" : false, + "CollectionUsageThresholdSupported" : true, + "UsageThresholdExceeded" : false, + "UsageThresholdSupported" : true, + "Valid" : true, + "Name" : "CMS Old Gen", + "Type" : "HEAP", + "ObjectName" : "java.lang:type=MemoryPool,name=CMS Old Gen" + } ] +} \ No newline at end of file diff --git a/src/test/resources/topUserOpsCount.json b/src/test/resources/topUserOpsCount.json new file mode 100644 index 0000000..e511c16 --- /dev/null +++ b/src/test/resources/topUserOpsCount.json @@ -0,0 +1,342 @@ +{ + "timestamp": "2016-03-21T15:00:57-0400", + "windows": [ + { + "windowLenMs": 300000, + "ops": [ + { + "opType": "delete", + "topUsers": [ + { + "user": "hive", + "count": 8 + }, + { + "user": "ambari-qa", + "count": 3 + } + ], + "totalCount": 11 + }, + { + "opType": "*", + "topUsers": [ + { + "user": "citi_2769197d", + "count": 284 + }, + { + "user": "wre", + "count": 152 + }, + { + "user": "ststrt", + "count": 78 + }, + { + "user": "hive", + "count": 63 + }, + { + "user": "ambari-qa", + "count": 21 + }, + { + "user": "logiq3", + "count": 6 + }, + { + "user": "oozie", + "count": 5 + }, + { + "user": "mapred", + "count": 4 + } + ], + "totalCount": 613 + }, + { + "opType": "mkdirs", + "topUsers": [ + { + "user": "hive", + "count": 14 + }, + { + "user": "ambari-qa", + "count": 8 + } + ], + "totalCount": 22 + }, + { + "opType": "listStatus", + "topUsers": [ + { + "user": "oozie", + "count": 5 + }, + { + "user": "mapred", + "count": 4 + } + ], + "totalCount": 9 + }, + { + "opType": "getfileinfo", + "topUsers": [ + { + "user": "citi_2769197d", + "count": 284 + }, + { + "user": "wre", + "count": 152 + }, + { + "user": "ststrt", + "count": 78 + }, + { + "user": "hive", + "count": 49 + }, + { + "user": "ambari-qa", + "count": 20 + }, + { + "user": "logiq3", + "count": 6 + } + ], + "totalCount": 589 + } + ] + }, + { + "windowLenMs": 1500000, + "ops": [ + { + "opType": "delete", + "topUsers": [ + { + "user": "hive", + "count": 30 + }, + { + "user": "ambari-qa", + "count": 12 + } + ], + "totalCount": 42 + }, + { + "opType": "*", + "topUsers": [ + { + "user": "citi_2769197d", + "count": 720 + }, + { + "user": "wre", + "count": 544 + }, + { + "user": "hive", + "count": 242 + }, + { + "user": "ststrt", + "count": 234 + }, + { + "user": "ambari-qa", + "count": 134 + }, + { + "user": "mapred", + "count": 18 + }, + { + "user": "oozie", + "count": 10 + }, + { + "user": "logiq3", + "count": 6 + } + ], + "totalCount": 1908 + }, + { + "opType": "mkdirs", + "topUsers": [ + { + "user": "hive", + "count": 44 + }, + { + "user": "ambari-qa", + "count": 25 + } + ], + "totalCount": 69 + }, + { + "opType": "listStatus", + "topUsers": [ + { + "user": "mapred", + "count": 18 + }, + { + "user": "oozie", + "count": 11 + } + ], + "totalCount": 29 + }, + { + "opType": "getfileinfo", + "topUsers": [ + { + "user": "citi_2769197d", + "count": 720 + }, + { + "user": "wre", + "count": 492 + }, + { + "user": "ststrt", + "count": 234 + }, + { + "user": "hive", + "count": 182 + }, + { + "user": "ambari-qa", + "count": 98 + }, + { + "user": "logiq3", + "count": 6 + } + ], + "totalCount": 1732 + } + ] + }, + { + "windowLenMs": 60000, + "ops": [ + { + "opType": "delete", + "topUsers": [ + { + "user": "hive", + "count": 1 + }, + { + "user": "ambari-qa", + "count": 1 + } + ], + "totalCount": 2 + }, + { + "opType": "*", + "topUsers": [ + { + "user": "ststrt", + "count": 78 + }, + { + "user": "wre", + "count": 26 + }, + { + "user": "ambari-qa", + "count": 12 + }, + { + "user": "hive", + "count": 11 + }, + { + "user": "logiq3", + "count": 6 + }, + { + "user": "mapred", + "count": 2 + }, + { + "user": "oozie", + "count": 1 + } + ], + "totalCount": 136 + }, + { + "opType": "mkdirs", + "topUsers": [ + { + "user": "hive", + "count": 2 + }, + { + "user": "ambari-qa", + "count": 2 + } + ], + "totalCount": 4 + }, + { + "opType": "listStatus", + "topUsers": [ + { + "user": "mapred", + "count": 2 + }, + { + "user": "oozie", + "count": 1 + } + ], + "totalCount": 3 + }, + { + "opType": "getfileinfo", + "topUsers": [ + { + "user": "ststrt", + "count": 78 + }, + { + "user": "wre", + "count": 26 + }, + { + "user": "ambari-qa", + "count": 9 + }, + { + "user": "hive", + "count": 7 + }, + { + "user": "logiq3", + "count": 6 + } + ], + "totalCount": 126 + } + ] + } + ] +}