MOST IMPORTANT ISSUES What's this?

We found the following problems in this heap dump.
For each problem where possible we report overhead: how much memory you would save
(in Kbytes and as a percentage of used memory) if you get rid of this problem.

Problem: Thread throwing OutOfMemoryError .  found 
Details

This dump was generated after the thread below threw OutOfMemoryError.
This may be caused by one or more of the following things:
- your heap is too small
- your app wastes memory somewhere
- your app has a memory leak
- your app tries to allocate an array bigger than the JVM limit
See other report sections for more concrete clues and solutions.

Found thread with name: "main"

For full details, go to


Problem: High percentage of memory is used by instances of some class(es).
Details

Instances of some class(es) take up a big percentage of memory.
That can happen when the number of instances is very high, or their size
is very large (e.g. for arrays). Depending on whether all these instances
are really necessary or not, and also whether they are live or garbage,
this may or may not be a problem. See other report sections for more details
about these objects.

Threshold exceeded for:

  Total size  Source 
 1,106,967Kb (68.0%)Objects of class byte[]
 243,821Kb (15.0%)Objects of class j.u.HashSet

For full details, go to


Problem: High percentage of memory retained by one or more GC roots, or memory leak.
Details

Some GC root(s) retain a large percentage of memory, and/or there are signs of a memory leak.
When one GC root holds most of the objects, it may be by design, or it may signal that
more objects than expected are accumulated in memory. Alternatively, when a single object,
typically a collection or array, references a very large number of other objects, AND they
take up a lot of memory, it may signal a memory leak. However, we cannot say for sure
that you have a real problem here. You need to inspect carefully the object trees in the
report section below and check whether they match your expectations.
For general discussion of this problem, see this article.

Found leak candidate(s) in Object tree for GC root(s) Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap

Threshold exceeded for:

  Total size  Source 
 1,047,624Kb (64.3%) Object tree for GC root(s) Java Static java.beans.ThreadGroupContext.contexts
 571,788Kb (35.1%) Object tree for GC root(s) Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap

For full details, go to


Problem: Memory is wasted by bad (underutilized or otherwise suboptimal) collections.  Overhead 15.0%  ( 244,518Kb ) 
Details

There are some collections, such as j.u.HashSet, with high overhead per element.
Each collection needs memory for its implementation objects, which can be many.
When a collection contains no elements (empty), all its memory is wasted completely.
Consider creating such collections lazily. When a collection contains one or two elements,
you may in some situations store these elements in a more economical data structure,
such as an array, or at least make sure that you create the collection with the correct
(small) capacity. Otherwise, the memory used by the collection itself vs its payload is
disproportionally high. See the report section below for concrete findings and suggestions
on how to fix.
For general discussion of this problem, see this article.

Threshold exceeded for:

  Overhead  Source 
 243,793Kb (15.0%)Instances of empty j.u.HashSet

For full details, go to


Problem: Memory is wasted by primitive arrays with many zeroes and/or other problems.  Overhead 64.6%  ( 1,052,826Kb ) 
Details

There are some primitive arrays, such as byte[], with high overhead per element.
Each primitive array has a fixed-size internal, JVM-managed header, with most common
size of 16 bytes in HotSpot JVM and 12 bytes in Android. Each array slot takes
memory as well. Thus an empty array (with all elements equal to 0) wastes all its memory.
Consider creating such arrays lazily. Another common problem is underutilized buffers:
arrays where less than a half of the capacity is ever used. See if such buffers can be
allocated with more appropriate capacity. When an array contains just one or two elements,
you may in some situations be able to store these elements more economically, e.g. as
data fields in the enclosing class, or at least make sure you create the array with the
correct (small) size. Otherwise, the memory used by the array internals vs its payload is
disproportionally high. See the report section below for concrete findings.

Threshold exceeded for:

  Overhead  Source 
 1,047,628Kb (64.3%) empty byte[] arrays

For full details, go to


Problem: Memory is wasted by duplicate objects other than Strings.  Overhead 3.7%  ( 60,519Kb ) 
Details

There are some duplicate objects, such as org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg.
That is, there are multiple instances of some class(es) with the same contents.
Object duplication often occurs when, for example, an app reads the same frequently used
piece of data from the DB repeatedly, creating a new Java object for it every time.
See the report section below for concrete findings.
For general discussion of this problem, see this article.

For full details, go to






1. Top-Level Stats What's this?
Generated by JXRay version 2.7

Heap dump hs2-hive-jira-20153.hprof created on Wed Jul 11 08:07:13 PDT 2018
JVM version: 1.8.0_144

     Instances   Object arrays   Primitive arrays  Total 
 Objects 17,932,614 999,814 601,29419,533,722
 Bytes 456,336Kb (28.0%) 55,852Kb (3.4%) 1,116,733Kb (68.6%)1,628,922Kb (100.0%)


     Live   Garbage  Total 
 Objects 19,517,030 16,69219,533,722
 Bytes 1,628,172Kb (100.0%) 749Kb (< 0.1%)1,628,922Kb (100.0%)


  Number of classes  Number of threads 
 6,87011


  JVM pointer size  Object header size 
 412




2. Thread throwing OutOfMemoryError. .  found  What's this?
This dump was created after OutOfMemoryError in the following thread:

Thread name: "main", daemon: false
java.lang.OutOfMemoryError.<init>(OutOfMemoryError.java:48)
java.util.regex.Matcher.<init>(Matcher.java:225)
  Local variables java.util.regex.Matcher(parentPattern : java.util.regex.Pattern@c0939488, groups : null, from : 0, to : 0, lookbehindTo : 0, text : "@Kcat@VArts & Entertainment > Hobbies & Creative Arts > Artwork > Posters@V", acceptMode : 0, first : -1, last : 0, oldLast : -1, lastAppendPosition : 0, locals : null, hitEnd : false, requireEnd : false, transparentBounds : false, anchoringBounds : true)

java.util.regex.Pattern(pattern : "@K(.*?)@V(.*?)@V", flags : 0, compiled : true, normalizedPattern : "@K(.*?)@V(.*?)@V", root : java.util.regex.Pattern$Start@c0939518, matchRoot : java.util.regex.Pattern$Slice@c0939530, buffer : null, namedGroups : null, groupNodes : null, temp : null, capturingGroupCount : 3, localCount : 2, cursor : 16, patternLength : 0, hasSupplementary : false)

java.util.regex.Pattern.matcher(Pattern.java:1093)
com.criteo.hadoop.hive.udf.UDFExtraDataToMap.toMap(UDFExtraDataToMap.java:42)
  Local variables com.criteo.hadoop.hive.udf.UDFExtraDataToMap(container : j.u.LinkedHashMap(size: 1))

com.criteo.hadoop.hive.udf.UDFExtraDataToMap.evaluate(UDFExtraDataToMap.java:82)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:187)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject.get(ExprNodeGenericFuncEvaluator.java:88)
  Local variables org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject(eager : false, eval : org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator@c4f50b30, evaluated : false, version : 4412885, obj : j.u.LinkedHashMap(size: 1), this$0 : org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator@c4f51398)

org.apache.hadoop.hive.ql.udf.generic.GenericUDFIndex.evaluate(GenericUDFIndex.java:100)
  Local variables "stars"

org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:187)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject.get(ExprNodeGenericFuncEvaluator.java:88)
  Local variables org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject(eager : false, eval : org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator@c4f51398, evaluated : false, version : 4412885, obj : null, this$0 : org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator@c4f51340)

org.apache.hadoop.hive.ql.udf.generic.GenericUDFWhen.evaluate(GenericUDFWhen.java:104)
  Local variables org.apache.hadoop.hive.ql.udf.generic.GenericUDFWhen(argumentOIs : org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector[](size: 3), returnOIResolver : org.apache.hadoop.hive.ql.udf.generic.GenericUDFUtils$ReturnObjectInspectorResolver@c4f509e8)

org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:187)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluatorHead._evaluate(ExprNodeEvaluatorHead.java:44)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:68)
org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)
  Local variables org.apache.hadoop.hive.ql.exec.SelectOperator(configuration : org.apache.hadoop.mapred.JobConf@80045688, cContext : null, childOperators : j.u.ArrayList(size: 1), parentOperators : j.u.ArrayList(size: 1), operatorId : "SEL_1", abortOp : java.util.concurrent.atomic.AtomicBoolean@c073b3e8, execContext : org.apache.hadoop.hive.ql.exec.mr.ExecMapperContext@c073d798, rootInitializeCalled : true, runTimeNumRows : 4412884, indexForTezUnion : -1, hconf : org.apache.hadoop.mapred.JobConf@80045688, asyncInitOperations : j.u.HashSet(size: 0), state : org.apache.hadoop.hive.ql.exec.Operator$State@c0713228, useBucketizedHiveInputFormat : false, conf : org.apache.hadoop.hive.ql.plan.SelectDesc@c073b438, done : false, rowSchema : org.apache.hadoop.hive.ql.exec.RowSchema@c076aa60, statsMap : j.u.HashMap(size: 0), out : null, LOG : org.slf4j.impl.Log4jLoggerAdapter@c05255d0, PLOG : org.slf4j.impl.Log4jLoggerAdapter@c05227a0, isLogInfoEnabled : false, isLogDebugEnabled : false, isLogTraceEnabled : false, alias : null, reporter : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, id : "1", inputObjInspectors : org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector[](size: 1), outputObjInspector : org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@c4b385d8, colExprMap : j.u.HashMap(size: 12), jobCloseDone : false, childOperatorsArray : org.apache.hadoop.hive.ql.exec.Operator[](size: 1), childOperatorsTag : int[](size: 1), groupKeyObject : null, eval : org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator[](size: 12), output : Object[](size: 12), isSelectStarNoCompute : false)

Object[2]{org.apache.hadoop.hive.serde2.columnar.ColumnarStruct(lengthNullSequence : 2, ...), Object[](1)@c0aa5078}

org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:897)
  Local variables org.apache.hadoop.hive.ql.exec.TableScanOperator(configuration : org.apache.hadoop.mapred.JobConf@80045688, cContext : null, childOperators : j.u.ArrayList(size: 1), parentOperators : j.u.ArrayList(size: 1), operatorId : "TS_0", abortOp : java.util.concurrent.atomic.AtomicBoolean@c076b2b0, execContext : org.apache.hadoop.hive.ql.exec.mr.ExecMapperContext@c073d798, rootInitializeCalled : true, runTimeNumRows : 4412885, indexForTezUnion : -1, hconf : org.apache.hadoop.mapred.JobConf@80045688, asyncInitOperations : j.u.HashSet(size: 0), state : org.apache.hadoop.hive.ql.exec.Operator$State@c0713228, useBucketizedHiveInputFormat : false, conf : org.apache.hadoop.hive.ql.plan.TableScanDesc@c076b300, done : false, rowSchema : org.apache.hadoop.hive.ql.exec.RowSchema@c076b490, statsMap : j.u.HashMap(size: 0), out : null, LOG : org.slf4j.impl.Log4jLoggerAdapter@c05226c8, PLOG : org.slf4j.impl.Log4jLoggerAdapter@c05227a0, isLogInfoEnabled : false, isLogDebugEnabled : false, isLogTraceEnabled : false, alias : null, reporter : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, id : "0", inputObjInspectors : org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector[](size: 1), outputObjInspector : org.apache.hadoop.hive.serde2.objectinspector.UnionStructObjectInspector@c06c6a00, colExprMap : null, jobCloseDone : false, childOperatorsArray : org.apache.hadoop.hive.ql.exec.Operator[](size: 1), childOperatorsTag : int[](size: 1), groupKeyObject : null, jc : null, inputFileChanged : true, tableDesc : null, currentStat : null, stats : null, rowLimit : -1, currCount : 0, insideView : false, defaultPartitionName : null, schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", schemaEvolutionColumnsTypes : "string,array<struct<h:int,coltp:int,catname:string,cattype:string>>,int,string,string,int,string,boolean,int,string,boolean,string,string,in ...[length 210]")

Object[2]{org.apache.hadoop.hive.serde2.columnar.ColumnarStruct(lengthNullSequence : 2, ...), Object[](1)@c0aa5078}

org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:148)
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:547)
  Local variables org.apache.hadoop.hive.ql.exec.MapOperator(configuration : org.apache.hadoop.mapred.JobConf@80045688, cContext : org.apache.hadoop.hive.ql.CompilationOpContext@c07366f8, childOperators : j.u.ArrayList(size: 1), parentOperators : j.u.ArrayList(size: 0), operatorId : "MAP_0", abortOp : java.util.concurrent.atomic.AtomicBoolean@c076c660, execContext : org.apache.hadoop.hive.ql.exec.mr.ExecMapperContext@c073d798, rootInitializeCalled : true, runTimeNumRows : 0, indexForTezUnion : -1, hconf : org.apache.hadoop.mapred.JobConf@80045688, asyncInitOperations : j.u.HashSet(size: 0), state : org.apache.hadoop.hive.ql.exec.Operator$State@c0713228, useBucketizedHiveInputFormat : false, conf : org.apache.hadoop.hive.ql.plan.MapWork@c076c6c0, done : false, rowSchema : null, statsMap : j.u.HashMap(size: 2), out : null, LOG : org.slf4j.impl.Log4jLoggerAdapter@c0523508, PLOG : org.slf4j.impl.Log4jLoggerAdapter@c05227a0, isLogInfoEnabled : false, isLogDebugEnabled : false, isLogTraceEnabled : false, alias : null, reporter : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, id : "0", inputObjInspectors : org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector[](size: 1), outputObjInspector : null, colExprMap : null, jobCloseDone : false, childOperatorsArray : org.apache.hadoop.hive.ql.exec.Operator[](size: 0), childOperatorsTag : int[](size: 0), groupKeyObject : null, deserialize_error_count : org.apache.hadoop.io.LongWritable@c0740358, recordCounter : org.apache.hadoop.io.LongWritable@c0740340, numRows : 4412884, connectedOperators : j.u.TreeMap(size: 0), normalizedPaths : j.u.HashMap(size: 674), cntr : 1, logEveryNRows : 0, opCtxMap : j.u.HashMap(size: 1,000), childrenOpToOI : j.u.HashMap(size: 1), currentCtxs : org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx[](size: 1))

org.apache.hadoop.hive.serde2.columnar.BytesRefArrayWritable(bytesRefWritables : org.apache.hadoop.hive.serde2.columnar.BytesRefWritable[](size: 23), valid : 23)

org.apache.hadoop.hive.ql.exec.mr.ExecMapperContext(lastInputPath : org.apache.hadoop.fs.Path@c0503200, currentInputPath : org.apache.hadoop.fs.Path@c0503200, inputFileChecked : true, fileId : null, localWork : null, fetchOperators : null, jc : org.apache.hadoop.mapred.JobConf@80045688, ioCxt : org.apache.hadoop.hive.ql.io.IOContext@c06ff8c0, currentBigBucketFile : null)

org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx[1]{(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=704}", ...)}

org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx(alias : "$hdt$_0:partnerdb_catalogs", op : org.apache.hadoop.hive.ql.exec.TableScanOperator@c0736768, partDesc : org.apache.hadoop.hive.ql.plan.PartitionDesc@c0aa4770, partObjectInspector : org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@c06c6958, vcsObjectInspector : null, rowObjectInspector : org.apache.hadoop.hive.serde2.objectinspector.UnionStructObjectInspector@c06c6a00, partTblObjectInspectorConverter : org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters$IdentityConverter@c0aa5050, rowWithPart : Object[](size: 2), rowWithPartAndVC : null, deserializer : org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe@c0aa50a0, tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=704}", vcs : null, vcValues : null)

Object[2]{org.apache.hadoop.hive.serde2.columnar.ColumnarStruct(lengthNullSequence : 2, ...), Object[](1)@c0aa5078}

org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:160)
  Local variables org.apache.hadoop.hive.ql.exec.mr.ExecMapper(mo : org.apache.hadoop.hive.ql.exec.MapOperator@c0736640, oc : org.apache.hadoop.mapred.MapTask$OldOutputCollector@c5008b80, jc : org.apache.hadoop.mapred.JobConf@80045688, abort : false, rp : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, localWork : null, execContext : org.apache.hadoop.hive.ql.exec.mr.ExecMapperContext@c073d798)

org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
  Local variables org.apache.hadoop.mapred.MapRunner(mapper : org.apache.hadoop.hive.ql.exec.mr.ExecMapper@c05d69f8, incrProcCount : false)

org.apache.hadoop.mapred.MapTask$TrackedRecordReader(rawIn : org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader@c05d6c68, fileInputByteCounter : org.apache.hadoop.mapred.Counters$Counter@c057eac8, inputRecordCounter : org.apache.hadoop.mapred.Counters$Counter@c057ec28, reporter : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, bytesInPrev : 0, bytesInCurr : 0, fsStats : null, this$0 : org.apache.hadoop.mapred.MapTask@801532d8)

org.apache.hadoop.mapred.MapTask$OldOutputCollector(partitioner : org.apache.hadoop.hive.ql.io.DefaultHivePartitioner@c5008b98, collector : org.apache.hadoop.mapred.MapTask$MapOutputBuffer@c0612f40, numPartitions : 1009)

org.apache.hadoop.mapred.Task$TaskReporter(umbilical : com.sun.proxy.$Proxy9@80183238, split : org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit@c0502ce0, taskProgress : org.apache.hadoop.util.Progress@8015bd88, pingThread : j.l.Thread@c0503a50, done : false, lock : Object@c0503e00, progressFlag : java.util.concurrent.atomic.AtomicBoolean@c0503e10, this$0 : org.apache.hadoop.mapred.MapTask@801532d8)

org.apache.hadoop.hive.shims.CombineHiveKey(key : org.apache.hadoop.io.LongWritable@db2fc0b8)

org.apache.hadoop.hive.serde2.columnar.BytesRefArrayWritable(bytesRefWritables : org.apache.hadoop.hive.serde2.columnar.BytesRefWritable[](size: 23), valid : 23)

org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:459)
  Local variables org.apache.hadoop.mapred.MapTask(jobFile : "job.xml", user : "sz.ho", taskId : org.apache.hadoop.mapred.TaskAttemptID@801533b8, partition : 1303, encryptedSpillKey : byte[](size: 1), taskStatus : org.apache.hadoop.mapred.MapTaskStatus@80153498, jobRunStateForCleanup : null, jobCleanup : false, jobSetup : false, taskCleanup : false, extraData : org.apache.hadoop.io.BytesWritable@80198e20, skipRanges : org.apache.hadoop.mapred.SortedRanges@80198e50, skipping : false, writeSkipRecs : true, currentRecStartIndex : 0, currentRecIndexIterator : org.apache.hadoop.mapred.SortedRanges$SkipRangeIterator@80198eb8, pTree : org.apache.hadoop.yarn.util.ProcfsBasedProcessTree@c0502ad8, initCpuCumulativeTime : 9080, conf : org.apache.hadoop.mapred.JobConf@80045688, mapOutputFile : org.apache.hadoop.mapred.YarnOutputFiles@80198f18, lDirAlloc : org.apache.hadoop.fs.LocalDirAllocator@80198f40, jobContext : org.apache.hadoop.mapred.JobContextImpl@c0502c70, taskContext : org.apache.hadoop.mapred.TaskAttemptContextImpl@c0503e20, outputFormat : null, committer : org.apache.hadoop.hive.ql.io.HiveFileFormatUtils$NullOutputCommitter@c0503e78, spilledRecordsCounter : org.apache.hadoop.mapred.Counters$Counter@80172d48, failedShuffleCounter : org.apache.hadoop.mapred.Counters$Counter@80172cb8, mergedMapOutputsCounter : org.apache.hadoop.mapred.Counters$Counter@80172dd8, numSlotsRequired : 1, umbilical : com.sun.proxy.$Proxy9@80183238, tokenSecret : javax.crypto.spec.SecretKeySpec@80198f50, shuffleSecret : javax.crypto.spec.SecretKeySpec@801915d8, gcUpdater : org.apache.hadoop.mapred.Task$GcTimeUpdater@80173280, taskProgress : org.apache.hadoop.util.Progress@8015bd88, counters : org.apache.hadoop.mapred.Counters@80172b58, taskDone : java.util.concurrent.atomic.AtomicBoolean@80172ad0, statisticUpdaters : j.u.HashMap(size: 3), splitMetaInfo : org.apache.hadoop.mapreduce.split.JobSplit$TaskSplitIndex@8015b858, mapPhase : org.apache.hadoop.util.Progress@c0517640, sortPhase : org.apache.hadoop.util.Progress@c05175b0)

org.apache.hadoop.mapred.JobConf(quietmode : true, allowNullValueProperties : false, resources : j.u.ArrayList(size: 1), finalParameters : j.u.Collections$SetFromMap@800457f0, loadDefaults : true, updatingResource : j.u.concurrent.ConcurrentHashMap(size: 1,913), properties : j.u.Properties(size: 1,915), overlay : j.u.Properties(size: 28), classLoader : sun.misc.Launcher$AppClassLoader@8001d198, credentials : org.apache.hadoop.security.Credentials@8001fbc8)

org.apache.hadoop.mapreduce.split.JobSplit$TaskSplitIndex(splitLocation : "viewfs://root/tmp/hadoop-yarn/sz.ho/.staging/job_1531301354100_57311/job.split", startOffset : 909032)

com.sun.proxy.$Proxy9(h : org.apache.hadoop.ipc.WritableRpcEngine$Invoker@80183248)

org.apache.hadoop.mapred.Task$TaskReporter(umbilical : com.sun.proxy.$Proxy9@80183238, split : org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit@c0502ce0, taskProgress : org.apache.hadoop.util.Progress@8015bd88, pingThread : j.l.Thread@c0503a50, done : false, lock : Object@c0503e00, progressFlag : java.util.concurrent.atomic.AtomicBoolean@c0503e10, this$0 : org.apache.hadoop.mapred.MapTask@801532d8)

org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit(paths : null, startoffset : null, lengths : null, locations : null, totLength : 0, job : null, shrinkedLength : 0, _isShrinked : false, inputFormatClassName : "org.apache.hadoop.hive.ql.io.RCFileInputFormat", inputSplitShim : org.apache.hadoop.hive.shims.HadoopShimsSecure$InputSplitShim@c0502da8, pathToPartitionInfo : null)

org.apache.hadoop.mapred.MapTask$TrackedRecordReader(rawIn : org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader@c05d6c68, fileInputByteCounter : org.apache.hadoop.mapred.Counters$Counter@c057eac8, inputRecordCounter : org.apache.hadoop.mapred.Counters$Counter@c057ec28, reporter : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, bytesInPrev : 0, bytesInCurr : 0, fsStats : null, this$0 : org.apache.hadoop.mapred.MapTask@801532d8)

org.apache.hadoop.mapred.MapTask$MapOutputBuffer(partitions : 1009, job : org.apache.hadoop.mapred.JobConf@80045688, reporter : org.apache.hadoop.mapred.Task$TaskReporter@c0502cb0, keyClass : class org.apache.hadoop.hive.ql.io.HiveKey, valClass : class org.apache.hadoop.io.BytesWritable, comparator : org.apache.hadoop.hive.ql.io.HiveKey$Comparator@c0613080, serializationFactory : org.apache.hadoop.io.serializer.SerializationFactory@c06130a0, keySerializer : org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer@c0613140, valSerializer : org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer@c06131c0, combinerRunner : null, combineCollector : null, codec : org.apache.hadoop.io.compress.SnappyCodec@c06131d8, kvmeta : java.nio.ByteBufferAsIntBufferL@c06131e8, kvstart : 268173308, kvend : 268173308, kvindex : 268173308, equator : 0, bufstart : 0, bufend : 0, bufmark : 0, bufindex : 0, bufvoid : 1072693248, kvbuffer : byte[](size: 1,072,693,248), b0 : byte[](size: 0), maxRec : 67043328, softLimit : 858154624, spillInProgress : false, bufferRemaining : 858154624, sortSpillException : null, numSpills : 0, minSpillsForCombine : 3, sorter : org.apache.hadoop.util.QuickSort@c0613260, spillLock : java.util.concurrent.locks.ReentrantLock@c0613270, spillDone : java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@c0613280, spillReady : java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@c0612da0, bb : org.apache.hadoop.mapred.MapTask$MapOutputBuffer$BlockingBuffer@c0613158, spillThreadRunning : true, spillThread : org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread@c0612c20, rfs : org.apache.hadoop.fs.RawLocalFileSystem@c0613298, mapOutputByteCounter : org.apache.hadoop.mapred.Counters$Counter@c057ebc8, mapOutputRecordCounter : org.apache.hadoop.mapred.Counters$Counter@c057ebf8, fileOutputByteCounter : org.apache.hadoop.mapred.Counters$Counter@c057eb98, indexCacheList : j.u.ArrayList(size: 0), totalIndexCacheMemory : 0, indexCacheMemoryLimit : 1048576, mapTask : org.apache.hadoop.mapred.MapTask@801532d8, mapOutputFile : org.apache.hadoop.mapred.YarnOutputFiles@80198f18, sortPhase : org.apache.hadoop.util.Progress@c05175b0, spilledRecordsCounter : org.apache.hadoop.mapred.Counters$Counter@80172d48, META_BUFFER_TMP : byte[](size: 16))

org.apache.hadoop.mapred.MapRunner(mapper : org.apache.hadoop.hive.ql.exec.mr.ExecMapper@c05d69f8, incrProcCount : false)

org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
  Local variables org.apache.hadoop.mapred.MapTask(jobFile : "job.xml", user : "sz.ho", taskId : org.apache.hadoop.mapred.TaskAttemptID@801533b8, partition : 1303, encryptedSpillKey : byte[](size: 1), taskStatus : org.apache.hadoop.mapred.MapTaskStatus@80153498, jobRunStateForCleanup : null, jobCleanup : false, jobSetup : false, taskCleanup : false, extraData : org.apache.hadoop.io.BytesWritable@80198e20, skipRanges : org.apache.hadoop.mapred.SortedRanges@80198e50, skipping : false, writeSkipRecs : true, currentRecStartIndex : 0, currentRecIndexIterator : org.apache.hadoop.mapred.SortedRanges$SkipRangeIterator@80198eb8, pTree : org.apache.hadoop.yarn.util.ProcfsBasedProcessTree@c0502ad8, initCpuCumulativeTime : 9080, conf : org.apache.hadoop.mapred.JobConf@80045688, mapOutputFile : org.apache.hadoop.mapred.YarnOutputFiles@80198f18, lDirAlloc : org.apache.hadoop.fs.LocalDirAllocator@80198f40, jobContext : org.apache.hadoop.mapred.JobContextImpl@c0502c70, taskContext : org.apache.hadoop.mapred.TaskAttemptContextImpl@c0503e20, outputFormat : null, committer : org.apache.hadoop.hive.ql.io.HiveFileFormatUtils$NullOutputCommitter@c0503e78, spilledRecordsCounter : org.apache.hadoop.mapred.Counters$Counter@80172d48, failedShuffleCounter : org.apache.hadoop.mapred.Counters$Counter@80172cb8, mergedMapOutputsCounter : org.apache.hadoop.mapred.Counters$Counter@80172dd8, numSlotsRequired : 1, umbilical : com.sun.proxy.$Proxy9@80183238, tokenSecret : javax.crypto.spec.SecretKeySpec@80198f50, shuffleSecret : javax.crypto.spec.SecretKeySpec@801915d8, gcUpdater : org.apache.hadoop.mapred.Task$GcTimeUpdater@80173280, taskProgress : org.apache.hadoop.util.Progress@8015bd88, counters : org.apache.hadoop.mapred.Counters@80172b58, taskDone : java.util.concurrent.atomic.AtomicBoolean@80172ad0, statisticUpdaters : j.u.HashMap(size: 3), splitMetaInfo : org.apache.hadoop.mapreduce.split.JobSplit$TaskSplitIndex@8015b858, mapPhase : org.apache.hadoop.util.Progress@c0517640, sortPhase : org.apache.hadoop.util.Progress@c05175b0)

org.apache.hadoop.mapred.JobConf(quietmode : true, allowNullValueProperties : false, resources : j.u.ArrayList(size: 1), finalParameters : j.u.Collections$SetFromMap@800457f0, loadDefaults : true, updatingResource : j.u.concurrent.ConcurrentHashMap(size: 1,913), properties : j.u.Properties(size: 1,915), overlay : j.u.Properties(size: 28), classLoader : sun.misc.Launcher$AppClassLoader@8001d198, credentials : org.apache.hadoop.security.Credentials@8001fbc8)

com.sun.proxy.$Proxy9(h : org.apache.hadoop.ipc.WritableRpcEngine$Invoker@80183248)

org.apache.hadoop.mapred.Task$TaskReporter(umbilical : com.sun.proxy.$Proxy9@80183238, split : org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit@c0502ce0, taskProgress : org.apache.hadoop.util.Progress@8015bd88, pingThread : j.l.Thread@c0503a50, done : false, lock : Object@c0503e00, progressFlag : java.util.concurrent.atomic.AtomicBoolean@c0503e10, this$0 : org.apache.hadoop.mapred.MapTask@801532d8)

org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
  Local variables org.apache.hadoop.mapred.YarnChild$2(val$taskFinal : org.apache.hadoop.mapred.MapTask@801532d8, val$job : org.apache.hadoop.mapred.JobConf@80045688, val$umbilical : com.sun.proxy.$Proxy9@80183238)

java.security.AccessController.doPrivileged(Native method)
javax.security.auth.Subject.doAs(Subject.java:422)
  Local variables javax.security.auth.Subject(principals : j.u.Collections$SynchronizedSet@80171fd8, pubCredentials : j.u.Collections$SynchronizedSet@801720a0, privCredentials : j.u.Collections$SynchronizedSet@801720f0, readOnly : false)

org.apache.hadoop.mapred.YarnChild$2(val$taskFinal : org.apache.hadoop.mapred.MapTask@801532d8, val$job : org.apache.hadoop.mapred.JobConf@80045688, val$umbilical : com.sun.proxy.$Proxy9@80183238)

java.security.AccessControlContext(context : java.security.ProtectionDomain[](size: 2), isPrivileged : false, isAuthorized : true, privilegedContext : null, combiner : null, permissions : null, parent : null, isWrapped : false, isLimited : false, limitedContext : null)

org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
  Local variables org.apache.hadoop.security.UserGroupInformation(subject : javax.security.auth.Subject@80171fb8, user : org.apache.hadoop.security.User@80172040, isKeytab : false, isKrbTkt : false, isLoginExternal : false)

org.apache.hadoop.mapred.YarnChild$2(val$taskFinal : org.apache.hadoop.mapred.MapTask@801532d8, val$job : org.apache.hadoop.mapred.JobConf@80045688, val$umbilical : com.sun.proxy.$Proxy9@80183238)

org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
  Local variables String[4]{"10.224.22.18", "46104", "attempt_1531301354100_57311_m_001303_1", "285873023226794"}

org.apache.hadoop.mapred.JobConf(quietmode : true, allowNullValueProperties : false, resources : j.u.ArrayList(size: 1), finalParameters : j.u.Collections$SetFromMap@800457f0, loadDefaults : true, updatingResource : j.u.concurrent.ConcurrentHashMap(size: 1,913), properties : j.u.Properties(size: 1,915), overlay : j.u.Properties(size: 28), classLoader : sun.misc.Launcher$AppClassLoader@8001d198, credentials : org.apache.hadoop.security.Credentials@8001fbc8)

"10.224.22.18"

java.net.InetSocketAddress(holder : java.net.InetSocketAddress$InetSocketAddressHolder@800469b0)

org.apache.hadoop.mapred.TaskAttemptID(id : 1, taskId : org.apache.hadoop.mapred.TaskID@8004ee28)

org.apache.hadoop.mapred.JVMId(isMap : true, jobId : org.apache.hadoop.mapred.JobID@8004edd8, jvmId : 285873023226794)

org.apache.hadoop.security.Credentials(secretKeysMap : j.u.HashMap(size: 1), tokenMap : j.u.HashMap(size: 5))

org.apache.hadoop.security.UserGroupInformation(subject : javax.security.auth.Subject@80046a18, user : org.apache.hadoop.security.User@80046aa0, isKeytab : false, isKrbTkt : false, isLoginExternal : false)

org.apache.hadoop.security.token.Token(identifier : byte[](size: 24), password : byte[](size: 20), kind : org.apache.hadoop.io.Text@80046e68, service : org.apache.hadoop.io.Text@80046db8, renewer : null)

com.sun.proxy.$Proxy9(h : org.apache.hadoop.ipc.WritableRpcEngine$Invoker@80183248)

org.apache.hadoop.mapred.JvmContext(jvmId : org.apache.hadoop.mapred.JVMId@8004ed70, pid : "-1000")

org.apache.hadoop.mapred.MapTask(jobFile : "job.xml", user : "sz.ho", taskId : org.apache.hadoop.mapred.TaskAttemptID@801533b8, partition : 1303, encryptedSpillKey : byte[](size: 1), taskStatus : org.apache.hadoop.mapred.MapTaskStatus@80153498, jobRunStateForCleanup : null, jobCleanup : false, jobSetup : false, taskCleanup : false, extraData : org.apache.hadoop.io.BytesWritable@80198e20, skipRanges : org.apache.hadoop.mapred.SortedRanges@80198e50, skipping : false, writeSkipRecs : true, currentRecStartIndex : 0, currentRecIndexIterator : org.apache.hadoop.mapred.SortedRanges$SkipRangeIterator@80198eb8, pTree : org.apache.hadoop.yarn.util.ProcfsBasedProcessTree@c0502ad8, initCpuCumulativeTime : 9080, conf : org.apache.hadoop.mapred.JobConf@80045688, mapOutputFile : org.apache.hadoop.mapred.YarnOutputFiles@80198f18, lDirAlloc : org.apache.hadoop.fs.LocalDirAllocator@80198f40, jobContext : org.apache.hadoop.mapred.JobContextImpl@c0502c70, taskContext : org.apache.hadoop.mapred.TaskAttemptContextImpl@c0503e20, outputFormat : null, committer : org.apache.hadoop.hive.ql.io.HiveFileFormatUtils$NullOutputCommitter@c0503e78, spilledRecordsCounter : org.apache.hadoop.mapred.Counters$Counter@80172d48, failedShuffleCounter : org.apache.hadoop.mapred.Counters$Counter@80172cb8, mergedMapOutputsCounter : org.apache.hadoop.mapred.Counters$Counter@80172dd8, numSlotsRequired : 1, umbilical : com.sun.proxy.$Proxy9@80183238, tokenSecret : javax.crypto.spec.SecretKeySpec@80198f50, shuffleSecret : javax.crypto.spec.SecretKeySpec@801915d8, gcUpdater : org.apache.hadoop.mapred.Task$GcTimeUpdater@80173280, taskProgress : org.apache.hadoop.util.Progress@8015bd88, counters : org.apache.hadoop.mapred.Counters@80172b58, taskDone : java.util.concurrent.atomic.AtomicBoolean@80172ad0, statisticUpdaters : j.u.HashMap(size: 3), splitMetaInfo : org.apache.hadoop.mapreduce.split.JobSplit$TaskSplitIndex@8015b858, mapPhase : org.apache.hadoop.util.Progress@c0517640, sortPhase : org.apache.hadoop.util.Progress@c05175b0)

org.apache.hadoop.security.UserGroupInformation(subject : javax.security.auth.Subject@80171fb8, user : org.apache.hadoop.security.User@80172040, isKeytab : false, isKrbTkt : false, isLoginExternal : false)

j.u.concurrent.Executors$DelegatedScheduledExecutorService(e : j.u.concurrent.ScheduledThreadPoolExecutor@8001f838, e : j.u.concurrent.ScheduledThreadPoolExecutor@8001f838)

org.apache.hadoop.mapred.JvmTask(t : org.apache.hadoop.mapred.MapTask@801532d8, shouldDie : false)

org.apache.hadoop.mapred.MapTask(jobFile : "job.xml", user : "sz.ho", taskId : org.apache.hadoop.mapred.TaskAttemptID@801533b8, partition : 1303, encryptedSpillKey : byte[](size: 1), taskStatus : org.apache.hadoop.mapred.MapTaskStatus@80153498, jobRunStateForCleanup : null, jobCleanup : false, jobSetup : false, taskCleanup : false, extraData : org.apache.hadoop.io.BytesWritable@80198e20, skipRanges : org.apache.hadoop.mapred.SortedRanges@80198e50, skipping : false, writeSkipRecs : true, currentRecStartIndex : 0, currentRecIndexIterator : org.apache.hadoop.mapred.SortedRanges$SkipRangeIterator@80198eb8, pTree : org.apache.hadoop.yarn.util.ProcfsBasedProcessTree@c0502ad8, initCpuCumulativeTime : 9080, conf : org.apache.hadoop.mapred.JobConf@80045688, mapOutputFile : org.apache.hadoop.mapred.YarnOutputFiles@80198f18, lDirAlloc : org.apache.hadoop.fs.LocalDirAllocator@80198f40, jobContext : org.apache.hadoop.mapred.JobContextImpl@c0502c70, taskContext : org.apache.hadoop.mapred.TaskAttemptContextImpl@c0503e20, outputFormat : null, committer : org.apache.hadoop.hive.ql.io.HiveFileFormatUtils$NullOutputCommitter@c0503e78, spilledRecordsCounter : org.apache.hadoop.mapred.Counters$Counter@80172d48, failedShuffleCounter : org.apache.hadoop.mapred.Counters$Counter@80172cb8, mergedMapOutputsCounter : org.apache.hadoop.mapred.Counters$Counter@80172dd8, numSlotsRequired : 1, umbilical : com.sun.proxy.$Proxy9@80183238, tokenSecret : javax.crypto.spec.SecretKeySpec@80198f50, shuffleSecret : javax.crypto.spec.SecretKeySpec@801915d8, gcUpdater : org.apache.hadoop.mapred.Task$GcTimeUpdater@80173280, taskProgress : org.apache.hadoop.util.Progress@8015bd88, counters : org.apache.hadoop.mapred.Counters@80172b58, taskDone : java.util.concurrent.atomic.AtomicBoolean@80172ad0, statisticUpdaters : j.u.HashMap(size: 3), splitMetaInfo : org.apache.hadoop.mapreduce.split.JobSplit$TaskSplitIndex@8015b858, mapPhase : org.apache.hadoop.util.Progress@c0517640, sortPhase : org.apache.hadoop.util.Progress@c05175b0)




3. Where Memory Goes, by Class What's this?

  # instances     Shallow size     Impl-incl. size     Retained memory    Class name 
 492,169 1,106,967Kb (68.0%) 1,106,967Kb (68.0%) 1,106,967Kb (68.0%)byte[]
Reference chains
Expensive data fields

1,047,552Kb (64.3%): byte[]: 1 objects

 Random sample 
byte[1072693248]{(all elements are 0s)}

 ↖org.apache.hadoop.mapred.MapTask$MapOutputBuffer.kvbuffer
48,016Kb (2.9%): byte[]: 488,662 objects

 Random sample 
byte[46]{'MAISON MARGIELA contrast sleeves bomber jacket'}
byte[110]{'Cahaya Terang UP Flash LED Selfie Luminous Hard Case untuk IPhone 7 Plus dengan USB Tanggal Line-Internasional'}
byte[54]{'Occidental Leather 5019 Pro Leather Utility Bag - intl'}
byte[32]{'Jes MaHarry Sky Dome Ring female'}
byte[64]{'Giclee Print: Black Dog's Treasure Island by Bill Bell : 24x18in'}
byte[52]{'Art Print: Floral Eclipse I by Paul Duncan : 20x20in'}
byte[56]{'2012 Mazda 2 1.5 (', -32, -72, -101, -32, -72, -75, ' 09-14) Sports Maxx Hatchback AT'}
byte[99]{'BYT Ultra Tipis Window Flip Cover Case untuk Apple IPhone 7 Plus WithCard Slot & Stand (Biru) -Intl'}
byte[58]{'R-Just Metal Aluminum Phone Case For Iphone 6 Plus (Black)'}
byte[73]{'Sketsa Bear Art Pola Phone Case untuk Motorola MOTO E (2014) (Hitam)-Intl'}
byte[60]{'Stretched Canvas Print: Lighthouse by Mina Teslaru : 36x48in'}
byte[55]{'Art Print: Hockey Shoe Patent by Cole Borders : 24x18in'}
byte[41]{'Jubilee Crown Chandelier - 3 Light - Pink'}
byte[33]{'Our Legacy rib knit cuffs sweater'}
byte[99]{'9 H Tempered Steel Glass Ultra Tipis HD Anti Gores Layar Film Pelindung untuk Nokia Lumia 640- INTL'}
byte[80]{'Ocean Springbed Magic Wonder Size 200 x 200 - Mattress Only - Khusus Jabodetabek'}
byte[62]{'Dompet Flip Leather Case untuk HUAWEI Mate 9 Lite (Brown)-Intl'}
byte[63]{'Stretched Canvas Print: Avocados 3 by Suren Nersisyan : 24x18in'}
byte[64]{'Art Print: Los Angeles Watercolor Street Map by NaxArt : 12x12in'}
byte[34]{'M', -31, -70, -73, 't n', -31, -70, -95, ' Laneige clear C mi', -31, -70, -65, 'ng'}

 ↖org.apache.hadoop.io.Text.bytes
8,719Kb (0.5%): byte[]: 11 objects

 Random sample 
byte[74736]{'\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N', ... (33184 trailing 0s)}
byte[186840]{'155981559815598155981559815598155981559815598155981559815598155981559815598155981559815598155981559815598155981559815598', ... (88850 trailing 0s)}
byte[88284]{'102000\N\N161000\N\N\N\N285000\N\N124000\N\N145000\N\N124000128000\N\N\N145000\N\N128000132000132000\N\N161000132000\N\N', ... (35224 trailing 0s)}
byte[86628]{'97000\N\N154000\N\N\N\N271000\N\N85000\N\N95000\N\N8500085000\N\N\N85000\N\N850008400084000\N\N10400084000\N\N\N\N61000\', ... (33498 trailing 0s)}
byte[527936]{'Chevron Anchor Boat Pola Phone Case untuk LenovoA6000 (Hitam)-Intl\N\NFor Samsung Galaxy J2 2016 Leather Case Luxury Sta', ...}
byte[2753314]{'@Ksamsung_ids@V0@V@Kseller_id@V13234@V@Ksimple_sku@VOE427ELAA8YCPQANID-20966951@V@Ksimple_sku_2@VOE427ELAA8YCPQANID-2096', ... (1170803 trailing 0s)}
byte[1343412]{'https://id-live-01.slatic.net/p/2/chevron-anchor-boat-pola-phone-case-untuk-lenovoa6000-hitam-intl-1488460338-01793051-e', ... (615522 trailing 0s)}
byte[2344694]{'http://i1.static-shopcade.com/270x270x100/000000000000000000000000xxxkhu/aHR0cHM6Ly9zYy1wcm9kdWN0LWltYWdlcy5zMy5hbWF6b25', ... (1047053 trailing 0s)}
byte[115190]{'237042370423704237042370423704237042370423704237042370423704237042370423704237042370423704237042370423704237042370423704', ... (35190 trailing 0s)}
byte[1114812]{'\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N', ... (538688 trailing 0s)}
byte[292730]{'\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N', ...}

 ↖org.apache.hadoop.hive.serde2.lazy.ByteArrayRef.data


Full reference chains

1,047,552Kb (64.3%): byte[]: 1 objects

 Random sample 
byte[1072693248]{(all elements are 0s)}

org.apache.hadoop.mapred.MapTask$MapOutputBuffer.kvbuffer org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.this$0
j.l.Thread[]
j.l.ThreadGroup.threads
java.beans.WeakIdentityMap$Entry.referent
java.beans.WeakIdentityMap$Entry[]
java.beans.ThreadGroupContext$1.table
↖Java Static java.beans.ThreadGroupContext.contexts
47,987Kb (2.9%): byte[]: 487,567 objects

 Random sample 
byte[46]{'MAISON MARGIELA contrast sleeves bomber jacket'}
byte[110]{'Cahaya Terang UP Flash LED Selfie Luminous Hard Case untuk IPhone 7 Plus dengan USB Tanggal Line-Internasional'}
byte[54]{'Occidental Leather 5019 Pro Leather Utility Bag - intl'}
byte[32]{'Jes MaHarry Sky Dome Ring female'}
byte[64]{'Giclee Print: Black Dog's Treasure Island by Bill Bell : 24x18in'}
byte[52]{'Art Print: Floral Eclipse I by Paul Duncan : 20x20in'}
byte[56]{'2012 Mazda 2 1.5 (', -32, -72, -101, -32, -72, -75, ' 09-14) Sports Maxx Hatchback AT'}
byte[99]{'BYT Ultra Tipis Window Flip Cover Case untuk Apple IPhone 7 Plus WithCard Slot & Stand (Biru) -Intl'}
byte[58]{'R-Just Metal Aluminum Phone Case For Iphone 6 Plus (Black)'}
byte[73]{'Sketsa Bear Art Pola Phone Case untuk Motorola MOTO E (2014) (Hitam)-Intl'}
byte[60]{'Stretched Canvas Print: Lighthouse by Mina Teslaru : 36x48in'}
byte[55]{'Art Print: Hockey Shoe Patent by Cole Borders : 24x18in'}
byte[41]{'Jubilee Crown Chandelier - 3 Light - Pink'}
byte[33]{'Our Legacy rib knit cuffs sweater'}
byte[99]{'9 H Tempered Steel Glass Ultra Tipis HD Anti Gores Layar Film Pelindung untuk Nokia Lumia 640- INTL'}
byte[80]{'Ocean Springbed Magic Wonder Size 200 x 200 - Mattress Only - Khusus Jabodetabek'}
byte[62]{'Dompet Flip Leather Case untuk HUAWEI Mate 9 Lite (Brown)-Intl'}
byte[63]{'Stretched Canvas Print: Avocados 3 by Suren Nersisyan : 24x18in'}
byte[64]{'Art Print: Los Angeles Watercolor Street Map by NaxArt : 12x12in'}
byte[34]{'M', -31, -70, -73, 't n', -31, -70, -95, ' Laneige clear C mi', -31, -70, -65, 'ng'}

org.apache.hadoop.io.Text.bytes Object[]
org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys
{j.u.HashMap}.keys
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
4,942Kb (0.3%): byte[]: 7 objects

 Random sample 
byte[74736]{'\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N', ... (33184 trailing 0s)}
byte[186840]{'155981559815598155981559815598155981559815598155981559815598155981559815598155981559815598155981559815598155981559815598', ... (88850 trailing 0s)}
byte[88284]{'102000\N\N161000\N\N\N\N285000\N\N124000\N\N145000\N\N124000128000\N\N\N145000\N\N128000132000132000\N\N161000132000\N\N', ... (35224 trailing 0s)}
byte[86628]{'97000\N\N154000\N\N\N\N271000\N\N85000\N\N95000\N\N8500085000\N\N\N85000\N\N850008400084000\N\N10400084000\N\N\N\N61000\', ... (33498 trailing 0s)}
byte[527936]{'Chevron Anchor Boat Pola Phone Case untuk LenovoA6000 (Hitam)-Intl\N\NFor Samsung Galaxy J2 2016 Leather Case Luxury Sta', ...}
byte[2753314]{'@Ksamsung_ids@V0@V@Kseller_id@V13234@V@Ksimple_sku@VOE427ELAA8YCPQANID-20966951@V@Ksimple_sku_2@VOE427ELAA8YCPQANID-2096', ... (1170803 trailing 0s)}
byte[1343412]{'https://id-live-01.slatic.net/p/2/chevron-anchor-boat-pola-phone-case-untuk-lenovoa6000-hitam-intl-1488460338-01793051-e', ... (615522 trailing 0s)}

org.apache.hadoop.hive.serde2.lazy.ByteArrayRef.data org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.cachedByteArrayRef
org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo[]
org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.fieldInfoList
Object[]
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
3,776Kb (0.2%): byte[]: 4 objects

 Random sample 
byte[2344694]{'http://i1.static-shopcade.com/270x270x100/000000000000000000000000xxxkhu/aHR0cHM6Ly9zYy1wcm9kdWN0LWltYWdlcy5zMy5hbWF6b25', ... (1047053 trailing 0s)}
byte[115190]{'237042370423704237042370423704237042370423704237042370423704237042370423704237042370423704237042370423704237042370423704', ... (35190 trailing 0s)}
byte[1114812]{'\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N', ... (538688 trailing 0s)}
byte[292730]{'\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N', ...}

org.apache.hadoop.hive.serde2.lazy.ByteArrayRef.data org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.cachedByteArrayRef
org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo[]
org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.fieldInfoList
org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.cachedLazyStruct
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx[]
org.apache.hadoop.hive.ql.exec.MapOperator.currentCtxs
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 3,900,708 60,948Kb (3.7%) 243,821Kb (15.0%) 244,022Kb (15.0%)j.u.HashSet
Reference chains
Expensive data fields

182,843Kb (11.2%): j.u.HashSet: 2,925,498 objects

 Random sample 
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}

 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg.uniqueObjects
60,947Kb (3.7%): j.u.HashSet: 975,166 objects

 Random sample 
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}

 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg.uniqueObjects


Full reference chains

182,843Kb (11.2%): j.u.HashSet: 2,925,492 objects

 Random sample 
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg.uniqueObjects org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
60,947Kb (3.7%): j.u.HashSet: 975,164 objects

 Random sample 
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}
j.u.HashSet(size: 0) {}

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg.uniqueObjects org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
10Kb (< 0.1%): j.u.HashSet: 1 objects

 Random sample 
j.u.HashSet<j.l.Class>(size: 275, capacity: 512) {class org.apache.hadoop.hive.ql.udf.ptf.MatchPath$MatchPathResolver, ...}

j.u.Collections$SynchronizedSet.c org.apache.hadoop.hive.ql.exec.Registry.builtIns
↖Java Static org.apache.hadoop.hive.ql.exec.FunctionRegistry.system


 2,925,498 68,566Kb (4.2%) 68,566Kb (4.2%) 251,409Kb (15.4%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg
Reference chains
Expensive data fields

68,566Kb (4.2%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg: 2,925,492 objects (37% of all objects referenced here)

 Random sample 
(uniqueObjects : j.u.HashSet(size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 2)
(uniqueObjects : (size: 0), value : 4)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 3)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 0)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

68,566Kb (4.2%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg: 2,925,492 objects (37% of all objects referenced here)

 Random sample 
(uniqueObjects : j.u.HashSet(size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 2)
(uniqueObjects : (size: 0), value : 4)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 3)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 1)
(uniqueObjects : (size: 0), value : 0)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
144b (< 0.1%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg: 6 objects (37% of all objects referenced here)

 Random sample 
(uniqueObjects : j.u.HashSet(size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)
(uniqueObjects : (size: 0), value : 0)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] org.apache.hadoop.hive.ql.exec.GroupByOperator.aggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 487,583 38,092Kb (2.3%) 38,092Kb (2.3%) 438,479Kb (26.9%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
Reference chains
Expensive data fields

38,092Kb (2.3%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]: 487,582 objects

 Random sample 
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cf685df8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c35408b8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c5b4a8b0, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c7c77758, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c9da5058, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cbf1e5c0, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@ce05fc28, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@d019c970, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c404fe58, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c6658518, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c878e338, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@ca8b0ca0, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cca34190, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@ceb734d8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c2a317e8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c505b138, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c7167a58, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c929a340, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cb3c3788, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cd548fa0, ...}

{j.u.HashMap}.values org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

38,092Kb (2.3%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]: 487,582 objects

 Random sample 
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cf685df8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c35408b8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c5b4a8b0, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c7c77758, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c9da5058, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cbf1e5c0, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@ce05fc28, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@d019c970, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c404fe58, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c6658518, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c878e338, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@ca8b0ca0, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cca34190, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@ceb734d8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c2a317e8, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c505b138, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c7167a58, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c929a340, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cb3c3788, ...}
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(1), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : false, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 1, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@cd548fa0, ...}

{j.u.HashMap}.values org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
80b (< 0.1%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]: 1 objects

 Random sample 
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[16]{org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : true, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg(empty : true, sum : j.l.Long(0), ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg(value : 0, ...), org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg@c501bcf8, ...}

org.apache.hadoop.hive.ql.exec.GroupByOperator.aggregations {j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 1,950,332 30,473Kb (1.9%) 30,473Kb (1.9%) 32,585Kb (2.0%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg
Reference chains
Expensive data fields

30,473Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg: 1,950,328 objects (25% of all objects referenced here)

 Random sample 
20 copies of (o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

30,473Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg: 1,950,328 objects (25% of all objects referenced here)

 Random sample 
20 copies of (o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
64b (< 0.1%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg: 4 objects (25% of all objects referenced here)

 Random sample 
4 copies of (o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] org.apache.hadoop.hive.ql.exec.GroupByOperator.aggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 1,950,332 30,473Kb (1.9%) 30,473Kb (1.9%) 32,585Kb (2.0%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg
Reference chains
Expensive data fields

30,473Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg: 1,950,328 objects (25% of all objects referenced here)

 Random sample 
20 copies of (o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

30,473Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg: 1,950,328 objects (25% of all objects referenced here)

 Random sample 
20 copies of (o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
64b (< 0.1%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg: 4 objects (25% of all objects referenced here)

 Random sample 
4 copies of (o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] org.apache.hadoop.hive.ql.exec.GroupByOperator.aggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 975,166 22,855Kb (1.4%) 22,855Kb (1.4%) 83,805Kb (5.1%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg
Reference chains
Expensive data fields

22,855Kb (1.4%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg: 975,164 objects (12% of all objects referenced here)

 Random sample 
(empty : false, sum : j.l.Long(0), uniqueObjects : j.u.HashSet(size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

22,855Kb (1.4%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg: 975,164 objects (12% of all objects referenced here)

 Random sample 
(empty : false, sum : j.l.Long(0), uniqueObjects : j.u.HashSet(size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (1), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))
(empty : false, sum : (0), uniqueObjects : (size: 0))

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
48b (< 0.1%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg: 2 objects (12% of all objects referenced here)

 Random sample 
(empty : true, sum : j.l.Long(0), uniqueObjects : j.u.HashSet(size: 0))
(empty : true, sum : (0), uniqueObjects : (size: 0))

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] org.apache.hadoop.hive.ql.exec.GroupByOperator.aggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 3,901,164 182,867Kb (11.2%) 19,599Kb (1.2%) 573,665Kb (35.2%)j.u.HashMap
Reference chains
Expensive data fields

19,333Kb (1.2%): j.u.HashMap: 1 objects

 Random sample 
j.u.HashMap<org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper, org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]>(size: 487582, capacity: 1048576) {(key:(hashcode : 2047900177, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@d5309e40), (key:(-1907323306, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c4b2d818), (key:(1322274519, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c44c19c8), (key:(1036008904, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@d9c4e5b0), (key:(-1604280216, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@dd28a590), (key:(1890611389, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c469e808), (key:(-1981773329, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c4697318), (key:(210766976, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@e1e1d4a8), (key:(1387287201, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c400d998), (key:(-826224943, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@e370ee48), ...}

 ↖org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

19,333Kb (1.2%): j.u.HashMap: 1 objects

 Random sample 
j.u.HashMap<org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper, org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]>(size: 487582, capacity: 1048576) {(key:(hashcode : 2047900177, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@d5309e40), (key:(-1907323306, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c4b2d818), (key:(1322274519, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c44c19c8), (key:(1036008904, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@d9c4e5b0), (key:(-1604280216, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@dd28a590), (key:(1890611389, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c469e808), (key:(-1981773329, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c4697318), (key:(210766976, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@e1e1d4a8), (key:(1387287201, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@c400d998), (key:(-826224943, ...), val:org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[](16)@e370ee48), ...}

org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations {j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
39Kb (< 0.1%): j.u.HashMap: 1 objects

 Random sample 
j.u.HashMap<String, j.u.LinkedHashMap>(size: 1000, capacity: 2048) {(key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=86", val:(size : 1, modCount : 1, threshold : 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=85", val:(1, 1, 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=88", val:(1, 1, 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=87", val:(1, 1, 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=82", val:(1, 1, 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=81", val:(1, 1, 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=84", val:(1, 1, 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=83", val:(1, 1, 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=850", val:(1, 1, 12, ...)), (key:"viewfs://root/user/bidata/partnerdb/catalogs/20180709/partner_partition=851", val:(1, 1, 12, ...)), ...}

org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap {j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
35Kb (< 0.1%): j.u.HashMap: 1 objects

 Random sample 
j.u.HashMap<String, org.apache.hadoop.hive.conf.HiveConf$ConfVars>(size: 877, capacity: 2048) {(key:"hive.exec.reducers.bytes.per.reducer", val:(name : "BYTESPERREDUCER", varname : "hive.exec.reducers.bytes.per.reducer", defaultExpr : "256000000", ...)), (key:"hive.metastore.client.capability.check", val:("METASTORE_CAPABILITY_CHECK", "hive.metastore.client.capability.check", "true", ...)), (key:"datanucleus.storeManagerType", val:("METASTORE_STORE_MANAGER_TYPE", "datanucleus.storeManagerType", "rdbms", ...)), (key:"hive.aux.jars.path", val:("HIVEAUXJARS", "hive.aux.jars.path", "", ...)), (key:"hive.exec.stagingdir", val:("STAGINGDIR", "hive.exec.stagingdir", ".hive-staging", ...)), (key:"hive.metastore.hbase.aggregate.stats.false.positive.probability", val:("METASTORE_HBASE_AGGREGATE_STATS_CACHE_FALSE_POSITIVE_PROBABILITY", "hive.metastore.hbase.aggregate.stats.false.positive.probability", "0.01", ...)), (key:"hive.exec.default.partition.name", val:("DEFAULTPARTITIONNAME", "hive.exec.default.partition.name", "__HIVE_DEFAULT_PARTITION__", ...)), (key:"mapreduce.input.fileinputformat.split.minsize.per.rack", val:("MAPREDMINSPLITSIZEPERRACK", "mapreduce.input.fileinputformat.split.minsize.per.rack", "1", ...)), (key:"hive.metastore.event.expiry.duration", val:("METASTORE_EVENT_EXPIRY_DURATION", "hive.metastore.event.expiry.duration", "0s", ...)), (key:"hive.exec.mode.local.auto.input.files.max", val:("LOCALMODEMAXINPUTFILES", "hive.exec.mode.local.auto.input.files.max", "4", ...)), ...}

 ↖Java Static org.apache.hadoop.hive.conf.HiveConf.vars


 487,583 15,236Kb (0.9%) 15,236Kb (0.9%) 93,698Kb (5.8%)org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper
Reference chains
Expensive data fields

15,236Kb (0.9%): org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper: 487,582 objects

 Random sample 
(hashcode : -277661827, keys : Object[](size: 2), equalComparer : org.apache.hadoop.hive.serde2.objectinspector.ListObjectsEqualComparer@c4b38458, this$0 : org.apache.hadoop.hive.ql.exec.KeyWrapperFactory@c4b38558)
(hashcode : -1559694932, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1692096880, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -747006738, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1293666561, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 426306545, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -200973971, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1852010205, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -260380835, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1768075374, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1613810886, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1468453948, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1803465267, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1021019654, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 189431968, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -503903149, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1048294277, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 87405895, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 2077607202, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1773541156, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)

{j.u.HashMap}.keys org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

15,236Kb (0.9%): org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper: 487,582 objects

 Random sample 
(hashcode : -277661827, keys : Object[](size: 2), equalComparer : org.apache.hadoop.hive.serde2.objectinspector.ListObjectsEqualComparer@c4b38458, this$0 : org.apache.hadoop.hive.ql.exec.KeyWrapperFactory@c4b38558)
(hashcode : -1559694932, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1692096880, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -747006738, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1293666561, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 426306545, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -200973971, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1852010205, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -260380835, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1768075374, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1613810886, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1468453948, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1803465267, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -1021019654, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 189431968, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : -503903149, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1048294277, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 87405895, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 2077607202, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)
(hashcode : 1773541156, keys : (size: 2), equalComparer : @c4b38458, this$0 : @c4b38558)

{j.u.HashMap}.keys org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
32b (< 0.1%): org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper: 1 objects

 Random sample 
(hashcode : 1569083627, keys : Object[](size: 2), equalComparer : org.apache.hadoop.hive.serde2.objectinspector.ListObjectsEqualComparer@c4b38888, this$0 : org.apache.hadoop.hive.ql.exec.KeyWrapperFactory@c4b38558)

org.apache.hadoop.hive.ql.exec.GroupByOperator.newKeys {j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 495,823 11,972Kb (0.7%) 11,711Kb (0.7%) 84,208Kb (5.2%)Object[]
Reference chains
Expensive data fields

11,427Kb (0.7%): Object[]: 487,583 objects

 Random sample 
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 21, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 76, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 161, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 17, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 49, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 46, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 19704), org.apache.hadoop.io.Text(length : 83, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 120, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 109, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 33, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 69, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 49, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 10704), org.apache.hadoop.io.Text(length : 48, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 22, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 111, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 61, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 59, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 61, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 67, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 46704), org.apache.hadoop.io.Text(length : 66, ...)}

 ↖org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys


Full reference chains

11,427Kb (0.7%): Object[]: 487,582 objects

 Random sample 
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 21, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 76, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 161, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 17, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 49, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 46, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 19704), org.apache.hadoop.io.Text(length : 83, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 120, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 109, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 33, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 69, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 49, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 10704), org.apache.hadoop.io.Text(length : 48, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 23704), org.apache.hadoop.io.Text(length : 22, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 111, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 61, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 15598), org.apache.hadoop.io.Text(length : 59, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 61, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 1704), org.apache.hadoop.io.Text(length : 67, ...)}
Object[2]{org.apache.hadoop.io.IntWritable(value : 46704), org.apache.hadoop.io.Text(length : 66, ...)}

org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys {j.u.HashMap}.keys
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
159Kb (< 0.1%): Object[]: 2,433 objects (36% of all objects referenced here)

 Random sample 
Object[20](element class String){null, null, null, null, null, "GET", null, "TRACE", null, null, null, null, null, null, null, "POST", "HEAD", "OPTIONS", "PUT", "DELETE"}
Object[3](element class String){",name=", "monitor", "control"}
Object[6](element class String){null, null, null, null, null, "9223372036854775808"}
Object[1]{null}
Object[2]{(all elements are nulls}
Object[2]{(all elements are nulls}
Object[26](element class String){"control", null, null, null, null, ".handlers", "java.util.logging.config.class", null, null, "", "java.util.logging.config.file", "java.home", null, "lib", "logging.properties", "config", null, null, null, null, ...}
Object[17](element class String){"http://www.w3.org/TR/1998/REC-xml-19980210", "MSG_GRAMMAR_NOT_FOUND", "RootElementTypeMustMatchDoctypedecl", "ElementUnterminated", "http://www.w3.org/TR/1999/REC-xml-names-19990114", "ElementXMLNSPrefix", "ElementPrefixUnbound", "AttributePrefixUnbound", "AttributeNSNotUnique", "AttributeNotUnique", "ElementEntityMismatch", "EqRequiredInAttribute", "CantBindXMLNS", "CantBindXML", "EmptyPrefixedAttName", "ETagRequired", "ETagUnterminated"}
Object[7](element class String){"Malformed \uxxxx encoding.", "#", "8859_1", "=", "UTF-8", "-- listing properties --", "..."}
Object[11](element class String){"env:", "system:", null, null, null, null, null, "", "Variable substitution depth is deeper than ", " for expression ", "\$\{[^\}\$ ]+\}"}
Object[1](element class String){"spark"}
Object[17](element class String){null, null, null, null, null, null, null, null, null, null, "modifyThread", null, null, null, null, null, null (16 elements are nulls)}
Object[8](element class String){"os.arch", "", "i386", "x86", "amd64", null, null, null}
Object[1]{null}
Object[6](element class String){"META-INF/services/", null, null, null, null, null}
Object[2]{(all elements are nulls}
Object[2]{(all elements are nulls}
Object[9]{(all elements are nulls}
Object[2]{(all elements are nulls}
Object[7](element class String){"ISO-8859-1", "#!", null, null, "include", "line.separator", null}

↖Unreachable
  All or some objects may start live as:

48b (< 0.1%): Object[]: 1 objects

 Random sample 
Object[8](element class String){"core-default.xml", "core-site.xml", "mapred-default.xml", "mapred-site.xml", "yarn-default.xml", "yarn-site.xml", "hdfs-default.xml", "hdfs-site.xml"}

j.u.concurrent.CopyOnWriteArrayList.array ↖Java Static org.apache.hadoop.conf.Configuration.defaultResources
40b (< 0.1%): Object[]: 1 objects

 Random sample 
Object[6](element class String){"INPUT__FILE__NAME", "BLOCK__OFFSET__INSIDE__FILE", "ROW__OFFSET__INSIDE__BLOCK", "RAW__DATA__SIZE", "GROUPING__ID", "ROW__ID"}

com.google.common.collect.RegularImmutableSet.elements ↖Java Static org.apache.hadoop.hive.ql.metadata.VirtualColumn.VIRTUAL_COLUMN_NAMES
40b (< 0.1%): Object[]: 1 objects

 Random sample 
Object[6](element class j.l.Integer){null, null, null, null, (2048), null}

 ↖Java Static org.apache.xerces.impl.XMLEntityManager.PROPERTY_DEFAULTS
504b (< 0.1%): Object[]: 12 objects

 Random sample 
Object[1](element class java.lang.invoke.LambdaForm$Name){(index : 0, ...)}
Object[8](element class java.lang.invoke.LambdaForm$Name){(index : 1, ...), (2, ...), (3, ...), (4, ...), (5, ...), (6, ...), (7, ...), (8, ...)}
Object[1](element class java.lang.invoke.LambdaForm$Name){(index : 0, ...)}
Object[9](element class java.lang.invoke.LambdaForm$Name){(index : 1, ...), (2, ...), (3, ...), (4, ...), (5, ...), (6, ...), (7, ...), (8, ...), (9, ...)}
Object[1](element class java.lang.invoke.LambdaForm$Name){(index : 0, ...)}
Object[10](element class java.lang.invoke.LambdaForm$Name){(index : 1, ...), (2, ...), (3, ...), (4, ...), (5, ...), (6, ...), (7, ...), (8, ...), (9, ...), (10, ...)}
Object[1](element class java.lang.invoke.LambdaForm$Name){(index : 0, ...)}
Object[11](element class java.lang.invoke.LambdaForm$Name){(index : 1, ...), (2, ...), (3, ...), (4, ...), (5, ...), (6, ...), (7, ...), (8, ...), (9, ...), (10, ...), (11, ...)}
Object[1](element class java.lang.invoke.LambdaForm$Name){(index : 0, ...)}
Object[1](element class java.lang.invoke.LambdaForm$Name){(index : 0, ...)}
Object[12](element class java.lang.invoke.LambdaForm$Name){(index : 1, ...), (2, ...), (3, ...), (4, ...), (5, ...), (6, ...), (7, ...), (8, ...), (9, ...), (10, ...), (11, ...), (12, ...)}
Object[13](element class java.lang.invoke.LambdaForm$Name){(index : 1, ...), (2, ...), (3, ...), (4, ...), (5, ...), (6, ...), (7, ...), (8, ...), (9, ...), (10, ...), (11, ...), (12, ...), (13, ...)}

java.lang.invoke.LambdaForm$Name.arguments java.lang.invoke.LambdaForm$Name[]
java.lang.invoke.LambdaForm.names
java.lang.invoke.DirectMethodHandle.form
java.lang.invoke.MethodHandle[]
↖Java Static java.lang.invoke.MethodHandleImpl$Lazy.FILL_ARRAYS


23Kb (< 0.1%): Object[]: 999 objects (99% of all objects referenced here)

 Random sample 
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 933)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 38)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 153)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 825)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 248)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 370)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 425)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 638)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 901)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 203)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 335)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 517)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 931)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 318)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 124)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 422)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 665)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 775)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 753)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 456)}

Object[] org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 498,673 11,687Kb (0.7%) 11,687Kb (0.7%) 59,704Kb (3.7%)org.apache.hadoop.io.Text
Reference chains
Expensive data fields

11,427Kb (0.7%): org.apache.hadoop.io.Text: 487,567 objects (49% of all objects referenced here)

 Random sample 
(bytes : byte[](size: 81), length : 81)
(bytes : (size: 125), length : 125)
(bytes : (size: 63), length : 63)
(bytes : (size: 105), length : 105)
(bytes : (size: 67), length : 67)
(bytes : (size: 72), length : 72)
(bytes : (size: 54), length : 54)
(bytes : (size: 59), length : 59)
(bytes : (size: 62), length : 62)
(bytes : (size: 125), length : 125)
(bytes : (size: 98), length : 98)
(bytes : (size: 77), length : 77)
(bytes : (size: 139), length : 139)
(bytes : (size: 118), length : 118)
(bytes : (size: 69), length : 69)
(bytes : (size: 57), length : 57)
(bytes : (size: 73), length : 73)
(bytes : (size: 100), length : 100)
(bytes : (size: 79), length : 79)
(bytes : (size: 42), length : 42)

Object[] org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys


Full reference chains

11,427Kb (0.7%): org.apache.hadoop.io.Text: 487,567 objects (49% of all objects referenced here)

 Random sample 
(bytes : byte[](size: 81), length : 81)
(bytes : (size: 125), length : 125)
(bytes : (size: 63), length : 63)
(bytes : (size: 105), length : 105)
(bytes : (size: 67), length : 67)
(bytes : (size: 72), length : 72)
(bytes : (size: 54), length : 54)
(bytes : (size: 59), length : 59)
(bytes : (size: 62), length : 62)
(bytes : (size: 125), length : 125)
(bytes : (size: 98), length : 98)
(bytes : (size: 77), length : 77)
(bytes : (size: 139), length : 139)
(bytes : (size: 118), length : 118)
(bytes : (size: 69), length : 69)
(bytes : (size: 57), length : 57)
(bytes : (size: 73), length : 73)
(bytes : (size: 100), length : 100)
(bytes : (size: 79), length : 79)
(bytes : (size: 42), length : 42)

Object[] org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys
{j.u.HashMap}.keys
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
233Kb (< 0.1%): org.apache.hadoop.io.Text: 9,980 objects

 Random sample 
(bytes : byte[](size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)
(bytes : (size: 0), length : 0)

org.apache.hadoop.hive.serde2.lazy.LazyString.data org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.field
org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo[]
org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.fieldInfoList
org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.cachedLazyStruct
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
11Kb (< 0.1%): org.apache.hadoop.io.Text: 500 objects

 Random sample 
(bytes : byte[](size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)
(bytes : (size: 2), length : 2)

org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.nullSequence org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.cachedLazyStruct
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 102,739 2,407Kb (0.1%) 11,290Kb (0.7%) 11,290Kb (0.7%)String
Reference chains
Expensive data fields

2,111Kb (0.1%): String: 13,710 objects

 Random sample 
"http://static.criteo.net/images/produits/14842/stars/2.png"
4 copies of "http://static.criteo.net/images/produits/14842/stars/0.png"
3 copies of "https://www.viator.com/images/stars/orange/16_5.gif"
4 copies of "http://static.criteo.net/images/produits/14842/stars/4.png"
4 copies of "http://static.criteo.net/images/produits/14842/stars/5.png"
"https://www.viator.com/images/stars/orange/16_4.gif"
2 copies of "http://static.criteo.net/images/produits/14842/stars/3.png"
"https://www.viator.com/images/stars/orange/16_4_5.gif"

 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg.o
2,111Kb (0.1%): String: 13,710 objects

 Random sample 
6 copies of "http://static.criteo.net/images/produits/14842/stars/5.png"
4 copies of "http://static.criteo.net/images/produits/14842/stars/3.png"
3 copies of "https://www.viator.com/images/stars/orange/16_4_5.gif"
3 copies of "http://static.criteo.net/images/produits/14842/stars/4.png"
2 copies of "https://www.viator.com/images/stars/orange/16_5.gif"
2 copies of "http://static.criteo.net/images/produits/14842/stars/0.png"

 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg.o


Full reference chains

2,111Kb (0.1%): String: 13,710 objects

 Random sample 
6 copies of "http://static.criteo.net/images/produits/14842/stars/5.png"
4 copies of "http://static.criteo.net/images/produits/14842/stars/3.png"
3 copies of "https://www.viator.com/images/stars/orange/16_4_5.gif"
3 copies of "http://static.criteo.net/images/produits/14842/stars/4.png"
2 copies of "https://www.viator.com/images/stars/orange/16_5.gif"
2 copies of "http://static.criteo.net/images/produits/14842/stars/0.png"

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg.o org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
2,111Kb (0.1%): String: 13,710 objects

 Random sample 
"http://static.criteo.net/images/produits/14842/stars/2.png"
4 copies of "http://static.criteo.net/images/produits/14842/stars/0.png"
3 copies of "https://www.viator.com/images/stars/orange/16_5.gif"
4 copies of "http://static.criteo.net/images/produits/14842/stars/4.png"
4 copies of "http://static.criteo.net/images/produits/14842/stars/5.png"
"https://www.viator.com/images/stars/orange/16_4.gif"
2 copies of "http://static.criteo.net/images/produits/14842/stars/3.png"
"https://www.viator.com/images/stars/orange/16_4_5.gif"

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg.o org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
1,365Kb (< 0.1%): String: 22,977 objects

 Random sample 
"createddate"
2 copies of "hashcode"
"reallyrecommendable"
3 copies of "extra"
"lasttouch"
"id"
"statflag"
2 copies of "bigimage"
"retailprice"
3 copies of "recommendable"
"filtered"
"additionalimagelinks"
"name"
"partnerid"

String[] j.u.Arrays$ArrayList.a
org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.columnNames
org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.serdeParams
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 493,612 7,712Kb (0.5%) 7,712Kb (0.5%) 7,712Kb (0.5%)org.apache.hadoop.io.IntWritable
Reference chains
Expensive data fields

7,618Kb (0.5%): org.apache.hadoop.io.IntWritable: 487,582 objects (50% of all objects referenced here)

 Random sample 
(value : 8598)
(value : 15598)
(value : 15598)
(value : 8598)
(value : 15598)
(value : 15598)
(value : 8598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 8598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)

Object[] org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys


Full reference chains

7,618Kb (0.5%): org.apache.hadoop.io.IntWritable: 487,582 objects (50% of all objects referenced here)

 Random sample 
(value : 8598)
(value : 15598)
(value : 15598)
(value : 8598)
(value : 15598)
(value : 15598)
(value : 8598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 8598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)
(value : 15598)

Object[] org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper.keys
{j.u.HashMap}.keys
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
77Kb (< 0.1%): org.apache.hadoop.io.IntWritable: 4,990 objects

 Random sample 
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)
(value : 0)

org.apache.hadoop.hive.serde2.lazy.LazyInteger.data org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.field
org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo[]
org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.fieldInfoList
org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.cachedLazyStruct
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
15Kb (< 0.1%): org.apache.hadoop.io.IntWritable: 999 objects

 Random sample 
(value : 830)
(value : 269)
(value : 49)
(value : 491)
(value : 778)
(value : 866)
(value : 357)
(value : 712)
(value : 392)
(value : 570)
(value : 846)
(value : 562)
(value : 699)
(value : 384)
(value : 669)
(value : 133)
(value : 594)
(value : 625)
(value : 622)
(value : 310)

Object[] Object[]
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 2,026 94Kb (< 0.1%) 2,279Kb (0.1%) 3,572Kb (0.2%)j.u.Properties
Reference chains
Full reference chains

967Kb (< 0.1%): j.u.Properties: 999 objects

 Random sample 
j.u.Properties<String, String>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187675"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1981871005"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187686"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"504182310"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187797"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"79466840"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187687"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"122594275"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187808"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"98299908"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187663"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1849065731"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187675"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"65571096"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187821"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1104855732"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187846"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"428864255"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187763"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"93614981"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187786"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"193388137"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187676"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"52798226"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187678"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"309248856"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187774"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1443123224"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187797"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"52523300"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187654"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"52489296"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187656"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"767087998"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187811"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"192229924"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187740"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"195071729"), (key:"bucket_count", val:"-1"), ...}
j.u.Properties<>(size: 23, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"EXTERNAL", val:"TRUE"), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"partition_columns", val:"partner_partition"), (key:"last_modified_time", val:"1531187688"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"455181474"), (key:"bucket_count", val:"-1"), ...}

org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.tableProperties org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.serdeParams
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
937Kb (< 0.1%): j.u.Properties: 1,000 objects

 Random sample 
j.u.Properties<String, String>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187741"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"68788226"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187808"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"98299908"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187665"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"93558648"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187675"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1981871005"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187809"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1194311409"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187808"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"28142108"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187775"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"223325319"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187789"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"53200451"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187847"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"81117519"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187765"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"942260599"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187667"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"230310100"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187869"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"125401724"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187822"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"935602865"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187700"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"180979907"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187666"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"211268469"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187761"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"57333119"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187775"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1022378400"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187810"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"111697975"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187812"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"1057809571"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}
j.u.Properties<>(size: 22, capacity: 47) {(key:"name", val:"bi_data.partnerdb_catalogs"), (key:"column.name.delimiter", val:","), (key:"file.inputformat", val:"org.apache.hadoop.hive.ql.io.RCFileInputFormat"), (key:"last_modified_time", val:"1531187833"), (key:"partition_columns", val:"partner_partition"), (key:"serialization.lib", val:"org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe"), (key:"file.outputformat", val:"org.apache.hadoop.hive.ql.io.RCFileOutputFormat"), (key:"totalSize", val:"72236656"), (key:"bucket_count", val:"-1"), (key:"columns.comments", val:""), ...}

org.apache.hadoop.hive.ql.plan.PartitionDesc.properties {j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.pathToPartitionInfo
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
143Kb (< 0.1%): j.u.Properties: 2 objects

 Random sample 
j.u.Properties<String, String>(size: 1903, capacity: 3071) {(key:"yarn.app.mapreduce.client.job.retry-interval", val:"2000"), (key:"s3.stream-buffer-size", val:"4096"), (key:"dfs.namenode.servicerpc-address.preprod-pa3.9c-b6-54-7d-1c-7c", val:"9c-b6-54-7d-1c-7c.hpc.criteo.preprod:8018"), (key:"dfs.namenode.http-address.yarn-experimental.a4-5d-36-fc-ef-54", val:"a4-5d-36-fc-ef-54.hpc.criteo.preprod:50070"), (key:"hive.metastore.hbase.aggr.stats.cache.entries", val:"10000"), (key:"hive.allow.udf.load.on.demand", val:"false"), (key:"dfs.cluster.administrators", val:" Gu-sre-Lake,gu-sre-lake"), (key:"hadoop.registry.zk.root", val:"/registry"), (key:"hive.server2.map.fair.scheduler.queue", val:"true"), (key:"hive.txn.manager", val:"org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager"), ...}
j.u.Properties<>(size: 1903, capacity: 3071) {(key:"yarn.app.mapreduce.client.job.retry-interval", val:"2000"), (key:"s3.stream-buffer-size", val:"4096"), (key:"dfs.namenode.servicerpc-address.preprod-pa3.9c-b6-54-7d-1c-7c", val:"9c-b6-54-7d-1c-7c.hpc.criteo.preprod:8018"), (key:"dfs.namenode.http-address.yarn-experimental.a4-5d-36-fc-ef-54", val:"a4-5d-36-fc-ef-54.hpc.criteo.preprod:50070"), (key:"hive.metastore.hbase.aggr.stats.cache.entries", val:"10000"), (key:"hive.allow.udf.load.on.demand", val:"false"), (key:"dfs.cluster.administrators", val:" Gu-sre-Lake,gu-sre-lake"), (key:"hadoop.registry.zk.root", val:"/registry"), (key:"hive.server2.map.fair.scheduler.queue", val:"true"), (key:"hive.txn.manager", val:"org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager"), ...}

org.apache.hadoop.conf.Configuration.properties org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider.conf
org.apache.hadoop.io.retry.RetryInvocationHandler.proxyProvider
com.sun.proxy.$Proxy15.h
org.apache.hadoop.hdfs.DFSClient.namenode
org.apache.hadoop.hdfs.DistributedFileSystem.dfs
{j.u.HashMap}.values
org.apache.hadoop.fs.FileSystem$Cache.map
↖Java Static org.apache.hadoop.fs.FileSystem.CACHE


 2,274 124Kb (< 0.1%) 539Kb (< 0.1%) 572,182Kb (35.1%)j.u.LinkedHashMap
Reference chains
Full reference chains

171Kb (< 0.1%): j.u.LinkedHashMap: 1,000 objects

 Random sample 
j.u.LinkedHashMap<String, String>(size: 1, capacity: 16) {(key:"partner_partition", val:"492")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"252")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"395")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"467")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"975")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"733")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"617")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"677")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"898")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"470")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"577")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"980")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"787")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"708")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"168")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"678")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"613")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"757")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"323")}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:"partner_partition", val:"10")}

org.apache.hadoop.hive.ql.plan.PartitionDesc.partSpec {j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.pathToPartitionInfo
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
171Kb (< 0.1%): j.u.LinkedHashMap: 1,000 objects

 Random sample 
j.u.LinkedHashMap<org.apache.hadoop.hive.ql.exec.TableScanOperator, org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=813}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=794}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=895}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=958}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=230}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=859}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=356}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=713}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=477}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=361}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=636}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=485}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=986}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=674}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=69}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=364}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=594}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=887}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=649}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=964}", ...))}

{j.u.HashMap}.values org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
47Kb (< 0.1%): j.u.LinkedHashMap: 1 objects

 Random sample 
j.u.LinkedHashMap<org.apache.hadoop.fs.Path, org.apache.hadoop.hive.ql.plan.PartitionDesc>(size: 1000, capacity: 2048) {(key:@c0b969c8, val:(baseFileName : "partner_partition=295", ...)), (key:@c0c04f90, val:("partner_partition=198", ...)), (key:@c0c5fc58, val:("partner_partition=384", ...)), (key:@c0b07c88, val:("partner_partition=570", ...)), (key:@c0b75da8, val:("partner_partition=287", ...)), (key:@c0b7c6d8, val:("partner_partition=473", ...)), (key:@c0b6b0d0, val:("partner_partition=465", ...)), (key:@c0b151f8, val:("partner_partition=554", ...)), (key:@c0b77d70, val:("partner_partition=279", ...)), (key:@c0b798f0, val:("partner_partition=562", ...)), ...}

org.apache.hadoop.hive.ql.plan.MapWork.pathToPartitionInfo {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


 2,584 60Kb (< 0.1%) 292Kb (< 0.1%) 569,239Kb (34.9%)j.u.ArrayList
Reference chains
Full reference chains

171Kb (< 0.1%): j.u.ArrayList: 999 objects

 Random sample 
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}
j.u.ArrayList(size: 23, capacity: 33) {org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo(typeName : "string"), org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo@c06c5fe0, ...}

org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.columnTypes org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.serdeParams
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
78Kb (< 0.1%): j.u.ArrayList: 1,000 objects

 Random sample 
j.u.ArrayList<String>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}
j.u.ArrayList<>(size: 1, capacity: 10) {"$hdt$_0:partnerdb_catalogs"}

{j.u.LinkedHashMap}.values org.apache.hadoop.hive.ql.plan.MapWork.pathToAliases
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
3Kb (< 0.1%): j.u.ArrayList: 1 objects

 Random sample 
j.u.ArrayList<String>(size: 557, capacity: 823) {"1", "2", "3", "5", "8", "9", "10", "11", "12", "13", "14", "15", "16", "18", "19", "20", "21", "22", "23", "25", ...}

 ↖Java Local(j.u.ArrayList)






4. Where Memory Goes, by GC Root.  Leak candidate(s) found  What's this?

1,047,624Kb (64.3%) Object tree for GC root(s) Java Static java.beans.ThreadGroupContext.contexts
Root object java.beans.ThreadGroupContext$1@8055e958 java.beans.ThreadGroupContext$1(queue : j.l.r.ReferenceQueue@8055e978, table : java.beans.WeakIdentityMap$Entry[](size: 8), threshold : 6, size : 1)
                              1. byte[] ↘1,047,552Kb (64.3%), self 1,047,552Kb (64.3%), 1 object(s)
                            1. ... 15 more referenced objects retaining 704b (< 0.1%)
                        1. ... 5 more referenced objects retaining 416b (< 0.1%)
                        1. ... 23 referenced objects retaining 6Kb (< 0.1%)
                  1. String ↘48b (< 0.1%), self 48b (< 0.1%), 1 object(s)
                1. ... 2 referenced objects retaining 64Kb (< 0.1%)
          1. j.l.r.ReferenceQueue$Lock ↘16b (< 0.1%), self 16b (< 0.1%), 1 object(s)
571,788Kb (35.1%) Object tree for GC root(s) Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap LEAK CANDIDATE(S) FOUND 
Root object org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory@c078c300 org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory(threadLocalWorkMap : null, gWorkMap : j.u.HashMap(size: 1), dummy : org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory$DummyMap@c078c918)
                                                  1. j.u.HashSet ↘182,843Kb (11.2%), self 182,843Kb (11.2%), 2,925,492 object(s)
                                                  1. j.u.HashSet ↘60,947Kb (3.7%), self 60,947Kb (3.7%), 975,164 object(s)
                                                  1. j.l.Long ↘2Kb (< 0.1%), self 2Kb (< 0.1%), 98 object(s)
                                                  1. String ↘2,111Kb (0.1%), self 2,111Kb (0.1%), 13,710 object(s)
                                                  1. String ↘2,111Kb (0.1%), self 2,111Kb (0.1%), 13,710 object(s)
                                                      1. byte[] ↘47,987Kb (2.9%), self 47,987Kb (2.9%), 487,567 object(s)
                                                  1. org.apache.hadoop.io.IntWritable ↘7,618Kb (0.5%), self 7,618Kb (0.5%), 487,582 object(s)
                                                      1. org.apache.hadoop.hive.serde2.objectinspector.ListObjectsEqualComparer$FieldComparer ↘160b (< 0.1%), self 160b (< 0.1%), 2 object(s)
                                    1. ... 31 more referenced objects retaining 24Kb (< 0.1%)
                            1. ... 14 more referenced objects retaining 14Kb (< 0.1%)
                                                1. ... 5494 referenced objects retaining 2,950Kb (0.2%)
                                                        1. ... 45908 referenced objects retaining 1,380Kb (< 0.1%)
                                                1. ... 1498 more referenced objects retaining 70Kb (< 0.1%)
                                            1. ... 3996 more referenced objects retaining 1,061Kb (< 0.1%)
                                                              1. byte[] ↘4,942Kb (0.3%), self 4,942Kb (0.3%), 7 object(s)
                                                          1. ... 23 referenced objects retaining 2Kb (< 0.1%)
                                                  1. int[] ↘48b (< 0.1%), self 48b (< 0.1%), 1 object(s)
                                                  1. org.apache.hadoop.io.IntWritable ↘15Kb (< 0.1%), self 15Kb (< 0.1%), 999 object(s)
                                        1. ... 1998 more referenced objects retaining 101Kb (< 0.1%)
                                    1. ... 1002 more referenced objects retaining 15Kb (< 0.1%)
                                1. ... 1001 more referenced objects retaining 187Kb (< 0.1%)
                                                          1. byte[] ↘3,776Kb (0.2%), self 3,776Kb (0.2%), 4 object(s)
                                                    1. ... 26 more referenced objects retaining 1,044Kb (< 0.1%)
                                            1. ... 2 more referenced objects retaining 96b (< 0.1%)
                                        1. ... 5 more referenced objects retaining 4Kb (< 0.1%)
                                    1. ... 2 more referenced objects retaining 104b (< 0.1%)
                            1. ... 16 more referenced objects retaining 26Kb (< 0.1%)
                    1. ... 13 more referenced objects retaining 16Kb (< 0.1%)
                  1. String ↘96b (< 0.1%), self 96b (< 0.1%), 1 object(s)
                        1. ... 12138 referenced objects retaining 927Kb (< 0.1%)
                    1. ... 2000 more referenced objects retaining 319Kb (< 0.1%)
                1. ... 977 more referenced objects retaining 745Kb (< 0.1%)
            1. ... 11 more referenced objects retaining 148Kb (< 0.1%)
                1. ... 6 referenced objects retaining 1Kb (< 0.1%)
      1. org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory$DummyMap ↘16b (< 0.1%), self 16b (< 0.1%), 1 object(s)
1,953Kb (0.1%) Object tree for GC root(s) Java Static org.apache.hive.com.esotericsoftware.reflectasm.AccessClassLoader.selfContextParentClassLoader
Root object sun.misc.Launcher$AppClassLoader@8001d198 sun.misc.Launcher$AppClassLoader(parent : sun.misc.Launcher$ExtClassLoader@8001d1f8, parallelLockMap : j.u.concurrent.ConcurrentHashMap(size: 4,821), package2certs : j.u.concurrent.ConcurrentHashMap(size: 283), classes : j.u.Vector(size: 4,373), defaultDomain : java.security.ProtectionDomain@80218448, domains : j.u.Collections$SynchronizedSet@801fcc10, packages : j.u.HashMap(size: 281), nativeLibraries : j.u.Vector(size: 1), assertionLock : Object@80200d18, defaultAssertionStatus : false, packageAssertionStatus : null, classAssertionStatus : null, initialized : true, pdcache : j.u.HashMap(size: 30), ucp : sun.misc.URLClassPath@80177708, acc : java.security.AccessControlContext@8004e2b0, closeables : j.u.WeakHashMap(size: 5), ucp : sun.misc.URLClassPath@80177708)
    1. ... 13 referenced objects retaining 1,953Kb (0.1%)
... and 7245 more GC roots together retaining 6,276Kb (0.4%)




5. Live vs Garbage Objects.  no significant overhead  What's this?

     Live   Garbage  Total 
 Objects 19,517,030 16,69219,533,722
 Bytes 1,628,172Kb (100.0%) 749Kb (< 0.1%)1,628,922Kb (100.0%)

Details:

  #instances
garbage 
  Shallow size garbage   #instances
live 
  Shallow size live  Class name 
 40 968b (< 0.1%) 492,129 1,106,966Kb (68.0%)byte[]
 96 4Kb (< 0.1%) 3,901,068 182,862Kb (11.2%)j.u.HashMap
 0 0b (0.0%) 2,925,498 68,566Kb (4.2%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg
 6 96b (< 0.1%) 3,900,702 60,948Kb (3.7%)j.u.HashSet
 0 0b (0.0%) 487,583 38,092Kb (2.3%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
 0 0b (0.0%) 1,950,332 30,473Kb (1.9%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg
 0 0b (0.0%) 1,950,332 30,473Kb (1.9%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg
 0 0b (0.0%) 975,166 22,855Kb (1.4%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg
 265 8Kb (< 0.1%) 493,950 15,435Kb (0.9%)j.u.HashMap$Node
 0 0b (0.0%) 487,583 15,236Kb (0.9%)org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper
 2,568 164Kb (< 0.1%) 493,255 11,807Kb (0.7%)Object[]
 0 0b (0.0%) 498,673 11,687Kb (0.7%)org.apache.hadoop.io.Text
 3,776 297Kb (< 0.1%) 99,161 8,933Kb (0.5%)char[]
 0 0b (0.0%) 493,612 7,712Kb (0.5%)org.apache.hadoop.io.IntWritable
 239 13Kb (< 0.1%) 2,439 4,354Kb (0.3%)j.u.HashMap$Node[]
 3,775 88Kb (< 0.1%) 98,964 2,319Kb (0.1%)String
 0 0b (0.0%) 56,301 1,759Kb (0.1%)j.u.Hashtable$Entry




6. Fixed per-object overhead.  Overhead 14.1%  ( 228,910Kb )  What's this?

  Fixed per-object overhead  Total overhead 
 12b228,910Kb (14.1%)

Details:

  #instances   (Average) object
size 
  Total overhead
per class 
 Class name 
 3,901,164 48b 45,716Kb (2.8%)j.u.HashMap
 3,900,708 16b 45,711Kb (2.8%)j.u.HashSet
 2,925,498 24b 34,283Kb (2.1%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg
 1,950,332 16b 22,855Kb (1.4%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg
 1,950,332 16b 22,855Kb (1.4%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg
 975,166 24b 11,427Kb (0.7%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg
 498,673 24b 5,843Kb (0.4%)org.apache.hadoop.io.Text
 495,823 24b 5,810Kb (0.4%)Object[]
 494,215 32b 5,791Kb (0.4%)j.u.HashMap$Node
 493,612 16b 5,784Kb (0.4%)org.apache.hadoop.io.IntWritable
 492,169 2Kb 5,767Kb (0.4%)byte[]
 487,583 32b 5,713Kb (0.4%)org.apache.hadoop.hive.ql.exec.KeyWrapperFactory$ListKeyWrapper
 487,583 80b 5,713Kb (0.4%)org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]




7. Memory Retained by Objects Awaiting Finalization.  no significant overhead  What's this?





8. Duplicate Strings.  Overhead 0.5%  ( 7,397Kb )  What's this?

  Total strings     Unique strings     Duplicate values    Overhead   
 102,739 31,356 5,9427,397Kb (0.5%)

Top duplicate strings

  Overhead     # char[]s     # objects    Value 
 1,387Kb (< 0.1%) 8,884 8,884"http://static.criteo.net/images/produits/14842/stars/5.png"
 825Kb (< 0.1%) 5,281 5,281"http://static.criteo.net/images/produits/14842/stars/4.png"
 726Kb (< 0.1%) 4,649 4,649"http://static.criteo.net/images/produits/14842/stars/0.png"
 612Kb (< 0.1%) 3,921 3,921"http://static.criteo.net/images/produits/14842/stars/3.png"
 396Kb (< 0.1%) 2,824 2,824"https://www.viator.com/images/stars/orange/16_5.gif"
 124Kb (< 0.1%) 842 842"https://www.viator.com/images/stars/orange/16_4_5.gif"
 95Kb (< 0.1%) 2,033 2,033"root"
 93Kb (< 0.1%) 666 666"https://www.viator.com/images/stars/orange/16_4.gif"
 85Kb (< 0.1%) 1,001 1,001"transient_lastDdlTime"
 85Kb (< 0.1%) 1,000 1,000"COLUMN_STATS_ACCURATE"
 78Kb (< 0.1%) 1,003 1,003"serialization.format"
 78Kb (< 0.1%) 1,002 1,002"reallyrecommendable"
 78Kb (< 0.1%) 1,002 1,002"additionalimagelinks"
 78Kb (< 0.1%) 1,001 1,001"last_modified_time"
 70Kb (< 0.1%) 1,002 1,002"recommendable"
 70Kb (< 0.1%) 1,001 1,001"last_modified_by"
 63Kb (< 0.1%) 1,014 1,014"description"
 62Kb (< 0.1%) 1,003 1,003"partnerid"
 62Kb (< 0.1%) 1,002 1,002"producturl"
 62Kb (< 0.1%) 1,002 1,002"smallimage"



Reference Chains for Duplicate Strings

Expensive data fields

2,111Kb (0.1%), 13,710 / 100% dup strings (8 unique), 13710 dup backing arrays:

  Num strings  String value 
 4,380"http://static.criteo.net/images/produits/14842/stars/5.png"
 2,641"http://static.criteo.net/images/produits/14842/stars/4.png"
 2,375"http://static.criteo.net/images/produits/14842/stars/0.png"
 1,971"http://static.criteo.net/images/produits/14842/stars/3.png"
 1,412"https://www.viator.com/images/stars/orange/16_5.gif"
 421"https://www.viator.com/images/stars/orange/16_4_5.gif"
 333"https://www.viator.com/images/stars/orange/16_4.gif"
 177"http://static.criteo.net/images/produits/14842/stars/2.png"

 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg.o
2,111Kb (0.1%), 13,710 / 100% dup strings (8 unique), 13710 dup backing arrays:

  Num strings  String value 
 4,504"http://static.criteo.net/images/produits/14842/stars/5.png"
 2,640"http://static.criteo.net/images/produits/14842/stars/4.png"
 2,274"http://static.criteo.net/images/produits/14842/stars/0.png"
 1,950"http://static.criteo.net/images/produits/14842/stars/3.png"
 1,412"https://www.viator.com/images/stars/orange/16_5.gif"
 421"https://www.viator.com/images/stars/orange/16_4_5.gif"
 333"https://www.viator.com/images/stars/orange/16_4.gif"
 176"http://static.criteo.net/images/produits/14842/stars/2.png"

 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg.o
1,365Kb (< 0.1%), 23,002 / 100% dup strings (25 unique), 23002 dup backing arrays:

  Num strings  String value 
 1,000"price"
 1,000"retailprice"
 1,000"name"
 1,000"additionalimagelinks"
 1,000"componentid"
 1,000"instock"
 1,000"extra"
 1,000"discount"
 1,000"category"
 1,000"hashcode"
... and 5,000 more strings, of which 15 are unique

String[] j.u.Arrays$ArrayList.a


Full reference chains

2,111Kb (0.1%), 13,710 / 100% dup strings (8 unique), 13710 dup backing arrays:

  Num strings  String value 
 4,380"http://static.criteo.net/images/produits/14842/stars/5.png"
 2,641"http://static.criteo.net/images/produits/14842/stars/4.png"
 2,375"http://static.criteo.net/images/produits/14842/stars/0.png"
 1,971"http://static.criteo.net/images/produits/14842/stars/3.png"
 1,412"https://www.viator.com/images/stars/orange/16_5.gif"
 421"https://www.viator.com/images/stars/orange/16_4_5.gif"
 333"https://www.viator.com/images/stars/orange/16_4.gif"
 177"http://static.criteo.net/images/produits/14842/stars/2.png"

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg.o org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
2,111Kb (0.1%), 13,710 / 100% dup strings (8 unique), 13710 dup backing arrays:

  Num strings  String value 
 4,504"http://static.criteo.net/images/produits/14842/stars/5.png"
 2,640"http://static.criteo.net/images/produits/14842/stars/4.png"
 2,274"http://static.criteo.net/images/produits/14842/stars/0.png"
 1,950"http://static.criteo.net/images/produits/14842/stars/3.png"
 1,412"https://www.viator.com/images/stars/orange/16_5.gif"
 421"https://www.viator.com/images/stars/orange/16_4_5.gif"
 333"https://www.viator.com/images/stars/orange/16_4.gif"
 176"http://static.criteo.net/images/produits/14842/stars/2.png"

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg.o org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
1,364Kb (< 0.1%), 22,977 / 100% dup strings (23 unique), 22977 dup backing arrays:

  Num strings  String value 
 999"price"
 999"name"
 999"retailprice"
 999"hashcode"
 999"additionalimagelinks"
 999"sqlid"
 999"instock"
 999"createddate"
 999"discount"
 999"extra"
... and 2,997 more strings, of which 13 are unique

String[] j.u.Arrays$ArrayList.a
org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.columnNames
org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.serdeParams
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap





9. Bad Collections.  Overhead 15.0%  ( 244,518Kb )  What's this?

  Total collections     Bad collections    Overhead:   
 7,809,571 3,905,512244,518Kb (15.0%)

What can I do to fix Bad Collections?


The overhead (waste) of all 3,905,512 bad collections is 244,518Kb, or 15.0% of used heap.
That's how much memory you could save if:
- There were no empty collections

All of the collections referenced by the data structures below are empty.
They waste 243,791Kb, or 15.0% of used heap. Does your app really use/need them?
  org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg.uniqueObjects : reference 2,925,498 of bad j.u.HashSet , waste 182,843Kb, or 11.2% of used heap
  org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg.uniqueObjects : reference 975,166 of bad j.u.HashSet , waste 60,947Kb, or 3.7% of used heap

How to fix?

All of the collections above are empty.
It means that these collections are initialized, but likely never used.
Every initialized collection uses some memory even when it's empty.
For example, you may have a class such as:
  class Foo {
    Map map = new HashMap<>(); // Allocated with default capacity 16
    void addToMap(K key, V val) { map.put(key, val); }
    ...
  }

but addToMap() is never called.

So, if the collections above are not used at all, the respective code is dead and should be removed.
Otherwise, such collections should be initialized lazily in method(s) that access them, e.g.
  void addToMap(K key, V val) {
    if (map == null) map = new HashMap<>();
    map.put(key, val);
  }




For detailed information about bad collections in this dump, check the tables below.
To learn more about fixing bad collections, check this article



Top bad collections:  

  Overhead     Problem     # objects    Type 
 243,793Kb (15.0%) empty 3900687 / 99%j.u.HashSet


Reference Chains for Bad Collections

Expensive data fields

182,843Kb (11.2%): j.u.HashSet: 2,925,498 / 100% of empty 182,843Kb (11.2%)
 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg.uniqueObjects
60,947Kb (3.7%): j.u.HashSet: 975,166 / 100% of empty 60,947Kb (3.7%)
 ↖org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg.uniqueObjects


Full reference chains

182,843Kb (11.2%): j.u.HashSet: 2,925,492 / 100% of empty 182,843Kb (11.2%)
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator$CountAgg.uniqueObjects org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
60,947Kb (3.7%): j.u.HashSet: 975,164 / 100% of empty 60,947Kb (3.7%)
org.apache.hadoop.hive.ql.udf.generic.GenericUDAFSum$GenericUDAFSumLong$SumLongAgg.uniqueObjects org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[]
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
171Kb (< 0.1%): j.u.LinkedHashMap: 1,000 / 100% of 1-elem 171Kb (< 0.1%)

 Random sample of non-empty containers 
j.u.LinkedHashMap<org.apache.hadoop.hive.ql.exec.TableScanOperator, org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=813}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=794}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=895}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=958}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=230}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=859}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=356}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=713}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=477}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=361}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=636}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=485}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=986}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=674}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=69}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=364}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=594}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=887}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=649}", ...))}
j.u.LinkedHashMap<>(size: 1, capacity: 16) {(key:(operatorId : "TS_0", id : "0", schemaEvolutionColumns : "bigimage,category,componentid,createddate,description,discount,extra,filtered,hashcode,id,instock,lasttouch,name,partnerid,price,producturl, ...[length 232]", ...), val:(alias : "$hdt$_0:partnerdb_catalogs", tableName : "bi_data.partnerdb_catalogs", partName : "{partner_partition=964}", ...))}

{j.u.HashMap}.values org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap





10. Bad Object Arrays.  no significant overhead  What's this?

  Total object arrays     Bad object arrays    Overhead:   
 999,807 7,916268Kb (< 0.1%)


Reference Chains for Bad Object Arrays

Full reference chains

77Kb (< 0.1%): Object[]: 815 / 12% of empty 36Kb (< 0.1%), 140 / 2% of sparse 18Kb (< 0.1%), 565 / 8% of 1-length 13Kb (< 0.1%), 190 / 2% of 1-elem 9Kb (< 0.1%)

 Random sample of non-empty containers 
Object[50](element class String){"X509", ".SF", ".DSA", ".RSA", ".EC", null, null, null, "1.0", null, null, null, "./", "/", null, null, "META-INF/MANIFEST.MF", null, null, null, ... (37 elements are nulls)}
Object[41](element class String){null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, ... (36 elements are nulls)}
Object[55](element class String){"webhdfs", null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, ... (53 elements are nulls)}
Object[17](element class String){null, null, null, null, null, "log4j.defaultInitOverride", null, "log4j.configuration", "log4j.configuratorClass", null, null, "Using URL [", "] for automatic log4j configuration.", null, null, null, null (12 elements are nulls)}
Object[15](element class String){null, null, null, null, null, null, null, null, null, null, "JobToken", "MapReduceShuffleToken", "MapReduceEncryptedSpillKey", null, null (12 elements are nulls)}
Object[94](element class String){null, null, null, null, null, null, null, null, null, "registerMBean", null, null, null, null, null, null, null, null, null, null, ... (92 elements are nulls)}
Object[7](element class String){null, null, null, null, null, "com.sun.proxy.", "$Proxy"}
Object[23](element class String){"hdfs", null, null, null, null, null, null, null, null, null, "dfs.client.retry.policy.enabled", "dfs.client.retry.policy.spec", "10000,6,60000,10", "dfs.client.failover.proxy.provider.", null, null, "Interface %s is not a NameNode protocol", null, null, null, ... (17 elements are nulls)}
Object[47](element class String){null, null, null, "serialization.format", null, null, null, null, "serialization.lib", null, null, null, null, null, null, null, null, null, null, null, ... (44 elements are nulls)}
Object[40](element class String){null, null, "IPC Client (", ") connection to ", " from ", null, null, null, null, null, "javax.security.sasl.qop", "Negotiated QOP is :", null, null, null, null, null, null, null, null, ... (35 elements are nulls)}
Object[34](element class String){null, null, null, null, null, null, null, null, null, null, "Get token info proto:", " info:", null, null, null, null, null, null, null, null, ... (30 elements are nulls)}
Object[51](element class String){null, null, null, null, null, "mapreduce.task.id", "mapreduce.task.attempt.id", "mapreduce.task.ismap", "mapreduce.task.partition", "mapreduce.job.id", null, "mapreduce.job.process-tree.class", "JVM_PID", " Using ResourceCalculatorProcessTree : ", "mapreduce.task.max.status.length", null, null, null, null, null, ... (38 elements are nulls)}
Object[14](element class String){null, null, null, null, null, null, null, null, null, null, "mapreduce.Counters", "org.apache.hadoop.mapred.Task$Counter", "org.apache.hadoop.mapred.JobInProgress$Counter", "FileSystemCounters" (10 elements are nulls)}
Object[21](element class String){"org.apache.hadoop.mapred.JobConf", "org.apache.hadoop.mapred.JobConfigurable", "configure", null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null (18 elements are nulls)}
Object[16](element class String){null, null, null, null, null, null, null, null, null, null, null, ": adding tracer ", null, "TracerPool(", ")", "Global" (12 elements are nulls)}
Object[31](element class String){"not a method or field: ", "not invocable, no method type", null, null, null, null, null, null, null, null, null, null, null, null, "<init>", null, null, null, null, null, ... (26 elements are nulls)}
Object[7](element class String){null, null, null, "yes", "no", null, null}
Object[7](element class String){null, null, null, "xml", "&", null, null}
Object[33](element class String){".", "hadoop-metrics2-", ".properties", "hadoop-metrics2.properties", "loaded properties from ", "Cannot locate configuration", null, null, null, null, "*.", null, null, null, null, null, null, "([^.*]+)\..+", null, null, ... (25 elements are nulls)}
Object[25](element class String){"hftp", null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, ... (23 elements are nulls)}
Object[1]{null}
Object[1]{null}
Object[1]{null}
Object[1]{null}
Object[1](element class String){"+"}
Object[1]{null}
Object[1]{null}
Object[1](element class String){"java.util.Arrays.useLegacyMergeSort"}
Object[1]{null}
Object[1](element class String){"metric info"}
Object[1](element class String){"java.lang:type=ClassLoading"}
Object[1]{null}
Object[1]{null}
Object[1](element class String){""}
Object[1](element class String){"auth.policy.provider"}
Object[1]{null}
Object[1]{null}
Object[1](element class String){"attachment"}
Object[1](element class String){"Detect premature EOF"}
Object[1]{null}
Object[40](element class String){"har", null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, ... (39 elements are nulls)}
Object[2](element class String){"DFSOpsCountStatistics", null}
Object[13](element class String){null, null, null, "org.apache.xerces.impl.dv.ObjectFactory", null, null, null, null, null, null, null, null, null (12 elements are nulls)}
Object[3](element class java.lang.invoke.SimpleMethodHandle){(customizationCount : 0, ...), null, null}
Object[3](element class String){null, null, "INSTANCE"}
Object[4](element class String){null, null, null, ""}
Object[4](element class String){null, "/", null, null}
Object[2](element class String){"IPC Parameter Sending Thread #%d", null}
Object[30](element class String){null, null, null, "X.509", null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, ... (29 elements are nulls)}
Object[4](element class String){"Weights must be non-negative", null, null, null}
Object[6](element class String){null, "attempt", null, null, null, null}
Object[5](element class String){"job", null, null, null, null}
Object[6](element class String){"jvm", null, null, null, null, null}
Object[13](element class String){null, null, null, null, null, null, "-07-", null, null, null, null, null, null (12 elements are nulls)}
Object[2](element class String){"io.file.buffer.size", null}
Object[12](element class String){null, null, null, null, null, null, "/", null, null, null, null, null (11 elements are nulls)}
Object[17](element class String){null, null, null, null, null, null, null, null, null, null, "modifyThread", null, null, null, null, null, null (16 elements are nulls)}
Object[2](element class String){"ltrim", null}
Object[3](element class String){"ISO-2022-KR", null, null}
Object[12](element class String){null, null, null, null, null, null, null, null, null, null, null, "Null key for a Map not allowed in JSON (use a converting NullKeySerializer?)" (11 elements are nulls)}

↖Unreachable
  All or some objects may start live as:

2Kb (< 0.1%): Object[]: 40 / 42% of 1-elem 2Kb (< 0.1%), 16 / 17% of sparse 512b (< 0.1%)

 Random sample of non-empty containers 
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.fs.ftp.FTPFileSystem", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "SecurityLogger.org.apache.hadoop.ipc.Server", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.io.nativeio.NativeIO", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.fs.azure.NativeAzureFileSystem", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hive.serde2.lazy.objectinspector.LazyListObjectInspector", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hive.conf.valcoersion.JavaIOTmpdirVariableCoercion", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hive.ql.session.SessionState", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hive.common.util.HiveVersionInfo", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.mapreduce.v2.util.MRApps", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "com.criteo.hadoop.hive.ql.io.PailOrCombineHiveInputFormat", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "com.criteo.hadoop.hive.ql.io.PailOrCombineHiveInputFormat", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "com.criteo.hadoop.hive.ql.io.PailOrCombineHiveInputFormat", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hive.ql.log.PerfLogger", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.conf.Configuration", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.io.serializer.SerializationFactory", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.mapreduce.lib.partition.KeyFieldBasedPartitioner", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "com.criteo.hadoop.hive.ql.io.PailOrCombineHiveInputFormat", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hive.ql.exec.mr.ExecMapper", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "SecurityLogger.org.apache.hadoop.ipc.Server", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hive.serde2.binarysortable.BinarySortableSerDe", additive : true, ...), null, null, null, null, null, null, null, null, null (9 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.yarn.security.ContainerTokenIdentifier", additive : true, ...), ("org.apache.hadoop.yarn.security.NMTokenIdentifier", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.mapreduce.counters.FrameworkCounterGroup", additive : true, ...), ("org.apache.hadoop.mapreduce.counters.FileSystemCounterGroup", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider", additive : true, ...), ("org.apache.hadoop.hdfs.server.namenode.NameNode", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider", additive : true, ...), ("org.apache.hadoop.hdfs.server.namenode.NameNode", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveVarcharObjectInspector", additive : true, ...), ("org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorUtils", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hive.common.HiveCompat", additive : true, ...), ("org.apache.hive.common.util.HiveVersionInfo", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hive.serde2.io.HiveIntervalYearMonthWritable", additive : true, ...), ("org.apache.hadoop.hive.serde2.io.HiveIntervalDayTimeWritable", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.mapreduce.lib.input.FileInputFormat", additive : true, ...), ("org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hive.shims.ShimLoader", additive : true, ...), ("org.apache.hadoop.hive.shims.HadoopShimsSecure", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hive.common.FileUtils", additive : true, ...), ("org.apache.hadoop.hive.common.JavaUtils", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient", additive : true, ...), ("org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.io.compress.zlib.ZlibFactory", additive : true, ...), ("org.apache.hadoop.io.compress.zlib.ZlibCompressor", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hive.common.HiveCompat", additive : true, ...), ("org.apache.hive.common.util.HiveVersionInfo", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe", additive : true, ...), ("org.apache.hadoop.hive.serde2.columnar.ColumnarStruct", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.net.unix.DomainSocketWatcher", additive : true, ...), ("org.apache.hadoop.net.unix.DomainSocket", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}
Object[10](element class org.apache.log4j.Logger){(name : "org.apache.hadoop.hive.serde2.lazybinary.LazyBinarySerDe", additive : true, ...), ("org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct", true, ...), null, null, null, null, null, null, null, null (8 elements are nulls)}

org.apache.log4j.ProvisionNode.elementData {j.u.Hashtable}.values
org.apache.log4j.Hierarchy.ht
org.apache.log4j.spi.DefaultRepositorySelector.repository
↖Java Static org.apache.log4j.LogManager.repositorySelector
648b (< 0.1%): Object[]: 14 / 21% of 1-elem 384b (< 0.1%), 10 / 15% of 1-length 240b (< 0.1%), 1 / 1% of sparse 24b (< 0.1%)

 Random sample of non-empty containers 
Object[2](element class com.google.common.cache.LocalCache$WeakAccessEntry){(hash : 669824604, accessTime : 13272560961758173, ...), null}
Object[2](element class com.google.common.cache.LocalCache$WeakAccessEntry){null, (159327113, 13272560286997461, ...)}
Object[4](element class com.google.common.cache.LocalCache$WeakAccessEntry){null, (477836833, 13272560480298972, ...), null, null}
Object[2](element class com.google.common.cache.LocalCache$WeakAccessEntry){null, (699439663, 13272560406736352, ...)}
Object[2](element class com.google.common.cache.LocalCache$WeakAccessEntry){null, (1863767155, 13272560281587131, ...)}
Object[4](element class com.google.common.cache.LocalCache$WeakAccessEntry){null, null, null, (1791980911, 13272560502631805, ...)}
Object[4](element class com.google.common.cache.LocalCache$WeakAccessEntry){null, null, null, (1899845515, 13272560495582537, ...)}
Object[2](element class com.google.common.cache.LocalCache$WeakAccessEntry){null, (-1965648947, 13272560346357946, ...)}
Object[2](element class com.google.common.cache.LocalCache$WeakAccessEntry){(hash : -1941699450, accessTime : 13272560240359217, ...), null}
Object[4](element class com.google.common.cache.LocalCache$WeakAccessEntry){null, (-1489747911, 13272560491196906, ...), null, null}
Object[2](element class com.google.common.cache.LocalCache$WeakAccessEntry){(hash : -1443131286, accessTime : 13272560274078248, ...), null}
Object[2](element class com.google.common.cache.LocalCache$WeakAccessEntry){null, (-1176900155, 13272560482853859, ...)}
Object[4](element class com.google.common.cache.LocalCache$WeakAccessEntry){null, (-390643163, 13272560493581128, ...), null, null}
Object[4](element class com.google.common.cache.LocalCache$WeakAccessEntry){(hash : -245919480, accessTime : 13272560374887781, ...), null, null, null}
Object[1]{null}
Object[1]{null}
Object[1]{null}
Object[1]{null}
Object[1]{null}
Object[1]{null}
Object[1]{null}
Object[1]{null}
Object[1]{null}
Object[1]{null}
Object[8](element class com.google.common.cache.LocalCache$WeakAccessEntry){(hash : -1134648168, accessTime : 13272560386829157, ...), null, null, null, null, null, null, (-1135483097, 13272560488628100, ...)}

java.util.concurrent.atomic.AtomicReferenceArray.array com.google.common.cache.LocalCache$Segment.table
com.google.common.cache.LocalCache$Segment[]
com.google.common.cache.LocalCache.segments
com.google.common.cache.LocalCache$LocalManualCache.localCache
↖Java Static org.apache.hive.common.util.ReflectionUtil.CONSTRUCTOR_CACHE
3Kb (< 0.1%): Object[]: 1 / 100% of sparse 3Kb (< 0.1%)

 Random sample of non-empty containers 
Object[1024](element class Object[]){null, Object[](2)@80345cd8, ... (831 elements are nulls)}

sun.nio.cs.StandardCharsets$Aliases.ht sun.nio.cs.StandardCharsets.aliasMap
↖Java Static java.nio.charset.Charset.standardProvider
23Kb (< 0.1%): Object[]: 999 / 99% of 1-length 23Kb (< 0.1%)

 Random sample of non-empty containers 
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 933)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 38)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 153)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 825)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 248)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 370)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 425)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 638)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 901)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 203)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 335)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 517)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 931)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 318)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 124)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 422)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 665)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 775)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 753)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 456)}

Object[] org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap


23Kb (< 0.1%): Object[]: 999 / 99% of 1-length 23Kb (< 0.1%)

 Random sample of non-empty containers 
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 933)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 38)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 153)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 825)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 248)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 370)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 425)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 638)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 901)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 203)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 335)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 517)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 931)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 318)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 124)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 422)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 665)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 775)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 753)}
Object[1](element class org.apache.hadoop.io.IntWritable){(value : 456)}

Object[] org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
23Kb (< 0.1%): Object[]: 998 / 99% of 1-elem 23Kb (< 0.1%)

 Random sample of non-empty containers 
Object[2](element class Object[]){null, Object[](1)@c0e187f8}
Object[2](element class Object[]){null, Object[](1)@c0bc0e70}
Object[2](element class Object[]){null, Object[](1)@c1199fd8}
Object[2](element class Object[]){null, Object[](1)@c0b31c38}
Object[2](element class Object[]){null, Object[](1)@c0d3cae0}
Object[2](element class Object[]){null, Object[](1)@c10511c8}
Object[2](element class Object[]){null, Object[](1)@c0814c20}
Object[2](element class Object[]){null, Object[](1)@c09954d0}
Object[2](element class Object[]){null, Object[](1)@c0fa1940}
Object[2](element class Object[]){null, Object[](1)@c07b4990}
Object[2](element class Object[]){null, Object[](1)@c10c4860}
Object[2](element class Object[]){null, Object[](1)@c0cb6938}
Object[2](element class Object[]){null, Object[](1)@c0f15ef8}
Object[2](element class Object[]){null, Object[](1)@c10a85a8}
Object[2](element class Object[]){null, Object[](1)@c1186d30}
Object[2](element class Object[]){null, Object[](1)@c07f52c8}
Object[2](element class Object[]){null, Object[](1)@c089eb00}
Object[2](element class Object[]){null, Object[](1)@c0fb3ce0}
Object[2](element class Object[]){null, Object[](1)@c0f723d0}
Object[2](element class Object[]){null, Object[](1)@c0b279b0}

org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart {j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap





11. Bad Primitive Arrays.  Overhead 64.6%  ( 1,052,826Kb )  What's this?

  Total primitive arrays     Bad primitive arrays    Overhead   
 601,294 7,2671,052,826Kb (64.6%)

Top bad primitive arrays:  

  Overhead     Problem     # objects    Type 
 1,047,628Kb (64.3%) empty 1076 / 0%byte[]
 4,530Kb (0.3%) trail-0s 186 / 0%byte[]


Reference Chains for Bad Primitive Arrays

Expensive data fields

1,047,552Kb (64.3%): byte[]: 1 / 100% of empty 1,047,552Kb (64.3%)
 ↖org.apache.hadoop.mapred.MapTask$MapOutputBuffer.kvbuffer
3,513Kb (0.2%): byte[]: 9 / 81% of trail-0s 3,513Kb (0.2%)

 Random sample of non-empty containers 
byte[74736]{'\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N', ... (33184 trailing 0s)}
byte[88284]{'102000\N\N161000\N\N\N\N285000\N\N124000\N\N145000\N\N124000128000\N\N\N145000\N\N128000132000132000\N\N161000132000\N\N', ... (35224 trailing 0s)}
byte[86628]{'97000\N\N154000\N\N\N\N271000\N\N85000\N\N95000\N\N8500085000\N\N\N85000\N\N850008400084000\N\N10400084000\N\N\N\N61000\', ... (33498 trailing 0s)}
byte[186840]{'155981559815598155981559815598155981559815598155981559815598155981559815598155981559815598155981559815598155981559815598', ... (88850 trailing 0s)}
byte[1343412]{'https://id-live-01.slatic.net/p/2/chevron-anchor-boat-pola-phone-case-untuk-lenovoa6000-hitam-intl-1488460338-01793051-e', ... (615522 trailing 0s)}
byte[2753314]{'@Ksamsung_ids@V0@V@Kseller_id@V13234@V@Ksimple_sku@VOE427ELAA8YCPQANID-20966951@V@Ksimple_sku_2@VOE427ELAA8YCPQANID-2096', ... (1170803 trailing 0s)}
byte[115190]{'237042370423704237042370423704237042370423704237042370423704237042370423704237042370423704237042370423704237042370423704', ... (35190 trailing 0s)}
byte[1114812]{'\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N', ... (538688 trailing 0s)}
byte[2344694]{'http://i1.static-shopcade.com/270x270x100/000000000000000000000000xxxkhu/aHR0cHM6Ly9zYy1wcm9kdWN0LWltYWdlcy5zMy5hbWF6b25', ... (1047053 trailing 0s)}

 ↖org.apache.hadoop.hive.serde2.lazy.ByteArrayRef.data


Full reference chains

1,047,552Kb (64.3%): byte[]: 1 / 100% of empty 1,047,552Kb (64.3%)
org.apache.hadoop.mapred.MapTask$MapOutputBuffer.kvbuffer org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.this$0
j.l.Thread[]
j.l.ThreadGroup.threads
java.beans.WeakIdentityMap$Entry.referent
java.beans.WeakIdentityMap$Entry[]
java.beans.ThreadGroupContext$1.table
↖Java Static java.beans.ThreadGroupContext.contexts
1,930Kb (0.1%): byte[]: 6 / 85% of trail-0s 1,930Kb (0.1%)

 Random sample of non-empty containers 
byte[74736]{'\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N', ... (33184 trailing 0s)}
byte[88284]{'102000\N\N161000\N\N\N\N285000\N\N124000\N\N145000\N\N124000128000\N\N\N145000\N\N128000132000132000\N\N161000132000\N\N', ... (35224 trailing 0s)}
byte[86628]{'97000\N\N154000\N\N\N\N271000\N\N85000\N\N95000\N\N8500085000\N\N\N85000\N\N850008400084000\N\N10400084000\N\N\N\N61000\', ... (33498 trailing 0s)}
byte[186840]{'155981559815598155981559815598155981559815598155981559815598155981559815598155981559815598155981559815598155981559815598', ... (88850 trailing 0s)}
byte[1343412]{'https://id-live-01.slatic.net/p/2/chevron-anchor-boat-pola-phone-case-untuk-lenovoa6000-hitam-intl-1488460338-01793051-e', ... (615522 trailing 0s)}
byte[2753314]{'@Ksamsung_ids@V0@V@Kseller_id@V13234@V@Ksimple_sku@VOE427ELAA8YCPQANID-20966951@V@Ksimple_sku_2@VOE427ELAA8YCPQANID-2096', ... (1170803 trailing 0s)}

org.apache.hadoop.hive.serde2.lazy.ByteArrayRef.data org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.cachedByteArrayRef
org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo[]
org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.fieldInfoList
Object[]
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.rowWithPart
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
1,582Kb (< 0.1%): byte[]: 3 / 75% of trail-0s 1,582Kb (< 0.1%)

 Random sample of non-empty containers 
byte[115190]{'237042370423704237042370423704237042370423704237042370423704237042370423704237042370423704237042370423704237042370423704', ... (35190 trailing 0s)}
byte[1114812]{'\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N', ... (538688 trailing 0s)}
byte[2344694]{'http://i1.static-shopcade.com/270x270x100/000000000000000000000000xxxkhu/aHR0cHM6Ly9zYy1wcm9kdWN0LWltYWdlcy5zMy5hbWF6b25', ... (1047053 trailing 0s)}

org.apache.hadoop.hive.serde2.lazy.ByteArrayRef.data org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo.cachedByteArrayRef
org.apache.hadoop.hive.serde2.columnar.ColumnarStructBase$FieldInfo[]
org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.fieldInfoList
org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.cachedLazyStruct
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx[]
org.apache.hadoop.hive.ql.exec.MapOperator.currentCtxs
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap





12. Boxed Numbers.  no significant overhead  What's this?

  Total boxed objects    Overhead   
 1,30623Kb (< 0.1%)


Reference Chains for Boxed Numbers

Full reference chains

4Kb (< 0.1%): j.l.Byte: 253 objects

 Random sample 
j.l.Byte(82)
(-68)
(-117)
(-112)
(29)
(50)
(88)
(32)
(7)
(-52)
(35)
(-50)
(-49)
(28)
(59)
(-26)
(-8)
(62)
(-22)
(-42)

j.l.Byte[] ↖Java Static java.lang.Byte$ByteCache.cache
4Kb (< 0.1%): j.l.Short: 255 objects

 Random sample 
j.l.Short(-63)
(-108)
(-125)
(-126)
(30)
(51)
(-57)
(33)
(3)
(101)
(36)
(103)
(104)
(85)
(60)
(127)
(-10)
(63)
(-111)
(111)

j.l.Short[] ↖Java Static java.lang.Short$ShortCache.cache
3Kb (< 0.1%): j.l.Long: 178 objects

 Random sample 
j.l.Long(92)
(-18)
(-49)
(-68)
(-47)
(-14)
(99)
(-24)
(-100)
(-42)
(-41)
(-40)
(-39)
(-58)
(-5)
(85)
(-112)
(-2)
(-21)
(-32)

j.l.Long[] ↖Java Static java.lang.Long$LongCache.cache





13. Duplicate Objects.  Overhead 3.7%  ( 60,519Kb )  What's this?

  Total objects of types
with duplicates  
 
  Unique objects     Duplicate values    Overhead   
 3,900,664 27,422 260,519Kb (3.7%)

Types of duplicate objects:

  Overhead     # objects     Unique objects    Class name 
 30,259Kb (1.9%) 1,950,332 13,711org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg
 30,259Kb (1.9%) 1,950,332 13,711org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg


Top duplicate objects

  Overhead     # objects    Value 
 30,259Kb (1.9%) 1,936,622org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg(o : null)
 30,259Kb (1.9%) 1,936,622org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg(o : null)



Reference Chains for Duplicate Objects

Expensive data fields

30,259Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg: 1,936,618 / 24% dup objects (1 unique)

  Num objects  Object value 
 1,936,618(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
30,259Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg: 1,936,618 / 24% dup objects (1 unique)

  Num objects  Object value 
 1,936,618(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations


Full reference chains

30,259Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg: 1,936,618 / 24% dup objects (1 unique)

  Num objects  Object value 
 1,936,618(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
30,259Kb (1.9%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMin$GenericUDAFMinEvaluator$MinAgg: 1,936,618 / 24% dup objects (1 unique)

  Num objects  Object value 
 1,936,618(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] {j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GroupByOperator.hashAggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
48b (< 0.1%): org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMax$GenericUDAFMaxEvaluator$MaxAgg: 4 / 25% dup objects (1 unique)

  Num objects  Object value 
 4(o : null)

org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator$AggregationBuffer[] org.apache.hadoop.hive.ql.exec.GroupByOperator.aggregations
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.SelectOperator.childOperators
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.childOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap





14. Duplicate Primitive Arrays.  no significant overhead  What's this?

  Total primitive arrays     Unique arrays     Duplicate values    Overhead   
 498,565 489,682 264542Kb (< 0.1%)

Types of duplicate objects:

  Overhead     # objects     Unique objects    Class name 
 271Kb (< 0.1%) 1,086 40boolean[]
 130Kb (< 0.1%) 492,169 488,472byte[]
 113Kb (< 0.1%) 5,062 1,005int[]
 26Kb (< 0.1%) 102,937 140char[]
 592b (< 0.1%) 11 2short[]
 32b (< 0.1%) 4 2float[]
 32b (< 0.1%) 17 15long[]
 32b (< 0.1%) 8 6double[]


Top duplicate arrays

  Overhead     # objects    Value 
 265Kb (< 0.1%) 1,001boolean[256]{(all elements are 0s)}
 47Kb (< 0.1%) 1,006byte[32]{(all elements are 0s)}
 46Kb (< 0.1%) 1,000int[7]{13, 14, 18, 12, 0, 19, 6 (all elements with high bits 0)}
 33Kb (< 0.1%) 2,158int[0]{}
 23Kb (< 0.1%) 1,001byte[2]{'\N'}
 23Kb (< 0.1%) 1,001byte[8]{1, 2, 3, 4, 5, 6, 7, 8}
 16Kb (< 0.1%) 3byte[8192]{(all elements are 0s)}
 16Kb (< 0.1%) 2char[8192]{(all elements are 0s)}
 7Kb (< 0.1%) 309byte[1]{'?'}
 5Kb (< 0.1%) 152int[6]{67, 82, 73, 84, 69, 79 (all elements with high bits 0)}
 4Kb (< 0.1%) 10char[256]{'�����������������������������������������������������������������������������������������������������������������������', ...}
 4Kb (< 0.1%) 136int[4]{80, 82, 79, 68 (all elements with high bits 0)}
 2Kb (< 0.1%) 27int[17]{1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 (all elements with high bits 0)}
 2Kb (< 0.1%) 72int[3]{72, 80, 67 (all elements with high bits 0)}
 1Kb (< 0.1%) 76int[2]{65, 68 (all elements with high bits 0)}
 1Kb (< 0.1%) 7boolean[256]{false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, ... (210 trailing 0s)}
 1Kb (< 0.1%) 68byte[4]{-2, -54, 0, 0}
 1Kb (< 0.1%) 4char[256]{(all elements are 0s)}
 1Kb (< 0.1%) 61byte[6]{0, 1, 0, 'J', 0, 0}
 1Kb (< 0.1%) 26char[19]{(all elements are 0s)}



Reference Chains for Duplicate Primitive Arrays

Full reference chains

265Kb (< 0.1%): boolean[]: 999 / 100% dup arrays (1 unique)

  Num objects  Object value 
 999boolean[256]{(all elements are 0s)}

org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.needsEscape org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.serdeParams
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
46Kb (< 0.1%): byte[]: 999 / 100% dup arrays (1 unique)

  Num objects  Object value 
 999byte[32]{(all elements are 0s)}

org.apache.hadoop.hive.serde2.ByteStream$Output.buf org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.serializeStream
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap
46Kb (< 0.1%): int[]: 998 / 100% dup arrays (1 unique)

  Num objects  Object value 
 998int[7]{13, 14, 18, 12, 0, 19, 6 (all elements with high bits 0)}

org.apache.hadoop.hive.serde2.columnar.ColumnarStruct.prjColIDs org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe.cachedLazyStruct
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.deserializer
{j.u.LinkedHashMap}.values
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.MapOperator.opCtxMap
{j.u.ArrayList}
org.apache.hadoop.hive.ql.exec.TableScanOperator.parentOperators
{j.u.LinkedHashMap}.values
org.apache.hadoop.hive.ql.plan.MapWork.aliasToWork
{j.u.HashMap}.values
org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.gWorkMap
↖Java Static org.apache.hadoop.hive.ql.exec.Utilities.gWorkMap





15. Duplicate Object Arrays.  no significant overhead  What's this?



16. Duplicate Lists.  no significant overhead  What's this?



17. WeakHashMaps with hard references from values to keys.  no significant overhead  What's this?



18. Off-heap memory used by java.nio.DirectByteBuffers.  no significant overhead  What's this?

  # objects   Total capacity  Total size 
 18 529Kb (< 0.1%)529Kb (< 0.1%)


Reference Chains for java.nio.DirectByteBuffers

Full reference chains

393Kb (< 0.1%): java.nio.DirectByteBuffer: 7 objects

 Random sample 
(mark : -1, position : 12839, limit : 65536, capacity : 65536, address : 140051011641680, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : sun.misc.Cleaner@c053cf10)
(mark : -1, position : 17866, limit : 17866, capacity : 65536, address : 140051017466848, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : @c053cee8)
(mark : -1, position : 764, limit : 65536, capacity : 65536, address : 140051008614592, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : @c053cec0)
(mark : -1, position : 64146, limit : 64146, capacity : 65536, address : 140051016144560, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : @c053ce98)
(mark : -1, position : 64, limit : 64, capacity : 8192, address : 140050564898576, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : @e21b39b8)
(mark : -1, position : 0, limit : 131231, capacity : 131231, address : 140051009967360, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : @e3703988)
(mark : -1, position : 874, limit : 874, capacity : 874, address : 140050982019408, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : @e3726010)

sun.misc.Cleaner.referent sun.misc.Cleaner.{prev}
java.nio.DirectByteBuffer.cleaner
java.nio.DirectByteBufferR.att
↖Java Static org.apache.hadoop.hdfs.DFSInputStream.EMPTY_BUFFER
128Kb (< 0.1%): java.nio.DirectByteBuffer: 1 objects

 Random sample 
(mark : -1, position : 3512, limit : 131072, capacity : 131072, address : 140051009967519, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : java.nio.DirectByteBuffer@e3703928, cleaner : null)

org.apache.hadoop.hdfs.RemoteBlockReader2.curDataSlice org.apache.hadoop.hdfs.DFSInputStream.blockReader
org.apache.hadoop.hdfs.client.HdfsDataInputStream.in
org.apache.hadoop.hive.ql.io.RCFile$Reader.in
org.apache.hadoop.hive.ql.io.RCFileRecordReader.in
org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.recordReader
org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.curReader
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.rawIn
↖Java Local(org.apache.hadoop.mapred.MapTask$TrackedRecordReader)

8Kb (< 0.1%): java.nio.DirectByteBuffer: 1 objects

 Random sample 
(mark : -1, position : 16, limit : 16, capacity : 8192, address : 140050990345872, hb : null, offset : 0, isReadOnly : false, bigEndian : true, nativeByteOrder : false, fd : null, att : null, cleaner : sun.misc.Cleaner@8006d9a8)

sun.misc.Cleaner.referent sun.misc.Cleaner.next
java.nio.DirectByteBuffer.cleaner
java.nio.DirectByteBufferR.att
↖Java Static org.apache.hadoop.hdfs.DFSInputStream.EMPTY_BUFFER





19. Heap Size Configuration.  no significant overhead  What's this?
We estimate that the maximum heap size (-Xmx) is configured properly for your current working set = 1,628,922Kb



20. Very Long (Over 1000 Elements) Reference Chains. .  not found  What's this?



21. Thread stacks. (number of threads.  11 ) What's this?
Thread name: "main", daemon: false
java.lang.OutOfMemoryError.<init>(OutOfMemoryError.java:48)
java.util.regex.Matcher.<init>(Matcher.java:225)
java.util.regex.Pattern.matcher(Pattern.java:1093)
com.criteo.hadoop.hive.udf.UDFExtraDataToMap.toMap(UDFExtraDataToMap.java:42)
com.criteo.hadoop.hive.udf.UDFExtraDataToMap.evaluate(UDFExtraDataToMap.java:82)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:187)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject.get(ExprNodeGenericFuncEvaluator.java:88)
org.apache.hadoop.hive.ql.udf.generic.GenericUDFIndex.evaluate(GenericUDFIndex.java:100)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:187)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator$DeferredExprObject.get(ExprNodeGenericFuncEvaluator.java:88)
org.apache.hadoop.hive.ql.udf.generic.GenericUDFWhen.evaluate(GenericUDFWhen.java:104)
org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator._evaluate(ExprNodeGenericFuncEvaluator.java:187)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluatorHead._evaluate(ExprNodeEvaluatorHead.java:44)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:80)
org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:68)
org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)
org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:897)
org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:148)
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:547)
org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:160)
org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:459)
org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
java.security.AccessController.doPrivileged(Native method)
javax.security.auth.Subject.doAs(Subject.java:422)
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)


Thread name: UNKNOWN, daemon: false
sun.misc.Unsafe.park(Native method)
java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.run(MapTask.java:1531)


Thread name: "Thread for syncLogs", daemon: true
org.apache.log4j.Hierarchy.getCurrentLoggers(Hierarchy.java:314)
org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:304)
org.apache.hadoop.mapred.TaskLog$3.run(TaskLog.java:350)
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)


Thread name: "communication thread", daemon: true
java.lang.ref.SoftReference.get(SoftReference.java:113)
java.lang.StringCoding.deref(StringCoding.java:66)
java.lang.StringCoding.encode(StringCoding.java:330)
java.lang.String.getBytes(String.java:918)
java.io.UnixFileSystem.getBooleanAttributes0(Native method)
java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:242)
java.io.File.isDirectory(File.java:849)
org.apache.hadoop.yarn.util.ProcfsBasedProcessTree.getProcessList(ProcfsBasedProcessTree.java:510)
org.apache.hadoop.yarn.util.ProcfsBasedProcessTree.updateProcessTree(ProcfsBasedProcessTree.java:209)
org.apache.hadoop.mapred.Task.updateResourceCounters(Task.java:898)
org.apache.hadoop.mapred.Task.updateCounters(Task.java:1067)
org.apache.hadoop.mapred.Task.access$500(Task.java:82)
org.apache.hadoop.mapred.Task$TaskReporter.run(Task.java:786)
java.lang.Thread.run(Thread.java:748)


Thread name: UNKNOWN, daemon: false
java.lang.Object.wait(Native method)
java.util.TimerThread.mainLoop(Timer.java:552)
java.util.TimerThread.run(Timer.java:505)


Thread name: UNKNOWN, daemon: false
java.lang.Object.wait(Native method)
java.lang.Object.wait(Object.java:502)
java.lang.ref.Reference.tryHandlePending(Reference.java:191)
java.lang.ref.Reference$ReferenceHandler.run(Reference.java:153)


Thread name: UNKNOWN, daemon: false
java.lang.Object.wait(Native method)
java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:164)
java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:209)


Thread name: "IPC Parameter Sending Thread #1", daemon: true
sun.misc.Unsafe.park(Native method)
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)


Thread name: UNKNOWN, daemon: false
java.lang.Thread.sleep(Native method)
org.apache.hadoop.hdfs.PeerCache.run(PeerCache.java:255)
org.apache.hadoop.hdfs.PeerCache.access$000(PeerCache.java:46)
org.apache.hadoop.hdfs.PeerCache$1.run(PeerCache.java:124)
java.lang.Thread.run(Thread.java:748)


Thread name: "org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner", daemon: true
java.lang.Object.wait(Native method)
java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:164)
org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner.run(FileSystem.java:3212)
java.lang.Thread.run(Thread.java:748)


Thread name: "Thread-7", daemon: true
org.apache.hadoop.net.unix.DomainSocketWatcher.doPoll0(Native method)
org.apache.hadoop.net.unix.DomainSocketWatcher.access$900(DomainSocketWatcher.java:52)
org.apache.hadoop.net.unix.DomainSocketWatcher$2.run(DomainSocketWatcher.java:509)
java.lang.Thread.run(Thread.java:748)





22. System Properties (result of java.lang.System.getProperties()) What's this?

  Key  Value 
 awt.toolkitsun.awt.X11.XToolkit
 file.encodingUTF-8
 file.encoding.pkgsun.io
 file.separator/
 hadoop.metrics.log.levelWARN
 hadoop.root.logfilesyslog
 hadoop.root.loggerWARN,CLA
 java.awt.graphicsenvsun.awt.X11GraphicsEnvironment
 java.awt.printerjobsun.print.PSPrinterJob
 java.class.path/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034:/etc/hadoop/conf:/usr/lib/hadoop/parquet-format-javadoc.jar:/usr/lib/hadoop/parquet-format-sources.jar:/usr/lib/hadoop/parquet-format.jar:/usr/lib/hadoop/parquet-avro.jar:/usr/lib/hadoop/parquet-cascading.jar:/usr/lib/hadoop/parquet-column.jar:/usr/lib/hadoop/parquet-common.jar:/usr/lib/hadoop/parquet-encoding.jar:/usr/lib/hadoop/parquet-generator.jar:/usr/lib/hadoop/parquet-hadoop-bundle.jar:/usr/lib/hadoop/parquet-hadoop.jar:/usr/lib/hadoop/parquet-jackson.jar:/usr/lib/hadoop/parquet-pig-bundle.jar:/usr/lib/hadoop/parquet-pig.jar:/usr/lib/hadoop/parquet-protobuf.jar:/usr/lib/hadoop/parquet-test-hadoop2.jar:/usr/lib/hadoop/parquet-thrift.jar:/usr/lib/hadoop/parquet-tools.jar:/usr/lib/hadoop/hadoop-annotations-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-aws.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-azure-datalake.jar:/usr/lib/hadoop/hadoop-common-2.6.0-cdh5.11.0-tests.jar:/usr/lib/hadoop/hadoop-common-tests.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-nfs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/hadoop-common-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-aws-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-auth-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/logredactor-1.0.3.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/avro.jar:/usr/lib/hadoop/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop/lib/aws-java-sdk-core-1.10.6.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/aws-java-sdk-dynamodb-1.10.6.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/aws-java-sdk-kms-1.10.6.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/aws-java-sdk-s3-1.10.6.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/aws-java-sdk-sts-1.10.6.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/azure-data-lake-store-sdk-2.1.4.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.5.jar:/usr/lib/hadoop/lib/commons-beanutils-1.9.2.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/slf4j-log4j12.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/zookeeper.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.15-cdh5.11.0.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/hadoop-lzo.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop/lib/jackson-xc-1.8.8.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.6.0-cdh5.11.0-tests.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-tests.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/avro.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.9.2.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-el-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/jasper-compiler-5.5.23.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archive-logs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-mapreduce/hadoop-archive-logs.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-tests.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-nativetask.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.11.0-tests.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-nativetask-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.2.jar:/usr/lib/hadoop-mapreduce/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/okhttp-2.4.0.jar:/usr/lib/hadoop-mapreduce/okio-1.4.0.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/zookeeper.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/avro.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-distributedshell.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-applications-unmanaged-am-launcher.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-registry.jar:/usr/lib/hadoop-yarn/hadoop-yarn-api-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-common-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-tests.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-web-proxy.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-common-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-resourcemanager-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-nodemanager-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-server-applicationhistoryservice-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/hadoop-yarn-client-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-yarn/lib/activation-1.1.jar:/usr/lib/hadoop-yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-yarn/lib/asm-3.2.jar:/usr/lib/hadoop-yarn/lib/commons-cli-1.2.jar:/usr/lib/hadoop-yarn/lib/commons-codec-1.4.jar:/usr/lib/hadoop-yarn/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop-yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-yarn/lib/commons-io-2.4.jar:/usr/lib/hadoop-yarn/lib/commons-lang-2.6.jar:/usr/lib/hadoop-yarn/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-yarn/lib/guava-11.0.2.jar:/usr/lib/hadoop-yarn/lib/guice-3.0.jar:/usr/lib/hadoop-yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-yarn/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-yarn/lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop-yarn/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-yarn/lib/jackson-xc-1.8.8.jar:/usr/lib/hadoop-yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-yarn/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-yarn/lib/jersey-client-1.9.jar:/usr/lib/hadoop-yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-yarn/lib/jersey-json-1.9.jar:/usr/lib/hadoop-yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-yarn/lib/jettison-1.1.jar:/usr/lib/hadoop-yarn/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-yarn/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-yarn/lib/jline-2.11.jar:/usr/lib/hadoop-yarn/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-yarn/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/lib/hadoop-yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-yarn/lib/servlet-api-2.5.jar:/usr/lib/hadoop-yarn/lib/stax-api-1.0-2.jar:/usr/lib/hadoop-yarn/lib/xz-1.0.jar:/usr/lib/hadoop-yarn/lib/zookeeper.jar:/usr/lib/hadoop-mapreduce/share/hadoop/mapreduce/*:/usr/lib/hadoop-mapreduce/share/hadoop/mapreduce/lib/*:job.jar/job.jar:job.jar/classes/:job.jar/lib/*:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/criteo-datadisco-hive-17394-uber.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/criteo-hadoop-hiveutils-hive2-17394-uber.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/job.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/glup-schemas-17394-uber.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034:/etc/hadoop/conf:/usr/lib/hadoop/parquet-format-javadoc.jar:/usr/lib/hadoop/parquet-format-sources.jar:/usr/lib/hadoop/parquet-format.jar:/usr/lib/hadoop/parquet-avro.jar:/usr/lib/hadoop/parquet-cascading.jar:/usr/lib/hadoop/parquet-column.jar:/usr/lib/hadoop/parquet-common.jar:/usr/lib/hadoop/parquet-encoding.jar:/usr/lib/hadoop/parquet-generator.jar:/usr/lib/hadoop/parquet-hadoop-bundle.jar:/usr/lib/hadoop/parquet-hadoop.jar:/usr/lib/hadoop/parquet-jackson.jar:/usr/lib/hadoop/parquet-pig-bundle.jar:/usr/lib/hadoop/parquet-pig.jar:/usr/lib/hadoop/parquet-protobuf.jar:/usr/lib/hadoop/parquet-test-hadoop2.jar:/usr/lib/hadoop/parquet-thrift.jar:/usr/lib/hadoop/parquet-tools.jar:/usr/lib/hadoop/hadoop-annotations-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-annotations.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-aws.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-azure-datalake.jar:/usr/lib/hadoop/hadoop-common-2.6.0-cdh5.11.0-tests.jar:/usr/lib/hadoop/hadoop-common-tests.jar:/usr/lib/hadoop/hadoop-common.jar:/usr/lib/hadoop/hadoop-nfs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-nfs.jar:/usr/lib/hadoop/hadoop-common-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-aws-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/hadoop-auth-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop/lib/activation-1.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop/lib/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop/lib/logredactor-1.0.3.jar:/usr/lib/hadoop/lib/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/api-util-1.0.0-M20.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/avro.jar:/usr/lib/hadoop/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop/lib/aws-java-sdk-core-1.10.6.jar:/usr/lib/hadoop/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop/lib/aws-java-sdk-dynamodb-1.10.6.jar:/usr/lib/hadoop/lib/paranamer-2.3.jar:/usr/lib/hadoop/lib/aws-java-sdk-kms-1.10.6.jar:/usr/lib/hadoop/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop/lib/aws-java-sdk-s3-1.10.6.jar:/usr/lib/hadoop/lib/servlet-api-2.5.jar:/usr/lib/hadoop/lib/aws-java-sdk-sts-1.10.6.jar:/usr/lib/hadoop/lib/jersey-core-1.9.jar:/usr/lib/hadoop/lib/azure-data-lake-store-sdk-2.1.4.jar:/usr/lib/hadoop/lib/slf4j-api-1.7.5.jar:/usr/lib/hadoop/lib/commons-beanutils-1.9.2.jar:/usr/lib/hadoop/lib/jersey-json-1.9.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/slf4j-log4j12.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/jersey-server-1.9.jar:/usr/lib/hadoop/lib/commons-collections-3.2.2.jar:/usr/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop/lib/jets3t-0.9.0.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/stax-api-1.0-2.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/lib/hadoop/lib/commons-lang-2.6.jar:/usr/lib/hadoop/lib/xz-1.0.jar:/usr/lib/hadoop/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop/lib/zookeeper.jar:/usr/lib/hadoop/lib/commons-math3-3.1.1.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/curator-client-2.7.1.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.15-cdh5.11.0.jar:/usr/lib/hadoop/lib/curator-framework-2.7.1.jar:/usr/lib/hadoop/lib/hadoop-lzo.jar:/usr/lib/hadoop/lib/curator-recipes-2.7.1.jar:/usr/lib/hadoop/lib/gson-2.2.4.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop/lib/jettison-1.1.jar:/usr/lib/hadoop/lib/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop/lib/httpclient-4.2.5.jar:/usr/lib/hadoop/lib/httpcore-4.2.5.jar:/usr/lib/hadoop/lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop/lib/jackson-xc-1.8.8.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.23.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop/lib/java-xmlbuilder-0.4.jar:/usr/lib/hadoop/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/jsp-api-2.1.jar:/usr/lib/hadoop/lib/jsr305-3.0.0.jar:/usr/lib/hadoop/lib/junit-4.11.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.6.0-cdh5.11.0-tests.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-nfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-tests.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-hdfs/lib/commons-el-1.0.jar:/usr/lib/hadoop-hdfs/lib/commons-io-2.4.jar:/usr/lib/hadoop-hdfs/lib/commons-lang-2.6.jar:/usr/lib/hadoop-hdfs/lib/commons-logging-1.1.3.jar:/usr/lib/hadoop-hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-hdfs/lib/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop-hdfs/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-hdfs/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-hdfs/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-hdfs/lib/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-hdfs/lib/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-hdfs/lib/jsp-api-2.1.jar:/usr/lib/hadoop-hdfs/lib/jsr305-3.0.0.jar:/usr/lib/hadoop-hdfs/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-hdfs/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop-hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-hdfs/lib/xercesImpl-2.9.1.jar:/usr/lib/hadoop-hdfs/lib/xml-apis-1.3.04.jar:/usr/lib/hadoop-hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/jets3t-0.9.0.jar:/usr/lib/hadoop-mapreduce/jettison-1.1.jar:/usr/lib/hadoop-mapreduce/jackson-core-2.2.3.jar:/usr/lib/hadoop-mapreduce/activation-1.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar:/usr/lib/hadoop-mapreduce/apacheds-i18n-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp.jar:/usr/lib/hadoop-mapreduce/jsch-0.1.42.jar:/usr/lib/hadoop-mapreduce/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/api-asn1-api-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/api-util-1.0.0-M20.jar:/usr/lib/hadoop-mapreduce/jsp-api-2.1.jar:/usr/lib/hadoop-mapreduce/asm-3.2.jar:/usr/lib/hadoop-mapreduce/jsr305-3.0.0.jar:/usr/lib/hadoop-mapreduce/avro.jar:/usr/lib/hadoop-mapreduce/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-1.9.2.jar:/usr/lib/hadoop-mapreduce/hadoop-extras.jar:/usr/lib/hadoop-mapreduce/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-mapreduce/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-cli-1.2.jar:/usr/lib/hadoop-mapreduce/hadoop-rumen.jar:/usr/lib/hadoop-mapreduce/commons-codec-1.4.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/commons-collections-3.2.2.jar:/usr/lib/hadoop-mapreduce/hadoop-sls-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core.jar:/usr/lib/hadoop-mapreduce/commons-configuration-1.6.jar:/usr/lib/hadoop-mapreduce/hadoop-sls.jar:/usr/lib/hadoop-mapreduce/commons-digester-1.8.jar:/usr/lib/hadoop-mapreduce/jackson-databind-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-el-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/commons-httpclient-3.1.jar:/usr/lib/hadoop-mapreduce/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-lang-2.6.jar:/usr/lib/hadoop-mapreduce/hadoop-streaming.jar:/usr/lib/hadoop-mapreduce/commons-logging-1.1.3.jar:/usr/lib/hadoop-mapreduce/jackson-annotations-2.2.3.jar:/usr/lib/hadoop-mapreduce/commons-math3-3.1.1.jar:/usr/lib/hadoop-mapreduce/jackson-xc-1.8.8.jar:/usr/lib/hadoop-mapreduce/commons-net-3.1.jar:/usr/lib/hadoop-mapreduce/htrace-core4-4.0.1-incubating.jar:/usr/lib/hadoop-mapreduce/curator-client-2.7.1.jar:/usr/lib/hadoop-mapreduce/httpclient-4.2.5.jar:/usr/lib/hadoop-mapreduce/curator-framework-2.7.1.jar:/usr/lib/hadoop-mapreduce/httpcore-4.2.5.jar:/usr/lib/hadoop-mapreduce/curator-recipes-2.7.1.jar:/usr/lib/hadoop-mapreduce/jasper-compiler-5.5.23.jar:/usr/lib/hadoop-mapreduce/gson-2.2.4.jar:/usr/lib/hadoop-mapreduce/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-mapreduce/guava-11.0.2.jar:/usr/lib/hadoop-mapreduce/hadoop-gridmix.jar:/usr/lib/hadoop-mapreduce/xz-1.0.jar:/usr/lib/hadoop-mapreduce/hadoop-ant-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/java-xmlbuilder-0.4.jar:/usr/lib/hadoop-mapreduce/hadoop-ant.jar:/usr/lib/hadoop-mapreduce/hadoop-extras-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-archive-logs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jetty-util-6.1.26.cloudera.4.jar:/usr/lib/hadoop-mapreduce/hadoop-archive-logs.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins.jar:/usr/lib/hadoop-mapreduce/hadoop-archives-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jaxb-api-2.2.2.jar:/usr/lib/hadoop-mapreduce/hadoop-archives.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs.jar:/usr/lib/hadoop-mapreduce/hadoop-auth-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-mapreduce/hadoop-auth.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-tests.jar:/usr/lib/hadoop-mapreduce/hadoop-azure-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-azure.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jersey-json-1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-datajoin.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-nativetask.jar:/usr/lib/hadoop-mapreduce/hadoop-distcp-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-app.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-core-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/hadoop-openstack.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-hs-plugins-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.11.0-tests.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-nativetask-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-shuffle-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples-2.6.0-cdh5.11.0.jar:/usr/lib/hadoop-mapreduce/jetty-6.1.26.cloudera.4.jar:/usr/lib/hadoop-mapreduce/junit-4.11.jar:/usr/lib/hadoop-mapreduce/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/metrics-core-3.0.2.jar:/usr/lib/hadoop-mapreduce/microsoft-windowsazure-storage-sdk-0.6.0.jar:/usr/lib/hadoop-mapreduce/mockito-all-1.8.5.jar:/usr/lib/hadoop-mapreduce/okhttp-2.4.0.jar:/usr/lib/hadoop-mapreduce/okio-1.4.0.jar:/usr/lib/hadoop-mapreduce/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/servlet-api-2.5.jar:/usr/lib/hadoop-mapreduce/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/stax-api-1.0-2.jar:/usr/lib/hadoop-mapreduce/xmlenc-0.52.jar:/usr/lib/hadoop-mapreduce/zookeeper.jar:/usr/lib/hadoop-mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-mapreduce/lib/avro.jar:/usr/lib/hadoop-mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-mapreduce/lib/commons-io-2.4.jar:/usr/lib/hadoop-mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-mapreduce/lib/hamcrest-core-1.3.jar:/usr/lib/hadoop-mapreduce/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-mapreduce/lib/junit-4.11.jar:/usr/lib/hadoop-mapreduce/lib/leveldbjni-all-1.8.jar:/usr/lib/hadoop-mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-mapreduce/lib/netty-3.10.5.Final.jar:/usr/lib/hadoop-mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-mapreduce/lib/xz-1.0.jar:/*:/lib/*:/usr/lib/hadoop-mapreduce/share/hadoop/mapreduce/*:/usr/lib/hadoop-mapreduce/share/hadoop/mapreduce/lib/*:job.jar/job.jar:job.jar/classes/:job.jar/lib/*:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/criteo-datadisco-hive-17394-uber.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/criteo-hadoop-hiveutils-hive2-17394-uber.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/job.jar:/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/glup-schemas-17394-uber.jar
 java.class.version52.0
 java.endorsed.dirs/usr/lib/jvm/jdk1.8.0_144/jre/lib/endorsed
 java.ext.dirs/usr/lib/jvm/jdk1.8.0_144/jre/lib/ext:/usr/java/packages/lib/ext
 java.home/usr/lib/jvm/jdk1.8.0_144/jre
 java.io.tmpdir/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034/tmp
 java.library.path/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034:/usr/lib/hadoop/lib/native:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
 java.net.preferIPv4Stacktrue
 java.runtime.nameJava(TM) SE Runtime Environment
 java.runtime.version1.8.0_144-b01
 java.specification.nameJava Platform API Specification
 java.specification.vendorOracle Corporation
 java.specification.version1.8
 java.vendorOracle Corporation
 java.vendor.urlhttp://java.oracle.com/
 java.vendor.url.bughttp://bugreport.sun.com/bugreport/
 java.version1.8.0_144
 java.vm.infomixed mode
 java.vm.nameJava HotSpot(TM) 64-Bit Server VM
 java.vm.specification.nameJava Virtual Machine Specification
 java.vm.specification.vendorOracle Corporation
 java.vm.specification.version1.8
 java.vm.vendorOracle Corporation
 java.vm.version25.144-b01
 line.separator
 log4j.configurationcontainer-log4j.properties
 os.archamd64
 os.nameLinux
 os.version4.4.21-1.el7.elrepo.x86_64
 path.separator:
 sun.arch.data.model64
 sun.boot.class.path/usr/lib/jvm/jdk1.8.0_144/jre/lib/resources.jar:/usr/lib/jvm/jdk1.8.0_144/jre/lib/rt.jar:/usr/lib/jvm/jdk1.8.0_144/jre/lib/sunrsasign.jar:/usr/lib/jvm/jdk1.8.0_144/jre/lib/jsse.jar:/usr/lib/jvm/jdk1.8.0_144/jre/lib/jce.jar:/usr/lib/jvm/jdk1.8.0_144/jre/lib/charsets.jar:/usr/lib/jvm/jdk1.8.0_144/jre/lib/jfr.jar:/usr/lib/jvm/jdk1.8.0_144/jre/classes
 sun.boot.library.path/usr/lib/jvm/jdk1.8.0_144/jre/lib/amd64
 sun.cpu.endianlittle
 sun.cpu.isalist
 sun.io.unicode.encodingUnicodeLittle
 sun.java.commandorg.apache.hadoop.mapred.YarnChild 10.224.22.18 46104 attempt_1531301354100_57311_m_001303_1 285873023226794
 sun.java.launcherSUN_STANDARD
 sun.jnu.encodingUTF-8
 sun.management.compilerHotSpot 64-Bit Tiered Compilers
 sun.os.patch.levelunknown
 user.countryUS
 user.dir/hdfs/uuid/55cb3c9b-4314-4fa5-b75e-73f503daa156/yarn/data/usercache/sz.ho/appcache/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034
 user.home/home/sz.ho
 user.languageen
 user.namesz.ho
 user.timezoneEtc/UTC
 yarn.app.container.log.dir/hdfs/uuid/f444a50c-5326-4c38-869b-17715b806ac7/yarn/logs/application_1531301354100_57311/container_e260_1531301354100_57311_01_005034
 yarn.app.container.log.filesize0