Summary
Queries against Hive 2 tables in Dremio v24 and earlier require Java 8 runtimes for full functionality. If running Dremio with Java 11, some queries may fail with "java.lang.NoSuchMethodError: 'sun.misc.Cleaner sun.nio.ch.DirectBuffer.cleaner()''.
Reported Issue
When querying a table in a Hive 2 data source from Dremio, the job fails, reporting an error stack with the leading message:
java.lang.NoSuchMethodError: 'sun.misc.Cleaner sun.nio.ch.DirectBuffer.cleaner()'
There are variants of the error stack trace because the exception can be thrown from different parts of Dremio code. Here is an error stack trace from Dremio 24.2.2 when trying to read a Hive 2 table where the data is in HDFS:
(java.lang.RuntimeException) java.lang.NoSuchMethodError: 'sun.misc.Cleaner sun.nio.ch.DirectBuffer.cleaner()'
com.dremio.common.DeferredException.addThrowable():136
com.dremio.sabot.exec.fragment.FragmentExecutor.transitionToFailed():760
com.dremio.sabot.exec.fragment.FragmentExecutor.run():517
com.dremio.sabot.exec.fragment.FragmentExecutor.access$1700():108
com.dremio.sabot.exec.fragment.FragmentExecutor$AsyncTaskImpl.run():1007
com.dremio.sabot.task.AsyncTaskWrapper.run():122
com.dremio.sabot.task.slicing.SlicingThread.mainExecutionLoop():249
com.dremio.sabot.task.slicing.SlicingThread.run():171
Caused By (java.lang.NoSuchMethodError) 'sun.misc.Cleaner sun.nio.ch.DirectBuffer.cleaner()'
org.apache.hadoop.io.nativeio.NativeIO$POSIX.munmap():330
org.apache.hadoop.hdfs.shortcircuit.ShortCircuitReplica.munmap():223
org.apache.hadoop.hdfs.shortcircuit.ShortCircuitCache.munmap():562
org.apache.hadoop.hdfs.shortcircuit.ShortCircuitCache.demoteOldEvictableMmaped():517
org.apache.hadoop.hdfs.shortcircuit.ShortCircuitCache.trimEvictionMaps():529
org.apache.hadoop.hdfs.shortcircuit.ShortCircuitCache.unref():471
org.apache.hadoop.hdfs.shortcircuit.ShortCircuitReplica.unref():143
org.apache.hadoop.hdfs.client.impl.BlockReaderLocal.close():627
org.apache.hadoop.hdfs.DFSInputStream.actualGetFromOneDataNode():1285
org.apache.hadoop.hdfs.DFSInputStream.actualGetFromOneDataNode():1204
org.apache.hadoop.hdfs.DFSInputStream.fetchBlockByteRange():1164
org.apache.hadoop.hdfs.DFSInputStream.pread():1541
org.apache.hadoop.hdfs.DFSInputStream.read():1507
org.apache.hadoop.fs.FSInputStream.readFully():121
org.apache.hadoop.fs.FSDataInputStream.readFully():111
com.dremio.exec.store.hive.exec.apache.FSDataInputStreamWrapper.readFully():83
com.dremio.exec.store.hive.exec.apache.FSDataInputStreamWithStatsWrapper.readFully():66
org.apache.orc.impl.ReaderImpl.extractFileTail():555
org.apache.orc.impl.ReaderImpl.<init>():370
org.apache.hadoop.hive.ql.io.orc.ReaderImpl.<init>():60
org.apache.hadoop.hive.ql.io.orc.OrcFile.createReader():90
com.dremio.exec.store.hive.exec.HiveORCVectorizedReader.internalInit():383
com.dremio.exec.store.hive.exec.HiveAbstractReader.lambda$setup$0():224
java.security.AccessController.doPrivileged():-2
javax.security.auth.Subject.doAs():423
org.apache.hadoop.security.UserGroupInformation.doAs():1844
com.dremio.exec.store.hive.exec.HiveAbstractReader.setup():227
com.dremio.exec.store.parquet.ScanTableFunction.setupNextReader():188
com.dremio.exec.store.parquet.ScanTableFunction.startRow():176
com.dremio.sabot.op.tablefunction.TableFunctionOperator.outputData():109
com.dremio.sabot.driver.SmartOp$SmartSingleInput.outputData():212
com.dremio.sabot.driver.StraightPipe.pump():56
com.dremio.sabot.driver.Pipeline.doPump():124
com.dremio.sabot.driver.Pipeline.pumpOnce():114
com.dremio.sabot.exec.fragment.FragmentExecutor$DoAsPumper.run():561
com.dremio.sabot.exec.fragment.FragmentExecutor.run():479
com.dremio.sabot.exec.fragment.FragmentExecutor.access$1700():108
com.dremio.sabot.exec.fragment.FragmentExecutor$AsyncTaskImpl.run():1007
com.dremio.sabot.task.AsyncTaskWrapper.run():122
com.dremio.sabot.task.slicing.SlicingThread.mainExecutionLoop():249
com.dremio.sabot.task.slicing.SlicingThread.run():171
Relevant Versions
Dremio 24.x. Versions 23 and earlier run on exclusively on Java 8. Dremio 25+ runs on Java 11+
Troubleshooting Steps
If you are using the Dremio UI query editor, the topmost line in the error stack trace should be reported to you upon query failure, but you can check the raw query profile under the "Error" for a full stack trace.
Determine which Dremio version you are using. In the UI this is shown under "About Dremio".
Determine which version of Java is running the Dremio executors. If you our your Dremio administrator do not know, you can login into a VM, physical node or Kubernetes pods running the executor. The error message in the raw query profile usually identifies a specific node that is reporting the error, for example:
SYSTEM ERROR: NoSuchMethodError: sun.nio.ch.DirectBuffer.cleaner()Lsun/misc/Cleaner; Fragment 2:0 [Error Id: 27df4c44-9e04-433f-8e5a-fed5642078ac on dremio.prod.exec.example.com:0]
...
You can run a shell and execute ps -ef | grep -i dremio. The output should show which JVM is running the application. Is it Java 8 or Java 11+?
Cause
Dremio versions 24.x and earlier rely on Hadoop 2 libraries in the implementation of the Hive 2 data source plugin. Hadoop 2 libraries requires Java 8 and do not support Java 11 compilation or runtimes.
Meanwhile, Dremio 24.x can run on both Java 8 and Java 11. Java 11 does not have the APIs the Hadoop 2 libraries rely on, hence the "NoSuchMethoedError".
Steps to Resolve
The best solution is the upgrade to Dremio 25+ which uses the more secure Hive 3 libraries to implement the Hive 2 data source plugin.
In some instances of the problem, where the underlying storage for the Hive 2 table data is HDFS, you may be able to work around it by disabling the Dremio local caching for the source. In the Hive 2 data source settings in Dremio, go to "Advanced Options" → "Caching Options" and uncheck the box for "Enable local caching for HDFS".