Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch appears in flink query hive

huangapple go评论95阅读模式
英文:

Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch appears in flink query hive

问题

  1. 1. 软件版本:
  2. * flink 1.11
  3. * hive 1.2.1
  4. * hadoop 2.7.1
  5. 2. 使用flink运行jar来运行提交程序时出现以下异常
  6. ```text
  7. org.apache.flink.runtime.JobException: 由 NoRestartBackoffTimeStrategy 抑制了恢复
  8. at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:116)
  9. ...
  10. Caused by: java.lang.NoClassDefFoundError: org/apache/orc/storage/ql/exec/vector/VectorizedRowBatch
  11. ...
  12. Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch
  13. ...
  1. 我在Maven中添加了以下依赖
  1. <dependency>
  2. <groupId>org.apache.hive</groupId>
  3. <artifactId>hive-storage-api</artifactId>
  4. <version>2.0.0</version>
  5. </dependency>

但程序仍然报错,我不太清楚是什么原因导致的!

  1. 我的POM配置如下
  1. <!-- 你提供的POM配置 -->
  1. 我在flink的lib目录下添加了以下jar包
  1. flink-sql-connector-hive-1.2.2
  2. hive-metastore-1.2.1.jar
  3. hive-exec-1.2.1.jar
  4. libfb303-0.9.2.jar
  5. orc-core-1.4.3-nohive.jar
  6. aircompressor-0.8.jar
  1. <details>
  2. <summary>英文:</summary>
  3. 1. Software version:
  4. * flink 1.11
  5. * hive1.2.1
  6. * hadoop2.7.1
  7. 2. Use flink run jar to run the submission program with the following exceptions
  8. ```text
  9. org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
  10. at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:116)
  11. at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:78)
  12. at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:192)
  13. at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:185)
  14. at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:179)
  15. at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:503)
  16. at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:386)
  17. at sun.reflect.GeneratedMethodAccessor70.invoke(Unknown Source)
  18. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  19. at java.lang.reflect.Method.invoke(Method.java:497)
  20. at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:284)
  21. at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:199)
  22. at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
  23. at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
  24. at org.apache.flink.runtime.rpc.akka.AkkaRpcActor$$Lambda$98/618592213.apply(Unknown Source)
  25. at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
  26. at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
  27. at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
  28. at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
  29. at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
  30. at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
  31. at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
  32. at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
  33. at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
  34. at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
  35. at akka.actor.ActorCell.invoke(ActorCell.scala:561)
  36. at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
  37. at akka.dispatch.Mailbox.run(Mailbox.scala:225)
  38. at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
  39. at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
  40. at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
  41. at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
  42. at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
  43. Caused by: java.lang.NoClassDefFoundError: org/apache/orc/storage/ql/exec/vector/VectorizedRowBatch
  44. at org.apache.flink.orc.nohive.OrcNoHiveSplitReaderUtil.genPartColumnarRowReader(OrcNoHiveSplitReaderUtil.java:67)
  45. at org.apache.flink.connectors.hive.read.HiveVectorizedOrcSplitReader.&lt;init&gt;(HiveVectorizedOrcSplitReader.java:67)
  46. at org.apache.flink.connectors.hive.read.HiveTableInputFormat.open(HiveTableInputFormat.java:137)
  47. at org.apache.flink.connectors.hive.read.HiveTableInputFormat.open(HiveTableInputFormat.java:66)
  48. at org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:85)
  49. at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
  50. at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
  51. at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:213)
  52. Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch
  53. at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
  54. at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  55. at org.apache.flink.util.FlinkUserCodeClassLoader.loadClassWithoutExceptionHandling(FlinkUserCodeClassLoader.java:61)
  56. at org.apache.flink.util.ChildFirstClassLoader.loadClassWithoutExceptionHandling(ChildFirstClassLoader.java:74)
  57. at org.apache.flink.util.FlinkUserCodeClassLoader.loadClass(FlinkUserCodeClassLoader.java:48)
  58. at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  59. ... 8 more
  1. I added the following dependencies in maven
  1. &lt;dependency&gt;
  2. &lt;groupId&gt;org.apache.hive&lt;/groupId&gt;
  3. &lt;artifactId&gt;hive-storage-api&lt;/artifactId&gt;
  4. &lt;version&gt;2.0.0&lt;/version&gt;
  5. &lt;/dependency&gt;

The program still reports an error, I don’t quite clear what caused it!

  1. My pom configuration is as follows
  1. &lt;project xmlns=&quot;http://maven.apache.org/POM/4.0.0&quot; xmlns:xsi=&quot;http://www.w3.org/2001/XMLSchema-instance&quot;
  2. xsi:schemaLocation=&quot;http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd&quot;&gt;
  3. &lt;modelVersion&gt;4.0.0&lt;/modelVersion&gt;
  4. &lt;groupId&gt;com.credit.analyze&lt;/groupId&gt;
  5. &lt;artifactId&gt;fast-analyze&lt;/artifactId&gt;
  6. &lt;version&gt;0.1&lt;/version&gt;
  7. &lt;packaging&gt;jar&lt;/packaging&gt;
  8. &lt;name&gt;Fast Analyze Job&lt;/name&gt;
  9. &lt;properties&gt;
  10. &lt;project.build.sourceEncoding&gt;UTF-8&lt;/project.build.sourceEncoding&gt;
  11. &lt;flink.version&gt;1.11.0&lt;/flink.version&gt;
  12. &lt;java.version&gt;1.8&lt;/java.version&gt;
  13. &lt;scala.binary.version&gt;2.11&lt;/scala.binary.version&gt;
  14. &lt;maven.compiler.source&gt;${java.version}&lt;/maven.compiler.source&gt;
  15. &lt;maven.compiler.target&gt;${java.version}&lt;/maven.compiler.target&gt;
  16. &lt;log4j.version&gt;2.12.1&lt;/log4j.version&gt;
  17. &lt;hive.version&gt;1.2.1&lt;/hive.version&gt;
  18. &lt;hadoop.version&gt;2.7.1&lt;/hadoop.version&gt;
  19. &lt;mysql.version&gt;5.1.16&lt;/mysql.version&gt;
  20. &lt;junit.version&gt;4.12&lt;/junit.version&gt;
  21. &lt;/properties&gt;
  22. &lt;repositories&gt;
  23. &lt;repository&gt;
  24. &lt;id&gt;apache.snapshots&lt;/id&gt;
  25. &lt;name&gt;Apache Development Snapshot Repository&lt;/name&gt;
  26. &lt;url&gt;https://repository.apache.org/content/repositories/snapshots/&lt;/url&gt;
  27. &lt;releases&gt;
  28. &lt;enabled&gt;false&lt;/enabled&gt;
  29. &lt;/releases&gt;
  30. &lt;snapshots&gt;
  31. &lt;enabled&gt;true&lt;/enabled&gt;
  32. &lt;/snapshots&gt;
  33. &lt;/repository&gt;
  34. &lt;/repositories&gt;
  35. &lt;dependencies&gt;
  36. &lt;!-- Apache Flink dependencies --&gt;
  37. &lt;!-- These dependencies are provided, because they should not be packaged into the JAR file. --&gt;
  38. &lt;dependency&gt;
  39. &lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
  40. &lt;artifactId&gt;flink-java&lt;/artifactId&gt;
  41. &lt;version&gt;${flink.version}&lt;/version&gt;
  42. &lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
  43. &lt;/dependency&gt;
  44. &lt;dependency&gt;
  45. &lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
  46. &lt;artifactId&gt;flink-streaming-java_${scala.binary.version}&lt;/artifactId&gt;
  47. &lt;version&gt;${flink.version}&lt;/version&gt;
  48. &lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
  49. &lt;/dependency&gt;
  50. &lt;dependency&gt;
  51. &lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
  52. &lt;artifactId&gt;flink-clients_${scala.binary.version}&lt;/artifactId&gt;
  53. &lt;version&gt;${flink.version}&lt;/version&gt;
  54. &lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
  55. &lt;/dependency&gt;
  56. &lt;!-- Add connector dependencies here. They must be in the default scope (compile). --&gt;
  57. &lt;dependency&gt;
  58. &lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
  59. &lt;artifactId&gt;flink-table-api-java-bridge_${scala.binary.version}&lt;/artifactId&gt;
  60. &lt;version&gt;${flink.version}&lt;/version&gt;
  61. &lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
  62. &lt;/dependency&gt;
  63. &lt;dependency&gt;
  64. &lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
  65. &lt;artifactId&gt;flink-table-planner_${scala.binary.version}&lt;/artifactId&gt;
  66. &lt;version&gt;${flink.version}&lt;/version&gt;
  67. &lt;scope&gt;provided&lt;/scope&gt;
  68. &lt;/dependency&gt;
  69. &lt;!-- or.. (for the new Blink planner) --&gt;
  70. &lt;dependency&gt;
  71. &lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
  72. &lt;artifactId&gt;flink-table-planner-blink_${scala.binary.version}&lt;/artifactId&gt;
  73. &lt;version&gt;${flink.version}&lt;/version&gt;
  74. &lt;scope&gt;provided&lt;/scope&gt;
  75. &lt;/dependency&gt;
  76. &lt;dependency&gt;
  77. &lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
  78. &lt;artifactId&gt;flink-streaming-scala_${scala.binary.version}&lt;/artifactId&gt;
  79. &lt;version&gt;${flink.version}&lt;/version&gt;
  80. &lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
  81. &lt;/dependency&gt;
  82. &lt;dependency&gt;
  83. &lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
  84. &lt;artifactId&gt;flink-connector-hive_${scala.binary.version}&lt;/artifactId&gt;
  85. &lt;version&gt;${flink.version}&lt;/version&gt;
  86. &lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
  87. &lt;/dependency&gt;
  88. &lt;dependency&gt;
  89. &lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
  90. &lt;artifactId&gt;flink-table-api-java-bridge_${scala.binary.version}&lt;/artifactId&gt;
  91. &lt;version&gt;${flink.version}&lt;/version&gt;
  92. &lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
  93. &lt;/dependency&gt;
  94. &lt;dependency&gt;
  95. &lt;groupId&gt;org.apache.hive&lt;/groupId&gt;
  96. &lt;artifactId&gt;hive-exec&lt;/artifactId&gt;
  97. &lt;version&gt;${hive.version}&lt;/version&gt;
  98. &lt;scope&gt;provided&lt;/scope&gt;
  99. &lt;/dependency&gt;
  100. &lt;dependency&gt;
  101. &lt;groupId&gt;org.apache.hadoop&lt;/groupId&gt;
  102. &lt;artifactId&gt;hadoop-client&lt;/artifactId&gt;
  103. &lt;version&gt;${hadoop.version}&lt;/version&gt;
  104. &lt;/dependency&gt;
  105. &lt;dependency&gt;
  106. &lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
  107. &lt;artifactId&gt;flink-connector-jdbc_${scala.binary.version}&lt;/artifactId&gt;
  108. &lt;version&gt;${flink.version}&lt;/version&gt;
  109. &lt;/dependency&gt;
  110. &lt;dependency&gt;
  111. &lt;groupId&gt;mysql&lt;/groupId&gt;
  112. &lt;artifactId&gt;mysql-connector-java&lt;/artifactId&gt;
  113. &lt;version&gt;${mysql.version}&lt;/version&gt;
  114. &lt;/dependency&gt;
  115. &lt;dependency&gt;
  116. &lt;groupId&gt;junit&lt;/groupId&gt;
  117. &lt;artifactId&gt;junit&lt;/artifactId&gt;
  118. &lt;version&gt;${junit.version}&lt;/version&gt;
  119. &lt;/dependency&gt;
  120. &lt;dependency&gt;
  121. &lt;groupId&gt;org.apache.orc&lt;/groupId&gt;
  122. &lt;artifactId&gt;orc-core&lt;/artifactId&gt;
  123. &lt;version&gt;1.6.5&lt;/version&gt;
  124. &lt;/dependency&gt;
  125. &lt;!-- Add logging framework, to produce console output when running in the IDE. --&gt;
  126. &lt;!-- These dependencies are excluded from the application JAR by default. --&gt;
  127. &lt;dependency&gt;
  128. &lt;groupId&gt;org.apache.logging.log4j&lt;/groupId&gt;
  129. &lt;artifactId&gt;log4j-slf4j-impl&lt;/artifactId&gt;
  130. &lt;version&gt;${log4j.version}&lt;/version&gt;
  131. &lt;scope&gt;runtime&lt;/scope&gt;
  132. &lt;/dependency&gt;
  133. &lt;dependency&gt;
  134. &lt;groupId&gt;org.apache.logging.log4j&lt;/groupId&gt;
  135. &lt;artifactId&gt;log4j-api&lt;/artifactId&gt;
  136. &lt;version&gt;${log4j.version}&lt;/version&gt;
  137. &lt;scope&gt;runtime&lt;/scope&gt;
  138. &lt;/dependency&gt;
  139. &lt;dependency&gt;
  140. &lt;groupId&gt;org.apache.logging.log4j&lt;/groupId&gt;
  141. &lt;artifactId&gt;log4j-core&lt;/artifactId&gt;
  142. &lt;version&gt;${log4j.version}&lt;/version&gt;
  143. &lt;scope&gt;runtime&lt;/scope&gt;
  144. &lt;/dependency&gt;
  145. &lt;/dependencies&gt;
  146. &lt;build&gt;
  147. &lt;plugins&gt;
  148. &lt;!-- Java Compiler --&gt;
  149. &lt;plugin&gt;
  150. &lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
  151. &lt;artifactId&gt;maven-compiler-plugin&lt;/artifactId&gt;
  152. &lt;version&gt;3.1&lt;/version&gt;
  153. &lt;configuration&gt;
  154. &lt;source&gt;${java.version}&lt;/source&gt;
  155. &lt;target&gt;${java.version}&lt;/target&gt;
  156. &lt;/configuration&gt;
  157. &lt;/plugin&gt;
  158. &lt;!-- We use the maven-shade plugin to create a fat jar that contains all necessary dependencies. --&gt;
  159. &lt;!-- Change the value of &lt;mainClass&gt;...&lt;/mainClass&gt; if your program entry point changes. --&gt;
  160. &lt;plugin&gt;
  161. &lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
  162. &lt;artifactId&gt;maven-shade-plugin&lt;/artifactId&gt;
  163. &lt;version&gt;3.1.1&lt;/version&gt;
  164. &lt;executions&gt;
  165. &lt;!-- Run shade goal on package phase --&gt;
  166. &lt;execution&gt;
  167. &lt;phase&gt;package&lt;/phase&gt;
  168. &lt;goals&gt;
  169. &lt;goal&gt;shade&lt;/goal&gt;
  170. &lt;/goals&gt;
  171. &lt;configuration&gt;
  172. &lt;artifactSet&gt;
  173. &lt;excludes&gt;
  174. &lt;exclude&gt;org.apache.flink:force-shading&lt;/exclude&gt;
  175. &lt;exclude&gt;com.google.code.findbugs:jsr305&lt;/exclude&gt;
  176. &lt;exclude&gt;org.slf4j:*&lt;/exclude&gt;
  177. &lt;exclude&gt;org.apache.logging.log4j:*&lt;/exclude&gt;
  178. &lt;/excludes&gt;
  179. &lt;/artifactSet&gt;
  180. &lt;filters&gt;
  181. &lt;filter&gt;
  182. &lt;!-- Do not copy the signatures in the META-INF folder.
  183. Otherwise, this might cause SecurityExceptions when using the JAR. --&gt;
  184. &lt;artifact&gt;*:*&lt;/artifact&gt;
  185. &lt;excludes&gt;
  186. &lt;exclude&gt;META-INF/*.SF&lt;/exclude&gt;
  187. &lt;exclude&gt;META-INF/*.DSA&lt;/exclude&gt;
  188. &lt;exclude&gt;META-INF/*.RSA&lt;/exclude&gt;
  189. &lt;/excludes&gt;
  190. &lt;/filter&gt;
  191. &lt;/filters&gt;
  192. &lt;transformers&gt;
  193. &lt;transformer implementation=&quot;org.apache.maven.plugins.shade.resource.ManifestResourceTransformer&quot;&gt;
  194. &lt;mainClass&gt;com.credit.analyze.job.batch.market.channel.MarketMonitorHiveJob&lt;/mainClass&gt;
  195. &lt;/transformer&gt;
  196. &lt;/transformers&gt;
  197. &lt;/configuration&gt;
  198. &lt;/execution&gt;
  199. &lt;/executions&gt;
  200. &lt;/plugin&gt;
  201. &lt;/plugins&gt;
  202. &lt;pluginManagement&gt;
  203. &lt;plugins&gt;
  204. &lt;!-- This improves the out-of-the-box experience in Eclipse by resolving some warnings. --&gt;
  205. &lt;plugin&gt;
  206. &lt;groupId&gt;org.eclipse.m2e&lt;/groupId&gt;
  207. &lt;artifactId&gt;lifecycle-mapping&lt;/artifactId&gt;
  208. &lt;version&gt;1.0.0&lt;/version&gt;
  209. &lt;configuration&gt;
  210. &lt;lifecycleMappingMetadata&gt;
  211. &lt;pluginExecutions&gt;
  212. &lt;pluginExecution&gt;
  213. &lt;pluginExecutionFilter&gt;
  214. &lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
  215. &lt;artifactId&gt;maven-shade-plugin&lt;/artifactId&gt;
  216. &lt;versionRange&gt;[3.1.1,)&lt;/versionRange&gt;
  217. &lt;goals&gt;
  218. &lt;goal&gt;shade&lt;/goal&gt;
  219. &lt;/goals&gt;
  220. &lt;/pluginExecutionFilter&gt;
  221. &lt;action&gt;
  222. &lt;ignore/&gt;
  223. &lt;/action&gt;
  224. &lt;/pluginExecution&gt;
  225. &lt;pluginExecution&gt;
  226. &lt;pluginExecutionFilter&gt;
  227. &lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
  228. &lt;artifactId&gt;maven-compiler-plugin&lt;/artifactId&gt;
  229. &lt;versionRange&gt;[3.1,)&lt;/versionRange&gt;
  230. &lt;goals&gt;
  231. &lt;goal&gt;testCompile&lt;/goal&gt;
  232. &lt;goal&gt;compile&lt;/goal&gt;
  233. &lt;/goals&gt;
  234. &lt;/pluginExecutionFilter&gt;
  235. &lt;action&gt;
  236. &lt;ignore/&gt;
  237. &lt;/action&gt;
  238. &lt;/pluginExecution&gt;
  239. &lt;/pluginExecutions&gt;
  240. &lt;/lifecycleMappingMetadata&gt;
  241. &lt;/configuration&gt;
  242. &lt;/plugin&gt;
  243. &lt;/plugins&gt;
  244. &lt;/pluginManagement&gt;
  245. &lt;/build&gt;
  246. &lt;/project&gt;
  1. I have added the following jar package in the flink lib directory
  1. flink-sql-connector-hive-1.2.2
  2. hive-metastore-1.2.1.jar
  3. hive-exec-1.2.1.jar
  4. libfb303-0.9.2.jar
  5. orc-core-1.4.3-nohive.jar
  6. aircompressor-0.8.jar

答案1

得分: 1

我已解决这个问题。需要在 pom.xml 中添加以下依赖项:

  1. <!-- flink orc nohive -->
  2. <dependency>
  3. <groupId>org.apache.flink</groupId>
  4. <artifactId>flink-orc-nohive_2.12</artifactId>
  5. <version>${flink.version}</version>
  6. </dependency>
英文:

I have solved the problem. Need to add the following dependencies in pom.xml

  1. &lt;!-- flink orc nohive --&gt;
  2. &lt;dependency&gt;
  3. &lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
  4. &lt;artifactId&gt;flink-orc-nohive_2.12&lt;/artifactId&gt;
  5. &lt;version&gt;${flink.version}&lt;/version&gt;
  6. &lt;/dependency&gt;

答案2

得分: 0

你的错误明确指出,确保你在依赖中导入了正确的 orc-core jar 包,如下所示,

  1. Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch

你应该添加以下依赖项,

  1. <dependency>
  2. <groupId>org.apache.orc</groupId>
  3. <artifactId>orc-core</artifactId>
  4. <version>1.6.5</version>
  5. </dependency>
英文:

Your error clearly indicates, make sure that your have proper orc-core jar imported as dependency in case,

  1. Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch

you should add the following artifact,

  1. &lt;dependency&gt;
  2. &lt;groupId&gt;org.apache.orc&lt;/groupId&gt;
  3. &lt;artifactId&gt;orc-core&lt;/artifactId&gt;
  4. &lt;version&gt;1.6.5&lt;/version&gt;
  5. &lt;/dependency&gt;

huangapple
  • 本文由 发表于 2020年10月13日 19:52:26
  • 转载请务必保留本文链接:https://go.coder-hub.com/64334718.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定