Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch appears in flink query hive

huangapple go评论70阅读模式
英文:

Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch appears in flink query hive

问题

1. 软件版本:
* flink 1.11
* hive 1.2.1
* hadoop 2.7.1

2. 使用flink运行jar来运行提交程序时出现以下异常

```text
org.apache.flink.runtime.JobException: 由 NoRestartBackoffTimeStrategy 抑制了恢复
	at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:116)
	...
Caused by: java.lang.NoClassDefFoundError: org/apache/orc/storage/ql/exec/vector/VectorizedRowBatch
	...
Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch
	...
  1. 我在Maven中添加了以下依赖
<dependency>
  <groupId>org.apache.hive</groupId>
  <artifactId>hive-storage-api</artifactId>
  <version>2.0.0</version>
</dependency>

但程序仍然报错,我不太清楚是什么原因导致的!

  1. 我的POM配置如下
<!-- 你提供的POM配置 -->
  1. 我在flink的lib目录下添加了以下jar包
flink-sql-connector-hive-1.2.2
hive-metastore-1.2.1.jar
hive-exec-1.2.1.jar
libfb303-0.9.2.jar 
orc-core-1.4.3-nohive.jar
aircompressor-0.8.jar

<details>
<summary>英文:</summary>

1. Software version:
* flink 1.11
* hive1.2.1
* hadoop2.7.1

2. Use flink run jar to run the submission program with the following exceptions

```text
org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
	at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:116)
	at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:78)
	at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:192)
	at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:185)
	at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:179)
	at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:503)
	at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:386)
	at sun.reflect.GeneratedMethodAccessor70.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:284)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:199)
	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor$$Lambda$98/618592213.apply(Unknown Source)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
	at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
	at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: org/apache/orc/storage/ql/exec/vector/VectorizedRowBatch
	at org.apache.flink.orc.nohive.OrcNoHiveSplitReaderUtil.genPartColumnarRowReader(OrcNoHiveSplitReaderUtil.java:67)
	at org.apache.flink.connectors.hive.read.HiveVectorizedOrcSplitReader.&lt;init&gt;(HiveVectorizedOrcSplitReader.java:67)
	at org.apache.flink.connectors.hive.read.HiveTableInputFormat.open(HiveTableInputFormat.java:137)
	at org.apache.flink.connectors.hive.read.HiveTableInputFormat.open(HiveTableInputFormat.java:66)
	at org.apache.flink.streaming.api.functions.source.InputFormatSourceFunction.run(InputFormatSourceFunction.java:85)
	at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
	at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
	at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:213)
Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at org.apache.flink.util.FlinkUserCodeClassLoader.loadClassWithoutExceptionHandling(FlinkUserCodeClassLoader.java:61)
	at org.apache.flink.util.ChildFirstClassLoader.loadClassWithoutExceptionHandling(ChildFirstClassLoader.java:74)
	at org.apache.flink.util.FlinkUserCodeClassLoader.loadClass(FlinkUserCodeClassLoader.java:48)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 8 more
  1. I added the following dependencies in maven
&lt;dependency&gt;
  &lt;groupId&gt;org.apache.hive&lt;/groupId&gt;
  &lt;artifactId&gt;hive-storage-api&lt;/artifactId&gt;
  &lt;version&gt;2.0.0&lt;/version&gt;
&lt;/dependency&gt;

The program still reports an error, I don’t quite clear what caused it!

  1. My pom configuration is as follows

&lt;project xmlns=&quot;http://maven.apache.org/POM/4.0.0&quot; xmlns:xsi=&quot;http://www.w3.org/2001/XMLSchema-instance&quot;
	xsi:schemaLocation=&quot;http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd&quot;&gt;
	&lt;modelVersion&gt;4.0.0&lt;/modelVersion&gt;

	&lt;groupId&gt;com.credit.analyze&lt;/groupId&gt;
	&lt;artifactId&gt;fast-analyze&lt;/artifactId&gt;
	&lt;version&gt;0.1&lt;/version&gt;
	&lt;packaging&gt;jar&lt;/packaging&gt;

	&lt;name&gt;Fast Analyze Job&lt;/name&gt;

	&lt;properties&gt;
		&lt;project.build.sourceEncoding&gt;UTF-8&lt;/project.build.sourceEncoding&gt;
		&lt;flink.version&gt;1.11.0&lt;/flink.version&gt;
		&lt;java.version&gt;1.8&lt;/java.version&gt;
		&lt;scala.binary.version&gt;2.11&lt;/scala.binary.version&gt;
		&lt;maven.compiler.source&gt;${java.version}&lt;/maven.compiler.source&gt;
		&lt;maven.compiler.target&gt;${java.version}&lt;/maven.compiler.target&gt;
		&lt;log4j.version&gt;2.12.1&lt;/log4j.version&gt;
		&lt;hive.version&gt;1.2.1&lt;/hive.version&gt;
		&lt;hadoop.version&gt;2.7.1&lt;/hadoop.version&gt;
		&lt;mysql.version&gt;5.1.16&lt;/mysql.version&gt;
        &lt;junit.version&gt;4.12&lt;/junit.version&gt;
	&lt;/properties&gt;

	&lt;repositories&gt;
		&lt;repository&gt;
			&lt;id&gt;apache.snapshots&lt;/id&gt;
			&lt;name&gt;Apache Development Snapshot Repository&lt;/name&gt;
			&lt;url&gt;https://repository.apache.org/content/repositories/snapshots/&lt;/url&gt;
			&lt;releases&gt;
				&lt;enabled&gt;false&lt;/enabled&gt;
			&lt;/releases&gt;
			&lt;snapshots&gt;
				&lt;enabled&gt;true&lt;/enabled&gt;
			&lt;/snapshots&gt;
		&lt;/repository&gt;
	&lt;/repositories&gt;

	&lt;dependencies&gt;
		&lt;!-- Apache Flink dependencies --&gt;
		&lt;!-- These dependencies are provided, because they should not be packaged into the JAR file. --&gt;
		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
			&lt;artifactId&gt;flink-java&lt;/artifactId&gt;
			&lt;version&gt;${flink.version}&lt;/version&gt;
			&lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
		&lt;/dependency&gt;
		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
			&lt;artifactId&gt;flink-streaming-java_${scala.binary.version}&lt;/artifactId&gt;
			&lt;version&gt;${flink.version}&lt;/version&gt;
			&lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
		&lt;/dependency&gt;
		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
			&lt;artifactId&gt;flink-clients_${scala.binary.version}&lt;/artifactId&gt;
			&lt;version&gt;${flink.version}&lt;/version&gt;
			&lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
		&lt;/dependency&gt;

		&lt;!-- Add connector dependencies here. They must be in the default scope (compile). --&gt;

		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
			&lt;artifactId&gt;flink-table-api-java-bridge_${scala.binary.version}&lt;/artifactId&gt;
			&lt;version&gt;${flink.version}&lt;/version&gt;
			&lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
		&lt;/dependency&gt;

		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
			&lt;artifactId&gt;flink-table-planner_${scala.binary.version}&lt;/artifactId&gt;
			&lt;version&gt;${flink.version}&lt;/version&gt;
			&lt;scope&gt;provided&lt;/scope&gt;
		&lt;/dependency&gt;
		&lt;!-- or.. (for the new Blink planner) --&gt;
		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
			&lt;artifactId&gt;flink-table-planner-blink_${scala.binary.version}&lt;/artifactId&gt;
			&lt;version&gt;${flink.version}&lt;/version&gt;
			&lt;scope&gt;provided&lt;/scope&gt;
		&lt;/dependency&gt;


		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
			&lt;artifactId&gt;flink-streaming-scala_${scala.binary.version}&lt;/artifactId&gt;
			&lt;version&gt;${flink.version}&lt;/version&gt;
			&lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
		&lt;/dependency&gt;

		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
			&lt;artifactId&gt;flink-connector-hive_${scala.binary.version}&lt;/artifactId&gt;
			&lt;version&gt;${flink.version}&lt;/version&gt;
			&lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
		&lt;/dependency&gt;

		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
			&lt;artifactId&gt;flink-table-api-java-bridge_${scala.binary.version}&lt;/artifactId&gt;
			&lt;version&gt;${flink.version}&lt;/version&gt;
			&lt;!--&lt;scope&gt;provided&lt;/scope&gt;--&gt;
		&lt;/dependency&gt;

		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.hive&lt;/groupId&gt;
			&lt;artifactId&gt;hive-exec&lt;/artifactId&gt;
			&lt;version&gt;${hive.version}&lt;/version&gt;
			&lt;scope&gt;provided&lt;/scope&gt;
		&lt;/dependency&gt;

		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.hadoop&lt;/groupId&gt;
			&lt;artifactId&gt;hadoop-client&lt;/artifactId&gt;
			&lt;version&gt;${hadoop.version}&lt;/version&gt;
		&lt;/dependency&gt;

		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
			&lt;artifactId&gt;flink-connector-jdbc_${scala.binary.version}&lt;/artifactId&gt;
			&lt;version&gt;${flink.version}&lt;/version&gt;
		&lt;/dependency&gt;

		&lt;dependency&gt;
			&lt;groupId&gt;mysql&lt;/groupId&gt;
			&lt;artifactId&gt;mysql-connector-java&lt;/artifactId&gt;
			&lt;version&gt;${mysql.version}&lt;/version&gt;
		&lt;/dependency&gt;


		&lt;dependency&gt;
			&lt;groupId&gt;junit&lt;/groupId&gt;
			&lt;artifactId&gt;junit&lt;/artifactId&gt;
			&lt;version&gt;${junit.version}&lt;/version&gt;
		&lt;/dependency&gt;



		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.orc&lt;/groupId&gt;
			&lt;artifactId&gt;orc-core&lt;/artifactId&gt;
			&lt;version&gt;1.6.5&lt;/version&gt;
		&lt;/dependency&gt;







		&lt;!-- Add logging framework, to produce console output when running in the IDE. --&gt;
		&lt;!-- These dependencies are excluded from the application JAR by default. --&gt;
		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.logging.log4j&lt;/groupId&gt;
			&lt;artifactId&gt;log4j-slf4j-impl&lt;/artifactId&gt;
			&lt;version&gt;${log4j.version}&lt;/version&gt;
			&lt;scope&gt;runtime&lt;/scope&gt;
		&lt;/dependency&gt;
		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.logging.log4j&lt;/groupId&gt;
			&lt;artifactId&gt;log4j-api&lt;/artifactId&gt;
			&lt;version&gt;${log4j.version}&lt;/version&gt;
			&lt;scope&gt;runtime&lt;/scope&gt;
		&lt;/dependency&gt;
		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.logging.log4j&lt;/groupId&gt;
			&lt;artifactId&gt;log4j-core&lt;/artifactId&gt;
			&lt;version&gt;${log4j.version}&lt;/version&gt;
			&lt;scope&gt;runtime&lt;/scope&gt;
		&lt;/dependency&gt;
	&lt;/dependencies&gt;

	&lt;build&gt;
		&lt;plugins&gt;

			&lt;!-- Java Compiler --&gt;
			&lt;plugin&gt;
				&lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
				&lt;artifactId&gt;maven-compiler-plugin&lt;/artifactId&gt;
				&lt;version&gt;3.1&lt;/version&gt;
				&lt;configuration&gt;
					&lt;source&gt;${java.version}&lt;/source&gt;
					&lt;target&gt;${java.version}&lt;/target&gt;
				&lt;/configuration&gt;
			&lt;/plugin&gt;

			&lt;!-- We use the maven-shade plugin to create a fat jar that contains all necessary dependencies. --&gt;
			&lt;!-- Change the value of &lt;mainClass&gt;...&lt;/mainClass&gt; if your program entry point changes. --&gt;
			&lt;plugin&gt;
				&lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
				&lt;artifactId&gt;maven-shade-plugin&lt;/artifactId&gt;
				&lt;version&gt;3.1.1&lt;/version&gt;
				&lt;executions&gt;
					&lt;!-- Run shade goal on package phase --&gt;
					&lt;execution&gt;
						&lt;phase&gt;package&lt;/phase&gt;
						&lt;goals&gt;
							&lt;goal&gt;shade&lt;/goal&gt;
						&lt;/goals&gt;
						&lt;configuration&gt;
							&lt;artifactSet&gt;
								&lt;excludes&gt;
									&lt;exclude&gt;org.apache.flink:force-shading&lt;/exclude&gt;
									&lt;exclude&gt;com.google.code.findbugs:jsr305&lt;/exclude&gt;
									&lt;exclude&gt;org.slf4j:*&lt;/exclude&gt;
									&lt;exclude&gt;org.apache.logging.log4j:*&lt;/exclude&gt;
								&lt;/excludes&gt;
							&lt;/artifactSet&gt;
							&lt;filters&gt;
								&lt;filter&gt;
									&lt;!-- Do not copy the signatures in the META-INF folder.
									Otherwise, this might cause SecurityExceptions when using the JAR. --&gt;
									&lt;artifact&gt;*:*&lt;/artifact&gt;
									&lt;excludes&gt;
										&lt;exclude&gt;META-INF/*.SF&lt;/exclude&gt;
										&lt;exclude&gt;META-INF/*.DSA&lt;/exclude&gt;
										&lt;exclude&gt;META-INF/*.RSA&lt;/exclude&gt;
									&lt;/excludes&gt;
								&lt;/filter&gt;
							&lt;/filters&gt;
							&lt;transformers&gt;
								&lt;transformer implementation=&quot;org.apache.maven.plugins.shade.resource.ManifestResourceTransformer&quot;&gt;
									&lt;mainClass&gt;com.credit.analyze.job.batch.market.channel.MarketMonitorHiveJob&lt;/mainClass&gt;
								&lt;/transformer&gt;
							&lt;/transformers&gt;
						&lt;/configuration&gt;
					&lt;/execution&gt;
				&lt;/executions&gt;
			&lt;/plugin&gt;
		&lt;/plugins&gt;

		&lt;pluginManagement&gt;
			&lt;plugins&gt;

				&lt;!-- This improves the out-of-the-box experience in Eclipse by resolving some warnings. --&gt;
				&lt;plugin&gt;
					&lt;groupId&gt;org.eclipse.m2e&lt;/groupId&gt;
					&lt;artifactId&gt;lifecycle-mapping&lt;/artifactId&gt;
					&lt;version&gt;1.0.0&lt;/version&gt;
					&lt;configuration&gt;
						&lt;lifecycleMappingMetadata&gt;
							&lt;pluginExecutions&gt;
								&lt;pluginExecution&gt;
									&lt;pluginExecutionFilter&gt;
										&lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
										&lt;artifactId&gt;maven-shade-plugin&lt;/artifactId&gt;
										&lt;versionRange&gt;[3.1.1,)&lt;/versionRange&gt;
										&lt;goals&gt;
											&lt;goal&gt;shade&lt;/goal&gt;
										&lt;/goals&gt;
									&lt;/pluginExecutionFilter&gt;
									&lt;action&gt;
										&lt;ignore/&gt;
									&lt;/action&gt;
								&lt;/pluginExecution&gt;
								&lt;pluginExecution&gt;
									&lt;pluginExecutionFilter&gt;
										&lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
										&lt;artifactId&gt;maven-compiler-plugin&lt;/artifactId&gt;
										&lt;versionRange&gt;[3.1,)&lt;/versionRange&gt;
										&lt;goals&gt;
											&lt;goal&gt;testCompile&lt;/goal&gt;
											&lt;goal&gt;compile&lt;/goal&gt;
										&lt;/goals&gt;
									&lt;/pluginExecutionFilter&gt;
									&lt;action&gt;
										&lt;ignore/&gt;
									&lt;/action&gt;
								&lt;/pluginExecution&gt;
							&lt;/pluginExecutions&gt;
						&lt;/lifecycleMappingMetadata&gt;
					&lt;/configuration&gt;
				&lt;/plugin&gt;
			&lt;/plugins&gt;
		&lt;/pluginManagement&gt;
	&lt;/build&gt;
&lt;/project&gt;
  1. I have added the following jar package in the flink lib directory
flink-sql-connector-hive-1.2.2
hive-metastore-1.2.1.jar
hive-exec-1.2.1.jar
libfb303-0.9.2.jar 
orc-core-1.4.3-nohive.jar
aircompressor-0.8.jar

答案1

得分: 1

我已解决这个问题。需要在 pom.xml 中添加以下依赖项:

<!-- flink orc nohive -->
<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-orc-nohive_2.12</artifactId>
    <version>${flink.version}</version>
</dependency>
英文:

I have solved the problem. Need to add the following dependencies in pom.xml

&lt;!-- flink orc nohive --&gt;
		&lt;dependency&gt;
			&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
			&lt;artifactId&gt;flink-orc-nohive_2.12&lt;/artifactId&gt;
			&lt;version&gt;${flink.version}&lt;/version&gt;
		&lt;/dependency&gt;

答案2

得分: 0

你的错误明确指出,确保你在依赖中导入了正确的 orc-core jar 包,如下所示,

Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch

你应该添加以下依赖项,

<dependency>
<groupId>org.apache.orc</groupId>
<artifactId>orc-core</artifactId>
<version>1.6.5</version>
</dependency>
英文:

Your error clearly indicates, make sure that your have proper orc-core jar imported as dependency in case,

Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch

you should add the following artifact,

&lt;dependency&gt;
&lt;groupId&gt;org.apache.orc&lt;/groupId&gt;
&lt;artifactId&gt;orc-core&lt;/artifactId&gt;
&lt;version&gt;1.6.5&lt;/version&gt;
&lt;/dependency&gt;

huangapple
  • 本文由 发表于 2020年10月13日 19:52:26
  • 转载请务必保留本文链接:https://go.coder-hub.com/64334718.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定