Flink 读取 S3 文件导致 Jackson 依赖问题。

huangapple go评论83阅读模式
英文:

Flink Reading a S3 file causing Jackson dependency issue

问题

以下是您提供的内容的翻译:

我正在我的链接应用程序中读取一个配置的YAML文件。我想将这个配置文件保存在S3文件系统上,但是当我在我的pom中使用aws-sdk尝试读取时,我得到了以下错误。我知道这是由于Jackson的冲突依赖引起的,但我无法解决它。请帮助我解决这个问题。

java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.enable([Lcom/fasterxml/jackson/core/JsonParser$Feature;)Lcom/fasterxml/jackson/databind/ObjectMapper;
	at com.amazonaws.partitions.PartitionsLoader.<clinit>(PartitionsLoader.java:54)
	at com.amazonaws.regions.RegionMetadataFactory.create(RegionMetadataFactory.java:30)
	at com.amazonaws.regions.RegionUtils.initialize(RegionUtils.java:65)
	at com.amazonaws.regions.RegionUtils.getRegionMetadata(RegionUtils.java:53)
	at com.amazonaws.regions.RegionUtils.getRegion(RegionUtils.java:107)
	at com.amazonaws.services.s3.AmazonS3Client.createSigner(AmazonS3Client.java:4016)
	at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4913)
	at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4872)
	at com.amazonaws.services.s3.AmazonS3Client.getObject(AmazonS3Client.java:1472)
	at com.bounce.processor.EventProcessor.main(EventProcessor.java:71)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:576)

这是我用于读取文件的代码:

AWSCredentials credentials = new BasicAWSCredentials(ACCESS_KEY, SECRET_KEY);
AmazonS3 amazonS3Client = new AmazonS3Client(credentials);

S3Object object = amazonS3Client.getObject(new GetObjectRequest(S3_PROD_BUCKET, para.get("topology")));
InputStream awsinputStream = object.getObjectContent();

这是我的pom.xml文件:

<dependencies>
    <!-- Flink dependencies -->
    <dependency>
        <groupId>io.confluent</groupId>
        <artifactId>kafka-avro-serializer</artifactId>
        <version>5.3.0</version>
    </dependency>
    <!-- Other dependencies... -->
</dependencies>

注意:由于代码和配置部分包含了很多细节,因此在翻译过程中可能会存在一些细微的差异。如果您对特定部分有更详细的问题,请随时提问。

英文:

I a reading a config YAML file in my link application. I want to keep this config file on S3 filesystem but when using aws-sdk in my pom and trying to read I am getting this error. I know this is due to Jackson's conflict dependency but I am not able to resolve it. Please help me to get it to resolve.

java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.enable([Lcom/fasterxml/jackson/core/JsonParser$Feature;)Lcom/fasterxml/jackson/databind/ObjectMapper;
at com.amazonaws.partitions.PartitionsLoader.<clinit>(PartitionsLoader.java:54)
at com.amazonaws.regions.RegionMetadataFactory.create(RegionMetadataFactory.java:30)
at com.amazonaws.regions.RegionUtils.initialize(RegionUtils.java:65)
at com.amazonaws.regions.RegionUtils.getRegionMetadata(RegionUtils.java:53)
at com.amazonaws.regions.RegionUtils.getRegion(RegionUtils.java:107)
at com.amazonaws.services.s3.AmazonS3Client.createSigner(AmazonS3Client.java:4016)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4913)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4872)
at com.amazonaws.services.s3.AmazonS3Client.getObject(AmazonS3Client.java:1472)
at com.bounce.processor.EventProcessor.main(EventProcessor.java:71)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:576)

This this code that I am using to read the file

        AmazonS3 amazonS3Client = new AmazonS3Client(credentials);

        S3Object object = amazonS3Client.getObject(new GetObjectRequest(S3_PROD_BUCKET, para.get(&quot;topology&quot;)));
        InputStream awsinputStream = object.getObjectContent();

This is my pom.xml

        &lt;!-- Flink dependencies --&gt;
&lt;dependency&gt;
&lt;groupId&gt;io.confluent&lt;/groupId&gt;
&lt;artifactId&gt;kafka-avro-serializer&lt;/artifactId&gt;
&lt;version&gt;5.3.0&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
&lt;artifactId&gt;flink-streaming-java_${scala.binary.version}&lt;/artifactId&gt;
&lt;version&gt;${flink.version}&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
&lt;artifactId&gt;flink-connector-kafka_${scala.binary.version}&lt;/artifactId&gt;
&lt;version&gt;${flink.version}&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
&lt;artifactId&gt;flink-connector-filesystem_${scala.binary.version}&lt;/artifactId&gt;
&lt;version&gt;${flink.version}&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;org.apache.hadoop&lt;/groupId&gt;
&lt;artifactId&gt;hadoop-common&lt;/artifactId&gt;
&lt;version&gt;${hadoop.version}&lt;/version&gt;
&lt;exclusions&gt;
&lt;exclusion&gt;
&lt;groupId&gt;commons-httpclient&lt;/groupId&gt;
&lt;artifactId&gt;commons-httpclient&lt;/artifactId&gt;
&lt;/exclusion&gt;
&lt;exclusion&gt;
&lt;groupId&gt;org.apache.httpcomponents&lt;/groupId&gt;
&lt;artifactId&gt;httpclient&lt;/artifactId&gt;
&lt;/exclusion&gt;
&lt;exclusion&gt;
&lt;groupId&gt;org.apache.httpcomponents&lt;/groupId&gt;
&lt;artifactId&gt;httpcore&lt;/artifactId&gt;
&lt;/exclusion&gt;
&lt;/exclusions&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
&lt;artifactId&gt;flink-avro&lt;/artifactId&gt;
&lt;version&gt;${flink.version}&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;org.apache.parquet&lt;/groupId&gt;
&lt;artifactId&gt;parquet-avro&lt;/artifactId&gt;
&lt;version&gt;${flink.format.parquet.version}&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;org.slf4j&lt;/groupId&gt;
&lt;artifactId&gt;slf4j-log4j12&lt;/artifactId&gt;
&lt;version&gt;1.7.7&lt;/version&gt;
&lt;scope&gt;runtime&lt;/scope&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;log4j&lt;/groupId&gt;
&lt;artifactId&gt;log4j&lt;/artifactId&gt;
&lt;version&gt;1.2.17&lt;/version&gt;
&lt;scope&gt;runtime&lt;/scope&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
&lt;artifactId&gt;flink-parquet_2.11&lt;/artifactId&gt;
&lt;version&gt;${flink.version}&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;joda-time&lt;/groupId&gt;
&lt;artifactId&gt;joda-time&lt;/artifactId&gt;
&lt;version&gt;2.10.5&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;com.fasterxml.jackson.dataformat&lt;/groupId&gt;
&lt;artifactId&gt;jackson-dataformat-yaml&lt;/artifactId&gt;
&lt;version&gt;2.10.2&lt;/version&gt;
&lt;/dependency&gt;
&lt;!-- https://mvnrepository.com/artifact/com.esotericsoftware.yamlbeans/yamlbeans --&gt;
&lt;dependency&gt;
&lt;groupId&gt;com.esotericsoftware.yamlbeans&lt;/groupId&gt;
&lt;artifactId&gt;yamlbeans&lt;/artifactId&gt;
&lt;version&gt;1.13&lt;/version&gt;
&lt;/dependency&gt;
&lt;!-- https://mvnrepository.com/artifact/com.uber/h3 --&gt;
&lt;dependency&gt;
&lt;groupId&gt;com.uber&lt;/groupId&gt;
&lt;artifactId&gt;h3&lt;/artifactId&gt;
&lt;version&gt;3.6.3&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;com.github.davidmoten&lt;/groupId&gt;
&lt;artifactId&gt;geo&lt;/artifactId&gt;
&lt;version&gt;0.7.7&lt;/version&gt;
&lt;/dependency&gt;
&lt;!-- https://mvnrepository.com/artifact/org.apache.flink/flink-connector-elasticsearch6 --&gt;
&lt;dependency&gt;
&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
&lt;artifactId&gt;flink-connector-elasticsearch7_2.11&lt;/artifactId&gt;
&lt;version&gt;${flink.version}&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;tech.allegro.schema.json2avro&lt;/groupId&gt;
&lt;artifactId&gt;converter&lt;/artifactId&gt;
&lt;version&gt;0.2.9&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
&lt;artifactId&gt;flink-statebackend-rocksdb_2.12&lt;/artifactId&gt;
&lt;version&gt;1.10.0&lt;/version&gt;
&lt;scope&gt;provided&lt;/scope&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;junit&lt;/groupId&gt;
&lt;artifactId&gt;junit&lt;/artifactId&gt;
&lt;version&gt;4.13&lt;/version&gt;
&lt;scope&gt;test&lt;/scope&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
&lt;artifactId&gt;flink-avro-confluent-registry&lt;/artifactId&gt;
&lt;version&gt;${flink.version}&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;org.apache.flink&lt;/groupId&gt;
&lt;artifactId&gt;flink-connector-elasticsearch7_2.11&lt;/artifactId&gt;
&lt;version&gt;${flink.version}&lt;/version&gt;
&lt;/dependency&gt;
&lt;dependency&gt;
&lt;groupId&gt;com.amazonaws&lt;/groupId&gt;
&lt;artifactId&gt;aws-java-sdk-bundle&lt;/artifactId&gt;
&lt;version&gt;1.11.756&lt;/version&gt;
&lt;/dependency&gt;
&lt;!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind --&gt;
&lt;dependency&gt;
&lt;groupId&gt;com.fasterxml.jackson.core&lt;/groupId&gt;
&lt;artifactId&gt;jackson-databind&lt;/artifactId&gt;
&lt;version&gt;2.10.0&lt;/version&gt;
&lt;/dependency&gt;
</details>
# 答案1
**得分**: 1
从 POM 文件中仅获取关于冲突的答案是非常不太可能的。相反,您应该参考 Maven 依赖插件,通过执行以下命令:
```bash
mvn dependency:tree

这将打印出所有的依赖关系以及这些依赖关系的依赖关系,这样您就能够定位您导入的库中具有对 Jackson 的传递性依赖的部分,并将其标记为排除。

注意:您在这个依赖树中真正寻找的是具有不同版本的 Jackson 依赖项,因此您不需要将它们全部排除。

英文:

Getting the answer about conflict solely from POM is highly unlikely. Instead, You should refer to the maven dependency plugin, by invoking the following command:

mvn dependency:tree

This will print all the dependencies and the dependencies of those dependencies, this way You will be able to locate which of libraries You are importing has the transitive dependency on Jackson and You will be able to mark it as excluded.

Note: What You are really looking for in this dependency tree are the Jackson dependencies with different versions, so You don't need to exclude them all.

huangapple
  • 本文由 发表于 2020年4月3日 20:52:49
  • 转载请务必保留本文链接:https://go.coder-hub.com/61012350.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定