英文:
Flink Reading a S3 file causing Jackson dependency issue
问题
以下是您提供的内容的翻译:
我正在我的链接应用程序中读取一个配置的YAML文件。我想将这个配置文件保存在S3文件系统上,但是当我在我的pom中使用aws-sdk尝试读取时,我得到了以下错误。我知道这是由于Jackson的冲突依赖引起的,但我无法解决它。请帮助我解决这个问题。
java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.enable([Lcom/fasterxml/jackson/core/JsonParser$Feature;)Lcom/fasterxml/jackson/databind/ObjectMapper;
at com.amazonaws.partitions.PartitionsLoader.<clinit>(PartitionsLoader.java:54)
at com.amazonaws.regions.RegionMetadataFactory.create(RegionMetadataFactory.java:30)
at com.amazonaws.regions.RegionUtils.initialize(RegionUtils.java:65)
at com.amazonaws.regions.RegionUtils.getRegionMetadata(RegionUtils.java:53)
at com.amazonaws.regions.RegionUtils.getRegion(RegionUtils.java:107)
at com.amazonaws.services.s3.AmazonS3Client.createSigner(AmazonS3Client.java:4016)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4913)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4872)
at com.amazonaws.services.s3.AmazonS3Client.getObject(AmazonS3Client.java:1472)
at com.bounce.processor.EventProcessor.main(EventProcessor.java:71)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:576)
这是我用于读取文件的代码:
AWSCredentials credentials = new BasicAWSCredentials(ACCESS_KEY, SECRET_KEY);
AmazonS3 amazonS3Client = new AmazonS3Client(credentials);
S3Object object = amazonS3Client.getObject(new GetObjectRequest(S3_PROD_BUCKET, para.get("topology")));
InputStream awsinputStream = object.getObjectContent();
这是我的pom.xml文件:
<dependencies>
<!-- Flink dependencies -->
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-avro-serializer</artifactId>
<version>5.3.0</version>
</dependency>
<!-- Other dependencies... -->
</dependencies>
注意:由于代码和配置部分包含了很多细节,因此在翻译过程中可能会存在一些细微的差异。如果您对特定部分有更详细的问题,请随时提问。
英文:
I a reading a config YAML file in my link application. I want to keep this config file on S3 filesystem but when using aws-sdk in my pom and trying to read I am getting this error. I know this is due to Jackson's conflict dependency but I am not able to resolve it. Please help me to get it to resolve.
java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.enable([Lcom/fasterxml/jackson/core/JsonParser$Feature;)Lcom/fasterxml/jackson/databind/ObjectMapper;
at com.amazonaws.partitions.PartitionsLoader.<clinit>(PartitionsLoader.java:54)
at com.amazonaws.regions.RegionMetadataFactory.create(RegionMetadataFactory.java:30)
at com.amazonaws.regions.RegionUtils.initialize(RegionUtils.java:65)
at com.amazonaws.regions.RegionUtils.getRegionMetadata(RegionUtils.java:53)
at com.amazonaws.regions.RegionUtils.getRegion(RegionUtils.java:107)
at com.amazonaws.services.s3.AmazonS3Client.createSigner(AmazonS3Client.java:4016)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4913)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4872)
at com.amazonaws.services.s3.AmazonS3Client.getObject(AmazonS3Client.java:1472)
at com.bounce.processor.EventProcessor.main(EventProcessor.java:71)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:576)
This this code that I am using to read the file
AmazonS3 amazonS3Client = new AmazonS3Client(credentials);
S3Object object = amazonS3Client.getObject(new GetObjectRequest(S3_PROD_BUCKET, para.get("topology")));
InputStream awsinputStream = object.getObjectContent();
This is my pom.xml
<!-- Flink dependencies -->
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-avro-serializer</artifactId>
<version>5.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-filesystem_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
<exclusions>
<exclusion>
<groupId>commons-httpclient</groupId>
<artifactId>commons-httpclient</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpcore</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-avro</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.parquet</groupId>
<artifactId>parquet-avro</artifactId>
<version>${flink.format.parquet.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.7</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-parquet_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.10.5</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-yaml</artifactId>
<version>2.10.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.esotericsoftware.yamlbeans/yamlbeans -->
<dependency>
<groupId>com.esotericsoftware.yamlbeans</groupId>
<artifactId>yamlbeans</artifactId>
<version>1.13</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.uber/h3 -->
<dependency>
<groupId>com.uber</groupId>
<artifactId>h3</artifactId>
<version>3.6.3</version>
</dependency>
<dependency>
<groupId>com.github.davidmoten</groupId>
<artifactId>geo</artifactId>
<version>0.7.7</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-connector-elasticsearch6 -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-elasticsearch7_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>tech.allegro.schema.json2avro</groupId>
<artifactId>converter</artifactId>
<version>0.2.9</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-statebackend-rocksdb_2.12</artifactId>
<version>1.10.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.13</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-avro-confluent-registry</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-elasticsearch7_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-bundle</artifactId>
<version>1.11.756</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.10.0</version>
</dependency>
</details>
# 答案1
**得分**: 1
从 POM 文件中仅获取关于冲突的答案是非常不太可能的。相反,您应该参考 Maven 依赖插件,通过执行以下命令:
```bash
mvn dependency:tree
这将打印出所有的依赖关系以及这些依赖关系的依赖关系,这样您就能够定位您导入的库中具有对 Jackson 的传递性依赖的部分,并将其标记为排除。
注意:您在这个依赖树中真正寻找的是具有不同版本的 Jackson 依赖项,因此您不需要将它们全部排除。
英文:
Getting the answer about conflict solely from POM is highly unlikely. Instead, You should refer to the maven dependency plugin, by invoking the following command:
mvn dependency:tree
This will print all the dependencies and the dependencies of those dependencies, this way You will be able to locate which of libraries You are importing has the transitive dependency on Jackson and You will be able to mark it as excluded.
Note: What You are really looking for in this dependency tree are the Jackson dependencies with different versions, so You don't need to exclude them all.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论