[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

getting an error when configuring state backend to hdfs

I am trying to set the backend state to hdfs 
val stateUri = "hdfs/path_to_dir"
val backend: RocksDBStateBackend = new RocksDBStateBackend(stateUri, true)

I am running with flink 1.7.0 with the following dependencies (tried them with different combinations)  :
"org.apache.flink"    %% "flink-connector-filesystem"         % flinkV
"org.apache.flink"    % "flink-hadoop-fs"                     % flinkV
"org.apache.hadoop"   % "hadoop-hdfs"                         % hadoopVersion
"org.apache.hadoop"   % "hadoop-common"                       % hadoopVersion

however when running the jar I am getting this error:

Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(
at org.apache.flink.core.fs.FileSystem.get(
at org.apache.flink.core.fs.Path.getFileSystem(
at org.apache.flink.runtime.state.filesystem.FsCheckpointStorage.<init>(
at org.apache.flink.runtime.state.filesystem.FsStateBackend.createCheckpointStorage(
at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createCheckpointStorage(
at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.<init>(
... 17 more
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop is not in the classpath/dependencies.
at org.apache.flink.core.fs.UnsupportedSchemeFactory.create(
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(
... 23 more

any help will be greatly appreciated