1、要想Java可以连接自己虚拟机的hadoop文件系统,需要给文件系统权限
(1)需要在/usr/local/hadoop/etc/hadoop/core-site.xml
core-site.xml文件配置具体ip
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://ip:9000</value>
</property>
</configuration>
(2)文件权限
hdfs dfs -chmod -R 777 /
2、maven依赖
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.9.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.9.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.9.2</version>
</dependency>
3、具体操作
(1)创建文件夹文章来源:https://www.toymoban.com/news/detail-806851.html
Configuration configuration = new Configuration();
configuration.set("fs.defaultFS", "hdfs://ip:9000");
FileSystem fileSystem = FileSystem.get(configuration);
boolean bool = fileSystem.mkdirs(new Path("/test"));
System.out.println(bool);
(2)创建文件文章来源地址https://www.toymoban.com/news/detail-806851.html
Configuration configuration = new Configuration();
configuration.set("fs.defaultFS", "hdfs://ip:9000");
FileSystem fileSystem = FileSystem.get(configuration);
Path path = new Path("/demo/test1.txt");
FSDataOutputStream out = fileSystem.create(path);
out.write("hfajdhfkafa".getBytes());
out.flush();
out.close();
到了这里,关于java大数据hadoop2.92 Java连接操作的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!