今天重新打开Hadoop的时候,格式化了Namenode,结果后来在上传文件到hdfs的时候出现了错误。直接百度看到有说关闭防火墙,删除/tmp/文件的。自己仔细看了一下在从节点jps没有datanode,所以是datanode没有打开,之后百度到是版本遗留问题。删除Hadoop/dfs/name/current目录即可,从节点相应删除hadoop/dfs/name/current和hadoop/dfs/data/current。
hdfs.DFSClient: DataStreamer Exception
【There are 0 datanode(s) running and no node(s) are excluded in this operation】org.apache.hadoop.ipc.RemoteException(java.io.IOException):xxxxxxxxxx_COPYING_ could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation.
推荐阅读
- 大数据|hadoop安装
- 大数据|hbase安装
- Gank Spark
- hadoop|Import/Export实现hbase集群间迁移
- 解决(Some projects cannot be imported because they already exist in the workspace)
- centos中修改时区及时间的方法
- 利用Hadoop平台进行大规模(百万以上)中文网页聚类
- 大数据|HBase导出CSV格式数据的方法
- Hadoop 技术生态体系
- spark|Spark,SparkSql wordCount,java wordcount