createBlockOutputStream
http://getsatisfaction.com/cloudera/topics/exception_in_createblockoutputstream_java_io_ioexception_bad_connect_ack_with_firstbadlink
When i am triying to copy file from local to hdfs it shows the error. But file copying. waht is the problem? how to solve it?
hadoopadmin@DcpMaster:/$ hadoop dfs -put /home/hadoopadmin/Desktop/hadoop-0.20/ /cinema
10/06/17 13:18:12 INFO hdfs.DFSClient: Exception in createBlockOutputStream java.io.IOException: Bad connect ack with firstBadLink 10.0.0.2:50010
10/06/17 13:18:12 INFO hdfs.DFSClient: Abandoning block blk_8926776881229359620_1133
10/06/17 13:18:12 INFO hdfs.DFSClient: Waiting to find target node: 10.0.0.1:50010
10/06/17 13:19:18 INFO hdfs.DFSClient: Exception in createBlockOutputStream java.io.IOException: Bad connect ack with firstBadLink 10.0.0.2:50010
10/06/17 13:19:18 INFO hdfs.DFSClient: Abandoning block blk_-7122249881716391441_1134
10/06/17 13:19:18 INFO hdfs.DFSClient: Waiting to find target node: 10.0.0.1:50010
10/06/17 13:20:24 INFO hdfs.DFSClient: Exception in createBlockOutputStream java.io.IOException: Bad connect ack with firstBadLink 10.0.0.2:50010
10/06/17 13:20:24 INFO hdfs.DFSClient: Abandoning block blk_4279698506667722666_1135
10/06/17 13:20:24 INFO hdfs.DFSClient: Waiting to find target node: 10.0.0.1:50010
Patrick Angeles (Solutions Architect) 1 year ago
Hey there.
These warnings could be normal and caused by a DN timing out (due to network hiccups or load) or going down. Hadoop can normally recover from these errors.
Is the copy failing?
If the problem occurs regularly on the same target datanode (looks like 10.0.0.2 is always failing here), then I would look into that node.
Regards,
- Patrick