2018-06-19 06:28:21,840 INFO [IPC Server handler 9 on 48365] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt attempt_1529328969159_0002_m_000002_3 is : 0.0
2018-06-19 06:28:21,853 FATAL [IPC Server handler 0 on 48365] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1529328969159_0002_m_000002_3 - exited : java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:533)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org

解决方案 »

  1.   

    Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Application was streaming results when the connection failed. Consider raising value of 'net_write_timeout' on the server.
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:404)
    at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:988)
    at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:3552)
    at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:3452)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3893)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:875)
    at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:1991)
    at 
      

  2.   

    我把net_write_timeout设置为600 还是报错,sqoop把mysql数据导入到hive中有这么困难吗
      

  3.   

    你分开一下做测试,先把mysql的数据写在文件中,再从文件写入hive
    数据量有多大,mysql及jdbc的版本。感觉你的网络也有问题