我用的导入语句如下:
sqoop import --connect jdbc:mysql://localhost:3306/report --username root --password root --table GOO_GOODS --hive-import -m 1 --hive-overwrite --hive-table report.GOO_GOODS --fields-terminated-by ','数据也导入到hive表中了,可是其实有两行数据被换行了,如下:(小孔空心(出租车专用))
1403101618482510,2014-03-10 16:18:48.0,b310A10008,510085,只,5,控制臂衬套,kzbct,20,11.0,1K0407183A,20,0,2015-09-01 17:23:16.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
1403101618482502,2014-03-10 16:18:47.0,b310A10007,510074,只,5,下控制臂胶套,xkzbjt,20,5.0,811407181A,20,0,2015-09-01 17:23:16.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
1403101618472494,2014-03-10 16:18:47.0,b310A10006,510080,只,5,控制臂衬套,kzbct,20,4.3,357407182,20,0,2015-09-01 17:23:16.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
1403101618472486,2014-03-10 16:18:47.0,b310A10005,510067,只,5,下摆臂胶套\大,xbbjt\d,20,9.3,191407181D ,20,0,2015-09-01 17:15:21.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
xkkx(czczy),20,8.8,191407181BG,20,0,2015-09-01 17:23:16.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
小孔空心(出租车专用),kzbct
1403101618472478,2014-03-10 16:18:47.0,b310A10004,510232,只,5,控制臂衬套
xksx(czczy),20,8.8,191407181EG,20,0,2015-09-01 17:23:16.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
小孔实心(出租车专用),kzbct
1403101618472470,2014-03-10 16:18:47.0,b310A10003,510231,只,5,控制臂衬套
1403101618472462,2014-03-10 16:18:47.0,b310A10002,510066,只,5,下摆臂胶套\大\有边,xbbjt\d\yb,20,8.8,191407181A,20,0,2015-09-01 17:15:21.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
1403101618472454,2014-03-10 16:18:46.0,b310810030,730240,个,7,控制臂,kzb,20,156.0,1K0407152M,20,0,2016-01-11 17:08:23.0,1302041111443250,138028000000000,1168,1402171639454103,138028000000000
1403101618462446,2014-03-10 16:18:46.0,b310810029,730239,个,7,控制臂,kzb,20,156.0,1K0407151M,20,0,2016-01-11 17:08:23.0,1302041111443250,138028000000000,1168,1402171639454103,138028000000000
sqoop import --connect jdbc:mysql://localhost:3306/report --username root --password root --table GOO_GOODS --hive-import -m 1 --hive-overwrite --hive-table report.GOO_GOODS --fields-terminated-by ','数据也导入到hive表中了,可是其实有两行数据被换行了,如下:(小孔空心(出租车专用))
1403101618482510,2014-03-10 16:18:48.0,b310A10008,510085,只,5,控制臂衬套,kzbct,20,11.0,1K0407183A,20,0,2015-09-01 17:23:16.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
1403101618482502,2014-03-10 16:18:47.0,b310A10007,510074,只,5,下控制臂胶套,xkzbjt,20,5.0,811407181A,20,0,2015-09-01 17:23:16.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
1403101618472494,2014-03-10 16:18:47.0,b310A10006,510080,只,5,控制臂衬套,kzbct,20,4.3,357407182,20,0,2015-09-01 17:23:16.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
1403101618472486,2014-03-10 16:18:47.0,b310A10005,510067,只,5,下摆臂胶套\大,xbbjt\d,20,9.3,191407181D ,20,0,2015-09-01 17:15:21.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
xkkx(czczy),20,8.8,191407181BG,20,0,2015-09-01 17:23:16.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
小孔空心(出租车专用),kzbct
1403101618472478,2014-03-10 16:18:47.0,b310A10004,510232,只,5,控制臂衬套
xksx(czczy),20,8.8,191407181EG,20,0,2015-09-01 17:23:16.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
小孔实心(出租车专用),kzbct
1403101618472470,2014-03-10 16:18:47.0,b310A10003,510231,只,5,控制臂衬套
1403101618472462,2014-03-10 16:18:47.0,b310A10002,510066,只,5,下摆臂胶套\大\有边,xbbjt\d\yb,20,8.8,191407181A,20,0,2015-09-01 17:15:21.0,1302041111443250,138030000000000,1170,1402171639454103,138030000000000
1403101618472454,2014-03-10 16:18:46.0,b310810030,730240,个,7,控制臂,kzb,20,156.0,1K0407152M,20,0,2016-01-11 17:08:23.0,1302041111443250,138028000000000,1168,1402171639454103,138028000000000
1403101618462446,2014-03-10 16:18:46.0,b310810029,730239,个,7,控制臂,kzb,20,156.0,1K0407151M,20,0,2016-01-11 17:08:23.0,1302041111443250,138028000000000,1168,1402171639454103,138028000000000
解决方案 »
- 如何隔绝不同项目之的网络,如何限制vm公网出口的带宽?
- 查询手机短信内容 没有手机的情况下怎么查询
- 关于在ubuntu安装openstack的问题,求论坛的各位大神帮忙
- OpenStack入门到实战视频教程全集下载
- win7下运行的scala程序无法连接到Linux中的spark集群
- bugzilla邮件发不了提示smtp.163.com连接不上!!
- 我用8088端口能打开网页,80端口超时,除了未在亚马逊上未备案,还有别的原因吗?
- spark reducebykey计算多个value的值?如图
- Spark 怎么读文件名
- 在python中,是否可以用spark.read.csv("csv path")读取hdfs格式的csv文件?
- Spark监视UI页面表示内容正常吗?感觉应该有问,拜托各位帮我看看。
- 第一次发帖 网站显示www和不显示www,是怎么做到的,问一下你们大神。
添加了:
--hive-drop-import-delims --lines-terminated-by '\n'<br><br> --hive-drop-import-delims 的意思是指:delims Drops \n, \r, and \01 from string fields when importing to Hive.
delims
Drops \n, \r, and \01 from string fields when importing to Hive.