hive数据能够加载,但是select语句卡顿
有哪位大神在线帮忙看一下hive>LOAD DATA LOCAL INPATH '/home/hadoop/upload/SogouQ2.txt' INTO TABLE SOGOUQ2;
Copying data from file:/home/hadoop/upload/SogouQ2.txt
Copying file: file:/home/hadoop/upload/SogouQ2.txt
Loading data to table hive.sogouq2
Table hive.sogouq2 stats: [numFiles=2, numRows=0, totalSize=326191991, rawDataSize=0]
OK
Time taken: 9.264 seconds
hive> select count(*) from SOGOUQ2;
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapreduce.job.reduces=<number>
有哪位大神在线帮忙看一下hive>LOAD DATA LOCAL INPATH '/home/hadoop/upload/SogouQ2.txt' INTO TABLE SOGOUQ2;
Copying data from file:/home/hadoop/upload/SogouQ2.txt
Copying file: file:/home/hadoop/upload/SogouQ2.txt
Loading data to table hive.sogouq2
Table hive.sogouq2 stats: [numFiles=2, numRows=0, totalSize=326191991, rawDataSize=0]
OK
Time taken: 9.264 seconds
hive> select count(*) from SOGOUQ2;
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapreduce.job.reduces=<number>
解决方案 »
免费领取超大流量手机卡,每月29元包185G流量+100分钟通话, 中国电信官方发货