hadoop - Hive Dynamic Partition for HDFS file -
i want create external table partition column. partition column dynamic based on application id.
for example
hdfs folder location is
/local/test/log/appid=app1 /local/test/log/appid=app2 /local/test/log/appid=app3 /local/test/log/appid=app4
inside above folders, there multiple log files in csv format.
i want create hive external table partition appid, if run below query:
select * table appid='app2';
it should return me result.
so far, have created hive table, seems not working.
here show create table details:
create external table `app_log`( `log_time` string comment 'from deserializer', `log_src` string comment 'from deserializer', `log_msg` string comment 'from deserializer') partitioned ( `appid` string) row format serde 'org.apache.hadoop.hive.serde2.opencsvserde' serdeproperties ( 'escapechar'='\\', 'quotechar'='\"', 'separatorchar'=',') stored inputformat 'org.apache.hadoop.mapred.textinputformat' outputformat 'org.apache.hadoop.hive.ql.io.hiveignorekeytextoutputformat' location 'hdfs://local/test/log' tblproperties ( 'transient_lastddltime'='1503197360')
please in creating dynamic hive partitioning external table;
ps: don't want create table using insert query;
Comments
Post a Comment