有道筆記-shell 讀取檔案行
最近通過spark streaming消費kafka資料,消費的資料落到hdfs,一分鐘乙個小檔案,昨天架構那邊的同事告訴我要清理歷史檔案,但是目錄太多,手動刪比較慢,於是想到可以把檔案目錄都拿到,寫入文字 path_to_clean.txt,通過shell迴圈讀路徑,並執行刪除。
hdfs://nameservice1/user/hadoop/dw_realtime/dw_real_for_path_list/mb_pageinfo_hash2/date=2016-01-09
hdfs://nameservice1/user/hadoop/dw_realtime/dw_real_for_path_list/mb_pageinfo_hash2/date=2016-07-05
hdfs://nameservice1/user/hadoop/dw_realtime/dw_real_for_path_list/mb_pageinfo_hash2/date=2016-09-05
hdfs://nameservice1/user/hadoop/dw_realtime/dw_real_for_path_list/mb_pageinfo_hash2/date=2016-10-20
hdfs://nameservice1/user/hadoop/dw_realtime/dw_real_for_path_list/mb_pageinfo_hash2/date=2016-11-06
hdfs://nameservice1/user/hadoop/dw_realtime/dw_real_for_path_list/mb_pageinfo_hash2/date=2016-11-07
...
已下是**。
#!/bin/bash
for line in `cat path_to_clean.txt`
doecho
$line
hadoop fs -rm -r $line
done
#!/bin/bash
while
read line
doecho
$line
done
< path_to_clean.txt
#!/bin/bash
cat path_to_clean.txt | while
read line
doecho
$line
done
shell指令碼 按行讀取檔案
按行讀取檔案 bin bash echo 方法 1 while read line1 do echo line1 done 1 echo 方法 2 cat 1 while read line2 do echo line2 done echo 方法 3 for line3 in 1 do echo l...
使用shell按行讀取檔案
在shell裡面,按行讀取檔案得注意設定環境變數ifs,否則無法正確讀入。具體例子如下 oldifs ifs ifs n file home xx txt in file home in for pattern in cat do grep v e pattern done ifs oldifs要按...
shell指令碼按行讀取檔案並解析
shell指令碼讀取乙個配置檔案,配置檔案的格式如下 name abc pwd 123456 permission mop 檔名稱為 config.cfg 要在shell指令碼裡讀取這個配置檔案,並且把值一一賦給相應的變數,實現如下 while read line do name echo line...