github.com/zq2599/blog_demos
内容:所有原创文章分类汇总及配套源码,涉及Java、Docker、Kubernetes、DevOPS等;
Sqoop是Apache开源项目,用于在Hadoop和关系型数据库之间高效传输大量数据,本文将与您一起实践以下内容:
wget https://mirror.bit.edu.cn/apache/sqoop/1.4.7/sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz
tar -zxvf sqoop-1.4.7.bin__hadoop-2.6.0.tar.gz
mv sqoop-env-template.sh sqoop-env.sh
export HADOOP_COMMON_HOME=/home/hadoop/hadoop-2.7.7 export HADOOP_MAPRED_HOME=/home/hadoop/hadoop-2.7.7 export HIVE_HOME=/home/hadoop/apache-hive-1.2.2-bin
[hadoop@node0 bin]$ ./sqoop version Warning: /home/hadoop/sqoop-1.4.7.bin__hadoop-2.6.0/bin/../../hbase does not exist! HBase imports will fail. Please set $HBASE_HOME to the root of your HBase installation. Warning: /home/hadoop/sqoop-1.4.7.bin__hadoop-2.6.0/bin/../../hcatalog does not exist! HCatalog jobs will fail. Please set $HCAT_HOME to the root of your HCatalog installation. Warning: /home/hadoop/sqoop-1.4.7.bin__hadoop-2.6.0/bin/../../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. Warning: /home/hadoop/sqoop-1.4.7.bin__hadoop-2.6.0/bin/../../zookeeper does not exist! Accumulo imports will fail. Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation. 20/11/02 12:02:58 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7 Sqoop 1.4.7 git commit id 2328971411f57f0cb683dfb79d19d4d19d185dd8 Compiled by maugli on Thu Dec 21 15:59:58 STD 2017
为了接下来的实战,需要把MySQL准备好,这里给出的MySQL的配置供您参考:
关于MySQL部署,我这为了省事儿,是用docker部署的
./sqoop export \ --connect jdbc:mysql://192.168.50.43:3306/sqoop \ --table address \ --username root \ --password 123456 \ --export-dir '/user/hive/warehouse/address' \ --fields-terminated-by ','
create table address2 (addressid int, province string, city string) row format delimited fields terminated by ',';
./sqoop import \ --connect jdbc:mysql://192.168.50.43:3306/sqoop \ --table address \ --username root \ --password 123456 \ --target-dir '/user/hive/warehouse/address2' \ -m 2
Virtual memory (bytes) snapshot=4169867264 Total committed heap usage (bytes)=121765888 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=94 20/11/02 16:09:22 INFO mapreduce.ImportJobBase: Transferred 94 bytes in 16.8683 seconds (5.5726 bytes/sec) 20/11/02 16:09:22 INFO mapreduce.ImportJobBase: Retrieved 5 records.
hive> select * from address2; OK 1 guangdong guangzhou 2 guangdong shenzhen 3 shanxi xian 4 shanxi hanzhong 6 jiangshu nanjing Time taken: 0.049 seconds, Fetched: 5 row(s)
我是欣宸,期待与您一同畅游Java世界…