目录
  1. 1. Sqoop | Sqoop报错解决
    1. 1.1. Job failed as tasks failed.
    2. 1.2. Hive导入MySQL的数据全部出现在同一列
Sqoop | Sqoop报错解决

Sqoop | Sqoop报错解决

Job failed as tasks failed.

19/12/20 06:43:56 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1576794672412_0001
19/12/20 06:43:58 INFO impl.YarnClientImpl: Submitted application application_1576794672412_0001
19/12/20 06:43:58 INFO mapreduce.Job: The url to track the job: http://bp01:8088/proxy/application_1576794672412_0001/
19/12/20 06:43:58 INFO mapreduce.Job: Running job: job_1576794672412_0001
19/12/20 06:44:17 INFO mapreduce.Job: Job job_1576794672412_0001 running in uber mode : false
19/12/20 06:44:17 INFO mapreduce.Job:  map 0% reduce 0%
19/12/20 06:44:25 INFO mapreduce.Job:  map 100% reduce 0%
19/12/20 06:44:27 INFO mapreduce.Job: Job job_1576794672412_0001 failed with state FAILED due to: Task failed   task_1576794672412_0001_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

解决:
查看日志:/opt/hadoop/hadoop-2.7.6/logs/userlogs/application_1576794672412_0006/container_1576794672412_0006_01_000002

2019-12-20 07:12:27,425 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : java.io.IOException:   com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column  'gender' at row 1
Caused by: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long   for column 'gender' at row 1

!! Data too long for column ‘gender’ at row 1

参考:sqoop从hive导出到mysql报错:failed with state FAILED due to: Task failed


Hive导入MySQL的数据全部出现在同一列

Hive数据
+------------------------+------------------------+--+
| interim_gender.gender  | interim_gender.amount  |
+------------------------+------------------------+--+
| 0                      | 33232                  |
| 1                      | 33215                  |
| 2                      | 33553                  |
+------------------------+------------------------+--+

导入:sqoop export –connect jdbc:mysql://192.168.43.122:3306/mall –username root –password *** –table gender –input-fields-terminated-by ‘\001’ –export-dir ‘/user/hive/warehouse/mall.db/interim_gender’ -m 1;

MySQL数据
+----------+---------+--+
|  gender  | amount  |
+----------+---------+--+
| 1    33215  | NULL    |
| 2    33553  | NULL    |
| 0    33232  | NULL    |
+----------+---------+--+

解决:
查看Hive数据格式

1
2
3
4
$ hadoop fs -cat /user/hive/warehouse/mall.db/interim_gender/000000_0
0 33232
1 33215
2 33553

数据由Tab键分割,改为:--fields-terminated-by '\t' 即可。

文章作者: Ben
文章链接: https://smallbenxiong.github.io/2019/12/20/20191220-Sqoop%E6%8A%A5%E9%94%99%E8%A7%A3%E5%86%B3/
版权声明: 本博客所有文章除特别声明外,均采用 CC BY-NC-SA 4.0 许可协议。转载请注明来自 Ben Blog
打赏
  • WeChat
  • Alipay

评论