Kylo + HDP 3.1 - not populate valid and invalid tables in Hive.

133 views
Skip to first unread message

Jose Humberto Cruvinel

unread,
Jul 30, 2019, 3:08:23 PM7/30/19
to Kylo Community
I installed Kylo version 0.10.1 with HDP version 3.1.0.0-78.

I Have to change Nifi to enable using Spark2 (version available at HDP 3.1).

Using Kylo standard-ingest, the process go to the end withou error.

But at the end, only the table _feed and _profile were with records. The main table and the tables _valid and _invalid are empty.

I think the step Validade And Split Records" (with uses ExecuteSparkJob) are not working, but I can identify any error in the log, only warnings.

I entered in hive session and confirm that _profile and _feed are ok. But no data is in _valid, _invalid and main table.

Looking at HDP the valid files was created to _valid table. 

Follows the kylo and Nifi logs of an complete execution and the screenshots of the execution and profile. 

Can you help me?
Ambari_override-hive_site-configuration.PNG
nifi-app.log
HDFS_valid_files.PNG
HIVE_queries.log
Kylo_Job_Complete.PNG
Kylo_job_profile_no_valid_records.PNG
Kylo_job_profile_statistics.PNG
Kylo_Jobs_Results.PNG
kylo-services.log
Nifi_Configuration_for_Validate_and_split_records.PNG

ha bach Duong

unread,
May 22, 2021, 1:33:51 PM5/22/21
to Kylo Community

I have same problem . How about to fix this?
Vào lúc 02:08:23 UTC+7 ngày Thứ Tư, 31 tháng 7, 2019, Jose Humberto Cruvinel đã viết:

ha bach Duong

unread,
May 23, 2021, 11:30:52 PM5/23/21
to Kylo Community
Just fixed it by 
mapred.input.dir.recursive=true
The external table created by kylo has sub dir so need config hive for loading sub dir

Vào lúc 00:33:51 UTC+7 ngày Chủ Nhật, 23 tháng 5, 2021, ha bach Duong đã viết:
Reply all
Reply to author
Forward
0 new messages