I have been trying to consume kafka msges to secure hortonworks hdfs cluster through standalone.sh script using launcher.type=MAPREDUCE. Since its a secure cluster, its asking for keytab authentication. but in job config of gobblin config files there is no means to provide details for standalone code.
i tried with below properties in gobblin-standalone.properties but these dont appear to be recognized by the script. Could anyone please suggest on how to pass the keytab info for standalone script run with mapreduce launch type
2016-11-18 00:03:06 CST INFO [main] gobblin.runtime.app.ServiceBasedAppLauncher 158 - Starting the Gobblin application and all its associated Services
2016-11-18 00:03:06 CST INFO [JobScheduler STARTING] gobblin.scheduler.JobScheduler 164 - Starting the job scheduler
2016-11-18 00:03:06 CST INFO [SchedulerService STARTING] org.quartz.impl.StdSchedulerFactory 1172 - Using default implementation for ThreadExecutor
2016-11-18 00:03:06 CST INFO [SchedulerService STARTING] org.quartz.core.SchedulerSignalerImpl 61 - Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl
2016-11-18 00:03:06 CST INFO [SchedulerService STARTING] org.quartz.core.QuartzScheduler 240 - Quartz Scheduler v.2.2.3 created.
2016-11-18 00:03:06 CST INFO [SchedulerService STARTING] org.quartz.simpl.RAMJobStore 155 - RAMJobStore initialized.
2016-11-18 00:03:06 CST INFO [SchedulerService STARTING] org.quartz.core.QuartzScheduler 305 - Scheduler meta-data: Quartz Scheduler (v2.2.3) 'LocalJobScheduler' with instanceId 'NON_CLUSTERED'
Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally.
NOT STARTED.
Currently in standby mode.
Number of jobs executed: 0
Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 3 threads.
Using job-store 'org.quartz.simpl.RAMJobStore' - which does not support persistence. and is not clustered.
2016-11-18 00:03:06 CST INFO [SchedulerService STARTING] org.quartz.impl.StdSchedulerFactory 1327 - Quartz scheduler 'LocalJobScheduler' initialized from specified file: '/xxxx/xxxx/xxx/gobblin-dist/conf/quartz.properties'
2016-11-18 00:03:06 CST INFO [SchedulerService STARTING] org.quartz.impl.StdSchedulerFactory 1331 - Quartz scheduler version: 2.2.3
2016-11-18 00:03:06 CST INFO [SchedulerService STARTING] org.quartz.core.QuartzScheduler 575 - Scheduler LocalJobScheduler_$_NON_CLUSTERED started.
2016-11-18 00:03:06 CST WARN [JobScheduler STARTING] org.apache.hadoop.util.NativeCodeLoader 62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-11-18 00:03:06 CST INFO [JobScheduler STARTING] gobblin.scheduler.JobScheduler 401 - Scheduling configured jobs
2016-11-18 00:03:06 CST INFO [JobScheduler STARTING] gobblin.scheduler.JobScheduler 415 - Loaded 1 job configurations
2016-11-18 00:03:07 CST ERROR [JobScheduler-0] gobblin.scheduler.JobScheduler$NonScheduledJobRunner 501 - Failed to run job GobblinKafkaQuickStart
gobblin.runtime.JobException: Failed to run job GobblinKafkaQuickStart
at gobblin.scheduler.JobScheduler.runJob(JobScheduler.java:337)
at gobblin.scheduler.JobScheduler$NonScheduledJobRunner.run(JobScheduler.java:499)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Failed to create job launcher: org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
at gobblin.runtime.JobLauncherFactory.newJobLauncher(JobLauncherFactory.java:94)
at gobblin.runtime.JobLauncherFactory.newJobLauncher(JobLauncherFactory.java:59)
at gobblin.scheduler.JobScheduler.runJob(JobScheduler.java:335)
... 4 more
Caused by: org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1748)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1112)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1108)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1108)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1399)
at gobblin.runtime.FsDatasetStateStore.getLatestDatasetStatesByUrns(FsDatasetStateStore.java:156)
at gobblin.runtime.JobContext.<init>(JobContext.java:136)
at gobblin.runtime.AbstractJobLauncher.<init>(AbstractJobLauncher.java:131)
at gobblin.runtime.local.LocalJobLauncher.<init>(LocalJobLauncher.java:62)
at gobblin.runtime.JobLauncherFactory.newJobLauncher(JobLauncherFactory.java:80)
... 6 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
at org.apache.hadoop.ipc.Client.call(Client.java:1406)
at org.apache.hadoop.ipc.Client.call(Client.java:1359)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy8.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy8.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:671)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1746)
... 16 more
bash-4