java - Save Spark Launcher-Output to File -
i'm submitting jobs spark-cluster (with yarn) programmatically java-app , spark launcher (starting job startapplication(), not launch()). have log-output, produced on stdout und stderr launcher when executing java app, in file, can access java-app. don't want change global spark-log-config, want dynamic solution, can control depending on changing variables java-app on every single execution.
following documentation should possible using child_process_logger_name option. defined java.util.logging.logger here , added code job-launcher:
sparklauncher.setconfig(sparklauncher.child_process_logger_name, "mylog");
but doesn't work, logfile empty. tried other methods setconf(...) or add addsparkarg(...), without success. did wrong? or should better use log4j, make custom configuration, , give in way launcher? if yes, how in java-app?
below code snippet have been using print sparklauncher logs slf4j-log4j:
private static final logger logger = loggerfactory.getlogger(jobsubmitter.class); sparklauncher launcher = new sparklauncher()............;//prepare launcher launcher.redirecttolog(jobsubmitter.class.getname()); sparkapphandle handler = launcher.startapplication(); while (handler.getstate() == null || !handler.getstate().isfinal()) { if (handler.getstate() != null) { logger.info("job state :{} " , handler.getstate()); if (handler.getappid() != null) { logger.info("app id: {} :: state:{}" , handler.getappid() , handler.getstate()); } } //pause job reduce job check frequency thread.sleep(jobstatuscheckinterval ==0?default_job_status_check_interval:jobstatuscheckinterval); }
add comment in case have queries.
Comments
Post a Comment