Apache Spark driver class logs need to direct to directory in both cluster and client mode. Application users can log useful information in driver class. We have two things: 1) Spark Application run in yarn-client mode 2) Spark Application run in yarn-cluster mode Spark Application run in yarn-client mode When running a job in yarn-client mode, the driver logs are spilled on the console, but this may not useful for longer run, because the terminal will be aborted. So it is always a good approach to log the driver information in to definite location. Following is the approach as discussed in the HDP blogspot for yarn-client mode: https://community.hortonworks.com/articles/138849/how-to-capture-spark-driver-and-executor-logs-in-y.html. Here are the steps: 1. Place a driver_log4j.properties file in a certain location (say /tmp) on the machine where you will be submitting the job in yarn-client mode Contents o...
I found Eclipse Oxygen is not working. it is throwing an below error I modified workbench.xmi to workbenchOLD.xmi. using cd .metadata/.plugins/org.eclipse.e4.workbench mv workbench.xmi workbenchOLD.xmi It got work !!!!!!!! in <ECLIPSE_WORKSPACE> .metadata/.plugins/org.eclipse.e4.workbench !ENTRY org.eclipse.e4.ui.workbench.swt 4 2 2018-05-14 13:02:12.435 !MESSAGE Problems occurred when invoking code from plug-in: "org.eclipse.e4.ui.workbench.swt". !STACK 0 java.lang.AssertionError: assertion failed at scala.Predef$.assert(Predef.scala:204) at scala.tools.scalap.Classfile.<init>(Classfile.scala:17) at org.scalaide.core.internal.jdt.model.ScalaClassFileDescriber$.isScala(ScalaClassFileDescriber.scala:16) at org.scalaide.core.internal.jdt.model.ScalaClassFileDescriber.describe(ScalaClassFileDescriber.scala:38) at org.eclipse.core.internal.content.ContentTypeCatalo...
Comments
Post a Comment