apache kafka - Spark Streaming jobs exception handling -


in spark streaming(kafka consumer) code have error. implemented exception handling never caught. tried both checked & unchecked exceptions.

my code running in executor. how catch exception when running in executor instead of driver ?

please suggest me.

spark parallel framework. add try catch code in code, exception thrown in other threads. code can not catch exception.


Comments

Popular posts from this blog

python - Operations inside variables -

Generic Map Parameter java -

arrays - What causes a java.lang.ArrayIndexOutOfBoundsException and how do I prevent it? -