ChatGPT解决这个技术问题 Extra ChatGPT

How to kill a running Spark application?

I have a running Spark application where it occupies all the cores where my other applications won't be allocated any resource.

I did some quick research and people suggested using YARN kill or /bin/spark-class to kill the command. However, I am using CDH version and /bin/spark-class doesn't even exist at all, YARN kill application doesn't work either.

https://i.stack.imgur.com/b3f8j.jpg

Can anyone with me with this?

if you are in a test env : ps aux | grep spark -> get the pid of spark and kill it from command line
@eliasah "test env", to me the job is already distributed..
you want to kill a job in production????
@eliasah Yeah... a job in production got hung due the failure in one host.

y
yatu

copy paste the application Id from the spark scheduler, for instance application_1428487296152_25597

connect to the server that have launch the job

yarn application -kill application_1428487296152_25597


How do you get to the spark scheduler?
Is it the same as the web UI ?
@Hunle You can get the ID from Spark History UI or YARN RUNNING apps UI (yarn-host:8088/cluster/apps/RUNNING) or from Spark Job Web UI URL (yarn-host:8088/proxy/application_<timestamp>_<id>)
can one kill several at once : yarn application -kill application_1428487296152_25597 application_1428487296152_25598 ... ??
D
Darth Hunterix

It may be time consuming to get all the application Ids from YARN and kill them one by one. You can use a Bash for loop to accomplish this repetitive task quickly and more efficiently as shown below:

Kill all applications on YARN which are in ACCEPTED state:

for x in $(yarn application -list -appStates ACCEPTED | awk 'NR > 2 { print $1 }'); do yarn application -kill $x; done

Kill all applications on YARN which are in RUNNING state:

for x in $(yarn application -list -appStates RUNNING | awk 'NR > 2 { print $1 }'); do yarn application -kill $x; done


A
Ankit Anand

First use:

yarn application -list

Note down the application id Then to kill use:

yarn application -kill application_id

For future me, one command to combine both of these yarn application -list|cut -f 1 |grep "application_" |xargs -I {} -P 1 -n 1 yarn application -kill {}
S
Starwalker

https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html#Cluster_Application_State_API

PUT http://{rm http address:port}/ws/v1/cluster/apps/{appid}/state

{
  "state":"KILLED"
}

S
Sachin Gaikwad

This might not be an ethical and preferred solution but it helps in environments where you can't access the console to kill the job using yarn application command.

Steps are

Go to application master page of spark job. Click on the jobs section. Click on the active job's active stage. You will see "kill" button right next to the active stage.

This works if the succeeding stages are dependent on the currently running stage. Though it marks job as " Killed By User"