IBM Cloud Databases - Structured Ideas

This Ideas portal is being closed.  Please enter new idea at http://ibm.biz/IBMAnalyticsIdeasPortal

Enable the feature of killing job on Spark UI so that admins of a project can kill their Spark jobs

I scheduled an jupyter Spark job in DSX, and something went wrong so I decided to stop the job. I deleted the schedule in DSX but noticed form Spark UI that the real job did not stop. It kept running and taking all the spark resources. I couldn't do anything but wait for the job to finish.

Originally, Spark UI has the feature allowing user to kill jobs but I have noticed that this feature has been disabled in the Spark UI in Bluemix. I think we should enable the feature for admins of DSX projects so that admins can kill their spark jobs in Spark UI if they want. In terms of implementation, a DSX project can have different types of users including editors, viewers and admins, which I think can be leveraged to do access control or permission control for the job killing feature in Spark UI.

  • Guest
  • Dec 14 2018
  • Needs review
Why is it useful?
Who would benefit from this IDEA? It would benefit all the users of Data Science Experience
How should it work?
Idea Priority High
Priority Justification
Customer Name
Submitting Organization
Submitter Tags
  • Attach files