IBM Watson Data & AI - Structured Ideas

Welcome to the idea forum for IBM Watson Data & AI — our team welcomes any feedback, requests, and suggestions you have for improving our products! 

This forum allows us to connect your product improvement ideas with IBM product and engineering teams. 

Enable the feature of killing job on Spark UI so that admins of a project can kill their Spark jobs

I scheduled an jupyter Spark job in DSX, and something went wrong so I decided to stop the job. I deleted the schedule in DSX but noticed form Spark UI that the real job did not stop. It kept running and taking all the spark resources. I couldn't do anything but wait for the job to finish.

Originally, Spark UI has the feature allowing user to kill jobs but I have noticed that this feature has been disabled in the Spark UI in Bluemix. I think we should enable the feature for admins of DSX projects so that admins can kill their spark jobs in Spark UI if they want. In terms of implementation, a DSX project can have different types of users including editors, viewers and admins, which I think can be leveraged to do access control or permission control for the job killing feature in Spark UI.

  • Guest
  • Nov 21 2017
  • Needs review
Role Summary
  • Attach files