IBM Watson Data & AI - Structured Ideas

Welcome to the idea forum for IBM Watson Data & AI — our team welcomes any feedback, requests, and suggestions you have for improving our products! 

This forum allows us to connect your product improvement ideas with IBM product and engineering teams. 

Add the ability to add/remove JARs for an entire cluster for Python notebooks.

When working with Spark using Python, it is common to need to install PySpark libraries that require additional JARs, which may be used by the JVM side of a system exposing a Python API that the user is interacting with. Currently, the only ways to due this in DSX are either to (1) open a Scala notebook and add the JAR via %AddJar, or (2) add them directly to a folder path has is generally an internal implementation detail that we are exploiting. It would be ideal to instead be able to install JARs at the project level via an official file path in a terminal, and/or via a GUI.

  • Mike Dusenberry
  • Aug 17 2017
  • Needs review
Role Summary
  • Attach files