When working with Spark using Python, it is common to need to install PySpark libraries that require additional JARs, which may be used by the JVM side of a system exposing a Python API that the user is interacting with. Currently, the only ways to due this in DSX are either to (1) open a Scala notebook and add the JAR via %AddJar, or (2) add them directly to a folder path has is generally an internal implementation detail that we are exploiting. It would be ideal to instead be able to install JARs at the project level via an official file path in a terminal, and/or via a GUI.
Why is it useful?
|Who would benefit from this IDEA?|
How should it work?