Adding modules/packages to dsx/spark aas currently feels quite hacky. Currently, I have to use a combination of pip, pixiedust, restarting kernels etc, which is all very manual.
Can we specify a python requirements.txt and a set of maven packages and similar for R packages in the DSX project settings? DSX should ensure the environment is setup and ready to use. If there are currently kernels running on notebooks when the module/package lists is saved, there should be a prompt asking if it is ok to restart those kernels.
Setting the modules/packages should be possible via an api, e.g. https://datascix.uservoice.com/forums/387207-general/suggestions/17837173-provide-an-api-for-dsx
Thank you for this idea! We are working to support exactly this functionality. You will be able to manage your environments and dependencies for notebooks. Your environments will be able to be shared for use inside projects as well as between projects.