IBM Watson Data & AI - Structured Ideas

Welcome to the idea forum for IBM Watson Data & AI — our team welcomes any feedback, requests, and suggestions you have for improving our products! 

This forum allows us to connect your product improvement ideas with IBM product and engineering teams. 

Provide a seamless workflow from prototyping deep learning networks in notebooks to experiments builder

This idea depends on: https://ibmwatsondataplatform.ideas.aha.io/ideas/IBMWDP-I-12

I would like to be able to prototype neural networks in notebooks.  Realistically this will require GPUs for performance reasons hence the dependency on IBMWDP-I-12.  However, just having GPUs will not be enough.  

The Deep Learning coding guideline documentation lists other steps that the deep learning program should follow to make your code work with the Watson Machine Learning backend GPU cluster.  Not all of these guidelines will work in notebooks without hacky code such as setting environment variables.  This will require separate code branch logic for each environment that the code runs on.

Additionally, The Watson Machine Learning service expects the deep learning code to be located in object store.  This will require having to export a notebook and then uploading the code manually.

Ideally the process should be made much more seamless so that deep learning engineers can go from rapid prototyping in notebooks to experiments without much overhead.

 

  • Chris Snow
  • Dec 14 2018
  • Accepted
Role Summary
  • Attach files
  • CHRISTOPHER SNOW commented
    14 Dec 17:58

    Update: after some deeper thought, this ticket does not need to depend on the ticket for GPU support.  If we could have an integrated workflow from prototyping neural networks in notebooks to submitting them to the dlaas, I think this will also be a huge improvement.

  • Chris Snow commented
    14 Dec 17:58

    ‘seemless’ should have been spelt ‘seamless’ :)

  • Chris Snow commented
    14 Dec 17:58
  • JAUME MIRALLES SOLE commented
    14 Dec 17:58

    Yes, there are many notebook samples that define and train a NN, for example using Keras. You are there with the notebook, just need to train faster... but now you have to export the code, create an experiment, run it, ... too complex !! I just wanted to train it faster !!
    Take into consideration that many laptops have a GPU available. I do not want to have to compete with the data scientists own basic infrastructure.

  • JAUME MIRALLES SOLE commented
    14 Dec 17:58

     FACTS: Training a MNIST model with keras

    In Watson Studio, with largest environment (16 CPUs), It takes 36 secs / epoch

    With my home laptop, which has a NVIDIA GeForce 940MX, it takes 11 secs / epoch

     

    Do you really expect to be taken seriously if we do not offer insane computing power?

  • adam massachi commented
    14 Dec 17:58

    @juame how can we be taken seriously if we're insane??!!!

  • Admin
    ANTHONY STEVENS commented
    14 Dec 18:00

    Typo corrected :-)