Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] (WIP) Binder visualization errors #193

Draft
wants to merge 4 commits into
base: legacy/main
Choose a base branch
from
Draft

Conversation

kaitj
Copy link
Collaborator

@kaitj kaitj commented Nov 1, 2021

Added libgl1-mesa-glx to the apt.txt for binder. This fixes the visualization error with fury, but increases the build time for the binder notebook.

Still causing issues with the visualization, so need to look into this more for binder.

Can test by spinning up a Binder notebook using the fix_lib_error branch.

@netlify
Copy link

netlify bot commented Nov 1, 2021

✔️ Deploy Preview for carpentries-dmri ready!

🔨 Explore the source changes: 254d9ca

🔍 Inspect the deploy log: https://app.netlify.com/sites/carpentries-dmri/deploys/61843d0a49becb000889cefe

😎 Browse the preview: https://deploy-preview-193--carpentries-dmri.netlify.app

@kaitj kaitj changed the title [BUG] Add libgl1-mesa-glx for binder visualization [BUG] (WIP) Binder visualization errors Nov 1, 2021
@jhlegarreta
Copy link
Contributor

Thanks for doing this Jason. I'm fine even it increases the loading time; I prefer the binder instance to be able to plot the results rather than returning an error.

On a slightly related note, as for the error we experienced with the local utils.visualization_utils::generate_anatomical_volume_figure method, I'm wondering whether the binder instance will need the path to the code be added to the Python path, as done in the CI so that the method can be imported.

@kaitj
Copy link
Collaborator Author

kaitj commented Nov 2, 2021

I can give it a shot with adding the PYTHONPATH, though I am not 100% sure I'll be able to get to it this week so if someone else wants to take a stab at it, go ahead! For the fury issue, it should resolve the error Davide experienced today, but I noticed the notebook was still dying when I tried to visualize the one cell he was trying to run. Need to look into this further.

@kaitj
Copy link
Collaborator Author

kaitj commented Nov 4, 2021

The PYTHONPATH issue is still not resolved. Adding it to PYTHONPATH allows for the import from the terminal within binder, but Jupyter does not recognize this within binder. Will need to dig further!

@josephmje
Copy link
Contributor

The PYTHONPATH issue is still not resolved. Adding it to PYTHONPATH allows for the import from the terminal within binder, but Jupyter does not recognize this within binder. Will need to dig further!

What about adding it to PATH as well?

If not, we may need to explicitly add it in the Jupyter notebook as well?
Something like:

import os

os.environ["PATH"] += os.pathsep + "/home/jovyan/utils"

@kaitj
Copy link
Collaborator Author

kaitj commented Nov 4, 2021

If not, we may need to explicitly add it in the Jupyter notebook as well?

Yeah, one thing we could do is add a sys.path.append if all else fails. Anyways, for tomorrow's lesson, the screen shots are available on the rendered website, but would definitely want to fix this long term for binder.

@kaitj
Copy link
Collaborator Author

kaitj commented Nov 4, 2021

Adding it to PATH also does not work from within the jupyter notebook.

@jhlegarreta
Copy link
Contributor

jhlegarreta commented Nov 6, 2021

Thanks for trying @kaitj . Maybe now that the workshop is over we could transition to JupyterLab and see if that helps fixing the issues we are experiencing. At least we are more likely to get some help/support.

@jhlegarreta
Copy link
Contributor

jhlegarreta commented Nov 8, 2021

I just tried to run the lesson using binder and I've noticed that binder is being limited to a 2 GB memory size. Not sure if that can be changed, or if it the same across all runs, but it looks the reason behind the kernel restarting over and over again, and the notebook being unable to run past the simplest cells (e.g. the deterministic tractography is unable to complete the Otsu brain masking - 2nd cell).

This makes me wonder how other lessons have run on such a low memory or how this one was run at the first workshop ever.

If that cannot be changed, we may need to start thinking about the CONP hosting that was discussed at some point @kaitj @josephmje @jerdra .

@jerdra
Copy link

jerdra commented Nov 8, 2021

@jhlegarreta does this occur when you try to run the notebook in isolation or when you progress through lessons?

In the latter case, manually shutting down notebooks frees up memory from previous episodes; closing the notebook tab isn't enough to free memory.

Nonetheless you bring up a good point re: better hosting solutions, I'll reach out to Loic from CONP again about any progress he's made with using our lessons w/Neurolibre

@jhlegarreta
Copy link
Contributor

@jhlegarreta does this occur when you try to run the notebook in isolation or when you progress through lessons?
In the latter case, manually shutting down notebooks frees up memory from previous episodes; closing the notebook tab isn't enough to free memory.

Ah, I did not know that. Thanks @jedra. Have tried again running the deterministic tractography notebook in isolation, and I've been able to run it once; trying to free resources and re-run it again was making the notebook to stall again at a later point (where memory usage was 1.85/2.00 GB) 😟.

So it looks like 2 GB is still too low as a memory resource.

Either way, I would add a note to warn users about this memory management. I have not been able to find any relevant mention in the official doc, though: @jedra do you happen to know where this is stated? Thanks.

@josephmje
Copy link
Contributor

Either way, I would add a note to warn users about this memory management. I have not been able to find any relevant mention in the official doc, though: @jedra do you happen to know where this is stated? Thanks.

https://mybinder.readthedocs.io/en/latest/about/about.html#how-much-memory-am-i-given-when-using-binder

@jhlegarreta
Copy link
Contributor

https://mybinder.readthedocs.io/en/latest/about/about.html#how-much-memory-am-i-given-when-using-binder

Thanks @josephmje. So that's about binder. Was wondering about the Jupyter side too.

@josephmje
Copy link
Contributor

https://mybinder.readthedocs.io/en/latest/about/about.html#how-much-memory-am-i-given-when-using-binder

Thanks @josephmje. So that's about binder. Was wondering about the Jupyter side too.

I think the 2 GB limit is only on Binder's side. If you were to run the notebooks locally, you'd be limited by your own computer's resources.

@jhlegarreta
Copy link
Contributor

I think the 2 GB limit is only on Binder's side. If you were to run the notebooks locally, you'd be limited by your own computer's resources.

Yes, but I would assume that for the notebooks that are not manually shut down memory is not freed as Jerry said. I was wondering where that is said in the Jupyter documentation.

@jerdra
Copy link

jerdra commented Nov 8, 2021

@jhlegarreta I'm not sure where in Jupyter its mentioned tbh. I noticed this in the fMRI lessons where in JupyterHub upon closing a tab for a given lesson you'll may notice that there is still a small black circle beside the notebook - this indicates that the kernel is still running.

Closest I can find is this discussion:

jupyterlab/jupyterlab#5241

And this pull request on JupyterHub:

jupyterlab/jupyterlab#6275

Looks like you can update a Jupyter setting kernelShutdown: true to automate this

@jhlegarreta
Copy link
Contributor

Thanks for the explanation @jedra 💯. I added the related notes in PR #200. I did not add the links to the issues/PR since I thought that might look somehow weird/requires the effort to go through all of the discussions. But can be changed at a later time/if documentation becomes available. Also, it looks that there is a File -> Close and Shutdown Notebook option in the menu according to the discussion (have not checked). I prefer not to configure shutting down kernels programatically since users would not be warned.

@jerdra
Copy link

jerdra commented Nov 9, 2021

@jhlegarreta np! That's the approach that we're taking with the fMRI notebooks as well! You can also right-click notebooks with a black circle to shut them down :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants