Create a Custom Jupyter Lab¶
Overview¶
In this tutorial, you'll create a custom Jupyter Lab on the RosettaHub Supercloud platform. Jupyter Labs are Docker-based formations (Docker Labs) that provide a browser-accessible JupyterLab environment. You'll learn how to create an object storage for your data, attach it as a working volume, launch and customize the lab, snapshot it, and share it with others.
A key concept in this tutorial is the working volume. When you attach an object storage as a working volume, it is mounted as JupyterLab's working directory. Files in the working volume persist independently of the machine image -- even if you delete and recreate the machine, your data remains in the object storage.
Prerequisites¶
- [ ] RosettaHub account with active subscription
- [ ] At least one cloud account connected (see Cloud Keys)
- [ ] Access to the JupyterLab public Docker Lab formation
- [ ] Notebooks or datasets to upload (optional)
Steps¶
Step 1: Open the Container Apps Perspective¶
From the RosettaHub dashboard, select the Container Apps perspective. This displays the Docker Labs view where container-based formations are listed.
Step 2: Create an Object Storage¶
In the Object Storages view, create a new storage to hold your data files.
- Right-click in the Object Storages panel
- Select Create
- Name the storage my-new-storage
- Click OK
See Object Storages Guide for more details on storage management.
Step 3: Upload Files to Your Storage¶
Click on the newly created my-new-storage entry. The AWS console opens, showing your S3 bucket.
- Click Upload in the S3 console
- Drag and drop your notebooks, datasets, or other files
- Click Upload to confirm
Note
The console-based upload is available for S3 (AWS) storage. For Azure Blob or GCP Storage, use the respective cloud provider consoles or CLI tools.
Step 4: Clone the JupyterLab Formation¶
In the Docker Labs view, locate the JupyterLab formation.
- Right-click JupyterLab
- Select Clone
- Name the new formation my-new-lab
Step 5: Configure the Working Volume¶
Attach your object storage as the working volume so JupyterLab can access your files.
- Right-click the my-new-lab formation
- Select Configure
- Navigate to the Volumes tab
- Set Working Volume to my-new-storage
- Click Save
Tip
The working volume is mounted as JupyterLab's working directory. Any files you save in JupyterLab's file browser are automatically persisted to the object storage.
Step 6: Launch Your Jupyter Lab¶
Click on my-new-lab to launch it. Click Yes in the confirmation dialog.
Your session appears under the Sessions panel. Wait approximately 2 minutes for the session to show a green tick, indicating it is ready.
Click the session to connect. JupyterLab opens in a new browser tab.
In JupyterLab, you can:
- Access the files you uploaded to the object storage in the file browser
- Open and run notebooks
- Install missing Python packages using
piporconda - Create new notebooks and save them to the working directory
Note
Files saved in the working directory are persisted to the object storage. Files saved elsewhere on the filesystem exist only in the container image.
Step 7: Create a Machine Image¶
After installing packages or making other environment customizations, snapshot your session to preserve them.
- Return to the RosettaHub dashboard
- Right-click your running session in the Sessions panel
- Select Create Machine Image
- Keep Update Originator Formation On Success checked
RosettaHub will:
- Snapshot the session with all newly installed packages into a new machine image
- The image appears under the Images panel (see Images Guide)
- Automatically update the my-new-lab formation to use the new image
Tip
You do not need to snapshot just to save data files. Files in the working directory are automatically persisted to the object storage. Snapshot only when you install new packages or modify the system environment.
Step 8: Share Your Lab¶
Share your customized Jupyter Lab with others:
- Right-click the my-new-lab formation
- Select Share
- Choose to share with a specific user, your organization, or a group
When recipients launch the shared formation, they get an identical JupyterLab environment with all your installed packages. If you shared the object storage as well, they also have access to the same data files.
Key Concepts¶
| Concept | Description |
|---|---|
| Docker Lab | A container-based formation running a web application (Jupyter, RStudio, etc.) |
| Working Volume | An object storage mounted as the application's working directory |
| Machine Image | A snapshot of the container with installed packages and system changes |
- Working volume files persist in object storage -- they survive machine deletion and image updates
- Machine image changes (installed packages, system configuration) require a snapshot to persist
- The working volume and machine image are independent -- update one without affecting the other
Next Steps¶
- Create a Custom RStudio Lab - Similar workflow for R development
- Create a Custom Big Data Lab - Hadoop/Spark cluster environments
- Object Storages Guide - Managing object storages
- Formations User Guide - Complete formations documentation
- Images Guide - Managing machine images
- Cloud Operations - Governance, budgets, and policy enforcement
Troubleshooting¶
JupyterLab does not show my uploaded files
Ensure that:
- You configured the correct object storage as the Working Volume (Step 5)
- You uploaded files to the correct S3 bucket (Step 3)
- The session was launched after configuring the volume
If you changed the volume configuration after launching, you need to shut down and relaunch the session.
Installed packages are missing after relaunch
Packages installed via pip or conda inside JupyterLab are stored in the container filesystem, not the working volume. To preserve them:
- Install the packages you need
- Use Create Machine Image (Step 7) to snapshot the session
- Ensure Update Originator Formation On Success is checked
Future launches will include the installed packages.
Session takes too long to start
Docker Lab sessions typically start within 2 minutes. If it takes longer:
- Check your cloud account status and quotas
- Verify your budget allocation has not been exceeded
- Try a different cloud region
Cannot access the object storage console
The direct console link opens the AWS S3 console. Ensure that:
- Your cloud key has the necessary S3 permissions
- You are logged into the correct AWS account
- Pop-up blockers are not preventing the new tab from opening