Secure your Jupyter Notebooks

Brought to you by Protect AI, NB Defense is a JupyterLab Extension as well as a CLI tool that encourages you to think about security throughout every step of your machine learning development process. NB Defense is now open source - Begin securing your notebooks today!

Detect various vulnerabilities

Secrets

Secrets

API keys, private keys, authentication tokens, and other security credentials.

PII

PII

Sensitive data and personally identifiable information.

CVE

CVE

CVE vulnerabilities and exposures in ML OSS frameworks, libraries, and packages.

Third-Party Licenses

Third-Party Licenses

Non-permissive licenses in ML OSS frameworks, libraries, and packages.

NB Defense Features

Contextual Guidance within JupyterLab

With the NB Defense JupyterLab Extension, you can leverage rich contextual help to identify problem areas within your Notebook. This feature streamlines the security review process by pinpointing areas of concern directly in-line within the notebook.

Learn more

Advanced Repository Scanning

The NB Defense CLI is designed to facilitate scanning of an entire Git Repo or folders containing notebooks, enabling you to leverage NB Defense's entire security capabilities outside of Jupyter environment.
The CLI tool can be inserted into Continuous Integration (CI) systems as a pre-commit hook, ensuring a streamlined development process.

Learn more

CVE Identification

Within the JupyterLab Extension, there is an innovative CVE scanner (first to market) which looks for code dependencies that are currently being imported by your Notebook and installed in your Python Kernel. This provides you with a lens on potential security risks that may be imported into your notebooks.

Learn more

Customizable Scanning Options

The JupyterLab Extension and CLI can both be easily configured to scan for specific types of secrets, PII, and third-party licenses. This allows you to set the appropriate sensitivity of the scan and tailor the security review process to your specific needs.

Learn more

Protect AI and the MLSecOps Community

Protect AI is committed to developing and promoting tools that help the ML community adopt best security practices practices. As part of this commitment, we have open sourced NB Defense under Apache 2.0 license. We believe that by working together, we can create a safer and more secure environment for all users of ML systems.

We welcome contributions to NB Defense as Pull Requests and GitHub Issues.
We invite you to join our MLSecOps Slack community where you can hear from experts in the field, discuss, and collaborate.