Welcome
Welcome to the documentation for Datalayer, your platform for running Remote Jupyter Kernels with GPU or CPU power, directly from your favorite Local Data Science IDEs such as JupyterLab, VS Code and even from CLI.
Datalayer helps you scale your Data Science workflows without the need to change your existing code. Our platform is designed to seamlessly integrate into your workflows and supercharge your computations with the processing power you need.
Whether you're a Data Scientist, AI engineer, or Researcher, Datalayer enhances your productivity and performance by providing powerful Remote Kernel capabilities.
Go to the Join Page and request an invitation for Free Credits! 🚀
This documentation will guide you through the platform's features and help you get started with your first Remote Kernel.
📄️ Environments
In the context of Datalayer, an Environment refers to the collection of Python libraries and resources (hardware like CPU, Memory...) requirements that your Jupyter Kernels utilize to execute your code. We offer a range of predefined environments to help you quickly leverage powerful GPU and CPU capabilities, enabling you to focus on your Data Science projects without unnecessary setup time.
📄️ Kernels
Datalayer introduces Remote Kernels to seamlessly scale your Data Science and AI workflows.
📄️ Content
Content is where you store and manage your data.
📄️ Use from JupyterLab
Datalayer integrates seamlessly as a JupyterLab extension, allowing you to manage and interact with Remote Kernels directly from your familiar JupyterLab interface. The extension mirrors the functionality available on Datalayer once you log in.
📄️ Use from VS Code
Go to the action menu of the kernel and select the Copy link for IDE action. This will copy in your clipboard a link to use in VS Code.
📄️ Use from CLI
Datalayer provides a Command Line Interface (CLI) allowing to list, create, terminate, open a console and run Notebooks and Python files on a Remote Kernel.
📄️ Credits
The usage of Remote Kernels is based on a credit consumption model. When you open an account, you begin with a set of free credits to get you started. Credits are consumed while Remote Kernels are running.
📄️ Secrets
You can securely store and manage secrets for your Kernels in the "Secrets" tab in the settings section. The secrets will be injected in all Remote Kernels as environment variables. The environment variable name be the secret name.
📄️ Examples
Enjoy pre-backed examples in this GitHub repository.