Quick Start Guide
This guide will walk you through installing the Neuronum SDK, deploying your first Agent with Neuronum Server, and making your first API call.
Protocol Note: The Neuronum SDK is powered by an end-to-end encrypted communication protocol based on public/private key pairs derived from a randomly generated 12-word mnemonic. All data is relayed through neuronum.net, providing secure communication without the need to set up public web servers or expose your infrastructure to the public internet.
⚠️ Development Status: The Neuronum SDK is currently in early stages of development and is not production-ready. It is intended for development, testing, and experimental purposes only. Do not use in production environments or for critical applications.
Requirements
- Python >= 3.8
- CUDA-compatible GPU (for Neuronum Server)
- CUDA Toolkit (for Neuronum Server)
Step 1: Installation
Create and activate a virtual environment
python3 -m venv ~/neuronum-venv
source ~/neuronum-venv/bin/activate
Install the Neuronum SDK
pip install neuronum==2026.01.0.dev1
Note: Always activate this virtual environment (source ~/neuronum-venv/bin/activate) before running any neuronum commands.
Create a Neuronum Cell (secure Identity)
neuronum create-cell
Step 2: Deploy with Neuronum Server
Start the Neuronum Server to deploy your AI model as an agentic backend:
neuronum start-server
This command will:
- Clone the neuronum-server repository (if not already present)
- Create a Python virtual environment
- Install all dependencies (vLLM, PyTorch, etc.)
- Start the vLLM server in the background
- Launch the Neuronum Server
Note: The initial setup may take some time as it downloads and installs the required dependencies.
Step 3: Call Your Agent
Now that your server is running, you can interact with your Agent using "kybercell" (official Neuronum Client) or build your own custom Client using the Neuronum Client API.
Python API
For programmatic access and integration into your applications:
import asyncio
from neuronum import Cell
async def main():
async with Cell() as cell:
cell_id = "id::cell" # Target cell to communicate with
# Send a prompt to your Agent
prompt_data = {
"type": "prompt",
"prompt": "Explain what a black hole is in one sentence"
}
tx_response = await cell.activate_tx(cell_id, prompt_data)
print(tx_response)
if __name__ == '__main__':
asyncio.run(main())
Next Steps
Now that you have your Agent up and running, explore more capabilities:
- Client API - Learn about knowledge management, tool management, and task scheduling
- Tools CLI - Create custom MCP-compliant tools to extend your Agent's functionality
- Server Configuration - Customize your server settings and model parameters
- E2EE Protocol - Learn how Neuronum keeps your data secure
Need Help? For more information, visit the GitHub repository or contact us.