Modular solutions for every MLOps challenge
From feature engineering to training and serving to monitoring, you can embed some or all the modules depending on the needs of your AI/ML product while sharing a common metadata core across them.
DKube modules can be made consistent with your product brand and color palette if needed. This gives you control over your MLOps offering to your customer, including field monitoring of AI/ML models and retraining. DKube runs wherever your data or environment is- on-prem, cloud, or at the edge.
Integrate DKube into your existing product
Can be made consistent with your brand, look, and feel
Runs on-prem, cloud, or at the edge
Features can be customized
Any combination from the following modules:
Basic Workflow
Feature Engineering
IDEs
Training
Pipelines
Serving
Monitoring
Integrated with existing infrastructure
API-based interface to control and extend functionality
Plugins used to add further capabilities
Can operate in multiple environments
On-prem
Cloud
Simple migration between platforms
Look, feel, and workflow identical in all environments
Execute workload based on required characteristics
Execution based on where the data is
Control execution for best cost/performance trade-off
K8s
Slurm
LSF
Spark
Based on best-in-class open standards- Kubeflow, MLFlow, Kubernetes
Large development community
Rapid integration of new features
Reduced learning curve for users
DKube licensing options based on your business model
User counts
Hardware configuration
Capabilities/features
Number of models