DKube : Embeddable MLOps Engine for Your AI/ML Platform

Incorporate an MLOps workflow through DKube modules: from feature engineering to training to serving to monitoring. You can embed some or all the modules depending on the needs of your AI/ML product while sharing a common metadata core across them.

DKube modules can be made consistent with your product brand and color pallet if needed. This gives you control over your MLOps offering at your customer, including field monitoring of AI/ML models and retraining. DKube runs wherever your data or environment is: on-prem, cloud or at the edge.
DKube-Embeddable-MLOps
Kubeflow
Kubeflow
Kubernetes
Slurm

Integrate DKube into your existing product

  • Can be made consistent with your brand, look & feel
  • Runs on-prem, cloud or at the edge
  • Features can be customized
Kubeflow
Kubeflow
Kubernetes
Slurm

Any combination from the following modules:

  • Basic workflow
  • Feature Engineering
  • IDEs
  • Training
  • Pipelines
  • Serving
  • Monitoring
Kubeflow
Kubeflow
Kubernetes
Slurm

Integrated with existing infrastructure

  • Schedule jobs on external clusters
    • Slurm, LSF, Spark, Remote Kubernetes
  • Multiple data sources
    • Oracle, MySQL, Redshift, Microsoft SQL Server
  • API-based interface to control and extend functionality
  • Plugins used to add further capabilities
Kubeflow
Kubeflow
Kubernetes
Slurm

Can operate in multiple environments

  • On Prem
  • Cloud
  • Simple migration between platforms
  • Look, feel, & workflow identical in all environments
Kubeflow
Kubeflow
Kubernetes
Slurm

Execute workload based on required characteristics

  • Execution based on where the data is
  • Control execution for best cost/ performance trade-off
  • K8s
  • Slurm
  • LSF
  • Spark
Kubeflow
Kubeflow
Kubernetes
Slurm

Based on best-in-class open standards - Kubeflow, MLFlow, Kubernetes

  • Large development community
  • Rapid integration of new features
  • Reduced learning curve for users
Kubeflow
Kubeflow
Kubernetes
Slurm

DKube licensing options based on your business model

  • User Counts
  • Hardware Configuration
  • Capabilities/ features
  • Number of models