Reliable migration
Migrate your data to another event hub, make several copies of your data, and restore multiple times without affecting the load on the production environment:
Move data from Open-Source Apache Kafka to Confluent Cloud or another event hub.
Create a decoupled source for your dataset; Kannika Armory keeps your data safe.
Create the dataset once, restore multiple times; it will not affect the load on production.
Validate replicated data and migrate the dataset to test environments.
Migrate from your Open-Source Kafka to Confluent including schemas.
Migrate from your on-premises Confluent platform to Confluent, including schemas.
Migrate from your Azure Event Hub to Redpanda, including schemas.
Features
Intuitive interface
We have GUI, REST API and Kubernetes CRDs. Kannika Armory has got you covered.
All major event hubs supported
Whether you want to use Apache Kafka, Confluent Kafka, MSK, Pulsar, Redpanda, or Azure Event Hub, we got you covered.
Control and filter your restore process
Restore your data with full control, including additional filtering that allows you to define the set of data you want to restore.
Environment cloning with schema mapping
When cloning your dataset to a new cluster/event hub, map the schema ids to match the new schema registry.
Deploy anyplace, anytime
Kannika Armory is Kubernetes-native. This means you can deploy the solution anywhere Kubernetes runs. Any type of cloud (AWS, Google or Azure), on-premise and even on your own PC is supported.