What is Kafka?
Kafka is an open-source distributed Message streaming platform that uses publish and subscribe mechanism to stream the records. Confluent Apache Kafka has fully managed services with major cloud platforms like Microsoft Azure, AWS, and Google Cloud providers.
How to Register Using Confluent Kafka
You can register for Kafka on Confluent IO.
If you're new to Confluent Kafka, you can register by using the get started URL, https://www.confluent.io/get-started/.
There are two types of confluent Kafka supports on Cloud and Software
- Cloud: A fully managed service available with major cloud platforms like Microsoft Azure, AWS, etc for Apache Kafka®
- Software: A self-managed, enterprise-grade distribution of Apache Kafka®
New signups received a 30-day trial with a 400$ Credit limit for exploring the CAK
The login process is shown in the below screenshot steps.
It will redirect you to a login URL with multiple options as shown in the below screenshot.
Once successfully validated the login credentials, the home screen will be displayed as shown below.
For a free trial account, the default environment will be created by the CAK service at the time of account creation.
Now important menu items we should know before diving into CAK services.
SNO |
Menu Name |
Description |
1 |
Environments |
Kafka clusters and distributed elements like Connect, ksqlDB, and Schema Registry are found in an environment. For Confluent Cloud, you can specify numerous environments for a single Organization, and adding more environments is completely free. To prevent interfering with one another, various divisions or teams can use separate environments. |
2 |
Accounts & Access |
Managing User Accounts & Service Accounts on this select section |
3 |
Bill Payments |
The amount of resources used by your cloud company determines how much Confluent Cloud will charge you. The total quantity of data that is transferred into and out of your cluster, as well as the request overhead incurred by the Kafka protocol, are included in this. |
4 |
Cloud API Keys |
Access to Confluent Cloud components and services is controlled by Confluent Cloud API keys. Each API key comprises of a secret as well as a key. For information on using user and service profiles, as well as their possession of API keys. |
5 |
Metrics |
Multiple data are gathered from an Apache Kafka® cluster by the Confluent data Reporter. |
6 |
Single sign-on |
Using your current SAML-based identity supplier, Confluent Cloud enables single sign-on (SSO). |
For numerous independent software platforms, SSO offers access control. SSO eliminates the need for businesses to keep and handle passwords in company databases by enabling enterprise users to log in to numerous, unrelated systems using a single user ID and password. Additionally, SSO enhances security while lessening support and troubleshooting problems related to individual logins.
We can easily on-board new environment in the CAK portal shown below screenshot
Stream governance packages are mandatory to select for data streams moving throughout your cloud environments to work.
Select the cloud provider and region where you want the environment Schema Registry and Stream Catalog to run and metadata to be stored.
Once the development environment is created, detailed information will be displayed as shown in the below screenshot.
Now click on the 'Create cluster on my own' button.
Limits & Features by cluster type as shown in the below screenshot.
https://docs.confluent.io/cloud/current/clusters/cluster-types.html?ajs_aid=abf48b6b-35e4-47e7-98e3-905e78b6a239&ajs_uid=970455#ccloud-features-and-limits-by-cluster-type
While creating a cluster, we need to choose a cloud provider to deploy our Kafka services with the below region details and Available zones are must provide.
Once the configuration is completed and it will show the complete cluster config details to lunch for our service usage, below screenshot for reference.
Configuration Cost, Usage limits, and Uptime SLA will be shown before launching the new cluster.
Confluent Kafka is guaranteed to provide 99.5% SLA for the basic cluster and we can increase the basic cluster to the standard cluster and it will give the best performance.
Now we successfully configured CAK Cluster shown in the above steps and clicked on the Launch button to onboard the new cluster.
Now we successfully configured the basic configuration to dive into the CAK.
More Promo codes for developers