10 November 2023

Kafka real world use case

Let's consider a real-world use case for Kafka involving a simple e-commerce platform. In this scenario, Kafka is employed to enable communication and data flow between different components of the system.

Real-World Use Case: E-commerce Order Processing

Components:

  1. Order Service (Producer):
    • The Order Service is responsible for receiving and processing customer orders. After processing an order, it produces order events to a Kafka topic called orders.
  2. Inventory Service (Consumer):
    • The Inventory Service is responsible for managing the inventory of products. It consumes order events from the orders topic to update the available quantities of products in the inventory.
  3. Shipping Service (Consumer):
    • The Shipping Service is responsible for handling the shipment of orders. It consumes order events from the orders topic to initiate the shipping process.

Workflow:

  1. Customer Places an Order:
    • A customer places an order on the e-commerce platform. The Order Service receives the order details, processes the order, and generates an order event.
  2. Order Event Production (Producer - Order Service):
    • The Order Service produces an order event to the Kafka orders topic. The order event includes information such as the order ID, customer details, and items in the order.
  3. Inventory Service Updates Inventory (Consumer - Inventory Service):
    • The Inventory Service is subscribed to the orders topic. It consumes the order event and updates the inventory by reducing the quantities of the ordered items.
  4. Shipping Service Initiates Shipment (Consumer - Shipping Service):
    • The Shipping Service is also subscribed to the orders topic. It consumes the order event and initiates the shipment process, preparing the order for delivery.

Benefits of Kafka in this Use Case:

  1. Asynchronous Communication:
    • Kafka enables asynchronous communication between different services. The Order Service doesn't need to wait for the Inventory and Shipping services to complete their tasks. It can produce the order event to Kafka and continue processing other orders.
  2. Fault Tolerance and Scalability:
    • Kafka provides fault tolerance and scalability. If any of the consumer services (Inventory or Shipping) is temporarily unavailable, Kafka will retain the order events, ensuring that they can be processed when the services are back online.
  3. Real-time Updates:
    • The Inventory and Shipping services receive real-time updates about new orders. This ensures that the inventory is always up-to-date, and the shipping process can be initiated promptly.
  4. Loose Coupling:
    • The services are loosely coupled through Kafka. The Order Service doesn't need to be aware of the internals of the Inventory or Shipping services, and vice versa. They communicate through the shared orders topic.
  5. Scalable Architecture:
    • As the order processing load increases, each service can be independently scaled. For example, if the order volume is high, more instances of the Inventory and Shipping services can be added to handle the load.

This is a simplified example, but it illustrates how Kafka can be used in a real-world scenario to enable efficient communication and data flow between different components of a distributed system, such as those found in an e-commerce platform.

Below is a simplified example in both Node.js and Python, demonstrating the order processing use case with Kafka. The examples focus on the Order Service (Producer), Inventory Service (Consumer), and Shipping Service (Consumer).

Node.js Example:

Order Service (Producer):

javascriptCopy code
const kafka = require('kafka-node');
const Producer = kafka.Producer;
const client = new kafka.KafkaClient({ kafkaHost: 'localhost:9092' });
const producer = new Producer(client);

producer.on('ready', () => {
    // Simulate order placement
    const order = {
        orderId: 123,
        customerName: 'John Doe',
        items: ['ProductA', 'ProductB'],
    };

    const payloads = [
        { topic: 'orders', messages: JSON.stringify(order) },
    ];

    producer.send(payloads, (err, data) => {
        if (err) {
            console.error(err);
        } else {
            console.log('Order event sent:', data);
        }
    });
});

producer.on('error', (err) => {
    console.error('Error in producer:', err);
});

// Close producer on process termination
process.on('SIGINT', () => {
    producer.close(true, () => {
        process.exit();
    });
});

Inventory Service (Consumer):

javascriptCopy code
const kafka = require('kafka-node');
const Consumer = kafka.Consumer;
const client = new kafka.KafkaClient({ kafkaHost: 'localhost:9092' });
const consumer = new Consumer(client, [{ topic: 'orders', partition: 0 }], { groupId: 'inventory-group' });

consumer.on('message', (message) => {
    const order = JSON.parse(message.value);

    // Simulate updating inventory
    console.log(`Updating inventory for order ${order.orderId}: ${order.items}`);
});

// Close consumer on process termination
process.on('SIGINT', () => {
    consumer.close(true, () => {
        process.exit();
    });
});

Shipping Service (Consumer):

javascriptCopy code
const kafka = require('kafka-node');
const Consumer = kafka.Consumer;
const client = new kafka.KafkaClient({ kafkaHost: 'localhost:9092' });
const consumer = new Consumer(client, [{ topic: 'orders', partition: 0 }], { groupId: 'shipping-group' });

consumer.on('message', (message) => {
    const order = JSON.parse(message.value);

    // Simulate initiating shipment
    console.log(`Initiating shipment for order ${order.orderId}`);
});

// Close consumer on process termination
process.on('SIGINT', () => {
    consumer.close(true, () => {
        process.exit();
    });
});

Python Example:

Order Service (Producer):

pythonCopy code
from kafka import KafkaProducer
import json

producer = KafkaProducer(bootstrap_servers='localhost:9092')

# Simulate order placement
order = {
    'orderId': 123,
    'customerName': 'John Doe',
    'items': ['ProductA', 'ProductB'],
}

producer.send('orders', value=json.dumps(order).encode('utf-8'))

# Close producer on process termination
producer.close()

Inventory Service (Consumer):

pythonCopy code
from kafka import KafkaConsumer
import json

consumer = KafkaConsumer('orders', group_id='inventory-group', bootstrap_servers='localhost:9092')

# Simulate updating inventory
for message in consumer:
    order = json.loads(message.value.decode('utf-8'))
    print(f"Updating inventory for order {order['orderId']}: {order['items']}")

# Close consumer on process termination
consumer.close()

Shipping Service (Consumer):

pythonCopy code
from kafka import KafkaConsumer
import json

consumer = KafkaConsumer('orders', group_id='shipping-group', bootstrap_servers='localhost:9092')

# Simulate initiating shipment
for message in consumer:
    order = json.loads(message.value.decode('utf-8'))
    print(f"Initiating shipment for order {order['orderId']}")

# Close consumer on process termination
consumer.close()

In these examples, the Order Service produces an order event to the orders topic, and both the Inventory Service and Shipping Service consume events from the same topic to perform their respective tasks. Please note that these examples are simplified for illustration purposes, and in a real-world scenario, you would likely have more sophisticated error handling, logging, and business logic.