Streaming
Reliable Message Persistence and Processing at Scale
FlowMQ's streaming feature provides a robust, scalable solution for message persistence and stream processing. Built on an append-only log architecture with cloud-native storage, FlowMQ streams offer the reliability of Kafka-style messaging while seamlessly integrating with MQTT and other protocols through unified topics.
Whether you're building real-time analytics pipelines, event sourcing systems, or need to persist IoT sensor data for later analysis, FlowMQ streams provide the foundation for reliable, scalable message storage and processing.
What is a FlowMQ Stream?
A FlowMQ stream is an append-only log designed to store messages reliably and efficiently. Unlike traditional message queues that delete messages after consumption, streams preserve all messages, creating a durable record of events that can be replayed, analyzed, and processed multiple times by different consumers.
Key Characteristics
- Append-Only Architecture: Messages are only added to the end of the log, never modified or deleted
- Immutable Storage: Once written, messages remain unchanged, ensuring data integrity
- Ordered Delivery: Messages maintain their original order within each partition
- Persistent Storage: Messages are stored durably on object storage systems like AWS S3
- Scalable Design: Streams can handle high-throughput scenarios with horizontal partitioning
Cloud-Native Storage Architecture
FlowMQ streams leverage object storage systems like AWS S3, Google Cloud Storage, and Azure Blob Storage for message persistence. This design provides several advantages:
Storage Benefits
- Cost-Effective: Object storage is significantly cheaper than traditional block storage
- Unlimited Scalability: No storage limits, scales automatically with your data volume
- High Durability: 99.999999999% (11 9's) durability with automatic replication
- Global Accessibility: Messages accessible from any region or availability zone
- Built-in Backup: Natural disaster recovery with cross-region replication
Kafka Compatibility
For Kafka users, a FlowMQ stream functions exactly like a Kafka topic. This compatibility means you can use your existing Kafka tools, libraries, and applications with FlowMQ streams without any modifications.
Complete Kafka API Compatibility
Kafka Features Supported
- Producers and Consumers: Use standard Kafka client libraries
- Consumer Groups: Multiple consumers can process messages in parallel
- Offset Management: Track consumption progress with Kafka-style offsets
- Partitioning: Messages distributed across multiple partitions for parallelism
- Replication: Use object storage like AWS S3 with high durability and fault tolerance
Stream Partitioning
Like Kafka topics, FlowMQ streams support partitioning to enable parallel processing and horizontal scaling:
Partition Benefits
- Parallel Processing: Multiple consumers can process different partitions simultaneously
- Ordered Delivery: Messages within a partition maintain strict ordering
- Scalable Throughput: Add partitions to increase processing capacity
- Load Distribution: Messages distributed evenly across partitions
Using Kafka Clients
You can use FlowMQ streams with any Kafka client library:
// Java Kafka Producer Example
Properties props = new Properties();
props.put("bootstrap.servers", "your-project.flowmq.cloud:9093");
props.put("security.protocol", "SASL_SSL");
props.put("sasl.mechanism", "PLAIN");
props.put("sasl.jaas.config",
"org.apache.kafka.common.security.plain.PlainLoginModule required " +
"username=\"your-username\" password=\"your-password\";");
KafkaProducer<String, String> producer = new KafkaProducer<>(props);
producer.send(new ProducerRecord<>("my-stream", "sensor-1", "temperature: 25.5�C"));# Python Kafka Consumer Example
from kafka import KafkaConsumer
consumer = KafkaConsumer(
'my-stream',
bootstrap_servers='your-project.flowmq.cloud:9093',
security_protocol='SASL_SSL',
sasl_mechanism='PLAIN',
sasl_plain_username='your-username',
sasl_plain_password='your-password',
auto_offset_reset='earliest'
)
for message in consumer:
print(f"Received: {message.value.decode('utf-8')}")Topic Filter Binding
One of FlowMQ's most powerful features is the ability to bind streams to topic filters. This allows you to automatically capture and persist messages that match specific topic patterns, making it incredibly useful for MQTT message persistence and later analysis.
How Topic Filter Binding Works
Setting Up Topic Filter Binding
When creating a stream, you can specify topic filters that determine which messages get automatically stored.
MQTT Message Persistence
Topic filter binding is particularly valuable for MQTT message persistence. MQTT is designed for real-time communication, but often you need to store messages for historical analysis, compliance, or offline processing.
MQTT to Stream Pipeline
Use Cases for MQTT Persistence
1. IoT Data Analytics
- Collect temperature, humidity, motion data
- Perform batch analytics on historical data
- Generate insights and trends
- Create predictive models
2. Compliance and Auditing
- Track device control commands
- Maintain regulatory compliance
- Provide audit trails for security
- Enable forensic analysis
3. Offline Processing
- Process high-volume sensor data
- Generate daily/weekly reports
- Perform complex aggregations
- Handle spike loads efficiently
Benefits of FlowMQ Streaming
1. Unified Protocol Support
- Store messages from MQTT, Kafka, AMQP, and NATS in the same stream
- Process multi-protocol data with unified tooling
- Eliminate protocol-specific storage solutions
2. Cost-Effective Storage
- Object storage pricing (pennies per GB per month)
- No expensive SAN or NFS storage required
- Automatic compression and optimization
3. Infinite Scalability
- No storage limits or capacity planning needed
- Automatic scaling based on data volume
- Global replication and disaster recovery
4. Developer Productivity
- Use familiar Kafka tools and libraries
- Automatic MQTT message persistence
- No complex ETL pipelines required
5. Operational Simplicity
- Managed storage and replication
- Built-in monitoring and alerting
- Zero-downtime scaling and maintenance
FlowMQ streams provide the foundation for building robust, scalable event-driven architectures that seamlessly bridge real-time and batch processing workflows across multiple messaging protocols.