Hi,
Hope you are doing great!
We have an urgent requirement with one of our clients, please review the job description below and let me know your interest.
Role : Data (Apache Flink) EngineerLocation : Chicago,IL
Local to Illinois only
Experience: 7
We are seeking an onsite resource to
ensure strong collaboration who can lead Kafka‐based initiatives from architecture and
design through delivery and production rollout.
This individual will serve as the technical lead for streaming projects and must bring deep,
practical expertise in Confluent Flink in addition to the broader Kafka ecosystem with below
expectations:
• Expert-level architecture and implementation experience of Flink applications on
Confluent Platform, specifically for high-volume, low-latency stream processing.
• Extensive experience architecting, implementing, and administering the Confluent
Cloud Kafka and Flink platform in production environments.
• Advanced proficiency in core Flink concepts including state management
(Keyed/Operator State, RocksDB), Exactly-Once semantics, and configuring
checkpointing and savepoints for fault tolerance.
• Deep knowledge of Event Time processing, Watermarks (Bounded Out-of-
Orderness), and complex Windowing (Tumbling, Sliding, Session) for accurate
stream analytics.
• Advanced knowledge of KSQL DB and KStreams for rapid development of real-time
stream processing/analytics alongside Flink.
• Proven proficiency in Kafka Connectors (including Change Data Capture/CDC) from
configuration to end-to-end integration in cloud environments.
• Demonstrated experience applying Flink and Kafka in the Retail Industry for use
cases such as real-time inventory management, dynamic pricing, fraud detection,
and personalized customer experience (e.g., clickstream analysis).
• Strong background in platform governance: schema registry, RBAC, audit logging,
retention, and compliance.
• Deep expertise with Terraform and the Confluent Terraform provider; adherence to
Infrastructure-as-Code (IaC) methodology and automation.
• Practical experience designing and managing Harness CI/CD pipelines (or other
similar tools) for automated deployment and configuration management of Flink jobs.
• Advanced knowledge of GCP networking, including Private Service Connect (PSC),
DNS, Firewalls, and enterprise security.
• Track record in implementing cloud-native monitoring and observability solutions;
troubleshooting, Flink performance tuning, and incident response.
• Thorough experience with Disaster Recovery (DR), High Availability (HA) strategies,
backup/restore, and multi-region design.
• Practical experience with cost optimization, resource monitoring, and right-sizing
specifically for Flink and Kafka resources in Confluent Cloud.
• Strong abilities in schema management, version compatibility, and data governance.
• Demonstrated capability in capacity planning, partitioning, and scaling high-
throughput streaming architectures.
• Experienced in Agile/DevOps methodologies.
• Experience providing hands-on production support for mission-critical streaming
platforms.
Thanks & Regards,
Vysakh Menon
Talent Acquisition Manager
Aditi LLC