The future is here

"There has been a revolution; organizations must become real time; to become real time, they must become event driven." Journey to event driven - Part 1

Our Consulting Packages

Architecture Workshops (2 days)

Streaming architectures are different. Whether building data-pipelines of event-driven microsevices, they all require a dataflow model to be constructed using sound principles that separate the core flow from the observability, operations and control planes. This 2 day session covers: technology strategy & selection, solution architecture, Kafka architecture, high-level data flow modelling, DevOps and automation. (tailor using services below)

Tooling & implementation accelerator (4-8 weeks)

Tooling of streaming platforms is different. We can be deployed to plan, adopt and implement technology strategy; including Cloud, automation, DevOps, build, application framework (Quarkus, Micronaut), distributed logging and CI/CD pipelines. (tailor using services below)

Residency (6-8 weeks)

PoC implementation and planning sessions for Event Storming, Domain driven design, data flow, operations and modelling. Additionally, technical design for data centric functions such as data-governance, data-lineage, data-classification, events-as-an-API and more. (tailor using services below)


Our Services

Architecture workshops

Covering event-driven-design, Kafka, central nervous system, event modeling, operations, DevOps, CI/CD and multi-cloud to build a comprehensive development plan and blue-print; we will work with you to tailor the content

Confluent Platform

We are experts with comprehensive knowledge of Confluent's Streaming platform. This includes deployment planning, technology selection and scaling. Some activities may include Confluent PS consultants as part of our partnership arrangement with them.

Open Source Apache Kafka

We also provide design and architecture services for Open Source Apache Kafka variants; including pure Apache Kafka, Confluent Community and others. Includes Kafka core, Kafka Connect and Kafka Streams as well as open source monitoring with Grafana and others.

Apache Kafka Build versus Buy

We work with you to evaluate alternatives of Open-Source versus Vendor based solutions. Including core technology and surrounding tools to map to the solution architecture while capturing development and operational concerns.

Data modelling

Event storming, domain driven design to capture data flows of domain and technical events. Define events as APIS and meet operational concerns for data evolution and observability.

Proof-of-Concept

We scope, design and agree the overall plan while delivering on-site or remotely. This accelerator helps for common scenarios such as data pipelines, event-driven microservices or a central nervous system. More advanced PoCs include stream processing shoot-outs, technology selection (i.e. Quarkus, Graalvm) and tooling.

Operational architecture

Capture and design Operational concerns including deployment, upgrade, recovery. Implementation will include deadletter queues, log aggregation, filtering and alerting using Kafka Streams and other tooling.

Instrumentation and Control plane

Develop streaming applications for observability of business metrics, KPIs and complete, end-to-end visibility including tooling selection

CI / CD pipeline design

Learn how to use the latest tools and techniques with Kafka; replay production incidents for regression scenario analysis, feature flag patterns, zero-down-time deployment

Snippets from our latest work

All
Conferences
Video
Web
Slides
Podcast
https://bit.ly/2DrGPJw

A startup specializing in Apache Kafka and Cloud

Our goal is to make your realtime project successful. We fill in the gaps of hard-to-find expertise. We help you realise your vision and get it done.

Nice to hear from you

Feel free to arrange a local or remote meeting, find out about pricing and availability or anything else!

Our Code Journal

Hooking up DynamoDB using Quarkus

Quarkus is proving to be very practical and building stuff fast. Its a shame the current support only uses the latest experimental AWS drivers.

S3 uploads now working with logscape-ng

Uploading to S3 is now working... the next step is to wire in DynamoDB for the metastore

Quarkus support for CDI and RestEasy make development seem too easy

Java development is now become meta based. The use of annotation based magic is overwhelming; but terrifyingly simple.