Analyse requirements for a high frequency logging and metering platform
Conceptualise and design a data processing platform based on Kafka, Kubernetes, Elastic Search, running on Azure services
Elastic SearchKafkaKubernetesAzure
Utilities/GreenTech
6 Monate
2022-04 - 2022-09
Analyse requirements for a high frequency trading platform
Architect/ Tech Lead
Architect/ Tech Lead
Analyse requirements for a high frequency trading platform
Conceptualise and design a data processing platform based on Kafka, Spark, Elastic Search, running on AWS services
Implement a high frequency timeseries MVP with QuestDB for market data analysis
Lemon markets, Berlin
1 Jahr 4 Monate
2021-01 - 2022-04
Analyse requirements for a high frequency payment platform
Architect/ Tech Lead
Architect/ Tech Lead
Analyse requirements for a high frequency payment platform
Implement a data processing platform based on Kafka, Spark, Elastic Search; Postgres and Redshift Spectrum, running on AWS services
Evaluate distributed tracing with Jaeger in existing ELK stack
PAYONE, Ratingen
9 Monate
2021-03 - 2021-11
Analyse requirements for a financial transaction processing platform
Architect/ Tech Lead
Architect/ Tech Lead
Analyse requirements for a financial transaction processing platform
Conceptualise and design a secure processing platform based on Kafka and Kubernetes on AWS
Implement ingestion with Kafka and Spark
Implement ELK set up in AWS with Logstash and CloudWatch integration
assessment of implementation efforts and cost
PPRO, München
9 Monate
2020-09 - 2021-05
Conception of cloud migration strategies
Architect/ Management Consulting
Architect/ Management Consulting
Conceptualise Cloud Migrations strategies for group IT with 300 FTEs and 800+ applications
Develop and conceptualise workflow and process following the DevOps and FastFlow strategies to increase implementation and time to market speed
Evaluate Cloud platforms (GCP, AWS, Azure) for CICD and deployment targets for applications
RWEST GmbH, Essen/Swindon
1 Jahr 7 Monate
2018-10 - 2020-04
Architecture, system design, development
Tech Lead/ Design Authority
Tech Lead/ Design Authority
Implement an event driven, stream processor for high frequency events on Kafka/ Spark for S3
Implement Redshift Spectrum/ Glue Crawler based Data Lake for events with continuous schema evolution
Assess requirements and design cloud based data processing docker components with AWS ECS
Implement distributed logging with the ELK stack, integrating Prometheus, CloudWatch and log metrics in Kibana
Design and implement Service Deployment Automation with Terraform
Implement Python based Applications with AWS Lambda
Implement data pipelines with AWS Kinesis and AWS SQS
Migrate existing data infrastructures from relational databases and Classic Redshift to the new Data Lake
ClearScore Technologies Ltd., London
1 Monat
2019-09 - 2019-09
Analyse requirements for a high frequency trading platform
Architect/ Tech Lead
Architect/ Tech Lead
Analyse requirements for a high frequency trading platform
Conceptualise and design a data processing platform based on Kafka, Spark, Elastic Search; Postgres and Redshift Spectrum, running on AWS services, assessment of implementation efforts and cost
RWE AG, Essen/Swindon
1 Jahr 4 Monate
2017-06 - 2018-09
Architecture, system design, development
Tech Lead/ Design Authority
Tech Lead/ Design Authority
Implement a micro service centric event sourced platform for a paper publication system
Assess requirements and design cloud-based data processing docker components with AWS ECR
Design and implement Service Deployment Automation with Terraform
Develop a concept for Monolith migration to microservices using strangulation pattern
Implement components with TDD on Java 8 with Kinesis, Elastic Search, PostgreSQL, AWS Lambda
Implement log distribution with filebeat, logstash with grok-patterns
Implement log and metrics dash board in Kibana with Elastic Search
Implement Python based Applications with AWS Lambda
Implement data pipelines with AWS Kinesis and AWS SQS
Refactor and migrate microservices to Docker centric deployments
Elsevier, Oxford
7 Monate
2017-04 - 2017-10
Architecture, system design, development
Lead Data Architect
Lead Data Architect
Asses requirements and design a cloud-based data processing platform
Design and implement an integrated data platform for adidas digital services based on AWS components such as EMR (Hadoop, Spark), Elastic Search
Introduce and implement Apache Kafka as a message platform
Design and Implement Terraform and ansible based immutable architecture
Adidas, Herzogenaurach
8 Monate
2016-10 - 2017-05
Architecture, system design, development
Lead Data Engineer
Lead Data Engineer
Asses requirements and design a cloud-based data processing platform
Design and implement Service Deployment Automation with AWS CloudFormation and Ansible
Introduce and implement Apache Kafka as a message platform
Implement a high volume, near real time data processing pipeline from RDBMS bin log, JMS and Kafka data sources into Redshift and AWS S3 using Apache Spark Streaming on Scala
Implement real ? time Geo - Tracking applications for vehicles with Elastic Search and Spark
Flixbus, Berlin
5 Monate
2016-06 - 2016-10
Architecture, system design, development
Lead Data Engineer
Lead Data Engineer
Asses and optimise processing infrastructure
Implement a real-time anomaly detection mechanism for current and voltage meters with Spark Streaming and Apache Kafka on Scala
Implement a Cassandra/Spark based aggregation platform for voltage and current meters for a domestic utility provider
Implement a high volume (1 M records/sec) time series archive on top of Cassandra
Optimise Cassandra access and keyspace design
Implement a Play Framework based REST API for high volume queries
Siemens Energy AG, Nuremberg
11 Monate
2015-12 - 2016-10
System design, development
Data Engineer
Data Engineer
Implement probabilistic identification mechanisms for App and Web users in profiling/ tracking records with Spark on Scala
Implement a large-scale Cassandra/Spark based aggregation platform for web and app user reporting and analytics
Implement a Play Framework based REST API for high volume queries
Optimize Cassandra access and keyspace design
Implement high volume SSTable based bulk Loading for large Cassandra Clusters
Implement automated AWS Elastic Map Reduce based pipelines with EMR and Spark
Implement Session and Cookie restoration mechanisms for user tracking records
GfK, Berlin, Nuremberg
1 Monat
2016-06 - 2016-06
Analyse requirements for a data processing platform
Architect/ Tech Lead
Architect/ Tech Lead
Analyse requirements for a data processing platform
Conceptualise and design a data processing platform based on Hadoop, Spark, Elastic Search; assessment of implementation efforts and cost
Develop a concept for Multicloud deployments in GCP and AWS
Baumarktdirekt/Otto Group, Hamburg
Einsatzorte
Einsatzorte
Berlin (+200km)
Deutschland
möglich
Projekte
Projekte
1 Jahr 1 Monat
2023-11 - heute
Architektur und Design für IoT und Data Platform in E-Mobility
Architektur und DesignRustTypeScriptPython
Architektur und Design
- Refactor and redesign Edge computing platform for fleet charging of Eletric vehicles with metering, monitoring and Smart Charging
- Implement real time Telematics ingestion with Kafka
- Redesign data processing workflows migrating batch processing ETL to streaming
- Design data platform for stream processing with dynamic querying model for metering, telematics and chargepoint data
Analyse requirements for a high frequency logging and metering platform
Conceptualise and design a data processing platform based on Kafka, Kubernetes, Elastic Search, running on Azure services
Elastic SearchKafkaKubernetesAzure
Utilities/GreenTech
6 Monate
2022-04 - 2022-09
Analyse requirements for a high frequency trading platform
Architect/ Tech Lead
Architect/ Tech Lead
Analyse requirements for a high frequency trading platform
Conceptualise and design a data processing platform based on Kafka, Spark, Elastic Search, running on AWS services
Implement a high frequency timeseries MVP with QuestDB for market data analysis
Lemon markets, Berlin
1 Jahr 4 Monate
2021-01 - 2022-04
Analyse requirements for a high frequency payment platform
Architect/ Tech Lead
Architect/ Tech Lead
Analyse requirements for a high frequency payment platform
Implement a data processing platform based on Kafka, Spark, Elastic Search; Postgres and Redshift Spectrum, running on AWS services
Evaluate distributed tracing with Jaeger in existing ELK stack
PAYONE, Ratingen
9 Monate
2021-03 - 2021-11
Analyse requirements for a financial transaction processing platform
Architect/ Tech Lead
Architect/ Tech Lead
Analyse requirements for a financial transaction processing platform
Conceptualise and design a secure processing platform based on Kafka and Kubernetes on AWS
Implement ingestion with Kafka and Spark
Implement ELK set up in AWS with Logstash and CloudWatch integration
assessment of implementation efforts and cost
PPRO, München
9 Monate
2020-09 - 2021-05
Conception of cloud migration strategies
Architect/ Management Consulting
Architect/ Management Consulting
Conceptualise Cloud Migrations strategies for group IT with 300 FTEs and 800+ applications
Develop and conceptualise workflow and process following the DevOps and FastFlow strategies to increase implementation and time to market speed
Evaluate Cloud platforms (GCP, AWS, Azure) for CICD and deployment targets for applications
RWEST GmbH, Essen/Swindon
1 Jahr 7 Monate
2018-10 - 2020-04
Architecture, system design, development
Tech Lead/ Design Authority
Tech Lead/ Design Authority
Implement an event driven, stream processor for high frequency events on Kafka/ Spark for S3
Implement Redshift Spectrum/ Glue Crawler based Data Lake for events with continuous schema evolution
Assess requirements and design cloud based data processing docker components with AWS ECS
Implement distributed logging with the ELK stack, integrating Prometheus, CloudWatch and log metrics in Kibana
Design and implement Service Deployment Automation with Terraform
Implement Python based Applications with AWS Lambda
Implement data pipelines with AWS Kinesis and AWS SQS
Migrate existing data infrastructures from relational databases and Classic Redshift to the new Data Lake
ClearScore Technologies Ltd., London
1 Monat
2019-09 - 2019-09
Analyse requirements for a high frequency trading platform
Architect/ Tech Lead
Architect/ Tech Lead
Analyse requirements for a high frequency trading platform
Conceptualise and design a data processing platform based on Kafka, Spark, Elastic Search; Postgres and Redshift Spectrum, running on AWS services, assessment of implementation efforts and cost
RWE AG, Essen/Swindon
1 Jahr 4 Monate
2017-06 - 2018-09
Architecture, system design, development
Tech Lead/ Design Authority
Tech Lead/ Design Authority
Implement a micro service centric event sourced platform for a paper publication system
Assess requirements and design cloud-based data processing docker components with AWS ECR
Design and implement Service Deployment Automation with Terraform
Develop a concept for Monolith migration to microservices using strangulation pattern
Implement components with TDD on Java 8 with Kinesis, Elastic Search, PostgreSQL, AWS Lambda
Implement log distribution with filebeat, logstash with grok-patterns
Implement log and metrics dash board in Kibana with Elastic Search
Implement Python based Applications with AWS Lambda
Implement data pipelines with AWS Kinesis and AWS SQS
Refactor and migrate microservices to Docker centric deployments
Elsevier, Oxford
7 Monate
2017-04 - 2017-10
Architecture, system design, development
Lead Data Architect
Lead Data Architect
Asses requirements and design a cloud-based data processing platform
Design and implement an integrated data platform for adidas digital services based on AWS components such as EMR (Hadoop, Spark), Elastic Search
Introduce and implement Apache Kafka as a message platform
Design and Implement Terraform and ansible based immutable architecture
Adidas, Herzogenaurach
8 Monate
2016-10 - 2017-05
Architecture, system design, development
Lead Data Engineer
Lead Data Engineer
Asses requirements and design a cloud-based data processing platform
Design and implement Service Deployment Automation with AWS CloudFormation and Ansible
Introduce and implement Apache Kafka as a message platform
Implement a high volume, near real time data processing pipeline from RDBMS bin log, JMS and Kafka data sources into Redshift and AWS S3 using Apache Spark Streaming on Scala
Implement real ? time Geo - Tracking applications for vehicles with Elastic Search and Spark
Flixbus, Berlin
5 Monate
2016-06 - 2016-10
Architecture, system design, development
Lead Data Engineer
Lead Data Engineer
Asses and optimise processing infrastructure
Implement a real-time anomaly detection mechanism for current and voltage meters with Spark Streaming and Apache Kafka on Scala
Implement a Cassandra/Spark based aggregation platform for voltage and current meters for a domestic utility provider
Implement a high volume (1 M records/sec) time series archive on top of Cassandra
Optimise Cassandra access and keyspace design
Implement a Play Framework based REST API for high volume queries
Siemens Energy AG, Nuremberg
11 Monate
2015-12 - 2016-10
System design, development
Data Engineer
Data Engineer
Implement probabilistic identification mechanisms for App and Web users in profiling/ tracking records with Spark on Scala
Implement a large-scale Cassandra/Spark based aggregation platform for web and app user reporting and analytics
Implement a Play Framework based REST API for high volume queries
Optimize Cassandra access and keyspace design
Implement high volume SSTable based bulk Loading for large Cassandra Clusters
Implement automated AWS Elastic Map Reduce based pipelines with EMR and Spark
Implement Session and Cookie restoration mechanisms for user tracking records
GfK, Berlin, Nuremberg
1 Monat
2016-06 - 2016-06
Analyse requirements for a data processing platform
Architect/ Tech Lead
Architect/ Tech Lead
Analyse requirements for a data processing platform
Conceptualise and design a data processing platform based on Hadoop, Spark, Elastic Search; assessment of implementation efforts and cost
Develop a concept for Multicloud deployments in GCP and AWS
Baumarktdirekt/Otto Group, Hamburg
Vertrauen Sie auf Randstad
Im Bereich Freelancing
Im Bereich Arbeitnehmerüberlassung / Personalvermittlung