Freiberufler / Selbstständiger
Remote-Arbeit
Verfügbar ab: 29.10.2024
Verfügbar zu: 100%
davon vor Ort: 10%
Top-Skills
Softwareentwicklung
Softwarearchitektur
(GCP) Google Cloud Platform
Java
Spring Boot
AWS (Amazon Web Services)
Microservices
Python
Kubernetes
BigData
Data-Engineering
Kafka
Grafana & Prometheus
Terraform
Ansible
Hadoop
OpenShift
Docker
GitLab & GitHub CI/CD
Jenkins & Artifactory
HashiCorp Vault & Packer & Consul
Linux / RHEL / Debian / Ubuntu
ETL
Einsatzorte
Städte
Frankfurt am Main (+50km)
Köln (+50km)
Stuttgart (+50km)
Mannheim (+50km)
München (+10km)
Hamburg (+10km)
Düsseldorf (+20km)
Nürnberg (+20km)
Essen, Ruhr (+20km)
Mainz (+20km)
Karlsruhe (Baden) (+50km)
Würzburg (+50km)
Duisburg (+20km)
Wien (+10km)
Salzburg (Österreich) (+10km)
Fulda (+20km)
Kassel, Hessen (+20km)
Länder
Deutschland, Österreich
Projekte
Rolle
Senior Developer & Architect
Projektinhalte
Further design and development of a custom cloud-based, multisource data ingestion system. Implementation of a monitoring and alerting solution.
Tech-Stack:
Google Cloud (BigQuery, Cloud- Functions / Run / Build / Storage, Compute Engine, PubSub, Airflow, Dataflow, GKE / K8s) Java, Spring Boot, Python, Shell scripting Oracle, SAP HANA, MSSQL, MySQL, PostgreSQL, Exasol Kafka, ActiveMQ Apache Beam REST, SFTP, SMB K8s Docker Grafana, Prometheus, Ansible, Vagrant FastAPI Terraform, GitHub Enterprise Hashicorp Vault
Produkte
Google Cloud Platform
SAP HANA
Confluent Kafka
Kubernetes
Grafana
Prometheus
Hashicorp Vault
Exasol
BigQuery
DataFlow
Kenntnisse
Java
Spring Boot
Python
Google Cloud
Kafka
ActiveMQ
K8s
Docker
Grafana
Prometheus
Ansible
Terraform
Rolle
Senior Cloud Data-Lake Developer & Data Engineer
Projektinhalte
Development of a custom cloud based, multisource data ingesting system.
Migration of large scale data processing pipelines from on-prem Hadoop Cluster to a cloud based Data-Lake.
Tech-Stack:
Google Cloud (BigQuery, Dataflow, PubSub, Cloud Functions / Build / Storage, Logging, GKE, App Engine, KMS etc.)
Hadoop, Exasol, MSSQL, Oracle
Kafka / Kafka-Connect, ActiveMQ, Apache Beam
AWS (SNS, SQS)
OpenShift
Java, Spring Boot, Python
Terraform, GitHub Enterprise, Bitbucket
JSON, Protobuf, Avro, XML
Shell scripting, SQL, Power BI
DataVault
Control-M
Produkte
Terraform
BMC Control-M
EXASolution
GCP
Google App Engine
Kenntnisse
Google Cloud
BigQuery
Hadoop
AWS
SNS
SQS
Kafka
Kafka-Connect
OpenShift
Java
SQL
Python
Spring-Framework
Shell-Script
MS SQL Server
Oracle
ActiveMQ
Protobuf
Avro
Terraform
Bigquery
Dataflow
PubSub
Cloud Functions
Cloud Build
Cloud Storage
GKE
Cloud Logging
KMS
Projektinhalte
Design and implementation of OpenShift based automation solutions for Microservices, Kafka and Monitoring.Tech Stack: - OpenShift - Docker - Helm - Kafka - GitLab CI/CD - Prometheus, Grafana
- Microservices, REST
Kenntnisse
OpenShift
Docker
Helm
Kafka
GitLab CI/CD
REST
Monitoring
Microservices
Prometheus
Grafana
Rolle
Senior Consultant - AWS Cloud & DevOps
Projektinhalte
Automation of AWS cloud based large scale infrastructure for the Energy sector.
Tech Stack:
Tech Stack: - AWS -- EC2 -- RDS -- ECR -- Lambda -- CloudFormation -- SSM -- S3 -- SNS
- Ansible - AWX - GitLab CI/CD - Docker - Kubernetes - Python - Shell - PowerShell - Monitoring -- ELK -- Observium - Linux / Windows
Produkte
AWX
Linux
Windows
Kenntnisse
AWS
Amazon EC2
AWS RDS
AWS Lambda
AWS SSM
Ansible
GitLab CI/CD
Automation
Docker
Rolle
Senior Consultant - Infrastructure Automation
Projektinhalte
Supporting the operations team with automation of infrastructure configuration of RHEL landscape and automated deployment of Splunk. Automation of VMWare based infrastructure management.
Tech-stack:
RHEL, Splunk, VMWare, Docker Ansible, PowerShell/PowerCLI Scripting, Python, Git
(project via accredia GmbH)
Produkte
RHEL
Splunk
VMware
Docker
OpenShift
Kenntnisse
Ansible
PowerShell
Python
Einsatzort
Frankfurt am Main
Projektinhalte
DevOps Architect in a large project with international distributed teams at the Digital Factory Division:
Solution Design and development of Continuous Integration infrastructure, Linux customization, Containerization, Automation.
Tech stack:
- Docker - Jenkins - Python - Ansible - Kubernetes - Artifactory - HashiCorp Vault - HashiCorp Packer - Vagrant - SonarQube - Grafana - LXC / LXD - systemd-nspawn - GitLab, TFS - Shell and PowerShell scripting - VMWare ESXi - PowerCLI Automation - Debian
Produkte
Docker
Jenkins
Python
Ansible
HashiCorp Vault
Packer
Vagrant
Artifactory
Unix Shell Scripting
PowerShell
SonarQube
TFS
Kubernetes
systemd-nspawn
Artifactory
Git
Debian
GitLab
HashiCorp Vault
HashiCorp Packer
Vagrant
Grafana
VMware ESXi
TFS
PowerCLI Automation
LXC / LXD
Kunde
Siemens AG - Digital Factory Division
Einsatzort
Frankfurt am Main
Rolle
Solution Architect & Big Data Developer
Projektinhalte
Solution Design and development of high performance Big Data streaming and Big Data Analytics system.
Tech-Stack:
- Storm
- Kafka
- Hive - Hadoop - Oozie - Pig - ZooKeeper - HDP (Hortonworks Data Platform)
- Python
- Java
Produkte
Apache Storm
Apache Kafka
Apache Hive
Apache Hadoop
Apache Oozie
Apache Pig
ZooKeeper
Hortonworks Data Platform
Kunde
Siemens AG - Energy Management Division
Einsatzort
Frankfurt am Main
Mehr
Weniger
Aus- und Weiterbildung
Institution, Ort
Frankfurt University of Applied Sciences
Institution, Ort
The Academic College of Tel-Aviv-Yaffo
Kompetenzen
Top-Skills
Softwareentwicklung
Softwarearchitektur
(GCP) Google Cloud Platform
Java
Spring Boot
AWS (Amazon Web Services)
Microservices
Python
Kubernetes
BigData
Data-Engineering
Kafka
Grafana & Prometheus
Terraform
Ansible
Hadoop
OpenShift
Docker
GitLab & GitHub CI/CD
Jenkins & Artifactory
HashiCorp Vault & Packer & Consul
Linux / RHEL / Debian / Ubuntu
ETL
Produkte / Standards / Erfahrungen / Methoden
Amazon EC2
Amazon Web Services
Apache Hadoop
Apache Hive
Apache Kafka
Apache Oozie
Apache Pig
Apache Storm
Artifactory
Automation
AWS
AWS SSM
AWX
BigQuery
BMC Control-M
Continuous Integration
Control-M
Debian
Docker
ESX VMware
Git
GitLab
GitLab CI/CD
Google Cloud
Grafana
Hadoop
HashiCorp Consul
HashiCorp Packer
HashiCorp Vault
Helm
Hortonworks Data Platform
InfluxDB
Jenkins
Kafka
Kafka-Connect
Kubernetes
Linux
LXC / LXD
Microservices
Monitoring
MySQL
nginx
OpenShift
Packer
PostgreSQL
PowerCLI
PowerCLI Automation
Prometheus
Red Hat Enterprise Linux
REST
RHEL 7
SNS
SonarQube
Splunk
Spring
SQS
systemd-nspawn
Terraform
TFS
Ubuntu
Vagrant
Virtualization
VMware
VMware ESXi
ZooKeeper
Betriebssysteme
Debian
Linux
RHEL
Ubuntu
Windows
Programmiersprachen
Ansible
Bash Shell
Java
Perl
PHP
PL-SQL
PowerShell
Python
Unix Shell Scripting
VBA
Datenbanken
AWS RDS
MongoDB
MySQL
Oracle
PostgreSQL
SQL
Design / Entwicklung / Konstruktion
Einsatzorte
Städte
Frankfurt am Main (+50km)
Köln (+50km)
Stuttgart (+50km)
Mannheim (+50km)
München (+10km)
Hamburg (+10km)
Düsseldorf (+20km)
Nürnberg (+20km)
Essen, Ruhr (+20km)
Mainz (+20km)
Karlsruhe (Baden) (+50km)
Würzburg (+50km)
Duisburg (+20km)
Wien (+10km)
Salzburg (Österreich) (+10km)
Fulda (+20km)
Kassel, Hessen (+20km)
Länder
Deutschland, Österreich
Projekte
Rolle
Senior Developer & Architect
Projektinhalte
Further design and development of a custom cloud-based, multisource data ingestion system. Implementation of a monitoring and alerting solution.
Tech-Stack:
Google Cloud (BigQuery, Cloud- Functions / Run / Build / Storage, Compute Engine, PubSub, Airflow, Dataflow, GKE / K8s) Java, Spring Boot, Python, Shell scripting Oracle, SAP HANA, MSSQL, MySQL, PostgreSQL, Exasol Kafka, ActiveMQ Apache Beam REST, SFTP, SMB K8s Docker Grafana, Prometheus, Ansible, Vagrant FastAPI Terraform, GitHub Enterprise Hashicorp Vault
Produkte
Google Cloud Platform
SAP HANA
Confluent Kafka
Kubernetes
Grafana
Prometheus
Hashicorp Vault
Exasol
BigQuery
DataFlow
Kenntnisse
Java
Spring Boot
Python
Google Cloud
Kafka
ActiveMQ
K8s
Docker
Grafana
Prometheus
Ansible
Terraform
Rolle
Senior Cloud Data-Lake Developer & Data Engineer
Projektinhalte
Development of a custom cloud based, multisource data ingesting system.
Migration of large scale data processing pipelines from on-prem Hadoop Cluster to a cloud based Data-Lake.
Tech-Stack:
Google Cloud (BigQuery, Dataflow, PubSub, Cloud Functions / Build / Storage, Logging, GKE, App Engine, KMS etc.)
Hadoop, Exasol, MSSQL, Oracle
Kafka / Kafka-Connect, ActiveMQ, Apache Beam
AWS (SNS, SQS)
OpenShift
Java, Spring Boot, Python
Terraform, GitHub Enterprise, Bitbucket
JSON, Protobuf, Avro, XML
Shell scripting, SQL, Power BI
DataVault
Control-M
Produkte
Terraform
BMC Control-M
EXASolution
GCP
Google App Engine
Kenntnisse
Google Cloud
BigQuery
Hadoop
AWS
SNS
SQS
Kafka
Kafka-Connect
OpenShift
Java
SQL
Python
Spring-Framework
Shell-Script
MS SQL Server
Oracle
ActiveMQ
Protobuf
Avro
Terraform
Bigquery
Dataflow
PubSub
Cloud Functions
Cloud Build
Cloud Storage
GKE
Cloud Logging
KMS
Projektinhalte
Design and implementation of OpenShift based automation solutions for Microservices, Kafka and Monitoring.Tech Stack: - OpenShift - Docker - Helm - Kafka - GitLab CI/CD - Prometheus, Grafana
- Microservices, REST
Kenntnisse
OpenShift
Docker
Helm
Kafka
GitLab CI/CD
REST
Monitoring
Microservices
Prometheus
Grafana
Rolle
Senior Consultant - AWS Cloud & DevOps
Projektinhalte
Automation of AWS cloud based large scale infrastructure for the Energy sector.
Tech Stack:
Tech Stack: - AWS -- EC2 -- RDS -- ECR -- Lambda -- CloudFormation -- SSM -- S3 -- SNS
- Ansible - AWX - GitLab CI/CD - Docker - Kubernetes - Python - Shell - PowerShell - Monitoring -- ELK -- Observium - Linux / Windows
Produkte
AWX
Linux
Windows
Kenntnisse
AWS
Amazon EC2
AWS RDS
AWS Lambda
AWS SSM
Ansible
GitLab CI/CD
Automation
Docker
Rolle
Senior Consultant - Infrastructure Automation
Projektinhalte
Supporting the operations team with automation of infrastructure configuration of RHEL landscape and automated deployment of Splunk. Automation of VMWare based infrastructure management.
Tech-stack:
RHEL, Splunk, VMWare, Docker Ansible, PowerShell/PowerCLI Scripting, Python, Git
(project via accredia GmbH)
Produkte
RHEL
Splunk
VMware
Docker
OpenShift
Kenntnisse
Ansible
PowerShell
Python
Einsatzort
Frankfurt am Main
Projektinhalte
DevOps Architect in a large project with international distributed teams at the Digital Factory Division:
Solution Design and development of Continuous Integration infrastructure, Linux customization, Containerization, Automation.
Tech stack:
- Docker - Jenkins - Python - Ansible - Kubernetes - Artifactory - HashiCorp Vault - HashiCorp Packer - Vagrant - SonarQube - Grafana - LXC / LXD - systemd-nspawn - GitLab, TFS - Shell and PowerShell scripting - VMWare ESXi - PowerCLI Automation - Debian
Produkte
Docker
Jenkins
Python
Ansible
HashiCorp Vault
Packer
Vagrant
Artifactory
Unix Shell Scripting
PowerShell
SonarQube
TFS
Kubernetes
systemd-nspawn
Artifactory
Git
Debian
GitLab
HashiCorp Vault
HashiCorp Packer
Vagrant
Grafana
VMware ESXi
TFS
PowerCLI Automation
LXC / LXD
Kunde
Siemens AG - Digital Factory Division
Einsatzort
Frankfurt am Main
Rolle
Solution Architect & Big Data Developer
Projektinhalte
Solution Design and development of high performance Big Data streaming and Big Data Analytics system.
Tech-Stack:
- Storm
- Kafka
- Hive - Hadoop - Oozie - Pig - ZooKeeper - HDP (Hortonworks Data Platform)
- Python
- Java
Produkte
Apache Storm
Apache Kafka
Apache Hive
Apache Hadoop
Apache Oozie
Apache Pig
ZooKeeper
Hortonworks Data Platform
Kunde
Siemens AG - Energy Management Division
Einsatzort
Frankfurt am Main
Mehr
Weniger
Aus- und Weiterbildung
Institution, Ort
Frankfurt University of Applied Sciences
Institution, Ort
The Academic College of Tel-Aviv-Yaffo
Kompetenzen
Top-Skills
Softwareentwicklung
Softwarearchitektur
(GCP) Google Cloud Platform
Java
Spring Boot
AWS (Amazon Web Services)
Microservices
Python
Kubernetes
BigData
Data-Engineering
Kafka
Grafana & Prometheus
Terraform
Ansible
Hadoop
OpenShift
Docker
GitLab & GitHub CI/CD
Jenkins & Artifactory
HashiCorp Vault & Packer & Consul
Linux / RHEL / Debian / Ubuntu
ETL
Produkte / Standards / Erfahrungen / Methoden
Amazon EC2
Amazon Web Services
Apache Hadoop
Apache Hive
Apache Kafka
Apache Oozie
Apache Pig
Apache Storm
Artifactory
Automation
AWS
AWS SSM
AWX
BigQuery
BMC Control-M
Continuous Integration
Control-M
Debian
Docker
ESX VMware
Git
GitLab
GitLab CI/CD
Google Cloud
Grafana
Hadoop
HashiCorp Consul
HashiCorp Packer
HashiCorp Vault
Helm
Hortonworks Data Platform
InfluxDB
Jenkins
Kafka
Kafka-Connect
Kubernetes
Linux
LXC / LXD
Microservices
Monitoring
MySQL
nginx
OpenShift
Packer
PostgreSQL
PowerCLI
PowerCLI Automation
Prometheus
Red Hat Enterprise Linux
REST
RHEL 7
SNS
SonarQube
Splunk
Spring
SQS
systemd-nspawn
Terraform
TFS
Ubuntu
Vagrant
Virtualization
VMware
VMware ESXi
ZooKeeper
Betriebssysteme
Debian
Linux
RHEL
Ubuntu
Windows
Programmiersprachen
Ansible
Bash Shell
Java
Perl
PHP
PL-SQL
PowerShell
Python
Unix Shell Scripting
VBA
Datenbanken
AWS RDS
MongoDB
MySQL
Oracle
PostgreSQL
SQL
Design / Entwicklung / Konstruktion
Das Freelancer-Portal
Direktester geht's nicht! Ganz einfach Freelancer finden und direkt Kontakt aufnehmen.
Jetzt bei GULP Direkt registrieren