Remote only
Programming and DevOps roles in several client?s sub-projects POSSIBLE, HEALTH-X and Marispace. Design and implementation of data space technologies based on GAIA-X and International Data Spaces (IDS). Creation of documentation, examples and demos for internal and external customers.
Tasks:
Migration of data space software components to IONOS cloud
Dockerization of GAIA-X services - Federated Catalog
Automation of infrastructure landscape with Terraform - IONOS Kubernetes
Automation of deployment process - CI/CD
Packaging/Deployment/Integration of EDC Connector
Packaging/Deployment/Integration of IDS DAPS
Packaging/Deployment/Integration of IoT OpenTwins platform based on Eclipse DITTO
GAIA-X Federation Services includes the building of basic services for connected, open data infrastructure based on european standards and values. Those services should be distributed and decentralized and connected to each other will build the basis for a homogeneous and user-friendly system for secure data transfer. The project also entails implementing those services and provisioning of the relevant IT infrastructure.
Core components:
Identity & Trust
Federated Catalogue
Sovereign Data Exchange
Compliance
Federation service layer:
Authentication methods
GAIA-X Service UI and Workflow Engine
Orchestration methods (Infrastructure, Interconnection, Dataspaces)
Interfaces for external integration
Tasks:
Deploying multiple services on Kubernetes using CI/CD and Helm
Automation of service integrations
Creating Docker images for existing services
Implementation of new services: Principal Creation Service, Invitation Service, DID Management Service, Claim Mapper Service
Setting up of various Kubernetes jobs
Bug Fixing and support for existing deployments
Daiteap simplifies the management of complex multi-cloud infrastructure by exposing easy to use UI and API to end users. Daiteap can manage Kubernetes clusters, virtual machines and S3 storage across a variety of cloud service providers, in-premise and IoT devices. Users can create integration multi-cloud environments by VPN, allowing Kubernetes nodes, virtual servers and devices to communicate directly without needing to cross edge network boundaries and use public IPs, gateways and proxies.
Technologies:
Cloud - AWS, GCP, Azure, Alicloud
Terraform
Ansible
Docker und Kubernetes, Cluster-API, RabbitMQ
IoT RaspberryPi
Helm
Vuejs
Django und Python
CI/CD
PKI,OAuth2.0/OIDC, SSO, Keycloak,VPN IPSEC/WireGuard
Tasks:
Programming backend and frontend
Development of CI/CD pipelines for continuous delivery
Integration of public cloud APIs: GCP, AWS, Azure, AliCloud
Adding support for private cloud - OpenStack
Adding support for bare metal on premise hardware
Adding support for IoT ARM devices - RaspberryPi
Integration of Yaook life cycle management
Integration of Cluster-API based life cycle management
Realization of VPN based peer-to-peer connectivity solution using IPSec and Wireguard
Integration of SSO using Keycloak and OIDC
Realization of multi-cloud AI/ML predictive analytics use case using Kubeflow
Development of a data analytics platform (https://www.octave.io/) based on Hortonworks Hadoop and hosted in OVH. Development of REST APIs for data ingest and access. Setting up SSO, IDM, Logging, Monitoring infrastructure. Support of production servers, bug fixing and further developments.
Technologies:
OVH cloud
Hortonworks, NIFI, HDFS
ELK stack
Kubernetes and Docker
Python/Django
Google Cloud Platform
Tasks:
programming the Python/Django functionalities according to the specifications
packaging and deployment (CI/CD)
maintenance/bug-fixing
FootgolfScore is a web platform used by players, tournament organizers and national federations to manage tournaments, player rankings, profiles and events. The platform also keeps track of real time results during tournaments and provides a live scorecard which is updated by the stewards during competitions.
User facing part is used by the general public and players to review and register for tournaments, see results and participate in footgolf events. Administrative part is used by the national federations and clubs which create the tournaments, rules and teams.
Technologies:
Python/Django
Bootstrap
HTML, JavaScript, CSS
Docker and Kubernetes
Google Cloud Platform
Tasks:
Programming the Python/Django functionalities according to the specifications from the client
Admin module for managing tournaments, registrations and result lists
Programming a result calculation engine for result lists and scorecards using dynamic tournament rules
Packaging and deployment (CI/CD)
Maintenance of the live system used by thousands of users in Italy
Architecture and implementation of Blockchain platform for signing Roaming and mobile Data contracts between MNO worldwide based on Hyperledger Fabric. Adaptation for the Hyperledger project to the NOMAD use case, including migration to Kubernetes orchestrator, setting up core and operator network components, creating development and installation packages and documentation. Supporting client and partner MNO technical teams by installing the NOMAD artifacts on public and private infrastructure, supporting technical teams with running operator NOMAD components. Active role in technical discussions with GSMA and assisting the governing body and partners with technical decisions and discussions pertaining to the technical realization of the network and blockchain technologies.
Technologies:
Docker and Kubernetes
Public cloud: Open Telekom Cloud, AWS, GCP
Private cloud
Bash scripting
NGINX
MySQL
HLF CA
OpenSSL
Golang, NodeJS
PKI
Tasks:
Implementation of Backend Proxy component as intermediary between frontend and backend systems. 10 separate microservices created using Python/Django, Docker, and NodeJS. Created specification, unit tests, deployment automation scripts. Supported the project for the production phase, fixed bugs and maintained customer serving systems. Created versioning, logging, and release management documents and procedures used by the whole team.
Technologies:
Python/Django
Docker and Kubernetes
NodeJS
Swagger
Bash
GIT
The ELIZA project is an AI customer support module, which automates parts of the customer support functions by leveraging machine learning functions. It integrates into existing data sources like whitepapers, user activity, forums, FAQ pages and others. Collected data is stored into a big data/Hadoop cluster and processed. The results are then used to implement various AI use cases. The system learns continuously from user generated data and activity and is able to improve the quality of its answers.
The technical solution was implemented using Hortonworks big data cluster. Data ingest layer was implemented with Apache NIFI, Kafka. Data access layer uses Java, Tomcat, and HBase. Perimeter security
Technologies:
Hortonworks Hadoop - Apache NIFI, HDFS, HBase, KNOX, Ranger, Kafka
Kerberos & LDAP
Java, Tomcat
Tasks:
Implementation of the customer AI assistant ELIZA
Design and implementation of data ingest and data access paths based on Apache NIFI and HBase
Development of the user management and perimeter security layer using Kerberos, LDAP, Ranger and KNOX
Implementation AI functions into the platform
The DataLab project provides Big Data infrastructure as a service to clients. The platform provides multi-tenant environments to end users with pre-installed and configured infrastructure including Apache Hadoop (Cloudera Enterprise and Hortonworks), various big data tools, frameworks and development environments.
Our role was to design and implement the software as a service layer running on top of the OpenStack infrastructure, including the management of the Hadoop distributions, user rights and access, integration of analytics tools with big data clusters and resolving user issues within the platform.
Technologies:
Cloudera Enterprise, Hortonworks
HBase, HDFS, Hive, Hue, Impala, Oozie, Solr, Spark, Sqoop, ZooKeeper, Pig, Kafka, Storm, Flume, Cloudera Manager
Ansible
Open Telekom Cloud
LDAP, Kerberos
OpenStack, KVM, Virtualization
Ubuntu, CentOS, Windows Server 2012
Gitlab, Jenkins
Tasks:
Installation and configuration of Hadoop distributions: Cloudera Enterprise and Hortonworks
Management of Hadoop Activities: HDFS, Hive, Spark, Impala, etc.
Resource Management with OpenStack, including performance analysis, resource utilization and backup
Remote access using VPN and HTTPS
Communications with partners, clients regarding technical requirements
Automatisation of user management.
DevOps:
Programming:
Cloud:
MLOps:
Hadoop and big data:
Remote only
Programming and DevOps roles in several client?s sub-projects POSSIBLE, HEALTH-X and Marispace. Design and implementation of data space technologies based on GAIA-X and International Data Spaces (IDS). Creation of documentation, examples and demos for internal and external customers.
Tasks:
Migration of data space software components to IONOS cloud
Dockerization of GAIA-X services - Federated Catalog
Automation of infrastructure landscape with Terraform - IONOS Kubernetes
Automation of deployment process - CI/CD
Packaging/Deployment/Integration of EDC Connector
Packaging/Deployment/Integration of IDS DAPS
Packaging/Deployment/Integration of IoT OpenTwins platform based on Eclipse DITTO
GAIA-X Federation Services includes the building of basic services for connected, open data infrastructure based on european standards and values. Those services should be distributed and decentralized and connected to each other will build the basis for a homogeneous and user-friendly system for secure data transfer. The project also entails implementing those services and provisioning of the relevant IT infrastructure.
Core components:
Identity & Trust
Federated Catalogue
Sovereign Data Exchange
Compliance
Federation service layer:
Authentication methods
GAIA-X Service UI and Workflow Engine
Orchestration methods (Infrastructure, Interconnection, Dataspaces)
Interfaces for external integration
Tasks:
Deploying multiple services on Kubernetes using CI/CD and Helm
Automation of service integrations
Creating Docker images for existing services
Implementation of new services: Principal Creation Service, Invitation Service, DID Management Service, Claim Mapper Service
Setting up of various Kubernetes jobs
Bug Fixing and support for existing deployments
Daiteap simplifies the management of complex multi-cloud infrastructure by exposing easy to use UI and API to end users. Daiteap can manage Kubernetes clusters, virtual machines and S3 storage across a variety of cloud service providers, in-premise and IoT devices. Users can create integration multi-cloud environments by VPN, allowing Kubernetes nodes, virtual servers and devices to communicate directly without needing to cross edge network boundaries and use public IPs, gateways and proxies.
Technologies:
Cloud - AWS, GCP, Azure, Alicloud
Terraform
Ansible
Docker und Kubernetes, Cluster-API, RabbitMQ
IoT RaspberryPi
Helm
Vuejs
Django und Python
CI/CD
PKI,OAuth2.0/OIDC, SSO, Keycloak,VPN IPSEC/WireGuard
Tasks:
Programming backend and frontend
Development of CI/CD pipelines for continuous delivery
Integration of public cloud APIs: GCP, AWS, Azure, AliCloud
Adding support for private cloud - OpenStack
Adding support for bare metal on premise hardware
Adding support for IoT ARM devices - RaspberryPi
Integration of Yaook life cycle management
Integration of Cluster-API based life cycle management
Realization of VPN based peer-to-peer connectivity solution using IPSec and Wireguard
Integration of SSO using Keycloak and OIDC
Realization of multi-cloud AI/ML predictive analytics use case using Kubeflow
Development of a data analytics platform (https://www.octave.io/) based on Hortonworks Hadoop and hosted in OVH. Development of REST APIs for data ingest and access. Setting up SSO, IDM, Logging, Monitoring infrastructure. Support of production servers, bug fixing and further developments.
Technologies:
OVH cloud
Hortonworks, NIFI, HDFS
ELK stack
Kubernetes and Docker
Python/Django
Google Cloud Platform
Tasks:
programming the Python/Django functionalities according to the specifications
packaging and deployment (CI/CD)
maintenance/bug-fixing
FootgolfScore is a web platform used by players, tournament organizers and national federations to manage tournaments, player rankings, profiles and events. The platform also keeps track of real time results during tournaments and provides a live scorecard which is updated by the stewards during competitions.
User facing part is used by the general public and players to review and register for tournaments, see results and participate in footgolf events. Administrative part is used by the national federations and clubs which create the tournaments, rules and teams.
Technologies:
Python/Django
Bootstrap
HTML, JavaScript, CSS
Docker and Kubernetes
Google Cloud Platform
Tasks:
Programming the Python/Django functionalities according to the specifications from the client
Admin module for managing tournaments, registrations and result lists
Programming a result calculation engine for result lists and scorecards using dynamic tournament rules
Packaging and deployment (CI/CD)
Maintenance of the live system used by thousands of users in Italy
Architecture and implementation of Blockchain platform for signing Roaming and mobile Data contracts between MNO worldwide based on Hyperledger Fabric. Adaptation for the Hyperledger project to the NOMAD use case, including migration to Kubernetes orchestrator, setting up core and operator network components, creating development and installation packages and documentation. Supporting client and partner MNO technical teams by installing the NOMAD artifacts on public and private infrastructure, supporting technical teams with running operator NOMAD components. Active role in technical discussions with GSMA and assisting the governing body and partners with technical decisions and discussions pertaining to the technical realization of the network and blockchain technologies.
Technologies:
Docker and Kubernetes
Public cloud: Open Telekom Cloud, AWS, GCP
Private cloud
Bash scripting
NGINX
MySQL
HLF CA
OpenSSL
Golang, NodeJS
PKI
Tasks:
Implementation of Backend Proxy component as intermediary between frontend and backend systems. 10 separate microservices created using Python/Django, Docker, and NodeJS. Created specification, unit tests, deployment automation scripts. Supported the project for the production phase, fixed bugs and maintained customer serving systems. Created versioning, logging, and release management documents and procedures used by the whole team.
Technologies:
Python/Django
Docker and Kubernetes
NodeJS
Swagger
Bash
GIT
The ELIZA project is an AI customer support module, which automates parts of the customer support functions by leveraging machine learning functions. It integrates into existing data sources like whitepapers, user activity, forums, FAQ pages and others. Collected data is stored into a big data/Hadoop cluster and processed. The results are then used to implement various AI use cases. The system learns continuously from user generated data and activity and is able to improve the quality of its answers.
The technical solution was implemented using Hortonworks big data cluster. Data ingest layer was implemented with Apache NIFI, Kafka. Data access layer uses Java, Tomcat, and HBase. Perimeter security
Technologies:
Hortonworks Hadoop - Apache NIFI, HDFS, HBase, KNOX, Ranger, Kafka
Kerberos & LDAP
Java, Tomcat
Tasks:
Implementation of the customer AI assistant ELIZA
Design and implementation of data ingest and data access paths based on Apache NIFI and HBase
Development of the user management and perimeter security layer using Kerberos, LDAP, Ranger and KNOX
Implementation AI functions into the platform
The DataLab project provides Big Data infrastructure as a service to clients. The platform provides multi-tenant environments to end users with pre-installed and configured infrastructure including Apache Hadoop (Cloudera Enterprise and Hortonworks), various big data tools, frameworks and development environments.
Our role was to design and implement the software as a service layer running on top of the OpenStack infrastructure, including the management of the Hadoop distributions, user rights and access, integration of analytics tools with big data clusters and resolving user issues within the platform.
Technologies:
Cloudera Enterprise, Hortonworks
HBase, HDFS, Hive, Hue, Impala, Oozie, Solr, Spark, Sqoop, ZooKeeper, Pig, Kafka, Storm, Flume, Cloudera Manager
Ansible
Open Telekom Cloud
LDAP, Kerberos
OpenStack, KVM, Virtualization
Ubuntu, CentOS, Windows Server 2012
Gitlab, Jenkins
Tasks:
Installation and configuration of Hadoop distributions: Cloudera Enterprise and Hortonworks
Management of Hadoop Activities: HDFS, Hive, Spark, Impala, etc.
Resource Management with OpenStack, including performance analysis, resource utilization and backup
Remote access using VPN and HTTPS
Communications with partners, clients regarding technical requirements
Automatisation of user management.
DevOps:
Programming:
Cloud:
MLOps:
Hadoop and big data: