A heterogeneous outgrown IAM infrastructure
needed to be migrated into a uniform automated environment.
Also a modern aproach for infrastructure provisioning and
monitoring needed to be established. Therefore the main goal
was to concept and implement a fully automated reliable modern
cloud environment.
The client also required know-how in the fields of SSO-systems (SAML2 / OIDC) as well
as consulting and implementation of a role-based
aproach in access and rights management.
Tasks include:
Overall conceptioning, implementation and
introduction of an automated infrastructure aproach
Introduction of a staging/pipelining concept for the infrastructure
Introduction of a centrilized logging with decentralized log listener components
Dockerizing application
Consulting in concerns of SSO (Best practices, troubleshooting, implementation)
Python development of production-ready tools (web- and command line based)
Introduction and evaluation of a test kubernetes cluster
Technologies inlude:
Ansible for Infrastructure automation as well as provisioning
Python for custom tools and ansible modules
Gitlab CI/CD for automated deployments
Docker for application encapsulation, reliable deployments and system independency
Prometheus + cAdvisor + nodeExporter + Grafana for monitoring and alerting
fluentd + Graylog for centralized logging
CentOS as base System
Kubernetes for automatic container provisioning and orchestration (POC)
Achievements include:
Establishment of a modern cloud environment
future-proof concept for infrastructure, application rollout/staging
Knowledge transfer to the client in order to be able to run the new infrastructure
A service platform for citizens was about to be ported to a microservice architecture. Goal was to prepare the CI/CD pipelines as well as standartized methods for a smooth transition into a kubernetes cluster. A second goal was to an overall pipelining/operations concept for the future kubernetes environment.
Tasks include:
Establishment of new automation concepts
Establishment of robust delivery pipelines
Preparation of a reliable resource monitoring and alerting infrastructure for the future kubernetes cluster
Consulting in concerns of future operation standards
Establishment of a performance monitoring for the application
Technologies include:
Ansible for systems provisioning
Gitlab + Gitlab CI/CD for automatic provisioning/deployment
Prometheus + cAdvisor + Node Exporter for resource monitoring / alerting / graphing
Selenium for performance monitoring
Docker for delivery of microservices
Python for static checks / testing / data analysis
Achievements include:
Installation/configuration of a functional Prometheus cluster
Update concepts
Concept +implementation for seamless application delivery with gitlab CI + ansible
A large IAM project required an intuitie interface for role-basedaccess and rights management. At the same tme, new workfows for role based access, life time and monitoring had to be established
Tasks include:
Creation of a web portal for role- and rights management
Establishing a connection to the existing MicroFocus role solution
Development of new role-based workflows and processes, as well as training and support
Maintenance of existing infrastructure
Techologies include:
Django for the roles- and rights management tool backend (backend is a REST interface)
Angular for the easy frontend interaction
HTML5/CSS3/Bootstrap3 for the frontend
Ansible + Docker for 1-click deployments
Achievements include:
A web based tool for an intuitive role assignment and administration
Online overview of company structures, projects, etc
Tools for role review, reporting and troubleshooting
Responsible for infrastructure architecture with regard
on the future development and tool selectsion in the field of
identity and access management
Tasks include:
Selection of future-proof tools for a large infrastructure
Infrastructure migration into a cloudstack cloud
Quality assurance in terms of documentation
Tool development for infrastructure overview
Development of ansible modules for client
Migration from SLES11 to SLES12
Technologies include:
Python for custom tool development,
Ansible for infrastructure migration and cloud configuration
SLES12
Django for Visualization
Achievements include:
Ansible module for SLES12 System + Package registration
Python tool for ACL-administration in cloud
Fully automated migration of old systems into cloud with ansible playbooks/roles
Django tool on LDAP - Schema Review
Responsible for infrastruture architechture/automation as well
as custom tool development on a project for directory services
and identity/access management in a large heterogenious environment.
Tasks Include:
Design and implementation of update and deployment process automation.
Quality assurance by design of infrastructure monitoring,
centralized logging solutions and documentation
Customized tool creation for ldap operations
Installation and maintanence of single sign on solutions
Customer support in ldap/infrastructural/programming concerns
Technologies Include:
Ansible for infrastructure automation and configuration
management
Python for custom Tool development
nxLog + rsyslog + graylog for logging infrastructure
NOVELL eDirectory
Shibboleth as identity Provider combined with ldap
SLES 11
Git for configuration and documentation versioning
Achievements include:
Drastically accelerated (~20x faster) the update
process and improved its reliability by introduction
of centralized configuration management.
Extended the python-ldap library
with interfaces for simplified access
and modification of LDAP-Objects and searches.
Introduction of a complete and reliable centralized
logging solution including log filtering and alerting
for both windows and linux systems.
Responsible for data transformation and tool
customization on a migration (legacy c++ code from
solaris to linux systems) project.
Tasks include:
Tool development for identification of
critical spots in code
Legacy code analysis
Quality assurance
Department-wide training in python
Consulting in topics of migration to git
Technologies include:
Python
Git
Linux(Debian)
Achievements include:
Implementing code coverage and dynamic code
checker for c++ legacy source code based on gcov
Introduction of Python + Environment in project
Responsible for the infrastructure including performance
and quality assurance on a large social media project.
Tasks include:
Organization and care of a cloud
network (Debian systems)
Support for software developers
Configuration of open source tools for
code quality and documentation
Implementation of performance checks
Implementation of automated reports
on code quality and performance
Technologies include:
Python + Django for tool development
Sonar as static and dynamic tool for code analysis
OpenLDAP for user rights management
JIRA as the issue tracking/SCRUM tool
Achievements include:
Customized wiki and documentation
application for developers with jenkins and git integration
Customized wiki and documentation application
for developers with jenkins and git integration
Django web application to test the product performance
using selenium tests in background with customizable tests/test
environments and a graphical evaluation using the jqplot
library (also javascript/jquery)
Improvement of the overall code quality by
raising test coverage (+ ~30%) and identification + elimination
of potential code flaws
Large public service project with the goal to establish
a platform for handling of finance processes with a very
high number of transactions. Main focus was the migration
of legacy data, by assuring data quality and transformation
into various formats
Tasks include:
Customer consulting with regard to
loading/unloading interfaces
Definition of requirements for transformation
of legacy data
Implementation of algorithms for data
transformation
Tool development for secure data transport
Tool development for tests of data
quality/interface implementation
Technologies include:
Standard Linux tools, such as awk, sed, grep, …
Python for in-depth data anlysis
JAVA for transport layers
IBM DataStage
Achievements include:
Definition of uniform standards
Introduction of the standard linux
stack as global toolset for data analysis in project