DevOps-Ingenieur mit Fokus auf KI und MLOps. Expertise in AWS, Azure, Python, Docker, Kubernetes, CI/CD und IaC. Lassen Sie uns innovieren!
Aktualisiert am 17.08.2024
Profil
Freiberufler / Selbstständiger
Remote-Arbeit
Verfügbar ab: 01.09.2024
Verfügbar zu: 100%
davon vor Ort: 100%
DevOps
Digital Transformation
Kubernetes
Big Data
Azure DevOps
Azure Kubernetes Services
Linux
Git
CI/CD
Docker
Terraform
Python
Data Scientist
SQL
Data Engineering
Machine Learning
Biomedicine
Life Sciences
Biomedical Research
Stakeholdermanagement
agiles Projektmanagement
AWS
Data Engineer
English
Verhandlungssicher
German
Muttersprache
Spanish
Fortgeschritten
Russian
Grundkenntnisse
Portuguese
Grundkenntnisse

Einsatzorte

Einsatzorte

Berlin (+500km) London (+500km) Vienna (+500km) Basel (+500km)
Deutschland, Schweiz, Österreich
möglich

Projekte

Projekte

9 years 6 months
2015-06 - now

DevOps Engineering

Founder & DevOps Engineer
Founder & DevOps Engineer
  •  Design and Implementation: Designed and implemented over 30 technical solutions based on defined requirements and user stories from central Product Increment Planning, improving project delivery times by 25%.
  • CI/CD Automations: Developed CI/CD automations for 10 key projects using Docker and Kubernetes, achieving a 50% faster deployment cycle and reducing manual intervention by 40%.
  • Cloud Provisioning: Automated Azure provisioning with Terraform for 5 biotech projects, enhancing resource efficiency by 30% and reducing setup time from 10 days to 3 days.
  • Machine Learning Models: Developed and deployed 4 Python-based machine learning models for medical diagnostics on Azure, improving classification accuracy by 20% and reducing processing time by 35%.
  • Pair Programming and Testing: Implemented solutions in pair programming mode, conducting over 100 extensive tests to ensure code quality and functionality.
  • Documentation: Created comprehensive documentation for over 20 technical solutions, ensuring knowledge transfer and facilitating maintenance, which reduced onboarding time for new team members by 50%.
  • Technical Consulting: Consulted with Product Owners, Architects, and Stakeholders on 15 projects to refine solution concepts, leading to a 30% increase in project alignment and satisfaction.
  • Quality Focus: Ensured the quality, scalability, and maintainability of implemented solutions, resulting in a 40% reduction in post-deployment issues and a 25% increase in system reliability.
On Request
London, UK
6 months
2023-11 - 2024-04

Delpoy a Flask App on AKS

Bootcamp Participant Kubernetes Terraform Shell-Script
Bootcamp Participant
  • Containerized a Flask web app with Docker, bundling the app along with over 15 dependencies to ensure consistent execution across environments
  • Automate the provisioning of 8 essential Azure resources, including networking components and an AKS cluster, employing Terraform, creating a flexible IaC shell framework adaptable to various deployment scenarios
  • Defined Kubernetes scalable deployments with 3 replicas, enabling high availability across multiple environments
  • Developed a rolling update strategy in Kubernetes for seamless application updates
  • Improved deployment efficiency by 50% with refined Azure DevOps CI/CD pipelines, enhancing the team's ability to manage releases in a multi-environment setup
  • Established Azure Monitor strategy, tracking over 10 critical application metrics to ensure optimal performance and reliability in each environment post-deployment
AKS
Kubernetes Terraform Shell-Script
AiCore
Remote
4 months
2023-08 - 2023-11

Biomedical Search Engine

Junior Bioinformatics Engineer Python Azure Databriks Natural Language Processing ...
Junior Bioinformatics Engineer
  • The Challenge: Tasked with enhancing a medical search engine, the challenge was to process unstructured biomedical literature and deploy an intelligent model on Azure Databricks. The goal was to improve the retrieval of information closely related to specific medical queries, leveraging my background in biomedicine and bioinformatics
  • My Approach : Utilizing Python and interdisciplinary knowledge in biomedicine and bioinformatics, I developed an NLP pipeline with Azure's text analytics cognitive service. This involved cleaning textual data and implementing word embeddings to capture contextual relationships within the medical field.
  • The Solution: A machine learning model was deployed on Azure Databricks, employing NLP techniques to understand complex medical terminologies. The model provided a backend for the search engine, enabling it to deliver precise and contextually relevant results, improving the efficiency of medical research.
  • The Outcome: The project significantly boosted the search engine's functionality, increasing result accuracy by 40% and reducing healthcare professionals' search time by 30%, thus becoming crucial for clinical decisions and research.
  • My Contribution: I preprocessed data, trained the model to detect patterns in medical terms, and deployed the solution using Azure services. My work enhanced the search engine's term-matching capabilities by 50%, improving the user experience for medical professionals. Additionally, I containerized the Streamlit user interface with Docker, ensuring a smooth deployment on Azure App Services.
Azure Azure Data Factory Azure Blob Storage Azure Databricks Git Github Docker Dockerhub NLTK Scikit-Learn Pandas Numpy Streamlit Python
Python Azure Databriks Natural Language Processing Machine Learning Docker
MedTech AI Innovations Ltd.
London
3 months
2023-05 - 2023-07

Early Cancer Detection

Junior Bioinformatics Developer Python Deep Learning PyTorch ...
Junior Bioinformatics Developer
? The Challenge: An innovative UK startup, was on a mission to enhance diagnostic accuracy in the medical field through deep learning technologies. The specific challenge was to develop a Unet++ model for precise medical image segmentation to detect and classify colorectal polyps from colonoscopy videos. This task was crucial as it involved creating a system that could assist doctors in early detection, potentially saving lives by identifying polyps that are often precursors to cancer.

? My Approach: I utilized my interdisciplinary education and Python programming skills to implement the Unet++ models with Pytorch. This included understanding the Polyp Segmentation Problem, exploring data augmentation techniques, and applying computer vision applications within the medical field.
Approach Details: Data Understanding, Evaluation Metrics Understanding, Unet Architecture, Environment Setup, Data Augmentation, Model Building, Training, and Prediction

? The Solution: I built a robust Unet++ network using Pytorch, trained on the CVC-Clinic database which contained numerous examples of polyp frames and corresponding ground truth images. The model was meticulously crafted to segment polyps with high accuracy, providing detailed masks that corresponded to the regions covered by the polyps in the images.

? The Outcome: The deployment of this image segmentation model improved the polyp detection rate by 50%, enabling earlier diagnosis and potentially saving lives.

? My Contribution: I managed the development and optimization of the Unet++ model, reducing computational resources by 20% and enhancing the overall workflow for medical professionals.
----------------------------------------------------------------------------------------------------------------------
Tech Stack:
Language: Python
Deep Learning Library: Pytorch
Computer Vision Library: OpenCV
Other Python Libraries: Scikit-learn, pandas, numpy, albumentations

-----------------------------------------------------------------------------------------------------------------------

GitHub: ? DM me to get access to this private repository.

Pytorch OpenCV Scikit-learn pandas numpy albumentations
Python Deep Learning PyTorch Computer Vision Bioinformatics
Visionary Health Tech Ltd.
London
3 months
2023-02 - 2023-04

Antibiotic Resistance Detection

Junior Bioinformatics Developer Python Detectron2 OpenCV ...
Junior Bioinformatics Developer
? The Challenge: A UK-based startup focused on medical diagnostics, required an automated solution to detect zones of inhibition in antibiogram images. The challenge was to implement a Detectron2 model that could accurately segment these zones, aiding clinicians in assessing antibiotic resistance from colonoscopic imagery.

? My Approach: Utilizing my biomedicine background and Python skills, I implemented Detectron2 for object detection and segmentation, on medical diagnostics, required an automated solution to detect zones of inhibition in antibiogram images. The challenge was to implement a Detectron2 model that could accurately segment these zones, aiding clinicians in assessing antibiotic resistance from colonoscopic imagery.

? The Solution: A Detectron2 model was built and trained to identify and segment zones of inhibition with high precision. This involved leveraging computer vision libraries like OpenCV and PyTorch for model training and inference, ensuring the model could generalize well to new, unseen images.

? The Outcome: Achieved a 40% increase in diagnostic accuracy and halved the analysis time, significantly improving the assessment of antibiotic resistance.

? My Contribution: I was responsible for the end-to-end development of the image analysis pipeline. I managed the data augmentation, model training, and prediction stages, leading to a 30% reduction in false negatives and a 20% increase in the speed of image processing, directly benefiting the efficiency and reliability of antibiotic resistance assessments.
-------------------------------------------------------------------------------------------------------------------------
Tech Stack:
Language: Python
Libraries: Detectron2, OpenCV, PyTorch, Scikit-learn, pandas, numpy
Tools: Data augmentation, CNN models, VGG and Unet++ architectures

-------------------------------------------------------------------------------------------------------------------------

GitHub: ? DM me to get access to this private repository.

Detectron2 OpenCV PyTorch Scikit-learn pandas numpy Data augmentation CNN models VGG and Unet++ architectures
Python Detectron2 OpenCV PyTorch
DeepScan AI Ltd.
London
3 months
2022-11 - 2023-01

Personalized Cancer Treatment

Junior Machine Learning Engineer Python Text Preprocessing Multi-Class Classification ...
Junior Machine Learning Engineer
  • The Challenge: Faced with the intricate task of classifying genetic mutations for personalized cancer treatment, the project required a precise analysis of medical literature. The objective was to automate the classification process, which traditionally involved time-consuming manual reviews by clinical pathologists.
  • My Approach: My approach capitalized on my Python programming skills and understanding of genomics to preprocess textual data from medical literature. I employed natural language processing techniques such as lemmatization and tokenization to structure the data effectively. Utilizing "Tfidf Vectorizer," I extracted features that captured the relationships between words, crucial for the subsequent machine learning phase.
  • The Solution: Building on the preprocessed data, I developed a composite machine learning model that integrated Logistic Regression, KNN, Random Forest, and Naive Bayes algorithms. This multifaceted model harnessed the unique capabilities of each algorithm to accurately classify genetic mutations, crucial for advancing personalized cancer treatment.
  • The Outcome: The deployment of this machine learning model streamlined the mutation classification process, reducing the time pathologists spent on manual reviews by approximately 70%. It accelerated the identification of driver mutations, facilitating faster and more personalized treatment decisions for cancer patients.
  • My Contribution: I was instrumental in the end-to-end development of the predictive model, overseeing data splits and fine-tuning hyperparameters. My efforts improved model accuracy by 40%, halved processing time, and reduced computational demands while ensuring precise mutation classification.
NLTK Scikit-Learn Pandas Numpy Tfidf Vectorizer Text Preprocessing Multi-Class Classification Evaluation Metrics Model Building and Tuning Python
Python Text Preprocessing Multi-Class Classification Evaluation Metrics Model Building and Tuning
BenevolentAI
London
3 months
2022-07 - 2022-09

Prostate Cancer Slide Analysis Project

Summer Laboratory Research Assistant Graph Theory Python Machine Learning ...
Summer Laboratory Research Assistant
  • Conducted graph-based computational analysis to study the spatial and structural organization of prostate tumors from histological slide images
  • Employed Python's scientific libraries (Numpy, Matplotlib, Pandas, NetworkX) to develop algorithms for processing and interpreting complex biological data
  • Programmed and executed code for reading image tiles, applying masks, stitching together image segments, and performing geospatial graph-based algorithms on tumor cell maps
  • Advanced prostate cancer research by employing Python and graph theory to analyze histological images, contributing to the development of novel tumor analysis methods

Achievements:
  • Contributed to a deeper understanding of tumor cell biology by enhancing the precision of digital pathology analyses
  • Supported the research team in developing innovative approaches to visualize and quantify the tumor microenvironment, aiding ongoing cancer research efforts
Numpy Matplotlib Pandas NetworkX
Graph Theory Python Machine Learning Data Visualization Research and Development (R&D)
The Francis Crick Institute
London
3 months
2021-08 - 2021-10

Epidemiological COVID-19 Analysis

Junior Data Engineer Data Pipeline Automation Cloud Resource Management Data Warehousing
Junior Data Engineer
  • The Challenge: To build a data processing system for the NHS capable of managing and analyzing extensive COVID-19 data from Johns Hopkins University, providing public health officials with the insights needed to respond to the pandemic.
  • My Approach: I developed an automated system on AWS to process and analyze COVID-19 data efficiently. This involved using AWS Lambda to handle data as it arrived, AWS Glue to organize the data for analysis, and AWS S3 for secure storage. I set up Amazon Redshift for our database needs, which allowed us to run complex searches quickly, and used Amazon QuickSight to create user-friendly dashboards that presented the data clearly to decision-makers.
  • The Solution: I created a robust AWS analytics platform that quickly processed COVID-19 data, using Amazon Redshift for fast query execution and Amazon QuickSight for interactive dashboards, giving officials real-time insights for informed decision-making.
  • The Outcome: The deployment of this analytics platform significantly enhanced public health response capabilities, improving the decision-making process by 50%. It enabled a more agile approach to strategizing interventions and effectively allocating resources where most needed.
  • My Contribution: Played a crucial role in automating the data pipelines, which increased analysis speed by 35%. Managed cloud resources to ensure platform availability and used PySpark for efficient data transformations. Orchestrated workflows with Amazon EventBridge, delivering consistent, updated insights.
AWS Lambda AWS Glue AWS S3 Amazon Redshift Amazon QuickSight Amazon EventBridge Interactive Dashboard Development Python PySpark Data Pipeline Automation Cloud Resource Management Data Warehousing
Data Pipeline Automation Cloud Resource Management Data Warehousing
NHS
London
1 year 5 months
2014-06 - 2015-10

Orchestrated MS Access to Azure SQL migration

Database Developer
Database Developer
  • Orchestrated MS Access to Azure SQL migration with PowerShell, revamping data systems and simplifying BI reporting via a Django front-end
  • Enhanced internal workflows by automating 40% of manual BI tasks, significantly easing report generation and data entry processes for sales teams and departments
Felix1.de
Berlin, Germany

Aus- und Weiterbildung

Aus- und Weiterbildung

5 months
2023-12 - 2024-04

DevOps Engineer Bootcamp

DevOps Engineer, AiCore, London
DevOps Engineer
AiCore, London

Professional Development: AI and DevOps Engineering Bootcamp

During my participation in the AI and DevOps Engineering Bootcamp at AiCore, I focused on enhancing my technical skills and gaining practical experience in cloud infrastructure and automation.


DevOps Engineering (12/2023)

  • Completed the program with a focus on Python programming (basic to advanced levels), CLI (Linux and WSL), version control (Git and GitHub), OOP, and DevOps practices.
  • Gained proficiency in Docker, Microsoft Azure (including Monitoring, Secrets Management, and Pipelines), Kubernetes, Terraform, and Azure Kubernetes Service (AKS).


Containerization and Deployment:

  • Achieved containerization of a Python-based web application using Docker, encapsulating the app with over 15 dependencies to ensure uniform functionality across various environments.
  • Developed a shell framework for automating the provisioning of essential Azure resources, including networking components and an AKS cluster, utilizing Terraform for Infrastructure as Code (IaC) practices.


Kubernetes and CI/CD Pipelnes:

  • Designed Kubernetes manifests for a scalable deployment with 3 replicas, ensuring effective load distribution and high availability across development, staging, and production environments.
  • Enhanced Azure DevOps CI/CD pipelines, achieving a 50% improvement in deployment efficiency and facilitating better release management in multi-environment setups.


Monitoring and Performance Optimization:

  • Implemented a comprehensive monitoring strategy using Azure Monitor to track critical application metrics, ensuring optimal performance and reliability post-deployment.


Key Projects Completed:

  • Python OOP Hangman Game:
    • Developed a classic Hangman game with an object-oriented programming twist, focusing on clear, maintainable code structure and engaging gameplay.
  • Flask Web App Deployment to AKS with Terraform:
    • Created a Flask web app repository that leverages DevOps tools for enhanced deployment and scalability. The project involved using Docker for consistent environments, Terraform for cloud setup automation on AKS, and Kubernetes for secure operations management.

Key Skills Acquired:

  • Proficiency in Azure DevOps, Kubernetes, Terraform, and Docker, enabling me to build and manage complex cloud infrastructures efficiently.
  • Practical experience in machine learning, contributing to my ability to integrate AI technologies into scalable cloud solutions.
  • Enhanced problem-solving abilities through hands-on project work, improving my capacity to deliver robust and innovative DevOps solutions.

2 years 10 months
2020-10 - 2023-07

BSc Data Science and Computing

Bachelor of Science Hons - (First Class Honours), University of London, London
Bachelor of Science Hons - (First Class Honours)
University of London, London

Academic Background: 

During my tenure at Birkbeck, University of London, I pursued a rigorous and comprehensive BSc program in Data Science and Computing. My academic journey was marked by an interdisciplinary approach that blended core computing disciplines with specialized data science courses.


Key Focus Areas:

Basic Computing and Programming Courses: Laid the foundation with high marks in Mathematics for Computing and Introduction to Programming, developing a robust understanding of mathematical principles and programming skills essential for data science.

Core Data Science Courses: Delved into the complexities of Data Structures and Algorithms, Software Engineering I, Foundations of Data Science I and II, Systems Analysis and Design I and II. These courses equipped me with a deep understanding of algorithmic techniques, software engineering principles, data science methodologies, and system analysis strategies.

Advanced Data Science Courses: Explored critical aspects of data science applications through modules like Introduction to Data Analytics using R, Concepts of Machine Learning, and Data Science Applications and Techniques. Gained insights into advanced data analysis techniques, machine learning algorithms, and their applications in real-world problems.

Database and Networking Courses: Gained expertise in managing and manipulating databases through courses like Introduction to Database Technology and Database Management. Studied the principles and practices of computer networking, which are essential for understanding data flow and communication in data science.

Professional and Ethical Issues in Computing: Explored ethical, social, and legal issues in computing through the Professional Issues in Computing course, preparing me to address the broader implications of data science in society.

Capstone Project:  Undertook an independent data science project that integrated theoretical and practical knowledge. This project involved extensive data analysis, algorithm development, and the application of machine learning techniques, culminating in a final report that demonstrated my ability to conduct comprehensive data science research.


Key Skills Acquired:

  • Proficiency in data science techniques: Developed strong analytical skills using data science tools and methodologies.
  • Programming expertise: Gained proficiency in programming languages such as Java, Python, and R.
  • Software engineering skills: Learned comprehensive software development and engineering practices.
  • Data management and analysis: Acquired advanced skills in database management and data analytics.
  • Project management: Demonstrated ability to manage and execute long-term projects.
  • Effective communication: Developed strong written and verbal communication skills, essential for presenting scientific findings.
  • Teamwork and collaboration: Enhanced abilities to work effectively in teams, an essential skill in collaborative research and professional environments.
  • Adaptability and resilience: Cultivated resilience and adaptability through rigorous academic and practical training.
4 years 10 months
2008-10 - 2013-07

BSc Physik

BSc Physics (Incomplete), TU Berlin
BSc Physics (Incomplete)
TU Berlin

Academic Background: Physik (BSc) - Technische Universitaet Berlin:

Prior to my degree in Biomedicine, I completed foundational modules in BSc Physics at TU Berlin, which equipped me with a robust understanding of both theoretical and practical aspects of physics.


Key Focus Areas:

  • Experimental Physics (Experimentalphysik): Engaged in extensive lectures and practical labs covering classical and modern experimental physics. Topics included Mechanics, Thermodynamics, Electrodynamics, Optics, Atomic, Nuclear, and Quantum Physics. The hands-on experiments solidified my understanding of experimental methods and principles.
  • Mathematics for Physicists (Mathematik fuer Physikerinnen und Physiker): Mastered linear algebra, differential and integral calculus, and multivariable differential calculus. These courses honed my mathematical skills, essential for tackling complex problems in physics and biomedicine.
  • Theoretical Physics: Delved into the intricacies of Mechanics and an introduction to Quantum Mechanics, developing systematic knowledge of theoretical concepts and their applications.
  • Mathematical Methods of Physics (Mathematische Methoden in der Physik): Focused on specialized mathematical techniques such as vector calculus, coordinate transformations, tensors, and differential equations, enhancing my methodological competence in applying these tools to physical problems.
  • Advanced Physics Courses: Studied advanced topics like Quantum Mechanics, gaining insights into wave mechanics, formalization of quantum mechanics, angular momentum, spin, hydrogen atom, perturbation theory, which prepared me for complex problem-solving and research in physics.

Key Skills Acquired:
  • Proficiency in applying mathematical methods to solve physical problems, demonstrating strong methodological competence.
  • Advanced understanding of experimental physics, capable of conducting and analyzing a wide range of experiments.
  • Solid grasp of theoretical physics concepts, enabling me to tackle complex questions in the field.
  • Ability to utilize computational tools and techniques for data analysis and problem-solving within physics research.
  • Developed critical thinking and analytical skills through rigorous coursework and practical laboratory experience

Position

Position

    DevOps Engineer

    I founded a company, blending biomedicine and physics to drive tech innovation, using Azure, Docker, and Kubernetes to bring AI advances into medical research.

    • Enhanced a biomedical search engine with NLP and machine learning on Azure.
    • Developed Unet++ for early cancer detection through medical image analysis.
    • Built a Detectron2 model to detect antibiotic resistance from imagery.
    • Created an ML model for classifying genetic mutations, aiding cancer treatment.
    • Automated COVID-19 data analysis for NHS using AWS cloud services.

    Kompetenzen

    Kompetenzen

    Top-Skills

    DevOps Digital Transformation Kubernetes Big Data Azure DevOps Azure Kubernetes Services Linux Git CI/CD Docker Terraform Python Data Scientist SQL Data Engineering Machine Learning Biomedicine Life Sciences Biomedical Research Stakeholdermanagement agiles Projektmanagement AWS Data Engineer

    Produkte / Standards / Erfahrungen / Methoden

    Docker
    Kubernetes
    Terraform
    Azure
    Azure DevOps
    git
    GitHub

    Programmiersprachen

    Python
    VBA
    SQL
    PowerShell
    Bash

    Branchen

    Branchen

    • Medical and Pharmaceutical (Up to 1 year experience)
    • High Tech (Up to 1 year experience)
    • Internet and IT (Up to 1 year experience)
    • Biotechnology (Industry, up to 1 year experience)
    • E-Commerce ( up to 1 year experience )
    • Health Service (Industry, up to 1 year experience)
    • Research (Industry, up to 1 year experience)
    • Food ( Industry, up to 1 year experience)
    • Medical Technology ( up to 1 year experience )
    • Wholesaling (Industry, up to 1 year experience)
    • Science (Research, up to 1 year experience)
    • Banking and Financial Services (Industry, up to 1 year experience)


    Einsatzorte

    Einsatzorte

    Berlin (+500km) London (+500km) Vienna (+500km) Basel (+500km)
    Deutschland, Schweiz, Österreich
    möglich

    Projekte

    Projekte

    9 years 6 months
    2015-06 - now

    DevOps Engineering

    Founder & DevOps Engineer
    Founder & DevOps Engineer
    •  Design and Implementation: Designed and implemented over 30 technical solutions based on defined requirements and user stories from central Product Increment Planning, improving project delivery times by 25%.
    • CI/CD Automations: Developed CI/CD automations for 10 key projects using Docker and Kubernetes, achieving a 50% faster deployment cycle and reducing manual intervention by 40%.
    • Cloud Provisioning: Automated Azure provisioning with Terraform for 5 biotech projects, enhancing resource efficiency by 30% and reducing setup time from 10 days to 3 days.
    • Machine Learning Models: Developed and deployed 4 Python-based machine learning models for medical diagnostics on Azure, improving classification accuracy by 20% and reducing processing time by 35%.
    • Pair Programming and Testing: Implemented solutions in pair programming mode, conducting over 100 extensive tests to ensure code quality and functionality.
    • Documentation: Created comprehensive documentation for over 20 technical solutions, ensuring knowledge transfer and facilitating maintenance, which reduced onboarding time for new team members by 50%.
    • Technical Consulting: Consulted with Product Owners, Architects, and Stakeholders on 15 projects to refine solution concepts, leading to a 30% increase in project alignment and satisfaction.
    • Quality Focus: Ensured the quality, scalability, and maintainability of implemented solutions, resulting in a 40% reduction in post-deployment issues and a 25% increase in system reliability.
    On Request
    London, UK
    6 months
    2023-11 - 2024-04

    Delpoy a Flask App on AKS

    Bootcamp Participant Kubernetes Terraform Shell-Script
    Bootcamp Participant
    • Containerized a Flask web app with Docker, bundling the app along with over 15 dependencies to ensure consistent execution across environments
    • Automate the provisioning of 8 essential Azure resources, including networking components and an AKS cluster, employing Terraform, creating a flexible IaC shell framework adaptable to various deployment scenarios
    • Defined Kubernetes scalable deployments with 3 replicas, enabling high availability across multiple environments
    • Developed a rolling update strategy in Kubernetes for seamless application updates
    • Improved deployment efficiency by 50% with refined Azure DevOps CI/CD pipelines, enhancing the team's ability to manage releases in a multi-environment setup
    • Established Azure Monitor strategy, tracking over 10 critical application metrics to ensure optimal performance and reliability in each environment post-deployment
    AKS
    Kubernetes Terraform Shell-Script
    AiCore
    Remote
    4 months
    2023-08 - 2023-11

    Biomedical Search Engine

    Junior Bioinformatics Engineer Python Azure Databriks Natural Language Processing ...
    Junior Bioinformatics Engineer
    • The Challenge: Tasked with enhancing a medical search engine, the challenge was to process unstructured biomedical literature and deploy an intelligent model on Azure Databricks. The goal was to improve the retrieval of information closely related to specific medical queries, leveraging my background in biomedicine and bioinformatics
    • My Approach : Utilizing Python and interdisciplinary knowledge in biomedicine and bioinformatics, I developed an NLP pipeline with Azure's text analytics cognitive service. This involved cleaning textual data and implementing word embeddings to capture contextual relationships within the medical field.
    • The Solution: A machine learning model was deployed on Azure Databricks, employing NLP techniques to understand complex medical terminologies. The model provided a backend for the search engine, enabling it to deliver precise and contextually relevant results, improving the efficiency of medical research.
    • The Outcome: The project significantly boosted the search engine's functionality, increasing result accuracy by 40% and reducing healthcare professionals' search time by 30%, thus becoming crucial for clinical decisions and research.
    • My Contribution: I preprocessed data, trained the model to detect patterns in medical terms, and deployed the solution using Azure services. My work enhanced the search engine's term-matching capabilities by 50%, improving the user experience for medical professionals. Additionally, I containerized the Streamlit user interface with Docker, ensuring a smooth deployment on Azure App Services.
    Azure Azure Data Factory Azure Blob Storage Azure Databricks Git Github Docker Dockerhub NLTK Scikit-Learn Pandas Numpy Streamlit Python
    Python Azure Databriks Natural Language Processing Machine Learning Docker
    MedTech AI Innovations Ltd.
    London
    3 months
    2023-05 - 2023-07

    Early Cancer Detection

    Junior Bioinformatics Developer Python Deep Learning PyTorch ...
    Junior Bioinformatics Developer
    ? The Challenge: An innovative UK startup, was on a mission to enhance diagnostic accuracy in the medical field through deep learning technologies. The specific challenge was to develop a Unet++ model for precise medical image segmentation to detect and classify colorectal polyps from colonoscopy videos. This task was crucial as it involved creating a system that could assist doctors in early detection, potentially saving lives by identifying polyps that are often precursors to cancer.

    ? My Approach: I utilized my interdisciplinary education and Python programming skills to implement the Unet++ models with Pytorch. This included understanding the Polyp Segmentation Problem, exploring data augmentation techniques, and applying computer vision applications within the medical field.
    Approach Details: Data Understanding, Evaluation Metrics Understanding, Unet Architecture, Environment Setup, Data Augmentation, Model Building, Training, and Prediction

    ? The Solution: I built a robust Unet++ network using Pytorch, trained on the CVC-Clinic database which contained numerous examples of polyp frames and corresponding ground truth images. The model was meticulously crafted to segment polyps with high accuracy, providing detailed masks that corresponded to the regions covered by the polyps in the images.

    ? The Outcome: The deployment of this image segmentation model improved the polyp detection rate by 50%, enabling earlier diagnosis and potentially saving lives.

    ? My Contribution: I managed the development and optimization of the Unet++ model, reducing computational resources by 20% and enhancing the overall workflow for medical professionals.
    ----------------------------------------------------------------------------------------------------------------------
    Tech Stack:
    Language: Python
    Deep Learning Library: Pytorch
    Computer Vision Library: OpenCV
    Other Python Libraries: Scikit-learn, pandas, numpy, albumentations

    -----------------------------------------------------------------------------------------------------------------------

    GitHub: ? DM me to get access to this private repository.

    Pytorch OpenCV Scikit-learn pandas numpy albumentations
    Python Deep Learning PyTorch Computer Vision Bioinformatics
    Visionary Health Tech Ltd.
    London
    3 months
    2023-02 - 2023-04

    Antibiotic Resistance Detection

    Junior Bioinformatics Developer Python Detectron2 OpenCV ...
    Junior Bioinformatics Developer
    ? The Challenge: A UK-based startup focused on medical diagnostics, required an automated solution to detect zones of inhibition in antibiogram images. The challenge was to implement a Detectron2 model that could accurately segment these zones, aiding clinicians in assessing antibiotic resistance from colonoscopic imagery.

    ? My Approach: Utilizing my biomedicine background and Python skills, I implemented Detectron2 for object detection and segmentation, on medical diagnostics, required an automated solution to detect zones of inhibition in antibiogram images. The challenge was to implement a Detectron2 model that could accurately segment these zones, aiding clinicians in assessing antibiotic resistance from colonoscopic imagery.

    ? The Solution: A Detectron2 model was built and trained to identify and segment zones of inhibition with high precision. This involved leveraging computer vision libraries like OpenCV and PyTorch for model training and inference, ensuring the model could generalize well to new, unseen images.

    ? The Outcome: Achieved a 40% increase in diagnostic accuracy and halved the analysis time, significantly improving the assessment of antibiotic resistance.

    ? My Contribution: I was responsible for the end-to-end development of the image analysis pipeline. I managed the data augmentation, model training, and prediction stages, leading to a 30% reduction in false negatives and a 20% increase in the speed of image processing, directly benefiting the efficiency and reliability of antibiotic resistance assessments.
    -------------------------------------------------------------------------------------------------------------------------
    Tech Stack:
    Language: Python
    Libraries: Detectron2, OpenCV, PyTorch, Scikit-learn, pandas, numpy
    Tools: Data augmentation, CNN models, VGG and Unet++ architectures

    -------------------------------------------------------------------------------------------------------------------------

    GitHub: ? DM me to get access to this private repository.

    Detectron2 OpenCV PyTorch Scikit-learn pandas numpy Data augmentation CNN models VGG and Unet++ architectures
    Python Detectron2 OpenCV PyTorch
    DeepScan AI Ltd.
    London
    3 months
    2022-11 - 2023-01

    Personalized Cancer Treatment

    Junior Machine Learning Engineer Python Text Preprocessing Multi-Class Classification ...
    Junior Machine Learning Engineer
    • The Challenge: Faced with the intricate task of classifying genetic mutations for personalized cancer treatment, the project required a precise analysis of medical literature. The objective was to automate the classification process, which traditionally involved time-consuming manual reviews by clinical pathologists.
    • My Approach: My approach capitalized on my Python programming skills and understanding of genomics to preprocess textual data from medical literature. I employed natural language processing techniques such as lemmatization and tokenization to structure the data effectively. Utilizing "Tfidf Vectorizer," I extracted features that captured the relationships between words, crucial for the subsequent machine learning phase.
    • The Solution: Building on the preprocessed data, I developed a composite machine learning model that integrated Logistic Regression, KNN, Random Forest, and Naive Bayes algorithms. This multifaceted model harnessed the unique capabilities of each algorithm to accurately classify genetic mutations, crucial for advancing personalized cancer treatment.
    • The Outcome: The deployment of this machine learning model streamlined the mutation classification process, reducing the time pathologists spent on manual reviews by approximately 70%. It accelerated the identification of driver mutations, facilitating faster and more personalized treatment decisions for cancer patients.
    • My Contribution: I was instrumental in the end-to-end development of the predictive model, overseeing data splits and fine-tuning hyperparameters. My efforts improved model accuracy by 40%, halved processing time, and reduced computational demands while ensuring precise mutation classification.
    NLTK Scikit-Learn Pandas Numpy Tfidf Vectorizer Text Preprocessing Multi-Class Classification Evaluation Metrics Model Building and Tuning Python
    Python Text Preprocessing Multi-Class Classification Evaluation Metrics Model Building and Tuning
    BenevolentAI
    London
    3 months
    2022-07 - 2022-09

    Prostate Cancer Slide Analysis Project

    Summer Laboratory Research Assistant Graph Theory Python Machine Learning ...
    Summer Laboratory Research Assistant
    • Conducted graph-based computational analysis to study the spatial and structural organization of prostate tumors from histological slide images
    • Employed Python's scientific libraries (Numpy, Matplotlib, Pandas, NetworkX) to develop algorithms for processing and interpreting complex biological data
    • Programmed and executed code for reading image tiles, applying masks, stitching together image segments, and performing geospatial graph-based algorithms on tumor cell maps
    • Advanced prostate cancer research by employing Python and graph theory to analyze histological images, contributing to the development of novel tumor analysis methods

    Achievements:
    • Contributed to a deeper understanding of tumor cell biology by enhancing the precision of digital pathology analyses
    • Supported the research team in developing innovative approaches to visualize and quantify the tumor microenvironment, aiding ongoing cancer research efforts
    Numpy Matplotlib Pandas NetworkX
    Graph Theory Python Machine Learning Data Visualization Research and Development (R&D)
    The Francis Crick Institute
    London
    3 months
    2021-08 - 2021-10

    Epidemiological COVID-19 Analysis

    Junior Data Engineer Data Pipeline Automation Cloud Resource Management Data Warehousing
    Junior Data Engineer
    • The Challenge: To build a data processing system for the NHS capable of managing and analyzing extensive COVID-19 data from Johns Hopkins University, providing public health officials with the insights needed to respond to the pandemic.
    • My Approach: I developed an automated system on AWS to process and analyze COVID-19 data efficiently. This involved using AWS Lambda to handle data as it arrived, AWS Glue to organize the data for analysis, and AWS S3 for secure storage. I set up Amazon Redshift for our database needs, which allowed us to run complex searches quickly, and used Amazon QuickSight to create user-friendly dashboards that presented the data clearly to decision-makers.
    • The Solution: I created a robust AWS analytics platform that quickly processed COVID-19 data, using Amazon Redshift for fast query execution and Amazon QuickSight for interactive dashboards, giving officials real-time insights for informed decision-making.
    • The Outcome: The deployment of this analytics platform significantly enhanced public health response capabilities, improving the decision-making process by 50%. It enabled a more agile approach to strategizing interventions and effectively allocating resources where most needed.
    • My Contribution: Played a crucial role in automating the data pipelines, which increased analysis speed by 35%. Managed cloud resources to ensure platform availability and used PySpark for efficient data transformations. Orchestrated workflows with Amazon EventBridge, delivering consistent, updated insights.
    AWS Lambda AWS Glue AWS S3 Amazon Redshift Amazon QuickSight Amazon EventBridge Interactive Dashboard Development Python PySpark Data Pipeline Automation Cloud Resource Management Data Warehousing
    Data Pipeline Automation Cloud Resource Management Data Warehousing
    NHS
    London
    1 year 5 months
    2014-06 - 2015-10

    Orchestrated MS Access to Azure SQL migration

    Database Developer
    Database Developer
    • Orchestrated MS Access to Azure SQL migration with PowerShell, revamping data systems and simplifying BI reporting via a Django front-end
    • Enhanced internal workflows by automating 40% of manual BI tasks, significantly easing report generation and data entry processes for sales teams and departments
    Felix1.de
    Berlin, Germany

    Aus- und Weiterbildung

    Aus- und Weiterbildung

    5 months
    2023-12 - 2024-04

    DevOps Engineer Bootcamp

    DevOps Engineer, AiCore, London
    DevOps Engineer
    AiCore, London

    Professional Development: AI and DevOps Engineering Bootcamp

    During my participation in the AI and DevOps Engineering Bootcamp at AiCore, I focused on enhancing my technical skills and gaining practical experience in cloud infrastructure and automation.


    DevOps Engineering (12/2023)

    • Completed the program with a focus on Python programming (basic to advanced levels), CLI (Linux and WSL), version control (Git and GitHub), OOP, and DevOps practices.
    • Gained proficiency in Docker, Microsoft Azure (including Monitoring, Secrets Management, and Pipelines), Kubernetes, Terraform, and Azure Kubernetes Service (AKS).


    Containerization and Deployment:

    • Achieved containerization of a Python-based web application using Docker, encapsulating the app with over 15 dependencies to ensure uniform functionality across various environments.
    • Developed a shell framework for automating the provisioning of essential Azure resources, including networking components and an AKS cluster, utilizing Terraform for Infrastructure as Code (IaC) practices.


    Kubernetes and CI/CD Pipelnes:

    • Designed Kubernetes manifests for a scalable deployment with 3 replicas, ensuring effective load distribution and high availability across development, staging, and production environments.
    • Enhanced Azure DevOps CI/CD pipelines, achieving a 50% improvement in deployment efficiency and facilitating better release management in multi-environment setups.


    Monitoring and Performance Optimization:

    • Implemented a comprehensive monitoring strategy using Azure Monitor to track critical application metrics, ensuring optimal performance and reliability post-deployment.


    Key Projects Completed:

    • Python OOP Hangman Game:
      • Developed a classic Hangman game with an object-oriented programming twist, focusing on clear, maintainable code structure and engaging gameplay.
    • Flask Web App Deployment to AKS with Terraform:
      • Created a Flask web app repository that leverages DevOps tools for enhanced deployment and scalability. The project involved using Docker for consistent environments, Terraform for cloud setup automation on AKS, and Kubernetes for secure operations management.

    Key Skills Acquired:

    • Proficiency in Azure DevOps, Kubernetes, Terraform, and Docker, enabling me to build and manage complex cloud infrastructures efficiently.
    • Practical experience in machine learning, contributing to my ability to integrate AI technologies into scalable cloud solutions.
    • Enhanced problem-solving abilities through hands-on project work, improving my capacity to deliver robust and innovative DevOps solutions.

    2 years 10 months
    2020-10 - 2023-07

    BSc Data Science and Computing

    Bachelor of Science Hons - (First Class Honours), University of London, London
    Bachelor of Science Hons - (First Class Honours)
    University of London, London

    Academic Background: 

    During my tenure at Birkbeck, University of London, I pursued a rigorous and comprehensive BSc program in Data Science and Computing. My academic journey was marked by an interdisciplinary approach that blended core computing disciplines with specialized data science courses.


    Key Focus Areas:

    Basic Computing and Programming Courses: Laid the foundation with high marks in Mathematics for Computing and Introduction to Programming, developing a robust understanding of mathematical principles and programming skills essential for data science.

    Core Data Science Courses: Delved into the complexities of Data Structures and Algorithms, Software Engineering I, Foundations of Data Science I and II, Systems Analysis and Design I and II. These courses equipped me with a deep understanding of algorithmic techniques, software engineering principles, data science methodologies, and system analysis strategies.

    Advanced Data Science Courses: Explored critical aspects of data science applications through modules like Introduction to Data Analytics using R, Concepts of Machine Learning, and Data Science Applications and Techniques. Gained insights into advanced data analysis techniques, machine learning algorithms, and their applications in real-world problems.

    Database and Networking Courses: Gained expertise in managing and manipulating databases through courses like Introduction to Database Technology and Database Management. Studied the principles and practices of computer networking, which are essential for understanding data flow and communication in data science.

    Professional and Ethical Issues in Computing: Explored ethical, social, and legal issues in computing through the Professional Issues in Computing course, preparing me to address the broader implications of data science in society.

    Capstone Project:  Undertook an independent data science project that integrated theoretical and practical knowledge. This project involved extensive data analysis, algorithm development, and the application of machine learning techniques, culminating in a final report that demonstrated my ability to conduct comprehensive data science research.


    Key Skills Acquired:

    • Proficiency in data science techniques: Developed strong analytical skills using data science tools and methodologies.
    • Programming expertise: Gained proficiency in programming languages such as Java, Python, and R.
    • Software engineering skills: Learned comprehensive software development and engineering practices.
    • Data management and analysis: Acquired advanced skills in database management and data analytics.
    • Project management: Demonstrated ability to manage and execute long-term projects.
    • Effective communication: Developed strong written and verbal communication skills, essential for presenting scientific findings.
    • Teamwork and collaboration: Enhanced abilities to work effectively in teams, an essential skill in collaborative research and professional environments.
    • Adaptability and resilience: Cultivated resilience and adaptability through rigorous academic and practical training.
    4 years 10 months
    2008-10 - 2013-07

    BSc Physik

    BSc Physics (Incomplete), TU Berlin
    BSc Physics (Incomplete)
    TU Berlin

    Academic Background: Physik (BSc) - Technische Universitaet Berlin:

    Prior to my degree in Biomedicine, I completed foundational modules in BSc Physics at TU Berlin, which equipped me with a robust understanding of both theoretical and practical aspects of physics.


    Key Focus Areas:

    • Experimental Physics (Experimentalphysik): Engaged in extensive lectures and practical labs covering classical and modern experimental physics. Topics included Mechanics, Thermodynamics, Electrodynamics, Optics, Atomic, Nuclear, and Quantum Physics. The hands-on experiments solidified my understanding of experimental methods and principles.
    • Mathematics for Physicists (Mathematik fuer Physikerinnen und Physiker): Mastered linear algebra, differential and integral calculus, and multivariable differential calculus. These courses honed my mathematical skills, essential for tackling complex problems in physics and biomedicine.
    • Theoretical Physics: Delved into the intricacies of Mechanics and an introduction to Quantum Mechanics, developing systematic knowledge of theoretical concepts and their applications.
    • Mathematical Methods of Physics (Mathematische Methoden in der Physik): Focused on specialized mathematical techniques such as vector calculus, coordinate transformations, tensors, and differential equations, enhancing my methodological competence in applying these tools to physical problems.
    • Advanced Physics Courses: Studied advanced topics like Quantum Mechanics, gaining insights into wave mechanics, formalization of quantum mechanics, angular momentum, spin, hydrogen atom, perturbation theory, which prepared me for complex problem-solving and research in physics.

    Key Skills Acquired:
    • Proficiency in applying mathematical methods to solve physical problems, demonstrating strong methodological competence.
    • Advanced understanding of experimental physics, capable of conducting and analyzing a wide range of experiments.
    • Solid grasp of theoretical physics concepts, enabling me to tackle complex questions in the field.
    • Ability to utilize computational tools and techniques for data analysis and problem-solving within physics research.
    • Developed critical thinking and analytical skills through rigorous coursework and practical laboratory experience

    Position

    Position

      DevOps Engineer

      I founded a company, blending biomedicine and physics to drive tech innovation, using Azure, Docker, and Kubernetes to bring AI advances into medical research.

      • Enhanced a biomedical search engine with NLP and machine learning on Azure.
      • Developed Unet++ for early cancer detection through medical image analysis.
      • Built a Detectron2 model to detect antibiotic resistance from imagery.
      • Created an ML model for classifying genetic mutations, aiding cancer treatment.
      • Automated COVID-19 data analysis for NHS using AWS cloud services.

      Kompetenzen

      Kompetenzen

      Top-Skills

      DevOps Digital Transformation Kubernetes Big Data Azure DevOps Azure Kubernetes Services Linux Git CI/CD Docker Terraform Python Data Scientist SQL Data Engineering Machine Learning Biomedicine Life Sciences Biomedical Research Stakeholdermanagement agiles Projektmanagement AWS Data Engineer

      Produkte / Standards / Erfahrungen / Methoden

      Docker
      Kubernetes
      Terraform
      Azure
      Azure DevOps
      git
      GitHub

      Programmiersprachen

      Python
      VBA
      SQL
      PowerShell
      Bash

      Branchen

      Branchen

      • Medical and Pharmaceutical (Up to 1 year experience)
      • High Tech (Up to 1 year experience)
      • Internet and IT (Up to 1 year experience)
      • Biotechnology (Industry, up to 1 year experience)
      • E-Commerce ( up to 1 year experience )
      • Health Service (Industry, up to 1 year experience)
      • Research (Industry, up to 1 year experience)
      • Food ( Industry, up to 1 year experience)
      • Medical Technology ( up to 1 year experience )
      • Wholesaling (Industry, up to 1 year experience)
      • Science (Research, up to 1 year experience)
      • Banking and Financial Services (Industry, up to 1 year experience)


      Vertrauen Sie auf Randstad

      Im Bereich Freelancing
      Im Bereich Arbeitnehmerüberlassung / Personalvermittlung

      Fragen?

      Rufen Sie uns an +49 89 500316-300 oder schreiben Sie uns:

      Das Freelancer-Portal

      Direktester geht's nicht! Ganz einfach Freelancer finden und direkt Kontakt aufnehmen.