Abdoulaye DIALLO

Tech Lead Data * AWS (2x) and Snowflake Certified

About Me

I am a Tech Lead Data with extensive experience in data engineering, cloud technologies, and team leadership. AWS (2x) and Snowflake certified, I specialize in data integration, migration, and building scalable data architectures. With a strong background in data engineering, DevOps, and data analysis, I lead technical teams in modernizing data infrastructures, deploying robust solutions, and ensuring best practices across the data lifecycle. I thrive in collaborative environments and have a proven track record of delivering efficient, high-quality solutions for leading companies across various industries.

Experiences

CANAL+ TECH - Tech Lead Data (10/2025 - Present)

+

Team: Data Delivery - Engineering & IA

The Data Delivery team centralizes all data from set-top boxes, applications, and partners such as Orange and Free. It processes logs (content sessions), navigation data, and makes this information available for various internal uses including user navigation, content management, and video handling.

  • Lead the team's technological direction through tool and architecture selection, ensuring alignment with business needs and global strategy.
  • Maintain continuous monitoring of new technologies and industry best practices to anticipate evolution.
  • Support technical teams in skill development and facilitate collaboration between developers and stakeholders (PO/PM).
  • Guarantee the quality of delivered solutions.

Technologies: Snowflake, AWS, Git, Airflow, CI/CD Tools, dbt

KEYRUS - Cloud Data Engineer (03/2025 - 10/2025)

+

Client: Generali France – Team Data Analytics & Cognitive Factory

The Data Analytics & Cognitive Factory team at Generali France works across all group data (France), focusing on 360° customer view, infrastructure, and data governance to meet business needs across different divisions. This mission is part of Generali's data strategy aimed at business automation, data access standardization, time-to-market acceleration, innovation, and data quality improvement.

  • Design and implementation of data products on Snowflake and Azure.
  • Development of data catalog and lineage.
  • Metadata management (strategic for Gen-AI).
  • Deployment of tools promoting business autonomy.
  • Development and deployment of data domain architecture (data mesh), addressing regulatory, operational, and performance challenges.

Technologies: Snowflake, Python, SQL, Azure, Git, dbt, Airflow, Internal Tools

ACCENTURE - Data Engineer (09/2024 - 03/2025)

+

Client: Allianz Technology France – Team Data Lakehouse

The Data Lakehouse team manages Allianz France's global data platform and provides data access to all company departments. Participated in integrating data exchanged between Allianz and its partners in an Azure Storage environment, focusing on data integration, transformation, application modernization, and monitoring.

  • Parsed and transformed JSON/CSV/Tab data to Delta Lake format for optimized analytics.
  • Developed comprehensive unit and functional tests to ensure data quality and pipeline reliability.
  • Deployed applications using Helm on ArgoCD for automated and consistent deployments.
  • Modernized legacy projects: refactored code for better maintainability, migrated PostgreSQL authentication to passwordless approach for enhanced security.
  • Implemented workflow supervision and monitoring using Grafana dashboards.
  • Collaborated in an agile environment with continuous integration via GitHub Actions.

Technologies: Python, SQL, Azure, ArgoCD, Docker, Git, GitHub Actions, Grafana

ACCENTURE - Cloud Engineer (01/2024 - 09/2024)

+

Client: Canal+ France – Team Infrastructure & Technology

The Data Architecture and Technologies team is responsible for developing and maintaining architecture projects for the Data department. As a cloud engineer and operational DevOps support on several strategic projects, contributed to providing work environments for Data teams and worked on deploying the Dataiku platform.

  • Automate deployments and ensure environment reproducibility (EC2, VPC, IAM, Roles, Backup).
  • Deployed Fleet Manager on AWS.
  • Set up 4 Dataiku nodes (Design, Deployer, Automation, Govern) from Fleet via Python API.
  • Performed POC for SSO Okta and SSO Snowflake authentication integration.
  • Monitored and decommissioned Redshift.

Technologies: Terraform, Python, AWS, Git, Jenkins, Airflow, Dataiku, Snowflake

ACCENTURE - Data Engineer (10/2021 - 12/2023)

+

Client: Canal+ France – Team Data Factory

The Data Factory unit manages most of the customer data for Canal+ France and Canal+ International. Worked on modernizing their data warehouse, including migration from Teradata data warehouses to Snowflake cloud platform and migration and implementation of new Informatica ETL/ELT pipelines.

  • Converted Teradata DDL to Snowflake DDL using Python + SQL.
  • Implemented Teradata to Snowflake migration project.
  • Created/migrated (existing) pipelines in Informatica/Airflow/Snowflake environment.
  • Optimized data architecture (Informatica, Snowflake).
  • Mentored junior developers on data warehousing best practices.

Technologies: SQL, Python, Shell, Snowflake, Informatica, AWS, Git, Jenkins, Airflow

Allianz France – Data Scientist (09/2019 - 09/2021)

+

Unité Distribution, Performance and Commercial Productivity

The Performance and Commercial Productivity department provides distribution networks with tools for managing sales and responds to ad hoc requests for studies and data from various departments.

  • Developed web scraping scripts in Python, created data processing and analysis pipelines (statistical exploration).
  • Calculated performance indicators and studied commercial performance.
  • Modeling and machine learning for prediction and clustering tasks.

Technologies: Python, PySpark, MicroStrategy, Jira

Orange Finance Mobile, Dakar – Data Analyst (03/2018 - 08/2018)

+

Mission: Statistical analysis and predictive modeling

  • Studied Orange Money customer portfolio: Analysis of Orange Money service usage by customers and production of typological segments.
  • Built, prepared, and analyzed customer data, scoring model (regression).

Education

École Nationale de la Statistique et de l’Analyse de l’Information (ENSAI)

+

Mastère spécialisé en Data Science pour la connaissance client (2020 - 2021)

Université Paris 8, Île-de-France

+

Master en Big Data & Data Mining (2018 - 2020)

Université de Bambey, Sénégal

+

Licence en Statistique et Informatique Décisionnelle (2014 - 2017)

Skills

Certifications & Trainings

AWS Cloud Practitioner - CLF-C01

Entry-level certification validating AWS Cloud foundational knowledge.

View Credential

AWS Solution Architect - SAA-CO3

Certification for designing distributed systems on AWS.

View Credential

AWS Partner Technical Accredited

Accreditation for AWS partner technical enablement.

View Credential

Snowflake: SnowPro Core Certification

Professional certification for Snowflake data platform expertise.

View Credential

Dataiku Advanced Designer Certification

Advanced certification for Dataiku platform proficiency.

View Credential

Dataiku Developer Certification

Developer certification for building solutions on Dataiku.

View Credential

Informatica IDMC

5+ foundational certifications and implementation training (see LinkedIn for details).

DevOps & Cloud Tools Training

Training in Helm, Docker, Terraform, and AWS Data Analytics tools.

Other Experiences

Active involvement in DataAgainstCovid19, an initiative organizing consolidated data on the epidemic.

Contact

Location: Île-de-France, France Email: abl(!)aye0m@gmail.com Phone: 0(sept)54491234

find me on GitHub.