Summary

Expertise

Project Highlights

Education

Agency

DT

English:

Advanced

David T.

vetted by Youteam

Vetted by YouTeam

Mexico

UTC -06:00

America/Mexico_City

English:

Advanced

I commit my entire focus, effort, and energy to ensure that every project I undertake is delivered with exceptional quality.

As a Data Scientist and Data Engineer with over six years of experience, I am deeply passionate about my work and remain determined to reach every goal set before me. I consistently prioritize quality and demonstrate unwavering dedication to achieving optimal outcomes. Continuously improving my skills and staying up-to-date with emerging technologies is a fundamental aspect of my approach, and I take every opportunity to challenge myself with com-plex tasks. I commit my entire focus, effort, and energy to ensure that every project I undertake is delivered with exceptional quality. My dedication to my craft has allowed me to contribute to various stages of data analysis and engineering in top-tier companies, including Dealership Performance 360, OLX Autos, Total Play, and TV Azteca.

Want to hire this engineer?

Check if David is available

Expertise

Years of commercial development experience

8 years of experience

Core technologies

Python 7 years
SQL 4 years
Tableau 4 years
Business Intelligence 6 years
pandas 2 years
PostgreSQL 2 years

Project Highlights

icon
Data Scientist

TCS

Aug `23 - Apr `24

8 months

Tableau Dashboard Development - GEN AI Project - USAA Bank

Responsibilities & achievements

Tableau Dashboard Development. • Developing interactive dashboards in Snowfake to visualize key performance indicators (KPIs) for a bank, such as transaction volumes, account balances, loan portfolios, and customer demographics. • Reading and querying Snowfake Data Lake to retrieve relevant data. • Creating and optimizing views in Snowfake to streamline data access and improve query performance. • Implementing security measures for access and content in Snowfake to ensure data integrity and confdentiality. • Project: Developing an AI-powered Fraud Detection System. • Researching and implementing state-of-the-art machine learning models using Python and TensorFlow/PyTorch to detect fraudulent activities in banking transactions. TensorFlow/PyTorch, and deploying the model to generate realistic and novel outputs. • Project: USAA Bank Responsibilities: • Develop and maintain data pipelines and ETL processes using Python and Snowflake to ensure data quality, integrity, and availability. • Collaborate with data scientists and analysts to understand data requirements and implement solutions to support their analytical needs. • Design and implement data models and schemas in Snowflake to support business requirements and opti-mize query performance. • Monitor and optimize the performance of data pipelines and ETL processes to ensure timely and efficient data delivery. • Work with cross-functional teams to troubleshoot and resolve data-related issues in a timely manner. • Stay up-to-date with emerging technologies and best practices in data engineering to continuously improve data processes and infrastructure.

Agile
Azure
ETL
Python
TensorFlow
PyTorch
Snowflake
icon
Lead Data / SR BI Analyst

TV Azteca

Jan `22 - Aug `23

2 years

TV Azteca, TotalPlay, OLX

Project Description: ▪ Web Scraping from competing companies. ▪ Utilizing Databricks and PySpark for data analysis and processing in e-commerce projects. ▪ Oracle Database, multi-model database management system, as well as in-memory, NoSQL and MySQL data-bases ▪ Name matching algorithm in Python. ▪ Data Ingestion and Processing. ▪ Create Python scripts to automate data ingestion from different sources, such as Oracle Database, PostgreSQL and NoSQL. ▪ Implement large-scale data processing using Apache Spark in OCI. ▪ Developed and implemented ETL processes using SQL Server Integration Services (SSIS) to integrate data from multiple sources into a centralized data warehouse, ensuring data quality and consistency, and optimizing query performance by configuring and managing indexes and partitioning in on-premises MSSQL databases. ▪ Develop and implement robust ETL pipelines using AWS services, particularly AWS Glue, to ingest, process, and consume data. Ensure data integration and validation processes are thoroughly executed to maintain data quali-ty and reliability across the data infrastructure. ▪ Utilized C# for end-to-end data processing, including ingestion, cleaning, and transformation. Applied statistical analysis and machine learning algorithms for enhanced insights. ▪ Developing Dashboards with Power BI. ▪ Developing dashboards in Tableau. ▪ Agility Content Management System ▪ Automatizing web scraping from Amazon AWS using cron and Firebase. ▪ Containerized Web Scraper with a CI/CD pipeline using AWS, Terraform, Boto3, and GitHub Actions.

Responsibilities & achievements

Project description: • ETL, dashboards and machine learning classification algorithms to classify data from users. • Complete tasks tracked by JIRA. • Web app using streamlit to analyze geographical data of Mexico City. • Strong mathematical skills, including proficiency in calculus, probability theory, and statistical analysis. • Solid knowledge of actuarial principles and techniques, including pricing, underwriting, and reserving. • Experience in developing and implementing pricing and underwriting models for personal and commercial lines of business. • Experience in analyzing data and making recommendations on product design and pricing. • Proficiency in programming languages commonly used in actuarial work, such as SAS, R, or Python. • Excellent analytical and problem-solving skills. • Strong attention to detail and accuracy. • Ability to manage multiple projects simultaneously and meet deadlines. • Excellent communication and interpersonal skills. • Experience in managing a team of data analysts. • Ability to work collaboratively with a technology lead to implement ML models. Project description: ▪ Collaborated in the maintenance of dashboards that allows rest of company employees orchestration and retrieve reports. ▪ Developed and implemented ETL processes using SQL Server Integration Services (SSIS) to integrate data from multiple sources into a centralized data warehouse, ensuring data quality and consistency, and optimizing query performance by configuring and managing indexes and partitioning in on-premises MSSQL databases. ▪ Develop and implement robust ETL pipelines using AWS services, particularly AWS Glue, to ingest, process, and consume data. Ensure data integration and validation processes are thoroughly executed to maintain data quality and reliability across the data infrastructure. ▪ Migration of all the dashboards from Power BI to Tableau ▪ Complete tasks tracked by JIRA. ▪ Have a clean repository by creating branch-per-feature and reviewing team code. ▪ Attending data issues reports from the dashboard’s users. ▪ Data Ingestion and Processing. ▪ Create Python scripts to automate data ingestion from different sources, such as Oracle Database, PostgreSQL and NoSQL. ▪ Implement large-scale data processing using Apache Spark in OCI.

Agile
AWS
C#
Django
Flask
JIRA
Kubernetes
Oracle
PostgreSQL
Python
R
SSIS
Jenkins
NoSQL
Ansible
Bash
Docker
Snowflake
Databricks
Numpy
CI/CD
icon
Data Analyst

Dealership Performance 360

Aug `21 - Jan `22

5 months

Data Analysis of Customer’s sales

Responsibilities & achievements

▪ Reporting and analyzing data from sales. ▪ Developing Dashboards with Power BI ▪ Complete tasks tracked by Trello. ▪ Have a clean repository by creating branch-per-feature and reviewing team code. ▪ Analyze the use cases to ensure high quality and best approach for the solution. ▪ Code in multiple projects in a mono-repository. ▪ Test-driven Development.

JIRA
Python
Trello
icon
Data Analyst

Construcciones Viales y Urbanas

Nov `14 - Dec `18

4 years

CONVIURSA

Construction Company

Responsibilities & achievements

▪ Reporting and analyzing data from inventory. ▪ Complete tasks tracked by Trello. ▪ Financial Analysis of company’s income. ▪ Developing Dashboards with Power BI. ▪ Forecasting income and revenue. ▪ Creating dashboards in Tableau to report the financial status of the company.

Excel
Python
Tableau
pandas

Education

Higher education in Computer Science

Agency

agency #2271

100-400

GMT-6

Ciudad de Mexico/Mexico,Austin/United States,Cuernavaca/Mexico

Core Expertise

AngularJS
Django
Java
JavaScript
Kotlin
MongoDB
.NET
Node.js
Python
React.js
React Native
TypeScript
Vue.js

Industries

Banking & Finance, Internet & Telecom, Beauty & Personal Care, Big Data

Want to hire this engineer?

Check if David is available