English:
Advanced
Mexico
UTC -06:00
America/Mexico_City
Senior Mexican engineer with 22+ years of experience as a Data Architect ETL, DB, Oracle PL SQL and Snowflake developer.
Senior engineer with 22+ years of experience as a Data Architect ETL, DB, Oracle PL SQL and Snowflake developer. Developed data models, data pipelines, tasks, sec matrix data/profiles, DWH & query performance, with expertise in brokerage, health, imbursement, retail, car, insurance, food, billing, finance industries. Enthusiastic and willing to do whatever it takes to complete tasks on time and with the highest standard. Ready to work with new teams, to share and get knowledge. Also willing to acknowledge feedback from my leads and teammates.
Want to hire this engineer?
Check if Edgar is availableExpertise
Years of commercial development experience
15 years of experience
Core technologies
Other technologies
Project Highlights
International Retail Brands
Analyze the business with stakeholders and technical team to identify an efficient way to move deltas and performance the current insights, design and develop solutions extracting from redshift, transforming and updating redshift.
Responsibilities & achievements
Python, AWS and redshift support. • Analyze many brands to normalize data sets aligned with the business. • Design, support and develop airflow dags • Extract, optimize and Quality manage of the brands data. • Query optimization. • Configure and test n dags. • Design and enhancement data models Skills: • Strong analytics skills. • SQL 22 years of experience • Working with big data tools (cloudera/Hadoop, hive, sqoop) • Working with AWS Redshift and snowflake • Development o Python o Airflow Dag o PySpark • Working on AWS apache managed airflow • Working with AWS S3 • Power Bi reporting. Environment & tools: AWS Platform, office, S3, Redshift, Agile, Jira, SCRUM Methodologies, power BI, python.
DWH and pipeline Support
International Beer company
Work as a support ops engineer providing support for the main application on all environments, our daily basis activities were checking AWS EC2,RDS and Elastic search servers operations and services review, checking if there was any issue present and solve on a proactive way also plan on weekly activities to be accomplish at the end of the week in case of any maintenance windows was planned in order to perform code deploys or RDS database patch as need per development team, review and solve bugs related with application, implement the fix and pass it to QA team in order to test and send the proper deploy on next production sprin
Responsibilities & achievements
• Check heath status on EC2 instances(CPU usage, memory usage and storage usage), RDS Databases, Target groups om EC2 and Load balancer time response. • Claim tickets related with bugs, diagnose and implement fix as peer requested by client. • Migrate RDS to latest version in order to resolve any open bug or issue from older versions • Improve Import Data Validation system for new users on the application to import information into the system • Implement AWS cli and lambda scripts to automate some daily reactive and proactive tasks. • Create shells scripts to perform Database migrations and all enterprise environments for specific clients • Cerate dash boards on CloudWatch service to keep motioning health status on EC2, elastic search, RDS and load balancer services • Role implementation for new clients to use their own resources on AWS. Environment & tools: Toad, office, AWS (EC2, CloudWatch, RDS, Elastic Search, Lambda, Load Balancer, Target groups, auto scale groups, aws cli, IAM, VPC, peering connection), Mysql, Data Grip, java, PHP, Apache, Linux, shell scripting.
Tableau reports automation
Japan Car company
Create the full pipelines to extract data from internal systems feeding tableau reports. Project automation on-premise and manually data sources to cloud and AWS. Designing and deployment of pipeline for 15 domains and 40 excel tableau reports automation.
Responsibilities & achievements
• Data discovery, wrangling, analysis, cleaning, quality and design layouts. • Configuring and deploy s3 connections into IICS • Designing, developing and deploying workflow/mappings into IICS delivering into S3 • Designing and developing snowflake data model regarding 15 domains • Designing and developing Data Marts • Develop snowpipe and import data model procedures • Design ETL using IICS (informatica cloud) and model DB snowflake cloud computing. Skills: • Analytics skills • Working with SQL and NoSQL Databases • Working Tableau reports • Development o SQL and NoSQL o Snowflake o IICS o Tableau Environment & tools: Informatica Cloud Services IICS, snowflake, AWS S3, Tableau, service now, Agile, SCRUM methodology.
Oracle SAP to Snowflake migration.
IT Germany company
Work as DWH developer designing data models into snowflake, creating task, schedule jobs, working with micro partitions and virtual DWH. In this first step of the SAP migration, we create the data pipeline to BI tools tableau. Deploy dbt models to data streaming from oracle.
Responsibilities & achievements
• Designing, planning and deployment of data models to 15 domains. • Designing, testing, monitoring and deploy data mart. • Develop tableau reports to Unit Test, UAT and QA. • Designing, developing, testing and deploy of dbt models. • Support to Data models and dbt models into ETL. Supporting All teams from SAP using AWS Servers along with AWS Premium support in Americas Time zone for customers using Demo Servers for SAP Company Skills: • Strong analytics skill data source • Analyze, design, develop and deploy DBT clases • Analyze, design, develop and deploy Snowflake data models • Development o Python o DBT o Snowflake o Tableau Environment: AWS S3, Office, Jira, snowflake, DBT, Oracle, pl sql, tableau, service now, github.
Cohort Atlas.
Health insights company, US remote
Develop, design and create models
Responsibilities & achievements
Activities: • Develop DynamoDB model • Create ETL (glue jobs) to extract data from snowflake • Design models to extract data to be consume by Lambda API • Develop AWS Data pipeline crawler/jobs. • Develop Elastic Search jobs to extract from snowflake. Environment: AWS, AWS S3, DynamoDB, Data pipeline, Glue, Cloudwatch, python, pyspark, snowflake, lambda, Elastic search, Agile, SCRUM methodology.
Maximo specialist supporrt
PC Tech US Company, remote
Test and deploy solutions by user hand
Responsibilities & achievements
• Refine JIRA cards • Analyst and develop scripts in Oracle PL SQL • Deploy JIRA Cards fixed at github Environment: Oracle, PL SQL, Jira, Maximo.
Product type for Meal US Company, remote
Pipeline migration on-premise to cloud architecture.
Responsibilities & achievements
Activities: • Data Analysis • Analysis and develop pipeline • Design and develop ETL jobs / processes • Model design DWH snowflake • Design Data Mart Environment: SQL Server, Scrum, Jira, snowflake, aws, lambda, python, pyspark, airflow
Education
Higher education in Computer Science
Agency
10-50
GMT-5
Lima, Peru
Core Expertise
Industries
Architecture & Design, E-Commerce & Retail, Information services & Technologies, Construction & Real estate, Data Science & Machine Learning, Branding, design, web development
Want to hire this engineer?
Check if Edgar is available