Summary

Expertise

Project Highlights

Education

Agency

JY

English:

Advanced

José Y.

vetted by Youteam

Vetted by YouTeam

Colombia

UTC -05:00

America/Bogota

English:

Advanced

Advanced Data Python Engineer

Computer Engineer with more than 5 years of experience working with big consultancy companies in Chile. Always been fascinated by how data works and consequently focused his professional path on Data Engineer roles using Python, SQL Server, among others as part of his main tech stack. Also has taken leadership roles to train and guide potential Juniors interested in Data and ease the comprehension and execution of tasks for his co-workers

Want to hire this engineer?

Check if José is available

Expertise

Years of commercial development experience

5 years of experience

Core technologies

Python 3 years
AWS 2 years
Data Modeling 4 years
PostgreSQL 3 years
pandas 2 years
SQL 3 years

Other technologies

AWS
ETL
Oracle
Python
SAP
SQL Server
Pentaho
Postman
Google Cloud Platform
AWS Lambda
pandas
Apache Airflow
Power BI
AWS Glue
Pyspark

Project Highlights

icon
Advanced Data Engineer

Zenta Group

Sep `21 - Present

3 years

Zenta is an IT consulting company specialized in software development. It has 15 years of innovation and experience in the digital market; and more than 300 highly qualified talents.

Responsibilities & achievements

● ETLs processes development in AWS Glue, AWS Lambda and GCP Cloud functions, using Pandas and PySpark libraries ● Development and scheduling of big data workloads in Apache Airflow ● Development of streaming data flows in AWS Kinesis ● Management of data repositories in AWS S3 and GCP Cloud Storage ● Creation, optimization and tuning of the data warehouse of the company using AWS RedShift and GCP BigQuery ● Development of data pipelines using Databricks (Python/Scala) with Azure Data Factory ● Development of data pipelines with Scala in on-premise data environments with Hadoop (Hortonworks), using Parquet files, Hive, YARN and Spark ● Participation in data migration project from Teradata on-premise to AWS Redshift

AWS
ETL
Google Cloud Platform
pandas
Lambda
Apache Airflow
AWS Glue
Pyspark
icon
Data Engineer

Forum

Oct `20 - Aug `21

10 months

● Construction of ETLs processes and workloads using Pentaho Data Integration 9.1 ● Design, creation, tuning and administration of SQL Server 2019’s Datamarts on an on-premise environment ● Creation and publication of Microsoft Analysis Services (SSAS) analytics cubes with more than 10M+ registers ● Development of data consuming APIs in Python 3.9 using Postman as testing tool ● Creation of financial Power BI’s reports using Data Flows

Responsibilities & achievements

● Construction of ETLs processes and workloads using Pentaho Data Integration 9.1 ● Design, creation, tuning and administration of SQL Server 2019’s Datamarts on an on-premise environment ● Creation and publication of Microsoft Analysis Services (SSAS) analytics cubes with more than 10M+ registers ● Development of data consuming APIs in Python 3.9 using Postman as testing tool ● Creation of financial Power BI’s reports using Data Flows

ETL
Python
SQL Server
Pentaho
Postman
icon
Senior Data Engineer

Ernst & Young (EY)

Nov `19 - Jun `20

7 months

EY provides consulting, assurance, tax and transaction services that help solve our client's toughest challenges and build a better working world for all.

Responsibilities & achievements

● Creation and schedule of data extraction and calculation’s workloads using Alteryx Designer ● Development of SP, functions and indexes in SQL Server 2014’s on-premise databases ● Presentations to the critical business team, introducing the new processes and tools of the data’s company team ● Management and training of Juniors Data Engineers’s team

SQL Server
icon
Business Intelligence Engineer

AFP Habitat

May `19 - Nov `20

2 years

AFP Habitat is a Chilean pension fund manager, created in 1981 and one of the two largest in the country by number of contributors.

Responsibilities & achievements

● Creation and analysis of the new Datamarts for the Finance and Risk departments of the company using Kimball’s star schema methodology ● Programming of SPs, functions, triggers and analytical queries in Sybase (SAP IQ) databases ● Development of ETLs processed in Pentaho Enterprise Edition ● Creation of QlikView, Qlik Sense and Power BI reports

ETL
QlikView
SAP
Power BI
Qlik Sense
icon
Business Intelligence Consultant

Banco Itaú

Feb `17 - May `19

2 years

Banco Itaú S.A. is a Brazilian bank, with headquarters in São Paulo.

Responsibilities & achievements

● Development of the bank’s main data warehouse using Oracle 12c ● Creation of ETLs processes and packages in Oracle Data Integrator (ODI) ● Automation of workloads using Control-M ● Participation in analysis and projects requirements with the principal Data Governance team of the company

ETL
Oracle

Education

Higher education in Computer Science

Agency

Software development company agency #2003

50-100

GMT-7

United States

Core Expertise

Apache
AWS
Azure
Cloud Engineer
Data Scientists
Java
JavaScript
Kubernetes
Linux
MySQL
Node.js
PHP
PostgreSQL
Python
React.js
React Native
Ruby
Ruby on Rails
TypeScript
DevOps
Go
Golang
Redis
Test Automation
Test Case Design
RESTful API
Docker
JavaSE
Vue.js
GraphQL
Google Cloud Platform
JavaScript MVC
GoLand
Next.js
Terraform
Cloudformation
Data Analyst

Want to hire this engineer?

Check if José is available