Summary

Expertise

Project Highlights

Education

Agency

FT

English:

Upper Intermediate

Fernando T.

vetted by Youteam

Vetted by YouTeam

Colombia

UTC -05:00

America/Bogota

English:

Upper Intermediate

Systems engineer and public accountant, with 8 years of experience in software development

Systems engineer and public accountant, with 8 years of experience in software development and related domains, including and impressive 5 years in roles encompassing data engineering, Data Architect, project management, and Agile processes, epitomizes versatility and expertise.

Want to hire this engineer?

Check if Fernando is available

Expertise

Years of commercial development experience

9 years of experience

Core technologies

Python 6 years
pandas 6 years
SQL 6 years
Docker 8 years
AWS 3 years

Other technologies

Apache
Azure
ETL
Spark
Jenkins
Hadoop

Project Highlights

icon
Data Engineer

LHA COMMEERCIALIZATION – LHA MIGRATION

Mar `23 - May `24

1 year

SoftServe

SoftServe is a premier IT consulting and digital services provider. We expand the horizon of new technologies to solve today's complex business challenges and achieve meaningful outcomes for our clients. Our boundless curiosity drives us to explore and reimagine the art of the possible. Clients confidently rely on SoftServe to architect and execute mature and innovative capabilities, such as digital engineering, data and analytics, cloud, and AI/ML.

Responsibilities & achievements

Data Architecture Design: Crafting and refining data architecture plans to ensure effective and efficient data handling ▪ Implement solutions Frontend/Backend using Docker, docker file, docker composer. ▪ Cloud-to-Cloud Migrations: Executing migrations of data pipelines between cloud platforms utilizing containers and Python for seamless transitions. ▪ Data Architecture Implementation: Putting into action the defined data architecture plans, including design and specification. ▪ Data Pipeline Migration: Transitioning data pipelines to GCP and Azure cloud environments, utilizing services such as Bigquery, DataFusion, Composer, Cloud Functions, PowerBI, GKS, Python. ▪ Utilizing Azure Services: Implementing solutions with Azure services like Azure Databricks, Azure Data Factory, Azure Kubernetes Service, Azure Data Lake Gen2, Pyspark, Spark, SQL, Postgresql, and Azure DevOps (ADO). ▪ Frontend Development: Utilizing technologies such as React and Poetry for frontend development. ▪ Project Management: Utilizing project management tools like Jira to plan and track progress throughout the project ▪ Data Mapping and Modeling layers. ▪ Data extraction, data loading, and data transformation using Pyspark. ▪ Streaming jobs using Pyspark. ▪ Incremental and bulk data updates. ▪ Set up monitoring and alerting mechanisms. ▪ Development build and deployment scripts ▪ Product support and documentation maintenance

Azure
PostgreSQL
Python
React.js
SQL
SQL Server
Databricks
icon
Data Architect

Stratio Datafabric

May `22 - Mar `23

10 months

SoftServe

SoftServe is a premier IT consulting and digital services provider. We expand the horizon of new technologies to solve today's complex business challenges and achieve meaningful outcomes for our clients. Our boundless curiosity drives us to explore and reimagine the art of the possible. Clients confidently rely on SoftServe to architect and execute mature and innovative capabilities, such as digital engineering, data and analytics, cloud, and AI/ML.

Responsibilities & achievements

▪ Technical management of the teams. ▪ Increment Plan Definition: Creating a comprehensive plan for incremental project development and implementation. ▪ Identification of risks, resolution, negotiation, and agreements with the client. ▪ Management of teams composed of data scientists, data engineers, full-stack developers, data governance professionals, and BI experts. ▪ Conflict Resolution: Addressing and resolving conflicts and issues that may arise during the project to ensure a harmonious working environment. ▪ Architecture Component Definition: Identifying and specifying the essential architectural components necessary for the project's success. ▪ Negotiation and Agreements: Engaging in negotiations with the client and team members to establish project requirements and secure mutually agreed-upon terms. ▪ Team Management: Efficiently managing cross-functional teams comprising data scientists, data engineers, full-stack developers, data governance specialists, and business intelligence (BI) experts. ▪ Monitoring and logging of pipeline activities using Rocket's built-in monitoring features. ▪ Implementation of general software and data fabric solutions using the Stratio Data Factory platform

Azure
Java
Kubernetes
Python
icon
Data Architect

Data Architect

Mar `21 - May `22

1 year

Globant

Projects in the telecommunications industry, in the AWS cloud, building various software solutions, and in GCP, performing on-premises migrations to the cloud using BigQuery as a data warehouse and Airflow to orchestrate the workflows.

Responsibilities & achievements

Migrating Data Warehouse (DWH) Solution: Transitioning the existing Data Warehouse solution to new GCP (Google Cloud Platform) solutions, with a particular focus on leveraging Bigquery o DWH Modeling and Implementation: Employing best practices for Data Warehouse modeling while implementing the DWH solution in Bigquery. o Reverse Engineering: Conducting reverse engineering activities to understand the structure of the current DWH solution, including analyzing the business ETL (Extract, Transform, Load) processes. This involves using tools such as Informatica Power Center and Neetiza. o ETL Implementation: Using Google DataFusion for ETL (Extract, Transform, Load) tasks, including data ingestion and transformation processes. o Workflow Orchestration: Utilizing Google Composer to orchestrate and manage the workflows within the new DWH solution, ensuring data flows smoothly and efficiently o Design and modeling of the data solution in BigQuery, data warehousing. o Migrate the data from on-premises (Netezza) to GCP BigQuery. o Building ETLs using Data Fusion, Pub/Sub for data ingestion, Airflow for orchestration, Python for transformations, and metadata extraction. o Implement the DAGs in Airflow. o Analyze the existing data solution through reverse engineering and replicate the model in the new solution.

API
ETL
Python
SQL
DynamoDB
Lambda
icon
Data Engineer

Phoenix DWH 2.0 project

Mar `20 - Mar `21

1 year

Globant

Projects in the telecommunications industry, in the AWS cloud, building various software solutions, and in GCP, performing on-premises migrations to the cloud using BigQuery as a data warehouse and Airflow to orchestrate the workflows.

Responsibilities & achievements

Implement the core to migrate the business processes from onpremises to AWS Cloud by implementing processing jobs on computing solutions on AWS and orchestrating them with Step Functions, along with other components like S3, Redshift, and Segment. o Implement software solutions in AWS for the analytics team using various AWS services. o Migration of Business Processes to AWS: Implementing the core infrastructure to migrate business processes from on-premises to AWS Cloud. This includes setting up computing solutions on AWS and orchestrating them using AWS Step Functions. Key Cristian Tovar Data Engineer 6 AWS components like S3, Redshift, and Segment will be utilized in this process. o Support for Data Solutions and Analytics: Providing support for data solutions, analytics, and reporting requirements. Assisting the IT team in the telecommunications industry by implementing data solutions using AWS Analytics services. This includes Lambda, S3, CloudFormation, EMR (Elastic MapReduce), AWS Glue, AWS Catalog, Lake Formation, Kinesis Firehose, RDS (Relational Database Service), SQS (Simple Queue Service), SNS (Simple Notification Service), Step Functions for workflow orchestration, Redshift, Athena, API Gateway, QuickSight, CloudWatch, DynamoDB, and other data tools like Talent and Segment. o ETL Solution Migration: Migrating the current ETL (Extract, Transform, Load) solution from Talent to AWS, ensuring a seamless transition of data processing tasks.

API
AWS
ETL
DynamoDB
icon
Big Data Developer

Big Data Developer

Dec `18 - Mar `20

1 year

Health insurance policies migration / Retirement system

In the company I played few roles, as a backend, frontend, bigdata developer, I worked in the company's accounting area

Responsibilities & achievements

Development of Business Process Optimization Strategies: Undertaking the role of devising optimization strategies for business processes within big data platforms, with a focus on the Cloudera ecosystem. ▪ Role Description: Working as a Big Data Developer, the responsibilities include the implementation of business processes utilizing the Hadoop ecosystem, which consists of components like HIVE, HUE, Impala, Spark, Zookeeper, Sqoop, Oozie, HDFS, and more. These processes are executed on a Cloudera distribution. Furthermore, Spark is utilized for batch processing in a distributed manner, with a strong emphasis on using Python and Scala as the programming languages of choice

Oracle
Python
Spark
icon
Full stack Developer

Full stack Developer

Dec `18 - Mar `20

1 year

Health insurance policies migration / Retirement system

In the company I played few roles, as a backend, frontend, bigdata developer, I worked in the company's accounting area

Responsibilities & achievements

▪ Software Development for Jubilation Projects: Serving as a developer for Jubilation's software on projects dedicated to Colombian companies. This involves actively participating in the design and coding of software solutions tailored to meet the specific needs of clients. ▪ Development of Decision Rules Engine Using Drools: Creating a decision rules engine utilizing Drools, a rules engine for Java applications, to enable intelligent decision-making processes within software solutions. ▪ Role Description: The role entails implementing new functionalities across both the backend and frontend layers of an application built with a microservices architecture. For the backend, Java version 8 is used, with the PlayFramework employed to implement web service methods. The database utilized is Oracle, and RabbitMQ serves as the message broker for efficient communication. On the frontend, React.js is the framework of choice, utilizing JavaScript and following the Redux pattern to manage user actions and flow. The entire stack, including the technologies mentioned, is deployed on the AWS Cloud infrastructure. ▪ solutions on projects in different areas. ▪ Update project framework Play 2.7 with libraries for tracing. ▪ Report developer using the standard defined by Colombian government institutions.

AWS
JavaScript
React.js
Oracle database
Testing Framework
RabbitMQ
icon
Quality Control

Quality Control

Dec `17 - May `18

5 months

Boxalud

This is a company that develops software for Colombian health-oriented institutions. In this role, I worked as an automated test engineer.

Responsibilities & achievements

Analyzer for test scenario requirements ▪ implement functional (Test Cafe) and unit test automation using Xunit framework. ▪ Unit Test Development with ASP.NET Framework: Functioning as a unit test developer, responsible for creating unit tests specifically tailored for a web application constructed within the ASP.NET framework. These tests are crucial to validate the functionality and performance of the application. ▪ Analysis of Test Scenario Requirements: Serving as an analyzer for test scenario requirements, undertaking a comprehensive assessment to understand the specific requirements and expectations related to the testing process. ▪ Functional and Unit Test Automation: Implementing automation for both functional and unit testing, utilizing the Xunit framework for test automation. These tests are carried out on a web application developed using the .NET framework, with the addition of C++ for specific components. The goal is to ensure the thorough testing of the application's functionality, guaranteeing its reliability and efficiency

C#
Automation
JUnit
icon
Desarrollador de software

SIEP Project

Mar `15 - Dec `17

3 years

OPE Open Process Empresarial

A company that specializes in consulting and implementing solutions for business process management. They focus on helping organizations improve efficiency, transparency, and collaboration through the optimization of their business processes.

Responsibilities & achievements

▪ Implement new functionalities in backend and frontend layers in an application with monolithic architecture using Java ▪ Using Vaadin framework to create the client interaction. ▪ Using Spring framework to define the backend and the MVC pattern ▪ Using Mybatis implement the ORM entities. ▪ Leading a small dev team. ▪ Full Stack Developer: Functioning as a full stack developer, with responsibilities spanning both the frontend and backend layers of software development. ▪ Development of ERP Solutions: Specializing in the development of ERP (Enterprise Resource Planning) solutions for projects across various sectors, including banking, the public sector, city halls, and government entities. ▪ Implementation of New Functionalities: Focusing on the implementation of new functionalities within the software, this role covers both the backend and frontend aspects of an application. ▪ Monolithic Architecture: Working with an application architecture structured as monolithic. The backend is constructed using Java, incorporating key frameworks and technologies such as the Vaadin framework for client interaction, the Spring framework to define the backend, and the utilization of the MVC (Model-View-Controller) pattern through spring-webmvc. Furthermore, Spring modules are employed for various purposes, including spring-web to implement and expose web services, spring modules like mybatis-spring using XML mapper to write complex SQL queries, and springjdbc and spring-tx to define the persistence and data access layer, facilitating the creation of low-level services. All of these components are configured across bean definitions with declared XML using Spring modules such as Spring Core and Context.

Java
PostgreSQL
Vaadin
Spring Framework
MyBatis
Maven

Education

Higher education in Computer Science

Agency

Software development agency #3757

10-50

GMT-5

Lima, Peru

Core Expertise

Agile
Amazon EC2
Amazon S3
AngularJS
AWS
Azure
C#
Django
Elixir
ETL
Express.js
Flask
Google Analytics
Groovy
Hibernate
HTML5
Ionic
Java
JavaScript
jQuery
Kotlin
Kubernetes
Microsoft
Microsoft Dynamics CRM
MongoDB
.NET
Node.js
PHP
PostgreSQL
Python
QlikView
React.js
React Native
Ruby on Rails
Scala
Selenium
Spark
Spring
SQL
SQL Server
SSIS
Tableau
TypeScript
WordPress
Xamarin
Apache Tomcat
Bootstrap
CSS3
Git
Go
Golang
HTML
iOS
Mocha
Oracle database
Pentaho
Project Scheduling
Scrum
SQL Azure
SQL Programming
Unit Testing
Web Services
Sketch
User Experience Design
Angular 2x
Postman
Project management
Docker
DynamoDB
MariaDB
SQL query
InVision
Redux
Project Manager
Scrum Master
Maven
Spring Boot
Illustrator
Photoshop
Jest
Enzyme
Hadoop
Flutter
.NET Core
Figma
AWS Lambda
Firebase
Next.js
SEO
Power BI
AWS Glue
Pyspark
.NET Framework
Snowflake
SAP HANA

Industries

Architecture & Design, E-Commerce & Retail, Information services & Technologies, Construction & Real estate, Data Science & Machine Learning, Branding, design, web development

Want to hire this engineer?

Check if Fernando is available