English:
Upper Intermediate
Guatemala
UTC -06:00
America/Guatemala
Analytics and Cloud Architect with over 10 years of experience
I am an Analytics and Cloud Architect with over 10 years of experience and specialized in Data Fields, such as Predictive Analysis, Data Engineering, Business Intelligence, Storytelling, Data Architectures, Cloud Infrastructure Design & Administration, Software Development, Software Deployments and Forensic Analysis.
Want to hire this engineer?
Check if Hugo is availableExpertise
Years of commercial development experience
10 years of experience
Core technologies
Other technologies
Project Highlights
FPT Latin America
FPT Latin America is a part of FPT Corporation, a globally leading technology and IT services & solutions provider headquartered in Vietnam, with nearly US$1.6 billion in revenue and 41,000 employees in 29 countries.
Responsibilities & achievements
Create Infrastructure as Code with Terraform in AWS several accounts, replicating modules with Terraformer. Use Puppet to define and automate Infrastructure provisioning and configuration. Manage configuration files and templates to ensure consistent changes across different environments. Integrate automates CI/CD pipelines to ensure that the infrastructure is deployed automatically
NexaBanco
The first neobank in Guatemala. Our purpose is to make your life easier. With us you can manage your money without going to an agency. Supervised by the Superintendencia de Bancos de Guatemala (SIB). Nexa complies with the same regulations as traditional banks but with state-of-the-art technology.
Responsibilities & achievements
Principal Consultant in Artificial Intelligence and Data Engineering for the first 100% Digital bank in Guatemala, on projects related to Data Engineering and Machine Learning. Design, development and implementation of the Lake House for the bank within AWS Databricks. Creation and maintenance of various Data Pipelines from multiple data sources. Development and implementation of a Machine Learning model to detect account holders who transfer money between their Nexa accounts and to or from other banks. Key Technologies: AWS (Lambda, RDS, RedShift, Glue, Athena & S3), Python/PySpark (NumPy, Pandas, SciKit-Learn, Keras & Matplotlib) & SQL.
Synergia.ai
Boutique Spark, Scala, Python, Databricks shop specialized in Data Engineering and Data Science
Responsibilities & achievements
Principal Data Engineer as an external consultant specialized in Data Engineering, for a 100% Digital Bank in Guatemala City. Generate and maintain datasets that align with business needs. Design and implement the Data Lake Architecture. Build and maintain Big Data (Lake House on AWS) ETL procedures. Develop algorithms to transform data into useful, actionable information. Build, test, and maintain database pipeline architectures. Collaborate with management to understand company objectives. Create new data validation methods and data analysis tools. Ensure compliance with data governance and security policies. Key Technologies: AWS (S3, Glue, RedShift & DMS), Databricks, Python, PySpark, SQL & Scala.
Millicom (Tigo)
Millicom (NASDAQ U.S.: TIGO, Nasdaq Stockholm: TIGO_SDB) is a leading provider of fixed and mobile telecommunications services in Latin America.
Responsibilities & achievements
Lead the migration of the Company's regional operation to AWS, primarily Data Warehouses. Lead the implementation of the Data Mesh in AWS for 50 different AWS Accounts, across 8 different countries. Administrate different cloud environments. Collaborate with other teams from different parts of the world to complete different analytical models. Propose and supervise different POC's for different projects. Developed and implemented a data backup and recovery system, increasing data security and reducing downtime in the event of a system failure. Developed a data ingestion process to automate the transfer of data from multiple sources into a single database, resulting in a X% reduction in manual data entry. Developed a system for large-scale data processing that was able to process petabytes of data in a few hours. Guide various development teams towards cloud migration, with good practices and showing them technical examples. • Regional Director of the Data Modeling Department for Millicom Latin America (Central America, Colombia, Bolivia, and Paraguay). • Create Data Modeling strategies, especially regarding product development and decision-making. • Monitor teams and handle Data Models to develop, implement, and complete project while ensuring a high level of data quality based on regulatory standards. • Create visual representations of information systems to communicate connections between data points and structures. • Define and Lead Conceptual, Logical and Physical Data Models. • Identification of entities. • Map attributes to entities completely. • Assign keys as needed, and decide on a degree of normalization that balances the need to reduce redundancy with performance requirements. • Implement and validate date models Key Technologies: AWS (EC2, Lambda, VPC, Control Tower, SageMaker Studio, RDS, RedShift, Glue, Athena, S3, AWS Lake Formation, Amazon Kinesis & CloudWatch), Linux, Python, PySpark & SQL & Terraform.
3Pillar Global
3Pillar Global builds breakthrough software products that power digital businesses. 3Pillar is an innovative product development partner whose solutions drive rapid revenue, market share, and customer growth for industry leaders in Software and SaaS, Media and Publishing and Information Services. Leveraging a lean and agile approach, 3Pillar delivers value-generating, digital solutions with specialized product strategy and management, user experience design, as well as software and data engineering expertise across mobile, cloud, and disruptive technologies.
Responsibilities & achievements
Technical leader for two data teams, one for a retail project and the other for an IoT-related project. Creation of various Python scripts to extract data from multiple sources (APIs, non-relational databases and relational databases). Creation and maintenance of various Data Pipelines. Define task complexity during Scrum sessions and assign. Assign tasks to other engineers and, if necessary, support and guide them to complete these tasks. Supported engineering projects by maintaining accurate records of all project data. Engage in non-technical sessions with clients to explain risks and new proposals. Key Technologies: AWS (RDS, Dynamo DB, Lambda, S3 & SageMaker Studio) Docker, Python, SQL, Hadoop, GCP (BigQuery & Buckets), Talend & Power BI.
NEXA Banco
Nexa is the modern solution for effortlessly managing your day-to-day financial life with products and solutions that were designed to give you the best experience.
Responsibilities & achievements
• Principal Data Engineer as an external consultant specialized in Data Engineering, for a 100% Digital Bank in Guatemala City. • Generate and maintain datasets that align with business needs. • Design and implement the Data Lake Architecture. Build and maintain Big Data (Lake House on AWS) ETL procedures. • Develop algorithms to transform data into useful, actionable information. • Build, test, and maintain database pipeline architectures. • Collaborate with management to understand company objectives. • Create new data validation methods and data analysis tools. • Ensure compliance with data governance and security policies. Key Technologies: AWS (S3, Glue, RedShift & DMS), Databricks, Python, PySpark, SQL & Scala.
EY Project
EY is a global leader in audit, tax, transaction advisory and consulting services. The quality analysis and services we provide help build confidence in capital markets and economies around the world.
Responsibilities & achievements
Technical Lead for the EY Crypto Wallet Data project. Design and implement the Data Lake Architecture. Design and implementation of a Lake house in Azure Databricks for various transactions, carried out with 3 families of different Cryptocurrencies. Enhancing the data collection process. Processing, cleansing & verifying of data. Creation and maintenance of various Data Pipelines. Key Technologies: Azure (Databricks, Data Factory, SQL Server, Storage Account & Azure Functions), Azure DevOps, Python, Spark/PySpark & SQL.
Moveapps
IT Staffing and Software Development Services. Achieve your business goals in less time, with our digital solutions and expert technology profiles.
Responsibilities & achievements
Create and maintain CI/CD pipelines for different microservices running in EKS using Azure DevOps, and maintain all IaC with Terraform in AWS several accounts. Manage configuration files and templates to ensure consistent changes across different environments. • Freelance DevOps for a Chilean Retailer. • Make use of micro-services to build flexible and scalable applications. • Deploy and maintain built-in environments. • Create and deploy cloud resources. • Administrate application and IaC CI/CD Pipelines. Key Technologies: AWS, Kubernetes, Terraform and Azure DevOps.
Caylent
Caylent is a cloud native services company that helps organizations bring the best out of their people and technology using AWS. We are living in a software-defined world where technology is at the core of every business.
Responsibilities & achievements
• Cloud Engineer Contractor for a Medical Company in the US. • Design and deploy applications to the cloud. • Identifying and using appropriate cloud services to support applications on the cloud. • Creating monitors, metrics, dashboards, and other useful form of monitoring. • Troubleshoot pipeline runs and errors during deployments. • Maintaining the usage of cloud services and implementing cost-saving strategies. Key Technologies: AWS (EKS, Fargate, Lambda, Glue & CloudWatch Logs), Kubernetes, Grafana, Terraform, Data Dog & Serverless Framework.
Conduent
Conduent delivers digital business solutions and services spanning the commercial, government and transportation spectrum — creating exceptional outcomes for its clients and the millions of people who count on them. The company leverages cloud computing, artificial intelligence, machine learning, automation and advanced analytics to deliver mission-critical services
Responsibilities & achievements
DevOps Engineer for UPS. Create and maintain all CI/CD pipelines for an entirely serverless backend written in .NET Core and C# with over 400 Azure Functions and 200 Queues. Administrate Cloud Components with IaC. Propose POC for further architectures in modernization projects. Design and test different Cloud Architectures for different applications. Create and deliver Development Environments to developers. Troubleshoot production problems and mitigate them. Administrate Cloud Environments. Troubleshooting to fix the code bugs. Key Technologies: Azure(Azure Functions, Queues, AppService, Cosmos DB, API Gateway, Azure SQL, Azure Databricks) , Terraform, .Net Core, SQL Server & Cosmos DB.
ITS
ITS InfoCom is a world-class multinational company that designs and delivers integrated Information and Communication Technology solutions to enhance our clients' activities and business.
Responsibilities & achievements
• Security Analyst designated for the BAM/Bancolombia project. • Analyze, Mitigate, Transfer and Document Software vulnerabilities. • Apply software patches and updates to maintain up-to date standards of security. • Investigate data and security leaks. Key Technologies: Nessus, WSUS, Linux, Windows Server, SQL Server & Nagios.
Seguros Universales, S.A.
Third largest company in the Guatemalan insurance market. Leader in service innovation, ISO 9001 certified in all its value processes, and recently recognised as one of the TOP BRANDS in Guatemala, being the only company in the insurance market.
Responsibilities & achievements
• Database Administrator for Guatemalan Insurance Company, Seguros Universales. • Assist with schema design, code review, SQL query tuning. • Install, tune, implement and upgrade DBMS installations. • Write and deploy SQL patches. • Upgrade and improve application schema and data upgrades. • Proactively and regularly make recommendations for system improvements. • Collate, prepare, and present statistical information for internal and external use. Key Technologies: Oracle 11g/12c, Linux, Hadoop, Bash Scripting, JAVA, SQL, PL/SQL, JAVA & Python.
Education
Higher education in Computer Science
Agency
10-50
GMT-5
Lima, Peru
Core Expertise
Industries
Architecture & Design, E-Commerce & Retail, Information services & Technologies, Construction & Real estate, Data Science & Machine Learning, Branding, design, web development
Want to hire this engineer?
Check if Hugo is available