English:
Upper Intermediate
Ulyana L.
Vetted by YouTeam
Belarus
UTC +03:00
Europe/Minsk
English:
Upper Intermediate
Software developer with 4 years of experience in the implementation of business applications
• Software developer with 4 years of experience in implementation of business applications • Skilled in writing complex SQL queries and query performance tuning • Experienced in writing stored procedures and functions • Experience with Azure and AWS cloud platforms • Strong skills in Data Lake and Data Warehouse design and implementations • Proficient in BI, ETL, data modelling and data analysis • Skilled in front-end development using Angular framework • Experienced in Cloud environment configuration and monitoring
Want to hire this engineer?
Check if Ulyana is availableExpertise
Years of commercial development experience
4 years of experience
Core technologies
Other technologies
Project Highlights
Department of Big Data technologies
DWH to store information about procurement of materials and equipment by various departments in different regions for further using for reports in Power BI. Data is integrated from various data sources (Azure Data Lake, SAP ERP, excel sheets, CSV files) using SSIS and stored into SQL Server.
Responsibilities & achievements
• Creation and configuration of ADF pipelines for data storage and processing; • Writing complex SQL queries and optimizing existing ones; • Transformation of data using stored procedures and functions; • Development of SSAS models; • Analysis of data sources, design and development of ETL jobs using SSIS; • Creation of Python scripts for processing of data in Azure Blob Storage and Azure Data Lake; • Automation of data processing using Azure Data Factory; • Application production support, troubleshooting, fix packs deployment, providing customer support and education;
Solution for companies trends analysis
Constructing pipelines for pulling and analyzing companies’ data from different sources and writing it into knowledge base. Knowledge base then used for analytical queries and for building visualizations of company history which helps to make business-critical decisions.
Responsibilities & achievements
• Development of web application for data visualisation using Angular, HTML, CSS, d3.js; • Creation of data visualisation prototypes using PowerBI and QlikSense; • Demonstrating applications and prototypes to busines users and product manager; • Defects analysis and resolution, data issues troubleshooting; • Working with Docker and Kubernetes; • Manipulating data in Elasticsearch cluster, writing queries;
Cloud DWH for a pharmaceutical company
DWH to store information about medicines and pharmacy products in different stores in different countries for further using in the analytical platform Vendavo. Data is loaded from SAP ERP and excel sheets using Pentaho DIS, stored into Oracle DB.
Responsibilities & achievements
• Analysis of data sources; • Data modelling and DB development; • Writing complex SQL queries and optimizing existing ones; • Development of various data storage modules; • Designing data loading using Pentaho DIS; • Testing data building processes; • Providing deployment documentation for production maintenance engineers;
Barcode report for a car tire manufacturer company.
Report with scanned barcodes aggregated by date for further using in the analytical department. All development is conducted in the cloud platform – Amazon Web Services. Data is extracted from DynamoDB, transformed to report and sent by Lambda service using Simple Email Service and Serverless framework. Lambda functions are written in Python 3.6.
Responsibilities & achievements
• Analysis of data source; • Creating functions in AWS Lambda using the Serverless framework for uploading data from DynamoDB and then sending the generated report via SES; • Creating a schedule to run Lambda functions through CloudWatch; • Log monitoring in CloudWatch.
DWH for a financial company
Development of DWH, data is loaded from various sources (Oracle DB, MSSQL DB, excel documents, web services) using Talend Data Integration. Final data is stored in Oracle DB.
Responsibilities & achievements
• Analysis of customer requirements; • Participation in business analysis and technical design to develop requirements for ETL documents and specifications; • Data architecture design; • Writing complex SQL queries for data transformation; • Optimization of existing SQL queries; • Design and development of ETL processes using Talend Data Integration; • Product support and maintenance; • Data testing • Work with the documentation
DWH for a commercial bank
Environment: GreenPlum DB, SAS Data Integration Studio, SAS Enterprise Guide, Aginity Workbench, PostgreSQL, Hadoop, Python, Linux
DWH contains information about potential and existing clients of the bank, about all transactions, cash flows on user accounts, marketing and advertising campaigns. Data comes from databases such as Oracle, Hadoop, as well as excel documents, csv and xml files, processed using SAS Data Integration Studio and loaded into GreenPlum MPP DB.
Responsibilities & achievements
• Design and development of extracting, transforming and loading processes of data using the capabilities of SAS Data Integration Studio; • Participation in the implementation of corporate information and analytical systems; • Design and creation of a data model for data warehouses and data marts; • Working with BigData solutions (Hadoop) and MPP databases; • Writing complex SQL queries and optimizing existing ones; • Testing the processes of extracting and loading data into the storage; • Support for data building processes after implementation in production; • Development of new project documentation and user manual; • Participation in daily meetings with the development and systems analysis team;
Education
Higher education in Computer Science
Agency
50-100
GMT+4
Minsk/Belarus
Core Expertise
Industries
Construction & Real estate, Internet & Telecom, Big Data, ERP
Want to hire this engineer?
Check if Ulyana is available