See all developers
Deepak K. Mr

Deepak K.

  • $
  • India
  • 9 years
  • I will decide later
Deepak is now available for hire Hire Deepak

Experienced Data Professional with 8 years of expertise in Data Vault 2.0, Data Warehousing, Data Integration, Data Visualization, SOA Architecture, and BPM Workflow. Proficient in leveraging Airflow for pipeline setup integrated with DBT and Snowflake. Successfully led migration projects from Sabre to Amadeus for Airline systems and facilitated Snowflake Cloud setup, including Oracle proc to Snowflake proc migration and ODI integration. Skilled in implementing Data Vault 2.0 for agile data modeling, notably contributing to critical Investor Relationship reports for Airbnb's IPO initiative. Demonstrated proficiency in ETL pipeline development using Airflow and creation of entity relationship diagrams and multidimensional data models for business. Experienced in Star-Schema Modeling, Snowflake Schema Modeling, and Fact and Dimension tables implementation. Adept in SCD Type 2 and SCD type 3, CDC using Oracle Golden Gate, and ETL Load Balancing using JEE Agents on Clustered Weblogic server. Successfully implemented ODI-BPM-OBIEE workflow utilizing SOA architecture. Proficient in utilizing issue and bug tracker tools like JIRA and LeanKit, along with version control tool SVN , Github.

Skills and experiences

Skills and experiences

Data Engineer @ HawaiianAir

Project: IT Commercial Analytics Dec 2022

• Setup Airflow and DBT pipeline for end-to-end processing .

• Setup DBT with Snowflake for ETL processing to save processing cost , better documentation and improved DW performance.

• Migrated data from Sabre to Amadeus system .

• Implemented Airflow to setup complete pipeline between hive , AWS S3 , Airflow

Data Engineer @ WWT(World Wide Technology)

Project: IT Data Insights and Analytics Jul 2021 To Dec 2022

• Implementation of Data Vault 2.0 in Snowflake and Oracle DB.

• ODI framework designing for data vault.

• Implemented Data pipeline in Snowflake from different systems using Airflow.

• Worked on different artifacts of Data Vault 2.0

• Snowpipe setup and implemented different snowflake components.

• Designing Tableau reports.

• Migrating Data Vault from Oracle to Snowflake .

Data Engineer @ Airbnb

Project: Investor Relationship Reporting , Financial Accounting Hub Apr 2020 To Jul 2021

• Implementation of DATA Vault in Snowflake.

• Airflow ETL pipeline development and designing.

• Tableau critical reporting like Investor Relationship for IPO initiative.

• Maintaining Production pipeline

• L1-L3 Changes

• AWS S3 to Oracle FAH transformations.

• Ingesting data form multiple sources to Financial Accounting Hub.

• Developing Tableau reports.

Consultant @ Capgemini

Project: Financial Publication Layer Apr 2019 To Apr 2020

• Snowflake Cloud warehouse setup .

• Snowflake Tasks and procedures executions and scheduling.

• Worked on Snowflake features like unstructured data , time travel , data sampling and streams

• Participated in business discussions with the users and created data- warehouse creation and publication layer setup.

• Decommissioning of Legacy Waker system.

• Participated in business discussions with the users and created requirement documents from which the OBIEE/Tableau dashboards are created.

• Development of different ODI artifacts , feed generation using OdiSqlUnload , OdiFileMove and other ODI utilities.

• Collaborate with BI teams to create reporting data structures.

• Implementation of data pulls from Impala data source.

• Implementation of SQL Loader to improve file load performance for huge data files.

• Worked on BRS , FDD and TDD designing .

System Engineer @ Tata Consultancy Services

Project: RGL (Ledger Details) , Airtel Zebra DTR Automation Project (GPA(Global partner Accounting) Project) Aug 2015 To Apr 2019

Project: RGL (Ledger Details) Tool Used: ODI 12c Developer, SQL Developer

• Participated in business discussions with the users and created data-warehouse for Ledger Detail tables and files.

• Collaborate with data architects for data model management and version control.

• Worked on Load Balancing using Java EE Agents on clustered WebLogic Servers.

• Enforce standards and best practices around data modeling efforts.

• Implementation of likewise feed generation and creation of generic source of other systems.

• Worked on parallel processing of packages as well as interfaces.

• Implementation of SCD and CDC's.

• Implementation of various Facts and Dimensions form BRS.

• Worked on BRS and FDD .
 

Project: Airtel Zebra DTR Automation Project (GPA(Global partner Accounting) Project)

Tool Used: ODI12c Developer, JDeveloper, SQL Developer , SOA -BPM

• Creation of Star and Snowflakes schema in OBIEE.

• Worked on Load Balancing using Java EE Agents on clustered WebLogic Servers.

• Work within the Enterprise BI team, supporting the creation of data pipeline processes for ingesting data at large scales

• Work directly with Data Modelers, Enterprise Architect, and Analysts, ensuring that business requirements are being met

• Implementation of various Facts and Dimesions form BRS. SOA-BPM Development:

• Implemented ODI to OBIEE report verification using BPM.

• Worked on end to end configuration of BPM and LDAP(Lightweight Directory Access Protocol.

Educations and Certifications

Bachelor of Technology

Information Technology 2011 To 2015

Dronacharya College of Engg. Gurugram , MDU University Rohtak , India 

Tableau certifications for Data Steward

2020

Oracle Data Integrator 12c Certified Implementation Specialist

2019

Oracle Certified Java Programmer

2014
Want to hire Deepak K. or just want to talk? Schedule chat with Deepak

Other developers