About
• 21 ~ years of experience in Enterprise Data Warehouse(EDW), HADOOP Data EcoSystem and AWS Data Lake application implementations and Evaluated emerging technologies in the field of data modeling to continuously improve the organization''s capabilities in managing vast amounts of information effectively.
• Expertise in end-to-end implementation of various projects and Collaborated with business stakeholders to translate requirements into actionable data architecture plans and Developed database architectural strategies at modeling, design, and implementation to address business or industry requirements to support of Data &Analytics and Machine Learning models.
• Profound Experience in EDW, Bigdata, and Cloud Data Lake Applications working on Data Modeling, Data Architecture, Data Governance, Data Acquisition, Data migration, Data conversion, ELT and ETL using Informatica Power center/ Scripting with Relational Databases likeTeradata, Oracle, Sql server ,and Java/Spark with Big Data databases like HDFS and Cassandra, and AWS Glue/EMR with AWS S3 Datalake.
• Extensive experience with Large Databases( 500+TB), Teradata data warehousing and Active Data Warehousing with thorough execution of data warehousing approaches like Kimball modeling like Star, Snowflake Schemas, and Bill Inmon of 3NF &ER Modeling, and BI methodologies like ROLAP, OLAP Cubes
• Extensive domain expertise in Finance, Retail, Manufacturing(Inventory,Supply Chain and Shipping), Aviation and Insurance in implementation of large-scale applications with the Data warehousing and Data EcoSystem using TOGAF framework and LAMDA Architecture etc..
• Strong experience of regulatory reporting like Basel Committee on Banking Supervision (BCBS), Matter Requiring Attention(MRA), Home Mortgage Disclosure Act (HMDA) and Sustainable Compliance etc..
• Extensive experience on Data modeling tool Erwin/PowerDesigner by converting Functional Requirements into Conceptual, Logical and Physical data models
• Expertise in Industry Standard Data models likeTeradata Manufacturing Logical Data Model (mLDM), Oracle Financial Services Analystical Application(OFSAA) and IBM Banking Data Warehouse(BDW)
Skills & Expertise
HDFS
CASSANDRA
ATSCALE
Research Interests
Chemical Engineering
Connect With Me
Experience
Data Solution Architect Lead/ Engineering Lead
- Manage, Design and Architect Home lending product teams (Decesion Sciences, Chase My Home( Advice, and Explore Buy &Manage) and Home Lending Origination applications to enable business discovery, self-servicing, consumption for ML Models to send marketing soliciations to customers and support Home Lending Advisors to increase the conversion of leads into loan originations as a Lead optimization program Defined information models supporting data assets for complex data structures represented through NO SQL databases and Domain modeling. Collaborated with business stakeholders, cloud architects and Information Architects to translate requirements into actionable data architecture plans and developed database architectural strategies at modeling, design, and implementation stages to address business requirements and challenges Interfaced directly with customers, stakeholders and end-users regarding capability architectures and Streamlined data storage and retrieval with the design of efficient conceptual,Logical and Physical data models to build a HL Loan origination shared semantic Normalized model and Supported business intelligence initiatives by constructing dimensional models optimized for reporting purposes on AWS S3 datalake sourced from external vendor ICE-Encompass to consume the multiple stakeholders like Operation Analytics, Regulatory reporting like HMDA, Capitol Markets and Load Optimization consumption for ML Models Enabling real-time analytics capabilities by designing streaming data pipelines using MSK on AWS Data Lake using internal CCB streaming Solution Migrated numerous legacy systems to newer technologies and initiatives were aligned with broader organizational goals and priorities like internal Datapipeline framework using Java/Spark, reducing costs and enhancing efficiency of computing tasks. Worked with teams of talented software engineers to define, build and maintain cloud infrastructure and Developed custom ETL processes for efficient data ingestion and transformation Used critical thinking to break down problems, evaluate solutions and make decisions within tight deadlines and fast-paced environments. Implemented security best practices within the AWS environment, safeguarding sensitive data and ensuring compliance with industry regulations. Defined cloud architecture for both hybrid and non-hybrid cloud solutions and provided architectural leadership and guidance to technical teams to deliver robust, highly scalable and cost-effective solutions. Designed and deployed scalable cloud architectures for improved application performance and reliability and Integrated disparate data sources into a cohesive and well-structured AWS Data Lake for advanced analytics capabilities and ML Model consumption. Developed and maintained CI/CD pipelines using Jenkins, increasing deployment speed and reliability. Accelerated time-to-market for new products with agile methodologies in developing end-to-end data engineering solutions. Technologies: AWS Datalake,Terraform,AWS S3, VPC, GLUE, Athena, MSK,IAM, KMS, Lambda, Event Bridge, SNS, Cloud Watch, CLI, Snowflake, PySpark,Teradata 14.x, HDFS,Java/Spark , Oracle 11G, Erwin 2021 R1, Microsoft Office 2010/13, Teradata SQL Assistant, HIVE, Bitbucket, JIRA, Jenkins, JFROG,IntelliJ, Maven, Splunk, Grafana,Putty,Winscp
dd