Niranjan Reddy Rachamala
About
Having decades of experience in Data Warehousing, Data modeling/Dimensional Modeling, BI/DW architecture, Database development projects for large Banking, Financial services and Insurance (BFSI) clients is an industry with extensive use of Microsoft SQL Server, T-SQL (SQL Server), Teradata, Hadoop, Python, PySpark, AWS, DevOps and UNIX technologies.
•Have worked on various types of project such as Anti money Laundering (AML), Financial Analytics, Regulatory Reporting, Data Analytics , compliance, campaign data model & Data Migration.
•Have worked on all aspects of project lifecycle such as design, architecting, development, integration, scheduling and supporting/maintenance.
•Strong database, SQL, ETL and data analysis skills.
•Expertise in Complex Query Analyzing, performance tuning and testing.
•Experience in Data modeling using Ervin, SAP power designer to find, visualize, design, deploy and standardize high-quality enterprise data assets.
•Stay current with industry trends, making recommendations as needed to help the clients excel.
•Expertise in build ETL/ELT processes using SSIS, Azure Data Factory , Teradata SQL, AWS Cloud and Python, PySpark.
•Expertise in Tableau & PowerBI to generate reports as per client requirements.
•Worked on Job scheduling and orchestrating using Autosys, Airflow, BMC Control-M and Tivoli development activities.
Skills & Expertise
Quality Control
Troubleshooting
Generative AI
Serverless Computing
Enterprise Solution Design
Design Documents
Research Interests
Banking Service and Operation
Banking & Finance
Data Engineering
Azure Data Factory
ETL Development
Data Warehousing
Azure Databricks
Business Intelligence Tools
Connect With Me
Experience
Technology and Data Software Engineer
- • Requirement gathering and Product/Application technical design. • Work with Product Owners, Application architects and Scrum masters and deliver database artifacts. • Define the sprint backlogs for the DB/ETL/Reporting teams and oversee the delivery. • Build DB servers, batch servers and Report server(SSRS and PowerBI) for hosting the Application data and define DB level security. • Configure Database project code repositories and SSIS project code repositories in TFS and Github. • Configured Enterprise level build and release pipelines using TFS, Jenkins, UDeploy and Liquibase. Database code build and release for EAP is configured in TFS and DB code for PMT application is configured using Github, Jenkins, UDeploy and Liquibase. • Defined process for DB code deployments from Dev environments to Production. • Implemented Conceptual, logical and physical Data models for EAP and PMT applications using SAP PowerDesigner and published the DDL changes to Dev environments & DB repos. • Created SSIS projects and ETL templates. • Oversee the development of DB code (Stored Procedure, Views, UDFs etc). • Performance tuning of highly complex SQL code and improved Overall application response time. • Enabled App and DB logging in AppDynamics, Splunk, SQL QueryStore etc to monitor Application health and Database performance. • Design Semantic data models and developed ODS (Operational Data stores) for reports and Downstream system consumption. • Design and develop Dashboards and reports using Tableau and PowerBI. • Data migration from Legacy System to Newly built target Systems. • Work with the Service/UI team and create DB objects to return the output in JSON format. Worked in API integration and testing.
Data Engineer
- • Participates in the design, development and implementation of complex Financial Crimes applications. • Involves in AML modeling, alert generation and processing, case investigation and Search analytics. • Analyzes, designs, develops, enhances and supports applications that integrate with existing business components utilizing established processes, frameworks and technologies. • Develops and supports existing and new project requirements, including coordinating with business to understand requirements, and analyzing potential impact of implementing change requests. • Performs tasks to comprehensively analyze application and process issues, including problem identification, communication and coordination with customer facing support teams. • Work with business to understand new scenario requirements or changes to existing scenarios and implement on SAS. • Participate in Agile ceremonies and co-ordinate AML scenario development. • Project Planning, Effort estimation, resource allocation, Development and release management. • Develop, support and maintain ETL processing using Teradata, SAS, UNIX, PySpark, Impala and Hive. • Work with Enterprise Risk Analytics team to translate the scenario requirements to AML model logic to be implemented using SAS/Teradata/Hadoop. • Implement new AML models using Hadoop source data building scenarios using PySpark and HDFS. • Performance tuning of Teradata SQL queries, Hive queries and Pyspark scripts. • Create complex SQL queries using OLAP/aggregate functions for business analysis requests/ data issues. • Implement Data integrity checks in the load process and between database distribution layers. • Release Management and Production Support. • Promote the code using DevOps • Worked on DAG development for Autosys scheduling.
Senior System Engineer
- • Data profiling and migrating from Teradata to bigdata cloud platform • Data modeling using Ervin for new model on the platform. • Develop data ingestion process for JSON files using Python dictionaries and Pandas. • Conversion of JSON files to Parquet files using Python Pandas package. • Wrapper developed in Python to run application along with other applications. • Develop Python scripts for Glue ETL Jobs. • Develop ELT flows to migrate on-prime data to AWS Cloud data warehouse (Redshift) using AWS S3, Glue crawlers and Redshift. • Create external tables from S3 files in Athena and Redshift database for data validations. • Flattening of JSON files (semi-structured) into relational rows and columns using Redshift Spectrum. • Create external tables from S3 files in Athena and Redshift database for data validations. • Flattening of JSON files (semi-structured) into relational rows and columns using Redshift Spectrum. • Design and develop the jobs with extensive hive scripting, Python and scheduling tool airflow. • Design and develop the data extraction process from source systems to AWS S3 and then to the target system ( writeback/backflow). • Promote the code using DevOps • Worked on DAG development for airflow scheduling. • Monitor the airflow dags through Grafana dashboards • Generate the execution statistics like rowcount, Vcore, Memory utilization report from KIBANA.
Data Engineer Lead
- Data Engineer Lead Aug 2021 - Apr 2022 · 9 mosAug 2021 to Apr 2022 · 9 mos Skills: Technical Design · Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · PySpark · Troubleshooting · Documentation · Design Documents · Avro · Data Processing · Enterprise Solution Design · Critical Thinking · Architecture Development · Datasets Skills: Technical Design · Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · PySpark · Troubleshooting · Documentation · Design Documents · Avro · Data Processing · Enterprise Solution Design · Critical Thinking · Architecture Development · Datasets Data Technical Engineer Feb 2020 - Jul 2021 · 1 yr 6 mosFeb 2020 to Jul 2021 · 1 yr 6 mos Skills: Technical Design · Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · PySpark · Troubleshooting · Documentation · Design Documents · Avro · Data Processing · Enterprise Solution Design · Critical Thinking · Architecture Development · Datasets
Sr. Application consultant
- Project Name : SAIL - Strategic AML Industry Lead Client : Standard Charted Bank Organization: Optimum Solutions Duration: Jan -2016 To Till Date Environment Software : Teradata , BigData Hadoop Languages : Python,Hive ,PySpark, Teradata Sql and Teradata Utilities Database : Teradata 15 , Hive Tools: Teradata SQL Assistant, Apache hive, sqoop, Squirrel, DB-Visualizer and oozie, HDFS commands, RUNDECK,DevOps, Tableau,JIRA O/s : Windows XP Professional and Unix Project Name : SAIL - Strategic AML Industry Lead Client : Standard Charted Bank Organization: Optimum Solutions Duration: Jan -2016 To Till Date Environment Software : Teradata , BigData Hadoop Languages : Python,Hive ,PySpark, Teradata Sql and Teradata Utilities Database : Teradata 15 , Hive Tools: Teradata SQL Assistant, Apache hive, sqoop, Squirrel, DB-Visualizer and oozie, HDFS commands, RUNDECK,DevOps, Tableau,JIRA O/s : Windows XP Professional and Unix Skills: Technical Design · Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · PySpark · Troubleshooting · Documentation · Design Documents · Avro · Data Processing · Enterprise Solution Design · Critical Thinking · Datasets
Technical Lead at DBS
- Project Name : Contextual Marketing Client : DBS Singapore Organization :APAR Technologies PTE LTD Duration : Sep -2015 To Jan -2016 Software : Teradata , Hadoop Languages : Hive , Teradata Sql and Teradata Utilities Database : Teradata 14 , HDFS Tools: Teradata SQL Assistant , Apache hive, hue O/s : Windows XP Professional and Unix Project Name : Contextual Marketing Client : DBS Singapore Organization :APAR Technologies PTE LTD Duration : Sep -2015 To Jan -2016 Software : Teradata , Hadoop Languages : Hive , Teradata Sql and Teradata Utilities Database : Teradata 14 , HDFS Tools: Teradata SQL Assistant , Apache hive, hue O/s : Windows XP Professional and Unix Skills: Technical Design · Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · PySpark · Troubleshooting · Documentation · Design Documents · Avro · Data Processing · Enterprise Solution Design · Critical Thinking · Datasets
Technology Business Analyst
- Project Name : Financial Analytics Client : Standard Charted Bank, Kaulalumpur ,Malaysia. Technology used : Teradata Sql and Teradata Utilities Tools : Teradata Decision Expert ( TDE ), UNIX, Control-M Scheduler ,Teradata SQL Assistant,HP Application Life Cycle Management Project Name : Financial Analytics Client : Standard Charted Bank, Kaulalumpur ,Malaysia. Technology used : Teradata Sql and Teradata Utilities Tools : Teradata Decision Expert ( TDE ), UNIX, Control-M Scheduler ,Teradata SQL Assistant,HP Application Life Cycle Management Skills: Technical Design · Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · PySpark · Troubleshooting · Documentation · Design Documents · Avro · Data Processing · Enterprise Solution Design · Critical Thinking · Datasets
Technical Lead
- Project Name : OCBC datamarts technical support. , SIBS silver lake integrated banking system Client : OCBC bank, Singapore Technology used : Teradata Sql and Teradata Utilities Tools : UNIX, ODI, Control- M scheduler ,Teradata SQL Assistant, HP Openview service desk Project Name : OCBC datamarts technical support. , SIBS silver lake integrated banking system Client : OCBC bank, Singapore Technology used : Teradata Sql and Teradata Utilities Tools : UNIX, ODI, Control- M scheduler ,Teradata SQL Assistant, HP Openview service desk Skills: Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · Troubleshooting · Documentation · Design Documents · Data Processing · Enterprise Solution Design · Critical Thinking · Datasets
Senior system analyst
- • Ensured smooth running of all IPG related jobs and provided abend resolutions. • Prepared Application information Documents (AID) in the knowledge transition sessions. • Ability to plan, executes, communicate and meet project commitments consistently for developments and enhancements. • Worked on performance improvement of SQL by query tuning. • Ensure to follow Client requirements through CAB (change advisory board). • Preparation of the Unit Test Plan and Unit Test Report • Moving the Code to Changeman • Setting up the job Dependencies in CA7 Scheduler • Ensured smooth running of all IPG related jobs and provided abend resolutions. • Prepared Application information Documents (AID) in the knowledge transition sessions. • Ability to plan, executes, communicate and meet project commitments consistently for developments and enhancements. • Worked on performance improvement of SQL by query tuning. • Ensure to follow Client requirements through CAB (change advisory board). • Preparation of the Unit Test Plan and Unit Test Report • Moving the Code to Changeman • Setting up the job Dependencies in CA7 Scheduler Skills: Data Standards · Extract, Transform, Load (ETL) · Data Pipelines · Troubleshooting · Documentation · Design Documents · Data Processing · Critical Thinking · Datasets
Senior Software Engineer
- Project Name : BI development and enhancements Client : Commin wealth bank, Australia Technology used : Teradata Sql and Teradata Utilities Tools : Mainframes ,Teradata SQL Assistant Project Name : BI development and enhancements Client : Commin wealth bank, Australia Technology used : Teradata Sql and Teradata Utilities Tools : Mainframes ,Teradata SQL Assistant Skills: Extract, Transform, Load (ETL) · Data Pipelines · Documentation · Design Documents · Data Processing · Critical Thinking · Datasets
Education
Sri Venkateswara University, Tirupati (SVU)
Projects
Policy management platform
Policy Management Technology (PMT) is a java based web application that was developed for the Policy Office (PO) team. PO team uses PMT to initiate, author and publish Enterprise and Business level policies, procedures and Guidelines. Published policies/procedures will be available in the Policy Library for team member’s reference. PMT is developed using Angular, Java and SQL Server 2019 technologies.
Certificates & Licenses (3)
Certified ScrumMaster (CSM)
The Open Group Certified: TOGAF® 9 Certified
Azure Fundamentals
Microsoft Certified: Azure Fundamentals
Awards & Achievements (1)
🏆 Editorial Board member
Description
Professional Memberships (1)
Association of Enterprise Architects
Country: United States
Publications (6)
This research study focuses on how to implement agile delivery model of data-driven user interface (UI) application within regulated industry. We explore the issues, approaches, and effective practice...
During the past decade, the rapid growth of data, rising user requirements and the need for stable applications have caused digital innovation to accelerate. Businesses that want to succeed in today’s...
Today, cloud-native data pipelines are a fundamental asset in data structures of present-day data-powered businesses, however, they present a major security risk through their full lifecycle. This res...
Today, financial institutions choose to move their financial data to the cloud so they can grow, gain access to analytics instantaneously and control costs. I analyze the adoption of Amazon Web Servic...
Financial organizations deal with large amounts of information on transactions, markets and risks that must be sorted through rapidly and correctly. This research aims to discover how Teradata, Hive S...
dd