Niranjan Reddy Rachamala

Technology and Data Software Engineer at Artech L.L.C.
📚 Data Engineer Lead | Huntersville, North Carolina, United States
Mutual Connections
Loading...

Send Message

No file chosen
6 Publications
0 Followers
0 Following
0 Questions

👤 About

Skills & Expertise

Quality Control Troubleshooting Generative AI Serverless Computing Enterprise Solution Design Design Documents

Research Interests

Banking Service and Operation Banking & Finance Data Engineering Azure Data Factory ETL Development Data Warehousing Azure Databricks Business Intelligence Tools

Connect With Me

💼 Experience

Technology and Data Software Engineer

Artech L.L.C. · October 2024 - Present
  • • Requirement gathering and Product/Application technical design. • Work with Product Owners, Application architects and Scrum masters and deliver database artifacts. • Define the sprint backlogs for the DB/ETL/Reporting teams and oversee the delivery. • Build DB servers, batch servers and Report server(SSRS and PowerBI) for hosting the Application data and define DB level security. • Configure Database project code repositories and SSIS project code repositories in TFS and Github. • Configured Enterprise level build and release pipelines using TFS, Jenkins, UDeploy and Liquibase. Database code build and release for EAP is configured in TFS and DB code for PMT application is configured using Github, Jenkins, UDeploy and Liquibase. • Defined process for DB code deployments from Dev environments to Production. • Implemented Conceptual, logical and physical Data models for EAP and PMT applications using SAP PowerDesigner and published the DDL changes to Dev environments & DB repos. • Created SSIS projects and ETL templates. • Oversee the development of DB code (Stored Procedure, Views, UDFs etc). • Performance tuning of highly complex SQL code and improved Overall application response time. • Enabled App and DB logging in AppDynamics, Splunk, SQL QueryStore etc to monitor Application health and Database performance. • Design Semantic data models and developed ODS (Operational Data stores) for reports and Downstream system consumption. • Design and develop Dashboards and reports using Tableau and PowerBI. • Data migration from Legacy System to Newly built target Systems. • Work with the Service/UI team and create DB objects to return the output in JSON format. Worked in API integration and testing.

Data Engineer

Tata Consultancy Services(TCS), Texas · August 2022 - October 2024
  • • Participates in the design, development and implementation of complex Financial Crimes applications. • Involves in AML modeling, alert generation and processing, case investigation and Search analytics. • Analyzes, designs, develops, enhances and supports applications that integrate with existing business components utilizing established processes, frameworks and technologies. • Develops and supports existing and new project requirements, including coordinating with business to understand requirements, and analyzing potential impact of implementing change requests. • Performs tasks to comprehensively analyze application and process issues, including problem identification, communication and coordination with customer facing support teams. • Work with business to understand new scenario requirements or changes to existing scenarios and implement on SAS. • Participate in Agile ceremonies and co-ordinate AML scenario development. • Project Planning, Effort estimation, resource allocation, Development and release management. • Develop, support and maintain ETL processing using Teradata, SAS, UNIX, PySpark, Impala and Hive. • Work with Enterprise Risk Analytics team to translate the scenario requirements to AML model logic to be implemented using SAS/Teradata/Hadoop. • Implement new AML models using Hadoop source data building scenarios using PySpark and HDFS. • Performance tuning of Teradata SQL queries, Hive queries and Pyspark scripts. • Create complex SQL queries using OLAP/aggregate functions for business analysis requests/ data issues. • Implement Data integrity checks in the load process and between database distribution layers. • Release Management and Production Support. • Promote the code using DevOps • Worked on DAG development for Autosys scheduling.

Senior System Engineer

Beyondsoft Consulting Inc · April 2020 - August 2022
  • • Data profiling and migrating from Teradata to bigdata cloud platform • Data modeling using Ervin for new model on the platform. • Develop data ingestion process for JSON files using Python dictionaries and Pandas. • Conversion of JSON files to Parquet files using Python Pandas package. • Wrapper developed in Python to run application along with other applications. • Develop Python scripts for Glue ETL Jobs. • Develop ELT flows to migrate on-prime data to AWS Cloud data warehouse (Redshift) using AWS S3, Glue crawlers and Redshift. • Create external tables from S3 files in Athena and Redshift database for data validations. • Flattening of JSON files (semi-structured) into relational rows and columns using Redshift Spectrum. • Create external tables from S3 files in Athena and Redshift database for data validations. • Flattening of JSON files (semi-structured) into relational rows and columns using Redshift Spectrum. • Design and develop the jobs with extensive hive scripting, Python and scheduling tool airflow. • Design and develop the data extraction process from source systems to AWS S3 and then to the target system ( writeback/backflow). • Promote the code using DevOps • Worked on DAG development for airflow scheduling. • Monitor the airflow dags through Grafana dashboards • Generate the execution statistics like rowcount, Vcore, Memory utilization report from KIBANA.

Data Engineer Lead

Beyondsoft Consulting Inc · February 2020 - April 2022
  • Data Engineer Lead Aug 2021 - Apr 2022 · 9 mosAug 2021 to Apr 2022 · 9 mos Skills: Technical Design · Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · PySpark · Troubleshooting · Documentation · Design Documents · Avro · Data Processing · Enterprise Solution Design · Critical Thinking · Architecture Development · Datasets Skills: Technical Design · Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · PySpark · Troubleshooting · Documentation · Design Documents · Avro · Data Processing · Enterprise Solution Design · Critical Thinking · Architecture Development · Datasets Data Technical Engineer Feb 2020 - Jul 2021 · 1 yr 6 mosFeb 2020 to Jul 2021 · 1 yr 6 mos Skills: Technical Design · Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · PySpark · Troubleshooting · Documentation · Design Documents · Avro · Data Processing · Enterprise Solution Design · Critical Thinking · Architecture Development · Datasets

Sr. Application consultant

Standard Chartered Bank · January 2016 - February 2020
  • Project Name : SAIL - Strategic AML Industry Lead Client : Standard Charted Bank Organization: Optimum Solutions Duration: Jan -2016 To Till Date Environment Software : Teradata , BigData Hadoop Languages : Python,Hive ,PySpark, Teradata Sql and Teradata Utilities Database : Teradata 15 , Hive Tools: Teradata SQL Assistant, Apache hive, sqoop, Squirrel, DB-Visualizer and oozie, HDFS commands, RUNDECK,DevOps, Tableau,JIRA O/s : Windows XP Professional and Unix Project Name : SAIL - Strategic AML Industry Lead Client : Standard Charted Bank Organization: Optimum Solutions Duration: Jan -2016 To Till Date Environment Software : Teradata , BigData Hadoop Languages : Python,Hive ,PySpark, Teradata Sql and Teradata Utilities Database : Teradata 15 , Hive Tools: Teradata SQL Assistant, Apache hive, sqoop, Squirrel, DB-Visualizer and oozie, HDFS commands, RUNDECK,DevOps, Tableau,JIRA O/s : Windows XP Professional and Unix Skills: Technical Design · Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · PySpark · Troubleshooting · Documentation · Design Documents · Avro · Data Processing · Enterprise Solution Design · Critical Thinking · Datasets

Technical Lead at DBS

DBS Bank · September 2015 - January 2016
  • Project Name : Contextual Marketing Client : DBS Singapore Organization :APAR Technologies PTE LTD Duration : Sep -2015 To Jan -2016 Software : Teradata , Hadoop Languages : Hive , Teradata Sql and Teradata Utilities Database : Teradata 14 , HDFS Tools: Teradata SQL Assistant , Apache hive, hue O/s : Windows XP Professional and Unix Project Name : Contextual Marketing Client : DBS Singapore Organization :APAR Technologies PTE LTD Duration : Sep -2015 To Jan -2016 Software : Teradata , Hadoop Languages : Hive , Teradata Sql and Teradata Utilities Database : Teradata 14 , HDFS Tools: Teradata SQL Assistant , Apache hive, hue O/s : Windows XP Professional and Unix Skills: Technical Design · Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · PySpark · Troubleshooting · Documentation · Design Documents · Avro · Data Processing · Enterprise Solution Design · Critical Thinking · Datasets

Technology Business Analyst

Standard Chartered Bank · March 2014 - August 2015
  • Project Name : Financial Analytics Client : Standard Charted Bank, Kaulalumpur ,Malaysia. Technology used : Teradata Sql and Teradata Utilities Tools : Teradata Decision Expert ( TDE ), UNIX, Control-M Scheduler ,Teradata SQL Assistant,HP Application Life Cycle Management Project Name : Financial Analytics Client : Standard Charted Bank, Kaulalumpur ,Malaysia. Technology used : Teradata Sql and Teradata Utilities Tools : Teradata Decision Expert ( TDE ), UNIX, Control-M Scheduler ,Teradata SQL Assistant,HP Application Life Cycle Management Skills: Technical Design · Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · PySpark · Troubleshooting · Documentation · Design Documents · Avro · Data Processing · Enterprise Solution Design · Critical Thinking · Datasets

Technical Lead

HCL Technologies Ltd · June 2011 - February 2014
  • Project Name : OCBC datamarts technical support. , SIBS silver lake integrated banking system Client : OCBC bank, Singapore Technology used : Teradata Sql and Teradata Utilities Tools : UNIX, ODI, Control- M scheduler ,Teradata SQL Assistant, HP Openview service desk Project Name : OCBC datamarts technical support. , SIBS silver lake integrated banking system Client : OCBC bank, Singapore Technology used : Teradata Sql and Teradata Utilities Tools : UNIX, ODI, Control- M scheduler ,Teradata SQL Assistant, HP Openview service desk Skills: Data Standards · Extract, Transform, Load (ETL) · Quality Control · Data Pipelines · Troubleshooting · Documentation · Design Documents · Data Processing · Enterprise Solution Design · Critical Thinking · Datasets

Senior system analyst

IBM India Private Limited · March 2010 - June 2011
  • • Ensured smooth running of all IPG related jobs and provided abend resolutions. • Prepared Application information Documents (AID) in the knowledge transition sessions. • Ability to plan, executes, communicate and meet project commitments consistently for developments and enhancements. • Worked on performance improvement of SQL by query tuning. • Ensure to follow Client requirements through CAB (change advisory board). • Preparation of the Unit Test Plan and Unit Test Report • Moving the Code to Changeman • Setting up the job Dependencies in CA7 Scheduler • Ensured smooth running of all IPG related jobs and provided abend resolutions. • Prepared Application information Documents (AID) in the knowledge transition sessions. • Ability to plan, executes, communicate and meet project commitments consistently for developments and enhancements. • Worked on performance improvement of SQL by query tuning. • Ensure to follow Client requirements through CAB (change advisory board). • Preparation of the Unit Test Plan and Unit Test Report • Moving the Code to Changeman • Setting up the job Dependencies in CA7 Scheduler Skills: Data Standards · Extract, Transform, Load (ETL) · Data Pipelines · Troubleshooting · Documentation · Design Documents · Data Processing · Critical Thinking · Datasets

Senior Software Engineer

HCL Technologies Ltd · November 2008 - March 2010
  • Project Name : BI development and enhancements Client : Commin wealth bank, Australia Technology used : Teradata Sql and Teradata Utilities Tools : Mainframes ,Teradata SQL Assistant Project Name : BI development and enhancements Client : Commin wealth bank, Australia Technology used : Teradata Sql and Teradata Utilities Tools : Mainframes ,Teradata SQL Assistant Skills: Extract, Transform, Load (ETL) · Data Pipelines · Documentation · Design Documents · Data Processing · Critical Thinking · Datasets

🎓 Education

Sri Venkateswara University, Tirupati (SVU)

B.Tech in Computer Science and Engineering · 2005
Description (between 50 and 1500 characters)

🚀 Projects

Policy management platform
Agency Name: ||
Policy Management Technology (PMT) is a java based web application that was developed for the Policy Office (PO) team. PO team uses PMT to initiate, author and publish Enterprise and Business level policies, procedures and Guidelines. Published policies/procedures will be available in the Policy Library for team member’s reference. PMT is developed using Angular, Java and SQL Server 2019 technologies.

🏅 Certificates & Licenses (3)

Certified ScrumMaster (CSM)
Scrum Alliance · Issued on March 2019
The Open Group Certified: TOGAF® 9 Certified
The Open Group · Issued on November 2021
Azure Fundamentals
Microsoft · Issued on December 2023
Microsoft Certified: Azure Fundamentals

🏆 Awards & Achievements (1)

🏆 Editorial Board member
Awarded by: International journal of scientific Research in computer science, Engineering and Information Technology || Year: 2024
Description

Professional Memberships (1)

Association of Enterprise Architects
Member: Member || Join dt: -
Country: United States

📚 Publications (6)

Journal: Analysis and Metaphysics • October 2022
This research study focuses on how to implement agile delivery model of data-driven user interface (UI) application within regulated industry. We explore the issues, approaches, and effective practice...
Agile methodology Regulated industries Data-driven applications Compliance Ui development Software delivery models
Journal: International Journal of Communication Networks and Information Security • November 2021
During the past decade, the rapid growth of data, rising user requirements and the need for stable applications have caused digital innovation to accelerate. Businesses that want to succeed in today’s...
Microservices architecture Data-driven applications Cloud-native development Scalable systems Composable microservices
Journal: Computer Fraud & Security Journal • May 2024
Today, cloud-native data pipelines are a fundamental asset in data structures of present-day data-powered businesses, however, they present a major security risk through their full lifecycle. This res...
Cloud-native security Data pipelines Zero trust Devsecops Container security Data protection
Journal: International Journal of Open Publication and Exploration • January 2023
Today, financial institutions choose to move their financial data to the cloud so they can grow, gain access to analytics instantaneously and control costs. I analyze the adoption of Amazon Web Servic...
The services amazon redshift Amazon athena Moving financial information Cloud analytics Data warehousing Etl and new 2020 technologies
Journal: Journal of Computational Analysis and Applications • February 2022
Financial organizations deal with large amounts of information on transactions, markets and risks that must be sorted through rapidly and correctly. This research aims to discover how Teradata, Hive S...
Distributed computing Parallel processing Teradata Hive sql Pyspark Financial Workloads Big data optimization Enterprise analytics
dd