Summary
Overview
Work History
Education
Skills
Timeline
Generic
ASWINI PURUSHOTHAMAN

ASWINI PURUSHOTHAMAN

Solution Architect
Kuala Lumpur,Sentul

Summary

Experienced professional specializing in solution designing, data engineering & migrations, big data management, and data warehouse implementation. Skilled in driving innovative solutions and optimizing data processes for improved business performance. Proven track record of leading cross-functional teams and delivering projects on time and within budget. Committed to staying updated on industry trends and expanding skill set to drive organizational success.

Overview

16
16
years of professional experience

Work History

Solutions Architect

UOB
12.2023 - Current
  • Worked closely with product teams to define and prioritize partner feature requests
  • Managed end-to-end software development life-cycle, from initial requirements gathering to post-implementation support and maintenance.
  • Presented roadmap and technology infrastructure to customers, demonstrating deep familiarity with APIs, platform infrastructure, security and integration capabilities.
  • Leveraged cloud technologies to optimize costs, improve scalability, and facilitate seamless integration across platforms.
  • Serving as Solution Architect for corporate and business banking initiatives digitally, owning end-to-end solutioning across backend services, API integration, and cloud-aware architectural design.
  • Architected and reviewed microservices deployed via Docker containers, with real-time messaging handled by Kafka, and routed securely through cloud-native API Gateway (AWS API Gateway).
  • Designed secure backend integration with private cloud-hosted databases (within VPC-isolated networks), ensuring data compliance with EDAG, IAM, and data masking requirements.
  • Developed lightweight data transformation flows using Talend and Python scripts, converting JSON configuration files into consumable formats for internal APIs and processing pipelines.
  • Owned the CI/CD architecture and governance, integrating pipelines via GitLab CI/CD, Jenkins, and Terraform for infrastructure-as-code deployments, including unit testing, container scanning, and controlled promotion to UAT and production environments.
  • Collaborated with security and DevOps teams to enforce pipeline quality gates, environment-specific configuration management, and automated deployment strategies in regulated environments.
  • Authored and maintained Confluence-based solution design documents, TAGC decks, and API specifications, driving consistent architecture practices across squads.
  • Actively engaged in Agile ceremonies, leading architecture walkthrough with developers, and product teams to translate EPICs into technically executable microservice blueprints.

Solution Architect

Birlasoft
03.2023 - 11.2023
  • Led the end-to-end data architecture design for a global pharmaceutical-based client, integrating clinical, manufacturing, and compliance datasets into a cloud-based analytics platform.
  • Architected a cloud-native data processing pipeline using Azure Blob Storage, Python scripting, and Snowflake, ensuring scalability, performance, and compliance.
  • Developed and deployed custom Python scripts to parse and convert semi-structured JSON files into CSV, handling schema validation, data flattening, and error logging.
  • Utilized Azure Storage Explorer for secure management of ingestion zones and manual data validation in Blob Storage.
  • Designed Snowflake data models (staging, curated, consumption) with secure views and masking policies to support GxP-compliant reporting.
  • Integrated Azure Synapse Analytics as the analytical layer for ad-hoc querying and BI consumption, enabling faster insights for R&D and compliance teams.
  • Authored solution design documents, architecture diagrams, and provided cloud deployment guidelines for production rollout.
  • Collaborated cross-functionally with DevOps, QA, and Data Governance teams to ensure alignment with data security, lineage, and regulatory standards (e.g., GDPR, GxP).

Lead Data Engineer

Thomson Reuters Pvt. Ltd
08.2022 - 03.2023
  • Driving day-to-day responsibility including developing ETL pipelines in & out of data warehouse, developing major regulatory and financial reports using advanced SQL queries in Snowflake
  • Building workflows using Alteryx for extracting data from legacy servers to Snowflake warehouse
  • Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh Architecture
  • Developing end-to-end ETL/ELT pipeline working with Data Analyst of business function; troubleshooting and resolving technical issues as these arise
  • Directing end-to-end data architecture activities inclusive of delivery of target architecture & decision documents
  • Ensuring the successful delivery of emerging data warehouse & business intelligence solutions
  • Defining solution architecture, data strategy, principles, vision, and standards; providing advisory services around the data frameworks, governance, and quality
  • Analyzing & reviewing business, functional and high-level technical requirements; designing detailed technical components for complex applications utilizing high-level architecture, design patterns and reusable code
  • Simulating, designing, developing & deploying computationally complex and practical data; building and delivering comprehensive data strategy roadmaps; ensuring final deliverables are of the highest quality
  • Identifying ways to integrate data from client's multiple source systems to provide consolidated KPI reports to client
  • Performing analytical interpretation of highly specialized technical and analytical information used to compose detailed documentation and reports
  • Designing data quality processes and resolving all resource efficiency issues
  • Implementing project plans within preset budgets and deadlines; monitoring and reporting on project progress
  • Creating business requirements, source to target mappings, data models, and universe designs to assist in the development of highly efficient and highly scalable project solutions
  • Ascertaining that the team structure is adequate and enforces compliance, best practices, approach & direction for the technical aspects of the organization's practice; providing technical leadership to fellow team members

Lead Data Engineer

Indegene Private Limited
07.2020 - 06.2022
  • Built a data pipeline for Integrated Data Platform (IDP) wherein data flows from various sources (FTP, SFTP, CRM Apps, API) to AWS S3 bucket (storage) via Airflow
  • Constructed the AWS data pipeline using VPC, EC2, S3, IAM, Glue (Pyspark), and Redshift; developed AWS Lambda code in Python for automating ETL Jobs
  • Managed metadata alongside the data for visibility of where data come from, it linage to identify data for customer projects using AWS Data Lake and its complex functions like AWS Lambda, and Glue
  • Restored PostgreSQL into Redshift; engaged in AWS DMS replication instance between different databases
  • Created 360 Data model on top of the raw data that flows from various sources to S3 based on the business rules in the redshift layer
  • Devised the job using AWS Glue (Pyspark) to automate the flow with the transformation (based on the business requirements); created Crawlers to maintain the Data Catalog
  • Scheduled the data from various sources with the help of Airflow dags and each dag holds the group of files based on their schedule
  • Developed views based on the user selection (front-end requirement)
  • Leveraged Talend for sourcing data from Google Analytics to AWS S3 bucket

Senior Software Engineer (Technical Lead)

Integra Micro Software Services (IMSS)
08.2019 - 07.2020
  • Formulated & executed six nodes of the Apache Hadoop Cluster; administered the cluster and discussed with the team to set up the cluster
  • Migrated ETL processes from the DB server to Hive to test the easy data manipulation and by using Sqoop; transferred data from relational databases to HDFS for business reporting
  • Developed Hive tables, loading them with data, and prepared hive queries to run internally in MapReduce way; analyzed partitioned, and bucketed data; calculated various metrics for reporting
  • Extracted the processed data through preparing Hive Scripts, using Spark-SQL to load JSON data, creating schema RDD, loading into Hive Tables, and managing structured data using Spark SQL
  • Transferred the JSON data to the BOC server (an internal client-server) by writing Kafka streams using python

Hadoop Developer

Tata Consultancy Services (Payroll Company - MEC Concepts India Pvt. Ltd.)
08.2018 - 06.2019
  • Worked on Hadoop Architecture and various components such as HDFS, Application Master, Node Manager, Resource Manager, Name Node, Data Node, and Hive with Spark concepts
  • Moved ETL processes from Oracle to Hive to test the easy data manipulation & used Sqoop to export data back to relational databases for business reporting & imported required tables from RDBMS to HDFS
  • Executed the workflows using the Apache Oozie framework to automate tasks

Technical Specialist

DXC Technology (Previously CSC India Private Limited)
11.2009 - 03.2018
  • Maintained and controlled the 8-node cluster and held discussions with the team for cluster setup
  • Analyzed the data by writing HQL queries for the users & developed tactical design and requirements of Advanced Analytical modules and rectified application defects
  • Deployed the HQL queries based on the client requirements and analyzed data using HiveQL
  • Implemented the data using Hive and filtered the data based on query factor and designed and developed multiple nodes in the cluster

Education

Bachelor of Science - Computer Science

SRM University
Chennai
04.2001 -

Skills

Improving processes

Timeline

Solutions Architect

UOB
12.2023 - Current

Solution Architect

Birlasoft
03.2023 - 11.2023

Lead Data Engineer

Thomson Reuters Pvt. Ltd
08.2022 - 03.2023

Lead Data Engineer

Indegene Private Limited
07.2020 - 06.2022

Senior Software Engineer (Technical Lead)

Integra Micro Software Services (IMSS)
08.2019 - 07.2020

Hadoop Developer

Tata Consultancy Services (Payroll Company - MEC Concepts India Pvt. Ltd.)
08.2018 - 06.2019

Technical Specialist

DXC Technology (Previously CSC India Private Limited)
11.2009 - 03.2018

Bachelor of Science - Computer Science

SRM University
04.2001 -
ASWINI PURUSHOTHAMANSolution Architect