Summary
Overview
Work History
Education
Additional Information
Timeline
Generic

Pradeep Sridharan

Penang

Summary

  • AI/ML engineer with technical expertise in Data Extraction,Data Wrangling, Statistical Modelling, Data Mining, Machine Learning, Deep Learning, NLP and Data Visualizations.


  • Data engineer with extensive hands-on experience and high proficiency with structured, semi-structured and unstructured data using big data tools including hadoop, kafka, pyspark, Azure data factory, Azure data lake, data Bricks, snowflake, RDBMS and no sql databases.


  • Highly skilled in data modelling, data analysis, data validation, data cleansing and ETL architecture.


Overview

13
13
years of professional experience

Work History

AI/ML and Data Engineer

Intel Microelectronics SDN Bhd
Penang
04.2017 - Current
  • Involved in reviewing and understanding the Business requirements.
  • Involved in entire lifecycle of the projects including Design, Development, and Deployment, Testing and Implementation and support.
  • Developed ARGO, CRON step and DAG method workflow through YAML.
  • Developed various Python scripts to find vulnerabilities with SQL Queries by doing SQL injection, permission checks and performance analysis and developed scripts to migrate data from proprietary database to MariaDB.
  • Worked with version control systems like Subversion, GIT and used Source code management tools GitHub, GitLab, Bitbucket including command line applications.
  • Created Data tables utilizing flask - sqlalchemy to display well data and policy information and add, delete, update Well records.
  • Worked on Apache Parquet files through Microsoft Azure Blob Storage and kubernetes.
  • Developed Spark Python modules for machine learning & predictive analytics in Hadoop.
  • Responsible for handling the integration of database systems.
  • Worked in team of Data Scientist to build and deploy applications.
  • Generated detailed studies on potential third-party data handling solutions, verifying compliance with internal needs and stakeholder requirements.
  • Designed compliance frameworks for multi-site data warehousing efforts to verify conformity with state and federal data security guidelines.

Data Engineer

Hewlett Packard
Cyberjaya
04.2016 - 04.2017
  • Developed frontend and backend modules using Python on Django including Tastypie Web Framework using Git.
  • Developed Merge jobs in Python to extract and load data into MySQL database and Worked on MYSQL data lineage process to map source to target DB, table and column mappings using OO Python.
  • Successfully migrated the Django database from SQLite to MySQL to PostgreSQL with complete data integrity and Designed, developed and deployed CSV Parsing using the big data approach on AWS EC2.
  • Used Sqoop to import the data from RDBMS to Hadoop Distributed File System (HDFS) and later analysed the imported data using Hadoop Components and Created and maintained Technical documentation for launching HADOOP Clusters and for executing Hive queries and Pig Scripts.
  • Worked with version control systems like Subversion, GIT and used Source code management tools GitHub, GitLab, Bitbucket including command line applications.
  • Maintained high degree of competency across the Microsoft Application Platform focusing on . NET Framework, WCF, Microsoft Azure, and SQL Azure.
  • Deployed and monitored scalable infrastructure on Amazon Web Services (AWS) and also used AWS CLI to control various AWS services over SHELL/Bash.
  • Developed a Single Page Application which can switch to different components using Angular 4services to get data
    Interacted with the Yelp/AWS API using Gitbash and s3cmd command line tool to push and extract data.
  • Used Azure DevOps to create boards and CI/CD pipelines for build and release
  • Responsible extensively involved in infrastructure as code, execution plans, resource graph and change automation using Terraform. Managed AWS infrastructure as code using Terraform.
  • Responsible for Profiling Thunder Token server applications. Golang, YAML, Ethereum, Prometheus, Grafana, Ansible, AWS EC2, and S3, Terraform.
  • Implemented and developed UI components using Angular JS features like dependency Injection, Models, data binding and controllers.
  • Responsible for the Automation of the deployment of the Conductor application on AWS lambda using high-end AWS architectural components Developed AWS lambda scripts to build on demand EC2 instance formation.
  • Implemented View Model patterns in creating and managing Views, Partial Views, View Models, and Web APIs using ASP. NET MVC.

Data Engineer

Zod Infosystem Pvt Ltd
Chennai
05.2012 - 04.2016
  • Involved in analysis, specification, design, and implementation and testing phases of Software Development Life Cycle (SDLC) and used agile methodology for developing application.
  • Interacted with the client end-users during requirements gathering sessions.
  • Working as an application developer experienced with controllers, views and models in Django.
  • Implemented Business logic, worked on data exchange, processed XML and HTML using Python 2.7 and its familiar framework Django.
  • Worked on Python OpenStack APIs and used NumPy for Numerical analysis.
  • Worked on real time in memory distributed systems.
  • Designed and developed transactions and persistence layers to save/retrieve/modify data for application functionalities using Django and PostgreSQL
  • Developed web-based applications using Python 2.7/2.6, Django 1.4/1.3, PHP, Flask, Webapp2, Angular.js, VB, C++, XML, CSS, HTML, DHTML, JavaScript and jQuery.
  • Developed Application to access JSON and XML from Restful, Webservices from consumer side using JavaScript and Angular.JS.
  • Responsible for handling the integration of database systems.
  • Used object/relational mapping (ORM) solution, technique of mapping data representation from MVC model to Oracle Relational data model with an SQL-based schema.
  • Wrote MapReduce code to make un-structured data into semi- structured data and loaded into Hive tables.

Database Administrator

Pageone Technologies
Chennai
11.2010 - 05.2012

Project #1

Client : CEVA LOGISTICS

Environment : Oracle & MS-SQL databases


  • Creation of Schema with Quota maintenance
  • Creation of users, roles, granting privileges and profiles
  • Creation and maintenance tablespaces and objects
  • Checking alert log file frequently
  • Performed Rman and Offline Backup
  • Cloned the database using cold backup
  • Used analyze table, analyze indexes to tune queries and increase performance
  • To gather the memory statistics using STATSPACK report
  • Import and Export of Schema and databases across various stages of the project
  • Weekly Database maintenance reports.

Education

Post Graduate Diploma - Software Technology With Networking Management

Robert Gordon University
Aberdeen, Scotland
02.2010

Bachelor of Engineering - Electronics & Communication Engineering

Anjalai Ammal Mahalingam Engineering College
TamilNadu
04.2007

Additional Information

13+ years experience in data domain and 4+ years experienced ML Engineer with proven success in building successful algorithms & predictive models with clustering & classification, data analysis & visualization. Passionate engineer with the ability to apply ML techniques & algorithms to real world problems.



Timeline

AI/ML and Data Engineer

Intel Microelectronics SDN Bhd
04.2017 - Current

Data Engineer

Hewlett Packard
04.2016 - 04.2017

Data Engineer

Zod Infosystem Pvt Ltd
05.2012 - 04.2016

Database Administrator

Pageone Technologies
11.2010 - 05.2012

Post Graduate Diploma - Software Technology With Networking Management

Robert Gordon University

Bachelor of Engineering - Electronics & Communication Engineering

Anjalai Ammal Mahalingam Engineering College
Pradeep Sridharan