Gaurav

Chicago, Illinois, United StatesNationality
Share

Work experience

  • Architect, Cloud Platform Engineering

    TransUnion
    2022.02-Current(4 years)
    Passionate about architecting robust, scalable, and innovative cloud-native solutions, I have led the design and implementation of cutting-edge Kubernetes platforms on AWS EKS. With a focus on leveraging CNCF-based technologies, Ive orchestrated solutions encompassing a diverse tech stack:Networking Stack: Deployed Kong Service Mesh, AWS VPC CNI, Calico CNI, and AWS LB Controller for seamless communication and management within complex infrastructures.Security Stack: Implemented security measures using OPA Gatekeeper, external secrets operator, cert-manager, Vault Agent Controller, Wiz, and sysdig to fortify system integrity and data protection.Continuous Delivery Stack: Utilized Flux and Flagger to automate and streamline the continuous delivery pipelines, ensuring efficient deployment of applications.Observability Stack: Implemented Prometheus, Grafana, and Splunk OTEL Collector for comprehensive observability and actionable insights into system performance.Key Achievements:Designed and executed strategies for transforming legacy monolithic applications into microservices architecture, enabling their smooth transition into cloud-native containerized environments.Leveraged extensive expertise in application containerization, infrastructure automation, app migrations, and seamless integrations into various cloud platforms, optimizing operations and enhancing scalability.My Approach:As a dedicated subject matter expert in cloud technologies, I remain abreast of industry trends and advancements. This allows me to recommend and implement innovative solutions that drive efficiency, security, and scalability within organizations.
  • Senior Consultant, Cloud Platform Services

    TransUnion
    2017.10-2022.02(4 years)
    Architecture Design: Designing and architecting cloud-native solutions that align with business objectives while adhering to DevOps principles. This involves designing scalable, resilient, and efficient architectures utilizing cloud-native technologies and best practices.Cloud Infrastructure and Automation: Creating, managing, and optimizing cloud infrastructure using Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation) to automate provisioning, configuration, and orchestration of cloud resources.CI/CD Pipeline Development: Building and enhancing Continuous Integration and Continuous Deployment (CI/CD) pipelines to ensure efficient and automated software delivery. This includes integrating tools for code repositories, automated testing, and deployment automation.Tooling and Technology Selection: Identifying, evaluating, and recommending suitable tools, technologies, and frameworks that align with DevOps principles for improving software delivery, deployment, and monitoring in the cloud environment.Collaboration and Communication: Working closely with cross-functional teams, including developers, operations, and other stakeholders, to ensure seamless integration of cloud solutions and streamline DevSecOps processes.
  • Lead Software Engineer

    Wipro Limited
    2017.06-2017.09(4 months)
    End Client: Capital One, Richmond, VirginiaProject Description: KYC-Know-Your-Customer Identity Management project centered around creating Streaming Data Analytics platform using Kafka & RabbitMQ data streaming solutions and sync the data into S3 data lake. This helps analysts to analyze huge amounts of real-time data collected from financial institutions to know about customers by obtaining information about their identity and address.Technologies: Rabbit MQ, Scala, Apache Spark, Amazon S3Role:- Extensively worked on creating frameworks using Spark/Scala which runs on both enterprise data hub and Amazon Cloud- Partnered with enterprise architecture team to create scalable enterprise streaming data platform to ingest heterogenous data from different data sources and ingest the same into Amazon S3 lake data area.- As a part of KYC-Chordiant project, created a data virtualization layer using Cerebro connecting datasets sitting across different AWS accounts.Significant Achievement: Capital One Cloud Front Award winner for significant contribution towards Cloud Engineering space.
  • Lead Software Engineer

    Wipro Limited
    2016.10-2017.06(9 months)
    End Client: Capital One, Vienna, VirginiaProject Description: The project aims at migrating legacy Teradata based EDW system to cloud on AWS based Redshift data warehouse,S3 data lake ingestion using the AWS services The project comprises of design and development of new generic frameworks that will be used across platform for data load, data quality check, data security, data validation purposes.Technologies: Amazon Redshift, AWS S3, AWS EC2, Ab Initio, Apache Spark, Scala, Linux Shell Scripting, Teradata SQL, Teradata Parallel Transporter, Control M, Python, JSON, Java, Postgres SQLResponsibilities:- Design, Develop, Unit test, implement and support generic frameworks using Apache Spark, Ab Initio, Scala, Shell, Python and other technologies- Written generic drag and drop reusable components in Ab initio that can stream the data from on-premises servers to AWS S3 buckets.- Saved months of efforts for manual data migration by writing generic solution for history load from Teradata to Redshift using Teradata parallel Transporter,Redshift SQL, Linux Shell Scripting.- Implement the code(SQL,Linux,Abinitio) for data migration from Teradata to redshift.- Worked with the team to design and develop lake data ingestion framework. Technologies used: Ab Initio, shell,Spark, Python, Scala,Parquet, Avro,Web Services - Team is also responsible for migration of 17k Teradata tables to Redshift, along with incremental run implemented via batch and history data one time loads. Data analysis, processing of huge data, handling data sets is a part of my day-to-day job.
  • Automation Consultant- Infrastructure Automation

    Wipro Limited
    2016.01-2016.07(7 months)
    End Client: Apple Inc, Sunnyvale, CaliforniaProject Description: GBI Ops (Global Business Intelligence Operations) is a managed service engagement, which includes Maintaining and Support of various platforms such as GBI Informatica Platform, GBI ETL Platform, GBI Storm Platform, GBI Portal Platform, GBI BODS Platform, GBI Hadoop, GBI GDT and other GBI Platforms across all the environments to enable smooth functioning of Business Intelligence Enterprise of Apple IncTechnologies: Informatica 9x, SAP DataServices 4.2 SP7, Storm 1.0, Core ETL (Shell/Perl/Python), Oracle 11x/12x, Teradata 14.10, MySQL, Ansible Playbooks, sqlplus, bteq, awk, monit, Java, Linux/AIX, HornetQ, Kafka, Oozie, etc.,.Responsibilities:- Improvement/Standardisations/Innovation initiatives for infrastructure stabilization.- Infrastructure Automation using various tools and technologies that may include Unix Shell, Perl, Python, Java, SQL and PL/SQL and GUI tools like Teradata SQL Assistant and/or Oracle SQL Developer, Ansible Tower dashboard etc .- Automated the status ticket reports using Python, Shell,HTML, JSON data processing and Central Station APIs- Write Linux shell scripts/python snippets for the Framework Diagnostic Tool. The tool validates and checks certain standards in the env before the code can be migrated from one env to another and eases the diagnosis by highlighting the env issues.- Prepare documents and provide trainings to the team members.Significant Achievements: Awarded Wipro Innovation award for Developing and automating the reports using Python,Shell,HTML, JSON data processing and Central Station APIs resulting in the reduction of manual effort for GBI Ops support shift leads. The real time data helps GBI Ops 24X7 support team to handle the tickets effectively and quickly.
  • Senior Project Engineer

    Wipro Limited
    2014.12-2015.12(a year)
    Project Description: PCI DSS DDE Consolidation: Design and develop generic framework to deliver a consolidated Data Distribution Environment(DDE) for the UK business in a PCI Compliant domain. The project aimed at securing any card number data to be stored in an encrypted manner in the data distribution environment. The flow of data external vendors to follow a secure path via the data encryption gateway, all the data sent to the external vendors to encrypted on the Data Encryption Gateway server. All the files received from external vendors to be decrypted at Data Encryption Gateway. The card numbers stored in the EDW to tokenized via Voltage Tool.Technologies: Ab Inito 3.0.5, Unix Shell Scripting, Oracle 11x, Teradata 13, PK Ware Secure Zip, Symantec PGP, Voltage TokenizationResponsibilities:- Lead a 5-6 member team of developers- Develop customized generic Ab Initio components/generic framework for the automatic encryption/decryption of data across platform. These components are used as drag and drop components to encrypt/decrypt the files using pgp and to tokenize data via Voltage Web Service.- Attend continual meeting focused to keep the development on track,defect control,coordinate with other systems and quickly respond to changing business requirements- Worked in Agile-scrum model with active participation in daily scrum,sprint planning,retrospective,other ceremonies- Analysis of system architecture to break overall complex technical requirement into an approach for logical implementation- Focal point for making sound decisions related to complex business requirements- Provide technical inputs to architects- Mentor team members for technical issues. Drive project as a lead- Document technical designs and Develop critical and complex interfaces- Support integration test phase.Fix defect, ensure smooth testing- Document Technical Deployment plans & prod implementation plans
  • Project Engineer

    Wipro Limited
    2011.04-2014.11(4 years)

Educational experience

  • Birla Institute of Technology & Science, Pilani

    Computer Software Engineering masters of science
  • Hemwati Nandan Bahuguna Garhwal University

    Computer Science bachelors degree
Resume Search
Nationality
Job category
City or country
Jobs
Candidates
Blog
Me