- Built client-facing, enterprise-grade, cloud-native/hybrid big data - processing, reporting and analytics platforms thrice.
- Built web-scale graph-processing solutions implementing graph algorithms.
- Developed key frameworks, tools & modules implementing architectural patterns for several products.
- Contributed directly to several successful client implementations.
- Led 5-10 member teams, provided design guidance to teams of large size (~100).
- Masters & Bachelors degrees in Computer Science and Engineering from Osmania and J.N.T. University respectively.
- Data : Hadoop (MR/Tez, Hive, HBase, Kafka), Redshift, Netezza, VoltDB, PostgreSQL & Oracle.
- Cloud : AWS (EC2/ECS, S3, API, Lambda, Batch and others), DYN - DNS
- Frameworks : Proprietary (FEBA, VIQ), Spring-boot, J2EE and RoR.
- Tools : IntelliJ Idea, Linux CLI tools, docker, awless, aws-cli, DBVisualizer, Erwin, git/svn, maven, jenkins, jira, confluence, SonarQube, puppet.
Senior Data Architect at ERT (Oct 31 2016 - Present - Greater Boston Area)
- Built enterprise-grade client-facing, cloud-native clinical data - processing, reporting and analytics platforms.
- Built Master Data Management platform
- Provide cloud architecture, tools/frameworks for all data platforms/systems/modules.
- Continuously build prototypes for data,cloud,integration,messaging problems.
- Provide thought leadership, propose, prototype, iterate, deliver.
Big Data Architect at Zeta Global Inc. (July 21 2016 - Oct 30 2016 - Greater Boston Area)
- Proposed and contributed directly to multi-tenant data lake with HBase, Hive, Kafka, Hawq, Spark, Atlas and Spring-boot by prototyping and benchmarking batch/real-time data processes along with APIs, to enable interactive use cases, for a scale of ~100 Million rows per table per client.
Software Architect at Visual IQ Inc. (May 2012 - July 20 2016 - Greater Boston Area & December 2010 - January 2012 - Cochin, India)
- Built web-scale graph processing solutions, with 1.3 Billion + nodes and 10 Billion + edges, for identifying ‘Connected Components’. Built 3 solutions as requirements, scale (100M to 1.3B+) and constraints changed:
- Enhanced implementation of a classic algorithm for Percolation/Dynamic Connectivity problems- ‘Weighted Quick Union with Path Compression’ to a distributed variant embedded in a Hive UDAF.
- Enhanced implementations of naive, block & diagonal GIMV(Generalized Iterative Matrix Vector algorithm) in MapReduce
- An Iterative HBase implementation of GIMV variant in Hive
- Led several successful proof-of-concepts with feature complete prototypes. Later migrated product suites (including data processes) to the new big data platforms.
- POCs on Netezza, Vertica, Hadoop. Later migrate all products and processes from Oracle BI to Netezza
- POCs on in-memory databases - VoltDB, memSQL, Cassandra. Later migrate backend of web interfaces from Netezza to VoltDB.
- POCs on scalable, cost-effective data processing platforms with AWS Redshift, Hadoop.Migrate key batch processes from Netezza to Hadoop.
- A Sqoop like data ingestion tool, from any database with JDBC interface to VoltDB. It is a multi-threaded, producer-consumer implementation that allows mapping tables from source to destination. Fine-grained control is enabled through SQL queries on source database and VoltDB stored procedures (invoked asynchronously). It moves 100s of Millions of records from Netezza to VoltDB at 16000 rows/sec rate.
- Successfully refactored the production SaaS product suite to reduce technical debt, build frameworks and enable multi-tenancy,
Software Engineer at Altius Education, Inc (January 2012 - May 2012 - San Francisco)
- Developed some of the key components in the course designer module - for the next generation online higher education LMS platform - Helix.
- Setup CI platform with jenkins
Product Technical Lead at Infosys (April 2007 - December 2010 - Bengaluru, India)
- An ORM framework for key-based operations on a given table. It implemented several patterns including Table Data Gateway, Optimistic Locking, Memento and DAO. It is employed as the de-facto interface for table operations in Infosys Finacle e-banking and mobile banking platforms used by some of the largest banks with highest online transaction volumes ex: ICICI, ANZ.
- A DSL Type system framework for primitives and collections including a HashList. Some of the operations include sorting, deep-cloning and data format transformations ex: to JSON. It heavily used java generics and several key interfaces ex: Comparable. It is leveraged as a build-block by several other frameworks in Infosys Finacle e-banking and mobile banking platforms.
Consultant at Sierra Atlantic (November 2003 - April 2007 - Hyderabad, India)
- Developed multi-threaded engine of an integration product between Agile PLM and JD Edwards ERP using Java, XML and XSLT. Later implemented it on Oracle SOA.
- Implement technical papers and build refinements.
- Problem modelling and data modelling
- Develop fully functional prototypes
- Provide technical design, guidance and mentoring.
- Consistent top ratings in annual performance reviews in all organizations including ERT, Infosys and Visual IQ.
- At Infosys - "Gem of the Quarter" award, for individual excellence.
- At Sierra Atlantic - Contributed immensely in achieving 3 team awards.
- At School - 5 year Scholarship from based on state level talent search examination.