Joined Expedia on contract basis for building the next generation reporting tool for internal users and then extended the contract on to anomaly detection platform. Responsibilities included software development in various languages, design & architecture discussions, DevOps, deployment pipeline, and interviewing candidates. Projects: 1. Reporting Platform The project involved building reporting platform using micro-services architecture in a Kanban environment. Front end was in React.js and backend was in Node.js Developed user identity management and implemented authorization algorithm using JSON Web Tokens(JWT) and oAuth. Designed and implement various REST API endpoints in Nodejs and Restify for getting the data. HATEOAS (Hypermedia As The Engine Of Application State) pattern was used for linking between resources. Wrote a framework Chef Cookbook for deploying Nodejs based services. Wrote Chef cookbooks for various services, including rspec tests and integration tests with kitchen vagrant. Acted as psuedo DevOps member within the team. Implemented API Gateway using nginx and Chef. Implemented Service Discovery using Hashicorp Consul. Implemented next iteration of API Gateway with nginx, Consul and chef. Deployment in AWS Cloud. Setup nodes using Scalr. Fix various deployment issues. Did a brown bag session on Hashicorp Consul and service discovery, which was well recieved. Worked closely with Operations team to deploy services to Production environment. Technologies Used: NodeJS, Chef, cookbooks, rspec, serverspec, chefspec, AWS Cloud, Teradata, nodejs, ruby, reactjs, redux, rubocop, jenkins, Scalr, bamboo, stash, gitlab, devops, restify, bamboo. 2. Anomally Detection platform (Machine Learning) Big data analytics project for detecting anomalies in various time series data using machine learning techniques. Worked closely with Data scientists to develop models and the platform. Wrote models is Scala to run on Apache Spark. Used Spark to parallelize forecasting based on splits of data. Wrote Scala based tool for triggering AWS Simple Notification Service (SNS). Automated Airflow DAG creationg using Python based on configurations in AWS S3. Wrote AWS Lambda functions in Python. Setup Elastic Search cluster on AWS cloud. Wrote a script to automate the process. Optimized data ingestion to Elastic search from Hive on AWS Elastic Map Reduce (EMR). Used Spark to move data from Teradata to S3 in Parquet format. Techonologies Used: Apache Spark, Hive, AWS Cloud, AWS Lambdas, Python, AWS SNS, AWS Elastic Map Reduce, Scala, Python, Airflow
The project was to build new module into existing social care product developed by Corelogic. Worked on Java (Spring/ibatis), Backbone/Marionette, Microsoft SQL Server and Oracle stack. Suggested positive changes to standards and practices. Introduced JSR-303 server side validation. Designed RESTful API's and structured controllers. Responsibilities included attending sprint meetings, code review and merging changes from other developers. Technologies Used: Java, Spring, ibatis, Backbone.js, Marionette, Handlebars, Underscore.js, MSSQL, Oracle, Weblogic server, Gitlab, JIRA, TestNG, Mockito.
This was a paid project given by the University as part of Newcastle Work Experience program to build an academic assignment feedback generation application which, makes it possible to prepare constructive and comprehensive feedback in an efficient and effective manner. ● Create PDF reports from data using Jasper reports. ● User authentication and authorisation features using Spring Security. ● UAT deployment in Heroku and PostgreSQL ● Production deployment in Tomcat server and MySQL database located at remote University Server using ssh key authentication. Technologies Used: Java, Spring Roo, Hibernate, Jasper Reports, Dojo framework, Spring Security, Heroku, PostgreSQL, Tomcat and MySQL.
Happiest is a start-up in Newcastle upon Tyne, that focused on Rewarding customers based on engagements. To be a part of an exciting start-up in the UK was really thrilling. Most exciting part was the thought of building something that is going to delight people. The internship turned into an offer for doing Master's dissertation. Data Engineer (June - September) ● Dissertation project on Social Network Analysis and Centrality metrics - This project was considered the initial phase of bigger data analysis project. ● Created a friend recommendation system using Neo4J graph database. ● Analysis of social network data using R, IGraph - Happiest's network data was analysed and centrality points were given to each node in the network. ● Graph data visualization using R and Gephi. API Developer (March - June) ● Developed various API end points in PHP ● Database programming in PostgreSQL ● Responsible for github commits and merges ● Scrum methodology ● Daily stand-up sessions and discussions.
This is the gateway module to all other web applications through which the user will login. There has to be facility to login, logout, add new application, remove an application and session handover should work transparently. This is an implementation of single-signon. Lead Developer. Single point of contact at Client location. Decided how the application works (architecture) and prepared the design document. Technologies Used: Java, JSP, jQuery, Jasper Reports, Netbeans, CSS, Production deployment in WebSphere
JMRInfotech, previously Trasset India, is a technology consulting company Banking and Financial Services Industry domain in India, where I served for 3 years including offshore support and client site development projects. ● Was deployed in client site and acted as single point of contact. ● Lead the team in design and development and also gave training to team members and knowledge transfer to new members. ● Designed and developed Java based web framework for development. ● Training in FLEXCUBE® (now OFSS), PL/SQL and Oracle Forms. ● Offshore support for FLEXCUBE. ● Responsible for code maintenance and version control. ● Responsible for UAT deployment. ● Recipient of certificate of appreciation, for the quality and dedication, 3 times from JMR Infotech. ● Direct email from the CEO appreciating performance and skills.
PDF reports were generated using Jasper reports based on the format given by the bank. Several reports were created as different applications. Lead Developer and Single point of contact at Client location. Technologies Used: Java, JSP, jQuery, Hibernate, Jasper Reports, Netbeans, CSS, Production deployment in IBM WebSphere
truncation Automation interface Jul 2010 - Sept 2010 Dhanlaxmi Bank Developed a Java Swing based desktop GUI application using Netbeans platform, which generates XML files required to automate check truncation. This application interfaces between FLEXCUBE® and cheque truncation automation vendor and is to be used manually by a bank employee to generate the XML files and then upload them to FLEXCUBE. Lead Developer. Single point of contact at Client location.
Hungarian Banks FLEXCUBE® is an end-to-end product suite for consumer, corporate, investment and Internet banking, asset management, and investor servicing. Have provided technical assistance on various FLEXCUBE® installations, including Application server setup, Web services configuration and database setup. Have also provided offshore support for various maintenance and issues raised in FLEXCUBE®. Technologies Used: Oracle Forms 6i/10g, Oracle Application Server 10g, Oracle Database 10g, SQL/PLSQL
8. XML Interface for FLEXCUBE® Feb 2009 - May 2009 A leading Hungarion Bank FLEXML is the Interface layer of FLEXCUBE® that is used to interface FLEXCUBE® with any third party external systems. Created a Java based front end screens to capture details for FT and SI Modules. Have also provided the end to end support for testing Front end application with the FLEXML interface. The task includes coding, Unit Testing, Integration testing and solving the UAT issues. Technologies Used: JAVA Web-services, Oracle 10g Database, Oracle PL/SQL
is the process of improving the volume or quality of traffic to a web site from search engines. As an Internet marketing strategy, SEO considers how search engines work and what people search for. The project involved editing meta content and HTML to increase the sites relevance to specific keywords and to remove barriers to the indexing activities of search engines. New sitemaps where generated and submitted to major search engines for indexing. Reports were created to analyse effect of SEO. Technologies Used: HTML, Google Analytics, Google WebMasters Central Tools