Experience on SAP BI-HANA design and development including data modeling - Design and creation of complex ETL components, Data extraction and data loading - developing and monitoring manual as well as automated data load processes( using Process chains), Reporting - Design and creation of reports, Unit testing , Cut-over activities, UAT(User acceptance testing) , Post-go-live support. TECHNICAL SKILLS: Business Warehouse (BW) • Sound knowledge in BW Modeling, BW Extractions using different Transformations, BEX Reporting and Integrated Planning • Sound knowledge of SAP BW/BI components and ETL process like creating Info area, Application Component, Data source, different types of Info objects, Data Store Object(DSO) for STAGING as well as for Analysis/Reporting purpose, Info Cubes based on Extended Star Schema , Info object(Insert Characteristic as Info provider), Multi Providers, Info set, virtual providers(Virtual Cube), Hybrid Providers for Real time Data Acquisition(RDA) , Semantically Partitioned Objects(SPO), Info Source, Info Package, Transformations, Data Transfer Process(DTP), Update rules, routines, Transfer rules, Open Hub Destination, Process Chains. • Good knowledge about creating Aggregation Levels, Planning Filters, Characteristic Relationships using Integrated Planning • Good knowledge of OLAP reporting and implementation of BEx Query Designer, BEx Analyzer for developing reports, analysis, and Web reporting using WAD (Web Application Designer). • Good knowledge on Queries, Calculated and Restricted key figures, Structures, Variables including customer exits, Exceptions, Conditions, BEx Analyzer, query on Physical Providers as well as on Virtual providers. • Good understanding on Data loading and monitoring, analyzing data packets, manually edition in PSA and reloading to Data targets. • Data Extracting from the Source system using LO Cockpit and Generic Extractions for all the logistics data using tables/views, function modules as also COPA extraction. • Good knowledge of Slice and Dice techniques -Info Cube maintenance i.e. .Aggregation , Compression, Roll-up for Aggregates, Dimension with High Cardinality and Line item Dimension. • Good knowledge on automation process - BI Daily, Weekly and monthly loads (Scheduling) using Process Chain. • Sound knowledge on TRM(Transport Request Management). ABAP Good knowledge of (and rigorously practiced) • Data dictionary (Tables, structures, Indexes, search helps, views) • Interfaces between various systems(IDOC/EDI/RFC), BAPI's • Enhancements (worked on enhancements (implicit and explicit) user exits/customer exits/ BADIs/BTE, Enhancement Framework) • Reports (classical reports, Interactive reports, ALV reports (class based ALV reports) and BSP's) • Usage of Hooked Subprograms in BI – • In Characteristic Info object- transfer Routine • In Transformation-Field level routine, Start Routine, End Routine, Expert Routine • Info package- Conversion Routine • BEx Query- Customer Exit High Performance Analytic Appliance (HANA) • Sound knowledge of data replication to HANA DB, Designing flow using LSA++ , Schema creation, design and creation of different Information Object like Attribute views, Analytical view and Calculation view. • Sound knowledge of performance tuning to improve the performance of all HANA flows • Sound knowledge of Data extraction from different source systems using BODS and SLT • Use of different transformations to obtain consistent data to the target system • Creation and Transportation of delivery units and packages • Creation of BOBJ reports using HANA information objects • Design and creation of KPIs using Design studio and Business Objects Business Object (BOBJ) • Good understanding of design and creation of Crystal reports, WEBI and Dashboard- Xcelcious • Good understanding of BODS(Business Object data services) and Transforms to extract data from different source systems to different target systems • Sound understanding of design and creation of schema , planning and designing IDT(Information Design Tool) as well as UDT(Universal Design tool) including object and classes with different joining functionalities. FUNCTIONAL SKILLS: • Excellent client interaction skills • Familiar with Logistics modules like MM, SD, FI-CO (Enterprise structure, Master data & Transaction data), EWM, QM, Vistex • Strong presentation skills, communication skills, facilitation skills (oral and written) • can work effectively in a team environment as also independently PROFESSIONAL TRAINING: Trained by SAP in Data Warehousing - SAP BI (3.5, 7.x ) BEx query designer, BOBJ(Business Objects) 4.0, In-Memory Data Management (Hana database Technique), Software development on SAP HANA and ABAP Trained by CMC in Programming languages C and C++,VB.Net, ASP.Net, ADO.Net, J2EE, SQL and PL/SQL (Oracle 9i).
1. Understand user requirements, analyzing them and determining the feasibility of the flow/KPI 2. Creation of Functional specification as well as Technical specification 3. Design and Build ETL flows as also modifying existing flows using LSA(Layered Scalable Architecture) 4. Following objects are created and modified during flow generation: - Creating/modifying Multi-providers, Hybrid providers, Analysis Process Designers, Standard and real time cubes as well as Semantically Partitioned Objects. Creating/modifying underlying objects like, Data Store Objects, info-sources, transformations , different types of routines to obtain customized results , DTP and Info-packages Creating as well as modifying Generic (Customized) data source and Extract Structure as well as writing ABAP code in ECC system to map relevant table fields with Data source objects. Extractors are being created as View/table and also as the Functional module. Enhancing Data source, by creating enhanced structure and writing ABAP code in ECC to map table fields with Data source enhanced objects. 5. Creating and modifying (existing) local Process chain as well as Meta chain by using different types of variants as well as automate data load by scheduling processes chains 6. Performing unit testing for each object followed by testing for integrated objects 7. Create BEx Queries on the top of the info-provider by generating Restricted Key Figures, Calculated Key Figures, Conditions and variables 8. Migrate KPIs created using different tools ( BEX analyzer, BEX Query, excel etc.) to HANA 9. Design and build HANA flows and optimize the already build flows for performance (performance tuning) 10. Design and build Data foundation and Business layer in IDT (Information Design Tool) 11. Design and creation of KPIs and dashboards using BOBJ as well as Design Studio. 12. Help in User acceptance testing and subsequent transport of changes to Production environment. 13. Hand holding and helping users to use KPIs.
It is a leading midsized pharmaceutical company and its requirement was to add BW layer to store and utilize data optimally as well as to generate reports to analyze data. It is a green field end to end implementation project in which I have performed following activities: 1. Understand user requirements and suggest module-wise scenario-wise reports 2. Create Functional Specifications and Technical Specifications according to business requirements 3. Design ETL flows using LSA(Layered Scalable Architecture) 4. Generate different Flows ( Logistics- Material Management, Sales Distribution , Financial Accounting, Quality Management , Vistex as well as Extended Warehouse Management) 5. Following objects are created during flow generation: - Creating Multi-providers, Hybrid providers, Analysis Process Designers, Standard and real time cubes as well as Semantically Partitioned Objects. Creating underlying objects like, Data Store Objects, info-sources, transformations , different types of routines to obtain customized results , DTP and Info-packages Creating Generic (Customized) data source and Extract Structure as well as writing ABAP code in ECC system to map relevant table fields with Data source objects. Extractors are being created as View/table and also as the Functional module. Enhancing Data source, by creating enhanced structure and writing ABAP code in ECC to map table fields with Data source enhanced objects. 5. Creating local Process chain as well as Meta chain by using different types of variants as well as automate data load by scheduling processes chains 6. Performing unit testing for each object followed by testing for integrated objects 7. Create BEx Queries on the top of the info-provider by generating Restricted Key Figures, Calculated Key Figures, Conditions and variables 8. Creating Workbooks using Bex Analyzer tools from Design toolbox like user forms, checkbox, macros. 9. Create APD (Analysis Process Designer)s to transform data in desired format 10. Performed cutover activities by creating Transport Requests (Customized as well as Workbench) and transporting TRs to other systems(Test , Quality and Production). Master data as well as transaction data load in production environment. 11. Mentoring the team members and guiding them in complex logic buildup.
Business requirements dictated “Article Hierarchy” (A SAP developed standard hierarchy) needed to be used instead of Merchandise Category Hierarchy. Standard content ERMA come with the Merchandise Category Hierarchy. “Article Hierarchy” was required to be used as the default hierarchy in Target. “Article Hierarchy” was added to the new proposed cubes for ERMA. The merchandise category which is in the business content cubes was removed from the new custom cubes. The “Article Hierarchy” element was added as characteristics in a dimension called Article Hierarchy. The solution was to use a mix of BW Content, Configuration and ABAP coding. The BW content was modeled after the delivered ERMA content. The coding was implemented as enhancements, BADIs and coding for the BW data transformations. The transformations were to map and calculate the new key figures. The data for the transformations was to be sourced from goods movement that happens in the ECC, cost and retail revaluations that happen in ECC as well as the POS transactions that were fed to the BI’s POSDM data source. New queries were created using ‘Article Hierarchy’ using the newly created BW multi-provider. This new multi-provider was to abstract 2 new cubes. The new cubes were based off of the standard ERMA content with additional characteristic and the new “Article Hierarchy” characteristics. The default hierarchy of Merchandise category was removed. I performed activities like designing and creating generic data source, creating info objects, different types of DSOs, Semantic Partitioned Objects(SPOs), transformations , different types of routines in the transformations to obtain desired data in the target, Virtual providers like Multi-providers, designing and creating Bex Queries including Calculated Key figures, Restricted Key figures, formulae, conditions, exceptions, Variables, Customer exits in the variables to provide required output in the report. I also provided support to the project by monitoring Process chains as well as by resolving technical issues.
SAP Public Budget Formulation (PBF) is an add-on component that provides a dynamic and collaborative environment to accommodate the requirements of the end-to-end public sector budget formulation process. It is a standalone comprehensive web based solution which supports budgeting in public sector organizations. It supports unique processes for each department in the public sector. The system landscape consists of OLTP (ECC) which transfers data to SAP BW in the process configuration layer. Leveraging the SAP Net Weaver platform we use BI Integrated planning functionality (with aggregation levels and Input Ready queries) amongst other functionalities of BW like staging, data loading, transformation, reporting as also objects like DSO, cubes, multi-provider etc. SAP Visual Composer is used as the Business process and User Interface Modeler. The application layer (which resides above the Process configuration layer) provides functionalities like master data maintenance, budget formulation/expenditure projection and forecasting. Integration with SAP Business Objects is available and various categories of reports can be used. The presentation layer consists of SAP Enterprise Portal with an integrated single-sign-on web access point for budgeting, reporting and other legacy applications. I was involved in SAP BW/IP processes while supporting CBMS project.
Westpac is a large bank in Australia. Was part of the SAP team that developed the corporate to Bank connectivity custom software for the bank. The software creates payment instructions in ISO 20022 format and sends the same to the bank using SAP XI as also receives and displays status messages in a custom monitor. I was involved in all activities of the project from development/testing perspective. That is requirement gathering, design phase including creation of Technical Specification, development & unit testing and UAT /customer acceptance.
Was part of the SAP team that extended the already developed the corporate to Bank connectivity custom software for American Express. That is the solution delivered to AMEX did not have the capability to handle cross-company code transactions. Later it was decided that the cross company code functionality (payment by one company code on behalf of another company) needed to be enhanced. I was involved in development/testing/documentation of the complete detailed proto type for the project. (We wrote all code required to enhance the CDP).
It's a solution providing company in which I performed activities like data extraction from heterogeneous systems, building flows by creating different BW objects - Info-objects, info-providers, transformations, Info-package and DTP, on the top of that Process chain and meta chain creation, as well as report generation using different tools. I also performed activities like data load monitoring, familiarizing users with the system and resolving technical issues.
Have worked with CMC as an Assistant Consultant in DFS Project (Directorate of Forensic Science Services), Gandhinagar. I was involved to build, Implement and support the software and familiarized more than 300 users with the system & helped them to resolve their system related issues.