Vijay Kocherla partly available

Vijay Kocherla

SAP BW BO HANA Certified Consultant

partly available
Profileimage by Vijay Kocherla SAP BW BO HANA Certified Consultant from AshburnVA
  • Ashburn VA Freelancer in
  • Graduation: not provided
  • Hourly-/Daily rates: 25 $/hour
  • Languages: English (Limited professional)
  • Last update: 10.03.2015
Executive Summary
  1. SAP Certified Application Associate – SAP HANA 1.0
  2. SAP Certified Business Objects Web Intelligence XI 3.0 professional
  3. 12 years of experience in SAP BI. Posses good development skills in BO and experience in utilizing the BODS ETL for data extractoin in BW.
  4. Architect experience includes the integration of ECC data, Transformed BW data, APO  Live Cachae data and Non ECC data for the reporting needs.
  5. Solution Architect exeprience includes, Data cleansing from ECC, R/3 and Legacy ( Non SAP) systems and transform & Consoldiate the data for BPC (Business Process Consolidation) Application.
  6. Business Objects experience includes building the universes on top of the OLAP and Non OLAP data bases. Strong conceptual knowledge on implementing the row and Column level security in BOBJ.
  7. BODS ETL skills includes, Automating the BODS ETL data loads into BW and make the data loads as on demand loads.
  8. SAP HANA Implementation skills in building Anlytical Views and Calculation Views for building Business Objects Reports.
  9. In Depth working experience on end-to-end implementation, as well as in support and maintenance type projects
  10. Posses efficient skills in ETL processes and that includes the custom extractor design & development by using the views and function modules in the R/3 and ECC systems.
  11. Very good experience in performance tuning and automation of the data using process chain.
  12. Project Management activities like project planning, staffing, liaise with the client, monthly billing, preparing SOW etc
  13. Business Development activities like writing Proposals, SOW; Conducting Pilot projects, Taking part in Due Diligence, Doing assessment studies at client sites
  14. Working knowledge of SAP modules SD, FI, APO, BPC, HR and CRM.
  15. SAP R/3 experience includes ASAP methodology, planning and scoping, End-User training and preparation of work instruction document 
Key SAP Skills
  1. SAP HANA 1.0
  2. SAP Business Objects 3.X and 4.0
  3. SAP-BI Versions 7.01, 7.0, 3.5, 3.1C, 3.0B and 2.X
  4. SAP-ABAP versions ECC 6.0, 4.7 Enterprise , 4.6C, 4.6B, 4.0B and 3.1H
  5. Working knowledge of SAP modules SD, FI, APO, BPC, HR and CRM.
Education / Membership of Professional institutes
  1. M.Tech(IT), Masters degree in the information Technology from Punjabi Technical University, Patiala, India
  2. M.C.A , Master of Computer Applications, Jamal Mohamed college, Trichy, India.
Professional Experience

Cognizant Technology Solutions
June ‘06
Till Date
Senior Consulting Manager

Tata Consultancy Services
Apr ’04
May’ 06
Assistant System Engineer

Samtech Infonet Ltd
Technical Consultant

SAP Project Experience
Gate Gourmet, Reston, VA  
Duration: 12/2013 – Till date
Role: BI Lead and Architect
Defining a enterprise level best practices for naming standards, transports mechanism, development standards, security methodology.
Requirements gathering for building BLTX, ZFISI, BLTX VS GL recon reports for Billing & NA Tax purposes.
Design data flow for BLTX requirement by leveraging billing items data flow.
Design data flow for ZFISI requirement by leveraging GL items data flow.
Design Staging, Tranformation and Reporting layers to maintain enterprise architecture standards.
Combine SD and FI GL data flows for reporting purposes and Design a report with SD & FI values with RRI to reconcile values between FI and SD applications at Plant and Profit Center Level.
Combining the flight information data with COPA data and generate the report at at flight level
Define and develop the process to transform GL account to AR account in BW
Define a process to pull the payment terms and the billing cycles to calculate the Days out standing
Created analytic Views for RDA5 upload data which happens on monthly basis.
Created analytic Views for RDA1 data for COPA on a monthly basis
Created analytic Views for Flight Information data
Created Calculation views for combining COPA and Flight information data
Created BO reports for PAX status at flight level
Designing post production activities like data loads, user interactions, SLA preperations etc
CapitalOne, Richmond, VA  
Duration: 11/2012 – 11/2013
Role: BI Lead
Design a data warehouse for QM, Risk and Issue data from Open Pages, Oracle and Teradata database.
Build a POC to load data into SAP HANA and integrate QM, Risk and Issue data by building different views in SAP HANA.
Define Naming standards, development standards, user groups, security mechanism
Define the transformation logic for QM, Risk and Issue related views as per the reporting requirements
Design multiple ETLs to load from source systems and define the periodicity of the data loads from each source system.
Build BOBJ reports for Issue, Risk and QM application areas in BOBJ Webi Reports
Perform interactions with Open Pages, Teradata and Oracle database teams to prepare transformation logic to meet the warehousing requirement
Coordinate with the End users and BSAs as part of requirement gathering
Define post production activities and prepare a plan to extend the warehouse for other verticals like Bank, Card, Auto etc.
Client: MeadWestVaco (MWV), Richmond, VA, USA     
Environment: BI 7.01, ECC6.0 and APO                                                                                                               
Duration: 02/2011 – 10/2012
Role: BI Architect
Project SCORE II:
  1. Requirement gathering from the FTM, CRM, APO and VMI reporting users.
  2. Prepare the Functional Specifications & Technical specification for FTM, CRM and VMI reports.
  3. Designed the BW architecture for SCORE II architecture in align with the SCORE I architecture without any impact on the SCORE I design.
  4. Built the custom extractors for extracting the BOM (Bill of Material) data from ECC system. This extractor is based on the function module which pulls the granular level of BOM data.
  5. Activated the Classification data sources for material Grade and Caliper for APO production planning.
  6. Built the Custom extractors in the APO system for pulling the data from the APO live cache.
  7. Developed the multiprovider to combine BOM data from ECC, General Ledger data (actual & Plan), COPA data (actual & Plan) and APO Semi finished goods data for meeting the Raw Material Cost & Usage reports.
  8. Developed the multiprovider to combine General Ledger & COPA data (Actual & Plan) from ECC and finished goods data from APO system for Cost of Goods Manufacture (COGM) reports.
  9. Built a function module based extractors in APO system for extracting the data from the APO planning area (Finished goods data) and APO Live cache (Semi Finished Goods data).
  10. Written the Start and end routine for converting the MSF data into the TONs data because in the APO system, quantity is always maintained in the MSF at MWV.
  11. Built the Company Code level authorizations by creating the Company code as an authorization relevant object.
  12. Built the universes in BO environment and built the Webi reports for operational reporting purposes.
  13. Built the Process chains to automate the data loads.
  14. Offshore coordination with the BW team during on a daily basis.
  15. Organized weekly calls with the project management team to track the progress on the development and issues.
  16. As a part of SME activity, organized the SME calls twice a month. This is to ensure the best practices are followed in all the MWV projects.
Project BPC:
  1. As an architect provided the EDW layer in the BW system for cleansing & consolidating the data from 38 legacy systems to provide the data for BPC application.
  2. Built the data load strategy for the data loads from SAP and Non SAP systems by using the BODS ETL.
  3. Built the separate data flows and the warehouse in BW for data cleansing and consolidations.
  4. Designed the global master data configuration for critical master data objects ( BU, LE, LOC, BULELOC, GL Account, Trading Parter)
  5. Designed the master data maintenance for the local master data in SAP BW.
  6. Business coordination for cleansing the data and integrating the local master data with the global master data.
  7. Designed the process to support the On Demand requests from the user.
  8. Designed the process chains to loads the data at every month end.
  9. Built the Drill back reports in BW and connected them to the BPC application for the drill back functionality.
  10. Built the complex routines for loading the global master data values for all the local master data by doing the look up on the X-REF data.
  11. Built the multiproviders for loading the data for data cleansing activity and consolidation activity.
  12. Built the separate data flow for a group of systems based on the region, data volume and no of users and their location.
  13. Built the Staging layer, transformation layer and reporting layers in BW as per the EDW standards.
Project MCOP / Split:
  1. Identified the business scenarios for the MCOP (Mead Copr Office Products) products.
  2. Prepared the list and processed the extraction jobs for loading the MCOP data from R/3 into BW.
  3. Identified the jobs in APO system which loads the data from APO to BW and deleted the jobs.
  4. Modified the jobs which load the data ly for both MCOP and Non MCOP products so that those jobs should load only MCOP products in the ACCO system and non MCOP products in MWV system.
  5. Modified the process chains in MWV BW system just to load the Non MCOP data.
  6. Modified the process chains in the ACCO BW system just to load the MCOP data.
  7. Deleted the MCOP jobs in the MWV R/3 system and deleted the Non MCOP R/3 jobs in the ACCO R/3 system.
  8. Deleted the MCOP jobs in the MWV APO system and deleted the Non MCOP jobs in the ACCO APO system.
  9. System was copied from MWV landscape to build the ACCO BW landscape.
  10. Prepared the Documentation and provided KT to the support team.
Client: Kennametal, Latrobe, PA, USA     
Environment: BO3.1, ECC6.0, BI7.01                                                                                                              
Duration: 04/2010 – 01/2011
Role: BI OTC process tech lead
  1. Understand and Analyze the OTC requirements. Gather the requirements from the business teams.
  2. Coordinate with Business and support business in completing the Functional Requirement documents.
  3. Conduct workshops on the requirements and freeze the scope of work. Modify the requirements as per the output of the workshop.
  4. Coordinate with BO technical team and tune OTC BO reports presentation and functionality.
  5. Building the hierarchies for Assignment Group (Cust Sales) in BW to avoid the hierarchy limitations in BOBJ reports
  6. Develop the region and country info objects to overcome the Hierarchy limitation in BOBJ.
  7. Prepare Design Documents and get approval from the business team
  8. Business Content installation for CRM and OTC Data flows
  9. Activate the Web templates for CRM reports which are being execute from CRM User Interface
  10. Configure the reports the templates in CRM system and execute the reports from the User Interface.
  11. Activate all the standard data sources in the source system for CRM and OTC.
  12. Develop custom info providers for OTC Black Book reporting requirements.
  13. Develop custom data flow for the Black Book reporting requirements. Create Transformation, DTP and reports to meet the same requirements.
  14. Prepare the test cases for SIT2 testing and execute the test cases.
  15. Prepare and maintain issue log for all the issues during the development and resolve the issues.
  16. Activate the business roles for CRM and transport the same to the QA and PRD system.
  17. Transport all the developments to the QA and Productions systems.
  18. Automate all the data loads by creating the process chains.
  19. Create BEx variables and reports to meet the user requirements.
  20. Create logical partitioning to improve the performance.
  21. Run BIA for all critical cubes on which there are BEx and BO reports.
  22. Support the project plan preparation activity and make sure the deliverables are within the timeline.
  23. Support the UAT1 test cases preparation and resolve all the issues occur during User Acceptance Test.
Client: Kimberly Clark, WI, USA     
Environment: BO 3.1, BI7.0 and ECC 6.0                                                                                                               
Duration: 09/2009 – 03/2010
Role: BI Technical Expert
  1. Design/Analyze/Modify the SAP BW BEx reports as per the user requirements in the North America, Europe and Global systems. Analyze / Modify all the variable screens, formulas, aggregations, Exceptional Aggregations, Restricted key figures and Calculated Keyfigures as per the functional requirements of the Kimberly Clark Business team.
  2. Analyze/Modify all the info providers like Info cubes, DSO and Multiproviders to meet the reporting requirements of the users. Check all the key figures are set to the correct update mode and they are working fine. Monitor all the activities in the administrative workbench activities are running fine in the BW system.
  3. Monitor the process chains to make sure the automated loads are running fine within the timeframe so that the user could execute their reports on a daily basis.
  4. Build/monitor the BIA server to make sure all the BIA indexes are properly built and reorganized. Delete and rebuild the BI Accelerator indexes when ever required to meet the requirements.
  5. HPSD should be monitored and resolve the tickets accordingly. These issues could be in the Front end like BEx and backend in the Business Information ware house. Monitor all the loads are completed with the successful status because if the data is not updated completely then reports will have the data issues.
  6. Analyze and monitor the extractors, How the data is extracted into BI system. Check the delta loads are pulling the correct number of records with the correct status. Fix the data loads if the delta does not bring the correct data or if the delta has failed for a particular day in a week.
  7. Analyze all the Finance info providers (FIGL, FI-AP, FI-AR, and FICO) and load the data on daily basis to meet the reporting requirements. If there are any load failures, Coordinate with the back end team and make sure the load are successfuly completed.
  8. Load the CRM data into BI system on daily basis. If there are any discrepancies analyze the data and resolve them as per the solution required. Solution could be deleting the existing delta loads and loading the full data once again or doing the selective deletions in the info provider.
  9. If there are any Sev#1 tickets then provide then bring the system up and make sure the users can access the system. After bringing the sev#2 or Sev#3 and make sure we are providing the Temp and long term fix for the issue and make sure they will not be repeated in the system again.
  10. If there are any Sev#2 tickets then analyze the issue and check whether the info providers are having correct data or not. If they have the correct data then check the Report filters and check the restriction on the report and fix the issue at the report level and make sure the report is available for the execution.
  11. Perform special activities during the quarter and year close like check the loads closely to avoid any sort of delays in the report availability. If there any failures, make sure they are fixed within the time and available for the reporting.
  12. If the data itself is not available for reporting then check the BIA server is actively working and the indexes are properly organized. If there are any issues with that then address the issue and make sure the reports are having data for the KC business.
Client: Abraxis Biosciences, CA, USA     
Environment: BO 3.1, BI 7.0 and ECC6.0                                                                                                               
Duration: 04/2009 – 08/2009
Role: BI Architect
  1. Review the standard naming conventions documents for BI objects in both BI and ECC.
  2. Define the standard content installation procedure document.
  3. Identify the standard reports on the financial standard infoproviders.
  4. Review and filter the financial reports from the check list.
  5. Identify the standard data sources required for activating the canned reports selected by the functional owners.
  6. Identify the master data datasources for fulfilling the financial reports.
  7. Replicate the FI master and transactional data datasources in the BI system and migrate the data sources from 3.x to BI7.0.
  8. Activate the business content for all the info providers and the standard queries
  9. Create transformations from the data sources to the info providers.
  10. Modify the standard queries by adding the info objects which are already available in the infoprovider.
  11. Conduct the workshop and gather the user feedback and requirements.
  12. Migrate the BI requests in both ECC and BI landscapes.
  13. Prepare the Functional Specification document during the project preparation.
  14. Prepare the logical data models during the business blueprint.
  15. Prepare the technical documents during the realization.
  16. Go-Live preparation and post Go-Live support.
Client: Nortel Networks, NC, USA     
Environment: BO 4.0, BI 7.0 and ECC6.0                                                                                                                
Duration: 01/2009 – 03/2009
Role: BI Technical Lead
  1. Understand the System Requirements Documents on the Logistics Warehouse metrics.
  2. Analyze the requirements on the Percentage of inbound order where POD to GR recorded is less than 4hours.
  3. Understand the requirement on the metrics which measures the Percentage of inbound order where GR Recorded to Put Away time is less than 2 hours.
  4. Analyze the logic to be designed on the metric which measures the % of DO where (Release Complete – DO Create) is <= 1 hours, % of DO where (Pick Complete – Pick Release) is <= 3 hours, % of DO where (Pack Complete – Pick Complete) is <= 3 hours, % of DO where (Goods Issue - Pack Complete) is <= 4 hours.
  5. Analyze and design the data flow on the metrics which measures On time GI recorded to pickup status from carrier is <= 1 hour, % of Past-Due Shipments (only those going out from Nortel Warehouse and NOT Vendor), % of In-transit overdue-Shipments
  6. Design the new custom extractors for pulling the data from the LTAP table for meeting some of the metrics which are measured on the percentage of inbound order based on the GR recorded and POD.
  7. Design the new custom extractor for pulling the data from the MKPF and MSEG tables for meeting the some of the metrics requirements.
  8. Design the 5 new DSO for meeting the phase1 metrics requirements.
  9. Design 10 new reports for reporting on the phase1 metrics.
  10. Monitor the WBS structure on the daily activities to make sure the team is on time on the deliverables.
  11. Coordinate with the offshore on the development activities which should happen to meet the phase1 metrics.
Client: Merck, NJ, USA     
Environment: BO 3.0, BI7.0 and ECC 6.0                                                                                                               
Duration: 08/2008 – 12/2008
Role: BI-HR (PA & OM) Technical Lead
  1. Understand and analyze the HR Functional Requirements from the stakeholders meeting.
  2. Prepare the Technical & Functional documents for the Personal Administration and Organizational Management Reporting data flows.
  3. Design the architecture of the BI system which includes the Levels in BW data flow and WEBi Reporting in BO environment.
  4. Review and finalize the naming conventions for the development objects and reports.
  5. Designed COLUMN level Security in the BO environment on the DOB and SALARY columns.
  6. Created the User Groups for Personal Administration and Organizational Management.
  7. Assigned the COLUMN level security to the PA and OM user groups
  8. Created the alerts on the appraisal column to alert the manager if the employee has below BE (Below Expectation) appraisal consistently for two times.
  9. Created the Validation on the Report user whether the user belongs to HR or Non HR user group.
  10. Used INCLUDE Universe concept in BW to use EMPLOYEE master data for the all employee validations.
  11. Implemented SLOWLY Changing dimensions concepts in BW based on the time dependency.
  12. Enhance the data source 0HR_PA_1 for the Post and Nopost data.
  13. Create the data source for the Post, Nopost and Position attributes
  14. Partition the info providers physically one for each other.
  15. Developed the WEBi reports for External Entrants, Exits and Actions for Personal Administration data.
  16. Developed the WEBi report for Position Headcount for Occupied, Unoccupied and Vacant positions.
  17. Create process chains for the data loads and schedule them periodically
Client: The Toro Company, MN, USA     
Environment: BO 3.1, BI 7.0 and ECC6.0                                                                                                                
Duration: 02/2008 – 07/2008
Role: BI Architect
  1. Develop the technical specifications as per the requirements for the inventory management.
  2. Prepared design documents for universes / Webi reports.
  3. Gap and fit gap analysis on the AS-IS and TO-BE data flow setups respectively.
  4. Developed the BEx queries in BW and made them available for Webi Reporting and Universe build.
  5. Define the DATABASE delegated Key Figures in the Universe to meet inventory Management reports.
  6. Context technics are been used in BW to avoid LOOPS at the universe level in BO environment.
  7. Accommodate ZIC_C01 and 0IC_C01 in the 0IC_C03 info cube and avoided loops.
  8. Replace the 2LIS_03_S196 and 2LIS_03_S198 with 2LIS_03_BF data source.
  9. Replace the existing reports with the BO Webi Reports.
  10. Perform the LO cockpit set up for the inventory management.
  11. Migrate the 3.x content data sources to the BI7.0 data sources
  12. Create transformations between the data source and the data targets
  13. Create process chains for the data loads and schedule them periodically
Client: Case New Holland, WI, USA     
Environment: ECC6.0, BI7.0                                                                                                                
Duration: 11/2006 – 01/2008
Role: BI Architect
  1. Requirement collection from the business users and analyze the requirements
  2. Design the CNH Finance Capital BI system as per the Business requirement
  3. Design BI architecture that includes the Levels in the data flow and the staging in the BI system.
  4. PSA back up plan and design alternative staging in the BI system itself.
  5. Plan for the data availability in the final and middle level data targets because of the data volume.
  6. Design the different data flows for the Batch, online and interface postings.
  7. Design the separate process chain for the Batch, online and interface data flows
  8. Create jobs for the each process chain and submit to the External scheduling system which triggers the jobs on the daily basis.
  9. Design different info providers depending on the field requirements in the reports.
  10. Create formulas in the query to channel the Split documents values for batch, online and AP postings.
  11. Involved in the High level design of the architecture from Legacy till ECC.
  12. As per the volume of the data plan for the data storage in the ECC system
  13. Create indexes on the Custom table in the ECC to improve the performance of the extraction process from ECC to BI
  14. Analyze the split process in the ECC and accordingly design the custom extractor to pull the corresponding values from the appropriate table. That is from FAGLFLEXA by using the field XSPLIMOD rather using the field in the BSEG that is XMOD.
  15. Analyze the tables which are involved in these financial applications and design the extraction process.
  16. Custom extractor design to meet the CNH Capital Finance requirements by using the Function Module
  17. Design the flow of the extraction process in the function module.
  18. Improve the performance of the extractor by creating the indexes on the source table and by using the appropriate internal tables in the function module.
  19. Define the size of the data packet while loading the data from the ECC to BI. That is defined in the data transfer section of the info package and also defined in the SBIW transaction.
  20. Design reconciliation process to make sure the data has been pulled to BI system
  21. Created transformation between the source and the target objects.
  22. Coordinated the offshore and built the offshore team to handle the queries and the production issues 
Client: Tata Tele Services Limited, India    
Environment: BW 3.5, SAP R/3 4.6 C (FI, SD and IM)
Duration: 11/2005 – 05/2006
Role: SAP BI module lead
  1. Involved in gathering business requirements and preparation of Information Template used for identifying data elements for future reporting needs.
  2. Analysis of data requirements and translation of these requirements into dimensional data models using extended star schema data modeling techniques, with an emphasis on actual implementation in BW.
  3. Worked with the Data management group during the process of Data Analysis and Data Mapping exercise.
  4. Installed and extended standard Business Content based on client requirements in SD, MM and FI modules for enhancing DataSources for transaction data, master data.
  5. Mapping all the required fields with the info objects in the BW system.
  6. Replicating the required data sources from R/3 system to the BW system.
  7. Applying the transformations as per the business requirements in the infosorces.
  8. Creating the update rules as per the reporting requirements.
  9. Exporting the data sources and creating the data mart set up to meet the business functionality.
  10. Aggregates and the index creation for improving the query performance.
  11. Created variables restricted, calculated KeyFigures for the Queries.
Client: Cadbury Schweppes, India      
Environment: BW 2.1C, 3.0B, SAP R/3 4.6 C (FI, SD)
Duration: 04/2005 – 10/2005
Role: SAP Technical Team Lead
  1. Involved in gathering business requirements from users and business analysts for sizing and critical area of the business and the system.
  2. Analyzing the Fiscal Year variant usage in the business and in the reports.
  3. Nature of the Bex reports used in the business and the expected behavior of the same reports after the project.
  4. Changing the all periodic reports into the monthly reports.
  5. Changing all the technical design as per the monthly report requirements if the design does not match with the current requirements.
  6. Initial analysis on the BW system where ever the fiscal year variant and fiscal year period have been used.
  7. Transferring the global settings from R/3 system to BW system and rebuilt the respective tables.
  8. Finding the impact of the FYV change in the customer exit program.
  9. Preparing the impact assessment documents where ever the system has been impacted.
  10. Providing the solution after the IA documents are being approved by the client.
  11. Analysis on all the customer exit variables defined in the include program ZRXRU01.
  12. Write the solution for all the impacted variables in the customer exit include program.
Client: GE Silicones, WI, US      
Environment: BW 3.0B, SAP R/3 4.6 C (FI, SD).
Duration: 06/03 – 03/05
Role: Offshore coordinator and module lead
  1. Extract the data from various R/3 and Non R/3 source systems.
  2. Create and maintain the process chains extensively to make the loading strategy efficient.
  3. Upload the data from BW to the third party using open hub services.
  4. Maintaining the info spokes and uploading the data to the browser.
  5. Quarterly, half-yearly and year end activities as per the customer requirements.
  6. Extract the data from OSi R/3 system and upload the data to the OSi FOTC Server to make the FOTC reports available to the user.
  7. Load all the sales data to the data targets using the Full/Delta uploads so that the SPAN reports would be available to the Global users.
  8. Ftp the files from BW to the browser by running EURCOCKPIT, GESCOCKPIT jobs.
  9. Applying the transformation logic for calculating the Position of the business as of that day in the year.
  10. Created reports on the position values so that QTD and YTD reports could be compared with the previous quarter and year values.
  11. Enhance the data source to bring the negotiable delivery data from the requested delivery date and the possible delivery date.
  12. Apply the logic to bring the NDD value.
  13. Initialization operation being carried when ever the extract structure changed.
Client: Cadbury Schweppes, India      
Environment: BW3.0B, SAP R/3 4.6 C (SD)
Duration: 04/02 – 05/03
Role: Developer
  1. Business content installation as per the business requirements.
  2. Assign Data Sources to the Info Source, Define Transfer Structure and apply the transformation logic as per requirements.
  3. Replication of the data sources from the source system to the target system.
  4. Filling up and the deletion of the set up tables as when the business needs.
  5. Analyzed and fixed Problems Related to LIS structures and Extractors onto ODS/ Info Cube, Corrected query definitions and carried out other support Tasks.
  6. Designed Two Info cubes with daily and Monthly Granularity, extraction from PSA.
  7. Responsible for query performance using the aggregates and the indexes.
  8. Configure use of restricted key figures, variables, exceptions and calculated key figures in queries.
Complete Skills Summary
  1. SAP HANA 1.0
  2. SAP Business Objects 3.X and 4.0
  4. SAP-BI versions BI7.0, 3.5, 3.1C, 3.0B, 2.1
  5. Working knowledge of SAP modules SD, FI, APO, BPC, HR and CRM.
  6. Office Tools: MS Office, MS-Project, Lotus SmartSuite
  7. Mailing S/W: MS Outlook, Lotus Notes
  8. Operating Systems: Dos, Unix, WIN 9X/NT/2000/XP,
  9. Languages: ABAP, C++, Java, HTML, Assembly (8088)
  10. Database: MSSQL Server, ORACLE, MS ACCESS
  11. Hardware: Intel Pentium based PCs
Krish Pydikondala - Phone number will be given as required
Nikhil Thakoor - Phone number will be given as required
Available for working from remote