Quantcast
Channel: SCN : Document List - SAP for Automotive
Viewing all 18 articles
Browse latest View live

DBM order tables :

$
0
0

All the DBM order specific tables are listed below,

  1. /DBM/VBAK_DB Database Table - Order Header
  2. /DBM/SPLHDR_DB Database Table for Split Header Data
  3. /DBM/VBPA Order-Related Partner Functions
  4. PNWTYH Warranty Claim Header (PVS Node)
  5. /DBM/WTYD_ORDER Warranty-Relevant Data in DBM Order
  6. /DBM/JOB Job Administration
  7. /DBM/TASK Task Administration
  8. /DBM/TASKCH Task Scheduling Line
  9. /DBM/VBAP Main Sales Document: Item Data
  10. /DBM/SPLIT Main Sales Document: Split Data
  11. /DBM/VBEP Main Sales Document: Schedule Line Data

 

All the order related information from the flowerpot will be from the order tables.

 

Hope this helps,

Mathan R.


Automotive Consignment Process

$
0
0

Overview:


Consignment is the process of delivering a vehicle by an automotive manufacturer to a dealer without creating an outbound invoice. This is usually required if dealer does not have an end customer but wants the vehicle for convincing prospective customers or at times when manufacturer wishes to support its dealer financially.

The vehicle which falls under the consignment category although stored at the dealer premises is owned by the manufacturer and the dealer is not obliged to pay for these vehicles till the time the vehicles are on consignment stock. The dealer has the option to either consume the vehicles from the consignment stock if they have an end customer or return the vehicle back to the manufacturer once the consignment period is over.

 

Business Process Highlights:


The decision making process and determination of consignment vehicles is done by Manufacturer based on the requests sent across by the dealer

As long as a vehicle is marked as a consigned vehicle, it cannot be Outbound Invoiced. However, the physical delivery does not get affected.

In the period of consignment all vehicles will financially remain on manufacturer stock

The vehicle can be put on consignment earliest when ordering the vehicle. The latest that a vehicle can be put on consignment is before the vehicle exits the dealer premises en-route to the end customer premises.

The actual consignment period starts only when the vehicle reaches the dealer premises.

While a vehicle is on consignment amendments to the vehicle order or dealer trades cannot be performed.

The vehicles on consignment are marked as special stock which allows for the vehicles on consignment to be tracked separately

Consignment stock is managed separately within the manufacturer stock so that the information is available as to exactly what stock is stored at the dealer location.


Process Flow:


Vehicle Consignment Fill Up:

 

This is the process by which a manufacturer stores the vehicle at the dealer premises but the manufacturer remains the owner of the vehicle. Sales order type for consignment filling is – KB

 

In vehicle consignment fill up only order and vehicle delivery takes place

The steps to create the Vehicle Consignment Fill Up within SAP are as follows:

 

Step 1: Enter the Transaction Code VA01 in the Command Field

Step 2: The order type to be entered is KB (Consignment Fill Up)

Step 3: The Sales Organization/Distribution Channel/Division has to be entered.

 

 

Step 4: Once all fields are filled hit the “Enter” key to move to the next screen

Step 5: Enter Sold to Party / Ship to Party / PO Number/ PO date / Material /Order Quantity.

 

   Step 6: Click on the button

   Step 7: The message as shown below is displayed on the screen

 

 

Vehicle Consignment Issue:


The dealer can choose take a vehicle from the consignment stock and use it as a normal dealer stock wherein generally the vehicle is sold to an end customer. If this happens then the manufacturer will raise an invoice to the dealer who will then raise an outbound invoice to the end customer to whom the vehicle is sold. The outbound invoice also triggers the delivery of the vehicle from the dealer premises to the end customer premises.

This process is the vehicle consignment issue, and the sales order type is KE.

For Vehicle Consignment Issue, Order, Delivery and Invoices take place

 

Step 1: Enter the Transaction Code VA01 in the Command Field

Step 2: The order type to be entered is KB (Consignment Issue)

Step 3: The Sales Organization/Distribution Channel/Division has to be entered.

 

           

Step 4: Once all fields are filled hit the “Enter” key to move to the next screen

Step 5: Enter Sold to Party / Ship to Party / PO Number/ PO date / Material /Order Quantity.

 

Step 6: Click on the button

Step 7: The message as shown below is displayed on the screen

 

Vehicle Consignment Return:

 

Once a vehicle has been issued to a dealer from the consignment stock the vehicle can be returned by the dealer to the manufacturer based on factors like damaged vehicle, failure in quality checks etc. The process of returning the vehicle is called vehicle consignment returns

This process is the vehicle consignment return, and the sales order type is KR. For Vehicle Consignment Return, Order, Delivery, Invoices and Credit memos can take place

 

 

Step 1: Enter the Transaction Code VA01 in the Command Field

Step 2: The order type to be entered is KR (Consignment returns)

Step 3: The Sales Organization/Distribution Channel/Division has to be entered.

Step 4: Once all fields are filled hit the “Enter” key to move to the next screen

Step 5: Enter Sold to Party / Ship to Party /Material /Order Quantity.

Step 6: Enter Order Reason

 

                Step 7: Click on the button

                Step 8: The message as shown below is displayed on the screen

 

 

Vehicle Consignment Pickup:

 

At the end of the consignment period the manufacturer might take back the vehicle into its own stock. The vehicle is moved from the dealer premises to the manufacturer premises. This process is the vehicle consignment pickup or consignment cancellation.

The sales order type used over here is KA. In this case for the vehicle order and return delivery takes place.

 

Step 1: Enter the Transaction Code VA01 in the Command Field

Step 2: The order type to be entered is KA (Consignment pickup)

Step 3: The Sales Organization/Distribution Channel/Division has to be entered.

 

 

Step 4: Once all fields are filled hit the “Enter” key to move to the next screen

Step 5: Enter Sold to Party / Ship to Party /Material /Order Quantity.

                   Step 6: Click on the button

                 Step 7: The message as shown below is displayed on the screen

             

Disabling ATP check in F4 search for parts in DBM-order user parameter

$
0
0

Hi,

In DBM order when user search for parts (item category P002) with standard F4 search /DBM/SH_PART_MARA an ATP check for maximal 500 materials is executed.

This can lead to performance problems.

 

DBM function module /DBM/P_MAT_AVAILABILITY triggers BAPI   BAPI_MATERIAL_AVAILABILITY .

 

After implementation of note 1403245 you can set user parameter  /DBM/F4_PART_NOATP to disable ATP check.

 

F4 search for parts will be executed without ATP check for each material.

 

Regards

Joachim

Populating field LABVAL in table /DBM/VBAP

$
0
0

In table /DBM/VBAP (Item Data DBM order) field LABVAL (Labor Value Number) is populated with the labor value number for following item categories:

 

P001     Labor Value

P020     Replacement Car

P701     Subordered Labor Value

 

For itemcategory P008 (Manual Labor Value) and P708 (Subordered Manual Labor Value) field LABVAL is not populated in /DBM/VBAP by DBM standard settings.

 

Please consider.

 

Regards

Joachim

Automotive Use Cases for SAP HANA

$
0
0

automotive.jpg

This space informs you about current use cases. Further more we created a new group (SAP HANA Uses Cases) at SCN where you have the possibilty to discuss SAP HANA use cases and give feedback.

 

Warranty Management

The BearingPoint Advanced Warranty Containment System (AWaCS) with the combination of HANA can support the warranty process. As a result fewer products will be built and shipped with quality issues.

 

Improve make to order manufacturing planning process

There is the possibility of combining Optessa and HANA for best support the processes which are based on big data, like demand management and planning and sheduling. E.g. it allows the control when there are unpredictable events.

 

View more use cases in Automotiveor discuss at SCN SAP HANA Use Cases...

Customer Hierarchy in Automotive Industry

$
0
0

Introduction

In the automotive industry, the major players are as below:

  • Ø  OEM
  • Ø  Importer/National Sales Center (NSC)/Distributor/OEM Sales Organization
  • Ø  Dealers
  • Ø  End Customer

The OEM is based out of the countries where the manufacturing plants are present. For a country specific sale, the importer/NSC imports the vehicles from the OEM and then sells them to the dealers, who in turn sell the vehicles to the end customers. This implies there is a multi-level buying group involved in the sale of an automobile. In large countries, there might be a huge number of dealers involved in the sales of vehicles to the end customers. There might be a very big dealer, under whom branches exist, or a logical grouping of dealers under a particular region might exist in that country. This grouping of dealers might have an impact on the sales of the vehicles, pricing of the vehicles or the mode of sales. Hence, a hierarchy in the customer structure exists involving the OEM / NSC / Dealers. In order to achieve this hierarchy, the SAP functionality of ‘Customer Hierarchy’ can be used.

 

Why Customer Hierarchy

 

There can be a lot of business reasons for an OEM to opt for customer hierarchy. Few of them are mentioned below:

 

Ø     Vehicle Sales

o    For an automotive company to get a better idea of the market demand for its products, it is useful if it has a clear demarcation of the market space. For this, it can opt for customer hierarchy where in it can ensure that its dealers operate in hierarchies belonging to different verticals. Through this, it can keep a track of individual node/entity wise demand, ensure better planning and can manage its stock in a better fashion.

o    When operating in markets with huge geographies, in order to have a structured sales format in place, companies can opt for customer hierarchy. This ensures ease of operation for the OEM in huge countries.

o    A company would want to have a clear demarcation of regions in a market (north, south, east, west or area wise) in order to empower the individual regions and ensure they operate as an independent entity for optimal results. This can be achieved through customer hierarchy; where in the head of every node in the hierarchy is responsible entirely for that particular node for better results. This also gives the flexibility of incorporating different sales and pricing strategies for different regions, based on the varying dynamics.

o    A company can opt for a hierarchy structure if it needs to keep a tab on individual regions or nodes in a hierarchy for their targets. Hence, in order to better monitor the sales targets, customer hierarchy can be used.

o    There might be cases where a very big dealer might have branches under him performing the sales functions, which is a common scenario and is applicable to almost all the automotive OEMs’ setup worldwide. In order to map the same in SAP, customer hierarchy can be used.

 

Ø     Pricing/Discount Handling

o    In a particular market, a company’s pricing might vary based not only on the product line, but also on the region it is selling. This can be handled in the form of discounts. For this, customer hierarchy can be used to ensure that products being sold in one region are priced differently or certain discounts are applicable only for certain regions.

o    An auto company might want certain incentives to apply only to certain dealers (best selling dealers, etc) based on certain criteria. Customer hierarchy can be used to achieve the same.

 

Ø     Default business partner determination – There might be a requirement for certain automotive OEMs to have default partners for a particular order – say a payer, etc. in order to address the requirement of specific pricing or any other sales related process which is driven by the partner functions. If we assign a partner procedure to a partner function, we can obtain the desired business partner in the orders for each partner function.

 

Ø     Process specific requirements, like order cancellation, returns, where the customer determination of the vehicle might differ based on the customer hierarchy – In specific cases, there might be requirements that certain processes like order cancellation, returns, etc should determine a specific customer in the hierarchy. This can be achieved by modifying the functional modules in order to determine the desired customer. This can address the need of managing regional stocks effectively and avoid unwanted goods movement between regions, especially in bigger geographies.

 

Ø     For huge automobile companies, operating in big countries, it is typical to have a grouping of dealers region wise. This will have an impact on their sales and delivery related processes. In order to achieve the same, customer hierarchy can be used.

 

 

1.            Customer Hierarchy Creation in SAP

 

a.            Setting up the basics for customer hierarchy creation

 

1.jpg

 

  1. In order to achieve the customer hierarchy structure, the configuration settings needed as pre-requisite are mentioned below. They are all present in the path Sales & Distribution à Master Data à Business Partners à Customers à Customer Hierarchy

 

Ø     Define Hierarchy Types – OVH1

 

2.jpg

There are some default hierarchy types already provided by SAP. However, in case we need different hierarchies for different business processes, or if we need our own behavior of the hierarchy, we can define new hierarchy types. Here, we assign a customer hierarchy type to a partner function. There are 4 default customer hierarchy type partner functions available in SAP.

 

Ø     Set Partner Determination for Hierarchy Categories

 

3.jpg

In this, we can set up the partner determination at various levels of partner determination procedures – customer master, sales/billing document header/item or the delivery level, for the partner functions defined for the customer hierarchy types. Once the partner functions relevant for hierarchy types are assigned, we need to assign the same to the desired partner determination procedure in order to ensure that the required maintenance is possible.

 

Ø     Assign account groups – OVH2

 

4.jpg

In order to be able to create the customer hierarchy, we need to map the lower level customers’ account group to that of the higher level customers’ account group. For example, a particular sold-to-party customer can exist under both ship-to-party and sold-to-party. Once this is maintained, creation of hierarchy with customers will be possible only according to this mapping.

 

Ø     Assign Sales areas – OVH3

 

5.jpg

For a particular account group, it is possible to restrict extending a customer to a higher level customer at a sales area level. This is to achieve the flexibility of the customer hierarchy related functionalities to apply only for certain sales areas. For example, a customer should fall under another region customer and be eligible for a discount only in case of normal business, and not when he is involved in lease sales.

 

Ø     Assign Hierarchy Type For Pricing By Sales Document Type – OVH4

 

6.jpg

For a particular sales document type, only one customer hierarchy type can be assigned for pricing relevance. Hence, once a new customer hierarchy type is defined, and if it needs to reflect upon the pricing in the quotation or the sales order document, it needs to be assigned to both the document types.

 

b.            Creation of customer hierarchy - VDH1N

 

Once all the set up for the customer hierarchy creation is done, the hierarchy can be created by clicking on the shown path in SAP menu or using the transaction VDH1N. For the maintained account groups and sales areas, we can give the higher level customer and the lower level customers’ sales areas and save. If the number of records for maintenance is too high, LSMW recording can be done, after which a file upload can be done in order to maintain the same in mass. Creation of records using VDH1N results in the table entries of KNVH.

Any process specific function module (like order creation, order cancellation, returns, etc) can be modified in order to fetch the higher or lower level customers from the table KNVH.

 

Bibliography & References:

 

•       http://help.sap.com/saphelp_erp60_sp/helpdata/en/dd/55f4d9545a11d1a7020000e829fd11/content.htm

•       http://scn.sap.com/thread/1170798

Vehicle Take-Back in Auto Industry

$
0
0

Introduction

 

In the usual SAP automotive space, vehicle buy-back or trade-in is applicable for used cars scenarios, where the selling entity (OEM Sales Organization/Importer/ National Sales Center – NSC henceforth) buys the vehicle back from the buying entity (Dealer). There are many occasions where dealer returns the vehicle back to the OEM/Importer/NSC. In such cases, only possible way is the ‘Returns’ process by issuing a 100% or returns credit memo. However, in certain countries, it is not legal to return a vehicle once the selling activities have been completed, unless it is damaged or a wrong delivery. Hence, a vehicle ‘take-back’ process is applicable in such scenarios, where in the buyer wants to return/sell the new vehicle invoiced to him, back to the seller.

 

Why Vehicle Take-Back?

 

1.    1. There might be scenarios where a dealer has ordered for a vehicle from the importer/NSC, the vehicle is invoiced to him, but later wants to give the vehicle back to the seller, may be because of loss of customer.

 

2.    2. This is needed as part of the legal compliance of the country specific business scenario.

 

Vehicle Creation and Invoicing Process in VMS

 

In a typical vehicle lifecycle, following business activities take place:

·         Create a quotation and order

·         Create vehicle with required configuration

·         Build the vehicle and mark it for delivery to customer

·         If NSC is involved in the selling of the vehicles a purchase order and invoice will be created

·         Creation of Sales order, Delivery, Goods Issue and Outbound Invoice

 

Pictorial representation of vehicle ordering & invoicing process (in a typical make-to-stock or order to stock scenario) is as below:

 

1.jpg

Following screenshots represent the creation of process documents for a vehicle:

 

Purchase Order Creation:

2.jpg

Goods Receipt for above Purchase Order:

 

3.jpg

Incoming Invoice:

 

4.jpg

Once a vehicle is created and configured, purchase order is created, along with the goods receipt and inbound invoice. Then, a sales order is created, along with the delivery, goods issue and outbound invoice.

 

Following figure represents the take-back process for a vehicle which is invoiced to a dealer:

 

Vehicle Take-Back Process

 

Below figure pictorially depicts the processes involved in vehicle take-back

 

5.jpg

1.    1. Create vehicle take-back purchase order, using the transaction ME21n, for the customer (which is defined as a vendor) based on the values of the outbound invoice values.

 

2.    2. Create vehicle take-back goods receipt document with reference to the take-back purchase order, using the transaction MIGO. This is to nullify the original goods issue which was posted during the original sale of the vehicle.

 

3.    3. Create vehicle take-back invoice document with reference to the take-back purchase order, using the transaction MIRO. This is to nullify the original outbound invoice which was created during the original sales of the vehicle.

 

4.    4. Once the vehicle is made available for resale, a new sales order document can be created for any other dealer/customer.

 

Pre-Conditions:

 

1.    1. The customers are all defined as vendors as well, in order to ensure that the creation of the purchase order for that particular vendor by the OEM/NSC goes through.

 

2.    2. The vendor master (XK01) is created with the corresponding data for that particular customer as in the customer master data (XD03). The link for the customer-vendor can be created by using vendor field in the customer master of the dealer.

 

3.    3. The standard function modules being used to create the purchase order / goods receipt / inbound invoice need to be modified in order to ensure that the flow of the document creation is allowed, and also to ensure if the automatic population of the values from the outbound invoice to the take-back purchase order is to be allowed.

 

4.    4. If custom developments are involved, then corresponding custom VELO actions have to be enhanced to incorporate above pre-requisites.

 

Bibliography & References:

•       www.help.sap.com/printdocu/core/print46c/en/data/pdf/.../LOVC.pdf

•       www.scn.sap.com/thread/1288644

•       www.scn.sap.com/docs/DOC-25224

 

About the Author

Vinod Sripada is an SAP SD Consultant in Infosys Ltd. He has 2 years of domain experience in sales and marketing vertical of an Indian automotive major, and 6 years of SAP SD experience in the automotive vertical. He has worked extensively on industry specific SAP solution of IS-Auto. He holds a B.Tech degree in Computer Science from JNTU, Hyderabad followed by MBA from Nirma University, Ahmedabad.


PS: Please note that the 3 SAP screen shots used in the document were provided by Prasanna Mukdapu, who is an MM Consultant in Infosys Ltd.

Data Migration for an Auto OEM in a complex IT landscape

$
0
0

Introduction

 

In any implementation or a roll-out of an SAP template, one of the first and the most critical activity is the ‘data migration’. Data Migration is the process of transferring the business critical data from a legacy system to the ERP system (SAP system is taken as a case in point in this document), which is imperative for the continuation of the business once the legacy system is down and the SAP system starts getting used in the day-to-day business activities. The data needs to be migrated to the SAP system in such a way that it is exactly in a desired state as it was in the legacy system at the time of migration, and which could be processed later once the SAP system goes live. This document basically covers one of the migration approaches that could be followed for an automotive OEM at the time of an SAP implementation or a roll-out in a particular country or an NSC (national sales center). The basic assumption is that typically in the automotive space, the landscape is quite complex. Typically, the automotive industry was amongst the early ones to adopt IT. Hence, there could be many IT systems in their landscape which might not be as integrated as desired, which means when transitioning to a more integrated system like SAP, the data source is disparate.Hence, the challenge at the time of migration is to ensure that the data is effectively fetched from the external systems to the SAP system and the data sanctity is maintained.

 

IT Landscape

 

In the automotive industry, the IT landscape might typically look as below:

 

 

1.jpg

 

 

  • Ø  Legacy System, which is the main source of the vehicle data. This is being replaced by the SAP system. It is assumed that the legacy system is an ordering system which is dependent on the factory systems to book the orders in the plant.
  • Ø  Financial System, which is assumed to communicate with the SAP system once the legacy system is replaced. It is assumed that the financial system is the source of the customer and vendor master data, in addition to the vehicle accounts data. There is also a possibility that other external systems could act as master data sources.
  • Ø  Master data sources, which provide the material master data or configuration related data. This can exist in case there is a centralized dedicated system to supply the master data related to the vehicle models for application in the IT landscape.
  • Ø  Factory Systems (which are the manufacturing systems), which are the source for the latest production status related data. In automotive space, typically the factory systems register the data from the ordering system and provide real time production status related data back to the ordering system. The ordering system communicates with the factory systems on a regular basis to book the vehicles for production and receives updates on the order back from the manufacturing system.
  • Ø  Local systems, which exchange data with the legacy system to address the country specific needs.

 

During the implementation or roll-out of the SAP template, the legacy system is being replaced by the SAP system. The legacy system is the source of all the order data which needs to be transferred to the SAP system. Also, in addition to this, data needs to be fetched from the different sources shown to ensure that sanctity is maintained. The complexity might lie in the fact that data needs to be fetched from the different sources into the SAP system, staged, validated and then processed to ensure it is in sync with the different sources.

 

Data Exchange between systems in the landscape

 

 

2.jpg

 

 

 

 

As part of migration, there can be 2 types of data that need to be migrated or maintained in the SAP system – Master Data and Transaction Data. The sources for these could be different. There might be a need for the master data to be migrated before the transaction data.

 

  • Ø  Legacy system to SAP system: Typically,the legacy systems provide the data in the form of external files which are then manually uploaded into the SAP system using custom programs or LSMW. The files could be in any desired format such as text files, excel files, etc. Alternatively, a communication channel between the legacy system and SAP system could be established in order to transmit the data. The uploaded or transmitted data in SAP is first stored in intermediary staging tables, checked for the sanctity using validation programs, and subsequently used in the data processing.
  • Ø  Factory system to SAP system: The factory system, which could be a non-SAP system, typically interacts with the ordering system via PI. In order to receive the latest information from the factory, the messages sent by the factory are converted into idocs and then transmitted to SAP. This data could be stored in the staging tables to be accessed by the migration programs during data processing or the migration programs could directly retrieve the data from the idocs, and use it in the data processing.
  • Ø  Master Data source to SAP system: In case there are multiple master data sources in the landscape, communication with the SAP system could be through web service calls for instant exchange, and also idocs via PI in order to exchange the master data. In certain cases, the vehicle master data is received from these systems against which the material master is created in SAP. This is as part of the master data set up before the migration run.
  • Ø  Financial system to SAP system: It is possible in certain scenarios that the debtor and creditor master data is sent from the finance system to the SAP system. This is as part of the master data set up. This could be done using the CREMAS/DEBMAS structures, and any additional information for customers or vendors could be uploaded using the LSMW objects.

 

 

Data Migration Process

 

 

3.jpg

 

 

 

The whole data migration process can be split into the 4 mentioned stages:

 

  • Ø  Pre-Migration: As part of this stage, the data migration check list is verified and the readiness is gauged, both from the legacy system and SAP system point of view.
  • Ø  Preparation: All the necessary data from different sources is loaded into the staging tables in SAP and validated against a pre-defined set of rules.
  • Ø  Execution: This is when the data which is staged is actually processed and the order data is taken to a point in SAP where it is in sync with that of the legacy system.
  • Ø  Post Run: The post data migration reporting is done as part of this stage.

 

 

Steps in Data Migration

 

 

4.jpg

 

 

 

The above figure shows the modularized phases in the data migration activity, both from the legacy system data preparation point of view and the SAP system migration activity point of view.

 

Ø  Legacy System

  • Data Extraction: All the data that is in scope for migration is finalized and the data is extracted in the desired format. This could be in the form of text files, excel files or any other data format that is agreed.
  • Basic Validations: Once the data is extracted, there is a sanity check that is done in order to ensure that the data being provided to the SAP system is correct. This could be done by certain rule based programs in the legacy system.

 

Ø  SAP System

  • Data Staging: As part of this step, data from multiple sources (legacy system, factory systems, etc) is fed into the SAP system. This is just the staging of data before migration where in the data is stored in intermediary staging tables. This data will include all the latest order and distribution related information (including documents related data), which is subsequently used for the next steps.
  • Data Validation: Once the complete data which is in scope of migration is staged in the SAP system, it needs to be validated to ensure that the data meets the criteria for the migration process to happen smoothly, and also post-migration, the data could be processed further. The validation could be in the form of certain custom programs with pre-defined rule sets. This is a very critical step in the migration process as data sanctity is dependent on this step.
  • Error Handling after Validation: After data validation, errors in the data provided by the legacy system could be reported back and correct data could be re-provided after fresh data extraction.
  • Migration Run: Once the necessary data is all staged and sanity checks done, the migration activity could be initiated. The creation of all the data in SAP could be done by using custom programs or by using LSMW objects.
  • Results and Error Reporting: Once the migration activity is complete, the results and the error list could be checked using custom reports or job spools.
  • Post Migration activities: This step involves all the post migration activities, like output type processing, idoc processing, etc.

 

It is imperative to follow a structured way with different process steps as the data which is to be migrated could be huge. Checklist could be prepared and this needs to be monitored to ensure sequential execution of steps and hassle free data migration process. Also, an optimal number of test-migration runs, involving all the systems in the landscape, need to be conducted in order to ensure the below:

  • Data which is being provided by the different systems in the landscape is correct.
  • Sequence of steps which are to be executed are finalized for the main production activity.
  • All the custom programs and objects which are being used at various stages for data validation, data creation and data processing are correct and the final data which is being stored in the SAP system is as desired.

 

Challenges

 

As part of the main migration cut-over activity, some of the challenges that could crop up are:

 

  • The first challenge could be related to finalizing the scope of the migration. It needs to be done in such a way that it is optimal enough to ensure the business continuity, and at the same time not large enough to ensure a smooth cut-over.
  • With many systems that could be involved in the landscape during the cut-over/migration activity, it needs to be ensured that the interfacing with the systems is properly set-up an the data being provided by the external systems is as desired.
  • One of the main challenges could be the system downtime. The production migration activity begins once the existing legacy system is brought down (with no further ordering process), data is extracted and provided to the SAP system. It needs to be ensured that the time frame between bringing down the legacy system and bringing up the SAP system for business is as short as possible.

 

One of the ways to address the challenges mentioned above could be to have enough number of pre-production migration runs (based on the project timelines). This will ensure that all the sanctity of all the necessary data being provided by the external systems is maintained and the interfacing with all the external systems is tested. Also, it is imperative that after every pre-production run, the migrated data needs to be processed and taken forward further in the life cycle. This will ensure that the data is being migrated correctly and will ensure business continuity once the SAP system goes live.

 

 

About the Author

 

Vinod Sripada is an SAP SD Consultant in Infosys Ltd. He has 2 years of domain experience in sales and marketing vertical of an Indian automotive major, and 6 years of SAP SD experience in the automotive vertical. He has worked extensively on industry specific SAP solution of IS-Auto. He holds a B.Tech degree in Computer Science from JNTU, Hyderabad followed by MBA from Nirma University, Ahmedabad.


Data Archival in IS-Auto

$
0
0

INTRODUCTION

When enterprises use ERP for their business transactions, they generate a lot of data on their daily transactions. Over the years, this data is stored in the ERP servers and slowly cause deterioration of system performance. SAP, for its customers offers standard data archival solution. Using this, one can move the data from SAP server to an alternate location (separate server) which can be accessed on need basis.

In this document, the archival process has been explained at high level for automotive solution from SAP (IS Auto)

PREPARATIONS FOR DATA ARCHIVAL

Archival of data from the servers require a detailed planning, careful analysis and meticulous execution. Thorough testing of the archival process in test systems and establishing the approach to archive the data in production environment is the key to successful project. Following sections list out the general preparations that are required for an archival project.

Identification of data for archival

Following are different topics to be considered as part of data identification process.

1. Scope of data to be archived

Identifying the data to be archived is an important aspect in the archival process. The scope definition can either stem from the business decision (To reduce the database size) or from an organization policy (E.g. All data older than 5 years should be archived).  Accordingly cut-off is decided.

There can be three different types of data in any business database. These are:

  • Master data
  • Transaction data
  • System Admin data

Following are the list of archival objects which are bracketed into 3 different types of data:


Transactional Data

System Admin. Data

Master Data

Arch. Object

Description

Arch. Object

Description

Arch. Object

Description

VEHICLE

Vehicle

BC_DBLOGS

DB Tab Logs

MC_S600

Evaluation Structure (S600)

FI_DOCUMNT

FI Documents

IDOC

Idocs

MM_SPSTOCK

Batches and Special Stock

MM_EKKO

Purchasing documents

BC_SBAL

Application Logs

MC_S033

Evaluation Structure (S033)

SD_VBAK

Sales documents

CATPROARCH

CATT-Log/Procedure

CA_BUPA

Business Partners

MM_MATBEL

Material documents

 

 

FI_ACCRECV

Customer Master

MM_REBEL

MM Invoice documents

 

 

MM_MATNR

Material Master

RV_LIKP

Delivery documents

 

 

MM_HDEL

Material Valuation: History

SD_VBRK

Billing documents

 

 

MC_S034

Evaluation Structure (S034)

 

 

 

 

MC_S035

Evaluation Structure (S035)

 

 

 

 

MC_S012

Evaluation Structure (S012)

 

 

 

 

MC_S014

Evaluation Structure (S014)

 

 

 

 

FI_ACCPAYB

Vendor Master

 

 

 

 

MM_EINA

Purchasing Info records

 

Among these, master data consumes only a minute amount of space in data base. Following diagram pictorially represents the database usage of different types of data:

Figure1_update.png

There can be different types of data and accordingly the filter criteria should be applied:

  1. All tables (Standard or Custom) whose size > ‘X’ megabyte (MB) should be considered for archiving
  2. All vehicle records older than ‘Y’ years should be considered for archiving
  3. All non – SAP documents (Such as PDF) older than ‘Y’ years OR pertaining vehicles selected in step 2 should be considered for archiving.

2. Residence and Retention Period

Residence Time is the minimum length of time that data must spend in the database before it is archived.

Retention Period is the entire duration of time before the data is completely deleted. This is based on the organization’s policy.

While archiving the data, organizations need to decide on residence time and retention period.

Figure2.png

3. Exclusion rule sets

 

It is possible that certain information however old they are, very important for the organizations and these shouldn’t be archived. While archiving, it is important to identify such data sets and formulate certain rules in order to exclude such data from archiving. Certain custom programming will be necessary in order to frame the rule sets in ABAP/3.

4. Strategy for non-standard SAP data   

Non – standard SAP constitutes bulk of data in VMS archival process. These can be classified into two categories.

  • PDF Documents: Normally organizations will have PDF documents for customer relevant documents such as customer order, customer offer, Sales Invoice documents and Credit or Debit notes. Such documents can be archived and stored separately and can be accessed separately (Independent of vehicle data). Therefore, archival strategy has to be adopted for this type of data.
  • Custom Tables: Non-standard SAP tables are another source of huge data. These tables also need to be considered along with standard SAP tables. A list of custom tables used in SAP environment needs to be prepared. From this list, all tables above the pre-decided cut off size need to be considered for archival.

5. Creation of DART Files

DAta Retention Tool (DART) is a functionality provided by SAP to meet the requirements by tax authorities. This tool extracts company code and period dependent data from database and corresponding master data and prepares file formats in ASCII. These files can be read using certain external tools.

Hence if there are requirements from tax authorities or local laws require such documents to be created, it is required as part of project to create DART files.

6. Identification of server to store the archived data

Normally archived data is stored in a separate server which can be accessed with certain restricted user permissions. Organization need to identify the retention period for the data after which this data can be discarded. Since it is expensive to manage the data in house, normally certain service providers (e.g. SAPERION) offer the service for a certain fee. Based on the organization’s data retention and audit policies, agreement can be made with the service providers in order to retain the data.

7. Security of the Data

Once the data has been stored in an external database, there could be certain occasions where users need to access the data. It has to be ensured to restrict the data access only to the information requested. In multi-country/market environment, access to the data from one market should be restricted to the users of another market.

8. Rule sets to access the archived data

Once the data has been archived and stored (either internally or externally) policy has to be formulated in order to access these stored data. Also the way in which data is accessed differs from the normal way of displaying the data in the system. Short training can be organized along with service provider who is providing the data storage facility.

Even though data is available for display, service providers grant limited access and multiple access comes with a price. It is advisable to access these data only for legal and auditing purposes.

9. Identification sequence of archival

It is not possible to archive the objects/documents if there are certain dependent object/ document present in the system. As a first step, all the dependent objects need to be archived and then the remaining objects can be archived. Following figure shows how the sequence can be organized for VMS archival procedure. PDF content can be archived separately before the archival of SAP documents begin.

Archival Sequence.jpg

10. Planning for Volume Testing

Success of archival project relies on the effective implementation and realization. In order to have defect free application, it is essential to have thorough testing of customizations and changes. It is also recommended to have volume testing since it gives better picture of the system behavior and estimate of the total time required for the archival process.

ARCHIVAL PROCEDURE

1. Standard customizations

In standard customization, user needs to set up the system in order to archive the standard SAP objects such as billing document, sales document etc. System is customized to select the files for archiving based on pre-determined conditions.

In VMS, users can select vehicles based on following different criteria:

  • Specific primary status of the vehicle
  • Specific secondary status of the vehicle
  • Specific action on the vehicle
  • Residence period

Following figure shows the duration between vehicle creation till the end of residence period.

figure4.png

Below is the screenshot of the standard SAP transaction (OBR8) where the residence period in days can be customized.

figure5.png

2. Data Backup

Similar to all critical projects where database back up is taken before the Go-Live of new software, it is essential to take the back up of complete data before the data archival is started. This helps if data restore is required in case of inadvertent deletion of data (that was not supposed deleted/archived) or some issues during archival which may corrupt the data.

Hence this step needs to be considered during the project planning so that, this step is considered as mandatory step before archival step.

3.    Archival

Archive Development Kit or ADK is a technical framework provided by SAP for the secure archival process. This framework provides a cluster of tools for enabling the complete archival procedure. ADK provides the interfaces, function modules, example programs and documentation that one needs to develop own archiving programs. Through ADK, SAP ensures that data archiving is independent of hardware and release changes.

ADK includes:

  • Archive Administration (SARA) functions
    • Pre processing
    • Write
    • Delete
    • Post processing
    • Read
    • Index
    • Storage System
    • Management
    • Database Tables
    • Info System
    • Statistics
  • An API to develop customer-specific archiving programs
  • Archiving-specific function modules
    • Automatic Job scheduling
    • File management using SAP Content Management Infrastructure
    • Connection to the storage system
    • Compression during archiving
    • Possibility of archiving while the system is running
    • Network graphic for showing object dependencies
    • Access to individual data objects in the archive
  • Sample programs and documentation

 

Once the customizations related to archival is completed, archival process can be initiated. Archival process is managed through a tool SARA. Following functions are possible using this tool:

  • Archiving Customizing
  • Overview of the archiving jobs
  • Assigning tables to archiving objects (DB15)
  • Data archiving statistics
  • Archive Information System


figure6.JPG

If there are custom tables in VMS system, then based on the size of these tables also need to be considered for archival. While customizing, these tables also need to be considered.

4. Reconciliation

After archival procedure is completed, reconciliation of the data before archival and after archival is advisable. (i.e., sum of errors during archival process and data archived = original data set) It will help to identify if any data was missed in the archival or error has occurred in the process. If some issues have occurred during the archival procedure, this can help to identify, fix it and minimize the impact.

OVERSIGHT OF THE PROCEDURE AND DOCUMENTATION

During data archiving, it is important to consider the legal and other compliance aspects.

1. Legal Compliance

While archiving the data from the business database, it is important to consult the legal and auditing team. All local laws pertaining to retaining of documents need to be complied with. All the information related to archiving process need to be documented and preserved.

It is advisable to involve an auditing team in the project and take signoff from them. This also ensures the transparency and helps to analyse the issues that may come up at later point in time.

2. Documentation

From the project management and auditing point of view, it is important to keep the documentation of each and every project milestone such as requirement gathering, solution design etc. in the archival project. Necessary approvals (sign off) have to be obtained from all the concerned stake holders. This process has to be established early in the project life cycle so as to ensure the compliance from all the team members.

CHANGE REQUEST HANDLING

After the execution of initial archival process, there could be requirements for further improvements and changes to the archival programs and processes. These changes should be handled very diligently and carefully. This is because archival process involves the live database. Any issues arising because of changes to the archiving program can cause problems to live database. Following points need to be considered while changes to archiving programs are taken up:

  • All the concerned stake holders should be consulted about the changes
  • Thorough regression testing should be taken up in order to rule out any issues due to the changes taken up

PROJECT RISKS

Since the operational data is taken out of the database, it is necessary to tread caution. Following are the possible risks during the project execution which needs to be taken care of.

  • Project execution delays
  • Wrong data being archived
  • Inefficient co-ordination between stake holders
  • Legal complications

Warehouse Activity Management : Overview

$
0
0

Warehouse Activity Management : Overview


Authors : Abhishek Banerjee , Prasad Bathala


Introduction:

 

Normally the business expects that Transfer Requirement (TR) needs to be converted into Transfer Order (TO) and subsequently TO needs to be confirmed to make the stock available for the user department such as Production, Sales etc. within a stipulated time based on their business requirement and priorities. In such scenarios it will be difficult to manually identify the delay due to some error in document or process when more and more such documents are getting generated.

 

Hence warehouse management system has provided certain functionality which are mainly intended to assist the warehouse personnel to oversee plan and optimize work processes in the warehouse and also to notify in case there are delays or errors in the overall system.

 

Purpose:

 

This document outlines required configuration for Warehouse Activity Monitor and how to implement the same in a warehouse.

 

Initial Configuration:

 

To setup Warehouse Activity Monitoring the configuration required is outlined in the following steps:

 

Step 1: To activate the warehouse activity monitor objects for the given Warehouse

 

 

For each warehouse monitor object is assigned the warehouse number (for eg: 005) as shown below

 

 

Step 2: Now define the critical parameters for the each object

 

 

For each object define the critical time period as per business requirement. Here I have define the critical period as 10 min which means transfer orders that got created and elapsed time is more than 10 min for confirmation will be considered as critical.

 

 

 

Step 3: Now for each object the report was defined by standard SAP as shown below

 

 

 

For each of the above shown report you can define the variant and values for the variant as shown below

 

 

Warehouse Activity Monitoring: Basic Scenario

 

The following outlines a basic scenario for Warehouse Activity Monitoring:

 

Step 1: Now create a background job for the warehouse activity monitoring with each report program execution in sequence using transaction SM36. For example the report program RLLL01SE and RLLL02SE are assigned to the background job “WAREHOUSE ACTIVITY MONITOR” as shown below

 

 

 

Step 2: Schedule and execute the job

 

 

Step 3: After execution of the background job now execute the transaction LL01

 

 

Here either you can maintain your selection entries and save it as a variant or you can execute directly without any variant or input values for the selected warehouse 005 as shown below

 

 

On execution you will get the following output as shown below

 

 

On double clicking at each level you can get the more detailed data as shown below

 

 

 

Usage:

 

The warehouse activity monitor provides,

 

  • Automatic monitoring of warehouse processes
  • Automatic recognition and display of errors in the warehouse
  • Support in the analysis of processes in which errors have occurred
  • Support to error correction

 

The warehouse monitor is useful for several reasons,

 

  • Not all warehousing processes are carried out in the system without errors.
  • Errors are often not recognized until sometime after they have occurred.
  • The search for the cause of an error and correcting it can be time consuming.

 

Conclusion:

 

In Warehouse Management, the warehouse activity monitor displays objects with critical processes. For each of the monitoring functions, the warehouse activity monitor offers additional functions that help you to analyze and correct errors. The additional functions that are available depend on the object concerned.

The individual objects that you can control in this manner include the following:

 

  • Unconfirmed Transfer Orders
  • Open Transfer Requirements
  • Open Posting Change Notices
  • Open Deliveries
  • Stock in Interim Storage Areas
  • Negative Stock
  • Inconsistencies in Stock Figures for Production Supply

SAP FICO ingrated with BI

$
0
0

hi,

 

any one can send me SAP FICO integrated with BI accounting entry and configure steps

SAP DBM Labour Value Concept

$
0
0

Labour Value are the operation codes of the Repair / Replacement of the vehicle while vehicle servicing /.repair. Generally every OEM provide the dealers / service centers with Flat rate manuals which includes the list of operation codes with a hierarchy of to group them based on aggregate & also have encapsulation of one operation code covering multiple sub operations.

 

In SAP DBM this hierarchy can be set up & operation codes are maintained as master data:

 

Transaction /DBM/LBROP- Maintain operation codes:

 

 

Select the catalog, multiple catalog can be set up in the system.

 

 

Select the operation code & click  on LV Target time. The target time which gets updated in the order. LV main type is the model service code & some times it is called as Variant Service code.

 

 

The same LV Main type is also maintained in the vehicle master & get derived in the vehicle from the model master.

 

 

This LV Main type is further gets captured in the service order at order creation, & automatically while adding the Labour operation code for Item category P001, system automatically filters the the LV value target time based on the LV main type set in the order.

 

You can change the LV main type in the order to Ase-trick (*) to get the LV value target time for all the other LV main types as well or general Labour times

SAP DBM 8.1 - Key features

$
0
0
  • SAP DBM 8.1 provides the option of replacing the usage of TREX-based searches with SAP HANA based searches for SAP HANA customers. The SAP HANA-based search option provides reduction in total cost of ownership (TCO) by avoiding the need for TREX server installation and maintenance.

 

  • SAP DBM 8.1 also supports the blocking of business partner data and the blocking of all master data or transactional data that contains personal data through the use of SAP Information Lifecycle Management (SAP ILM).

 

Refer DBM 8.1 release note for more details: https://websmp208.sap-ag.de/~sapidb/012002523100014173662015E/Release_Note_EN_DBM810.pdf

Vehicle Data Upload - Migration of Vehicle Orders

$
0
0

INTRODUCTION


The purpose of this document is to identify key elements during Vehicle Data Load during cut-over and to analyse possible strategies with their merits and demerits. Also the document will focus on various technical and functional aspects of the process.

 

In any VMS project, Vehicle data upload is an important activity. A large volume of vehicles need to be created in SAP as part of the data migration / cut-over activity.

 

Created (or migrated) vehicles need to be updated for several parameters e.g. stock, vehicle location, registration data, configuration of the vehicle, primary & secondary statuses etc. For an effective transition from a non-SAP system to VMS, vehicles will need to reflect the legacy world on go live date or at least must contain all the required data so that business can start transacting without losing any functionality and / or required information.

 

First let us discuss, what we need to load into SAP from a Vehicles perspective including various technical and functional aspects. Once we know what we are uploading, we shall discuss on various mechanisms / strategies on how to best go through the migration process.

 

Vehicle data upload can broadly be split into 2 parts – Vehicle Creation and Vehicle Data Update.

 

VEHICLE CREATION:


To be able to upload vehicle data, first the vehicle needs to be created in VMS. SAP has provided a BAPI to create vehicles - BAPI_VEHICLE_CREATE.

The pre-requisite for the BAPI to run is that Material Code (MATNR) and PLANT are available in SAP.

 

The action to be exported to the BAPI is CREA (or any other vehicle creation action as per the business requirement). The table GT_VEHICLE needs to be populated for MATERIALNUMBER (MATNR) and PLANT as bare minimum requirement for the BAPI to be able to create a vehicle. Upon successful run of the BAPI, table GT_VEHICLE will have the VGUID for the created vehicle, which can be used in onward process. The table RETURN will contain the success / failure messages.

 

As VGUID is an internal field to SAP, I suggest to populate VIN (VHVIN) and / or External Vehicle Number (VHCEX) while creating vehicles, as these can be used to search for the vehicle from VMS database (VLCVEHICLE) for updating or any other onward process.

 

Current release (as of 01.01.2012) of BAPI is capable of creating only 1 vehicle per call as per the parameter values passed within the standard code.

 

 

 

VEHICLE DATA UPDATE: VEHICLE CONFIGURATION


Once the vehicle is created, the vehicle data has to be enriched with configuration i.e. to load the characteristics values. Depending upon the business scenario this may not be needed. Usually, vehicles are configurable i.e. paint colour, tyres, trims, entertainment, satellite navigation systems, safety features, climate control mechanisms etc. can be selected by the customer on the top of a standard vehicle. There are numerous areas where a vehicle’s configuration can be used depending on business requirements. These chosen ‘options’ can be used for pricing i.e. a metallic colour can attract additional costs, or a 6 CD changer or an in-built Satellite Navigation System on the same vehicle model. Also the configuration can be used for various analytical purposes e.g. to analyse market demand for a newly introduced feature or what colours are more in demand etc.

 

Standard SAP provides the functionality via batch class and characteristics. Therefore for such vehicles, the configuration (characteristic values) are needed to be loaded on to the vehicle.

 

In VMS, action CMOD is used to configure the vehicle. The action calls function module VELO10_REAL_EXECUTE. The function is used for 2 purposes i.e. to convert a planned vehicle to a real vehicle (creating valuation class and batch for the vehicle) and to change the configuration of the vehicle. Action CMOD, skips the creation of Batch and Valuation Class and simply change the configuration of the vehicle.

 

To run CMOD action, BAPI_VEHICLE_CHANGE_MULTIPLE can be used.  The BAPI can work with multiple vehicles so populate BAPI fields / tables according to the requirement. Vehicle Identifier will need to be passed on to BAPI to find the vehicle, which could either be VGUID, External Vehicle Number, VIN or Internal Vehicle Number.

 

The BAPI does not append to vehicle’s configuration, it loads the data whatever is passed to BAPI fields / tables. So, in case, the action is executed on such a vehicle where partial configuration is already loaded, then already loaded configuration is to be read and copied to BAPI fields along with the remaining values while firing the BAPI to have full configuration on the vehicle.

 

Also the BAPI will return the error message and will not load the configuration if master data is not created beforehand. Full set of passed characteristic values are need to be present in SAP to run the BAPI and load the configuration. None of the values will be loaded if there even one value missing in SAP.

 

VEHICLE DATA UPDATE: ADDITIONAL DATA


There are two types of additional data to a vehicles – Vehicle Attributes and Vehicle Qualifiers.

Attributes are pre-defined in the system e.g. External Vehicle Number, VIN, Usage, Sharing Level etc. The possible values for any attribute are defined in customizing.

 

Qualifiers are additional data for vehicles. The values could be from anything to everything e.g. Carbon Emission of a vehicle, Custom Clearance Document Number, Engine Number, Import Value, Customer Name etc. What values are to be stored, will depend on the business requirement. The Qualifiers can be defined in customizing as required.

 

Attributes and Qualifiers are assigned to desired actions in customizing, the mappings are stored in tables CVLC16 (Qualifier – Action) and CVLC17 (Attribute – Action).  All actions, upon execution, fire the function VELO09_WRITE_VEHICLE_DATA which calls VELO09_WRITE_VLCADDVDATA for updating Attributes and VELO09_WRITE_VLCADDDATA for updating Qualifiers. The Attribute and / or Qualifier to be updated by any action is driven by the mapping done in customizing.

 

The vehicle data for attributes is stored in VLCVEHICLE table and for qualifiers is stored in VLACADDDATA table.  One thing to note regarding qualifier update is, if any qualifier is duplicated in supplied data to BAPI, it will return an error and no qualifier update will be carried out, therefore before firing the BAPI, make sure to run a check for duplicate entries on the data supplied.

 

Once relevant configuration is in place, the migration mechanism can upload additional data to respective table by using BAPI_VEHICLE_CHANGE_MULTIPLE passing relevant action, attributes / qualifiers and Vehicle Identifier.

 

 

VEHICLE DATA UPDATE: PRIMARY AND SECONDARY STATUSES


The vehicle life-cycle is controlled and regulated by Primary and Secondary action matrices. These matrices are configured in customizing. They define the lifecycle point a vehicle attains once an action is executed.

 

The migrated vehicles need to be brought to the point where they are in legacy at the time of cut-over so that next business action / transaction / step / process can be carried out in SAP instead of legacy.

 

To correct the vehicle statuses VELOE transaction is to be used. Action CORR is used to denote the status change in vehicle history. This action can be executed using transaction VELOE. The transaction overrides the CVLC04 mapping (Action Matrices) and updates the status for the selection of vehicle from the input field.

 

As of note 445860 (which my current understanding is too), there is no program / screen attached to the action in customizing by default therefore either a custom development will be required to be able to run CORR action from VELO transaction. The BAPI_VEHICLE_CHANGE_MULTIPLE isn't useful either in this case.

 

 

INITIAL STOCK UPDATE:


Stock balances have to be loaded into SAP during Cut-over to reconcile the FI books. IEID is the action is to be used to load the stock balances. The action triggers a 561 movement on the vehicle using batch number (Internal Vehicle Number) and loads the inventory. The action can be run manually from VELO (required necessary configuration in place).

 

Usually during the cut-over, posting date for initial stock may not be the date when action is being performed therefore using BAPI_VEHICLE_CHANGE_MULTIPLE may not help as posting date may not be manipulated while executing BAPI. So far I have not been able to execute IEID action using BAPI successfully. Instead I had used VELO17_IEID_EXECUTE (caution: FM is not released) along with couple more individual functions to load the vehicle inventory.

 

 

MORE OPERATIONS ON VEHICLE:


There could be a variety of data, assignments could be required for the vehicle as per the business requirements e.g. a percentage of the migrated vehicles will already have dealers (customers) assigned to them in legacy i.e. the NSC / Importer / Distributor will know which dealer they would be sold to. This has to be reflected in migrated data.

 

VMS has an action to link a vehicle to the dealer i.e.  RSVN. The action can be executed using BAPI mentioned in earlier steps passing an appropriate Vehicle identifier and RSVN as action. The action will update VLCRESERVATION table.

 

Reservation can only be carried out only on unreserved vehicles. A vehicle, for which either a quote, sales order, reservation etc. exists, will return an error. A full list of exception can be checked in function VELO11_RSVN_PREPARE.

 

Similarly, there could be more actions required to be executed on the vehicle during the cut-over e.g. assignment to Vendor, End Customer, Location update, Sales Order Creation, etc., which will depend on business requirements.

 

The BAPI provided can be used to enrich vehicle data by using different actions as required.

 

However to be able to execute any action, the Action Matrices should be in place. Also, any action fired via BAPI to update vehicles, should be allowed as per action matrix configuration (table CVLC04). The material master, characteristics master, Customer Master, Vendor Master etc. should be created in SAP before the migration exercise.

 

All these above actions can be wrapped up into a single program. The program then can receive data and instruction from any feeding mechanism to migrate data to SAP. The various feeding mechanisms are discussed in the next section of the document.

 

 

STRATEGIES TO LOAD DATA


So far we have discussed what is to be loaded into SAP VMS and what are the various technical and functional aspects of using BAPI and other function modules. Now we need to determine how the data is to be supplied to the BAPI, Functions etc. There are various ways with which we can supply data to BAPI, program etc., most commonly used and effective ones are discussed below:

 

EXCEL UPLOAD:

 

The data can be loaded via excel files. The file will have a pre-determined format (template) as governed by the load program / function. Legacy systems may not provide the data in exact required format. So the data would need to be converted into required excel files as per the templates.

 

This excel files give more clarity and control over data. One can see the data before load physically. However this method is recommended if the volume of data is not too large. Above templates can be combined together to execute more than 1 action using single file, which would depend upon data, volume and requirement.

 

Another reason excel upload is effective because one can keep the templates dynamic, if required, for updates like SMOD and CMOD. In my experience, a couple of times I have had the need of updating just one qualifier to a number of vehicles to have complete information or just couple of more options to the vehicles to correct the price even after the data migration (or Data Migration Agreement). In such cases, the dynamic templates help. The load program would read the column header to determine which qualifier or characteristic is to be populated by corresponding value from record. This gave a unique advantage of ploughing through only needed parts of the file. With fix format, all the qualifier will be read for blank values, thereby increasing the run time. Also with dynamic template, one can work with the full data set to prepare most effective load files keeping columns at optimum number.

 

In this case, the program will pick up the files from local machine and churn the data as per the design. However, there is a downside to this approach. If the data set is big, the system has to stay on-line till the process is finished.

 

To overcome this, the program can be made to run in background using file from application server. The files in csv format can be placed on application server and the program can read them and execute in background. This increases the effort of putting files on app server, but it relieves the machine from being engaged throughout the load.

 

The load program can generate log files indicating success, failure, information etc. for every record it processes in both front end and background processing modes. They will be a great help in fixing issues and re-loading the error records. While running he program in background, one would need to download the log files and then work them out. I preferred generating 2 log files for every file processed i.e. a summary log, indicating Pass or Fail for a vehicle record and a detailed log, capturing all the message in return table of BAPI if there is an Error or Warning with the vehicle identifier. The log can be generated in a format which can be worked upon in excel easily.

 

Using excel, I have managed loading a volume of 400,000 vehicles in 3-4 days’ time including error fixing and re-processing the failed records.

 

UPLOAD VIA INTERFACE:

 

It is one of the easiest and effective method in my opinion to load the vehicles into SAP. Usually, SAP VMS is not used as a front end system to create vehicles. In normal business scenario, the vehicle orders are received at dealers’ end which results in a vehicle creation in VMS. Since Dealer front end systems are not usually SAP, therefore most of the implementations will involve an interface between ordering system and SAP to run day to day activities and vehicle’s life-cycle. As part of the implementation, vehicle ordering / life-cycle interface would be developed.

 

So, instead of creating a separate program to load the vehicles, I found it easier to get the legacy system (if it has the capability) to trigger all the orders from its database to SAP over the interface as part of migration exercise. Generally an indicator reset within legacy does the trick.

 

Once the order bank from legacy is published, it will behave as normal vehicle’s life-cycle. So if order is not present in SAP, it will create a vehicle otherwise it will update the vehicle with to the latest life-cycle point as directed by the incoming data.

 

But this approach may not work depending upon the circumstances. My point is, if legacy system could be used to migrate vehicles then it is worth looking at the possibility, as it will save time, the data can be migrated in chunks as and when suited, reconciliation becomes easy as interface will undergo testing during development and will save development in SAP.

 

In my experience, I have enabled the vehicle ordering interface way before the actual go live, by transporting the required configuration well before the go-live date. This loads the vehicles, continues the life-cycle replicating the legacy action in SAP too and gives enough time data reconciliation and to fix any bugs in the interface.

 

MIDDLEWARE:

 

As discussed above, there would usually be an interface through which SAP will receive the vehicle creation / life-cycle updates. If the legacy system cannot push data to SAP, then it might be worth looking at the middleware involved. The data can be inserted at middleware so that it is received in SAP as normal interface, if one is built as discussed above. Alternatively, the BAPI can be called by the middleware through RFC and data can be transferred to SAP.

This also saves an additional development to create a data upload mechanism in SAP. However, not all middleware systems are friendly enough to be able to cater to this requirement. So this could be a long shot but still worth exploring depending upon project governance and other constraints.

 

MANUAL:

 

Of course the vehicles can be created directly in VELO transaction. But this way will not be feasible for even a data volume of few hundred vehicles. Loading configuration or qualifier (if they are too many – some of my projects had more than 500 qualifiers) manually will be a tedious task and prone to human errors.

 

Apart from above, there could be more methods to load the vehicles in SAP e.g. LSMW. However I have never explored these options as in my experience, the business requirements often are quite spread out and to achieve them a custom development is required. So I have always used either excel upload or interface load to migrate Vehicle data to SAP.

 

I am sure there would be many more ways to do the above. Any comments and suggestions are very welcome. Hope you have enjoyed the document and it helps. For any questions, please drop a message.

Incompletion log in DBM Job Order

$
0
0

Incompletion rules can be defined within the DBM Job order as per the business requirement. Based on the defined rule any field within the DBM Order can be made mandatory during any action.

 

1) Define Incompletion rule You can define the incompletion rule in the below mentioned node. /DBM/SPRO – ORDER – ORDER CONTROL – INCOMPLETION LOG – Define Incompletion rule

 

2) Assign the incompletion rule to action in the below node. You can also control the reaction of the specific entry i.e Information, Warning or Error message

 

/DBM/SPRO – ORDER – ORDER CONTROL – INCOMPLETION LOG – Define Incompletion rule

 

New custom Criteria tables can be created in the below mentioned note to control the level at which incompletion rules are linked with actions.

 

/DBM/SPRO – ORDER – ORDER CONTROL – INCOMPLETION LOG – Define table of criteria

 

incompletion log.jpg


Minimum Margin rule

$
0
0

Minimum margin rule can be defined within the DBM Job order to ensure sale price greater than cost with the defined margin value of the rule.

 

The rule must contain the below information

• Key of the rule

• Description of the rule

• Planned Cost field (a field of structure /DBM/SPLIT_COM)

• Minimum margin

• Unit of the margin (percentage or currency)

 

The rule defines which field of structure /DBM/SPLIT_COM is used as the planned cost field to calculate the margin of the order.

 

1)Define Minimum margin rule You can define the minimum margin rule in the below node in customizing /DBM/SPRO – ORDER – BASIC FUNCTIONS – MINIMUM MARGIN – Define Minimum Margin rule

 

2)Minimum margin rule can be determined as per the definition in criteria table /DBM/SPRO – ORDER – BASIC FUNCTIONS – MINIMUM MARGIN – Determination of Minimum Margin rule

 

minimum margin.png

Exchange Part in DBM Service Order

$
0
0

DBM Material master allows to define a spare part material as Exchange part, Used part or Normal part which can then be used in job order as per business scenarios.

 

Exchange part – The new part which is installed in customer’s vehicle.

 

Used part – The defective old part which is taken back from the customer’s vehicle in replacement of the exchange part.

 

In some business scenarios the defective old part is taken back from the customer’s vehicle in replacement of the new part.

 

This defective old parts will be added automatically in the service job order as soon as the user enters the exchange part.

 

Here, system shows a popup asking, if the corresponding used part is to be added in the job order or not. If the user selects “Yes” the corresponding used part is automatically added in the job order.


Definition

 

Tcode - /DBM/MM01

 

Here in the below screen shot the field – “Type of part” defines whether the material is Exchange part, Used part or Normal part.

 

The Old defective material which is to be taken back is mentioned in field – “Used Part”.

Untitled.png

Some useful T-Codes for Warranty Processing

$
0
0

Processing

1. WTY - Warranty Claim

2. WTYAUT – Authorization

3. WTYRCL - Maintain Recall

4. WTYOQ - Worklist Warranty

5. WTYSE - Search Claim

6. WTYRP - Warranty: Part to Be Returned

7. WTYCL - Create Credit Memo Lists

8. WTYOR - Execute Report

9. WTYMP - Mass Change Warranty Claim

10. WTYMP_A - Mass Change Warranty Claim Admin. Tools

11. WTY_UPROF - Assign User Profiles (Warranty)

12. WTYSC_WWB - Warranty workbench

13. WTYDBSHOW - Warranty Claim: Table Display

14. WTY_SARA - Archive Warranty Claim

15. WTY_ARCHIV - Display Archived Warranty Claim

16. WTYNK - Number Range Warranty Claim

 

Pricing

17. WYP1 - Create Condition

18. WYP2 - Change Condition

19. WYP3 - Display Condition

 

Message

20. WYN1 - Create Message: Warranty

21. WYN2 - Change Message: Warranty

22. WYN3 - Display Message: Warranty

 

Customizing

23. OWTY - Customizing Warranty

 

Processing Control

24. WTY_VSR_ACTIVE - Activate Validation / Substitution

Viewing all 18 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>