Copyright 2024 - BV TallVision IT

SAP protect's it's data by ensuring all data is checked and approved before it is stored. Data integrity is the insurance that the data in your system is correct, which is something that should be protected. What if it's your report that damages the data on your system ? Any ideas on how it should be fixed ? Prevention is the better option here...

  1. Direct table updates on standard SAP tables - damage the data integrity of SAP's own data. If data needs to be updated from a process other than SAP standard - make sure the object you are updating is locked (in the exact same way SAP locks it). Also make sure the update is done through SAP standard tools - which should be possible. If no BAPI is available for the update, consider the ancient BDC session...
  2. Locking changes each and every dbase insert, change or delete needs to be locked. Make sure the very program you are developing can be run in 2 sessions simultaneously working on the exact same set of data. Your program should perform just one of the updates. What's the point in locking a delete ? Or an insert ?? Another program could be working on the exact same insert (which would produce a dump) or another program could be preparing an update on the very object you are about to delete. All dbase updates (including insert and delete) should be locked.
  3. Authorize changes when you build a transaction which allows dbase updates, make sure the authorization for transaction is on-track. There's also a functional data integrity to take into account.
  4. Single point of entry - there should be no need to maintain (the same) data on a table in different ways. When generating the table maintenance screens for a new transparent table - the SM31transaction should be the only transaction which updates the table.
  5. Single point of entry (2) When checks are added to make sure data is entered correctly, consider where the checks should be added. There should be no work-around for added checks if you deem the check relevant for data-integrity. For example:
    Case: An Inbound Idoc for Goods Receipt is checked, for certain vendors, the goods receipt document must have a few custom fields filled in. This check is added to the inbound processing module of the Idoc, leaving the Idoc in an error status when the additional fields are not available.
    Effect: When the vendor calls about some Idocs that will not be sent, the Goods Receipt documents can be entered in the system manually - even though the added check still needs to be done, they have been bypassed because the Idoc inbound process is not used.
    Alternative:So what would have been a better approach? Put the check in place on a user exit (or BADI) for the Goods Reciept. If the Goods Receipt is entered manually, the check will be executed. When processed via inbound Idoc processing, the same check will be executed by the BAPI that saves the Goods Receipt document. Without a "back door" or work-around for the check - your data integrity is secured...
    Case: A standard SAP application is called from a "shell" application, build to make the task of using the SAP application more easy. This shell could typically be an interactive report or full screen module. BDC (batch input) technology is often used here.
    Effect: Checks can be added to the shell which would not be available when the SAP application is called without shell. The single point of entry concept is out the door...
    Alternative:Avoid field checks in the shell or (if that's not possible) duplicate the field checks in standard SAP application used exit or BADI.
    Case: When initial data migration is done, a migration program was used to upload information into some tables.
    Effect: someone has a brilliant idea: he or she has to do quite a number of updates and decides to use the upload program to do them. If all goes well, it will look like the respective user performed hundreds of updates in just minutes... The fact that data migration programs are assumed to run at take-on only, the programs are often not designed for other use.
    Alternative: Make sure migration programs are no longer available after take-on, e.g. by authorization.
  6. Debugging powers one thing to remember: a developer will need access to a production system to pin-point problems. Debugging is often a must as well. HOWEVER: within the debugger a developer can change variables - when he or she is authorized to do so. Changing values in the debugger effectively allows everything possible. The result of an authorization check could be changed around and data could be changed to unchecked data just before a table record is saved. Giving developers (or anyone else) access to change variables on a production system can result in data-integrity that cannot be traced. BTW: if a value is changed on a correctly set up production system: the actual change is logged in a system log...
  7. Migration tools are often forgotten about. A powerful tool like LSMW or any other custom build migration program is there to initially load the system. They often remain where they are because they could be of good use later. They can also be used as second point of entry...