Copyright 2024 - BV TallVision IT

Data integrity is a very important factor, not just for computer systems. A system which is used to control the processes of a big company (typically an ERP system) should handle it's data integrity with a lot of love and devotion. Imagine a system which performs it's task very well, but every now and then a small hitch presents itself. It should automatically make you wonder how many actual hitches took place. The fact that you noticed this one, does not imply it was the only one around. With a good data integrity it should be quite easy to pin-point what happened, and how many of the same hitches are around. Without data integrity - there's no telling what your ever-so-important system will do next.

It's quite easy to think lightly on data integrity issues - why worry about the small scale solutions in full detail. The system should look after itself ? The whole concept behind trying to achieve total data-integrity is quite simple: damage your system integrity just a little, and the actual errors will go unnoticed and will be very hard to trace. Damage it a bit more and the system becomes unreliable.

There could be ways to calculate the risk of integrity losses. So what if a given table update fails every 150th time ? There's no way however to make sure this "calculated risk" remains calculated. In SAP terms - it's much easier to achieve data integrity then calculating the risk of integrity losses. SAP and it's development toolset are designed to achieve this. And even when the toolsets are used, all updates are locked and there's no back doors (work-around) for an update - there's still human error that will lead to data integrity issues. Just make sure there is no "data integrity by design".