ITMPI FLAT 001
Main Menu
Home / IT Excellence / IT Operational Excellence: Real Time Data Repository

IT Operational Excellence: Real Time Data Repository

This is part of a continuing series with Bob Anderson, IT Operational Excellence, presented by Anne Grybowski. You can read the entire series here.

In today’s world of technology, we have all come to expect information in real-time (instantaneously, as data are created). It stands to reason, when we are discussing operational excellence, that we should expect nothing less from our Data Repository. An organization should have constant access to specific deliverables that are easily viewable in real-time. As Bob Anderson notes, the Data Repository must be something physical, not something anecdotal, because, in the business world, things are not real unless transformed into useful data that can be retrieved, reviewed and acted upon.

Things only become real when you can measure them, or you can look at an artifact that was created that is supposed to satisfy a specific purpose or business reason. Simply having a discussion has nothing to do with operational excellence. It is only hard facts and hard measurements that allow you to know whether or not you are improving or back-sliding. The core of that is having some kind of data repository, whether it be for knowledge, numbers and metrics, resources or performance. You must have some electronic repository that people can get to and take action based upon what they see.

Critical Success Factor: An Electronic Data Repository

Although this was not done purposefully from a design standpoint, real-time data repository fits well as the center piece to our puzzle because, according to Anderson, it is “the core piece of upon which operational excellence depends.” This is because a real-time data repository holds functional knowledge, operational, and performance data. Having an electronic data repository is crucial in Anderson’s mind. Process templates, completed deliverables, metrics and service goals, times spent, and organizational structure must be easily and quickly accessible to keep an organization running smoothly and efficiently.

Each part of IT has its own unique data needs. Many of these needs overlap, but Anderson reminds us that different IT functions will have data are unique. When you are designing this data repository, you must understand what exactly you are trying to achieve and in what IT functional areas are you trying to achieve it. There are many types of data that will need to be stored – Static, Dynamic (changing frequently), Descriptive data, different units of measure for metrics and, actual. Care must be taken in designing the data repository – keep it simple. The more compiles the data relationships the more complicated it is to retrieve and use.

You also could be storing process templates or deliverable templates? This is beyond working with discrete data elements. Each of these components must have their own place within a data repository. Anderson suggests thinking of it as different neighborhoods in a city. The neighborhoods together compose the entire city, but each individual neighborhood is slightly or vastly different than the next.

Your electronic repository must also have real-time updates in order to be successful. You want to capture the data when it is created so that, as the outside world changes, the data can update as well. If you waste time in logging data at different increments of time, its value will degrade directly proportionally to the distance from whey it was created. The time delay between data creation and usage adds more elements of confusion as data degrades in reflecting the current environment.

However, this definition of “real-time” can vary from data to data. Sooner is always better, Anderson notes, but some data are less time-sensitive than others. For example, in regards to a process template, time is not critical. The template could hit an approval cycle which could take weeks. Even so, with a process template, there will be no degradation of data. On the other hand, with something like metrics, data degrades very quickly, so quick capture is a must. Quantifiable data, as in numbers and time, tends to be the most critical. It is important to log information as often as possible while keeping people updated via increased communication. In other words, make data collection part of doing the work that is associated with that data These quick updates are quite important for customer satisfaction as well. When fast moving transactional data are collected in real-time, operational excellence is more easily achieved.

The next article in our series will discuss Knowledge Management, the seventh piece of the IT Operational Excellence Puzzle.

Robert Anderson’s Bio:

Bob Anderson has been with Computer Aid, Inc. since 1988. The majority of his 38 years of IT experience have been spent as a senior executive in large IT organizations. In addition to being published, Anderson has been the principal architect of CAI’s process management, event tracking, and resource management tool, “Tracer” and most recently created a free assessment survey to help organizations recognize where and how they can improve their operations. He has also built CAI’s Production Support & Training department. Anderson is also a decorated US Marine. He and his wife have two grown daughters and reside in Boiling Springs, PA.

About Bob Anderson

Director of Product Development & Quality Assurance for CAI, principal architect of the IT Service Management Tool: Tracer, and known author, with more than 38 years of IT experience.

Check Also

IT Operational Excellence: Metrics – You can’t fix what you can’t measure.

This is part of a continuing series with Bob Anderson, IT Operational Excellence, presented by …

Leave a Reply

Your email address will not be published. Required fields are marked *