Answers to 8 essential questions about assets that should be in your CMDB

Your Configuration Management Database (CMDB) is continuously increasing in size and complexity, driven by an endless list of components that need to be improved or new data sets that someone wants to add. You understand that more data doesn’t necessarily translate into more value. You wish someone could tell you, “What data do I actually need in my CMDB?” We can answer that question, and do it pretty precisely. At the core of any CMDB are the Asset/Configuration Item (CI) Records. Here are the answers to 8 essential questions about assets that are important to manage the IT ecosystem, and should be in your CMDB.

1. What are they? An accurate inventory of what assets and configuration items exist in your IT ecosystem is the foundation of your CMDB. Your asset/CI Records may come from discovery tools, physical inventories, supplier reports, change records, or even spreadsheets, but whatever their origin, you must know what assets you have in your environment.

2. Where are they? Asset location may not seem relevant at first, but the physical location of hardware, software and likely infrastructure impacts what types of SLAs you can provide to users, the cost of service contracts with suppliers and, in some areas, the amount of taxes you are required to pay. In many organizations, the physical location of assets is only captured as the “ship-to address” on the original purchase order; however, good practice dictates that you should update this information frequently. Some options may be GPS/RFID tracking, change records, physical inventory or triangulation from known fixed points on a network.

3. Why do we have them? Understanding the purpose of an asset is the key to unlocking the value it provides to the organization. Keep in mind that an asset’s purpose may change during time as the business evolves. The intended purpose when the asset was purchased may not be the same as the actual purpose it is serving today. Periodic review of dependencies, requests for change and usage/activity logs can help provide some insights into an asset’s purpose.

4. To what are they connected? Dependency information is critical for impact assessment, portfolio management, incident diagnosis and coordination of changes. Often, however, asset dependency data is incomplete, inaccurate and obsolete – providing only a partial picture to those who use the data for decision-making. When capturing and managing dependency data, it is important to keep in mind that the business/IT ecosystem is constantly evolving (particularly with the proliferation of cloud services), causing dependencies to assume important time attributes.

5. Who uses them? User activities and business processes should both be represented in the CMDB as CIs (they are part of your business/IT ecosystem). If not, then you are missing a tremendous opportunity to leverage the power of your ITSM system to understand how technology enables business performance. If you already have users, activities and processes in your CMDB, then the dependency relationships should frequently be updated from system transaction and access logs to show actual (not just intended) usage.

6. How much are they costing? Assets incur both direct and indirect costs for your organizations. Some examples may include support contracts, licensing, infrastructure capacity, maintenance and upgrades, service desk costs, taxes and management/overhead by IT staff. Understanding how much each asset is costing may not be easy to calculate, but this becomes the component cost for determining the total cost of providing services to users.

7. How old are they? Nothing is intended to be in your environment forever. Understanding the age and the expected, useful life of each of your assets helps you understand the past and future costs (TCO) and inform decisions about when to upgrade versus when to replace an asset. Asset age information should include not only when the asset was acquired, but also when significant upgrades/replacements occurred that might extend the expected, useful life of the asset to the organization.

8. How often are they changing? Change requests, feature backlogs and change management records provide valuable insights into the fitness of the asset for use (both intended use and incidental). This information should be available from other parts of your ITSM system (change, problem management, portfolio management, ), but it is critical that current and accurate information about change be considered a part of your asset records.

You should be able to find the answers to these 8 essential questions about assets in your CMDB. If you can’t, then you may have problems with either your data integration or asset data quality. If that is your situation, then Blazent can help. As industry leaders in data quality management solutions, Blazent can help you gather data from a broad set of sources and, through integration, correlation, reconciliation and contextualization, improve the quality of the core asset records in your CMDB, so you can maximize the decision-making value from your ITSM investments.

The future is closer than you think. Data is coming (and fast), how will you manage it?

What will you do when your job and the future of your company hinges on your ability to analyze almost every piece of data your company ever created against everything known about your markets, competitors and customers – and the impact of your decision will determine success or failure? That future is closer than you think. Data on an entirely different level is coming, and much faster than anyone realizes. Are you prepared for this new paradigm?

  • Technologists have been talking about “big-data” as a trend for more than a decade and that it is coming “” “Soon” is now in your rear-view mirror.
  • Companies have been capturing and storing operational and business process data for more than 20 years (sometimes longer), providing a deep vault of historical data, assuming you can access it.
  • IoT is leading to the creation of a massive stream of new operational data at an unprecedented rate. If you think volumes are high now, you’ve seen nothing yet.
  • The free flow of user-generated (un-curated) information across social media has enabled greater contextual insights than ever before, but concurrently the signal-to-noise ratio is off the charts.

What does all this mean? It means big data is already driving everything we do. The analytics capabilities of IT systems are becoming more sophisticated and easier for business leaders to use to analyze and tune their businesses. For them to be successful and make good decisions, however, the data on which they rely must be trustworthy, complete, accurate and inclusive of all available data sets.

Delivering the underlying quality data that leaders need is no small feat for the IT department. The problem has transformed from“ not enough data” to “too much of a good thing.” The challenge facing most organization is filtering through the noise in the data and amplifying the signal of information that is relevant and actionable for decision-making.

Inside your organization, historical data may be available in data warehouses and archives, but is likely fragmented and incomplete, depending on the source systems in use and the business processes being executed when the data was created. As IT systems have matured and more business and manufacturing processes are automated, the amount of operational (transactional) data created has increased. As part of stitching together business processes and managing changes across IT systems, data is often copied (and sometimes translated) multiple times, leading to large-scale data redundancy. IoT and OT are enabling instrumentation and the collection of an unprecedented volume and diversity of new operational data (often available in real-time) for operational monitoring and control, but the data from these collectors may have a limited useful life.

Outside your organization lies a wealth of contextual data about your customers, competitors and markets. During the past, experts and journalists published information for accuracy and objectivity – providing a baseline expectation of data quality. During the age of social media, a large volume of subjective user-generated content has replaced this curated information. Although this new content is lacking the objective rigor of its curated predecessors, the value associated with quality has been replaced with an exponential increase in data quantity and diversity available for consumption. The challenge for business leaders is filtering through the noise of opinion and conjecture to identify the real trends and insights that exist within publicly available data sets.

For you to make the big decisions on which the future of your company depend, you must be able to gather all of the available data – past, present, internal and external – and refine it into a trustworthy data set that can yield actionable insights, while simultaneously being continuously updated.

Machine Learning is re-inventing Business Process Optimization

Machine Learning is a game changer for business process optimization – enabling organizations to achieve levels of cost and quality efficiency never imagined previously. For the past 30 years, business process optimization was a tedious, time-consuming manual effort. Those tasked with this effort had to examine process output quality and review a very limited set of operational data to identify optimization opportunities based on historical process performance. Process changes would require re-measurement and comparison to pre-change data to evaluate the effectiveness of the change. Often, improvement impacts were either un-measurable or failed to satisfy the expectation of management.

With modern machine-learning capabilities, process management professionals are able to integrate a broad array of sensors and monitoring mechanisms to capture large volumes of operational data from their business processes. This data can be ingested, correlated and analyzed in real-time to provide a comprehensive view of process performance. Before machine learning, managing the signals from instrumented processes was limited to either pre-defined scenarios or the review of past performance. These limitations have now been removed.

Machine learning enables the instrumentation of a greater number of activities because of its capability to manage large volumes of data. During the past, process managers had to limit what monitors they set up to avoid information overload when processing the data being collected. Cloud-scale services combined with machine learning provide greater flexibility for process managers. They are able to collect data for “what-if” scenario modeling, as well as the training of the machine-learning system to “identify” relationships and events within the operational processes much more quickly than users are able to identify them manually.

One of the most promising potential benefits of machine learning is the “learning” aspect. Systems are not constrained to pre-defined rules and relationships – enabling them to adapt dynamically to changes in the data set from the business process and make inferences about problems in the process. These inferences can then be translated into events and incidents – potentially leading to automated corrective action and/or performance optimization of the process.

Even if companies are not ready to fully embrace machine-learning systems making decisions and taking actions without human intervention, there is tremendous near-term value in using machine-learning capabilities for correlation analysis and data validation to increase confidence and quality of data being used to drive operational insights. Manual scrubbing of data can be very costly and, in many cases, can offset (and negate) the potential benefits that data insights can provide to business process optimization. Machine learning is enabling higher quality insights to be obtained at a much lower cost than was previously achievable.

In business process optimization, there is an important distinction to be made between “change” and “improvement.” Machine-learning systems can correlate a large diversity of data sources – even without pre-defined relationships. They provide the ability to qualify operational (process) data with contextual (cost/value) data to help process managers quantify the impacts of inefficiencies and the potential benefits of changes. This is particularly important when developing a business justification for process optimization investments.

Machine learning is a true game changer for process optimization and process management professionals. Process analysis is now able to involve an exponentially larger volume of data inputs, process the data faster and at a much lower price point, and generate near-real-time insights with quantifiable impact assessments. This enables businesses to achieve higher levels of process optimization and be more agile to make changes when they are needed.

Optimizing Business Performance with People, Process, and Data

People are the heart and mind of your business. Processes form the backbone of your operations. Data is the lifeblood that feeds everything you do. For your business to operate at peak performance and deliver the results you seek, people, processes and data must be healthy individually, as well as work in harmony. Technology has always been important to bringing people, process and data together; however technology’s importance is evolving. As it does, the relationships among people, processes and technology are also changing.

People are the source of the ideas and the engine of critical thinking that enables you to turn customer needs and market forces into competitive (and profitable) opportunities for your business. The human brain is uniquely wired to interpret a large volume of information from the environment, analyze it and make decisions about how to respond. The human intellect, combined with passion and creativity, serves as the source for your company’s innovative ideas – both to create new and to improve existing products and operations. Ironically, most companies have historically viewed human resources as the “brawn” of their organization (workers), not the brains (thinkers).

Business and manufacturing processes provide the structure of your company’s operations – aligning the activities and efforts of your people into efficient and predictable workflows. Processes are critical to enable the effective allocation of the organization’s resources and ensure consistent and repeatable outcomes in both products and business functions. As companies mature, they develop the capabilities to improve operational performance by observing processes in action.

Operational data enables the people and process elements of your company to work together,  providing both real-time and historical indications of what activities are taking place and how well they are performing. Data is also the key enabler of scalability – multiple people are able to perform related activities independently and communicate amongst each other. Without data, separation of responsibilities and specialization of job roles would be almost impossible.

The relationships among people, process and data are changing. Since the first industrial revolution, processes were seen as the primary focus of businesses, with people serving as resources to execute those processes and data being created as a by-product of the work taking place. Technology adoption and the introduction of IT and manufacturing automation functions have primarily centered on the concept of business process automation – retaining the process focus and seeking to increase output and reduce costs through the elimination or streamlining of human activities. A new generation of business-process data enabled better monitoring and controlling of the processes to identify further opportunities for automation.

With the maturation of the information age, the benefits of investing in business-process automation are reaching a point of diminishing returns. Enterprises have been addressing those activities that could be easily and cost-effectively automated, and the majority of recapturable human resource costs involved in executing business processes have been harvested.

Optimizing business performance in the current business environment requires companies to re-think the relationship between people, process, data, and the technology that enables them. Forward-looking companies are transitioning to a data-centric perspective, viewing data as the strategic asset of the organization and framing people, process and technology as enablers to the creation, management and consumption of data. Re-framing the relationship in this way unlocks a new set of business optimization opportunities.

People are no longer viewed as workers to execute processes, but as interpreters of  environmental and operational data – making critical, time-sensitive decisions and continually adjusting activities to  improve business performance. Processes are no longer viewed as the rigid backbone to which all other parts of the organization must be attached, but, instead, become the source of operational data and the mechanisms for implementing change. In the modern paradigm, technology becomes more data-centric, capturing larger volumes and diversity of data elements and assisting humans to correlate them together to drive large-scale and real-time operational insights.

The ability of companies to fine tune their organization effectively for optimal business performance will be largely dependent on the quality and trustworthiness of the data assets they have at their disposal. Business processes have become more data-centric, and technology adoption has expanded the possibilities for new and diverse instrumentation. Bringing  all of the operational, environmental and strategic data sources together to enable decision making has become critical to business success.

Blazent’s service intelligently unifies disparate sources of IT and operational data into a single source that supports decision making and process refinement. While people and process are critical, it is not just the enabling data, but the quality of the data, that determines whether a company accelerates or stalls when pressure is applied. Blazent’s core role in the management of data quality has always served as a catalyst for growth and innovation.

IT under attack: Data Quality Technology helps companies assess security vulnerabilities

In the wake of the most recent (May 2017) malware attack impacting computer systems around the world, company executives are in urgent discussions with IT leaders, asking them to provide assessments of risks and vulnerabilities and recommendations to safeguard the company’s information and operations. CIOs and IT leaders strongly depend on the accuracy, completeness and trustworthiness of the data at their disposal to make informed decisions. How confident are you of the data being used to protect your organization from harm?

One of the biggest challenges for IT leaders is creating a dependable “big picture” view of their organization’s technology ecosystem and dependencies, because pieces of their operational data infrastructure are spread across a wide variety of technology management tools, ITSM systems, asset inventories, maintenance/patching logs and fragmented architecture models. While all of the pieces of the puzzle may be available, they don’t often fit together well (or easily) and data issues frequently appear, such as gaps, duplications, overlaps and conflicts, as well as problems with accuracy and out-of-date data. The result is a confusing picture for IT leaders and one that cannot be shared with company executives without raising concerns about the confidence of IT leaders’ recommendations and decisions.

If the quality of a decision is only as good as the data that is the basis for making that decision, then the solution to this problem seems simple: “Improve the quality of the data.” For most organizations, “simple” is not a term that applies to data management. Integrating IT data from multiple systems into a unified enterprise picture involves a series of complex steps; integration, validation, reconciliation, correlation and contextualization must all be performed to ensure the quality and trustworthiness of the information consumed. Unfortunately, most companies’ ITSM, Data Warehouse, Operations Management and even reporting systems lack the capabilities to effectively perform the unification steps necessary to ensure the required levels of data quality. This is where specialized Data Quality management technology is needed.

Consider for a moment where the IT operational data that relate to the latest malware attack resides – focusing on identifying the areas of vulnerability and assessing the potential extent of impact on the business. This latest attack was related to a known security issue with certain versions of operating systems in use both on end-user computer systems and some systems in data centers.

The asset records that identify the potentially impacted devices are typically found in an Asset Management system, Finance/Purchasing system, in network access logs or as configuration items in the CMDB of the ITSM system. Patching records that indicate what version of the operating system the devices are running may be found in a change management or deployment system (used by IT to distribute patches to users); asset management system (if devices are frequently inventoried); infrastructure management system (if the devices are in a data center); or in the ITSM system (if records are maintained when manual patching is done).

Once potentially vulnerable devices have been identified, IT staff and decision makers must understand where the devices are being used within the organization to assess the impact on business operations. For end-user devices, assigned-user/owner data is typically contained both in asset inventory records, IT access management/account management systems and server access logs. The user can be associated with a business function or department through HR records. For devices installed in data centers and other common locations, the ITSM system, purchasing records, asset inventories and/or architecture models can often be used to identify the relationships between the device and a business process or responsible department/function.

There are commonly at least 5 independent sources of data that must be combined to identify what devices are potentially vulnerable and what business functions depend on them. When these data sets are gathered, there will undoubtedly be a large number of duplicates, partial records, records for devices that have been retired or replaced, conflicting data about the same device and records with old data that is inaccurate. According to Gartner, at any moment, as much as 40% of enterprise data is inaccurate, missing or incomplete. Data quality technology can help integrate the data, resolve  the issues, alert data management staff to areas that need attention and help decision makers understand the accuracy and completeness of the data on which they depend.

Blazent has been a leader in providing Data Quality solutions for more than 10 years and is an expert in integrating the types of IT operational data needed to help CIOs and IT leaders assemble an accurate and unified big picture view of their technology ecosystem. With data quality and trustworthiness enabled by Blazent’s technology, your leaders and decision makers can be confident that the information they are using to assess vulnerabilities and risks will lead to solid recommendations and decisions that protect your organization from harm.

Machine Learning and the rise of the Dynamic Enterprise

The term “dynamic enterprise” was introduced during 2008, as an enterprise architecture concept. Rather than striving for stability, predictability and maturity, dynamic enterprises began focusing on continuous and transformational growth – embracing change as the only constant. This shift began with the proliferation of social media and user-generated (Web 2.0) content, which started to replace the curated information previously available. However, business and IT leaders in established enterprises did not fully embrace this trend; in fact, it was resisted for many years. These leaders were fearful of losing control of their organization’s information and technology assets.

Outside the direct oversight of the IT organization, now fueled by the mindset of a younger generation of employees, the shift has continued towards leveraging less accurate and subjective information (often crowd-sourced) to make business decisions. As organizations embraced larger volumes of less-accurate data, information began flowing more openly, changing the underlying nature of what was driving the dynamic enterprise. It wasn’t from where information was sourced, but rather how the large volumes of often-conflicting data could be organized, categorized and consumed. Big data was constraining the vision of dynamic enterprise.

As the data consumption trends evolved within the business environment, technologists (including Tim Berners-Lee, the inventor of the World Wide Web) were working behind the scenes on standards for a Semantic Web (Web 3.0), where computers could consume and analyze all of the content and information available. This enabling technology would bridge the divide between humans and their information and solve the big-data problem.

Making the data readable by computers was only part of the challenge. Most companies still lacked the technology capabilities and know-how to take advantage of the information at their disposal.  Advancements in machine learning and cloud infrastructure during the past 3 years have finally unlocked the potential of big data to the masses. A few large cloud service providers have invested in computing infrastructure and developed the capabilities to ingest and process vast quantities of data. They have analyzed, correlated and made it available to users in the form of cloud services that require neither the technical expertise nor the capital investment that were former barriers to adoption.

As more enterprises and individuals leverage machine learning to draw insights from data, those insights become part of the “learned knowledge” of the system itself, and help the computer understand context and consumption behavior patterns that further improved its capability to bridge the human-information divide. Computers are also able to detect changes in information consumption as an indicator of potential change in the underlying information, which is to say, data is no longer inaccurate or obsolete.

The dynamic enterprise is focused on continuous and transformative change. Until recently, the ability of humans to process information has limited the rate at which that change could take place. The maturation of machine learning and its accessibility to the mainstream business community has enabled enterprises to overcome this barrier and embrace what it means to be a truly dynamic enterprise. The challenge going forward will be to determine what questions to ask the computers and how to translate the newly available insights into business opportunities.

Blazent has helped our customers dramatically improve the effectiveness of their IT Service and Operations processes by improving the quality and integrity of the data upon which they depend. This provides a more stable basis for decision making, as well as providing insight into costs associated with both Service and Asset management.

How ITSM tools miss the boat on operational data management

For most companies, IT Service Management (ITSM) tools are the core hub for monitoring and managing the systems used to support day-to-day operations and business processes across the organization. It is also the centralized repository of information related to the status and performance of individual systems, and a critical source of information to support operational and strategic decision making. There is no argument that ITSM systems are an important part of supporting the company’s underlying technology and associated IT functions. However, ITSM tools often miss the boat when it comes to operational data management when more directly supporting business operations and providing the insights needed to integrate IT and OT data.

The challenge to ITSM systems for Operational Data Management are threefold. First is the audience for which these systems were designed. ITSM systems originated from IT helpdesk and ticketing systems intended to enable semi-skilled IT employees to capture and manage system events and user requests. Over time, as ITSM systems evolved, incremental capabilities for monitoring technology component availability and performance were added in order to provide a unified monitoring console for infrastructure operators. With the proliferation of applications as the core focus of IT over the past 15 years, ITSM tooling added capabilities for configuration, dependency and change management to enable application developers with insight into how their software was interacting with the infrastructure environment. Where ITSM systems continue to be weak is in business process awareness. They are still focused on being a toolset for the technologists, not the business and operations employees that use the technology. ITSM systems simply have not been designed for the audience that can benefit most from operational data management.

The second challenge of ITSM systems is a lack of workflow awareness. ITSM tools do a good job of monitoring and managing components, however they struggle to put the component data into the context of where it affects the business as it is being used, and how components connect to each other in the form of business processes and operational workflows. Where workflow awareness is built into ITSM systems, it is typically semi-static in nature and fails to support the accelerating rate of change taking place in most organizations. This situation is further exacerbated by the proliferation of cloud services and IoT applications to support business processes and operations. As it becomes easier for business and operations employees to introduce technology changes without the involvement of technologists, it becomes increasingly difficult for the IT organization to understand what technology is being used to support a business workflow.

The third challenge of ITSM systems in supporting operational data management is in the context and timeliness of data ingested into the ITSM system. Most ITSM systems cannot handle the volume of real-time transactional data needed to generate operational data insights quickly enough to be useful for business and operations employees to use them for supporting live operations and processes. Operational data management requires detailed transaction data to be made available, and preferably ASAP. For most organizations, this would include millions (and in some cases billions) of transactional data points to be ingested, processed, analyzed and presented to users on a continuous basis. Once ingested into the ITSM system, data management and retention becomes a challenge. Most ITSM systems are not architected to support the scale required for this type of ongoing processing.

Just because ITSM systems are missing the boat on operational data management doesn’t mean that your organization can’t begin getting operational value out of the data you have today. Individual operational systems are already designed to handle the scale of (business process, manufacturing, financial) transactions they are performing. Your ITSM system already has some insight into configurations and dependencies. By bringing this information together, resolving duplication and quality issues, and making the information easily accessible to business and operations users, you can begin taking steps towards operational data management.

Operational systems and ITSM processes depend on high-quality data, which has been Blazent’s core focus for years, and is now rapidly expanding as the range of tracked items driven by OT expansion continues to accelerate.

Why CMDBs are sexier than you think

Sexy may not be the first word that comes to mind when you think about your CMDB and the operational data of your company… but (seriously) maybe it should be!  After all, it has a number of attractive qualities and (with some care and feeding) could be the ideal partner for a lasting long-term relationship. There are lots of potential reasons this can work, but let’s focus on the top three:

Substance: Your Configuration Management Database is not shallow and fickle, it is strong and deep, with a history as long as your company’s. It is built on a core of your master data and pulls together all of the facets of operational data your company creates every day. It contains the complex web of connective tissue that can help you understand how your company works.  Those insights then become part of the CMDB itself – enabling the strength of your data to be balanced by the wisdom that comes from analytics and self-awareness.

Long-term potential:  You may lust after the latest new tool or trend, but your CMDB will stand by your company’s side through thick and thin, long into the future.  It will grow and evolve with you, always be honest about what’s going on, and work with you to provide insights to get your company through troubled times.  As your company changes with new markets, products, customers, and competitors or becomes a part of something bigger through acquisition or partnership, your CMDB is there to help you navigate the changes and achieve success.

Air of mystery:  You may never fully understand all of the secrets that your CMDB holds about your company.  As you unlock one insight, the potential for others seems to appear magically.  What would you expect from something that brings together all parts of your company data and the complex interrelationships in one place for you to explore?  The question isn’t “what are the limits of the CMDB?” but rather “what are the limits of your curiosity?”

Deep substance, long-term potential and an air of mystery. Maybe your CMDB is sexier than you think!  But (just like human relationships) simply because it has the attractive qualities that you desire does not necessarily mean that there is “happily ever after” in your future. You must learn to work with your CMDB, to understand its qualities and quirks. Continual care and feeding (data quality and maintenance) will help keep it healthy and enable it to grow with you.

The time has never been better to take the next steps with your CMDB.  If you are feeling nervous or don’t know where to start, Blazent can be your relationship coach – bringing you the tools and techniques help you understand your CMDB better and teach you how to bring out the best in it so the CMDB can bring out the best in you!

Data Integrity: the key to operational insights or the elephant in the room?

Throughout history, business has always struggled with the challenge of data accuracy and integrity. While there is clear operational and strategic value in accurate and dependable data for decision-making, the operational cost of achieving and maintaining data integrity can be a substantial barrier to success for many organizations. As IT and OT (Operational Technology) systems evolve, legacy data is continuously migrated to new systems. Examples includes off-the-shelf software and SaaS solutions which come with their own taxonomies, or data models which are merged with home-grown systems. As he needs of the business change, the maintenance of any level of data integrity can easily become cumbersome and costly.

Executives constantly ask their IT leaders how they can improve the quality and integrity of data in order to obtain the insights needed to guide their company effectively. While it sounds reasonable, it may well be the wrong question. Rather than focusing on the quality of raw data, a better approach is to focus on the quality of insights available and the speed/cost to obtain them by asking, “How can we better leverage the data we already have to cost effectively obtain the insights we need?”

Advances in machine learning, data science and correlation analysis during the past decade have enabled a broader range of capabilities to analyze data from disparate operational processes and information systems. This has been accomplished without developing some of the structured relationships and incurring data-model-integration costs associated with traditional data warehousing and reporting approaches. Keep in mind that modern analysis methods are most appropriately suited to gaining operational insights and do not (presently) replace the structured data required for financial reporting and regulatory compliance.

Through assessment of the trends and relationships between different data elements, modern data analysis systems are able to “discover” a variety of insights that may not have been available during the past. Examples include undocumented dependencies within operational processes, sources of data inaccuracy and the evolution of operational processes during time. Instead of focusing on what is “known” about operational data, modern methods focus on understanding what is “unknown” about operational data. It is the variances, exceptions and changes to data that yield true operational insights. This type of analysis not only yields higher value decisions, but also can be accomplished at a lower cost to the IT function, because it does not require the “scrubbing” and “integrating” of operational data that was once required. In many cases, poor data quality and integrity can yield higher quality insights in areas where traditional approaches masked operational challenges and opportunities.

The key to operational insights is actionability, where you can answer questions like:

  • “Where are inefficiencies and time delays occurring in my operational process?”
  • “Is there variation in performance across operational functions/locations?”
  • “Where are inconsistencies being introduced into my operations?”

None of these questions focuses on “Where are my operations performing as designed?” but rather asks, “Where are my operations doing something unexpected?” By focusing on the unknowns and unexpected behavior, decision makers can identify actionable areas either to mitigate cost/risk or to exploit opportunities for value creation.

Is data integrity the key to operational insights or is it the elephant in the room? That depends on how organizations want to view the situation. Modern data analysis methods and improvements in technology enable what once was viewed as a challenge (or liability) now to be used as an asset to drive opportunity. Data Integrity at both the informational and operational level is a core requirement of any modern business, and has been an area of focus for Blazent since the early days of Big Data.

Why is IT Operational data important for IT?

Each day, with every customer transaction, employee task and business process, companies generate vast amounts of operational data that provides leaders and managers with insight into what is working well and what requires attention. Operational data is particularly important to those responsible for stewarding the information and technology assets of their organization.

In this context, operational data is particularly important to IT, which is why it is so critical to understand the three different types of operational data on which IT leaders rely: business operational data, IT operational data and their combination: Integrated Business-IT operational data. Together, this combined set of data provides insight into both the content and context of an IT organization’s activities – enabling IT leaders to have informed discussions with their business peers and make decisions about where to best invest organizational resources.

Business operational data is all about the business processes and user experiences, which IT enables with the technology and services it provides. The reason organizations invest in technology is to improve the productivity and effectiveness of business operations. Process and user-related data evaluated over time provides a contextual picture into how effectively the technology is achieving that goal. IT offers this data to business leaders in the form of information insights. With it, the decision makers who control the organization’s checkbook will better understand the value of technology to justify continued IT investment.

IT operational data is concerned with the content of “what” technology components are operating and being used. IT operational data is important as a part of the IT planning process to understand capacity utilization and determine where scalability constraints exist, as well as to understand the cost of services provided to users and to assess security and risk considerations of the business-technology ecosystem. Within IT service management processes, operational data is critical to ensure performance and availability Service Levels Agreements (SLAs) are honored, and to drive technology cost reduction through infrastructure optimization.

When the first two types of operational data – business and IT – are combined, they provide a set of powerful insights into the complex relationships that exist between the operations of the business and the technology on which those operations depend. Utilization of IT services and manual process performance data help leaders identify where additional automation investment may help the organization scale. IT cost data, in the context of the business processes, provides the inputs for the elusive ROI calculation on which both business and IT leaders depend to make investment trade-off decisions. Increases in business transactional data volume, combined with IT capacity enablement trends reinforce the business case for technology upgrades or expansions.

Operational data provides IT with the critical picture it needs to understand and optimize the role it plays in the context of the company, and identifies where opportunities and risks exist that require attention. For IT (and business) organizations to receive the most value from their operational data assets, they must trust the data’s reliability and quality.

Data integrity, correlation and validation tools from Blazent can help your organization integrate operational data from a variety of systems, so your leaders and decision makers can trust the dependability of the information.