The term “dynamic enterprise” was introduced during 2008, as an enterprise architecture concept. Rather than striving for stability, predictability and maturity, dynamic enterprises began focusing on continuous and transformational growth – embracing change as the only constant. This shift began with the proliferation of social media and user-generated (Web 2.0) content, which started to replace the curated information previously available. However, business and IT leaders in established enterprises did not fully embrace this trend; in fact, it was resisted for many years. These leaders were fearful of losing control of their organization’s information and technology assets.
Outside the direct oversight of the IT organization, now fueled by the mindset of a younger generation of employees, the shift has continued towards leveraging less accurate and subjective information (often crowd-sourced) to make business decisions. As organizations embraced larger volumes of less-accurate data, information began flowing more openly, changing the underlying nature of what was driving the dynamic enterprise. It wasn’t from where information was sourced, but rather how the large volumes of often-conflicting data could be organized, categorized and consumed. Big data was constraining the vision of dynamic enterprise.
As the data consumption trends evolved within the business environment, technologists (including Tim Berners-Lee, the inventor of the World Wide Web) were working behind the scenes on standards for a Semantic Web (Web 3.0), where computers could consume and analyze all of the content and information available. This enabling technology would bridge the divide between humans and their information and solve the big-data problem.
Making the data readable by computers was only part of the challenge. Most companies still lacked the technology capabilities and know-how to take advantage of the information at their disposal. Advancements in machine learning and cloud infrastructure during the past 3 years have finally unlocked the potential of big data to the masses. A few large cloud service providers have invested in computing infrastructure and developed the capabilities to ingest and process vast quantities of data. They have analyzed, correlated and made it available to users in the form of cloud services that require neither the technical expertise nor the capital investment that were former barriers to adoption.
As more enterprises and individuals leverage machine learning to draw insights from data, those insights become part of the “learned knowledge” of the system itself, and help the computer understand context and consumption behavior patterns that further improved its capability to bridge the human-information divide. Computers are also able to detect changes in information consumption as an indicator of potential change in the underlying information, which is to say, data is no longer inaccurate or obsolete.
The dynamic enterprise is focused on continuous and transformative change. Until recently, the ability of humans to process information has limited the rate at which that change could take place. The maturation of machine learning and its accessibility to the mainstream business community has enabled enterprises to overcome this barrier and embrace what it means to be a truly dynamic enterprise. The challenge going forward will be to determine what questions to ask the computers and how to translate the newly available insights into business opportunities.
Blazent has helped our customers dramatically improve the effectiveness of their IT Service and Operations processes by improving the quality and integrity of the data upon which they depend. This provides a more stable basis for decision making, as well as providing insight into costs associated with both Service and Asset management.