Over the last two decades, businesses have witnessed tremendous changes in the field of information science. Together, exponential gains in the processing power of computer systems coupled with widespread adoption of emerging cloud technologies across industries unfetters a previous era’s conservative treatment of knowledge, and gives rise to both the problems and potentials of Big Data. In seeking to maximize the utility of data, as well as to improve cross-functional decision-making, a fleet of business intelligence tools flood the emerging data-science market with an (almost) out-of-the-box analytic capability. With these tools, data might now not only be called efficiently, but also combined effectively across systems to produce new innovative advantage.
Business intelligence (BI) tools help to apply data-as-an-asset, supporting increasingly direct feedback channels linking customers to products, services to customers, and performance to strategic decisions. Capable of processing large volumes of data in columnar indexes, rather than row-by-row fashion, business intelligence tools, such as Tableau, Cognos, and Watson, work to produce timely insights from raw data, directly to the business user. In fact, because these software support more direct access to data, the roles of the data analyst and the functional-area managers within the organization are less-and-less distinct: functional-area managers are now empowered to be analysts themselves. Further, within the organization, data itself is increasingly understood as an intangible asset, which needs not only management, but cultivation. Data analysts benefit as well.
Whereas most business decision-makers just want to see the clear implications of the choices before them, data analysts delve deep into the predictive potential of data. The reduced demand for cursory statistical analysis, which might be handled easily within BI dashboards, liberates the data scientist to mine, investigate, model, and even automate in-depth operations-performance analytics. Further, the quick-wins that may be achieved without bothering the analyst increase the organization’s perception of the value of data, which increases its incentive to collect valuable KPI’s. What data scientist has ever complained about too much data?
Unfortunately, as countless organizations have discovered in the wake of the Big Data hype, the apophthegmatic “garbage in equals garbage out” applies not only to computer programming, but also to the realm of business intelligence: quality analytics depend upon quality data. For both the data scientist and the business-level decision analyst, the task of data pre-processing very often demands the large majority of the analyst’s time. Further, system connectivity lingers as a challenge, shading the promise of “cognitive” BI as an integrated capability. As business intelligence tools take on increasing capabilities on the machine-learning frontier, organizations must not only transition every functional-leader to an analyst, but also every employee into a data steward. Empowered through data governance, the coherence of people, processes, data and BI technologies cohere as an adaptive learning capability within the organization.