Tag Archives: analytics

Digital Business: Not Your Father’s Data-driven Business

Every business becomes a digital business and the use of big data, advanced analytics, and new digital business platforms radically reshape the nature of what it means to be a data-driven business. No longer are data-driven business those that use it to manage finance and operations. Rather, a data-driven business embeds data and analytics in digital business platforms to help customers and suppliers make their own near real-time business decisions.

Digital business footprints are obvious from the early successes of Amazon, Google and Netflix. The same is true of the digital transformations now underway in every industrial sector including those occurring at Argos, Atom Bank, Burberry, GE, Hulu, Microsoft, Siemens, Starbucks, Target, Tesco, T-Mobile USA, Uber and Walmart among many other enterprises. Digital business occurs through organic growth, spinouts, and new startups that at once challenge and support existing lines of industrial era businesses.

Digital businesses deliver competitive advantages as industry boundaries blur, coalesce, merge and as traditional approaches to data-driven business are augmented, complemented and assaulted. However, making the transition to digital business means the capabilities of data-driven business must be expanded beyond those of traditional data-driven business. Continue reading Digital Business: Not Your Father’s Data-driven Business

Risky Business: Incorporating Analytics in the Engine of Risk

As Enterprise Risk Management (ERM) continues to advance as a concept, linking financial, operational, and GRC risk management across the enterprise, new opportunities are emerging from the application of Big Data. Business is about risk; and management of an enterprise is about risk management. Risk and opportunity are inextricably strictly linked, per the famous and somewhat mistaken Chinese character which has been a business management meme since the 60s.

Application of Big Data and Advanced Analytics to risk management can create enormous potential for a change in how firms are run. While current application of analytics to this area tends to remain relatively small and limited to niche areas such as fraud detection, immediate market changes, regulation and bug forecasts, and the like, the capabilities are growing exponentially. Application of Advanced Analytics directly to business processes can create a mechanism of advanced performance in which individual processes are modified in accordance with an immediate analysis of risk. Such modification and would enable an intelligent form of agility, permitting companies to respond to events where they might occur within markets, supply chains, business conditions, and other areas. This will have different effects in different industries, initially impacting financial services and other professional services that can be immediately tailored to meet changing conditions. Yet we can see the potential for integration with manufacturing, software development and the like. Continue reading Risky Business: Incorporating Analytics in the Engine of Risk

The Self-Aided Analytics Solution

There has long been a problem in making analytics solutions available and accessible to the user. This has created innumerable trends toward simplification, and left many users with inadequate solutions. Large numbers of businesses and enterprise departments still rely upon spreadsheets to fill a significant portion of their Business Intelligence and data analytics requirements. However, the advent of Big Data and the advantages of Advanced Analytics are making simple solutions less possible and are raising the bar for analytics across all types and sizes of business.

Unfortunately, Advanced Analytics requires expertise that is not always available. While processing capabilities are becoming available on a SaaS basis from the cloud, experts are still required to formulate questions and produce understandable results. But what if we could apply Analytics to the querying process itself, and to the production of usable results?

New programs are beginning to do just that, with IBM’s Watson Analytics leading the way in providing a Natural Language Processing front-end for its Cognos analytics solution, and using analytics to provide immediately accessible visualizations to the user. All major analytics vendors are now moving in the direction of Natural Language Processing for queries, and a number of vendors are also moving into easy visualizations, as defined by market leader Tableau, that permit a naïve user to more fully grasp the implications of available data. Continue reading The Self-Aided Analytics Solution

Catching Up With Alteryx

Earlier this week, Saugatuck had the opportunity to talk to analytics provider Alteryx. Alteryx provides a tool that combines both ETL and Advanced Analytics into a single tool, which helps their primary customers – LOB analysts – get their analysis done faster.

Over the last 2 years, Alteryx has gained significant traction with a “land and expand” go-to-market strategy that targets LOB users initially and then expands internally, often with the support of IT. This strategy has helped them grow from 150 customers two years ago to just over 1,000 today (including EMC, Home Depot, Verizon, and Cardinal Health – see Saugatuck Lens360 blog post Alteryx Inspire – The Importance of Analytic Context, published 01July 2014). They have also been succeeding in deploying their Alteryx Server solution – which enables end users to schedule the analytics jobs, publish results and provide reports, on the public or private cloud, rather than just on a local machine.

Alteryx came at the advanced analytics market just at the time that companies were first considering Big Data solutions like Hadoop and MapReduce, but took a different tack. Initially their product aimed to Continue reading Catching Up With Alteryx

In the Valley of the Blind, Autonomy is King

As Big Data continues to develop as a force shaping the enterprise, we can expect to see changes in business processes. Saugatuck has been monitoring the intersection between Big Data and intelligent business processes for the past several years. Developments in Big Data and advanced analytics have already had a substantial impact upon concepts of Business Process Management (BPM) including linkages with Internet of Things (IOT) and the industrial Internet; and the application of advanced analytics to analytic processes themselves. This contributes to a Big Data/Process convergence that is likely to have a continuing effect within the enterprise environment.

The application of Big Data directly to business processes such as manufacturing, finance, and supply chain processes, combined with autonomous operations enabled by real time evaluation and prediction, creates a new fabric for business operations. While much of this area has been focused upon manufacturing and is now being studied by governments and businesses, the digital business surround means that processes created within one domain are easily transferred to others. So, as rapid advances occur in linking processes to the Internet of Things through the Industrial Internet, we can expect rapid application of these ideas to other business areas such as human resources, healthcare, professional services, and the like. Continue reading In the Valley of the Blind, Autonomy is King

API with Analytics Yet?

The importance of the API economy has been apparent for several years, and API availability and use is growing exponentially. To date, this growth has been fueled by mobility, with APIs providing a mechanism for enabling tiny apps on mobile devices to perform important tasks by invoking the capability of hosted applications. At the same time, Big Data and Advanced Analytics have been developing steadily and moving toward direct real-time integration with business processes. Analytics APIs, particularly the new machine-learning driven Predictive APIs (PAPIs), can provide the glue to bring Analytics and processes together.

Analytics APIs offer the possibility of real time access to analytics inserted directly into composite applications. This offers great possibilities for enhancement of business processes, but it also opens the possibility of combining multiple simultaneous streams of analysis on an ad-hoc basis, creating a variable and scalable artificial intelligence.

Predictive APIs are already here and are being provided by major vendors and startups alike. This includes Microsoft, which provides Azure machine learning, and Google, which provides its Prediction API currently in a beta. Emerging providers taking revolutionary steps in this area include BigML, Swift API, Datagami, GraphLab, Apigee Insights, Openscoring.io, Intuitics, Zementis, Predixion, PredictionIO, H2O, Yottamine, Lattice, Futurelytics, and Lumiata. As with many other technologies in the IT sector, a lot of the innovation is happening with startups. In this case, however, startups greatly expand their capabilities of the underlying technology by opening up a wider range of APIs for assembly to handle an ever-increasing range of data and outcomes. Continue reading API with Analytics Yet?

Time to Rethink the “Data Center” Concept – and the Role of “IT?”

On the one hand, the IT function and organization for most firms has never been more about “data.” Big Data, data mining, advanced analytics, predictive analytics, real-time analytics, and so on rule media reports, analyst insights, event titles, the blogosphere, and provider announcements. The growth of Digital Business both feeds, and feed off of, data and more data. What used to be primarily a focus on software and infrastructure, many firms now see data and content providing their greatest growth.

On the other hand, data and its processing are every day less and less centered. It’s easy to see the proliferation of Cloud-based function-, application-, and group-specific analytics that accompany the parallel proliferation of Cloud-based applications and attendant data stores. And meanwhile, CIOs all over the globe are already telling us that because they can leverage Cloud capabilities, they want to never build another data center.

This has engendered some interesting discussions with the Saugatuck team and with our clients as well. If we as business and IT leaders are more and more about “data” every day, while we actively pursue non-centered uses, locations, and processing for that data, what do we do with the concept of the “data center?”

The data center concept grew because of traditional, centralized IT organizations, infrastructures, and policies that insisted that data was a valuable resource that therefore must be centrally held, controlled, processed, and secured. This version of reality became moot after the onslaught of desktop and portable computing, and has become even less meaningful to many in a Cloud-first IT and Continue reading Time to Rethink the “Data Center” Concept – and the Role of “IT?”

SIIA in Boston – Deciphering Data and Analytics for ISV Business

What is Happening?

ISVs can, should, and do profit from the use of advanced data analytics – not only by integrating them within software and services offerings, but more importantly, by integrating an increasing range and scope of data (including Big Data) and analytics into their own business operations and decision making. Data regarding user behavior, operational efficiencies, and relationship management can and should be analyzed to help determine and take advantage of customer / buyer desires and needs, as well as competitive abilities, solution improvements, development strategy, upsell / cross-sell opportunities, pricing, business models, and hiring / retaining the most useful employees.

These were among the lessons reported by Saugatuck Research Fellow Bruce Guptill, who had the pleasure of attending and participating in this week’s “Deciphering the Data Storm” event, presented in Boston by the Software & Services division of the Software and Information Industry Association (SIIA).

Key lessons learned and reported by ISVs regarding the analysis and application of a wide range of business data (including Big Data) include the following:

  • Data needs “gravity” in order to be useful; i.e., data needs varying combinations of human business context, situational relevance, and environmental semantics (i.e., “the voice of the author”) in order to be qualified, let alone be useful in analysis.
  • Don’t always focus on reducing / limiting the “bigness” of data. Adding to / augmenting data with similar, complementary, and relevant data can provide and improve the “gravity” of that data. The key information sought may not be found completely in your own data. That being said, don’t be afraid to apply a variety of filters to screen Big Data; just be willing to accept failure and move on quickly when the filtering doesn’t work as expected.
  • Share data in common to improve collaboration. “Success” is defined differently everywhere, even within small ISVs. Utilizing common sets of data has more beneficial impact, and enables more and better business collaboration, than trying to develop and focus on a “single version of the truth.” Different groups will always have different perspectives, and use data in different ways; ensuring that the data used is common rather than simply absolute will enable better understanding, and foster more (and more useful) interaction.
  • Know what the next step is. In other words, set realistic business goals beyond simply analyzing data. Once deeply into the analysis, it’s easy to lose sight of business reasons behind the analysis. And as more data becomes more readily available from more sources, it becomes more and more easy to become overwhelmed.

Continue reading SIIA in Boston – Deciphering Data and Analytics for ISV Business

Security’s Next Era: Big Data Security Analytics

Strap yourself into your seat for the big data security analytics show, for it’s coming to a town near you. Carnival barkers from every walk of life will want you to come into their tents to see the latest and greatest show on earth: the big data security analytics show.

You will want to understand why using evolution charts, Venn diagrams, Pareto charts, and Pivot tables can or will help. You’ll want to see what association rules, clustering, decision trees, and forecasting can do for you. And you will want to understand the difference between analysis and knowledge, as it’s applied to security.

You will also want to make the distinction between whether you have to hire a data scientist or not and whether this will solve your immediate problems. You will also want to consider which approaches you could take that will produce the most value in the short, medium, and long term for your company and career.

To be useful, security analytics must take the large volume of data that can be collected and take three actions with the data, as follows:

  • Reduce voluminous data and identify the pattern that matters,
  • Use the information to enable a timely and appropriate in-situ response, and,
  • Use the data to make adjustments – after the fact.

Continue reading Security’s Next Era: Big Data Security Analytics

Tidemark: Mobile-First, Ease of Use Analytics

As part of Saugatuck’s ongoing “Finance in the Cloud” Series, Mike West and Bill McNee recently conferenced with executives from Tidemark to learn more about the Tidemark solutions, their market reception, future plans for enhancement and how Tidemark views “Finance in the Cloud.”

Formerly known as Proferi, Tidemark is a privately-held firm founded in 2010 that provides Cloud-based analytics solutions built for a mobile device-enabled platform. Tidemark has grown rapidly, and now has nearly 200 employees and over 50 customers.

Tidemark’s differentiation in the financial planning and analysis (FP&A) space, which has many competing providers, is a process-focused elastic grid platform that enables a high ease-of-use user experience designed to move financial analysis beyond the CFO suite. Tidemark targets the emerging urgency – not just in the CFO suite, but across the lines of business – for real-time analysis of multi-dimensional, complex Continue reading Tidemark: Mobile-First, Ease of Use Analytics