As Big Data continues to develop as a force shaping the enterprise, we can expect to see changes in business processes. Saugatuck has been monitoring the intersection between Big Data and intelligent business processes for the past several years. Developments in Big Data and advanced analytics have already had a substantial impact upon concepts of Business Process Management (BPM) including linkages with Internet of Things (IOT) and the industrial Internet; and the application of advanced analytics to analytic processes themselves. This contributes to a Big Data/Process convergence that is likely to have a continuing effect within the enterprise environment.
The application of Big Data directly to business processes such as manufacturing, finance, and supply chain processes, combined with autonomous operations enabled by real time evaluation and prediction, creates a new fabric for business operations. While much of this area has been focused upon manufacturing and is now being studied by governments and businesses, the digital business surround means that processes created within one domain are easily transferred to others. So, as rapid advances occur in linking processes to the Internet of Things through the Industrial Internet, we can expect rapid application of these ideas to other business areas such as human resources, healthcare, professional services, and the like. Continue reading In the Valley of the Blind, Autonomy is King
The importance of the API economy has been apparent for several years, and API availability and use is growing exponentially. To date, this growth has been fueled by mobility, with APIs providing a mechanism for enabling tiny apps on mobile devices to perform important tasks by invoking the capability of hosted applications. At the same time, Big Data and Advanced Analytics have been developing steadily and moving toward direct real-time integration with business processes. Analytics APIs, particularly the new machine-learning driven Predictive APIs (PAPIs), can provide the glue to bring Analytics and processes together.
Analytics APIs offer the possibility of real time access to analytics inserted directly into composite applications. This offers great possibilities for enhancement of business processes, but it also opens the possibility of combining multiple simultaneous streams of analysis on an ad-hoc basis, creating a variable and scalable artificial intelligence.
Predictive APIs are already here and are being provided by major vendors and startups alike. This includes Microsoft, which provides Azure machine learning, and Google, which provides its Prediction API currently in a beta. Emerging providers taking revolutionary steps in this area include BigML, Swift API, Datagami, GraphLab, Apigee Insights, Openscoring.io, Intuitics, Zementis, Predixion, PredictionIO, H2O, Yottamine, Lattice, Futurelytics, and Lumiata. As with many other technologies in the IT sector, a lot of the innovation is happening with startups. In this case, however, startups greatly expand their capabilities of the underlying technology by opening up a wider range of APIs for assembly to handle an ever-increasing range of data and outcomes. Continue reading API with Analytics Yet?
On the one hand, the IT function and organization for most firms has never been more about “data.” Big Data, data mining, advanced analytics, predictive analytics, real-time analytics, and so on rule media reports, analyst insights, event titles, the blogosphere, and provider announcements. The growth of Digital Business both feeds, and feed off of, data and more data. What used to be primarily a focus on software and infrastructure, many firms now see data and content providing their greatest growth.
On the other hand, data and its processing are every day less and less centered. It’s easy to see the proliferation of Cloud-based function-, application-, and group-specific analytics that accompany the parallel proliferation of Cloud-based applications and attendant data stores. And meanwhile, CIOs all over the globe are already telling us that because they can leverage Cloud capabilities, they want to never build another data center.
This has engendered some interesting discussions with the Saugatuck team and with our clients as well. If we as business and IT leaders are more and more about “data” every day, while we actively pursue non-centered uses, locations, and processing for that data, what do we do with the concept of the “data center?”
The data center concept grew because of traditional, centralized IT organizations, infrastructures, and policies that insisted that data was a valuable resource that therefore must be centrally held, controlled, processed, and secured. This version of reality became moot after the onslaught of desktop and portable computing, and has become even less meaningful to many in a Cloud-first IT and Continue reading Time to Rethink the “Data Center” Concept – and the Role of “IT?”
What is Happening?
ISVs can, should, and do profit from the use of advanced data analytics – not only by integrating them within software and services offerings, but more importantly, by integrating an increasing range and scope of data (including Big Data) and analytics into their own business operations and decision making. Data regarding user behavior, operational efficiencies, and relationship management can and should be analyzed to help determine and take advantage of customer / buyer desires and needs, as well as competitive abilities, solution improvements, development strategy, upsell / cross-sell opportunities, pricing, business models, and hiring / retaining the most useful employees.
These were among the lessons reported by Saugatuck Research Fellow Bruce Guptill, who had the pleasure of attending and participating in this week’s “Deciphering the Data Storm” event, presented in Boston by the Software & Services division of the Software and Information Industry Association (SIIA).
Key lessons learned and reported by ISVs regarding the analysis and application of a wide range of business data (including Big Data) include the following:
- Data needs “gravity” in order to be useful; i.e., data needs varying combinations of human business context, situational relevance, and environmental semantics (i.e., “the voice of the author”) in order to be qualified, let alone be useful in analysis.
- Don’t always focus on reducing / limiting the “bigness” of data. Adding to / augmenting data with similar, complementary, and relevant data can provide and improve the “gravity” of that data. The key information sought may not be found completely in your own data. That being said, don’t be afraid to apply a variety of filters to screen Big Data; just be willing to accept failure and move on quickly when the filtering doesn’t work as expected.
- Share data in common to improve collaboration. “Success” is defined differently everywhere, even within small ISVs. Utilizing common sets of data has more beneficial impact, and enables more and better business collaboration, than trying to develop and focus on a “single version of the truth.” Different groups will always have different perspectives, and use data in different ways; ensuring that the data used is common rather than simply absolute will enable better understanding, and foster more (and more useful) interaction.
- Know what the next step is. In other words, set realistic business goals beyond simply analyzing data. Once deeply into the analysis, it’s easy to lose sight of business reasons behind the analysis. And as more data becomes more readily available from more sources, it becomes more and more easy to become overwhelmed.
Continue reading SIIA in Boston – Deciphering Data and Analytics for ISV Business
Strap yourself into your seat for the big data security analytics show, for it’s coming to a town near you. Carnival barkers from every walk of life will want you to come into their tents to see the latest and greatest show on earth: the big data security analytics show.
You will want to understand why using evolution charts, Venn diagrams, Pareto charts, and Pivot tables can or will help. You’ll want to see what association rules, clustering, decision trees, and forecasting can do for you. And you will want to understand the difference between analysis and knowledge, as it’s applied to security.
You will also want to make the distinction between whether you have to hire a data scientist or not and whether this will solve your immediate problems. You will also want to consider which approaches you could take that will produce the most value in the short, medium, and long term for your company and career.
To be useful, security analytics must take the large volume of data that can be collected and take three actions with the data, as follows:
- Reduce voluminous data and identify the pattern that matters,
- Use the information to enable a timely and appropriate in-situ response, and,
- Use the data to make adjustments – after the fact.
Continue reading Security’s Next Era: Big Data Security Analytics
As part of Saugatuck’s ongoing “Finance in the Cloud” Series, Mike West and Bill McNee recently conferenced with executives from Tidemark to learn more about the Tidemark solutions, their market reception, future plans for enhancement and how Tidemark views “Finance in the Cloud.”
Formerly known as Proferi, Tidemark is a privately-held firm founded in 2010 that provides Cloud-based analytics solutions built for a mobile device-enabled platform. Tidemark has grown rapidly, and now has nearly 200 employees and over 50 customers.
Tidemark’s differentiation in the financial planning and analysis (FP&A) space, which has many competing providers, is a process-focused elastic grid platform that enables a high ease-of-use user experience designed to move financial analysis beyond the CFO suite. Tidemark targets the emerging urgency – not just in the CFO suite, but across the lines of business – for real-time analysis of multi-dimensional, complex Continue reading Tidemark: Mobile-First, Ease of Use Analytics
With Clouds come storms, and big storms tend to blow things around. Back in 2012, we began building our “Boundary-free Enterprise™” business concept to illustrate the concept of how much Cloud and its related technologies will “blow away” many, if not most, of our traditional business and technological boundaries.
A new Strategic Perspective for Saugatuck Technology’s subscription research clients looks at the four types of boundaries most likely to be buffeted by these storms, as follows: Continue reading Boundary-free Enterprises and the Big Storm of 2015
What is Happening?
Recent software analyst and IT media reports, including insights from a recent SAP Americas User Group (ASUG) survey, suggest that SAP’s HANA Big Data service / platform is not yet seen by a majority of ASUG members as benefiting their business (relative to the implementation cost of implementing), or driving enough revenue growth for SAP. SAP has, very smartly, issued a careful rebuttal explaining how, where, and why customers see value in HANA – and more importantly, offering to work with any customer to help them understand and realize business benefits from the offering and its associated apps.
We believe that, through at least 2016, this type of approach is the most effective way of getting user enterprises to understand the value of any Big Data analytics capability; i.e., develop company-specific and operationally-specific business cases in order to enable and develop business value. This is because, in most companies, Big Data analytics just can’t be widely used to deliver broad-based business benefits across the full portfolio – because user enterprises have huge challenges finding and managing their own data, let alone analyzing it. Continue reading The Business Problem with Big Data Analytics
Recently, Saugatuck attended the 2014 Alteryx Inspire event in San Diego as part of our ongoing Analytics, BI, and Big Data research. The event showcased not just Alteryx offerings and customers, but also did a good job of presenting and encouraging discussion around Analytics trends, partner relationships, and challenges for users of analytics – including Big Data. We came away from the event with three key insights, as follows:
1) ETL & Access to the Data. One of the primary differentiators of Alteryx is built-in ETL. Even though the application features significant Advanced Analytics capabilities (built around the R language), the Alteryx Designer focuses around an ETL-centric workflow. These capabilities make Alteryx adept at combining multiple data sources, and performing complex Joins and Transformations that would normally be prohibitively difficult for end-user business analysts. These capabilities feature centrally for the customers that we talked to as well, as most Continue reading Alteryx Inspire – The Importance of Analytic Context
What is Happening?
Digital delivery models are impacting traditional businesses. Driven by consumer demand for convenience and new consumption models, including subscriptions and usage-based consumption, enterprises are moving to Digital Business models and offerings, and finding huge upside from predictable revenue streams and long-term, recurring-revenue customer relationships.
There’s no simple means of accomplishing this, however – there’s no “silver bullet.” Saugatuck’s research in the constantly-changing, constantly-innovating world of Digital Business continues to indicate that success requires not only significant investment in business strategy, modeling, and organization, but also a flexible, platform-based approach built on three cornerstones: data analytics, dPaaS and other Cloud development platforms, and DevOps.
A 36-page new Saugatuck Strategic Research Report – Cornerstones for Digital Business: Big Data, dPaaS, and DevOps – summarizes and explains this platform-based approach, using key concepts of Digital Business, in-depth explanation of the required platform architecture, and re-examinations of foundational Saugatuck research to illustrate and guide readers through Digital Business transitions. Continue reading Keys to Digital Transformation, Cornerstones for Digital Business