Tag Archives: Cloud IT

From Technology Primacy to Business Primacy – Mapping IT Capabilities Across Eras

A recent Strategic Perspective published for subscribers of Saugatuck Technology’s Continuous Research Services laid out a simple vision of three eras of enterprise IT – Centralized, De-centralizing, and Boundary-free Business – and why the shift from the second to the third has been so traumatic, and has generated such fear, uncertainty, and doubt among enterprise IT organizations and the external providers serving them. A new Strategic Perspective uses input from CIOs and CTOs to create a heat map model that illustrates key shifts from one era to the next, in the context of change from Technology-first to Business-first IT capabilities, and weighs the effects of these on internal enterprise IT organizations and external IT providers.

The net of it all: The core nature of enterprise IT has shifted from Technology primacy to Business primacy, and that is what has changed the fundamental nature and value of enterprise IT – and of the external providers serving enterprise IT. Most of us are aware of this, but we have not been able to visualize it easily, and so have lacked an effective means of qualifying the change and identifying where investments and planning need to shift, especially as regards the capabilities needed to make IT work for the enterprise. Saugatuck’s model enables this visualization to be adaptable to individual enterprises.

Because of the shift away from Technological primacy, many traditional external IT providers (e.g., VARs, SIs, outsourcers) are losing much of their influence over many aspects of enterprise IT and business. This is not to say that what they provide is unimportant and not valued; instead, more value is being attached to other, more Business-first, types of capabilities (e.g., process-specific, market-specific knowledge and expertise). And yes, many providers have decades of process- and market-specific skills and knowledge, but in the past, these tended to be applied in ways that emphasized the specific technological characteristics of providers’ offerings. The value expected by enterprises is no longer in the offerings, but in how the offerings improve their business. Continue reading From Technology Primacy to Business Primacy – Mapping IT Capabilities Across Eras

Oracle Financials – Tougher Than You Think

Oracle Corp. (ORCL) issued its quarterly and fiscal year-end financial report this week, and the blogosphere is abuzz with quotes, assertions, and counter-arguments regarding the company’s future and viability. On the one side, we have company leadership asserting that all is well, and that the notable decline in traditional software, hardware, and services business will be more than offset by the continued increase in Cloud-based business. On the other side, we have pundits proclaiming the beginning of the end for Oracle.

Here’s what Saugatuck sees: A very traditional, old-style IT Master Brand in the throes of re-inventing itself, and suffering financially as a result (just like all other traditional IT Master Brands). That doesn’t mean that Oracle is failing, or will fail, just that it, like HP, IBM, and other traditional old-line MBs, is suffering as their businesses and traditional customers and partners turn, almost groaningly slowly, toward Cloud-first approaches.

A radar-style chart helps to show not only how slowly this is happening, but how tough it is to accomplish. Figure 1 uses data aggregated from Oracle’s SEC filings in 2013 and 2015 to illustrate. I combined Oracle’s reported “New Software License” revenue and its “Software License Updates and Product Support” revenue into the “Traditional Software Business” category in Figure 1. Oracle’s reported “Cloud Software as a Service and Platform as a Service” is combined with its “Cloud Infrastructure as a Service” revenues to create “Cloud-based business” in Figure 1. “Traditional Hardware Business” includes the company’s reported “Hardware Systems products” and “Hardware Systems Support” revenues. “Traditional Services Business” in Figure 1 is simply Oracle’s reported “Services Revenues.”

Figure 1: Oracle Revenue Shifts (Almost Imperceptibly) Toward Cloud, 2013-2015


Source: Saugatuck Technology Inc.; figures are in millions of US $

See the little bit of pink/red to the left of center? That’s how much / how little Oracle’s Cloud-based revenue has grown in two years. Continue reading Oracle Financials – Tougher Than You Think

Tight Budgets and Impending Upgrades Drive Cloud Evaluation

Saugatuck recently released findings from our 2015 Cloud Infrastructure Survey. This global Web survey of 327 IT executives spanning major geographies and business sizes clearly shows that IT infrastructures are transitioning from traditional On-premises resources to Cloud-based alternatives. In a just published Strategic Perspective we look at one facet revealed by the survey: planned infrastructure upgrades for key workloads.

For most IT organizations, budgets are always tight. In the past few years IT budgets have been stretched even thinner than during “normal” years as a result of budget cuts due to lingering sluggish economic conditions and an explosion of demand for IT resources to accommodate new requirements including mobility, analytics, and social.

Saugatuck projects that the demand for computing resources will continue to grow rapidly – and, in increasingly dynamic and erratic increments. Continue reading Tight Budgets and Impending Upgrades Drive Cloud Evaluation

Scaling Beyond The Second Dimension: Re-visualizing Boundary-free Enterprise™

As a pioneer of the concept and theme of “free range” business IT and “boundary-free” business within and between enterprises, Saugatuck developed some of the earlier architectural, cost, and IT management models that have become more and more widely used.

But just as Clouds change shape, content, consistency and output over time, so must the nature ands visualization of Cloud-based business IT – especially given the accelerated adoption and expansion of more forms of Digital Business that make use of multiple systems, groups, functions, and data that previously did not intersect or interact. In short, we have to be able to accurately visualize what’s happening in order to have any chance of managing it and sustaining it as a business resource. Continue reading Scaling Beyond The Second Dimension: Re-visualizing Boundary-free Enterprise™

ISPs Act to Block Title II Reclassification in Net Neutrality Rules

On May 1, AT&T, CenturyLink, as well as U.S. telecom and cable industry groups petitioned the FCC to block parts of new Net Neutrality rules. They cited “crushing” compliance costs and threats to investment. The FCC ruling will go into effect June 12, 2015 unless the FCC or a court grants the motion to stay (or delay) the ruling.

The request objects to subjecting the broadband carriers to common carrier duties under Title II of the Communications Act of 1934. While the Title II changes are part of the new Net Neutrality rules from the FCC, the request from ISPs specifically targeted Title II reclassification. Petitioners did not seek a stay of the other key Net Neutrality rules: no blocking, no throttling, no paid prioritization. This reveals an important element of their legal strategy. The Internet providers see the new FCC rules as a house of cards with Title II classification as its foundation.

As we shared earlier this year (see Net Neutrality – Enjoy the Media Circus, Hurry Up and Wait for Real Change, 20Feb2015, 1530RA – see Lens360 blog post version), the consequences of the FCC reclassify broadband under Title II are unclear. The agency would need to use forbearance and waive certain processes that don’t apply for the Internet. Opponents assert that the change would slow innovation on the Internet. The result would be a regulated “innovation by permission” situation. Providers assert that Title II carries a lot of baggage as a regulatory option, with a risk of forcing other forms of transmission to also fall under its classification. In essence, opponents say, “if it’s not broken, don’t fix it.” Continue reading ISPs Act to Block Title II Reclassification in Net Neutrality Rules

AWS Summit 2015: Public Cloud and Microservices

What is Happening?

Yesterday, Saugatuck attended the AWS Summit 2015 in San Francisco, where Amazon gave an update on their business, and released several new products to the nearly 10,000 attendees at the Moscone Center and the 7,000 who watched the livestream. Andy Jassy, SVP of Web Services at Amazon kicked off the keynote by highlighting some key statistics about the business: from 4Q13 to 4Q14 they experienced 103 percent year-over-year growth in the amount of Data transferred into and out of S3 (Simple Storage Service) and 93 percent growth in the use of their compute service EC2. They now have over 1 million active users who have used the service in the last month.

Jassy brought several companies out to discuss the value of the AWS Cloud infrastructure. Jason Kilar, the founding CEO of Hulu, and now CEO of Vessel, a startup focused on video sharing and consumption, highlighted the ability to keep his team small, and focus on the business without having to build infrastructure. Wilf Russel, VP of Digital Technology Development at Nike described how the Cloud has fundamentally changed their application architecture and described their shift toward DevOps and Microservices. Valentini Volonghi, CTO at AdRoll discussed how the Cloud gave their business the reach to reduce latency by distributing their app around the globe. And Colin Bodell, CTO & EVP at Time Inc. who is migrating all of Time’s datacenters to AWS, citing that in the UK, they took their datacenter monthly run rate from $70,000 to $17,000.

Finally, Jassy used the opportunity to make several product announcements:

  1. Amazon Elastic File System (EFS) – a fully managed filesystem that can span multiple EC2 instances to enable multiple concurrent and scalable connections to a single file directory.
  2. Amazon Machine Learning Service – A one-size-fits all service that enables non-experts to implement Machine Learning algorithms on their data sets, or within their applications. This offers a simple API for training and modeling calculations which can then be called to perform specific machine learning tasks.
  3. Amazon Workspaces and AWS Marketplace for Desktop Apps – an Amazon VDI product to enable companies to create virtual desktops, as well as purchase, manage and provision the software for those desktop applications.
  4. The GA release of the Amazon EC2 Container Service – The Amazon tool for deploying, managing, updating, and running Docker clusters.
  5. The GA release of AWS Lambda – a Service designed to perform trigger / event driven compute in small doses. Provides a way to perform small, scripted tasks in real-time when triggers are initiated. Amazon highlighted its use in sending notifications, indexing, IoT, and as a serverless mobile backend. At present, Lambda only supports Node.js, but is adding support for Java in the coming weeks.

Continue reading AWS Summit 2015: Public Cloud and Microservices

Cloud Infrastructure Survey: New Report Findings

What is Happening?           

Earlier today, Saugatuck Technology released the findings from its just completed Cloud Infrastructure Survey. The research and analysis clearly shows that businesses are moving rapidly away from traditional On-premises systems toward a range of Cloud infrastructure alternatives – including Internal Private Cloud, Hosted Private Cloud, Public Cloud and Hybrid (On-premises + Public Cloud). While CRM, HCM and Marketing-based SaaS solutions have dominated early Cloud decisions deployments, and more recently Cloud-based Finance offerings have begun to gain traction – the growing migration of On-premises production workloads to the Cloud, as well as the creation and deployment of Cloud-native production workloads clearly shows that we are entering a new phase in the transition.

Across major infrastructure services, both in the Cloud and On-premises, companies indicate a broad desire to upgrade their capabilities over the next two years. Additionally, On-premises virtualization – the dominant platform for IT Infrastructure today – will be supplanted with a combination of both Internal and Hosted Private Clouds, often supported by next-gen containerization technologies. While very few companies expect to be running entirely on Public Cloud by the end of the decade, a combination of Private and Public Cloud infrastructures (supporting production workloads), along with publically-available SaaS solutions (across an array of functional domains) will become increasingly the norm.

These are just a few of the conclusions from Saugatuck’s 73-page Strategic Report, released earlier today (Next-gen IT – Cloud on the March, 1553SSR, 02Apr2015). The report leverages a global web survey of 327 senior IT executives, across major geographic regions and business sizes. Continue reading Cloud Infrastructure Survey: New Report Findings

Journey to Cloud Starts with a Single Step

IT executives are increasingly recognizing that Cloud alternatives, unlike infrastructure technology refreshes, can be a key component of a new IT infrastructure that provides both cost savings and capabilities for improved service. However, to achieve these benefits, the IT organization must ensure the new infrastructure addresses business requirements rather than simply implementing the latest IT fad.

In a recently published Strategic Perspective, Saugatuck explains that implementing the right infrastructure upgrade depends on fully understanding future business requirements. Figure 1 helps visualize the linkage from business objectives, through business strategies, to business applications requirements, and ultimately to IT infrastructure requirements.

Figure 1: Linkage Between Business Objectives and IT Infrastructure Requirements

1551STR_figure1Source: Saugatuck Technology Inc. Continue reading Journey to Cloud Starts with a Single Step

SIIA in Boston – Deciphering Data and Analytics for ISV Business

What is Happening?

ISVs can, should, and do profit from the use of advanced data analytics – not only by integrating them within software and services offerings, but more importantly, by integrating an increasing range and scope of data (including Big Data) and analytics into their own business operations and decision making. Data regarding user behavior, operational efficiencies, and relationship management can and should be analyzed to help determine and take advantage of customer / buyer desires and needs, as well as competitive abilities, solution improvements, development strategy, upsell / cross-sell opportunities, pricing, business models, and hiring / retaining the most useful employees.

These were among the lessons reported by Saugatuck Research Fellow Bruce Guptill, who had the pleasure of attending and participating in this week’s “Deciphering the Data Storm” event, presented in Boston by the Software & Services division of the Software and Information Industry Association (SIIA).

Key lessons learned and reported by ISVs regarding the analysis and application of a wide range of business data (including Big Data) include the following:

  • Data needs “gravity” in order to be useful; i.e., data needs varying combinations of human business context, situational relevance, and environmental semantics (i.e., “the voice of the author”) in order to be qualified, let alone be useful in analysis.
  • Don’t always focus on reducing / limiting the “bigness” of data. Adding to / augmenting data with similar, complementary, and relevant data can provide and improve the “gravity” of that data. The key information sought may not be found completely in your own data. That being said, don’t be afraid to apply a variety of filters to screen Big Data; just be willing to accept failure and move on quickly when the filtering doesn’t work as expected.
  • Share data in common to improve collaboration. “Success” is defined differently everywhere, even within small ISVs. Utilizing common sets of data has more beneficial impact, and enables more and better business collaboration, than trying to develop and focus on a “single version of the truth.” Different groups will always have different perspectives, and use data in different ways; ensuring that the data used is common rather than simply absolute will enable better understanding, and foster more (and more useful) interaction.
  • Know what the next step is. In other words, set realistic business goals beyond simply analyzing data. Once deeply into the analysis, it’s easy to lose sight of business reasons behind the analysis. And as more data becomes more readily available from more sources, it becomes more and more easy to become overwhelmed.

Continue reading SIIA in Boston – Deciphering Data and Analytics for ISV Business

Finance Systems, Cloud, and Uncertainty

Finance and IT organizations are the two user enterprise power centers most likely to benefit from, and change because of, the increasing shift to Cloud-catalyzed digital business. A new Strategic Perspective published for Saugatuck subscription research clients continues our look into not only what is changing, but how, when, and why, as the Finance side of the house continues its journey through strategic and operational transformation – specifically, how Finance management systems are expected to be deployed, and when. The report builds on data and insight developed from our recent global Cloud Finance survey. The full survey findings – summarized in our Strategic Research Report entitled “Cloud Financials – The Third Wave Emerges” (1492SSR, 18Dec2014) – included 317 responses from senior Finance and IT executives – all from North American enterprises. For the purposes of this analysis, we focus on the 162 senior Finance executive responses.

First, we asked the following question (bear with us, as this includes five sub-responses as bulleted below): Continue reading Finance Systems, Cloud, and Uncertainty