As a pioneer of the concept and theme of “free range” business IT and “boundary-free” business within and between enterprises, Saugatuck developed some of the earlier architectural, cost, and IT management models that have become more and more widely used.
But just as Clouds change shape, content, consistency and output over time, so must the nature ands visualization of Cloud-based business IT – especially given the accelerated adoption and expansion of more forms of Digital Business that make use of multiple systems, groups, functions, and data that previously did not intersect or interact. In short, we have to be able to accurately visualize what’s happening in order to have any chance of managing it and sustaining it as a business resource. Continue reading
On May 1, AT&T, CenturyLink, as well as U.S. telecom and cable industry groups petitioned the FCC to block parts of new Net Neutrality rules. They cited “crushing” compliance costs and threats to investment. The FCC ruling will go into effect June 12, 2015 unless the FCC or a court grants the motion to stay (or delay) the ruling.
The request objects to subjecting the broadband carriers to common carrier duties under Title II of the Communications Act of 1934. While the Title II changes are part of the new Net Neutrality rules from the FCC, the request from ISPs specifically targeted Title II reclassification. Petitioners did not seek a stay of the other key Net Neutrality rules: no blocking, no throttling, no paid prioritization. This reveals an important element of their legal strategy. The Internet providers see the new FCC rules as a house of cards with Title II classification as its foundation.
As we shared earlier this year (see Net Neutrality – Enjoy the Media Circus, Hurry Up and Wait for Real Change, 20Feb2015, 1530RA – see Lens360 blog post version), the consequences of the FCC reclassify broadband under Title II are unclear. The agency would need to use forbearance and waive certain processes that don’t apply for the Internet. Opponents assert that the change would slow innovation on the Internet. The result would be a regulated “innovation by permission” situation. Providers assert that Title II carries a lot of baggage as a regulatory option, with a risk of forcing other forms of transmission to also fall under its classification. In essence, opponents say, “if it’s not broken, don’t fix it.” Continue reading
What is Happening?
Yesterday, Saugatuck attended the AWS Summit 2015 in San Francisco, where Amazon gave an update on their business, and released several new products to the nearly 10,000 attendees at the Moscone Center and the 7,000 who watched the livestream. Andy Jassy, SVP of Web Services at Amazon kicked off the keynote by highlighting some key statistics about the business: from 4Q13 to 4Q14 they experienced 103 percent year-over-year growth in the amount of Data transferred into and out of S3 (Simple Storage Service) and 93 percent growth in the use of their compute service EC2. They now have over 1 million active users who have used the service in the last month.
Jassy brought several companies out to discuss the value of the AWS Cloud infrastructure. Jason Kilar, the founding CEO of Hulu, and now CEO of Vessel, a startup focused on video sharing and consumption, highlighted the ability to keep his team small, and focus on the business without having to build infrastructure. Wilf Russel, VP of Digital Technology Development at Nike described how the Cloud has fundamentally changed their application architecture and described their shift toward DevOps and Microservices. Valentini Volonghi, CTO at AdRoll discussed how the Cloud gave their business the reach to reduce latency by distributing their app around the globe. And Colin Bodell, CTO & EVP at Time Inc. who is migrating all of Time’s datacenters to AWS, citing that in the UK, they took their datacenter monthly run rate from $70,000 to $17,000.
Finally, Jassy used the opportunity to make several product announcements:
- Amazon Elastic File System (EFS) – a fully managed filesystem that can span multiple EC2 instances to enable multiple concurrent and scalable connections to a single file directory.
- Amazon Machine Learning Service – A one-size-fits all service that enables non-experts to implement Machine Learning algorithms on their data sets, or within their applications. This offers a simple API for training and modeling calculations which can then be called to perform specific machine learning tasks.
- Amazon Workspaces and AWS Marketplace for Desktop Apps – an Amazon VDI product to enable companies to create virtual desktops, as well as purchase, manage and provision the software for those desktop applications.
- The GA release of the Amazon EC2 Container Service – The Amazon tool for deploying, managing, updating, and running Docker clusters.
- The GA release of AWS Lambda – a Service designed to perform trigger / event driven compute in small doses. Provides a way to perform small, scripted tasks in real-time when triggers are initiated. Amazon highlighted its use in sending notifications, indexing, IoT, and as a serverless mobile backend. At present, Lambda only supports Node.js, but is adding support for Java in the coming weeks.
What is Happening?
Earlier today, Saugatuck Technology released the findings from its just completed Cloud Infrastructure Survey. The research and analysis clearly shows that businesses are moving rapidly away from traditional On-premises systems toward a range of Cloud infrastructure alternatives – including Internal Private Cloud, Hosted Private Cloud, Public Cloud and Hybrid (On-premises + Public Cloud). While CRM, HCM and Marketing-based SaaS solutions have dominated early Cloud decisions deployments, and more recently Cloud-based Finance offerings have begun to gain traction – the growing migration of On-premises production workloads to the Cloud, as well as the creation and deployment of Cloud-native production workloads clearly shows that we are entering a new phase in the transition.
Across major infrastructure services, both in the Cloud and On-premises, companies indicate a broad desire to upgrade their capabilities over the next two years. Additionally, On-premises virtualization – the dominant platform for IT Infrastructure today – will be supplanted with a combination of both Internal and Hosted Private Clouds, often supported by next-gen containerization technologies. While very few companies expect to be running entirely on Public Cloud by the end of the decade, a combination of Private and Public Cloud infrastructures (supporting production workloads), along with publically-available SaaS solutions (across an array of functional domains) will become increasingly the norm.
These are just a few of the conclusions from Saugatuck’s 73-page Strategic Report, released earlier today (Next-gen IT – Cloud on the March, 1553SSR, 02Apr2015). The report leverages a global web survey of 327 senior IT executives, across major geographic regions and business sizes. Continue reading
IT executives are increasingly recognizing that Cloud alternatives, unlike infrastructure technology refreshes, can be a key component of a new IT infrastructure that provides both cost savings and capabilities for improved service. However, to achieve these benefits, the IT organization must ensure the new infrastructure addresses business requirements rather than simply implementing the latest IT fad.
In a recently published Strategic Perspective, Saugatuck explains that implementing the right infrastructure upgrade depends on fully understanding future business requirements. Figure 1 helps visualize the linkage from business objectives, through business strategies, to business applications requirements, and ultimately to IT infrastructure requirements.
Figure 1: Linkage Between Business Objectives and IT Infrastructure Requirements
Source: Saugatuck Technology Inc. Continue reading
What is Happening?
ISVs can, should, and do profit from the use of advanced data analytics – not only by integrating them within software and services offerings, but more importantly, by integrating an increasing range and scope of data (including Big Data) and analytics into their own business operations and decision making. Data regarding user behavior, operational efficiencies, and relationship management can and should be analyzed to help determine and take advantage of customer / buyer desires and needs, as well as competitive abilities, solution improvements, development strategy, upsell / cross-sell opportunities, pricing, business models, and hiring / retaining the most useful employees.
These were among the lessons reported by Saugatuck Research Fellow Bruce Guptill, who had the pleasure of attending and participating in this week’s “Deciphering the Data Storm” event, presented in Boston by the Software & Services division of the Software and Information Industry Association (SIIA).
Key lessons learned and reported by ISVs regarding the analysis and application of a wide range of business data (including Big Data) include the following:
- Data needs “gravity” in order to be useful; i.e., data needs varying combinations of human business context, situational relevance, and environmental semantics (i.e., “the voice of the author”) in order to be qualified, let alone be useful in analysis.
- Don’t always focus on reducing / limiting the “bigness” of data. Adding to / augmenting data with similar, complementary, and relevant data can provide and improve the “gravity” of that data. The key information sought may not be found completely in your own data. That being said, don’t be afraid to apply a variety of filters to screen Big Data; just be willing to accept failure and move on quickly when the filtering doesn’t work as expected.
- Share data in common to improve collaboration. “Success” is defined differently everywhere, even within small ISVs. Utilizing common sets of data has more beneficial impact, and enables more and better business collaboration, than trying to develop and focus on a “single version of the truth.” Different groups will always have different perspectives, and use data in different ways; ensuring that the data used is common rather than simply absolute will enable better understanding, and foster more (and more useful) interaction.
- Know what the next step is. In other words, set realistic business goals beyond simply analyzing data. Once deeply into the analysis, it’s easy to lose sight of business reasons behind the analysis. And as more data becomes more readily available from more sources, it becomes more and more easy to become overwhelmed.
Finance and IT organizations are the two user enterprise power centers most likely to benefit from, and change because of, the increasing shift to Cloud-catalyzed digital business. A new Strategic Perspective published for Saugatuck subscription research clients continues our look into not only what is changing, but how, when, and why, as the Finance side of the house continues its journey through strategic and operational transformation – specifically, how Finance management systems are expected to be deployed, and when. The report builds on data and insight developed from our recent global Cloud Finance survey. The full survey findings – summarized in our Strategic Research Report entitled “Cloud Financials – The Third Wave Emerges” (1492SSR, 18Dec2014) – included 317 responses from senior Finance and IT executives – all from North American enterprises. For the purposes of this analysis, we focus on the 162 senior Finance executive responses.
First, we asked the following question (bear with us, as this includes five sub-responses as bulleted below): Continue reading
What is Happening?
Microservices is a new emerging architecture that is designed to operate well in Cloud environments. Microservices is often contrasted with traditional monolithic architectures – where instead of single cohesive applications, individual services are developed separately and connected by using interfaces – often RESTful APIs.
Because these APIs effectively abstract the inner workings of each service, Microservices can be developed using a variety of languages and technologies that best suit the service’s performance characteristics and requirements. This abstraction also allows the service to be upgraded under continuous development and deployment practices without interrupting the service, as long as the interface does not change.
Microservices architectures tend to scale well horizontally. Unlike Monolithic architectures, as load increases on any one service, it is possible to scale that service independently of the others. This allows better usage of resources, and also caters well to the Cloud, where it is possible to purchase very granular amounts of infrastructure for highly elastic and responsive scaling.
There are downsides though – primarily in the form of increased developmental and operational complexity from maintaining individual services, interfaces, and scaling resources. Microservices necessitates high-levels of automation, both in rolling out updates and deploying new services, to auto-scaling, load balancing, clustering, and fault-tolerance. Additional considerations need to be made into how applications are able to handle degraded service, since when individual services fail (and they will) the entire application doesn’t necessarily fail.
Finally, the Microservices approach plays well with legacy applications. In many cases, this is where Saugatuck expects to see the greatest adoption of Microservices in the enterprise. Because the services communicate through interfaces and are not bounded by the use of existing technologies / languages / databases, they are well suited to be added on to existing systems when additional features, functions, or performance are needed. Examples might include simple analytics services for regressions, additional small webpage applets, or asynchronous notification processes. Continue reading
Adoption of Cloud-based solutions is enabling and catalysing IT organizations to transform to new modes of operation with associated higher value propositions. In a recently published Strategic Perspective, Saugatuck defines four primary IT Organization Profiles:
- Supportive: These organizations collect, document, and analyze IT-oriented performance metrics and strive to react in a timely manner to new requirements from the business units.
- Proactive: These ensure delivery of services and strive to adopt new technologies.
- Collaborative: These collaborate with business units (BUs) to support new business processes in a timely fashion by facilitating use of new technologies.
- Innovative: These work in close partnership with BUs to innovate new business processes, products, and services.
What is Happening?
In October IBM announced surprising and disappointing revenue and profit for 3Q2014. The stock market reacted, and many trade and financial publications and pundits were quick to criticize IBM as being slow to capitalize on Cloud computing and the resulting spending shifts in both software and infrastructure. Thus, it was rather predictable that IBM has recently announced several billion-dollar-deals with very large customers (e.g., “IBM Signs Multi-Billion Dollar Cloud Computing Deal with a Dutch Bank.“).
However, some press and industry pundits were quick to express that these deals are merely further indication how IBM is not winning in the new Cloud-oriented market. Saugatuck sees things a little differently, and interprets recent IBM media and press activity as representative of how IBM and other traditional vendors are evolving with the market, and how the key to significant deals is expertise in helping large customers innovate and transform their businesses. Continue reading