MuleSoft Connect: Integration and the IoT

What is Happening?           

This week, Saugatuck attended MuleSoft Connect, the company’s annual customer event. Attendance was up from around 700 partner and customer attendees in 2014 to over 1,000 in this year. The growth of the event also heralds a growth of the integration industry – from MuleSoft’s initial forays into the iPaaS market, connected SaaS applications, to a broader platform, which helps enable the API Economy.

A key component of the API Economy – that has been center-stage at the event this year – is the rise of the Internet of Things (IoT). The IoT presents key architectural and integration challenges for companies that are trying to take advantage of new capabilities. Devices, services, and analytics systems all need to be carefully connected together to create a multi-part value chain. MuleSoft had several demonstrations of complex, multi-device integrations into single applications or workflows. While these were clearly demonstrations, the underlying capabilities are becoming increasingly critical for many businesses (see IoT Platforms).

MuleSoft used their platform in several demonstrations, including controlling small robots from web applications, to complex electronic music integrations that combined touch, image, and sound sensors with Phillips Hue lightbulbs and analog guitar input, to an on stage demonstration by CTO Uri Sareed that combined flight arrival APIs with hotel check in and Uber request APIs to demonstrate a real-life case of connected APIs and devices.

Why is it Happening?

IoT devices are proliferating at a rapid rate, and at the same time, the lines between devices, services, apps, and data are blurring. One important trend, which was highlighted1583RA Chart at the event, is that the devices themselves are becoming smarter. While connectivity is a given in devices in the IoT, it is increasingly clear that devices will become more than connected sensors, and become more autonomous and intelligent. While the bulk of devices available now do little processing on-board, instead focusing on using the cloud for data processing, the possibilities for more robust on-device analytics, processing, and communications are imminent. Continue reading

Billion-dollar EMC Virtustream Deal as a Tipping Point

It’s a compelling sign of the times when a $1.2B acquisition of a Cloud-based services provider gets little more than a de facto “well, that makes sense” from the IT marketplace. But those are the times in which we live, and that’s what’s happened so far as a result of storage Master Brand EMC buying Virtustream.

Not very long ago, Wall Street pundits and industry analysts would be scrutinizing the deal in very granular fashion, raising questions over the fact that Virtustream is a Cloud-based services provider while EMC is a very traditional storage and storage management, asking where will the disconnects will be, examining cultures for fit and friction, and invoking opinions regarding how this will or will not “disrupt” somebody’s business.

Today, it’s seen as a significant investment to be sure, and one that will greatly increase both EMC’s portfolio and its value to customers and partners. But the billion-dollar-plus valuation barely raises an eyebrow, the Cloud-first nature of Virtustream’s business raises none, and there’s scant reference to any disruption resulting from the deal.

There have been other, similar, and fairly recent acquisitions of Cloud-first providers by traditional IT Master Brands. Cisco, Dell, HP, IBM, Microsoft, Oracle, SAP and other major, influential, and mostly traditional IT brand names have all made similar or even larger investments/acquisitions since 2010. And several if not many of these deals have disrupted the acquirers and their customers and partners.

But we have passed a tipping point when the combination of billion-dollar deal size and Cloud-first nature of the deal are nothing more than somewhat noteworthy for a major, traditional IT firm. Cloud is now an expected and natural aspect of 10-figure IT vendor deals.

To paraphrase Saugatuck founder and CEO Bill McNee, we are well past the “Cloud Experiment” stage, and well into integrating Cloud into everyday business – regardless of whether we’re using, developing, or selling Cloud-based business IT. We’re living, breathing, and working Cloud everywhere all the time, which means that it’s getting harder to consider the Cloud-first nature of any business IT or provider as a disruptor in and of itself.

Which, to me, begs another question: What’s the next major IT business disruptor?

What’s the Risk?

What’s the risk? By not having a solid risk management program in place, the IT security function in the enterprise further risks making itself more irrelevant to the business of the enterprise as the organization transitions to digital business. By not providing assessing the potential for damage to the organization, when such damage is more likely, and communicate this simply to business stakeholders, the more likely the IT security function will be further relegated to the role of plumbers: someone who’s necessary when needed but otherwise not called upon. Continue reading

Epicor Insights 2015 – Customers and Epicor Reboot for Future

What is Happening?

Three days of interaction with Epicor executives, customers, and partners at the Insights 2015 event indicate the company and customers are both in the midst of substantial business change and re-invention / realignment. The overall sense is that of customers and Epicor both rethinking how they do business, with each working to not outpace the other, and finding ways to build a solid future out of a somewhat-dispersed present.

For customers, this includes coming to grips with combined technological and generational changes – practically a “structural break,” defined by economists as “an unexpected shift in a macroeconomic time series.” In other words, business is changing faster than most feel they can manage effectively (kudos to Epicor EVP Craig McCollum for the conceptual mention in his Wednesday keynote).

For Epicor, the changes are more readily visible, including the following:

  • A massive business restructuring that includes consolidation of formerly-disparate silos that resulted from a longtime acquisition-driven growth strategy;
  • A new, more focused and aligned senior management team, including a new CTO, CIO, Chief Product Executive, EVPs of Americas and Support, and more;
  • Spinning off Retail Systems Group – a significant legacy presence and revenue source that simply did not sit well on Epicor’s core ERP foundation;
  • Re-engineering to embed consistent technology, code, UI/UX, and approaches / methodologies within and across all remaining product lines; and
  • Implementing common methodologies across all business and product planning, sales, support, marketing, and communication.

Epicor in many ways is following the Saugatuck playbook in re re-inventing a traditional ISV business for a Cloud-first future, while its customers work to rethink and in many cases reboot their own businesses to compete and grow in changing marketplaces, with all laying groundwork for an obviously different future with a still-indeterminate arrival timeframe. Continue reading

News from Notables at Internet of Things World

The second annual Internet of Things World 2015 conference was held last week in San Francisco. The conference hosted 4,000 attendees, 250 speakers, and 150 exhibitors. The focus for 2015 was on creating Internet of Things (IoT) partnerships and developing ecosystems to monetize the IoT service vision. Two notable providers made significant announcements touting their IoT offerings, successes, and strategiesSamsung introduced ARTIK, a new chip technology for IoT devices. The ARTIK features security, small form factor, advanced software, and low power use, said the company. ARTIK includes an IoT software stack and tools for product development. ARTIK is also compatible with Arduino, the open source electronics platform. Samsung thinks its manufacturing capabilities combined with its IoT ecosystem means product developers can quickly get their ARTIK-based solutions to market.

Samsung also announced SmartThings Open Cloud, a new open software and data exchange cloud. Open Cloud is powered by SAMI, Samsung’s platform for open exchange of diverse data from any source. Open Cloud supports digital appliances and consumer electronics products. Samsung ships about 660 million devices each year, including electronics and appliances. The company wants to connect 90% of those devices to the Internet by 2017.

Qualcomm, via its wireless-focused subsidiary Qualcomm Atheros, bowed new and smarter Wi- Fi chip solutions to connect IoT devices and applications. The new products are compatible with the AllJoyn software framework from the AllSeen Alliance. The company also announced six new integrations affecting the IoT. The integrations address security, performance management, cloud connect services, code generation, and analytics.

Qualcomm sees the IoT as a critical growth market; its research shows that 5 billion connected devices that are not smartphones will ship by 2018. Its chips are already in 20 million automobiles and more than 120 million home devices. Qualcomm also supports several smart city projects. And the company claims more than $1 billion in revenue from chips for IoT devices in its last fiscal year that ended in September 2014.

The Samsung and Qualcomm IoT announcements reinforce the opinion that Saugatuck expects IoT ecosystems and interoperability to be central to provider strategies (see 1564SSR, Making Sense of the Internet of Things: What’s a Leader to Do?, 23Apr2015). The growth of the IoT will depend on such strategies. Improved interoperability will hopefully follow the implementations of the new products and integrations from these two providers as well as others.

NOTE: this Lens360 blog post was originally published online by Saugatuck at http:// blog.saugatucktechnology.com.

Continue reading

Enterprise Architecture: Everything Old is New Again

Enterprise Architecture (EA) has existed since John  Zachman identified the concept in in 1987. It has become incorporated within the IT department and claims a significant role in the data processing mission. Yet, EA has struggled to be understood, fully useful, and incorporated into the rest of the data processing environment.

As we enter a new age, it is clear that the principles underlying EA will become even more important. It is essential that we understand the data flows and how systems are interconnected, because this is the basis for every kind of evaluation — from security to audits of transparency and optimization of process performance across IT. But one of the keys to the current environment is the development of Big Data analytics and the possibility of applying this kind of analysis to create a more automated approach to modeling. Continue reading

Infrastructure Selection Criteria: Increasing Importance of Managed Services

What is Happening?          

Saugatuck recently released the findings from its 2015 Cloud Infrastructure Survey. The data clearly shows that IT infrastructures are transitioning rapidly from traditional On-premises resources to a range of alternatives – including Internal Private Cloud, Hosted Private Cloud, Public Cloud and Hybrid (On-premises + Public Cloud). The infrastructure transition is both the result of and the enabler of increasing migration of conventional On-premises production workloads to the Cloud, and the deployment of new workloads designed for Cloud functionality.

The research also shows that as usage of Cloud-based offerings broadens and deepens, the evaluation and selection criteria used by IT organizations will evolve – including recognition that Managed Services will play an important role. The new criteria will be dictated by a combination of factors: org. capabilities, workload characteristics, and workload criticality. These are a few of the findings from our global web survey of 327 IT execs spanning major geos and business sizes. This Research Alert focuses on the evolving criteria for selection of Cloud offerings.

Why is it Happening?

Many IT executives are increasingly recognizing that the inefficiencies and the inertia inherent in their traditional infrastructure are inhibitors to the businesses. Thus, even as they are striving to modernize applications with new capabilities such as mobility and analytics, they are adopting more agile infrastructure alternatives. In addition, the majority of IT execs recognize the value of transitioning existing support-oriented IT organizations toward Collaborative and Innovative orgs that are more focused on “serving the business” than on managing and operating IT assets.

Market Impact

In an earlier Research Alert (Evolving Infrastructure Profiles – The Shift to the Cloud Accelerates, 1543RA, 13March2015 – Click Here for Lens360 blog version), we highlighted the transition from mostly Virtualized Infrastructures toward Cloud alternatives. The survey also revealed that respondents expect the profiles of their IT organizations to evolve from mostly Supportive to mostly Innovative over the next four years. Specifically:

  • While the traditional IT organization profile (i.e., Supportive) is the most common today, by 2019 it shrinks from 41 percent to only 12 percent. The majority of this transition is due to interim shifts to Proactive and Collaborative, followed by shift to Innovative.

Figure 1: Current and Projected IT Organization Profiles

1577RA_fig1

Source: Saugatuck Technology Inc. Cloud Infrastructure Survey, April 2015, n=327 (global) Continue reading

Scaling Beyond The Second Dimension: Re-visualizing Boundary-free Enterprise™

As a pioneer of the concept and theme of “free range” business IT and “boundary-free” business within and between enterprises, Saugatuck developed some of the earlier architectural, cost, and IT management models that have become more and more widely used.

But just as Clouds change shape, content, consistency and output over time, so must the nature ands visualization of Cloud-based business IT – especially given the accelerated adoption and expansion of more forms of Digital Business that make use of multiple systems, groups, functions, and data that previously did not intersect or interact. In short, we have to be able to accurately visualize what’s happening in order to have any chance of managing it and sustaining it as a business resource. Continue reading

IoT Platforms

The Internet of Things (IoT) breeds scale and complexity. Large scale and complexity draws in enterprise providers. To solve such challenges, enterprise providers announce, build, and manage technology platforms. Yet the concept of “platform” is often confusing, meaning different things to different people. And IoT value to businesses is delivered in applications, not platforms.

No longer is it sufficient to provide a partial, hodgepodge solution. Enterprise providers feel pressure to give buyers the whole enchilada to build, deploy, and manage the IoT; the industry calls them “platforms”. Every new IT initiative leads to new enterprise platforms; the emergence of the IoT is not different. The hype around the IoT is unrelenting, which means a low signal-to-noise ratio. The latest noise is from platform announcements. Some IoT platform initiatives appear to reflect actual application requirements, while others look like repackaging of existing solutions. Sorting through the hype is difficult not only for buyers but also for providers helping their customers plan IoT initiatives. None completely answer the call; most buyers need an ecosystem of providers to match requirements.

Providers are therefore hedging their bets – forming and joining IoT alliances while establishing their own platforms. We recently discussed market confusion involving the many enterprise providers addressing the IoT. Some of those, including Cisco, GE, HP, Intel, Oracle, Qualcomm, and others now espouse their own IoT platforms (1564SSR, Making Sense of the Internet of Things: What’s a Leader to Do?, 23Apr2015). Yet in the last several weeks, the elusive all-inclusive platform for development, deployment, and maintenance seems to be the focus of many IoT announcements. Continue reading

ISPs Act to Block Title II Reclassification in Net Neutrality Rules

On May 1, AT&T, CenturyLink, as well as U.S. telecom and cable industry groups petitioned the FCC to block parts of new Net Neutrality rules. They cited “crushing” compliance costs and threats to investment. The FCC ruling will go into effect June 12, 2015 unless the FCC or a court grants the motion to stay (or delay) the ruling.

The request objects to subjecting the broadband carriers to common carrier duties under Title II of the Communications Act of 1934. While the Title II changes are part of the new Net Neutrality rules from the FCC, the request from ISPs specifically targeted Title II reclassification. Petitioners did not seek a stay of the other key Net Neutrality rules: no blocking, no throttling, no paid prioritization. This reveals an important element of their legal strategy. The Internet providers see the new FCC rules as a house of cards with Title II classification as its foundation.

As we shared earlier this year (see Net Neutrality – Enjoy the Media Circus, Hurry Up and Wait for Real Change, 20Feb2015, 1530RA – see Lens360 blog post version), the consequences of the FCC reclassify broadband under Title II are unclear. The agency would need to use forbearance and waive certain processes that don’t apply for the Internet. Opponents assert that the change would slow innovation on the Internet. The result would be a regulated “innovation by permission” situation. Providers assert that Title II carries a lot of baggage as a regulatory option, with a risk of forcing other forms of transmission to also fall under its classification. In essence, opponents say, “if it’s not broken, don’t fix it.” Continue reading