Click here to close now.


Release Management Authors: Jnan Dash, Liz McMillan, Lori MacVittie, Gilad Parann-Nissany, Carmen Gonzalez

Related Topics: @CloudExpo, Agile Computing

@CloudExpo: Blog Post

Cloud and the Consumerization of IT

The cloud gives the consumerization of IT a whole new meaning

The cloud essentially "consumerizes" all of IT, not just relatively unimportant bits like procuring personal hard- and software. This requires a whole rethinking of corporate IT, as the idea of any master design becomes unattainable. How can IT as a species survive this trend as it may render the education of a whole generation of IT-ers irrelevant? On the brighter side - it really caters for the talents of today's teenagers: consumption as a lifestyle.

The idea of consumerization - users being allowed to freely procure their own personal hard- and software - has been around for a while. But few CIOs and even less heads of IT Operations have embraced it. Other than some token adoption, where users could choose between an iPhone or a Blackberry or where users got a personal budget to order from the company supplied catalog of pre-approved hardware, we see little adoption of the concept. The idea is that users can go to any consumer store or webshop and order any gadget they like, be it an iPad, laptops, printer or smart phone and configure these basically while still in the store to access their corporate mail, intranet and company applications. The idea originated when people wanted to use their 24 inch HD pc with 4 processors and mega memory - all essential to enjoy modern home entertainment and video and far superior to company standard issue equipment- to also do some work.

Cloud computing now makes such a consumer approach also possible at the departmental level. Department selecting and using non corporate approved or endorsed SaaS based CRM applications are the most commonly used example. But more interesting are the cases where departments - tired of waiting for their turn in the never reducing application backlog of corporate IT - turned to a system integrator to build a custom cloud application to meet their immediate needs. Several system integrators indicate that they have more and more projects where no longer IT, but the business department, is their prime customer. Contracts, SLA's and even integrations are negotiated directly between the SI and the business department, in some cases IT is not even involved or aware.

Now this is not a new phenomenon. We saw the exact same thing when PCs and departmental servers were introduced. Departments went off on their own and bought "solutions" from vendors popping up like the proverbial poppy seeds and often disappearing just a quickly after (remember Datapoint, Wang, Digital? And those were the ones that lasted). Guess who the business expected to clean up (integrate) the mess they left behind? Yes, the same IT departments they bypassed in the first place. One may even argue that: if IT had not been so busy cleaning up this mess over the last 15 years, they would have had a much better chance at building an integrated solution that actually did meet business's need. I am not of that opinion. With ERP we got this chance (and the associated billions) and still did not manage to keep up with the requirements, some things are just too vast, complex or simply change to fast to be captured in any master design.

So back to consumerisation. Although the trend has been far from whole heartily embraced by most corporate IT, it is continuing. In my direct environment I see several people who, instead of plugging their laptop into the corporate network at the office, take a 3G network stick to work. For around 20 Euros a month this gives them better performance accessing the applications they care about, not to mention it gives them access to applications most corporate IT department do not care for, like facebook, twitter, etc. Question is off course, can they do their work like that? Don't they need all day, full time access to the aforementioned fully vertically integrated ERP system? The answer is No. First of all, the vertically integrated type of enterprise that ERP was intended for, no longer exist. Most corporations have taken to outsourcing distribution to DHL or TNT, employee travel to the likes of American Express, HR payroll and expenses to XYZ, etc. etc. The list goes on and on.

All these external service providers support these services with web based systems that can be accessed from anywhere, inside and outside the company firewall. At the same time, the remaining processes that occur in the corporate ERP system are so integrated that they hardly require any manual intervention from employees. Consequently employees don't need to spend their time doing data entry or even data updates or analysis on that system. Any remaining required interaction is facilitated by directly interfacing with the customer via the web shop or via other web based systems. One could say that the world moved from vertically integrated manufacturing corporations to supply chain connected extended enterprises.

The question I will address in my next post is how does the cloud enabling consumerisation for enterprise applications play a role in this and what this means for IT moving forward.

On the supply side of IT, it means applications are best delivered as easily consumerable services to employees and others (partners, customers, suppliers). One large European multinational is already delivering all their new applications as internet (so not intranet) applications. Meaning any application can be accessed from anywhere by simply entering a URL and doing proper authentication. Choosing which applications to provide internally is based on whether there are outside parties willing and capable to provide these services or whether the company can gain a distinct advantage by providing the service themselves.

When speaking about consuming services, one should try and think broader than just IT services. The head of distribution may be looking for a parcel tracking system, but when asking the CEO or the COO they are more likely to think of a service in terms of something a DHL or TNT delivers. Services such as warehousing, distribution, but also complaint tracking, returns and repairs, or even accounting, marketing and reselling, all including the associate IT parts of those services. It is the idea of everything as a services, but on steroids (XaaSoS). Please note that even when an organization decides to provide one of these services internally, they can still source the underlying infrastructure and even applications "as a service" externally (this last scenario strangely enough is what many an IT person seems to think of exclusively when discussing cloud computing).

On the demand side of IT the issue is an altogether other one. How do we warrant continuity, efficiency and compliance, in such a consumption oriented IT World. If it is every man (or department) for themself, how do we prevent suboptimisation, In fact , how do we even know what is going on in the first place. How do we know what services are being consumed. This is the new challenge, and it is very similar to what companies faced when they decided to not manufacture everything themselves anymore, abandoning vertical integration where it made sense and taking a "supply chain" approach. Cloud computing is in many aspects a similar movement, and also here a supply chain approach looks like the way to go.

Such a supply chain approach means thoroughly understanding both demand and supply, matching the two and making sure that the goods - or in this case services - reach the right audience at the right time (on demand). IT has invested a fair amount of time and effort in better ways and methodologies to understand demand. On the supply side, IT till now assumed they were the supplier. In that role they used industry analysts to classify the components required, such as hardware and software. In this new world they need to start thoroughly understanding the full services that are available on the market. An interesting effort worth mentioning here is the SMI (Service Measurement Index) an approach to classify cloud services co-initiated by my employer, CA technologies and lead by Carnegie Mellon University.

After having gained an understanding of both demand and supply, the remaining task is "connecting the dots". This sounds trivial but is an activity that analysts estimate becoming a multi-billion industry within just a few years. It includes non-trivial tasks like identifying which users are allowed to do which tasks in this now open environment and optimizing the processes by picking resources that have the lowest utilization and thus cost. Because going forward scarcity will determine price especially in the new cloud world (which resembles Adam Smith's idea of a perfect open market a lot closer than any internal IT department ever did or will do).

Now off course all of the above won't happen overnight. Many a reader (and with a little luck the author) will have retired by the time today's vertically integrated systems - many of which are several decades old and based on solid, reliable mainframes - will have become services that are brokered in an open cloud market. A couple of high profile outages may even prolong this a generation or two more. But long term I see no other way. Other markets (electricity, electronics, publishing and even healthcare) have taken or are taking the same path. It is the era of consumption.

PS Short term, however, the thing we (IT) probably need most is a new diagraming technique. Why? From the above it will be clear that - in such a consumerised world - architecture diagrams are a thing of the past. And an IT person without a diagram is like a fish without water . We need something that allows us to evolve our IT fins into feet and our IT chews into lungs, so we can transition from water to land and not become extinct in the process. One essential aspect will be that unlike pictures of clouds and very much like real clouds, the diagrams will need to be able to change dynamically, much like pictures in a Harry Potter movie (it's magic). Who has a suggestion for such a technique ?

[Editorial note: This blog originally was published at on May 31st , 2010]

Read the original blog entry...

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
SYS-CON Events announced today that Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, will keynote at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, will provide an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, will discuss the impact of technology on identity. Should we federate, or not? How should identity be secured? Who owns the identity? How is identity ...
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...

@BigDataExpo Blogs
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
Today’s modern day industrial revolution is being shaped by ubiquitous connectivity, machine to machine (M2M) communications, the Internet of Things (IoT), open APIs leading to a surge in new applications and services, partnerships and eventual marketplaces. IoT has the potential to transform industry and society much like advances in steam technology, transportation, mass production and communications ushered in the industrial revolution in the 18th and 19th centuries.
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
It’s not hard to find technology trade press commentary on the subject of Big Data. Variously defined (in non-technical terms) as the cluttered old shoebox of all data – and again (in more technical terms) as that amount of data that does not comfortably fit into a standard relational database for storage, processing and analytics within the normal constraints of processing, memory and data transport technologies – we can say that Big Data is an oft mentioned and sometimes misunderstood subject.
All we need to do is have our teams self-organize, and behold! Emergent design and/or architecture springs up out of the nothingness! If only it were that easy, right? I follow in the footsteps of so many people who have long wondered at the meanings of such simple words, as though they were dogma from on high. Emerge? Self-organizing? Profound, to be sure. But what do we really make of this sentence?
SCOPE is an acronym for Structured Computations Optimized for Parallel Execution, a declarative language for working with large-scale data. It is still under development at Microsoft. If you know SQL then working with SCOPE will be quite easy as SCOPE builds on SQL. The execution environment is different from that RDBMS oriented data. Data is still modeled as rows. Every row has typed columns and eveyr rowset has a well-defined schema. There is a SCOPe compiler that comes up with optimized execution plan and a runtime execution plan.
If you’re running Big Data applications, you’re going to want to look at some kind of distributed processing system. Hadoop is one of the best-known clustering systems, but how are you going to process all your data in a reasonable time frame? MapReduce has become a standard, perhaps the standard, for distributed file systems. While it’s a great system already, it’s really geared toward batch use, with jobs needing to queue for later output. This can severely hamper your flexibility. What if you want to explore some of your data? If it’s going to take all night, forget about it.
Disaster recovery (DR) has traditionally been a major challenge for IT departments. Even with the advent of server virtualization and other technologies that have simplified DR implementation and some aspects of on-going management, it is still a complex and (often extremely) costly undertaking. For those applications that do not require high availability, but are still mission- and business-critical, the decision as to which [applications] to spend money on for true disaster recovery can be a struggle.
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
I was recently watching one of my favorite science fiction TV shows (I’ll confess, ‘Dr. Who’). In classic dystopian fashion, there was a scene in which a young boy is running for his life across some barren ground in a war-ravaged world. One of his compatriots calls out to him to freeze, not to move another inch. The compatriot warns the young boy that he’s in a field of hand mines (no, that is not a typo, he did say hand mines). Slowly, dull gray hands with eyes in the palm start emerging from the ground around the boy and the compatriot. Suddenly, one of the hands grabs the compatriot and pu...
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
DevOps Summit at Cloud Expo 2014 Silicon Valley was a terrific event for us. The Qubell booth was crowded on all three days. We ran demos every 30 minutes with folks lining up to get a seat and usually standing around. It was great to meet and talk to over 500 people! My keynote was well received and so was Stan's joint presentation with RingCentral on Devops for BigData. I also participated in two Power Panels – ‘Women in Technology’ and ‘Why DevOps Is Even More Important than You Think,’ both featuring brilliant colleagues and moderators and it was a blast to be a part of.
“Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management.” While this definition is broadly accepted and has, in fact, been my adopted standard for years, it only describes technical aspects of cloud computing. The amalgamation of technologies used to deliver cloud services is not even half the story. Above all else, the successful employment requires a tight linkage to the econ...
Too many multinational corporations delete little, if any, data even though at its creation, more than 70 percent of this data is useless for business, regulatory or legal reasons.[1] The problem is hoarding, and what businesses need is their own “Hoarders” reality show about people whose lives are driven by their stuff[2] (corporations are legally people, after all). The goal of such an intervention (and this article)? Turning hoarders into collectors.
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...

Tweets by @BigDataExpo

About Release Management
Open Web Developer's Journal assists Web developers in learning how to leverage open APIs in their own code, on their own websites, and in their own businesses.