Welcome!

Release Management Authors: Pat Romanski, Elizabeth White, David H Deans, Liz McMillan, Jnan Dash

Related Topics: @CloudExpo, Java IoT, Microservices Expo, IBM Cloud, Containers Expo Blog, Agile Computing

@CloudExpo: Article

PaaS Deployment Models

The contained and referential approaches

Rapid deployment capability is table stakes when we are talking about a PaaS solution. Every vendor touts it, and to be frank, every user simply expects it to be there. While I think it is interesting to talk about rapid deployment and perhaps compare speed of one solution to that of another, I think it is infinitely more interesting to talk about the mechanics of deployment for a particular solution. That is, I think the more interesting and important question is ‘What deployment style does a particular solution take?'

At a very high, black and white level, I think two primary deployment styles permeate the landscape of PaaS today: contained and referential. I want to compare each approach, but before that, let me use a few words to describe each style.

- Contained: In the contained deployment model, PaaS solutions deploy environments based on packages that contain most, if not all, of the desired configuration as well as the logic to apply that configuration. For instance, if a solution were to deploy a virtual image in the contained model, the virtual machine would have the necessary information and logic embedded to configure itself upon start up. It would not necessarily need to contact external systems or wait for instructions from other actors.

- Referential: In the referential deployment model, PaaS solutions deploy environments using a minimal base package. At some point during the deployment process, the deployed environment communicates with a third party in some fashion to procure the necessary configuration information. Going back to the example above, if a virtual image were deployed in the referential model, the virtual machine would start up and then communicate with a third party service (either by initiating a request or waiting for instructions). This third party service would send down the configuration information and instructions for the environment hosted within the virtual machine.

When comparing the two approaches, it is helpful to understand the advantages and drawbacks of each. A closer look at the contained model reveals an obvious benefit: speed. In this model, the deployment package contains most of what it will require in order to fully configure itself. It does not rely on contacting an external service and having to pull down the necessary binaries and configuration information.

This advantage comes with an obvious drawback: management burden. By building more and more into the deployment package, you are increasing the amount of content that must be maintained and updated in said package. While it is not a huge concern if you only have a handful of discrete packages, you may not be able to rely on that luxury. You may expect that after some amount of time, the number of permutations to support necessitate spending an inordinate amount of time updating deployment packages. If this is the case, you can easily end up in a situation where the benefits of rapid deployment are negated by increased administrative costs.

The referential approach avoids the above pitfall. In this model, the deployment package remains fairly skeletal. Instead of packing in all of the content like in the contained model, the deployment packages in the referential model know just enough to integrate with an external system to get the right configuration information (think Chef and Puppet). This means that you only need to update and maintain configuration data and configuration actions in a single location instead of in each and every deployment package. As the number of different required environments increase, this approach can mean a significant reduction in management burden.

There is a flip side to this coin of course. The referential approach typically results in longer deployment times - dependent on the time required to install and configure content for your environments of course. Since the deployment packages contain very little content at deploy-time, they must pull or otherwise receive that data at some point during the deployment. This may or may not be a big issue for your particular use case, but it is a potential drawback worth considering.

So which approach is better? It is my opinion, one derived from numerous user experiences, that there is no way to generalize the answer to that question. In cases where content is infrequently updated and the number of environmental permutations is fairly well constrained, the contained deployment model can be extremely effective and efficient. On the other hand, in cases where content is dynamic and ever-changing, the referential deployment model is a virtual requirement. From a user's standpoint, I strongly suggest pursuing solutions that support both kinds of deployment models. Tools should not dictate your approach. Your requirements for a particular usage scenario should!

More Stories By Dustin Amrhein

Dustin Amrhein joined IBM as a member of the development team for WebSphere Application Server. While in that position, he worked on the development of Web services infrastructure and Web services programming models. In his current role, Dustin is a technical specialist for cloud, mobile, and data grid technology in IBM's WebSphere portfolio. He blogs at http://dustinamrhein.ulitzer.com. You can follow him on Twitter at http://twitter.com/damrhein.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...