Release Management Authors: Pat Romanski, Elizabeth White, David H Deans, Liz McMillan, Jnan Dash

Related Topics: Release Management , Containers Expo Blog

Release Management : Article

The Importance of Personalization in VDI

Gartner has been busy

In just the last two months Gartner have published three documents that talk about the importance of Persistent Personalization, their term for a subset of User Environment Management (UEM). They have approached it from three different directions: One document looks at Persistent Personalization itself, one for hosted virtual desktops and one for client virtual desktops. As you would expect with three documents in such a short period of time there are some very solid pieces of information and some long shots. In this article I want to pull together the principal themes and clarify some of the points they make.

Out with the old, in with the new
Organizations have been implementing VDI since around 2005 but the early implementations were all reliant on keeping a complete copy of the user’s image for each user. This meant that the use cases where it made sense to deliver desktops in this way were economically limited. Today organizations are implementing VDI through componentizing the image and then automatically delivering those components on demand allowing IT to standardize the components and hence reduce costs and improve service delivery for the broader user base. One of the critical components is the user environment which represents all user related information in the image and which is delivered into the standard components to give the user a productive and familiar experience.

Up to now there has been some debate on what the critical dates for the adoption of this new componentized model and hence VDI in general would be. Gartner’s take is that VDI is ready for task workers now and will be ready for deployment to a broad range of users in 2010, driven by the actions of vendors such as AppSense introducing functionality such as user installable applications to allow greater user freedom. They also see that the componentized model will drive the older single-image-per-user deployments obsolete by 2012.

Persistent Personalization (UEM)
Unsurprisingly Gartner see Persistent Personalization as critical for the success of desktop virtualization with some stages of its development being contingent on developments in UEM. In particular Gartner cite the capability for a user to be able to install applications and I would also point to improvements in the manageability of the platform.

In debates on technology it is critical to keep in mind the overall objective and so not get diverted into unproductive side alleys. The objective of desktop virtualization is to improve the manageability of the desktop platform and hence deliver a better service to corporate users at lower cost. The way that we will achieve this is by standardizing the components of the image and managing these components across the business and so achieving economies of scale. Consequently, key to reaching our goals is that we can manage the components effectively. This means being able to manage the delivery of the components but also manage within the components so that we have appropriate visibility and control into the components. We must not slip back into the unique and impenetrable image problem we have in current PCs, generally referred to as ‘the blob problem’. This means that manageability of the user environment is key so delivery can be done efficiently and so that any user problems can be isolated quickly and effectively.

And a new category: Workspace Virtualization
Within Persistent Personalization, itself a subset of User Environment Management, Gartner have introduced a new category called Workspace Virtualization to cover vendors such as RingCube, MokaFive and UniDesk. Some of these vendors are completely new, some have had offerings in different markets before but this is their first recognition as part of broader corporate desktop virtualization. Consequently it is worth thinking about how their technology could contribute to the goals of desktop virtualization.

Amongst the vendors with products the common theme is that they split the image stack in a different way to typical desktop virtualization. Rather than creating a division between operating system, applications and user environment they split purely on the basis of operating system and everything else.

This has the benefit of simplicity but does it address the problem we need to address and hence achieve our goals? The solutions in this category are very new to market and will undoubtedly mature over time but I have two concerns. Firstly, can these solutions manage the components of desktop virtualization so that we achieve economies of scale or are we going to find ourselves with an unmanageable blob per user, much as we have now? Secondly, can these solutions become more than a point solution for just a small proportion of users leading to another management tool which must also be managed? It is too early to tell with the solutions we see now and it will be interesting to see if this sub category can carve a niche for itself.

In conclusion it is great to see Gartner getting behind the importance of Persistent Personalization and its importance to desktop virtualization. Three papers in two months is recognition that Persistent Personalization and the larger category of user environment management is critical to the development of desktop virtualization. This will help the broader market understand the role of UEM in improving the overall management of corporate desktops and hence deliver better service for users at lower cost.

More Stories By Martin Ingram

Martin Ingram is vice president of strategy for AppSense, where he's responsible for understanding where the entire desktop computing market is going and deciding where AppSense should direct its products. He is recognized within the industry as an expert on application delivery. Martin has been with AppSense since 2005, previously having built companies around compliance and security including Kalypton, MIMEsweeper, Baltimore Technologies, Tektronix and Avid. He holds an electrical engineering degree from Sheffield University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business outcomes to drive data, analytics and technology decisions that underpin an organization's digital transformation strategy.
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...