Welcome!

Release Management Authors: Pat Romanski, Elizabeth White, David H Deans, Liz McMillan, Jnan Dash

Related Topics: Agile Computing, Release Management

Agile Computing: Blog Feed Post

Mashable Sees Double Rainbows as Google Goes Gaga for OAuth

Enterprise developers and architects beware: OAuth is not the double rainbow it is made out to be

Enterprise developers and architects beware: OAuth is not the double rainbow it is made out to be. It can be a foundational technology for your applications, but only if you’re aware of the risks.

OAuth has been silently growing as the favored mechanism for cross-site authentication in the Web 2.0 world. The ability to leverage a single set of credentials across a variety of sites reduces the number of username/password combinations a user must remember. It also inherently provides for a granular authorization scheme.

Google’s announcement that it now offers OAuth support for Google Apps APIs was widely mentioned this week including Mashable’s declaration that Google’s adoption implies all applications must follow suit. Now. Stop reading, get to it. It was made out to sound like that much of an imperative.

drainbowoauth

Google’s argument that OAuth is more secure than the ClientLogin model was a bit of a stretch considering the primary use of OAuth at this time is to integrate Web 2.0 and social networking sites – all of which rely upon a simple username/password model for authentication. True, it’s safer than sharing passwords across sites, but the security of data can still be compromised by a single username/password pair.

OAUTH on the WEB RELIES on a CLIENT-LOGIN MODEL

The premise of OAuth is that the credentials of another site are used for authentication but not shared or divulged. Authorization for actions and access to data are determined by the user on a app by app basis. Anyone familiar with SAML and assertions or Kerberos pdf-icon will recognize strong similarities in the documentation and underlying OAuth protocol. Similar to SAML and Kerberos, OAuth tokens - like SAML assertions or Kerberos tickets - can be set to expire, which is one of the core reasons Google claims it is “more” secure than the ClientLogin model.

OAuth has been a boon to Web 2.0 and increased user interaction across sites because it makes the process of authenticating and managing access to a site simple for users. Users can choose a single site to be the authoritative source of identity across the web and leverage it at any site that supports OAuth and the site they’ve chosen (usually a major brand like Facebook or Google or Twitter). That means every other site they use that requires authentication is likely using one central location to store their credentials and identifying data.

One central location that, ultimately, requires a username and a password for authentication.

Yes, ultimately, OAuth is relying on the security of the same ClientLogin model it claims is less secure than OAuth. Which means OAuth is not more secure than the traditional model because any security mechanism is only as strong as the weakest link in the chain. In other words, it is quite possible that a user’s entire Internet identity is riding on one username and password at one site. If that doesn’t scare you, it should. Especially if that user is your mom or child and they haven’t quite got the hang of secure passwords or manually change their passwords on a regular basis.

When all your base are protected by a single username and password, all your base are eventually belong to someone else.

Now, as far as the claim that the ability to expire tokens makes OAuth more secure, I’m not buying it. There is nothing in the ClientLogin model that disallows expiration of credentials. Nothing. It’s not inherently a part of the model, but it’s not disallowed in any way. Developers can and probably should include an expiration of credentials in any site but they generally don’t unless they’re developing for the enterprise because organizational security policies require it. Neither OAuth nor the ClientLogin model are more secure because the ability exists to expire credentials. Argue that the use of a nonce makes OAuth less susceptible to replay attacks and I’ll agree.  Argue that OAuth’s use of tokens makes it less susceptible to spying-based attacks and I’ll agree. Argue that expiring tokens is the basis for OAuth’s superior security model and I’ll shake my head.

SECURITY through DISTRIBUTED OBSCURITY

The trick with OAuth is that the security is inherently contained within the distributed, hidden nature of the credentials. A single site might leverage two, three, or more external authoritative sources (OAuth providers) to authenticate users and authorize access. It is imageimpossible to tell which site a user might have designated as its primary authoritative source at any other site, and it is not necessarily (though it is likely, users are creatures of habit) the case that the user always designates the same authoritative source across external sites. Thus, the security of a given site relying on OAuth for authorization distributes the responsibility for security of credentials to the external, authorizing site. Neat trick, isn’t it? Exclusively using OAuth effectively surrounds a site with an SEP field. It’s Somebody Else’s Problem if a user’s credentials are compromised or permissions are changed because the responsibility for securing that information lies with some other site. You are effectively abrogating responsibility and thus control. You can’t enforce policy on any data you don’t control.

The distributed, user-choice aspect of OAuth obscures the authoritative source of those credentials, making it more difficult to determine which site should be attacked to gain access. It’s security through distributed obscurity, hiding the credential store “somewhere” on the web.

CAVEAT AEDIFICATOR

That probably comes as a shock at this point, but honestly the point here is not to lambast OAuth as an authentication and authorization model. Google’s choice to allow OAuth and stop the sharing of passwords across sites is certainly laudable given that most users tend to use the same username and password across every site they access. OAuth has reduced the number of places in which that particular password is stored and therefore might be compromised. OAuth is scriptable, as Google pointed out, and while the username/password model is also image scriptable doing so almost certainly violates about a hundred and fifty basic security precepts. Google is doing something to stop that, and for that they should be applauded.

OAuth can be a good choice for application authentication and authorization. It naturally carries with it a more granular permission (authorization) model that means developers don’t have to invent one from scratch. Leveraging granular permissions is certainly a good idea given the nature of APIs and applications today. OAuth is also a good way to integrate partners into your application ecosystem without requiring management of their identities on-site – allowing partners to identify their own OAuth provider as an authoritative source for your applications can reduce the complexity in supporting and maintaining their identities. But it isn’t more secure – at least not from an end-user perspective, especially when the authoritative source is not enforcing any kind of strict password and username policies. It is that end-user perspective that must be considered when deciding to leverage OAuth for an application that might have access to sensitive data. An application that has access to sensitive data should not rely on the security of a user’s Facebook/Twitter/<insert social site here> password. Users may internally create strong passwords because corporate policy forces them to. They don’t necessarily carry that practice into their “personal” world, so relying on them to do so would likely be a mistake.

Certainly an enterprise architect can make good use of Google’s support for OAuth if s/he is cautious: the authoritative source should be (a) under the control of IT and (b) integrated into an identity store back-end capable of enforcing organizational policies regarding authentication, i.e. password expirations and strengths. When developing applications using Google Apps and deciding to leverage OAuth, it behooves the developer to be very cautious about which third party sites/applications can be used for authentication and authorization, because the only real point of control regarding the sharing of that data is in the choices allowed, i.e. which OAuth providers are supported. Building out applications that leverage an OAuth foundation internally will enable single-sign on capabilities in a way that eliminates the need for more complex architecture. It also supports the notion of “as a service” that can be leveraged later when moving toward the “IT as a Service” goal.

OAuth is not necessarily “more” secure than the model upon which it ultimately relies, though it certainly is more elegant from an architectural and end-user perspective. The strength of an application’s security when using OAuth is relative to the strength of the authoritative source of authentication and authorization. Does OAuth have advantages over the traditional model, as put forth by Google in its blog? Yes. Absolutely. But security of any system is only as strong as the weakest link in the chain and with OAuth today that remains in most cases a username/password combination.

That ultimately means the strength or weakness of OAuth security remains firmly in the hands (literally) of the end-user. For enterprise developers that translates into a “Caveat aedificator” or in plain English, “Let the architect beware.”


 

 

 

 

 

 

 

 

Related blogs & articles:

Follow me on Twitter View Lori's profile on SlideShare friendfeed icon_facebook

AddThis Feed Button Bookmark and Share

 

 

 

 

 

 

 

 

 

 

 

 

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@ThingsExpo Stories
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone inn...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor ...
"Digital transformation - what we knew about it in the past has been redefined. Automation is going to play such a huge role in that because the culture, the technology, and the business operations are being shifted now," stated Brian Boeggeman, VP of Alliances & Partnerships at Ayehu, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...