Welcome!

Open Web Authors: Liz McMillan, Gilad Parann-Nissany, Carmen Gonzalez, Mark R. Hinkle, Elizabeth White

Related Topics: Virtualization, Java, SOA & WOA, Open Web, Cloud Expo, Apache

Virtualization: Blog Feed Post

Optimizing Storage Architectures for SSD

Traditional arrays were designed to cope with the hard drive as the slowest component in the architecture

Last week I attended Hitachi’s 2012 Blogger Day.  Aside from catching up with some old friends, we were presented with some NDA stuff which will see the light of day soon.  In the meantime, I want to talk about a press release Hitachi made while I was still on holiday (and clearly missed as I returned, somewhat jet lagged).

Previously I’ve discussed how solid-state arrays need to be optimized in their design to get the best out of the technology.  Traditional arrays were designed to cope with the hard drive as the slowest component in the architecture.  IP was built around squeezing the best performance out of spinning media.  Startups in the all-flash array market have created new products that work in exactly the same way; they are designed to get the best out of solid state media, including all its good points and bad points.

While I still believe that all-flash arrays built from the ground up will have the advantage (especially in delivering low latency rather than purely high IOPS), Hitachi’s announcement of their Flash Acceleration firmware release for VSP (which looks to have been made in true deprecating style) shows that current hardware could be tweaked to be more efficient with SSD technology.  In fact, the improvements are significant, with an all-flash VSP producing a claimed 1,000,000 IOPS.  I questioned Patrick Allaire (Marketing VP for VSP who possibly needs to tweet a little more) on this magical 1 million number, which let’s face it, is purely a marketing term.  He indicated that the lab testing had pushed workloads to higher values (around 1.2m IOPS), but getting to the 1 million mark sends the message Hitachi are looking to convey, and that’s that their VSP architecture continues to deliver on performance.  The enhancements provide 3x scalability on the current VSP, in terms of IOPS, with a 65% reduction in I/O response time.  Incidentally, the enhancements also improve performance/throughput with arrays built on traditional disks too.  The only downside to this new firmware release is that it comes as a chargeable item (albeit with a free trial first).  I think if Hitachi want competitive advantage in this market, then that should release this firmware as a free upgrade, as it would show commitment to delivering the best possible products to their customers.

Of course Hitachi are not the first “top six” storage vendor to claim high performance, however from what I can see, they are the first to put a number to their array capability.  Hitachi did present a slide indicating EMC had quoted the VMAX- 40K at 810,000 IOPS during EMC World 2012.  As I didn’t attend that event, I can’t comment on the accuracy of that figure, however I have tried to corroborate the number through EMC blogs, presentation material and so on and have no success in doing so.  In fact, Chad Sakacc on his EMC blog at Virtual Geek has a post highlighting the technical benefits of the 40K without a single quantifiable performance figure.  If anyone has a referenceable source then please let me know and I will update this post.

What’s Next?

Hitachi also announced some details of their new flash controller architecture.  This is due to offer greater sustained throughput, 5+ years endurance, zero block compression/dedupe and security functionality.  Look out for more on that as the news becomes public.  I have a feeling that this will be only the start of an evolving strategy and we will see many more announcements in the coming months.

The Architect’s View

The all-flash array market is maturing nicely.  We can see products at all levels; replacing traditional HDDs in arrays, as dedicated flash appliances and of course integrated into the host.  Hitachi have some way to go in order to catch up with this expansive market place and the Flash Acceleration code is only a first step.  Already, other vendors are moving into delivering converged flash solutions; that is, they are integrating the intelligence between host and array flash to provide added value, but possibly more important to secure customer lockin.  Hitachi needs to make sure the future doesn’t lock them out.

Disclaimer: I recently attended the Hitachi Bloggers’ Day 2012.  My flights and accommodation were covered by Hitachi during the trip, however there is no requirement for me to blog about any of the content presented and I am not compensated in any way for my time when attending the event.  Some materials presented were discussed under NDA and don’t form part of my blog posts, but could influence future discussions.

Related Links

Comments are always welcome; please indicate if you work for a vendor as it’s only fair.  If you have any related links of interest, please feel free to add them as a comment for consideration.

Read the original blog entry...

@ThingsExpo Stories
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device experiences grounded in people's real needs and desires.
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect at Hookflash, will walk through the shifting landscape of traditional telephone and voice services ...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, described how to revolutioniz...
Bit6 today issued a challenge to the technology community implementing Web Real Time Communication (WebRTC). To leap beyond WebRTC’s significant limitations and fully leverage its underlying value to accelerate innovation, application developers need to consider the entire communications ecosystem.
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from hardware to software, or as we like to say, it’s an Internet of many different things. The difference ...
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, phone and digital TV services to consumers primarily in rural areas.
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...

The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, and physical persons. In the IoT vision, every new "thing" - sensor, actuator, data source, data con...
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the "Internet of Things" (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his General Session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, discuss the real benefits to focus on, how to understand the requirements of a successful solution, the flow of ...
"BSQUARE is in the business of selling software solutions for smart connected devices. It's obvious that IoT has moved from being a technology to being a fundamental part of business, and in the last 18 months people have said let's figure out how to do it and let's put some focus on it, " explained Dave Wagstaff, VP & Chief Architect, at BSQUARE Corporation, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.