Welcome!

Release Management Authors: Jnan Dash, Liz McMillan, Lori MacVittie, Gilad Parann-Nissany, Carmen Gonzalez

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Agile Computing, Release Management , Cloud Security

@CloudExpo: Article

Lessons Learned from Real-World Big Data Implementations

The value of Big Data is in the insights that the data can provide

In the past few weeks I visited several Cloud and Big Data conferences that provided me with a lot of insight. Some people only consider the technology side of Big Data technologies like Hadoop or Cassandra. The real driver however is a different one. Business analysts have discovered Big Data technologies as a way to leverage tons of existing data and ask questions about customer behavior and all sorts relationships to drive business strategy. By doing that they are pushing their IT departments to run ever bigger Hadoop environments and ever faster real-time systems.

What's interesting from a technical side is that ad-hoc analytics on existing data is allowed to take some time. However ad-hoc implies people waiting for an answer, meaning we are talking about minutes and not hours. Another interesting insight is that Hadoop environments are never static or standalone. Most companies take in new data on a continuous basis via technologies like flume. This means Hadoop MapReduce jobs need to be able to keep up with the data flow, either by adding more hardware or by optimizing them.

There are multiple drivers to Big Data (actually there are a lot) but the two most important ones are these: Analytics and Technical Need for Speed. Let's look at some of those and the resulting takeaways.

The Value Is in the Insight Not the Volume
The value of Big Data is in the insights that the data can provide, not the sheer volume of it. The reason that more and more companies are keeping all of their log and transaction data is that they want to gain those insights. The sheer size of the data is rather an obstacle to this goal and has been for a long time. With Big Data technologies this value can be harnessed.

Don't Forget That Data Analysts Are People Too
Ad-hoc analytics doesn't have to be instant, but must not take hours either. It was interesting to see that time to result on ad-hoc analytics is considered important. This is because people are doing those queries, and people don't like to wait for hours. But even more important is that business analytics is often an iterative process. Ask a question, check the answer, refine or change the question. Hours long MapReduce jobs are prohibitive to this process.

New Data Is Coming in All the Time
Big Data environments are constantly fed new data. This is not really big news, but I was still surprised by the constant reiteration of this fact. The constant data growth means that ad-hoc queries get either slower over time or need to work on samples. To remedy this, companies are writing, scrubbing and categorizing MapReduce jobs. These jobs basically strip out all the unimportant stuff and put cleansed, streamline easy-to-access data into new files. Instead of executing analytics against raw files, the analyst works on a cleansed data set. The implications are that scrubbing jobs need to be maintained all the time (as data input is changing over time) and they need to be able to keep up with the velocity of the input. MapReduce is not allowed to run for hours, but needs to be quick and iterative.

Big Data Is Not Cheap
While it sounds obvious, it's something that's not talked about by the vendors unless specifically asked. Hadoop requires a lot of hardware and a lot of expertise. Especially the expertise is hard to come by as of yet. While hardware might be cheap (you don't need expensive boxes for Hadoop) the bigger the environment the higher the operational costs. That operational cost is the reason some Hadoop vendors exist on services alone and also why customers are demanding better monitoring and management solutions.

Data Must Be Accessible at Low Latencies to Provide Value
One very interesting fact is that most early adopters that use Hadoop for analytics use it for ad-hoc analytics and not as a traditional warehouse. They use MapReduce to do the heavy lifting that is usually reserved for ETL jobs and put the resulting dimensions in existing data warehouses or into a NoSQL solution like HBase, Cassandra or MongoDB. These solutions provide low latency access semantics and are then integrated in the transactional application world, e.g. to provide recommendations to the end users.

This does not absolve them from optimizing their Hadoop environment where they can, but it gives them the much needed real time access that Hadoop so far does not provide. This also makes for additional complexity that needs to be maintained and monitored.

NoSQL Solutions Need Management and Monitoring as Well
NoSQL solutions are most often used to provide low latency databases with failover and horizontal scaling characteristics. As expected, practitioners quickly run into new issues like distribution and wrong access patterns. Most NoSQL solutions lack sophisticated monitoring or performance analysis tools and require experts instead. Fortunately several companies are working on providing those tools and some APM vendors work hard to support NoSQL databases similar to normal databases. This is emphasized by another interesting finding: With a fast and scalable data storage, the application itself quickly becomes the response time and scaling bottleneck.

Applications Using NoSQL Technologies Are More Complex
Most NoSQL solutions surrender more complex logic like joins in order to achieve horizontally scalable data distribution. That logic is moved to the application - arguably this is where it should be anyway. NoSQL solutions require data to be stored in a query access optimized way - de-normalization is the key. The flip side of storing data multiple times and the need to keep it in sync on updates, is that the storage logic again becomes more complex. More application logic usually means less performance.

My conclusion as a performance engineer is relatively clear: Big Data requires Performance Management and Monitoring Tools to fulfill its promise in a cost effective and timely manner. Here are some suggestions on what you should think about when you start a Big Data project.

  1. Large Hadoop environments are hard to manage and operate. Without automation in terms of deployment, operations, monitoring and root cause analysis they quickly become unmanageable. Make sure to have a monitoring solution in place that informs you pro-actively of any infrastructure or software issues that would affect your operation. It needs to give you an easy way to pinpoint the root cause.
  2. The easiest way to identify new performance issues is to detect and analyze change. Adopt a life cycle and 24/7 production APM approach. It will enable you to notice changes in data and compute distribution over time. In addition a life cycle approach will allow you to immediately pin point any negative changes introduced by a new software release.
  3. Don't just throw more and more hardware at the problem. While you can use cheaper hardware for Hadoop, it's still cost. But more than that you have to consider the operational drag. Every node you add will make traditional log based analysis more complicated. Instead ensure that you have an APM solution in place that lets you understand and optimize MapReduce jobs at their core and reduce both the time and resources it takes to run them.
  4. Your Hadoop cluster is no island, but will always be connected in some form or the other to a real time or at least transactional system. Make sure that you have a monitoring solution in place that can support both.

NoSQL applications tend to have more complex logic. The very performance and scalability of the store depends on correct data access and data distribution. An good monitoring solution allows you to monitor and optimize that additional complexity with ease; it also enables you to understand how your application access the data and how that access is distributed across your NoSQL cluster in your production system. The best way to ensure a scalable and fast NoSQL store is to ensure optimal distribution and access patterns.

Conclusion
Big Data is still very much an emerging technology and its promises are huge. But in order to deliver on those promises it must be cost and time effective to those that harness its value - The Business and not just technology experts.

More Stories By Michael Kopp

Michael Kopp has over 12 years of experience as an architect and developer in the Enterprise Java space. Before coming to CompuwareAPM dynaTrace he was the Chief Architect at GoldenSource, a major player in the EDM space. In 2009 he joined dynaTrace as a technology strategist in the center of excellence. He specializes application performance management in large scale production environments with special focus on virtualized and cloud environments. His current focus is how to effectively leverage BigData Solutions and how these technologies impact and change the application landscape.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, wh...
"delaPlex is a software development company. We do team-based outsourcing development," explained Mark Rivers, COO and Co-founder of delaPlex Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"We work in the area of Big Data analytics and Big Data analytics is a very crowded space - you have Hadoop, ETL, warehousing, visualization and there's a lot of effort trying to get these tools to talk to each other," explained Mukund Deshpande, head of the Analytics practice at Accelerite, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2016' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited t...
IoT offers a value of almost $4 trillion to the manufacturing industry through platforms that can improve margins, optimize operations & drive high performance work teams. By using IoT technologies as a foundation, manufacturing customers are integrating worker safety with manufacturing systems, driving deep collaboration and utilizing analytics to exponentially increased per-unit margins. However, as Benoit Lheureux, the VP for Research at Gartner points out, “IoT project implementers often ...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profession...
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
When people aren’t talking about VMs and containers, they’re talking about serverless architecture. Serverless is about no maintenance. It means you are not worried about low-level infrastructural and operational details. An event-driven serverless platform is a great use case for IoT. In his session at @ThingsExpo, Animesh Singh, an STSM and Lead for IBM Cloud Platform and Infrastructure, will detail how to build a distributed serverless, polyglot, microservices framework using open source tec...
Connected devices and the industrial internet are growing exponentially every year with Cisco expecting 50 billion devices to be in operation by 2020. In this period of growth, location-based insights are becoming invaluable to many businesses as they adopt new connected technologies. Knowing when and where these devices connect from is critical for a number of scenarios in supply chain management, disaster management, emergency response, M2M, location marketing and more. In his session at @Th...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
The idea of comparing data in motion (at the sensor level) to data at rest (in a Big Data server warehouse) with predictive analytics in the cloud is very appealing to the industrial IoT sector. The problem Big Data vendors have, however, is access to that data in motion at the sensor location. In his session at @ThingsExpo, Scott Allen, CMO of FreeWave, discussed how as IoT is increasingly adopted by industrial markets, there is going to be an increased demand for sensor data from the outermos...
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
Presidio has received the 2015 EMC Partner Services Quality Award from EMC Corporation for achieving outstanding service excellence and customer satisfaction as measured by the EMC Partner Services Quality (PSQ) program. Presidio was also honored as the 2015 EMC Americas Marketing Excellence Partner of the Year and 2015 Mid-Market East Partner of the Year. The EMC PSQ program is a project-specific survey program designed for partners with Service Partner designations to solicit customer feedbac...
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
There are several IoTs: the Industrial Internet, Consumer Wearables, Wearables and Healthcare, Supply Chains, and the movement toward Smart Grids, Cities, Regions, and Nations. There are competing communications standards every step of the way, a bewildering array of sensors and devices, and an entire world of competing data analytics platforms. To some this appears to be chaos. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Bradley Holt, Developer Advocate a...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...