|By Jon Shende||
|December 2, 2010 06:45 AM EST||
As mentioned in Part 1 of this article, one of my functions is to research current and up and coming solutions within the technology realm, particularly that of distributed computing and cloud computing.
It is a strong possibility that malicious users will eventually identify and exploit potential flaws within the cloud computing model. CSPs, in their pursuit to secure market share may have underestimated the possibilities of attack and misuse of their cloud resources by a malicious user or users.
The likelihood that the creation, storage, processing and distribution of illicit material will present major legal issues, is also a grave reality 
Digital Forensic Examiners also know that any effective forensic system has to have an effective means of monitoring and collecting a wide range of data as; there is no directive which states what may be pertinent to any one case a priori.
With regard to possibility of insider attacks, collecting data at the entry points of a network will not contribute to tracing insider attacks.
When our admin director signed me up to attend the webinar, The Case for Network Forensics - from Solera Networks a few weeks ago; to be honest I thought that it would be a variation of some tools already in use by another start-up.
The synopsis of this webinar had me recall a paper I read a while ago by a Gartner consultant  which stated, "Cloud services are especially difficult to investigate, because logging and data for multiple customers may be co-located and may also be spread across an ever-changing set of hosts and data centres," then, I figured it was only a matter of time before a start-up proved this statement wrong.
Enter Solera's discussion on network forensics. One takeaway was that the core nature of this product is that it is like a Security camera - and it records everything.
Ok I thought, digital forensics examiners typically have vast amounts of data to sift through in a traditional system anyway; how will this company's tools expedite the sorting and analysis to output what we need that is specific to an investigation within the cloud; which will be accepted in a court of law?
Also digital evidence by itself can be extremely fragile, in that it can be altered, damaged, or destroyed by improper handling or examination. As forensic examiners we know how critical it is to ensure that precautions are taken to document, collect, preserve and examine evidence. As you know any failure in this process can render a case inadmissible in court.
I took my questions to Peter Schlampp VP Marketing and Product Management and Alan Hall Director Marketing  from Solera, who provided insight as follows.
Within the cloud Solera's tools does not use a typical custom silicate, but rather will see packets as they are seeing it as if on a traditional system NIC. Integrated into a cloud service providers environments this system claims to ensure that the customer are the only one seeing aspects of their data and no one else.
Of course I wondered about the VM managers at the cloud service provider (CSP) who manages the VMs at this point, as they can see customers' data.
The response, I received was as follows: Data tracks on the customer view, will be that of who interacted with their system in the cloud and what types of connections came in to the system hosted in the cloud. In other words it records traffic between virtual host on a physical host.
The system also has an integration with Sourcefire's defense center, although I haven't conducted a PEN-TEST in over a year, I still keep updated on current processes and technologies within the IT Security - Pen-Testing world; knowing that SNORT is utilized, was an immediate plus for me.
In the event of an incident, an investigator can drill down to event level which shows the frame of traffic; an alert from a Sourcefire event will then go directly to a Solera networks device.
Data provided from this can provide answers to: How did the connection get initiated? How do you know what happened afterwards? And for a host that was compromised one can potentially follow paths.
Despite this I still express some concerns with regard to levels of assurance for data held within the cloud amongst others. In order to get objective feedback, I approached one of my mentors Mark Pollitt for his sage input. Although he expressed his concern regarding the Solera's pitch of "network forensics for amateurs," he did state that "anything that will make analysis easier and capable of being done (even just as triage) by less skilled operators is very useful."
Whilst not an endorsement, it put my mind at ease in the sense that: the company had a vision which was on track with regard to a direction for virtualization, the cloud and forensic examination.
As a technologist there is nothing like more data and case study results to satisfy my reserve, so I presented these concerns to Schlampp and Hall, who responded with food for thought as follows:
Advanced Solera Networks network forensics technology now gives the ability to make data more understandable to a common individual. Packet detail is now rendered as web pages, emails, IMs, MS Office docs, etc. That means we can utilize support staff that can interpret this "human visible" or "human readable" data and clearly understand that the data obviously contain information we don't want leaked from our organization. With the advances Solera Networks makes, users have more front line incident response personnel that can determine if the appropriate triage requires escalation to those limited personnel that possess the in-depth skills. Those skills, combined with a complete forensics record from Solera Networks appliances, can uncover exactly what happened and more importantly, help determine the proper course of action and do so quickly to close the gap in response time between incident and remediation.
In a perfect world, effective network forensics requires the ability to "capture it all, all of the time." When we don't know what we don't know, capturing it all is the only way to ensure we have the complete data to interrogate and create the accurate story of what happened. However, what we end up with in practical use is usually something short of "everything."
We have to factor in things like amount of storage at our disposal, how fast our networks are running, what data or systems we have determined as most valuable in our organization, data protection regulations, etc. Accounting for these and other factors, Solera Networks has real-time network forensics technology that lets you make choices on what to capture - all data on every segment; selective segments of data based on port, specific applications, protocols, IP addresses, etc.; or, even get as granular as analyzing every packet for specific information like a hex pattern and only retaining those packets.
Selective capture requires a trade-off between creating more manageable "haystacks of data" and "missing the needle" altogether because it is in a different haystack of data that we didn't have the foresight to capture. Because of Solera Networks approach network forensics technology has evolved to the point where we can stick with one haystack and have the tools to find the exact needle in near real-time.
With any new product only time can tell the benefits it will provide. With regard to digital forensics and the drive to adopt cloud computing systems, any tool that will improve results, reduce false positives and give an investigator data that is relevant, factual and which can be presented and accepted in a court of law will be valued. I believe that these tools combined with a system such as that of ForNet  could chart a part for forensics investigations within the cloud ecosystem.
Accordingly ForNet :"helps with the postmortem of any security incident including insider attacks. It can also store potential evidence for months, which is much longer than any existing solution. With an integration of its XML based query routing protocols, coalescing of synopses, and a user interface, an analyst can locate evidence relating to an incident efficiently and transparently."
1.Politt MM. Six blind men from Indostan. Digital forensics research workshop (DFRWS); 2004.
2.Digital Forensics:Defining a Research Agenda -Nance,Hay Bishop 2009;978-0-7695-3450-3/09 IEEE
4. Cloud Computing Storms: Biggs, Vidalis; IJICR Vol 1, Issue 1, March 2010
5. GARTNER. 2008. Tough questions: Gartner tallies up seven cloud-computing security risks.
6.Peter Schlampp VP Marketing and Product Management,Alan Hall Director Marketing - Solera Networks
7.ForNet: A Distributed Forensic Network, Kulesh Shanmugasundaram - Project ForNet NYU Polytechnic University.
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Aug. 26, 2016 07:00 PM EDT Reads: 633
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
Aug. 26, 2016 04:45 PM EDT Reads: 1,524
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Aug. 26, 2016 04:30 PM EDT Reads: 2,296
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Aug. 26, 2016 04:00 PM EDT Reads: 3,927
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
Aug. 26, 2016 03:15 PM EDT Reads: 406
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Aug. 26, 2016 11:30 AM EDT Reads: 578
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Aug. 26, 2016 10:15 AM EDT Reads: 3,580
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Aug. 26, 2016 09:45 AM EDT Reads: 1,842
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Aug. 26, 2016 05:00 AM EDT Reads: 3,044
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
Aug. 26, 2016 02:15 AM EDT Reads: 1,979
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 26, 2016 01:30 AM EDT Reads: 1,693
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Aug. 26, 2016 01:00 AM EDT Reads: 1,985
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
Aug. 26, 2016 12:30 AM EDT Reads: 2,914
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Aug. 25, 2016 11:45 PM EDT Reads: 2,309
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Aug. 25, 2016 10:15 PM EDT Reads: 1,736
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Aug. 25, 2016 09:15 PM EDT Reads: 2,228
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
Aug. 25, 2016 05:15 PM EDT Reads: 742
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Aug. 25, 2016 01:00 PM EDT Reads: 2,622
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Aug. 25, 2016 08:45 AM EDT Reads: 2,150
Akana has announced the availability of version 8 of its API Management solution. The Akana Platform provides an end-to-end API Management solution for designing, implementing, securing, managing, monitoring, and publishing APIs. It is available as a SaaS platform, on-premises, and as a hybrid deployment. Version 8 introduces a lot of new functionality, all aimed at offering customers the richest API Management capabilities in a way that is easier than ever for API and app developers to use.
Aug. 25, 2016 06:00 AM EDT Reads: 1,467