Cable giant Belden set to add Tripwire

January 22nd, 2015 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

Belden recently announced its plan to acquire security vendor Tripwire for $710m in cash from Thoma Bravo, giving the more-than-100-year-old company known primarily for signal transmission and cabling a foothold in the information security space. That $710m figure values Tripwire at nearly five times sales, a somewhat significant markup from what Thoma Bravo paid (around $225m) in 2011.


Tripwire is probably best known for its first major product, file integrity monitoring (FIM), a technology that stands at 32% implemented within enterprises, according to the 2014 Information Security Study. Tripwire is the de facto leader in the space, at around 15%, with 1% of greenfield growth among new customers projected in the next six months. Six percent (6%) of existing customers spent more on FIM in 2014 vs. 2013, and 2015 is projected to have more of the same. Tripwire has made entry into other security product areas as well, notably vulnerability management with the 2013 acquisition of nCircle.

Quotes from the 2014 Wave 17 Information Security Study on FIM included the following:

  • “[FIM is] only in the Unix environment for SOX controls. The Windows environment doesn’t have critical applications.” – LE, Other
  • “Not doing any of that [FIM] anymore. [Why stopped?] It was required for PCI, and we’re not doing that anymore. Partially because our IS guys couldn’t figure out to implement. File integrity is very hard.” – LE, Telecom/Technology
  • “From PCI DSS, the file integrity monitoring and endpoint logging are beyond our environment. They also want everything encrypted.” – MSE, Other
  • “Functionality gaps. We need a better way to do file integrity monitoring.” – LE, Financial Services

Why efficiency is a storage project on its own: A mid-study peek at storage budgets

January 20th, 2015 by AnnMarie Jordan

Marco Coulter, Research Director for Storage

The average storage budget has been declining since 2011, driven by decreases in enterprises with the largest storage spending. Understanding the best practices that enterprises are applying to address tighter budgets is critical for enterprises themselves and for the vendors that sell to them.

The chart below shows that the percentage change since 2011 is greater for midsize enterprises, which may feel greater pain from greater comparative change; the larger budgets of the large enterprises is overwhelmingly the source of the drop in average budgets here. Mid-study data shows this trend continuing in 2014.

Tighter budgets were first addressed with technology. Optimization tools like automated tiering, de-duplication and thin provisioning helped ease the burden. As their benefits begin to exhaust themselves, the search continues, increasing the focus on ‘efficiency and cost reduction’ as a standalone project, growing from 1% to 6% of selections, making it ninth of the top 10 storage projects.


Two new entries in the top 10 pain points reflect different aspects of efficiency. Unsurprisingly, budget pressure arrives on the top 10 and receives about as many selections as meeting business provisioning expectations, reflecting that efficiency needs to be considered not only around budget, but also around service delivery. Staying on pain points – as budgets decrease, we also see the ‘high cost of storage’ fall 7 percentage points to 14%. This is a perception matter. With incumbent architectures, products and services, storage professionals had a real sense in the past few years that they were paying a lot for storage capacity and not getting full value. Yet the costs have come down as vendors realized the money was not in the budget.

Interviews reveal four best practices: seeking technologies that better enable optimization; seeking new maintenance models to reduce operational costs; seeking commodification of hardware especially for cold storage; and seeking integrated offerings that simplify the management and delivery of storage services.

  • “We expect a flat storage budget going forward because of the increased storage technology options like thin provisioning.” – LE, Healthcare/Pharmaceuticals
  • ‘We’d like to have all storage go to de-dupe with flash up front. We do have de-dupe backup storage and some flash. It’s a budget and timing thing.” – LE, Services: Business/Accounting/Engineering
  • “A lot of replacing of full-time staff with FT temps. Some support satellite offices. The goal was to reduce the budget costs of FTE.” – LE, Financial Services
  • “Budgets are decreasing; no one wants to spend money.” – LE, Healthcare/Pharmaceuticals
  • “The business wants us to have a flat IT budget.” – LE, Industrial/Manufacturing
  • “Over a year ago we replaced all our storage, from mid-tier to top-tier storage. We’re at a good foundation, but our three-year estimate is already blown through. Within a year, we’re back to re-evaluating and budget requirements for new purchases.” – LE, Telecom/Technology
  • “Seeing large demand without an expanding budget.” – LE, Financial Services

Server budgets spiked in 2014, but are expected to plateau in 2015

January 16th, 2015 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

Server and virtualization budgets had a standout year in 2014, with median and average reaching the highest levels in the past four years, but spending is expected to flatten going into 2015. Our recently completed Wave 14 Servers and Virtualization Study shows that the average budget has skyrocketed from $15.1m in 2013 to $36.6m in 2014, and the median climbed from $2m to $3.5m for the same time period. The financial services industry vertical recorded the highest average and median budgets, with $85.8m and $14.5m respectively.

Telecom/technology had the second-highest average of $70.8m, while industrial/manufacturing and transportation were the two runners-up, with highest medians of $8.2m and $8m.

Although coming off a strong base following refresh projects and the economy starting to come out of a recession in 2014, spending is expected to plateau in 2015. The percentage of respondents projecting budget increases fell from 49% in 2014 to 41% in 2015, and those with funding cuts in plan increased from 17% to 21%. Cost/budget remains the most common pain point that large and midsize enterprises have to tackle. Flattening spending allocations show that 2015 will be quite different from 2014, putting additional pressure on vendors going forward.


Study respondents had the following commentary about budgets and spending at their organizations:

  • “Still waiting on the budget for 2015. Had a big spend [2014] in VMware and blades and chassis.” – MSE, Education
  • “We’re investing in a converged infrastructure so this is a big increase this [2014] year.” – LE, Telecom/Technology
  • “We lease all of our server infrastructure. One-third of budget is capital spending. Had to refresh our routers which are not leased.” – MSE, Financial Services
  • “We identified multiple ways of using the existing infrastructure to reduce spending.” – LE, Financial Services
  • “We are doing a massive datacenter consolidation from 12 to 3.” – LE, Financial Services

Riverbed is taken private at $3.6bn

January 14th, 2015 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

Riverbed Technology has revealed plans to go private in a sale to private equity firm Thoma Bravo for $3.6bn. The company had previously been under pressure from activist investment firm Elliot Management, which had attempted to acquire Riverbed in February. The move allows Riverbed to continue the diversification of its product offering beyond WAN optimization (WANOp) without the quarter-to-quarter pressure of a publicly traded company.

Riverbed saw reasonably positive spending intentions among its current customers this year, according to the 2014 Networking Study – 45% increased spending from 2013 levels, against only 15% decreasing spending. Next year doesn’t project as positive but remains reasonable, as 29% are planning to increase spending, against 18% decreasing their spend. Meanwhile, Riverbed has seen increased vulnerability to customer loss over previous years, with 25% of customers participating in the latest study saying they ‘might’ switch away from the platform. By contrast, just 4% of respondents last year said they were definitely switching. Silver Peak and Cisco were named as the most likely alternatives for those customers looking elsewhere.


WANOp continues to be a near-even split between being implemented as a point solution (38% of respondent enterprises) for some specific application or link latency problem and being a network-wide rollout (36%). The technology works via techniques including caching, de-duplication, compression and traffic shaping. Riverbed is the clear enterprise leader in the technology, according to the 11th Networking Study, with a significant market-share lead over Cisco and Blue Coat. In the study, 57% of users report having a WANOp solution in place, but future growth is somewhat muted, since only about 3% of companies currently without WANOp have implementation plans in the next six moths. Riverbed has also seen growth via acquisition in network management, for example, taking third place this year in network-based application monitoring behind SolarWinds and Cisco.

  • Acceleration equipment – “Using Riverbed, and it’s painful because as they try to move up the stack, they become more visible to the application.” – LE, Services: Business/Accounting/Engineering
  • “Not sure about its [Riverbed’s] longevity. Because to get the part of if that will do what we really want it to do is another capital expenditure. The true visibility that we really need. We’ve got half the picture, but it can’t give us the stats and the granular control that we’d like.” – LE, Consumer Goods/Retail
  • “We removed our WAN optimization hardware several years ago – Riverbed. We had some problems with it, and it was not failing appropriately. We were losing connectivity to sites, having a bunch of support issues with the devices. In the end, it wasn’t worth it at that point.” – LE, Financial Services
  • “They [Riverbed] seem to be on a roll. [We had been] thinking of them just as a network performance acceleration WAN management tool, but they bought a number of tools we use – OPNET, Clarus software, [and now] rather than being prey for somebody like Cisco, they seem to be the pursuer and getting bigger.” – LE, Industrial/Manufacturing
  • “Riverbed delivers products very quickly. The tech support is second to none, and the products work as advertised. They do require more care and feeding, but once up, they are great. The product is expensive, and they are not willing to negotiate unless you want a huge deal. They are strong at their core, but their peripheral is problematic.” – LE, Services: Business/Accounting/Engineering


What provider type will host majority of workloads in the future remains to be seen

January 12th, 2015 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

Enterprises have many options when it comes to infrastructure-oriented external cloud and while online service providers dominate today, it remains to be seen what provider types will host majority of the workloads in the future. The reign of large online service providers, most frequently characterized by Amazon Web Services, is evident in infrastructure-oriented (non-SaaS) external cloud, with 44% of large and midsized enterprises using their services today and 52% expecting to do the same two years from now. However, a wide diversity of other provider types is vying for attention with telecom providers, hosting providers, system integrators and datacenter operators expected to be in use by 15 to 20% of enterprises in two years.

Given that only 8% of enterprise workload will be in non-SaaS public cloud and 18% in any kind of public cloud by 2016, it is too early to pick what provider types will dominate in the future. The opportunity for hosting enterprise workloads is shifting to a diverse set of potential providers, setting the scene for increased competition and eventual vendor consolidation, mergers and acquisitions. Even when decision-makers are intrigued by the technology, the size of the vendor can be a major roadblock to adoption. At this point, only AWS, Microsoft, IBM and Verizon have crossed the 5% enterprise adoption level in infrastructure-oriented non-SaaS external cloud services.


Study respondents had the following commentary about infrastructure-oriented external cloud services:

  • “Some small companies are potentially interesting, but they’re companies we wouldn’t do business with, too small. Wouldn’t invest a large amount in.” – LE, Industrial/Manufacturing
  • “The big trusted companies are still the ones most likely to get traction in the cloud market. A small company with a cool idea; selling a cloud service is a way tougher sell. At least some distrust at decision-making about cloud, more likely to trust a larger company.” – LE, Telecom/Technology
  • “Things could change in two years, but it’s hard to know how and which of the vendors we might be doing business with.” – MSE, Consumer Goods/Retail
  • “In two years’ time frame, may be moving to a storage backup provider to eliminate tape.” – MSE, Financial Services

Existing clients rate HDS higher in 2014

January 9th, 2015 by AnnMarie Jordan

Marco Coulter, Research Director for Storage

The importance of roadmaps to storage professionals helps explain the variance between HDS’s leading customer-assessment scores and subordinate market share. The indexes on 451 Research’s Market Window come from enterprises with that vendor in production, which self-select to evaluate the company. In the chart below, HDS shows improvement on both promise and fulfillment indexes since 2013. Movement is not the whole story, as position against the industry average must also be considered.

The fulfillment index tracks the experience of actually using a vendor’s product based on average scores in a number of categories. The best 2014 average for HDS of 4.7 is for product reliability once implemented, and second highest is 4.6 for out-of-the-box product quality. Despite falling slightly from last year, both scores remained higher than for other array vendors. Other signs of a well-regarded product include technical support that sees the greatest increase rising to 4.1 from 3.7, along with slight improvements in delivery as promised and interoperability scores.

Good fulfillment experiences resulted in HDS being less vulnerable, and receiving the lowest selections for consideration for replacement since early 2010. The few that were considering switching away from HDS most commonly cited cost.


Being a client of HDS is not all sunshine and roses. Storage professionals want to invest in a long-term innovative relationship with vendors, and describe HDS as ‘fuzzy’ when it comes to strategic vision. As one storage pro from a large telecom/technology enterprise said, “Their products are totally reliable and rock solid. Their downfall is their strategic vision and innovation…. EMC is showing us a roadmap that is making us more comfortable.” Despite all this good spirit, enterprises are still hesitant with HDS.

The promise index tracks perceptions of the vendor’s future soundness and technology roadmap, and the ratings from enterprise pros position HDS below the storage-industry average. This index is driven by scores such as technical innovation, strategic vision and competitive positioning, where HDS received three of its lowest averages in 2014. Despite these concerns, 14% of those reviewing HDS had switched to the vendor in the last 12 months, the majority replacing EMC capacity.

  • ‘HDS makes a rock-solid, reliable product that works as advertised. I think that they may be trying to be too much to too many. I think their strategic vision is a little fuzzy.’ – LE, Telecom/Technology
  • ‘Value [would lead me to switch]. I like HDS better, but NetApp E series is cost effective, especially for block storage.’ – LE, Education
  • ‘HDS really has to do better with price performance.’ – LE, Telecom/Technology
  • ‘HDS is a solid vendor and the front-end interface for commonality is excellent. Time to market is a weakness; it takes them much longer than they promise. I think this is due to their technology changing.’ – LE, Industrial/Manufacturing
  • ‘I would like to see HDS do a better job on their tech support. There are too many hoops you have to jump through until you get to the right person.’ – LE, Telecom/Technology
  • ‘They have a well-engineered product, but they can be slow to market and innovation. They need to do a better job getting their message out. I think it is their Japanese tradition.’ – LE, Public Sector
  • ‘Improve on their marketing. I don’t see them marketing big time like EMC or IBM or HP type people. I don’t see a conference or anything like that, unlike other vendors.’ – LE, Services: Business/Accounting/Engineering

Hard drives and laptops still dominate encryption usage

January 7th, 2015 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

Previous waves of the Information Security Study tracked the use of cryptography as part of enterprise security strategy under two categories: hard drive and laptop encryption. With laptop encryption at 80% in use in 2013, three providers dominated the technology: Microsoft (BitLocker), Intel (McAfee’s prior acquisition of SafeBoot) and Symantec (acquired PGP and its whole-disk encryption product). This year’s study tracks encryption under a more general ‘encryption’ category, but the top vendors remain the same.


This is largely because the 82% of enterprises that have encryption technologies in use are still largely referencing encrypting laptops and hard drives, with email also garnering a significant percentage of responses. There is little surprise in this data because much of the attention around drive encryption has been as a mechanism that allows organizations to bypass many state laws requiring breach notification to affected customers. Lost laptops, then, only require the cost of replacing the computer rather than more expansive costs of notifying customers whose data may have been on a lost laptop. Nineteen percent (19%) of interviewees in the study reported spending more on encryption products in 2014; 18% project spending more in 2015.

Quotes from security managers on their use of encryption taken from the 17th Information Security Study included the following:

  • “Encryption, trying to do, establishing data classifications; trying to figure out how to encrypt certain data classifications; that’s been real interesting.” – LE, Consumer Goods/Retail
  • “Whole disk encryption [is a top project].” – LE, Education
  • “Unstructured data file repository encryption.” – LE, Energy/Utilities
  • “Expanded use of data at rest encryption.” – LE, Industrial/Manufacturing
  • “Updating our POS systems, adding tokenization and encryption for card transactions.” – MSE, Consumer Goods/Retail

The Converged Infrastructure Ball: May I have the next dance?

January 5th, 2015 by AnnMarie Jordan

Peter ffoulkes, Research Director for Servers and Cloud Computing

In case you hadn’t noticed, the IT world is going through a sea change, arguably the largest to occur in 30 years or so. Intel architecture-based servers run the majority of enterprise workloads, virtualized architectures dominate, and cloud-based delivery models – private, public, hybrid – are considered to be the way of the future. Although some kind of ‘software-defined infrastructure’ is the architectural vision of the future, the underlying hardware platforms – servers, network and storage – need to be able to support the required integration and cooperate seamlessly.

Converged infrastructure gathers momentum

Cisco shook up the server-vendor market with the introduction of its UCS platform in 2009, a well-designed and well-accepted platform for virtualized workloads that integrated x86-based servers with networking capabilities. Perhaps Cisco’s best move at the time was to remain storage-agnostic and partner with the market’s leading enterprise storage vendors: EMC, NetApp and HDS. As a strategy, it served Cisco well. In particular, enterprise customers liked the multi-vendor best-of-breed approach over the single-vendor solutions perhaps best characterized by offerings from HP and IBM.

Over the intervening period, converged infrastructure offerings have gained traction and are cited as in use by 25% of respondents, with 18% planning first-time use in the next two-plus years and 21% planning increased spending. From the vendor perspective, the multi-vendor alliance faction (VCE, Cisco, NetApp, EMC, etc.) far outstrips the single-vendor approaches, which are led by HP, with Dell, IBM and Oracle a considerable distance behind.


A change of tune and a change of partners

A few years ago, during the financial meltdown, there was lots of talk about banks that were too big to fail. Now we have large IT vendors that are too big to succeed. HP is splitting itself asunder; IBM has sold the Intel-based server division to Lenovo. The CEOs of Cisco, EMC and Oracle are all involved in succession planning. We are witnessing the passing of the old order, and with that comes change and a shift to new alliances.

NetApp is introducing a new VMware-spec-based Integrated EVO:RAIL offering that muddies the positioning with its FlexPod offering. Cisco and IBM have added to the multi-vendor community with the introduction of the new converged ‘VersaStack Integrated Solution’ offering that combines Cisco UCS with IBM’s Storwize V7000 storage system. Is this the beginning of Lenovo’s ‘Winter of Discontent,’ or is Lenovo just planning a glorious summer with a whole new set of beach-party goers for future converged offerings that do not include IBM? One is left wondering whether PureSystems will be quite so pure going forward.

To add to the mix, there is a plethora of hyperconverged vendors beginning to gain attention, including Nutanix, SimpliVity and other new contenders to the throne. Against this background of unrest, anecdotal commentary illustrates the risk-averse tendencies expressed by TheInfoPro respondent community:

  • “Two years ago we put in IBM’s PureFlex, but backed off because of the selling of pieces of their business to Lenovo. Not so much that camp.” – MSE, Public Sector
  • “It’s hard to say where we’re headed in this category. We always watch the converged offerings, but aren’t sure when and how to move forward.” – LE, Consumer Goods/Retail
  • “Leaning toward FlexPod over Vblock because of cost and customizability.” – LE, Energy/Utilities
  • “Both primary vendors are fighting for market share (EMC and HP), and so there are compatibility issues.” – LE, Consumer Goods/Retail
  • “We think it is cool technology with a flawed business model. With current converged infrastructure, you are stuck with vendor lock-in.” – LE, Financial Services

Only half of enterprise workloads will be in any kind of cloud two years from now

January 2nd, 2015 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

In 2014, 75% of enterprise workloads reside in noncloud environments, and only 50% are expected to be in any kind of cloud in two years, with internal private cloud leading the way. Internal, on-premises private cloud is expected to account for 23% of all workloads in 2016, up from 12% in 2014. For the same time period, the use of public cloud is expected to double, but should not exceed 20% of all large and midsize enterprise workloads and applications. Hybrid cloud, which we define as two or more distinct cloud infrastructures that actively interoperate to deliver seamless business functions, is harder to achieve than some of the other delivery mechanisms.

Non-IT roadblocks continue to account for the lion’s share of enterprise roadblocks in reaching the next phase of cloud computing initiatives, but there is light at the end of this tunnel. The percentages for both IT- and non-IT-related roadblocks have gone down between 2H 2013 and 1H 2014. Furthermore, the majority of large and midsize enterprises moved past the virtualization phase in the evolution of their cloud-enabling internal infrastructure. Some applications, such as cloud-native and collaborative apps for public cloud, are already thought of as prime candidates for certain cloud-based execution venues.

Enterprise IT ecosystems are changing fast, and with 2015 around the corner, it is a good time for decision-makers to start budgeting and considering what deployment methods may be optimal for certain workloads and applications in the future. A good place to start is our Cloud Computing Metrics Wave 7 report, which provides a breakdown of the reasons behind large and midsize enterprises using specific deployment methods for certain categories of workloads and business functions in the next two years.


Study respondents had the following commentary about the use of different digital infrastructure deployment methods:

  • “Our developers want the flexibility to choose different options. We got to continue to push that point.” – LE, Financial Services
  • “I think it will evolve. A lot of it isn’t planned necessarily by us; it’s client-driven. When they see savings. We will be running Microsoft Office 365.” – LE, Healthcare/Pharmaceuticals
  • “We tend to push for on-prem vs. public cloud. That can be determined as time goes on and as cloud becomes more clear.” – MSE, Healthcare/Pharmaceuticals
  • “We have stuff in Azure and Amazon; people did their own thing. They’re in the [public] cloud because we suck.” – LE, Other

The OpenStack standard rises in importance for network managers

December 29th, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

OpenStack, in essence an infrastructure-as-a-service (IaaS) open source framework based on some 13 subcomponents written in Python, has continued to rise in importance among network managers as a selection differentiator between cloud service providers.

The joint project of Rackspace and NASA got its start in 2010 as a mechanism for enabling cloud computing on existing standard hardware, and has grown to include more than 200 companies (although NASA did drop out). The nonprofit running the project, the OpenStack Foundation, includes directors from Cisco, HP, IBM, Red Hat and Intel.


No other cloud standard rose to the level of importance of OpenStack in evaluating cloud providers, only the Cloud Security Alliance guidelines (referred to as the Cloud Controls Matrix) garnered a mention at 5% of responses. Among network managers interviewed for the 2014 study, 76% noted OpenStack usage as important for service providers to have – likely a reaction to the need for both transparency and portability when it comes to the deployment of applications in the cloud. This is up from 61% saying OpenStack was an important piece of selection criteria among service providers last year, and those saying it is unimportant this year are down to only about one-quarter of respondents.

The commentary on the cloud, SDN and the OpenStack standard that was provided by network managers ran the gamut:

  • “Keep simplifying. Looking at SDN, but it’s just a joke. Because it has massive physics and administrative problems. Who’s gonna program all this s—? And what dumbass thought it was a good idea to make it a framework and not a solution? You’ll need developers in the datacenters. And they’re not connecting the other services you need, like IPAM, all that s—. Are you gonna build that statically? I love the idea of SDN. It’s pie in the sky. It’s the ‘yeah, buts.’ Juniper’s SDN only runs on OpenStack, but OpenStack’s a piece of s—. Already bifurcating. Juniper stuff can’t work with VMware. VMware s— can only go 100 kilometers. That’s useless. It’s like all the fine print.” – LE, Other
  • “In use today. We have OpenStack internally – Cisco Insieme is friggin’ fantastic; they have it nailed, as long as it works, will be amazing, will drive capabilities more than anything else in the last 10 years. Insieme works, is wholly owned by Cisco. When you say mission critical, you need to be sure the app is built in a way that’s safe to run in Amazon. It depends on the app. I wouldn’t put SAP on Amazon. [The application] needs to be architected in a way that tolerates failures.” – LE, Telecom/Technology
  • “We want to go with the industry standards for flexibility. So we would look to OpenStack as one of those standards. We are still new to the arena.” – LE, Services: Business/Accounting/Engineering
  • “OpenStack is winning. Absolutely important for providers to adhere to it. In 2013, OpenStack is what Linux was in 1997. Different vendors are figuring out different parts of this ecosystem, how to make money from OpenStack. Not one product, one vendor, one thing, it’s a foundational technology that people are exploiting in different ways. It’s being sustained by a large number of companies with different and competing interests, out of which is distilled a good product.” – LE, Telecom/Technology