Buy-in/resistance to change is top non-IT roadblock to cloud adoption

August 1st, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

Non-IT roadblocks continue to dominate the large- and midsize-enterprise journey to a cloud-ready infrastructure, with buy-in/resistance to change becoming the most common inhibitor in 2H 2013. Non-IT roadblocks are reported by 71% of enterprises, while IT-related roadblocks were common at only 19% of enterprises in 2H 2013. In addition, 18% of respondents experienced no roadblocks in reaching the next phase of cloud computing initiatives. There have been considerable changes in the composition of top non-IT roadblocks catalyzing the need for change management initiatives.

Concerns over buy-in/resistance to change have ballooned to the top, replacing organization/budgetary inhibitors as a top non-IT roadblock, with citations increasing from 16% in 1H 2013 to 37% in 2H 2013. Organization/budget-related issues dropped to the fourth place, falling from 37% to 15% of respondents between the studies, consistent with more enterprises reporting the shift of cloud-specific budgets into core enterprise IT spending. Issues associated with vendor selection/offerings/cost models gathered the second-most citations and appeared on the list for the first time with 19%, demonstrating difficulties in navigating between different offerings. Fortunately for decision-makers, 451 Research has recently developed a Cloud Pricing Codex that guides readers through different pricing models on the market. People/time-related challenges climbed to the third place, growing from 10% of citations to 18%. Data management/control and internal organizational issues were some of the other notable new entrants to the list of top non-IT roadblocks to cloud adoption, with 6% and 5% of citations, respectively.


As buy-in/resistance to change climbs atop the list of most common non-IT roadblocks at large and midsize enterprises, it is important to have adequate change management mechanisms in place. For example, establishing a proper communication channel to manage employee expectations and cultural changes is often required for integrating cross-functional team structure in a service-oriented IT model. Change management is important not only with internal stakeholders, but also external ones. Monitoring non-IT related roadblocks to cloud adoption is not easy, but doing so can prove to be a key step in transition.

TheInfoPro’s respondents had the following commentary about non-IT roadblocks at their enterprises:

  • “Different vendors have different approaches and cost models – we need to sort through these.” – LE, Healthcare/Pharmaceuticals
  • “Client buy-in. Does the data belong in the cloud? Some data will never go in the cloud, such as patient information.” – LE, Healthcare/Pharmaceuticals
  • “Mostly around education. Business leaders need to be brought more closely to this. To explain how it works, what it does. Lip service paid to it, not really well understood yet.” – LE, Consumer Goods/Retail
  • “It’s a change – we went from traditional IT shop managing everything to cloud. We didn’t realize there’s not the talent pool out there for vendor management.” – LE, Consumer Goods/Retail
  • “Cultural, that’s a big one. Understanding how roles and what changes within organizational structures.” – LE, Financial Services

Network monitoring, or lack thereof, is a pain

July 30th, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

The top source of network managers’ pain in the recently completed Wave 11 Networking Study is network monitoring: 19% of network managers cited it as one of their top pain points. This is up from 13% mentioning it in the previous study. Measured largely on the availability levels they can maintain for the enterprise, network managers cited a need for tools that offer network transparency amid increasing complexity, quick identification of the source of problems, and a ‘single pane of glass’ view of their networks.


Aging hardware took second place as a top source of pain, up to 16% from 9% of citations last year, indicating that for some enterprises, networks may be in line for an equipment refresh. The greatest drop-offs in sources of pain occurred for security (down to 14% of citations as a top pain point, from 22% last year), capacity (at 13%, down from 19% last year) and wireless (dropping from 12% last year to 6% this year).

Network managers had the following to say about their top sources of pain:

  • “One of the challenges today is monitoring, visibility. Basically, today it’s not enough that you say it’s not the network [that's a problem]. It’s how do you fix the problem? It’s the end-to-end visibility; we’re struggling with coming up with a set of tools. We have tools, server guys have their own tools, [I have my tools], I say it’s not my problem. [We need] something that will stitch up whether the problem is the server or the network or something, a pane of glass that will guide you [to] what services to look at [to fix the problem].” – LE, Healthcare/Pharmaceuticals
  • “Complexity of the environment in general. I’ve never seen anything like it in the last 17 years. To support it we are well beyond, we can’t even hire people without 10 years’ experience in the support world. With the new technologies from Cisco, only a few people are really able to get their hands on it, not a bunch of dissemination, not like the project is handed off to support folks that have to support it. It’s pretty tough. Up until five years ago, the majority of technology was well known, understood, and widely used the same way. But virtualization, new MPLS technologies, they were more on the cutting edge back then, now becoming like mainstream. Not enough knowledge base, expertise in the engineers to support some of these decisions.” – LE, Financial Services
  • “Just using networking monitoring tools, getting the most out of it, something we wish we could do. It takes a little bit of time for us to really get a good grasp of all the things it can do. SNMP is new for us, slowly getting our way into learning as much as we can, finding without formal training it’s a bit of a chore.” – MSE, Transportation

Ping Identity breaks into top five for single-sign-on implementations

July 28th, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

Enterprises continue to streamline authentication procedures with single-sign-on (SSO) implementations, and Ping Identity is getting a piece of the action with a newly acquired place in the group of top five vendors for implementations. SSO technologies provide users with a single login experience to multiple resources, with the goal of reducing the cost of management, support and day-to-day operations. The majority of large and midsize enterprises have taken advantage of SSO technologies, as in-use implementations have climbed from 49% in 2H 2012 to 55% in 2H 2013.

In addition to 55% of enterprises that have already implemented the technology, about 12% report in-pilot/evaluation, near-term, long-term and past-long-term projects, which Ping may use to continue its progress in enterprise implementations. Only a third of large and midsize enterprises are not considering SSO technology. Spending intentions look positive for SSO in 2014, with 24% of decision-makers planning to increase spending, compared to 18% in 2013, and only 2% planning to curb spending. In 2H 2013, Ping gathered the same number of citations for implementation as did CA Tech, allowing Ping to move past IBM into the group of top five vendors for in-use cases. Ping continues to show more future project plans than CA Tech, which would allow Ping to continue the rise in the group of top five in-use vendors upon successful conversion into implementations.

Ping is not the only vendor that is gaining traction in the SSO space – Microsoft has strengthened its lead between the studies, while Oracle moved up to second place in implementations. Microsoft increased the number of citations for in-use cases from 8% to 11% and widened the gap separating it from the runners-up between the studies. Oracle became the new runner-up to Microsoft in 2H 2013 as it experienced growth in implementations between the studies, unlike homegrown alternatives, which had a slight decline reported for in-use cases, pushing them to the third place. Ping has several options available to build on the established traction in SSO space, including an IPO. Last March, the company hired a new CFO who comes from a public company, while its founder and CEO, Andre Durand, said Ping may go public as early as 2014. More details on Ping’s potential future moves are available in a recently published report by 451 Research titled ‘2014 M&A Outlook – Enterprise security’ (client login required).


The following anecdotes were provided by survey respondents about Ping Identity and other SSO technologies:

  • “[Exciting vendor:] Ping Identity – we’ve been trying to do all the IDM ourselves, but it’s been burning a lot of energy. We want to outsource all the identity federation with cloud providers to them. We want to provide SSO for cloud apps, and that’s where Ping is very strong.” – LE, Consumer Goods/Retail
  • “[Exciting vendor:] Ping Identity/PingFederate – extremely simple to use. I can have a federated link up and running in seven minutes. It’s a bit rudimentary, but so are the open tools, and this is even easier to use.” – LE, Other
  • “Bought it [SSO] a couple years ago, but not yet rolled out.” – LE, Consumer Goods/Retail
  • “SSO for corporate apps only.” – LE, Industrial/Manufacturing

Is it time for IBM to say ‘Goodbye to the Machine’?

July 25th, 2014 by AnnMarie Jordan

Peter ffoulkes, Research Director for Servers and Cloud Computing

IBM is scheduled to report its Q2 2014 earnings a week from now, on July 17. For the past several quarters, the company’s systems and technology revenue has been in a steep decline and ended at -23% year-over-year last quarter, representing a mere 11% of IBM’s overall revenue. As part of an attempted company turnaround, IBM is on track to sell its System x division to Lenovo, which although hemorrhaging revenue at double-digit rates is faring better than either the Power Systems or System z divisions. Current speculation is that IBM will also shed its microelectronics division and rely on the OpenPOWER initiative to develop the technology for its proprietary product lines. Historically similar initiatives have met with limited success.

Unless IBM can achieve a dramatic turnaround in its hardware-related businesses, it might be appropriate for the company to consider a rebranding exercise to position itself as something that more closely relates to its future direction, such as International Business Services or International Business Cloud, although available and suitable three-letter acronyms are hard to come by these days.

IBM server customers plan less spending in 2014

In our most recent Servers and Virtualization Study, 36% of respondents who chose to evaluate IBM planned to spend less with the company in 2014, a 13-percentage-point increase over the prior year. This should be of particular concern to IBM, because the vast majority of workloads in modern datacenters and ‘the cloud’ run on x86 architectures, and IBM’s System x customers express significantly less loyalty than those who use IBM proprietary technologies. Looking forward, this does not bode well for Lenovo’s prospects of becoming a major x86 server vendor in North America. The Q2 2014 earnings results will indeed prove interesting.


IBM customers have a general tendency to be loyal, especially those who use IBM proprietary technologies. However, anecdotal commentary illustrates that the TheInfoPro’s respondent community is showing increasing concern about support for the dominant server technologies that they use to run their critical business workloads:

  • “The main reason we are switching to HP is for their blade technology. Switching from IBM due to support problems.” – LE, Consumer Goods/Retail
  • “If they sell out to Lenovo [I’ll consider switching].” – LE, Industrial/Manufacturing
  • “Cost, support and loss of their competitive edge.” – LE, Telecom/Technology
  • “IBM’s X series response to service has been very poor. Production was down for long periods of time.” – LE, Consumer Goods/Retail
  • “I am concerned about them selling their portfolio to Lenovo.” – LE, Financial Services
  • “IBM really missed the boat with the next generation of blades. PureFlex was their answer, but it is a ‘Frankenstein’ system. They only looked at the hardware. They never considered the ease of use. Overall, IBM still has great recognition, but this can be a weakness. They rely on that a bit too much; they need to be hungrier when it comes to new products.” – LE, Telecom/Technology

Back to the future (of converged infrastructure)

July 23rd, 2014 by AnnMarie Jordan

Marco Coulter, Research Director for Storage

Running detailed research since 2002 allows us to define recurring industry patterns and trends newcomers miss, and also to reminisce a little. This looks back at our prior storage research to better understand some of the aspects of today’s technology. (The original 2002 report is outside our paywall here.)

Sometimes, you just have to laugh. In 2002, we considered ‘Over 50TB’ as the highest category needed for SAN capacity! We were right – the energy/utilities vertical had the highest count, with only 40% of respondents falling into this category. For industry veterans, perhaps you recall some of these companies mentioned that have since disappeared or now only exist in stock codes, including Compaq, Legato, Veritas, StoreAge, BlueArc or StorageTek. If you have a tchotchke with the brand of one of the departed companies, tweet a photo with a #TIPflashback tag so we can see it too.

The real point in looking back is to acknowledge the issues we solved, and spot those that may recur. Consider the chart below.


Fourteen years ago, managing growth was still the leading pain point in storage, and familiar ones around scalability and migration were only just arriving. But in 2002, we were still taking the islands of DAS storage capacity attached to specific compute environments, and merging them into SAN and NAS sharable environments. See the 4% selecting ‘internal selling of the SAN concept,’ the 5% ‘developing storage networking strategy’ and the 12% wanting to ‘consolidate existing capacity.’ We were well aware of the capacity lying fallow in silos across the enterprise, and the amount of duplicated technology appearing in those silos. We were determined to merge those ‘islands’ into a capacity ‘continent.’ As an industry, we succeeded.

These sorts of pain points could return if storage is included in converged infrastructure procurement. Certified environments to help avoid interoperability issues make a lot of sense. But do we want to buy storage capacity at the same time, and in proportion to compute capacity? It is hard enough to predict future capacity needs. The idea of purchasing compute and storage capacity in equal proportions assumes we can successfully calculate a compute-capacity ratio for an application. Getting the ratio wrong means we end up with islands of free capacity in the wrong converged infrastructure racks … again.

Perhaps the potential waste doesn’t really matter. The cost of wasted capacity is not what it was in 2002. Few businesses would like to admit over-purchasing to need, but this is likely less expensive than reacting to exhausted capacity. Reactive additions of capacity are not as painful as they were then, as newer storage architectures permit disruption-free addition of capacity. There will still be waste when compute or storage grows faster than the other requiring paired purchases when only one is really needed. (Did anyone have to buy a second Vblock when they really only needed a little more speed?) The connectivity and interoperability challenges between converged infrastructure stacks are also not what they were. Networking has standardized, and even when physical connectivity varies (e.g., InfiniBand), today they support standard protocols further up the stack, allowing interoperation.

Still, we spent a lot of time and effort since 2002 corralling those silos of proprietary storage into centrally managed storage plants. We know how complicated that task was. Storage pros will find it hard to let go of the concept for a converged infrastructure model that includes storage capacity. Does it make sense to include storage in converged? Convergence makes a lot of sense in transient parts of the stack like compute and network, but the permanent nature of storage is a significant difference. Try calculating the compute-capacity ratio for one of your longer-life applications and see if they trended equally over time.

Yes, we have been keeping narratives to provide context around the data since 2002. Here are a few that capture that moment in time. Note that some ideas like ‘storage on demand’ have been around for a while.

  • “We have so many islands of SAN that we are looking at Director switches to combine them. We would like to meld the block level SAN access with NAS file access all behind a Fibre fabric.” – a 2002 Fortune 500 communications products company
  • “Our biggest issues are: We have so much DAS that we can’t provision where it’s needed accurately. Right now, we have islands of capacity until we get into SAN or NAS environment. We have difficulty getting vendor support in moving to a storage-on-demand model. We have no good mechanism to understand what we should charge back to users.” – a 2002 F100 consumer products company
  • “We are now consolidating on smaller group of servers, but we did a big expansion recently in the data warehouse area. We will wind up with less data overall. In North America alone, we have 450 NT servers. We should be able to get that down to 200 servers. Some servers are running at less than 1% of capacity. This happened because we weren’t monitoring servers. But now we are, as part of our overall architecture revamping. We’re now examining our cluster options and moving almost all of our capacity to SANs.” – a 2002 F500 chemical company
  • “Our biggest issue is integrating islands of DAS into a switched storage network. Also, we are trying to get economies of scale and improve our utilization rates.” – a 2002 F500 communications services company

Expect weaker network spending in 2015

July 21st, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

Network budgets continued an upward swing in 2014, according to TheInfoPro’s recently completed Wave 11 Networking Study, but the outlook for 2015 is bleaker for large enterprises. Forty-four percent (44%) of interviewed networking managers said their budget increased between 2013 and 2014. Only 29% of interviewed network managers say they believe they’ll see a similar increase in 2015, while those predicting a drop in budgets increased from 21% to 27%.


The average budget for the enterprises represented in the study was $18.6m, with the greatest increase going to larger enterprises, while midsize enterprises’ budgets remained largely flat. Healthcare and financial services lead the pack of industry verticals devoting the highest average dollar amount to network capital and operational expenses. For the most part, operational expenses continue to dominate all budgets; 61% on average is devoted to opex vs. 39% dedicated to capex.

Interviewees in the Wave 11 Networking Study had the following comments about their networking budgets:

  • “Switching and routing and the core. Need to update too much of it, rip and replace for a lot of it. And getting the budgeting for that stuff, oy, so many different things.” – LE, Consumer Goods/Retail
  • “Cost containment; our budgets are not getting any larger.” – LE, Education
  • “Identifying sources of certain network issues, for example finding bandwidth hogs. We are able to do it, albeit a fairly manual process to get that information. There are a lot of tools that can automate, but without much of a budget to speak of, I’m kind of forced into finding ways to do without or maybe learning how to script a lot of things we do as manual process to get the information we need.” – MSE, Transportation
  • “Finding the budget to keep up [is a pain point].” – MSE, Education

Global competition among telecom providers is emerging in cloud infrastructure services

July 17th, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

The after-effects of the NSA revelations and growing global competition are some of the factors telecom providers have to consider as they roll out and adjust their strategies in the expanding cloud infrastructure services space. Even when we exclude software as a service (SaaS), the potential size of the market opportunity is estimated to reach over $30bn by 2018, with telecom providers showing the second-highest enterprise adoption intentions after online service providers. The percentage of large and midsize enterprises using telecom providers for infrastructure-oriented cloud services is expected to increase from 40% now to 52% in two years. However, the events from last week demonstrate that some providers will experience headwinds as geopolitical factors come into play.

As a consequence of NSA revelations, the German government announced its suspension of Verizon’s communications services for all agencies by 2015. We can expect a similar backlash for cloud infrastructure services. The political alignment will not favor US or British telecom providers, and they may need to adjust their strategies in light of recent events.

The results from the Wave 6 Cloud Computing Study show that Verizon and AT&T have the most in-use cases for cloud infrastructure services with 14% each, followed by British Telecom (BT), CenturyLink and Reliance Globalcom. When we asked our study respondents about telecom providers they plan to use for cloud infrastructure services two years from now, Verizon took the lead with 18% of selections, followed by AT&T with 15%. Picking winners is still premature at this stage as global competition is just starting to emerge in the cloud infrastructure services arena.


We expect telecom providers from different geographies and specialties to move into the cloud services space. Some telecom providers have to determine their go-to-market strategies, models, variables they will have to deal with and differentiation stories they want to pursue, while others, such as Entel from Chile and Swisscom from Switzerland, are already making considerable strides forward. Many of the telecom providers that have not been ready for the move before are now jumping on the cloud infrastructure services bandwagon, as they realize that failure to do may result in significant problems down the road. A recently published report by 451 Research, titled “To overcome cloud roadblocks, vendors need new ways to add value beyond core offerings,” discusses some specialization options telecom providers can pursue, unless they choose to join the race to the bottom with the economies of scale established by AWS and Google.

TheInfoPro’s respondents had the following commentary about telecom providers and cloud security problems:

  • “The whole NSA access into the cloud providers has had a great impact on our clients/customers and thus impacts us.” – LE, Consumer Goods/Retail
  • “[Reducing security concerns:] Third-party access is nearly impossible to mitigate. It is a pure ‘trust’ situation and cannot be technically overcome. AT&T’s vCloud hosting seems closest possible, but you still have to go through their setup/network.” – LE, Industrial/Manufacturing
  • “[Public cloud performance management/monitoring:] For our CenturyLink deployment, we do receive QoS information and metrics.” – LE, Consumer Goods/Retail

Security is the greatest pain in the cloud

July 9th, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

Cloud service cost, organizational culture, network connectivity, expertise and other pain points associated with the move to a cloud-based infrastructure all pale in comparison to the top pain point cited by those responsible for cloud-related deployments in their enterprises: security. More than a third (37%) of interviewees in the cloud study, not security managers as with the security study, cited security as their top source of pain in deployment to the cloud. While other concerns are eroding (pricing decreased from 16% to 14%; integration decreased from 18% to 12%; expertise decreased from 17% to 11%), security is growing as a cloud concern, increasing from 30% to 37% between the Wave 5 and 6 Cloud Studies.


The top cloud security issue cited by interviewees is the security and privacy of data at 41%, with access control following at 35% and auditability rounding out the top three at 32%. Not surprisingly, ‘transparency’ and auditability are the top two ways respondents thought cloud providers could address security concerns, cited by 28% and 21% of respondents respectively. Almost three-quarters (73%) consider the security traits of a cloud provider to be ‘extremely important,’ ahead of even regulatory compliance cited by 63% as ‘extremely important.’ The top method of resolving compliance concerns for cloud providers, cited by 33% of respondents, was the ability to present proof of audit and compliance traits.

Commentators responsible for their enterprise’s cloud deployments had the following to say about security concerns:

  • “Perennially it’s security, how do we guarantee security. We operate with Rackspace, and they are remarkably diligent. But the last thing we’d ever want to see would be a breach in our infrastructure.” – LE, Consumer Goods/Retail
  • “Inadequate cloud transparency with too many cloud service providers (CSPs). For example, CSPs that ignore the CSA STAR (Security, Trust & Assurance Registry) and CAIQ (Consensus Assessments Initiative Questionnaire).” – LE, Consumer Goods/Retail
  • “The naive view that the cloud vendor is taking care of securing the data. It still requires our own security measures and encryption, but people assume once you buy into the cloud, it is by definition secure.” – LE, Education
  • “Security. Our security standards are fairly liberal. We use the work of the Cloud Security Alliance for evaluation. We find that the vendors don’t have the standard of care for my in-house data.” – LE, Services: Business/Accounting/Engineering

Dell’s road to the future is private and full of cloudy promise

July 7th, 2014 by AnnMarie Jordan

Peter ffoulkes, Research Director for Servers and Cloud Computing

Despite the fact that we concluded our previous server study prior to Michael Dell’s successful privatization of the company he founded, the results of the survey were generally positive for Dell’s future prospects. The company maintains growing recognition as an enterprise-class server provider and is gaining recognition for its end-to-end datacenter strategy as the company builds out its software offerings with well-thought-out acquisitions. On the server spending front, growth has been steadily slowing for Dell in the past few years, but still remains stronger than for all other server vendors except Cisco. Dell servers have been increasingly gaining recognition with enterprise customers, with 10% of respondents who rated Dell spending more than $5m in 2013.

Dell scores highly in the Market Window for server infrastructure

TheInfoPro’s Market Window is a graphic representation of vendors rated and actively in use by respondents compared for effectiveness at marketing (promise) and execution (fulfillment). Dell consistently scores well in the product categories, especially in value for money, coming second to Cisco overall among hardware vendors, noticeably ahead of HP and IBM. In the Market Window, Dell was rated average on the Promise Index and well above average for fulfillment, equaling Cisco’s score.


Six months after our survey was completed Dell held its first analyst summit since its privatization and provided detailed insight into its strategy, product roadmaps and extensive access for 451 analysts to company executives, including Michael Dell himself.

The strategy Dell discussed is fundamentally unchanged from the one it has been pursuing for several years. Perhaps the biggest impact initially will be on its executive team – CEO Michael Dell says he got back 20% of his time through not having to deal with Wall Street, while board meetings can last as little as 10 minutes.

The 451 analyst consensus is that the company going private is not an issue for most customers, but the main difference may be in how employees and partners feel about the company. Related to this is the issue of how Dell presents itself to the outside world. Now more than ever as a private concern, it needs a clear and strong voice, and developing this remains a work in progress. Part of the challenge remains to articulate a single vision that resonates with a broad target audience of consumers, SMEs and large enterprises. Nonetheless, the renewed energy and enthusiasm Dell talked about is in evidence among its senior executives; its challenge now is to ensure this percolates down through its 100,000-strong workforce. Though Dell still has to articulate a strong idea of what exactly what it wants to be in the longer term. The fact it has put cloud firmly in its cross hairs suggests that the company is developing an idea of where it needs to be.

Anecdotal commentary illustrates the sentiments expressed by TheInfoPro respondent community at the end of 2013:

  • “Dell has good brand and recognition. They stay in line with the Intel CPU roadmap. They maintain a business focus even though they sell retail computers. On the down side, Dell has reorganized their sales force too many times. Unless you spend a lot of money, you don’t receive a lot of personal attention.” – LE, Financial Services
  • “They’re making strides through some acquisitions that are making them more competitive.” – MSE, Public Sector
  • “Upgrades being forced on us – HP and Oracle don’t really get along, so we’re changing everything over to Dell.” – LE, Consumer Goods/Retail
  • “Migration of SAN infrastructure to Dell Compellent.” – LE, Energy/Utilities
  • “Dell has a great sales team that makes sure we are taken care of based on our needs. They do a great job specking what we need. The products, especially the x86, does a great job for us. Down side, Dell is not an innovator. They are a great provider of IT equipment. Good, solid commodity vendor. They need to continue to offer present levels of value. That will keep us happy.” – LE, Industrial/Manufacturing
  • “Dell offers incredible value for the money. The feature set is way above what we can get from any other vendor. The reliability is excellent, especially in the first five years! Weakness: their jumbled management tool philosophy. In the next two years they should have this worked out.” – LE, Consumer Goods/Retail
  • “They’re really hitting their stride.” – LE, Services: Business/Accounting/Engineering
  • “Dell’s direct sales model is compelling and a major reason we stay with them.” – LE, Consumer Goods/Retail
  • “Dell is a very cost-competitive vendor. They are well positioned, and they have a good overall vision. We stay with Dell because of their cost structure. There is no compelling reason to look elsewhere.” – LE, Materials/Chemicals

Overall enterprise IT spending shrinks faster than storage budgets

July 2nd, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

We expect 2014 to be a tough year for storage professionals and vendors as budgets continue to tighten. It is not just storage, though – overall IT budgets are shrinking even faster, contributing to a larger proportion of storage in average IT spending. Despite tightening storage budgets in the past two years, our study participants experienced capacity growth in excess of Moore’s Law as they tackled unprecedented data growth. Ability to do more with less is becoming an upper management expectation that is affecting not only storage, but IT as a whole.

The recently completed Wave 18 Storage Study shows that storage-specific budgets grew as a proportion of average IT spending from 9.5% in the prior study to 13.5% now. The growth of storage budget as a proportion of the overall IT fund allocation comes at a time when the average storage budget at both large and midsize enterprises has declined between the studies. The increase in the storage budget portion of IT spending is because the overall IT budgets reported by commentators shrank more than storage budgets.

There are many factors affecting the size of storage and overall IT budgets at large and midsize enterprises from different industry verticals. In our latest study, storage professionals from the energy/utility industry vertical report the highest storage budget as a proportion of the average IT spending, amounting to 27%, while materials/chemicals have the least, accounting for just 4%. The expectation of being able to do more with less is putting pressure on IT professionals and vendors, but at the same time provides opportunities for technologies that enable more efficient operations, such as thin provisioning, automated tiering and cloud, among others.


Wave 18 Storage Study participants had the following commentary about budgets at their enterprises:

  • “We expect a flat storage budget going forward because of the increased storage technology options like thin provisioning.” – LE, Healthcare/Pharmaceuticals
  • “Seeing large demand without an expanding budget.” – LE, Financial Services
  • “Refreshing storage that is four years old with appropriate budget is difficult.” – LE, Healthcare/Pharmaceuticals
  • “Budgets are decreasing, no one wants to spend money.” – LE, Healthcare/Pharmaceuticals