Back to the future (of converged infrastructure)

July 23rd, 2014 by AnnMarie Jordan

Marco Coulter, Research Director for Storage

Running detailed research since 2002 allows us to define recurring industry patterns and trends newcomers miss, and also to reminisce a little. This looks back at our prior storage research to better understand some of the aspects of today’s technology. (The original 2002 report is outside our paywall here.)

Sometimes, you just have to laugh. In 2002, we considered ‘Over 50TB’ as the highest category needed for SAN capacity! We were right – the energy/utilities vertical had the highest count, with only 40% of respondents falling into this category. For industry veterans, perhaps you recall some of these companies mentioned that have since disappeared or now only exist in stock codes, including Compaq, Legato, Veritas, StoreAge, BlueArc or StorageTek. If you have a tchotchke with the brand of one of the departed companies, tweet a photo with a #TIPflashback tag so we can see it too.

The real point in looking back is to acknowledge the issues we solved, and spot those that may recur. Consider the chart below.

TIP-Thurs-Stor-071014-pic

Fourteen years ago, managing growth was still the leading pain point in storage, and familiar ones around scalability and migration were only just arriving. But in 2002, we were still taking the islands of DAS storage capacity attached to specific compute environments, and merging them into SAN and NAS sharable environments. See the 4% selecting ‘internal selling of the SAN concept,’ the 5% ‘developing storage networking strategy’ and the 12% wanting to ‘consolidate existing capacity.’ We were well aware of the capacity lying fallow in silos across the enterprise, and the amount of duplicated technology appearing in those silos. We were determined to merge those ‘islands’ into a capacity ‘continent.’ As an industry, we succeeded.

These sorts of pain points could return if storage is included in converged infrastructure procurement. Certified environments to help avoid interoperability issues make a lot of sense. But do we want to buy storage capacity at the same time, and in proportion to compute capacity? It is hard enough to predict future capacity needs. The idea of purchasing compute and storage capacity in equal proportions assumes we can successfully calculate a compute-capacity ratio for an application. Getting the ratio wrong means we end up with islands of free capacity in the wrong converged infrastructure racks … again.

Perhaps the potential waste doesn’t really matter. The cost of wasted capacity is not what it was in 2002. Few businesses would like to admit over-purchasing to need, but this is likely less expensive than reacting to exhausted capacity. Reactive additions of capacity are not as painful as they were then, as newer storage architectures permit disruption-free addition of capacity. There will still be waste when compute or storage grows faster than the other requiring paired purchases when only one is really needed. (Did anyone have to buy a second Vblock when they really only needed a little more speed?) The connectivity and interoperability challenges between converged infrastructure stacks are also not what they were. Networking has standardized, and even when physical connectivity varies (e.g., InfiniBand), today they support standard protocols further up the stack, allowing interoperation.

Still, we spent a lot of time and effort since 2002 corralling those silos of proprietary storage into centrally managed storage plants. We know how complicated that task was. Storage pros will find it hard to let go of the concept for a converged infrastructure model that includes storage capacity. Does it make sense to include storage in converged? Convergence makes a lot of sense in transient parts of the stack like compute and network, but the permanent nature of storage is a significant difference. Try calculating the compute-capacity ratio for one of your longer-life applications and see if they trended equally over time.

Yes, we have been keeping narratives to provide context around the data since 2002. Here are a few that capture that moment in time. Note that some ideas like ‘storage on demand’ have been around for a while.

  • “We have so many islands of SAN that we are looking at Director switches to combine them. We would like to meld the block level SAN access with NAS file access all behind a Fibre fabric.” – a 2002 Fortune 500 communications products company
  • “Our biggest issues are: We have so much DAS that we can’t provision where it’s needed accurately. Right now, we have islands of capacity until we get into SAN or NAS environment. We have difficulty getting vendor support in moving to a storage-on-demand model. We have no good mechanism to understand what we should charge back to users.” – a 2002 F100 consumer products company
  • “We are now consolidating on smaller group of servers, but we did a big expansion recently in the data warehouse area. We will wind up with less data overall. In North America alone, we have 450 NT servers. We should be able to get that down to 200 servers. Some servers are running at less than 1% of capacity. This happened because we weren’t monitoring servers. But now we are, as part of our overall architecture revamping. We’re now examining our cluster options and moving almost all of our capacity to SANs.” – a 2002 F500 chemical company
  • “Our biggest issue is integrating islands of DAS into a switched storage network. Also, we are trying to get economies of scale and improve our utilization rates.” – a 2002 F500 communications services company

Expect weaker network spending in 2015

July 21st, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

Network budgets continued an upward swing in 2014, according to TheInfoPro’s recently completed Wave 11 Networking Study, but the outlook for 2015 is bleaker for large enterprises. Forty-four percent (44%) of interviewed networking managers said their budget increased between 2013 and 2014. Only 29% of interviewed network managers say they believe they’ll see a similar increase in 2015, while those predicting a drop in budgets increased from 21% to 27%.

TIP-Thurs-Net-070314-pic

The average budget for the enterprises represented in the study was $18.6m, with the greatest increase going to larger enterprises, while midsize enterprises’ budgets remained largely flat. Healthcare and financial services lead the pack of industry verticals devoting the highest average dollar amount to network capital and operational expenses. For the most part, operational expenses continue to dominate all budgets; 61% on average is devoted to opex vs. 39% dedicated to capex.

Interviewees in the Wave 11 Networking Study had the following comments about their networking budgets:

  • “Switching and routing and the core. Need to update too much of it, rip and replace for a lot of it. And getting the budgeting for that stuff, oy, so many different things.” – LE, Consumer Goods/Retail
  • “Cost containment; our budgets are not getting any larger.” – LE, Education
  • “Identifying sources of certain network issues, for example finding bandwidth hogs. We are able to do it, albeit a fairly manual process to get that information. There are a lot of tools that can automate, but without much of a budget to speak of, I’m kind of forced into finding ways to do without or maybe learning how to script a lot of things we do as manual process to get the information we need.” – MSE, Transportation
  • “Finding the budget to keep up [is a pain point].” – MSE, Education

Global competition among telecom providers is emerging in cloud infrastructure services

July 17th, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

The after-effects of the NSA revelations and growing global competition are some of the factors telecom providers have to consider as they roll out and adjust their strategies in the expanding cloud infrastructure services space. Even when we exclude software as a service (SaaS), the potential size of the market opportunity is estimated to reach over $30bn by 2018, with telecom providers showing the second-highest enterprise adoption intentions after online service providers. The percentage of large and midsize enterprises using telecom providers for infrastructure-oriented cloud services is expected to increase from 40% now to 52% in two years. However, the events from last week demonstrate that some providers will experience headwinds as geopolitical factors come into play.

As a consequence of NSA revelations, the German government announced its suspension of Verizon’s communications services for all agencies by 2015. We can expect a similar backlash for cloud infrastructure services. The political alignment will not favor US or British telecom providers, and they may need to adjust their strategies in light of recent events.

The results from the Wave 6 Cloud Computing Study show that Verizon and AT&T have the most in-use cases for cloud infrastructure services with 14% each, followed by British Telecom (BT), CenturyLink and Reliance Globalcom. When we asked our study respondents about telecom providers they plan to use for cloud infrastructure services two years from now, Verizon took the lead with 18% of selections, followed by AT&T with 15%. Picking winners is still premature at this stage as global competition is just starting to emerge in the cloud infrastructure services arena.

TIP-Thurs-Cloud-070314-pic

We expect telecom providers from different geographies and specialties to move into the cloud services space. Some telecom providers have to determine their go-to-market strategies, models, variables they will have to deal with and differentiation stories they want to pursue, while others, such as Entel from Chile and Swisscom from Switzerland, are already making considerable strides forward. Many of the telecom providers that have not been ready for the move before are now jumping on the cloud infrastructure services bandwagon, as they realize that failure to do may result in significant problems down the road. A recently published report by 451 Research, titled “To overcome cloud roadblocks, vendors need new ways to add value beyond core offerings,” discusses some specialization options telecom providers can pursue, unless they choose to join the race to the bottom with the economies of scale established by AWS and Google.

TheInfoPro’s respondents had the following commentary about telecom providers and cloud security problems:

  • “The whole NSA access into the cloud providers has had a great impact on our clients/customers and thus impacts us.” – LE, Consumer Goods/Retail
  • “[Reducing security concerns:] Third-party access is nearly impossible to mitigate. It is a pure ‘trust’ situation and cannot be technically overcome. AT&T’s vCloud hosting seems closest possible, but you still have to go through their setup/network.” – LE, Industrial/Manufacturing
  • “[Public cloud performance management/monitoring:] For our CenturyLink deployment, we do receive QoS information and metrics.” – LE, Consumer Goods/Retail

Security is the greatest pain in the cloud

July 9th, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

Cloud service cost, organizational culture, network connectivity, expertise and other pain points associated with the move to a cloud-based infrastructure all pale in comparison to the top pain point cited by those responsible for cloud-related deployments in their enterprises: security. More than a third (37%) of interviewees in the cloud study, not security managers as with the security study, cited security as their top source of pain in deployment to the cloud. While other concerns are eroding (pricing decreased from 16% to 14%; integration decreased from 18% to 12%; expertise decreased from 17% to 11%), security is growing as a cloud concern, increasing from 30% to 37% between the Wave 5 and 6 Cloud Studies.

TIP-Thurs-Infosec-062614-pic

The top cloud security issue cited by interviewees is the security and privacy of data at 41%, with access control following at 35% and auditability rounding out the top three at 32%. Not surprisingly, ‘transparency’ and auditability are the top two ways respondents thought cloud providers could address security concerns, cited by 28% and 21% of respondents respectively. Almost three-quarters (73%) consider the security traits of a cloud provider to be ‘extremely important,’ ahead of even regulatory compliance cited by 63% as ‘extremely important.’ The top method of resolving compliance concerns for cloud providers, cited by 33% of respondents, was the ability to present proof of audit and compliance traits.

Commentators responsible for their enterprise’s cloud deployments had the following to say about security concerns:

  • “Perennially it’s security, how do we guarantee security. We operate with Rackspace, and they are remarkably diligent. But the last thing we’d ever want to see would be a breach in our infrastructure.” – LE, Consumer Goods/Retail
  • “Inadequate cloud transparency with too many cloud service providers (CSPs). For example, CSPs that ignore the CSA STAR (Security, Trust & Assurance Registry) and CAIQ (Consensus Assessments Initiative Questionnaire).” – LE, Consumer Goods/Retail
  • “The naive view that the cloud vendor is taking care of securing the data. It still requires our own security measures and encryption, but people assume once you buy into the cloud, it is by definition secure.” – LE, Education
  • “Security. Our security standards are fairly liberal. We use the work of the Cloud Security Alliance for evaluation. We find that the vendors don’t have the standard of care for my in-house data.” – LE, Services: Business/Accounting/Engineering

Dell’s road to the future is private and full of cloudy promise

July 7th, 2014 by AnnMarie Jordan

Peter ffoulkes, Research Director for Servers and Cloud Computing

Despite the fact that we concluded our previous server study prior to Michael Dell’s successful privatization of the company he founded, the results of the survey were generally positive for Dell’s future prospects. The company maintains growing recognition as an enterprise-class server provider and is gaining recognition for its end-to-end datacenter strategy as the company builds out its software offerings with well-thought-out acquisitions. On the server spending front, growth has been steadily slowing for Dell in the past few years, but still remains stronger than for all other server vendors except Cisco. Dell servers have been increasingly gaining recognition with enterprise customers, with 10% of respondents who rated Dell spending more than $5m in 2013.

Dell scores highly in the Market Window for server infrastructure

TheInfoPro’s Market Window is a graphic representation of vendors rated and actively in use by respondents compared for effectiveness at marketing (promise) and execution (fulfillment). Dell consistently scores well in the product categories, especially in value for money, coming second to Cisco overall among hardware vendors, noticeably ahead of HP and IBM. In the Market Window, Dell was rated average on the Promise Index and well above average for fulfillment, equaling Cisco’s score.

TIP-Thurs-Serv-062614-pic

Six months after our survey was completed Dell held its first analyst summit since its privatization and provided detailed insight into its strategy, product roadmaps and extensive access for 451 analysts to company executives, including Michael Dell himself.

The strategy Dell discussed is fundamentally unchanged from the one it has been pursuing for several years. Perhaps the biggest impact initially will be on its executive team – CEO Michael Dell says he got back 20% of his time through not having to deal with Wall Street, while board meetings can last as little as 10 minutes.

The 451 analyst consensus is that the company going private is not an issue for most customers, but the main difference may be in how employees and partners feel about the company. Related to this is the issue of how Dell presents itself to the outside world. Now more than ever as a private concern, it needs a clear and strong voice, and developing this remains a work in progress. Part of the challenge remains to articulate a single vision that resonates with a broad target audience of consumers, SMEs and large enterprises. Nonetheless, the renewed energy and enthusiasm Dell talked about is in evidence among its senior executives; its challenge now is to ensure this percolates down through its 100,000-strong workforce. Though Dell still has to articulate a strong idea of what exactly what it wants to be in the longer term. The fact it has put cloud firmly in its cross hairs suggests that the company is developing an idea of where it needs to be.

Anecdotal commentary illustrates the sentiments expressed by TheInfoPro respondent community at the end of 2013:

  • “Dell has good brand and recognition. They stay in line with the Intel CPU roadmap. They maintain a business focus even though they sell retail computers. On the down side, Dell has reorganized their sales force too many times. Unless you spend a lot of money, you don’t receive a lot of personal attention.” – LE, Financial Services
  • “They’re making strides through some acquisitions that are making them more competitive.” – MSE, Public Sector
  • “Upgrades being forced on us – HP and Oracle don’t really get along, so we’re changing everything over to Dell.” – LE, Consumer Goods/Retail
  • “Migration of SAN infrastructure to Dell Compellent.” – LE, Energy/Utilities
  • “Dell has a great sales team that makes sure we are taken care of based on our needs. They do a great job specking what we need. The products, especially the x86, does a great job for us. Down side, Dell is not an innovator. They are a great provider of IT equipment. Good, solid commodity vendor. They need to continue to offer present levels of value. That will keep us happy.” – LE, Industrial/Manufacturing
  • “Dell offers incredible value for the money. The feature set is way above what we can get from any other vendor. The reliability is excellent, especially in the first five years! Weakness: their jumbled management tool philosophy. In the next two years they should have this worked out.” – LE, Consumer Goods/Retail
  • “They’re really hitting their stride.” – LE, Services: Business/Accounting/Engineering
  • “Dell’s direct sales model is compelling and a major reason we stay with them.” – LE, Consumer Goods/Retail
  • “Dell is a very cost-competitive vendor. They are well positioned, and they have a good overall vision. We stay with Dell because of their cost structure. There is no compelling reason to look elsewhere.” – LE, Materials/Chemicals

Overall enterprise IT spending shrinks faster than storage budgets

July 2nd, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

We expect 2014 to be a tough year for storage professionals and vendors as budgets continue to tighten. It is not just storage, though – overall IT budgets are shrinking even faster, contributing to a larger proportion of storage in average IT spending. Despite tightening storage budgets in the past two years, our study participants experienced capacity growth in excess of Moore’s Law as they tackled unprecedented data growth. Ability to do more with less is becoming an upper management expectation that is affecting not only storage, but IT as a whole.

The recently completed Wave 18 Storage Study shows that storage-specific budgets grew as a proportion of average IT spending from 9.5% in the prior study to 13.5% now. The growth of storage budget as a proportion of the overall IT fund allocation comes at a time when the average storage budget at both large and midsize enterprises has declined between the studies. The increase in the storage budget portion of IT spending is because the overall IT budgets reported by commentators shrank more than storage budgets.

There are many factors affecting the size of storage and overall IT budgets at large and midsize enterprises from different industry verticals. In our latest study, storage professionals from the energy/utility industry vertical report the highest storage budget as a proportion of the average IT spending, amounting to 27%, while materials/chemicals have the least, accounting for just 4%. The expectation of being able to do more with less is putting pressure on IT professionals and vendors, but at the same time provides opportunities for technologies that enable more efficient operations, such as thin provisioning, automated tiering and cloud, among others.

TIP-Thurs-Stor-062614-pic

Wave 18 Storage Study participants had the following commentary about budgets at their enterprises:

  • “We expect a flat storage budget going forward because of the increased storage technology options like thin provisioning.” – LE, Healthcare/Pharmaceuticals
  • “Seeing large demand without an expanding budget.” – LE, Financial Services
  • “Refreshing storage that is four years old with appropriate budget is difficult.” – LE, Healthcare/Pharmaceuticals
  • “Budgets are decreasing, no one wants to spend money.” – LE, Healthcare/Pharmaceuticals

Anti-DDoS consolidation continues: F5 buys Defense.Net

June 30th, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

The ‘Avenge Assange’ evolution of the Anonymous-led Operation Payback to DDoS attacks on financial websites that withdrew services from WikiLeaks in 2010 caught the attention of media and many security professionals. It was those professionals who largely realized their level of unpreparedness against distributed denial of service (DDoS) attacks, and with little surprise, anti-botnet tools have seen a steady increase in use from that period onward. Such solutions rose from 20% to 40% in use among interviewed large enterprises in the past two years.

Responding to the need for market solutions to mitigate this risk, networking providers have looked to provide anti-DDoS services. Following this vein, on May 22 F5 Networks acquired Defense.Net. This follows last year’s acquisition of Prolexic Technologies by Akamai.

TIP-Thurs-Infosec-061214-pic

There are a number of ways to mitigate the risk of DDoS attacks, from what the Wave 16 Information Security Study covers, which is going after the locally installed bots themselves (so that you’re not a part of an attack) on endpoints to on-premises, to upstream provider solutions for throttling and filtering unwanted traffic away from enterprise networks. This second group falls under the more anti-DDoS solution set, and is a better reflection of what F5 Networks is acquiring with Defense.Net.

Security respondents had the following to say about DDoS protection and F5 Networks:

  • “How will we deal with DDoS? We feel we will be a collateral target at some time in the future. We need to take a hard look at what we will do. We are taking this very seriously!” – LE, Financial Services
  • “Building a private cloud structure in our datacenter to be rolled out in the next couple of months. F5 has strong application protection.” – LE, Healthcare/Pharmaceuticals
  • “It’s one particular piece of the [F5] product. Other things, we’re happy with. But the SSL VPN has been extremely problematic.” – LE, Financial Services
  • “DDoS, it is starting to affect us indirectly.” – MSE, Energy/Utilities

Unified cloud management consoles inspire plenty of interest but little confidence

June 27th, 2014 by AnnMarie Jordan

Peter ffoulkes, Research Director for Servers and Cloud Computing

As enterprises steadily evolve their IT strategies to embrace cloud computing, it is clear that the domain of systems management also needs to evolve in parallel. Virtualization has moved the goal posts for traditional enterprise datacenter stalwarts including BMC Software, CA Technologies, HP and IBM, who are experiencing strong competition from Microsoft and VMware, as well as newer vendors such as ServiceNow and SolarWinds. For infrastructure management software vendors, cloud computing will change the game again.

The quest for ‘The One Ring’

As cloud computing slowly dominates the enterprise IT market, it is clear that the majority of enterprises will deploy a combination of private on-premises, private-hosted, hybrid and public cloud delivery models. While the concept of a unified cloud management console – ‘one ring to rule them all’ – is highly desirable. TheInfoPro’s Cloud Computing Study respondents express little confidence in achieving such comprehensive management capabilities anytime soon. Just 6% had some version of the technology in production use, with a further 20% considering the technology in the next two-plus years.

TIP-Thurs-Cloud-062514-pic

Anecdotal commentary illustrates the desire for such a capability within TheInfoPro’s respondent community, together with a very healthy skepticism about vendors’ ability to deliver on the promise in the foreseeable future:

  • “I wish! We would love it if we could find it.” – LE, Financial Services
  • “Boy, that would be nice.” – LE, Telecom/Technology
  • “I haven’t even seen such a thing! AWS will give us a whole holistic view of their services but not a view into somebody else’s cloud. Some sort of federated unified cloud console.” – LE, Consumer Goods/Retail
  • “The product we’re running a POC with will not be a good fit for moving forward. The POC is about seeing ‘if they will come,’ not to actually use moving forward.” – LE, Energy/Utilities
  • “VMware seems like the best bet here, but will it save me money to use it?” – LE, Telecom/Technology
  • “It’s a hope of mine, but it is very difficult to come up with a management console in an environment where we have so much country-specific regulation and restrictions.” – LE, Financial Services
  • “I don’t believe that anyone can deliver this.” – LE, Telecom/Technology

Technology refreshes dominate 2014

June 25th, 2014 by AnnMarie Jordan

Marco Coulter, Research Director for Storage

Welcome to the Wave 18 Storage Study results! There will be a lot to write about in coming Thursday TIPs, including all-flash arrays, on-premises cloud, the rise and fall of incumbent vendors, and the new storage architecture. Setting the scene are the top storage projects you and your peers identified for 2014.

The lives of storage professionals are dominated by technology refreshes. With disk sizes reaching multiple terabytes, the simple act of migrating the data from old to new systems is time consuming. (Watch for storage virtualization to get a rebirth from this driver.) Wrap around refresh migration the planning, change control, vendor and equipment evaluation, and a year feels a lot shorter. Backup and DR redesign fill out the top three as enterprises continue moving away from tape. Enterprises are now making backup a snapshot and replication process and pushing software-driven backup into a legacy item for longer-term retention.

2014 is the year of the all-flash array, and flash implementation is a new arrival to the top 10 this wave, at sixth place. Incumbents are all pushing one or more offerings to the market, and emerging vendors like Pure Storage have generated excitement around the technology. Databases will be the main workload moved to flash. AFA success may curtail growth plans for server-side flash vendors like Fusion-io.

TIP-Thurs-Stor-061214x-pic

Efficiency and cost reduction gets more attention this year as budgets tighten, giving a boost to primary de-duplication and compression. These projects also mention interest in ‘cold’ storage – searching for lower-cost archival capacity.

Cloud storage is also in the top 10. For large enterprises, these are on-premises projects, while midsize enterprises are more interested in external cloud storage particularly for file sharing and mobile access.

Storage pros have a lot going on, and the industry wants to help by offering a range of capacity types from hot to cold and applying software to make management of the various hardware types easier. Narratives reveal that many storage professionals are still working out the best approach.

  • “Formal program on storage efficiency and performance management. Mostly toward the application efficiency of storage.” – LE, Financial Services
  • “Cheap and deep. Looking at high-capacity, low-cost arrays. Implementing, NAS migration.” – LE, Financial Services
  • “There’s a big cost review going on with regards to storage, but that will be a while down the road.” – LE, Education
  • “We’ve been looking at internal low-cost storage for archive that’s got reliability. Part of bringing that data back in from the field.” – LE, Consumer Goods/Retail
  • “We’re doing a storage refresh, retiring storage frames, new ones, and doing storage rationalization project to gain better utilization of the infrastructure.” – LE, Services: Business/Accounting/Engineering

Higher levels of virtualization present enterprises with new pain points

June 23rd, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

As large and midsize enterprises move toward higher levels of virtualization, pain points are starting to evolve for server professionals. The changes in pain points are catalyzed by the shift of heavy-duty production workloads into virtualized environments. In addition to existing pain points, some enterprises are starting to face new ones, such as VM monitoring and management, which appeared as one of the top pain points for the first time in our Wave 13 Servers and Virtualization Study.

The list of top pain points significantly changed between the studies as our participants leaped ahead in virtualization of their server environments. Cost/budget climbed atop the list of pain points, increasing from 8% of respondents in 2H 2012 to 16% in 2H 2013. Enterprises with concerns about insufficient resources also increased, from 5% to 11% between the studies. Meanwhile, infrastructure management, which had the most selections in the prior study, and datacenter space/energy/consolidation concerns have declined, dropping from 19% to 12% and from 10% to 6% respectively. Some of the newer pain points faced by the respondent community included VM monitoring and management, which appeared for the first time as a top pain point with 12% of respondents, and application virtualization issues, which also made it to the list of top 10 pain points, increasing from just 1% in 1H 2012 to 8% of citations now. Storage issues, capacity planning, organizational issues and automation were the other pain points that experienced an uptick between the studies.

As the journey to the cloud-ready infrastructure continues, server professionals should anticipate and plan ahead for the pain points that may lie in the future as they shift new workloads to virtualized environments. It is important for vendors to communicate with existing and potential customers to understand these evolving pain points and become partners in helping to alleviate these issues. Virtualization can bring many benefits to the table, and it is important to alleviate pain points that come with it in order to realize the full potential and enable the path to agile, automated and adaptable infrastructure.

TIP-Thurs-Serv-061214-pic

TheInfoPro’s respondents had the following comments about top server and virtualization pain points:

  • “Some applications that were written many, many years ago, still not supported on virtualized servers. Addressing those, either updating those applications or using a newer application so they can work in virtualized environments.” – LE, Consumer Goods/Retail
  • “Monitoring, visibility into it, technically into what’s happening from an optimization perspective.” – LE, Services: Business/Accounting/Engineering
  • “Rapid growth. We can’t keep up, and we are now scrambling because there’s insufficient amount of capital funds to be able to take care of some of the infrastructure.” – LE, Industrial/Manufacturing
  • “Staffing – hard to find people with right set of skills.” – LE, Consumer Goods/Retail
  • “From a technology standpoint, we have leveraged VMware. We are using Microsoft tooling for automation and orchestration. Integrating the Microsoft tool with VMware is more difficult to do than just plug and play.” – LE, Financial Services