Mobile security pain overwhelms the enterprise

September 22nd, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

Mobile device security is the top source of pain for enterprise security managers interviewed for the most recent information security study. Encompassing the general shift from BlackBerry devices to a panoply of different devices owned by either the enterprise or its employees, this pain point includes general IT consumerization, employee expectations, mobile device management, and the management challenges created by implementing such technologies.

TIP-Thurs-Infosec-090414-pic

Some pain points were consistent between last year and this, including hackers, the ineffectiveness of security awareness training, and regulatory/compliance requirements. Data security in general dropped as a concern from 15% to 7%, but with the major caveat that data-loss prevention emerged as a source of pain among 8% of respondents. Other concerns, such as third-party security and vulnerability management, experienced major upticks among interviewees.

Security respondents had the following to say about what causes them pain:

  • “Malware, it’s more targeted, more spear-phishing, going after people they know are not technical, like executives, like an email with ‘you need to reset your password, click here.’ And then that’s bad and they’ve given their password to someone from the Ukraine.” – LE, Consumer Goods/Retail
  • “Increasing frequency of regulatory inspection. Every country (140) we sell in has some kind of regulatory oversight, and they reserve the right to inspect any of our operations worldwide.” – LE, Education
  • “Mobile apps – the new way to steal information.” – LE, Financial Services
  • “XP remediation; 47,000 PCs with many running XP. Microsoft dropped support – 891 under my control.” – LE, Industrial/Manufacturing
  • “Our Web applications. Just the different vulnerabilities that are popping up left and right. You can only do so much, and you turn around and there’s a few that squeak by.” – MSE, Financial Services

The ‘software-defined’ label contorts better than a Cirque du Soleil gymnast

September 18th, 2014 by AnnMarie Jordan

Marco Coulter, Research Director for Storage

Cloud computing had a shaky start. In the early days, experts fiercely debated the term’s meaning. Vendors contorted the definition better than a Cirque du Soleil gymnast to cover almost any product for sale. Enterprise professionals initially said cloud computing was just hype that would soon pass. Cloud computing is now on almost every enterprise roadmap in some shape or form. The software-defined datacenter may be in the early stages of a similar lifecycle. It seems that almost every infrastructure news release these days refers at some point to ‘software-defined.’ In Wave 18, we wanted to understand whether this was marketing hype or an implementation reality. Narratives indicate opinions went to both directions.

Already, there are 41% who agree to some extent that they are strategically planning to move to a software-defined datacenter. To succeed, this momentum needs to be maintained as debates rage over the term’s meaning. Clearly many respondents see the definition still forming. Reflecting the uncertainty of the definition, only 6% were prepared to completely agree with the following statement: “We are strategically planning to move to a software-defined datacenter.”

TIP-Thurs-Stor-090414-pic

The majority (58%) disagree – they are not strategically moving to a software-defined datacenter. Some were quite vehement in dismissing the term as ‘marketing hooey.’ Others said they simply find it undesirable, concerned about the risk in introducing more lines of software code into infrastructure. Some love the idea, but see no reality in the offerings available today. It is early days.

The fact that some IT pros already agree with a vague definition is an indicator that the term has a comfort level for enterprise professionals. As frustrating as it may be to try and comprehend all the various definitions from vendors, and even analyst firms, perhaps we should sit back and enjoy the show.

Expect to see the term distorted heavily until enterprise professionals decide on a definition.

  • “No one frickin’ knows what ‘software-defined’ anything means.” – LE, Education
  • “Unfortunately, the industry is going that way. Software layer introduces more and more bugs. Traditionally more bugs in software than hardware.” – LE, Services: Business/Accounting/Engineering
  • “I need to do some more reading about software-defined datacenters and what they are; I’ve never heard any talk at all about software-defined anything, really.” – LE, Telecom/Technology
  • “[Software-defined is] the buzzword that VMware is throwing around, and we’ll see where it goes. New thing they had, nothing else to talk about in their features. I don’t know if the buzzword will be here in three years. Could be one that came and went.” – LE, Industrial/Manufacturing
  • “Love the concept of software-defined, but it’s all vaporware right now.” – LE, Consumer Goods/Retail
  • “Strategically, we are becoming a more service-oriented architecture; we will be more flexible in response to customers. Is that software-defined datacenter? Then yes, pretty likely.” – LE, Healthcare/Pharmaceuticals

Majority of enterprises move past virtualization on their journey to the cloud

September 12th, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

Completed this August, the Wave 7 Cloud Computing Study shows that the majority of large and midsize enterprises have reached sufficient levels of virtualization and have moved on to the automation, orchestration and private cloud phases in the cloud journey. Almost a third of enterprises (32%) continue to work on virtualizing their environments, but 54% have entered the next phases in the evolution of cloud-ready internal infrastructure, compared to about 40% at the end of 2012. Automation took center stage for 26% of study respondents and is now the most common post-virtualization initiative, followed by private cloud implementations, with 17% of selections, and orchestration, with 11%. Only 8% and 6% are still working on standardization and consolidation respectively.

Each phase in the evolution of the cloud-ready internal infrastructure presents decision-makers with an evolving set of challenges that have to be addressed. Many enterprises own datacenters that are on different phases of this evolution, making closer collaboration with vendors an integral component of achieving the set objectives. If in-use vendors fail to recognize the importance of closer collaboration to overcome the evolving set of challenges, there will be opportunities for other vendors as enterprises start reassessing provider strategies and relationships during licensing renewals.

TIP-Thurs-Cloud-082814x-pic

Our respondents had the following commentary about their journey to a cloud-ready internal infrastructure:

  • “Operations team is smaller now. We use VMware a lot, and we are fairly well down the route with that. At the moment, next stage in our evolution will be looking at cheaper alternatives that can do the same job as well if not better. We have kept an eye on the sort of Microsoft catchup that seems to be coming. I don’t think they’re quite there; I wouldn’t be surprised if it’s in the next two years.” – MSE, Healthcare/Pharmaceuticals
  • “We have many datacenters, that in some centers we are in one phase or another.” – LE, Financial Services
  • “We have worked on the final piece of it, getting our designs fixed this year, what type of clustered topologies are we putting in place, how we categorize. A lot has to do with performance. Like to like things in place to optimize our costs.” – LE, Telecom/Technology

Networking organizations face change in 2014

September 10th, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

The majority of enterprises (68%) employ between one and 10 full-time resources for network management. Forty percent (40%) employ the same range of third-party or outsourced resources, while 45% report having no third-party resources employed as part of their network management operation. Some manner of team divestiture or breakup was the greatest organizational change for networking teams, occurring at 28% of enterprises. In addition, 14% of networking teams experienced some manner of outsourcing. About the same percentage of respondents said the most significant organization change was either an increase or decrease in staffing numbers.

TIP-Thurs-Net-082814x-pic

The networking team is still mainly measured by availability, with 60% of enterprises citing it as the primary evaluator of network team performance. Issue resolution time (41%) and project completion (40%) rounded out the top three performance measures. Sixteen percent (16%) of enterprise representatives said their teams have no real measurement that they are evaluated by.

Network managers provided the following commentary on their staffing and responsibility changes:

  • “With the uptick in the economy, there’s been a huge brain drain. Our numbers have dwindled.” – LE, Consumer Goods/Retail
  • “It depends. Switches, routers and load balancers are physical. Management tools to manage multiple devices (like central management consoles) we would prefer virtual. The whole team manages the devices – one person can access any network device.” – MSE, Other
  • “I could be on 200 devices next week, someone could be on one, depends on the project.” – LE, Financial Services

Spending increases on third-party services are most common in security

September 8th, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

Large and midsize enterprises are reporting significant spending increases on third-party services in 2014, with the security function experiencing budget spikes most commonly for these services, followed by networking, servers and virtualization, and storage. Almost half of security professionals (45%) report budget increases for third-party services this year, while another 45% are keeping flat spending, and only 10% have funding cuts. In addition to having the largest portion of respondents reporting spending increases on third-party services, infosec also has the smallest percentage of its participants reporting flat spending for third parties year-over-year, in comparison to networking, servers and virtualization, and storage respondents.

There are many factors that decision-makers should take into consideration when evaluating third-party services in information security. Many consider third-party services to be a more cost-effective approach that relieves enterprises from building and maintaining the appropriate infrastructure and employee skill set needed to tackle ever-evolving security threats. At the same time, decision-makers should also be comfortable with transferring some control over their security needs to third parties. Some prefer to use third parties for specific applications – for example, event log management – while others transfer control of more than one security need. Decision-makers should evaluate their information security ecosystem to determine if the use of third-party services is an applicable option given the existing corporate culture, risk tolerance levels for certain needs, existing internal skill set and future training requirements.

TIP-Thurs-Infosec-082114-pic

Security respondents had the following comments about third-party services:

  • “We are increasing third party in order to get away from maintaining and building infrastructure.” – LE, Healthcare/Pharmaceuticals
  • “[Pain point:] Data leakage – protecting information vs. negligence internally or via a third party.” – LE, Healthcare/Pharmaceuticals
  • “Infosec is generally weaker than it should be – we’re weak in our infosec, outsourcing this and most of IT to a third party.” – LE, Telecom/Technology
  • “We are looking at certain niche things; the applications are offered at a lower cost if we go to a third party.” – MSE, Energy/Utilities
  • “Rudimentary log analysis from third party.” – LE, Other

Does Microsoft have a hunting license for hypervisors this year?

September 5th, 2014 by AnnMarie Jordan

Peter ffoulkes, Research Director for Servers and Cloud Computing

With VMworld San Francisco on the immediate horizon, it seems a reasonable time to question whether the foundations of VMware’s software-defined datacenter fortress are solid as a rock or potentially being eroded by incessant waves of competitive assault.

VMware’s increasing vulnerability

The apparent unassailability of VMware’s market position for server virtualization is predicated upon many factors. For most enterprise IT organizations, these include the perceived technical superiority of VMware’s offerings, the investment in licenses and expertise and the challenge of changing vendors in the middle of a multi-year server virtualization effort.

However, this scenario is steadily changing. In our recent cloud survey, just 32% of respondents were still at the virtualization stage, with 54% having moved on to automation, orchestration and private-cloud initiatives. When combined with the renewal timing of enterprise licensing agreements, it is natural for customers to reassess their provider strategies and relationships.

In our past several surveys, VMware’s vulnerability – as shown by the percentage of respondents that are definitely or possibly considering switching to another vendor – has been steadily increasing from 11% in 2009 to 30% in 2013, with Microsoft being the most likely beneficiary.

TIP-Thurs-Serv-082114-pic

With the market shifting beyond core virtualization capabilities, the timing is good for Microsoft. With Windows Server 2012, the company’s hypervisor technology – Hyper-V – is widely considered to have caught up sufficiently with VMware, and the timing is good for enterprises to migrate from earlier versions of Windows Server. VMware faces stronger competition in the automation, orchestration and private cloud arenas where it will have to fight on more level playing fields, and Microsoft is regarded as having strong offerings and a well-established user base. Both Microsoft and VMware have come from predominantly homogeneous, proprietary backgrounds, but with both companies under new leadership, more open and heterogeneous approaches are being promoted as they target the cloud computing market opportunity.

With change in the wind and the technical advantage narrowing, cost becomes an increasingly important factor, providing a potential opportunity for Microsoft to leverage its licensing strategies to its competitive advantage. Is it open season yet for hypervisors? Perhaps VMworld will yield some clues.

Anecdotal commentary illustrates the sentiments expressed by TheInfoPro respondent community:

  • “Microsoft is cheaper, and their product’s evolved to where’s it passable.” – LE, Consumer Goods/Retail
  • “We are always examining Hyper-V as an option, but no plans to switch at this time.” – LE, Financial Services
  • “Microsoft owns the licensing strategies, and we own a lot of Microsoft right now. So cost is a very big factor.” – MSE, Healthcare/Pharmaceuticals
  • “Last year I would have said 5 on the difficulty to switch. Now I see a light at the end of that tunnel. Next year I think it will be 2 or 3, and the light will be staring me in the face. And now you don’t have to force solutions into VMware, they’re all VM-friendly. Makes it easy to transition.” – LE, Services: Business/Accounting/Engineering

SanDisk buys a leader in server-side flash with Fusion-io acquisition

September 3rd, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

When SanDisk put down $1.3bn for Fusion-io this June, the NAND flash giant effectively bought a leader and acquired brand recognition in the server-side flash market, where the newly acquired PCIe flash pioneer gathered more than twice as many selections for implementations as the closest runner-up. Currently, Fusion-io rules in the server-side flash space, whether directly through its own brand or through OEM deals with server vendors. The recently completed Wave 18 Storage Study shows that Fusion-io is the only vendor that had more than 10% of selections for in-use implementations in the large and midsize enterprise space in 1H 2014. The company was able to increase the gap in selections between the closest runners-up from just 1.6 percentage points in 1H 2013 to 6 percentage points now. However, the closest runners-up, including HP, IBM, Dell and EMC, are also now getting more citations for future project plans than before.

One in four (25%) of our study participants report in-use cases of server-side flash in 1H 2014, and an additional 13% have future projects plans for the technology. The selections for future project plans are spread more evenly between the leading vendors now than back in 1H 2013, when Fusion-io gathered considerably more citations for considerations, making it a tougher market for SanDisk to compete in. The spending intentions continue to look positive, with 15% of enterprises planning to increase spending in 2014 and 20% planning to do the same in 2015, vs. 3% and 2% planning budget cuts respectively.

Based on the data from our previous Wave 17 Storage Study, we indicated in a past Thursday’s TIP that Fusion-io was a potential target. Now that the acquisition has happened, SanDisk has to engage in some product rationalizing, and the market is tougher than when Fusion-io has started. A recently published 451 Research report, titled ‘2014 is the year of the all-flash array,’ discusses how all-flash implementations, which are starting to be viewed as a cost-effective alternative by some enterprises, may slow down flash in server implementations that may be more appropriate for only latency-based performance requirements in the future.

TIP-Thurs-Stor-082114-pic

Wave 18 study participants had the following commentary about server-side flash and Fusion-io:

  • “Considering all vendors at this time. Pilot would be this year.” – LE, Consumer Goods/Retail
  • “This is interesting – just started throwing local flash into servers. Caching software is in plan – have looked at two vendors, one is Fusion-io.” – LE, Energy/Utilities
  • “Just the way the servers are purchased, they come in with that config.” – LE, Transportation
  • “Fusion-io has multiple approaches with their next-gen purchase. You can have flash arrays or PCIE cards with solid state in the server itself.” – MSE, Financial Services
  • “Fusion-io, it is having a great impact on our VM environment.” – LE, Telecom/Technology

Competition between Cisco and VMware intensifies in network virtualization

August 29th, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

VMware is expanding its existing ecosystem of partnerships with its latest announcement of NSX integration with Arista as it moves closer to Cisco for network virtualization implementations between our studies. In 1H 2013, about 13 percentage points of in-use implementations separated Cisco’s lead in the space from VMware, but in 1H 2014, the gap has declined to just 2.4 percentage points. Announced earlier this month, the expanded strategic relationship with Arista may serve as an additional tailwind for VMware CEO Patrick Gelsinger to strengthen the company’s competitive stance against Cisco’s ACI platform. Cisco continues to display more future project plans than VMware for network virtualization, but if Cisco fails to convert those implementation plans into in-use cases and VMware successfully makes the conversion, we may see a new king of the hill.

More decision-makers are taking advantage of virtual networks today than a year ago. In 1H 2013, 37% of large and midsize enterprises reported implementations and an additional 11% had future project plans, while in 1H 2014, in-use cases have climbed to 41% of respondents with an additional 13% planning future implementations. Spending intentions also remain positive for the technology, with 16% citing budget increases in 2014 and 21% in 2015 vs. just 1% and 2% having budget cuts in plan respectively.

VMware and Cisco are the main beneficiaries of future adoption plans and budget increases for network virtualization. The two incumbents are taking different approaches to network virtualization, and both are goliaths in this standoff. It is too early to judge whether VMware’s lead in core server virtualization will spread into network virtualization or if Cisco’s entrenched position in the space will hold off. A recently published Technology and Business Insight from 451 Research, titled ‘Software-Defined Networking: Making Sense of a Complex Technology Inflection and Market Opportunity,’ explores network virtualization in detail and discusses the two approaches taken by VMware and Cisco.

TIP-Thurs-Net-081414-pic

Network managers had the following comments about network virtualization in their environments:

  • “We’ve become a VMware shop. We’ve done a little with Cisco virtual switches, but VMware’s big in that.” – LE, Services: Business/Accounting/Engineering
  • “We do a lot of that in the area of VMware.” – LE, Other
  • “We stand up virtual network switches that support the VMware servers.” – LE, Consumer Goods/Retail
  • “Discovery phase – Cisco Nexus offers some capability here.” – LE, Services: Business/Accounting/Engineering

Budgeting for the transition to cloud computing

August 27th, 2014 by AnnMarie Jordan

Peter ffoulkes, Research Director for Servers and Cloud Computing

From the budget perspective, enterprise IT organizations are increasingly focusing on the transformation to cloud computing in one form or another as being the IT delivery model of the future. In our most recent survey completed in August 2014, 37% of respondents had separate budgets for cloud computing, almost identical to the figure from a year earlier, and 12 percentage points higher than the number in the survey completed at the end of 2013. For the group with separate cloud budgets, spending has continued to show a trend to a more regular distribution of budget levels, with a minimum budget of $5,000 and a maximum of $30m. Median and mean levels of spending are $725,000 and $3.15m respectively, slightly down from the prior survey.

Cloud is still a small percentage of overall IT budgets

Of the remaining 63% of respondents who did not have separate cloud computing budgets, two-thirds were able to estimate the percentage of their overall 2013 IT budget spent on cloud-related projects in comparison to traditional non-cloud IT spending, with 57% spending 5% or less on cloud. Of the remainder, 25% spent 6-20%, with 8% spending more than 50% of their 2013 IT budget on cloud-related projects.

TIP-Thurs-Cloud-081414-pic

In line with the previous two surveys, 81% saw an increase in cloud spending from 2013 levels, and 82% expected 2015 spending to be higher than 2014 levels. Twenty-seven percent (27%) of respondents expected 2015 spending to increase by up to 10%, 36% by 11-50%, and 19% expected increases of over 50%. As indicated by median budget levels, the top five sectors were telecom/technology, public sector, energy/utilities, healthcare/pharmaceuticals and financial services.

Anecdotal commentary illustrates the slow and steady pace of cloud adoption by TheInfoPro’s respondent community:

  • “Small dollars ($100k) for now as we test and build comfort with technology/process. We are trying to redirect $100k working with partners to build the case for a private cloud.” – LE, Consumer Goods/Retail
  • “The 5% of cloud spending above would be 25-30% if you include … exchanges around the world, reference data providers. Externally provided data capabilities. Growth, it won’t spike. It’s a steady thing. We have to change some legal stuff as an organization to enable stuff to happen. We’re hamstrung by what data we can put where. 2016, 2017 more interesting things can happen.” – LE, Financial Services
  • “Challenge on separating spending – we have so many businesses. I cannot give you any figures as to what we’re spending. I can give you relatives. It’s impossible to separate the cloud computing budget because of how many different players there are. I can tell you this – it’s all opex. We cannot capitalize software or infrastructure as a service.” – LE, Energy/Utilities
  • “The way the DoD works, there is money assigned for large capacity machines, then commodity-type machines with no cloud capabilities. Agencies must then provide funding for software, developed code, and any cloud-based features/functionality. Our HPC lab has total spending of $2-3m. One to 2% is for internal cloud-related spending. We have no plans for external cloud spending. We want spending on internal cloud to grow, but it will stay flat or decline due to budget pressures, but will increase in 2016. We have a large machine coming in this year which has taken much of our resources for this year. Disk-based archive will grow, however, in 2014.” – LE, Public Sector
  • “Difficult for us to justify getting funding right now. For FY 2015, trying to put in about a hundred to two or 300 thousand in the budget for OpenStack or cloud-related initiative. Currently less than 5% of our budget. If the new ERP goes into cloud, 15-20%.” – MSE, Healthcare/Pharmaceuticals
  • “Guestimate 10% cloud to 90% traditional.” – LE, Industrial/Manufacturing

The race in capacity planning tools continues

August 25th, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

VMware continues to lead the race in capacity planning tools, but as enterprises search for new efficiencies following IT’s transformation to a cloud-ready environment, change on the vendor landscape may follow. VMware’s lead in capacity planning tools stems from its dominant role in core virtualization that many large and midsize enterprises have still yet to complete. At first, efficiencies that virtualization and cloud computing bring to the table often overshadow the benefits enabled by capacity planning tools. However, as pain points continue to evolve at higher levels of virtualization, capacity planning continues to be a top pain point, cited by 9% of commentators in 2H 2013.

The results of the Wave 13 Servers and Virtualization Study show that 42% of large and midsize enterprises took advantage of capacity planning tools in 2H 2013, up from 38% in 2H 2012. From a total of 11% of enterprises with future project plans for the technology, almost half, or 5%, are planning to implement capacity planning tools in the long run. We expect the technology’s enterprise proliferation to continue in the future as server professionals seek out additional cost savings and efficiencies following private cloud deployments.

In 2H 2013, VMware gathered almost three times the selections for capacity planning tool implementations than did the closest runners-up, including homegrown approaches and BMC Software, but placing bets on which will prevail at the end may be premature given the expected x86 commodification. Microsoft jumped from seventh to fourth place for implementations between the studies and gathered most selections for near-term project plans. Satya Nadella’s ship may become a new runner-up to VMware if it successfully converts future project plans into in-use cases. In addition, a number of smaller vendors are also gaining traction in the space. A 451 Research report, titled ‘If your head is in the clouds, it helps to understand your capacity,’ (client login required) discusses some of the technologies and vendors that have been getting increased awareness in capacity planning.

TIP-Thurs-Serv-080714-pic

TheInfoPro’s respondents had the following comments about capacity planning tools:

  • “[Pain point:] Capacity planning, making sure we have capacity in place for needs of the business. We’re using custom homegrown tools and just starting to deploy a new tool, vCOps from VMware, to help with that. But the tool doesn’t exactly do everything – no tool does – so we’ll continue the homegrown pieces.” – LE, Other
  • “Capacity planning is definitely one [pain point], across any of our storage platforms. A lot of our projects are bursty, so we have busy seasons and low seasons. Nothing’s linear, it’s kind of hit and miss.” – LE, Education
  • “It’s hit or miss right now, maybe some spreadsheets and manual processes.” – LE, Financial Services
  • “Not consolidated – several tools used ad hoc.” – LE, Healthcare/Pharmaceuticals
  • “The tools are in place, they are running, but not used today. The processes don’t use the data that the system provided.” – LE, Industrial/Manufacturing