Multi-year shift in DAST approaches an apex

October 1st, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

A perceptible multi-year shift among vendors offering a Web application security testing solution, sometimes referred to as dynamic application security testing, or DAST, has occurred year-over-year in our quantitative information security studies. This shift suggests that in the absence of acquisition, smaller pure-play WhiteHat Security is on a trajectory to overtake much larger competitors for enterprise usage. IBM and HP as recently as the 2013 study led the list of DAST vendors via their prior acquisitions of Watchfire and SPI Dynamics respectively, part of an application security strategy that also included code security analysis acquisitions.

TIP-Thurs-Infosec-091814-pic

In 2011 IBM captured nearly 6% of responses for being ‘in use’ among interviewees’ enterprises, followed by HP at 4%. Cenzic (more recently acquired by Trustwave) followed at a more distant third, with around 1%. Fast-forward to 2014, and Qualys is now the most-cited vendor in the space as traditional vulnerability assessment providers further invade the application security space. HP sits at 6%, as does IBM. WhiteHat Security, which first showed up in the study in 2012, is at 5% ‘in use,’ with a chance to grow 2 percentage points over the next year and a half based on the reported plans of information security managers.

Quotes from security managers using WhiteHat Security from the latest Information Security Study included the following:

  • “WhiteHat has the ability to execute and the quality of the service they provide. Weakness is market penetration and source code analysis and being late to the game. I would like to see them more strategic into the overall security environment.” – LE, Financial Services
  • “Their [WhiteHat Security's] product implementation had a few hiccups; we’re still struggling to implement it.” – LE, Consumer Goods/Retail
  • “It [WhiteHat Security] works as advertised, does exactly what they say it will do. Tech support is weak; there is lack of availability. Getting the human can be a real challenge.” – MSE, Financial Services
  • “We’re using WhiteHat for a few apps and will expand usage. They’re the leader in their space.” – LE, Other

The pendulum swings in internal cloud’s favor for cost comparisons

September 29th, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

More enterprises with a grasp on the resources needed to provide internal cloud services consider it to be a cheaper alternative to using equivalent services from external suppliers. Our latest Cloud Computing Study shows that 57% of large and midsize enterprises have developed an understanding of internal IT costs in order to make a realistic comparison with using similar external services. Almost half (46%) believe that an internal, on-premises cloud service is a cheaper endeavor, while 31% consider it to be more expensive, and 10% think that it costs the same. The pendulum seems to have swung in internal cloud’s favor when compared with equivalent external cloud services, but factors that go into this evaluation do not make it a straightforward process for enterprises.

It is best to compare applicability and cost effectiveness of internal and external cloud venues on a case-by-case basis, where regulations, workload types, and HA requirements, among other factors, can make a difference. Even time factors have to be taken into consideration, as internal cloud services that could appear to be unattractive due to high upfront costs may turn out to be cost effective in the long run. Understanding external cloud pricing models is not easy, either, but fortunately for decision-makers, 451 Research has published a Cloud Pricing Codex that can help to put clarity in different pricing methods.

Enterprises should continue reviewing their environment and external ecosystem, even after initial cloud deployment, to account for the evolving risks and opportunities that the brave new cloud world brings.

TIP-Thurs-Cloud-091114-pic

Study respondents had the following commentary about their considerations of costs for internal and external cloud services:

  • “Yeah, we benchmark on a biweekly basis against Amazon, adding Microsoft Azure to that. Our storage is more expensive and always will be; our compute is about equal. As costs go down, Amazon you save more buying reserve instances, but they don’t drop the price when you prepaid, unlike Google; when they do, their price drops. Buying a one-year reserve doesn’t make sense. They’ll milk that as long as they can.” – LE, Telecom/Technology
  • “[Re: costs:] It’s a factor of time as well; internal it’s a big up-front cost and then dies down. External is continuous or increasing.” – LE, Healthcare/Pharmaceuticals
  • “Cost depends. Things that required immediate HA can be more expensive in-house. Tier 2 apps would definitely be less expensive.” – LE, Consumer Goods/Retail
  • “We’re less expensive but trending to be the same; they’re [external providers] getting an economy of scale.” – LE, Financial Services
  • “[Re: costs:] In some cases we understand it pretty well, but it’s easier to explain back to business in the external model, not internal. It’s more quantifiable. Internal we have some costs we haven’t figured out; we have a hard time qualifying for a particular workload, maybe people, back-end infrastructure. In an internal cost you’re getting charged for everything.” – LE, Consumer Goods/Retail

Microsoft may have an edge in unified communications

September 26th, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

Part of the Networking Study analysis year over year becomes looking at places where Cisco may have vulnerability, given the vendor’s historic dominance of the entire networking space. 2014 is not much different; Cisco is the lead ‘in-plan’ vendor for nine out of 10 of the hottest networking technologies as measured by the study’s Heat Index. The only exception is unified communications (UC), offerings that use presence and preference information alongside a variety of mediums (voice, video, message management) to deliver communications as efficiently as possible, where Microsoft has gained an edge in future installations.

TIP-Thurs-Net-091114-pic

Despite a fast growth trajectory, from 49% in use in 2012 to 72% in use in 2014, growth prospects for new UC installations remain reasonably strong. Four percent (4%) report near-term UC projects under way, with a further 8% having implementation plans penciled in over the next year and a half and beyond.

These growth numbers show why UC remains the hottest technology both in terms of new installations and continued increased spending in the voice and video category. Thirty-one percent (31%) of those interviewed noted increased spending intentions this year, with 13% indicating plans to install such solutions. 2015 is similarly positive, with 31% of networking managers increasing spending.

Network managers participating in the 2014 study had the following comments about UC:

  • “They [Cisco] are losing market share in the unified communications space, and I think the trend will begin to eat away at their other lines and technologies.” – LE, Healthcare/Pharmaceuticals
  • “It’s in production, but we aren’t really taking advantage of it, not leveraging it. Because we haven’t proven the business value for it. Without some sort of executive sponsorship it’s never gonna go anywhere. UCS really, we have the capability but we haven’t turned anything on. The PC is separate from the phone, separate from the mobile device, even though that platform can support that capability.” – LE, Consumer Goods/Retail

Is VMware CEO Pat Gelsinger really a Time Lord?

September 24th, 2014 by AnnMarie Jordan

Peter ffoulkes, Research Director for Servers and Cloud Computing

This year’s VMworld, held in San Francisco from August 24-28, was based on the themes of “No Limits” and “Break Through.” It began with an impressive audio-visual display that might have been more appropriate had many of us had not been awakened around 3:20am that Sunday morning with our houses “Rockin’ and Rollin’.” The earth moved for us! I’ll say no more about the opening theme for VMworld except that I felt rather sorry for the marketing team and the production companies. Not much could be done at that stage.

Pat Gelsinger: the new (spin) ‘Doctor’

VMware’s current CEO, Pat Gelsinger, ‘regenerated’ from the previous incumbent Paul Maritz two years ago. Maritz had a good run as VMware’s CEO, but the ‘vRAM’ debacle required heads to roll, and so a new (spin) ‘Doctor’ was required. Gelsinger has put his stamp on that position over the past couple of years, but the story arc hasn’t changed that much. Hypervisors are still hypervisors, infrastructure management is still infrastructure management, cloud platforms are still cloud platforms, and VMware is still VMware.

The names may have changed, which makes it quite difficult to track both historical usage and forward-looking plans, but at the end of the day marketing departments like to change names to protect the guilty. Whatever the products are called today, or may be called in the future, it is clear that the hypervisor-level technologies that are the basis of VMware’s current market dominance are commoditizing. This provides leverage but no guarantee of future market share for VMware in adjacent markets (management and cloud platforms), which have notable established incumbents and a set of engagement rules that are not necessarily aligned with VMware’s historical success factors.

TIP-Thurs-Serv-090414-pic

Looking forward, VMware clearly has a solid platform from which to launch its next ventures into the future, as expressed by the ‘software-defined datacenter’ meme, but changing identity so frequently and claiming dominance upon past successes is not a guarantee of future success. In the sci-fi world, ‘The Doctor’ has a well-established following, despite frequent changes of identity. In the IT world, it remains to be seen whether VMware can establish the same level of loyalty or longevity enjoyed by ‘The Doctor.’

Anecdotal commentary from VMware customers illustrates reactions from TheInfoPro’s respondent community:

  • “Open to change and ideas. They are very aware of their competition and are doing what they have to, to stay ahead. They know their core – vSphere – and build around that well. Their marketing is horrible.” – LE, Energy/Utilities
  • “Strength: Their stuff works; it works well. Very stable. Kinda lives up to the reputation. Weakness: I think their story, they’ve been buying products, but their future, how they’re gonna leverage those products in their overall portfolio, is confusing. Making sure large corporations can adjust to some of the changes they make in new releases is important; some of those really threw us for a loop. They could improve on their support, as well, taking more ownership of problems. Finger-pointing. Their story about cloud, important to be clear for large corporations. Interoperability has been challenging. And cut costs down.” – LE, Financial Services
  • “Strength: They’re a market leader, a solid base product – good job of putting that together. The hypervisor itself. Weakness: They’re still in the mindset of virtualization being a strategic advantage. Industry it’s a commodity. No longer a new thing. Moving past that to cloud virtualization provider better differentiation. They need to improve support for their acquired products. Value for the money.” – LE, Financial Services
  • “VMware in many ways is a cumbersome tool to use. It works well with VMware, but not much else. I don’t think it performs as well as they tell you it does. VMware, in my opinion, is more hype than reality. Many believed it was successful because they were told it was successful.” – LE, Healthcare/Pharmaceuticals
  • “VMware is a far superior product, but Microsoft is coming on fast. VMware is expensive, and the licensing is ridiculous. Microsoft will offer much more flexibility with their licensing. This is a management decision, but we have to pay for everything.” – LE, Transportation

Mobile security pain overwhelms the enterprise

September 22nd, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

Mobile device security is the top source of pain for enterprise security managers interviewed for the most recent information security study. Encompassing the general shift from BlackBerry devices to a panoply of different devices owned by either the enterprise or its employees, this pain point includes general IT consumerization, employee expectations, mobile device management, and the management challenges created by implementing such technologies.

TIP-Thurs-Infosec-090414-pic

Some pain points were consistent between last year and this, including hackers, the ineffectiveness of security awareness training, and regulatory/compliance requirements. Data security in general dropped as a concern from 15% to 7%, but with the major caveat that data-loss prevention emerged as a source of pain among 8% of respondents. Other concerns, such as third-party security and vulnerability management, experienced major upticks among interviewees.

Security respondents had the following to say about what causes them pain:

  • “Malware, it’s more targeted, more spear-phishing, going after people they know are not technical, like executives, like an email with ‘you need to reset your password, click here.’ And then that’s bad and they’ve given their password to someone from the Ukraine.” – LE, Consumer Goods/Retail
  • “Increasing frequency of regulatory inspection. Every country (140) we sell in has some kind of regulatory oversight, and they reserve the right to inspect any of our operations worldwide.” – LE, Education
  • “Mobile apps – the new way to steal information.” – LE, Financial Services
  • “XP remediation; 47,000 PCs with many running XP. Microsoft dropped support – 891 under my control.” – LE, Industrial/Manufacturing
  • “Our Web applications. Just the different vulnerabilities that are popping up left and right. You can only do so much, and you turn around and there’s a few that squeak by.” – MSE, Financial Services

The ‘software-defined’ label contorts better than a Cirque du Soleil gymnast

September 18th, 2014 by AnnMarie Jordan

Marco Coulter, Research Director for Storage

Cloud computing had a shaky start. In the early days, experts fiercely debated the term’s meaning. Vendors contorted the definition better than a Cirque du Soleil gymnast to cover almost any product for sale. Enterprise professionals initially said cloud computing was just hype that would soon pass. Cloud computing is now on almost every enterprise roadmap in some shape or form. The software-defined datacenter may be in the early stages of a similar lifecycle. It seems that almost every infrastructure news release these days refers at some point to ‘software-defined.’ In Wave 18, we wanted to understand whether this was marketing hype or an implementation reality. Narratives indicate opinions went to both directions.

Already, there are 41% who agree to some extent that they are strategically planning to move to a software-defined datacenter. To succeed, this momentum needs to be maintained as debates rage over the term’s meaning. Clearly many respondents see the definition still forming. Reflecting the uncertainty of the definition, only 6% were prepared to completely agree with the following statement: “We are strategically planning to move to a software-defined datacenter.”

TIP-Thurs-Stor-090414-pic

The majority (58%) disagree – they are not strategically moving to a software-defined datacenter. Some were quite vehement in dismissing the term as ‘marketing hooey.’ Others said they simply find it undesirable, concerned about the risk in introducing more lines of software code into infrastructure. Some love the idea, but see no reality in the offerings available today. It is early days.

The fact that some IT pros already agree with a vague definition is an indicator that the term has a comfort level for enterprise professionals. As frustrating as it may be to try and comprehend all the various definitions from vendors, and even analyst firms, perhaps we should sit back and enjoy the show.

Expect to see the term distorted heavily until enterprise professionals decide on a definition.

  • “No one frickin’ knows what ‘software-defined’ anything means.” – LE, Education
  • “Unfortunately, the industry is going that way. Software layer introduces more and more bugs. Traditionally more bugs in software than hardware.” – LE, Services: Business/Accounting/Engineering
  • “I need to do some more reading about software-defined datacenters and what they are; I’ve never heard any talk at all about software-defined anything, really.” – LE, Telecom/Technology
  • “[Software-defined is] the buzzword that VMware is throwing around, and we’ll see where it goes. New thing they had, nothing else to talk about in their features. I don’t know if the buzzword will be here in three years. Could be one that came and went.” – LE, Industrial/Manufacturing
  • “Love the concept of software-defined, but it’s all vaporware right now.” – LE, Consumer Goods/Retail
  • “Strategically, we are becoming a more service-oriented architecture; we will be more flexible in response to customers. Is that software-defined datacenter? Then yes, pretty likely.” – LE, Healthcare/Pharmaceuticals

Majority of enterprises move past virtualization on their journey to the cloud

September 12th, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

Completed this August, the Wave 7 Cloud Computing Study shows that the majority of large and midsize enterprises have reached sufficient levels of virtualization and have moved on to the automation, orchestration and private cloud phases in the cloud journey. Almost a third of enterprises (32%) continue to work on virtualizing their environments, but 54% have entered the next phases in the evolution of cloud-ready internal infrastructure, compared to about 40% at the end of 2012. Automation took center stage for 26% of study respondents and is now the most common post-virtualization initiative, followed by private cloud implementations, with 17% of selections, and orchestration, with 11%. Only 8% and 6% are still working on standardization and consolidation respectively.

Each phase in the evolution of the cloud-ready internal infrastructure presents decision-makers with an evolving set of challenges that have to be addressed. Many enterprises own datacenters that are on different phases of this evolution, making closer collaboration with vendors an integral component of achieving the set objectives. If in-use vendors fail to recognize the importance of closer collaboration to overcome the evolving set of challenges, there will be opportunities for other vendors as enterprises start reassessing provider strategies and relationships during licensing renewals.

TIP-Thurs-Cloud-082814x-pic

Our respondents had the following commentary about their journey to a cloud-ready internal infrastructure:

  • “Operations team is smaller now. We use VMware a lot, and we are fairly well down the route with that. At the moment, next stage in our evolution will be looking at cheaper alternatives that can do the same job as well if not better. We have kept an eye on the sort of Microsoft catchup that seems to be coming. I don’t think they’re quite there; I wouldn’t be surprised if it’s in the next two years.” – MSE, Healthcare/Pharmaceuticals
  • “We have many datacenters, that in some centers we are in one phase or another.” – LE, Financial Services
  • “We have worked on the final piece of it, getting our designs fixed this year, what type of clustered topologies are we putting in place, how we categorize. A lot has to do with performance. Like to like things in place to optimize our costs.” – LE, Telecom/Technology

Networking organizations face change in 2014

September 10th, 2014 by AnnMarie Jordan

Daniel Kennedy, Research Director for Information Security

The majority of enterprises (68%) employ between one and 10 full-time resources for network management. Forty percent (40%) employ the same range of third-party or outsourced resources, while 45% report having no third-party resources employed as part of their network management operation. Some manner of team divestiture or breakup was the greatest organizational change for networking teams, occurring at 28% of enterprises. In addition, 14% of networking teams experienced some manner of outsourcing. About the same percentage of respondents said the most significant organization change was either an increase or decrease in staffing numbers.

TIP-Thurs-Net-082814x-pic

The networking team is still mainly measured by availability, with 60% of enterprises citing it as the primary evaluator of network team performance. Issue resolution time (41%) and project completion (40%) rounded out the top three performance measures. Sixteen percent (16%) of enterprise representatives said their teams have no real measurement that they are evaluated by.

Network managers provided the following commentary on their staffing and responsibility changes:

  • “With the uptick in the economy, there’s been a huge brain drain. Our numbers have dwindled.” – LE, Consumer Goods/Retail
  • “It depends. Switches, routers and load balancers are physical. Management tools to manage multiple devices (like central management consoles) we would prefer virtual. The whole team manages the devices – one person can access any network device.” – MSE, Other
  • “I could be on 200 devices next week, someone could be on one, depends on the project.” – LE, Financial Services

Spending increases on third-party services are most common in security

September 8th, 2014 by AnnMarie Jordan

Nikolay Yamakawa, Analyst for TheInfoPro

Large and midsize enterprises are reporting significant spending increases on third-party services in 2014, with the security function experiencing budget spikes most commonly for these services, followed by networking, servers and virtualization, and storage. Almost half of security professionals (45%) report budget increases for third-party services this year, while another 45% are keeping flat spending, and only 10% have funding cuts. In addition to having the largest portion of respondents reporting spending increases on third-party services, infosec also has the smallest percentage of its participants reporting flat spending for third parties year-over-year, in comparison to networking, servers and virtualization, and storage respondents.

There are many factors that decision-makers should take into consideration when evaluating third-party services in information security. Many consider third-party services to be a more cost-effective approach that relieves enterprises from building and maintaining the appropriate infrastructure and employee skill set needed to tackle ever-evolving security threats. At the same time, decision-makers should also be comfortable with transferring some control over their security needs to third parties. Some prefer to use third parties for specific applications – for example, event log management – while others transfer control of more than one security need. Decision-makers should evaluate their information security ecosystem to determine if the use of third-party services is an applicable option given the existing corporate culture, risk tolerance levels for certain needs, existing internal skill set and future training requirements.

TIP-Thurs-Infosec-082114-pic

Security respondents had the following comments about third-party services:

  • “We are increasing third party in order to get away from maintaining and building infrastructure.” – LE, Healthcare/Pharmaceuticals
  • “[Pain point:] Data leakage – protecting information vs. negligence internally or via a third party.” – LE, Healthcare/Pharmaceuticals
  • “Infosec is generally weaker than it should be – we’re weak in our infosec, outsourcing this and most of IT to a third party.” – LE, Telecom/Technology
  • “We are looking at certain niche things; the applications are offered at a lower cost if we go to a third party.” – MSE, Energy/Utilities
  • “Rudimentary log analysis from third party.” – LE, Other

Does Microsoft have a hunting license for hypervisors this year?

September 5th, 2014 by AnnMarie Jordan

Peter ffoulkes, Research Director for Servers and Cloud Computing

With VMworld San Francisco on the immediate horizon, it seems a reasonable time to question whether the foundations of VMware’s software-defined datacenter fortress are solid as a rock or potentially being eroded by incessant waves of competitive assault.

VMware’s increasing vulnerability

The apparent unassailability of VMware’s market position for server virtualization is predicated upon many factors. For most enterprise IT organizations, these include the perceived technical superiority of VMware’s offerings, the investment in licenses and expertise and the challenge of changing vendors in the middle of a multi-year server virtualization effort.

However, this scenario is steadily changing. In our recent cloud survey, just 32% of respondents were still at the virtualization stage, with 54% having moved on to automation, orchestration and private-cloud initiatives. When combined with the renewal timing of enterprise licensing agreements, it is natural for customers to reassess their provider strategies and relationships.

In our past several surveys, VMware’s vulnerability – as shown by the percentage of respondents that are definitely or possibly considering switching to another vendor – has been steadily increasing from 11% in 2009 to 30% in 2013, with Microsoft being the most likely beneficiary.

TIP-Thurs-Serv-082114-pic

With the market shifting beyond core virtualization capabilities, the timing is good for Microsoft. With Windows Server 2012, the company’s hypervisor technology – Hyper-V – is widely considered to have caught up sufficiently with VMware, and the timing is good for enterprises to migrate from earlier versions of Windows Server. VMware faces stronger competition in the automation, orchestration and private cloud arenas where it will have to fight on more level playing fields, and Microsoft is regarded as having strong offerings and a well-established user base. Both Microsoft and VMware have come from predominantly homogeneous, proprietary backgrounds, but with both companies under new leadership, more open and heterogeneous approaches are being promoted as they target the cloud computing market opportunity.

With change in the wind and the technical advantage narrowing, cost becomes an increasingly important factor, providing a potential opportunity for Microsoft to leverage its licensing strategies to its competitive advantage. Is it open season yet for hypervisors? Perhaps VMworld will yield some clues.

Anecdotal commentary illustrates the sentiments expressed by TheInfoPro respondent community:

  • “Microsoft is cheaper, and their product’s evolved to where’s it passable.” – LE, Consumer Goods/Retail
  • “We are always examining Hyper-V as an option, but no plans to switch at this time.” – LE, Financial Services
  • “Microsoft owns the licensing strategies, and we own a lot of Microsoft right now. So cost is a very big factor.” – MSE, Healthcare/Pharmaceuticals
  • “Last year I would have said 5 on the difficulty to switch. Now I see a light at the end of that tunnel. Next year I think it will be 2 or 3, and the light will be staring me in the face. And now you don’t have to force solutions into VMware, they’re all VM-friendly. Makes it easy to transition.” – LE, Services: Business/Accounting/Engineering