We just raised a $30M Series A: Read our story

Contrast Security Assess OverviewUNIXBusinessApplication

Contrast Security Assess is #7 ranked solution in AST tools and #9 ranked solution in application security tools. IT Central Station users give Contrast Security Assess an average rating of 8 out of 10. Contrast Security Assess is most commonly compared to Veracode:Contrast Security Assess vs Veracode. The top industry researching this solution are professionals from a computer software company, accounting for 32% of all views.
What is Contrast Security Assess?

Contrast Security is the world’s leading provider of security technology that enables software applications to protect themselves against cyberattacks, heralding the new era of self-protecting software. Contrast's patented deep security instrumentation is the breakthrough technology that enables highly accurate assessment and always-on protection of an entire application portfolio, without disruptive scanning or expensive security experts. Only Contrast has sensors that work actively inside applications to uncover vulnerabilities, prevent data breaches, and secure the entire enterprise from development, to operations, to production.

Contrast Security Assess is also known as Contrast Assess.

Contrast Security Assess Buyer's Guide

Download the Contrast Security Assess Buyer's Guide including reviews and more. Updated: November 2021

Contrast Security Assess Customers

Williams-Sonoma, Autodesk, HUAWEI, Chromeriver, RingCentral, Demandware.

Contrast Security Assess Video

Pricing Advice

What users are saying about Contrast Security Assess pricing:
  • "I like the per-application licensing model... We just license the app and we look at different vulnerabilities on that app and we remediate within the app. It's simpler."
  • "It's a tiered licensing model. The more you buy, as you cross certain quantity thresholds, the pricing changes. If you have a smaller environment, your licensing costs are going to be different than a larger environment... The licensing is primarily per application. An application can be as many agents as you need. If you've got 10 development servers and 20 production servers and 50 QA servers, all of those agents can be reporting as a single application that utilizes one license."
  • "For what it offers, it's a very reasonable cost. The way that it is priced is extremely straightforward. It works on the number of applications that you use, and you license a server. It is something that is extremely fair, because it doesn't take into consideration the number of requests, etc. It is only priced based on the number of onboarded applications. It suits our model as well, because we have huge traffic. Our number of applications is not that large, so the pricing works great for us."
  • "The good news is that the agent itself comes in two different forms: the unlicensed form and the licensed form. Unlicensed gives use of that software composition analysis for free. Thereafter, if you apply a license to that same agent, that's when the instrumentation takes hold. So one of my suggestions is to do what we're doing: Deploy the agent to as many applications as possible, with just the SCA feature turned on with no license applied, and then you can be more choosy and pick which teams will get the license applied."
  • "You only get one license for an application. Ours are very big, monolithic applications with millions of lines of code. We were able to apply one license to one monolithic application, which is great. We are happy with the licensing. Pricing-wise, they are industry-standard, which is fine."

Contrast Security Assess Reviews

Filter by:
Filter Reviews
Industry
Loading...
Filter Unavailable
Company Size
Loading...
Filter Unavailable
Job Level
Loading...
Filter Unavailable
Rating
Loading...
Filter Unavailable
Considered
Loading...
Filter Unavailable
Order by:
Loading...
  • Date
  • Highest Rating
  • Lowest Rating
  • Review Length
Search:
Showingreviews based on the current filters. Reset all filters
Ramesh Raja
Senior Security Architect at a tech services company with 5,001-10,000 employees
Real User
Top 5
Continuously looks at application traffic, adding to the coverage of our manual pen testing

Pros and Cons

  • "We use the Contrast OSS feature that allows us to look at third-party, open-source software libraries, because it has a cool interface where you can look at all the different libraries. It has some really cool additional features where it gives us how many instances in which something has been used... It tells us it has been used 10 times out of 20 workloads, for example. Then we know for sure that OSS is being used."
  • "Contrast Security Assess covers a wide range of applications like .NET Framework, Java, PSP, Node.js, etc. But there are some like Ubuntu and the .NET Core which are not covered. They have it in their roadmap to have these agents. If they have that, we will have complete coverage."

What is our primary use case?

We use the solution for application vulnerability scanning and pen-testing. We have a workflow where we use a Contrast agent and deploy it to apps from our development team. Contrast continuously monitors the apps.

When any development team comes to us and asks, "Hey, can you take care of the Assess, run a pen test and do vulnerability scanning for our application?" We have a workflow and deploy a Contrast agent to their app. Because Contrast continuously monitors the app, when we have notifications from Contrast and they go to the developers who are responsible for fixing that piece of the code. As soon as they see a notification, and especially when it's a higher, critical one, they go back into Contrast, look at how to fix it, and make changes to their code. It's quite easy to then go back to Contrast and say, "Hey, just consider this as fixed and if you see it come back again, report it to us." Since Contrast continuously looks at the app, if the finding doesn't come back in the next two days, then we say, "Yeah, that's fixed." It's been working out well in our model so far.

We have pre-production environments where dedicated developers look at it. We also have some of these solutions in production, so that way we can switch back.

It's hosted in their cloud and we just use it to aggregate all of our vulnerabilities there.

How has it helped my organization?

If an app team is going to deploy new features to prod, they put in a ticket saying, "We are including these features in our 2.0 release." The ticket comes to our team. We deploy Contrast Security and then we do a bunch of manual pen tests. During the time that we're doing manual pen tests, Contrast will have a bunch of additional findings because Contrast is sensor-based. It's an agent-based solution which continuously looks at traffic coming in and going out of the application. When my team does manual penetration tests, Contrast looks through those flows and that makes our coverage better. It goes hand-in-hand with our pen test team. When the manual pen-test team tests the application, Contrast is looking at that traffic. Another application, like a Qualys, doesn't go hand-in-hand with a manual pen test team. Contrast really helps us because it's more like another resource looking at traffic, and at logs. It's like a watchman looking at traffic going in and going out. I literally consider it as another resource looking at traffic, day in and day out.

Contrast has also reduced the number of false positives we have to deal with, by something like 10 to 20 percent over the 18-plus months that we've had it.

The solution is accurate 90 percent of the time. Most of the time, when Contrast has identified top vulnerabilities in the OWASP Top 10, our manual pen-test team has gone in and said, "Yes, for sure." There were times when, because of resourcing issues, we did not have people pen-testing and they would just say, "Okay, we'll see what Contrast says." And sure enough, Contrast would come back with 10 to 20 critical vulnerabilities. Then we would backtrack and have manual pen do some pen tests. They would come back and say, "Yes, it has literally identified most of them;" things like a SQL Injection, which is in the OWASP Top 10. So we've seen that happen in the past, and that's why I feel the accuracy of Contrast is pretty good.

The advantage of using Contrast is that it is continuous.

I've seen some of the development teams completely take up Contrast themselves and work in Contrast. For example, a developer will be notified of an issue and will fix the code. He will then go back to Contrast and mark it as remediated. Then, he will keep watching the portal. He will be notified if the same vulnerability is found. We have seen teams that completely like the information that Contrast provides and they work independently with Contrast, instead of having a security team guiding them and holding their hands. There are times when we do hold hands for some of the teams, but it really depends on the software developers' maturity and secure coding practices.

In addition, it definitely helps save us time and money by being able to fix software bugs earlier in the software development lifecycle. It really depends on where you put Contrast. If you put Contrast in your Dev environment, sure enough, as soon as the developer deploys his code and QA is testing it in that environment, it will immediately flag and say, for instance, "You're not using TLS 1.2." The developer will go back and make those changes. It really depends on what model you have and where you want to use Contrast to your advantage. A lot of teams put it in the development environment or a preparation environment and get to fixing vulnerabilities before something is released.

I've also seen the other side of the fence where people have deployed it in production. The vulnerabilities keep coming. Newer hacks develop over time. When teams put it in prod and an exploit happens, they can use Contrast Protect and block it on the other side. You can use it as you need to use it.

The time it saves us is on the order of one US-based FTE, a security person at an average pay level. At a bare minimum, Contrast helps us like that resource. It's like having a CISSP guy, in the US, on our payroll. That's how we quantify it in our team and how we did so in our project proposal.

What is most valuable?

Contrast has a feature called Protect. When a real exploit comes through, we can look at it and say, "Hey, yeah, this is a Cross-Site Scripting or SQL Injection," and then we can block it.

Another especially valuable feature is the stack trace. I've been in the application security space for about 15-plus years now. I saw it when it was a baby or when people thought of it as the "icing on the cake." That was especially true when they had money. Then they would say, "Yeah, we can now look at security." Now, security is a part of the SDLC. So when Contrast identifies a vulnerability, it provides very important information, like stack trace and variables.

It also has another feature called IAST, interactive application security testing. When I started out I was actually an embed developer, and now I'm managing an OWASP team. I've seen both ends of the spectrum and I feel that the information for every vulnerability that Contrast provides is really cool and amazing, enabling us to go and fix the vulnerabilities.

It also has features so you can tweak a policy. You can make a rule saying, "Hey, if this vulnerability comes back, it is not an issue." Or you can go and change some code in a module and tell Contrast, "This is per-design." Contrast will cleverly identify and recognize that it was marked as per-design. It will not come back and say that's a vulnerability.

We use the Contrast OSS feature that allows us to look at third-party, open-source software libraries, because it has a cool interface where you can look at all the different libraries. It has some really cool additional features where it gives us how many instances in which something has been used. For example, of the total, say, 500 calls, has the OSS been used that many times? It tells us it has been used 10 times out of 20 workloads, for example. Then we know for sure that OSS is being used. There are tools that would tell you something is being used, but sometimes developers can include libraries that are never used. Contrast goes one step further and tells you how many times something has been used. 

I can't quantify the effect of the OSS feature on our software development, but it gives us a grading from A to F. In this evolving security world, customers come back to us and say, "Hey, do you guys have a pen test report? We can go back to Contrast and pull all this stuff and provide it to customers.

What needs improvement?

Contrast Security Assess covers a wide range of applications like .NET Framework, Java, PSP, Node.js, etc. But there are some like Ubuntu and the .NET Core which are not covered. They have it in their roadmap to have these agents. If they have that, we will have complete coverage. 

Let's say you have .NET Core in an Ubuntu setup. You probably don't have an agent that you could install, at all. If Contrast gets those built up, and provides wide coverage, that will make it a masterpiece. So they should explore more of technologies that they don't support. It should also include some of the newer ones and future technologies. For example, Google is coming up with its own OS. If they can support agent-based or sensor-based technology there, that would really help a lot.

For how long have I used the solution?

I have been using Contrast Security Assess for a year and a half.

What do I think about the stability of the solution?

You can't quantify anything about the stability. It's more an autopilot, like agents. It's more like a process monitor that keeps looking at traffic. It's quite similar to that. Once you put it on there, it just hangs in there until the infrastructure team decides to move the old apps from PCF to another environment. Once it has been deployed it's done. It's all auto-maintained.

What do I think about the scalability of the solution?

It depends on how many apps a company or organization has. But whatever the different apps are that you have, you can scale it to those apps. It has wide coverage. Once you install it in an app server, if the app is very convoluted, it has too many workflows, that is no problem. Contrast is per app. It's not like when you install source-code tools, where they charge by lines of code, per KLOC. Here, it's per app. You can pick 50 apps or 100 apps and then scale it. If the app is complex, that's still no problem, because it's all per app.

We have continuously increased our license count with Contrast, because of the ease of deployment and the ease of remediating vulnerabilities. We had a fixed set for one year. When we updated about six months ago, we did purchase extra licenses and we intend to ramp up and keep going. It will be based on the business cases and the business apps that come out of our organization.

Once we get a license for an app, folks who are project managers and scrum masters, who also have access to Contrast, get emails directly. They know they can put defects right from Contrast into JIRA. We also have other different tools that we use for integration like ThreatFix, and risk and compliance and governance tools. We take the results and upload them to those tools for the audit team to look at.

How are customer service and technical support?

They have a cool, amazing support team that really helps us. I've seen a bunch of other vendors where you put in tickets and they get back to you after a few days. But Contrast responds really fast. From the word "go," Contrast support has been really awesome.

That's their standard support. They don't have premium support. I've worked with different vendors, doing evaluations, and Contrast is top-of-the line there.

Which solution did I use previously and why did I switch?

Before Contrast we were using regular manual pen-testing tools like Burp and other common tools. We switched to Contrast because the way it scans is different. Back in those days, security would do a pen test on Friday or Saturday — over the weekend when the traffic is less. We used to set aside time. Contrast doesn't work that way. It's continuous scanning. We install an agent and it continuously does it. Continuous is way better than having a separate time where you say, "We're going to scan at this time." The Dev-SecOps model is continuous and Contrast fits well there. That's why we made the switch.

Contrast is above par with respect to the different applications that I've used in the past, like Veracode. I saw false positives and false negatives with all those tools. But Contrast is better than all the other tools that I've used.

How was the initial setup?

The initial setup was straightforward. At the time, I was doing a proof of concept of Contrast Security to see how it works. It was fairly simple. Our company has a bunch of apps in various environments. Initially, we wanted to make sure that it works for .NET, Java, and PCF before we procured it. It was easy.

Our implementation strategy was coverage for a complete .NET application and then coverage for a complete Java application, in and out, where you find all the vulnerabilities and you have all the different remediation steps. Then we set up meetings with the app teams to go over some of it and explain things. And then, we had a bunch of apps in PCF. These were the three that we wanted: .NET, Java, and PCF. They are our bread and butter. We did all three in 45 days.

From our side, it was just me and another infrastructure guy involved.

What about the implementation team?

We only worked with Contrast. There were times when Contrast worked with Pivotal, internally, for PCF. But they pulled it off because they have a fairly good agreement with Pivotal and its support team. Initially, we had a few issues with deploying a Contrast tile to Pivotal. But Contrast worked things out with Pivotal and got all of it up for us. It was easy for us to just deploy the tile and bind the application. Once the application is bound, it's all about the vulnerabilities and remediation.

What was our ROI?

We expect to see ROI with the architecture team, the infrastructure team, and with the development teams, especially when it comes to how early in our development cycle the vulnerabilities are found and remediated. That plays a big part because the more time it takes to find a software vulnerability, obviously, the more your cost to market will be substantially higher.

What's my experience with pricing, setup cost, and licensing?

I like the per-application licensing model, but there are reasons why some solutions want to do per KLOC. For us, especially because it's per app, it's really easy. We just license the app and we look at different vulnerabilities on that app and we remediate within the app. It's simpler.

If you have to go to somebody, like a Dev manager and ask him, "Hey, how many thousands of lines of code does your application have?" he will be taken aback. He'll probably say, "I don't know." It's difficult to cost-segregate and price things in that kind of model. But if, like with Contrast, they say, "Hey, your entire application — however big it is, we don't care. We're just going to use one license," that is simpler. This type of license model works better for us.

Which other solutions did I evaluate?

Before choosing Contrast Assess, we looked at Veracode and Checkmarx. 

Contrast does things continuously so it's more of an IAST. Checkmarx didn't. Using it, you would have to upload a .war file and then it would do analysis. You would then go back to the portal and see the vulnerabilities there. 

It was the same with Veracode. When you take a SAST piece or a DAST piece, you have to have some specific timing in some workflows and then you upload all of the stuff to their portal and wait for results. The results would only come after three days or after five days, depending on how long it takes to scan that specific workflow. 

The way the scanning is done is fundamentally different in Contrast compared to how the solutions do it. You just install Contrast on the app server and voilà. Within five minutes you might see some vulnerabilities when you use that application workflow.

What other advice do I have?

If you are thinking about Contrast, you should evaluate it for your specific needs. Companies are different. The way they work is different. I know a bunch of companies that still have the Waterfall model. So evaluate and see how it fits in your mode. It's very easy to go and buy a tool, but if it does not fit very well in your processes and in your software development lifecycle, it will be wasted money. My strongest advice is: See how well it fits in your model and in your environment. For example, are developers using more of pre-production? Are they using a Dev sandbox? How is QA working and where do they work? It should work in your process and it should work in your business model.

"Change" is the lesson I have taken away by using Contrast. The security world evolves and hackers get smarter, more sophisticated, and more technology-driven. Back in the day when security was very new, people would say a four-letter or six-letter password was more than enough. But now, there is distributed computing, where they can have a bunch of computers trying to compute permutations and combinations of your passwords. As things change, Contrast has adapted well to all the changes. Even five years ago, people would sit in a war room and deploy on weekends. Now, with the DevOps and Dev-SecOps models, Contrast is set up well for all the changes. And Contrast is pretty good in providing solutions.

Contrast is not like other, traditional tools where, as you write the code they immediately tell you there is a security issue. But when you have the plugin and something is deployed and somebody is using the application, that's when it's going to tell you there's an issue. I don't think it has an on-desktop tool where, when the developer writes this code, it's going to tell him about an issue at that time, like a Veracode Greenlight. It is more of an IAST.

We don't have specific people for maintenance. We have more of a Dev-SecOps model. Our AppSec team has four people, so we distribute the tasks and share it with the developers. We set up a team's integration with them, or a notification with them. That way, as soon as Contrast finds something, they get notified. We try to integrate teams and integrate notifications. Our concern is more about when a vulnerability is found and how long it takes for the developer to fix it. We have worked all that out with Power BI so it actually shows us, when a vulnerability is found, how long it takes to remediate it. It's more like autopilot. It's not like a maintenance type of thing.

I would rate Contrast at nine out of 10. I would never give anything a 10, but Contrast is right up there.

Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
ML
Director of Threat and Vulnerability Management at a consultancy with 10,001+ employees
MSP
Top 20
We're gathering vulnerability data from multiple environments in real time, fundamentally changing how we identify issues in applications

Pros and Cons

  • "The solution is very accurate in identifying vulnerabilities. In cases where we are performing application assessment using Contrast Assess, and also using legacy application security testing tools, Contrast successfully identifies the same vulnerabilities that the other tools have identified but it also identifies significantly more. In addition, it has visibility into application components that other testing methodologies are unaware of."
  • "To instrument an agent, it has to be running on a type of application technology that the agent recognizes and understands. It's excellent when it works. If we're using an application that is using an unsupported technology, then we can't instrument it at all. We do use PHP and Contrast presently doesn't support that, although it's on their roadmap. My primary hurdle is that it doesn't support all of the technologies that we use."

What is our primary use case?

The primary use case is application security testing, where we try to identify vulnerabilities within applications developed by our company.

Contrast a cloud-hosted solution. That's where most of the data and analysis takes place. It's also how most users interact with that data. Data is collected by agents that are deployed to servers within our environment. The agent component is internal to our organization, gathering data that is sent back to the cloud.

How has it helped my organization?

The way that it has improved our application security process is that we are no longer performing scans of specific environments to provide point-in-time vulnerability data. Instead, we're gathering vulnerability data from multiple environments in real time. That's a fundamental change in terms of how our program operates and how we identify vulnerabilities in applications. It gives us greater visibility and it gives us visibility much faster, while allowing us to identify issues throughout the environment, and not in just a single location.

Assess has also reduced the number of false positives we encounter. Because it is observing application traffic and it's not dependent on a response from a web server or other information, it tends to be more accurate.

Assess can identify vulnerabilities associated with application libraries where we would otherwise be dependent on other third-party solutions. It provides us visibility that we didn't have before, which is very helpful. This tends to be an area where our application owners are less focused. They're generally interested in whether or not their application has a vulnerability that is the result of code that they've written. They tend to ignore whether or not they've inherited a vulnerability from a library that they're using. Our ability to point out to them that they are using a vulnerable library is information they didn't have before.

It helps us save time and money by fixing software bugs earlier in the software development cycle, although that's difficult to quantify unless you have a metric for the resource impact of a vulnerable application, or an incident that occurs because an application was vulnerable. But we are certainly identifying vulnerabilities earlier in the process and feel that we are identifying vulnerabilities more accurately.

What is most valuable?

The solution is very accurate in identifying vulnerabilities. In cases where we are performing application assessment using Contrast Assess, and also using legacy application security testing tools, Contrast successfully identifies the same vulnerabilities that the other tools have identified but it also identifies significantly more. In addition, it has visibility into application components that other testing methodologies are unaware of.

Assess also provides the option of helping developers incorporate security elements while they're writing code. It depends on whether individual developers decide to utilize the information that's provided to them from the solution, but it definitely gives them visibility into more environments. It gives them an opportunity to remediate vulnerabilities well before production deployments.

What needs improvement?

The automation via its instrumentation methodology is very effective when the underlying application technology is supported. To instrument an agent, it has to be running on an application technology that the agent recognizes and understands. It's excellent when it works. If we're developing an application that is using an unsupported technology, then we can't instrument it at all. We use PHP and Contrast presently doesn't support that, although it's on their roadmap. My primary hurdle is that it doesn't support all of the technologies that we use. 

For how long have I used the solution?

I've been using Contrast Security Assess for three years.

What do I think about the stability of the solution?

The stability of Assess is very good. We've never had any issues where we were unable to reach the platform. We've had very few issues with the agent. And where we did have issues with the agent, it tended to be something in our environment and not with the agent itself.

What do I think about the scalability of the solution?

It scales extremely well. One of the selling points in our organization, internally, is that I'm able to tell my application owners that we can deploy Contrast for them anywhere. If they want to have their web services in the cloud, we can deploy the agent in the cloud. If they want to have web servers on-premises, we can deploy it on-premises. We can do a hybrid approach and we deploy globally. We're able to provide the same service to development teams in other parts of the world.

We're planning to use it for roughly 50 percent of our environment. We certainly intend to increase our footprint. Our objective is to do all of our application security testing through Contrast. One of our primary hurdles right now, in that regard, is that we're using technology that they don't support. If they supported all of our application technologies, our objective would be to migrate all of our applications into Contrast.

In terms of how much of the solution we're using, I put us at around 75 percent. We could get more out of the product. We could utilize the product better. A lot of that is dependent on adoption by developers. They're really not used to interactive application security testing solutions. They're used to legacy solutions like DAST or SAST. This is a change in process for them and a change in technology. We need to get further along with the developers before we can really maximize our utilization of the product.

How are customer service and technical support?

I don't think I've ever called their tech support. I've always either opened tickets through the web UI or by email. Overall, my experience with them is positive. I can't think of an occasion when they've failed to either provide me resolution or understand my issue well enough to have it escalated to other product teams that were able to resolve my issues.

Which solution did I use previously and why did I switch?

We did not use any other interactive application security testing solutions. There are very few on the market. We did use legacy technologies like DAST and SAST. We still use those technologies in our environment mostly to supplement Contrast or to assess environments that Contrast is not able to assess.

How was the initial setup?

The initial setup was both straightforward and complex. Getting the agent deployed to environments can be complex when people don't understand how it works. But once that agent is deployed, it's very simple. The agent starts gathering data immediately and the data is presented in a UI in a way that is easily understood. You pretty much have vulnerability data right away. The only hurdle is making sure that you've got the agent deployed correctly. After that, everything is very simple.

Deployment for us is ongoing, as we continue to add applications. If I were to just choose one application and look at how long it takes to deploy to that environment, if the application owner has the resources and the ability to deploy the agent, it could be done in a few hours.

In our case, because deploying the agent is a change to the environment, sometimes that impacts larger processes like change management or making sure that the appropriate resources are assigned to do that work. If you have a large environment with many servers that need to have the agent deployed, it could take days or weeks if you don't have the resources to do it. That's not really a weakness of Contrast, but I think it's important to be aware of that if an organization is going to deploy this. A security team like mine might have external dependencies. When it comes to a legacy scan, we might not need anybody's input for us to run it. But with Contrast, we definitely need other teams to help us deploy the agents. Those teams include application owners, cloud services, server management. Whoever is responsible for installing software on a server in your environment would have to participate in this process. It's not something that the security team can do alone.

A good implementation strategy would be

  • having an application inventory
  • knowing where you're going to deploy this
  • ensuring that your applications are using technologies that are supported by Contrast. 

One of the things that we've done internally to try to simplify the agent deployment process is that we give the development teams a package that includes the agent, instructions for deploying the agent, and a couple of other properties that are included in the agent to help us with overall organization. At that point, it really is just a matter of getting the agents installed.

Once you're gathering data, you want to work with development teams to make sure that they have access to the data. Once you're gathering data, that's when you can start working with integration points, because Contrast does allow you to create tickets in bug-tracking systems or to send alerts to communications platforms. Gathering the data is just the beginning of the process. There's also the dissemination of that data. That part is really dependent on how your organization utilizes and communicates vulnerability data.

We have under 50 users of the solution and about 80 percent are developers, while 10 percent are program management and the other 10 percent are in security. Aside from security, they're all consumers of data. The security users operate the platform, make sure that everything is in order, that applications are being added correctly, and that integration is being added correctly. All of the other users are people who are logging in to view vulnerabilities or to review the state of their applications or to gather reporting data for some deliverable. They don't actually operate or manage the platform. I'm the primary operator.

In the security department, our role in deployment and maintenance is creating those packages that I referred to earlier, packages that tell the developers or the application owners how to deploy the agents. It's the application owners who are responsible for a lot of the maintenance. They're the ones that have to make sure that the agent is part of their build process, they have to make sure that the agent is reporting correctly, and they have to make sure the agent is deployed to servers that are associated with their application. It's the agent that feeds the platform, so a lot of the maintenance is associated with maintaining the agent.

What about the implementation team?

We did not use an integrator. We got some support directly from Contrast, but it was really self-implemented. We got information about the product and made sure that we understood how it works. We had some initial hurdles where we didn't understand how the agent was supposed to work in some environments. But once we had that information, we did everything ourselves.

What was our ROI?

Anecdotally, we have seen ROI in mean time to remediation.

What's my experience with pricing, setup cost, and licensing?

It's a tiered licensing model. The more you buy, as you cross certain quantity thresholds, the pricing changes. If you have a smaller environment, your licensing costs are going to be different than a larger environment. While the licensing is tiered, there are no or mandatory minimums. With some of our other products, you have to buy at least 50 licenses in a block, or you have to buy 100 in a block. With Contrast, you can buy a single license.

The licensing is primarily per application. An application can be as many agents as you need. If you've got 10 development servers and 20 production servers and 50 QA servers, all of those agents can be reporting as a single application that utilizes one license. That's really outstanding if you want to cover a large environment, because you get a holistic view of an application under a single license.

Licensing is done annually, for us at least, although they might have some flexibility on that. The licensing that I'm talking about is specifically Assess. There's also developer licensing where the developers can have a plugin for their development platforms. That's separate, but the structure is the same. It's also tiered. Depending on how many developers you have, you'll ultimately pay less based on quantity.

I don't believe there are any costs in addition to the standard licensing. However, they do have a part of their licensing model where they assume a certain number of developers are going to be present when you have an application. I don't know if this has changed recently, but if you buy licensing for a number of applications, they're going to assume that there are also a set ratio of developers per application, and therefore you must also buy the developer licensing. One of the challenges we've had with them is explaining to them that that's not how our developers work. In our environment, we have developers who are responsible for multiple applications. If we're buying licensing for our applications, we're somewhat forced into buying developer licensing that we don't need or can't use.

Which other solutions did I evaluate?

We evaluated all of the IAST products that were on the market at the time. It was the most mature product in the space. One vendor had an IAST solution, but it wasn't a fully developed solution; they may not have even had any customers. There was another that had a fairly mature IAST product, but they hadn't done a lot of development in terms of the look and feel. Contrast was a very complete solution. It met all of our technical requirements and it was really the only IAST product that felt like a real product.

What other advice do I have?

Be prepared for the cultural change, more than the technology change. Most of the benefits that I have from the solution are the time savings where we're not scanning things and analyzing things. I now spend a lot of my time explaining to people how Contrast works, explaining to people how it changes our program, and explaining to people how Contrast fits into their development life cycle. If you're approaching it from a purely technical perspective, you're missing a big piece of what you're going to be spending your time on.

I don't have any major complaints. Most of our challenges with Contrast have been how it changes our program and how it impacts the internal culture around development. Those are not really issues with the product itself. If we have had any kind of technical hurdle, it would be that a lot of our application owners might not understand the process for deploying the agent, which is when they instrument their environment. So we spend quite a bit of time supporting that part of the process, technically, which is not necessarily a good fit for a security program, having to tell people how to install an agent.

It gathers data in real time, it gathers data from agents. It doesn't perform scans, rather, it observes traffic, and that's fundamentally different from the other tools and from how those tools are used in our existing processes. We spend a lot of time on culture and process and explaining how the technology is different.

I find it very intuitive, but our users do not. We have developers who have spent the past 20 years thinking of application security in terms of a scan, and they're passive in that activity. The scan is something that's done for them, it's done to their environment, and then they're given data. Contrast is passive, it's an agent that's just gathering information, but it gives it to them directly, and that means they have to participate. They have to ingest that information, they have to be prepared for what they're going to do with it. They're not used to having that role. They're used to being the recipients of information or they're used to other people performing the service of scanning their environment, and Contrast doesn't do that.

The biggest lesson I've learned is around how our developers think about security. When they're passive in that process, when somebody else is running scans for them and telling them what to fix, the way that they operate is different than when you give them an agent in their environment and you start giving them data from multiple environments and you start automatically sending that information to a bug tracker that they use. It's the automation and visibility that they've been asking for. But now that they're getting it, they are not exactly sure what to do with it.

I was not prepared for having to have conversations about culture and process. Now that I have a better understanding of how our developers operate, what their metrics are, and how they're evaluated, as well as what constitutes success and what constitutes security on their part, it gives me a much better idea of how to interact with them. Before, we would talk about how we're seeing a certain type of security issue in our environment, and then we would try to figure out why our developers were continuing to make that mistake. Now, it's more about how developers utilize security data in their process and how we can improve that.

Right now, the visibility the solution gives us is probably a little bit painful, because this is data that the developers didn't have before. We're identifying more vulnerabilities, and that is something they were not expecting. They were used to results that originated from our previous tools and they only had a handful of vulnerabilities to address. Contrast is now finding more issues with their applications as well as finding issues that are associated with libraries. That's a lot more data than they're used to receiving. And potentially, they're surprised by how vulnerable their applications are.

The initial impact of having additional vulnerabilities that you were previously unaware of seems like a significant resource impact. A developer who normally only had to deal with a handful of findings may now have 10 or 20 or 100 findings to deal with. That may feel like a resource burden because you now have more things to fix, but ultimately that's going to be less expensive than the cost of a breach or loss of contract or anything else that might affect the business in the larger sense.

Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Flag as inappropriate
Learn what your peers think about Contrast Security Assess. Get advice and tips from experienced pros sharing their opinions. Updated: November 2021.
552,695 professionals have used our research since 2012.
Aggelos Karonis
Technical Information Security Team Lead at Kaizen Gaming
Real User
Top 5
An easy, fast way to improve your code security and health

Pros and Cons

  • "In our most critical applications, we have a deep dive in the code evaluation, which was something we usually did with periodic vulnerability assessments, code reviews, etc. Now, we have real time access to it. It's something that has greatly enhanced our code's quality. We have actually embedded a KPI in regards to the improvement of our code shell. For example, Contrast provides a baseline where libraries and the usability of the code are evaluated, and they produce a score. We always aim to improve that score. On a quarterly basis, we have added this to our KPIs."
  • "Personalization of the board and how to make it appealing to an organization is something that could be done on their end. The reports could be adaptable to the customer's preferences."

What is our primary use case?

Up to this point, as an information security company, we had very limited visibility over the testing of the code. We have 25 Scrum teams working but we were only included in very specific projects where information security feedback was required and mandatory to be there. With the use of Contrast, including the evaluation we did, and the applications we have included in the system, we now have clear visibility of the code.

How has it helped my organization?

In our most critical applications, we have a deep dive in the code evaluation, which was something we usually did with periodic vulnerability assessments, code reviews, etc. Now, we have real time access to it. It's something that has greatly enhanced our code's quality. We have actually embedded a KPI in regards to the improvement of our code shell. For example, Contrast provides a baseline where libraries and the usability of the code are evaluated, and they produce a score. We always aim to improve that score. On a quarterly basis, we have added this to our KPIs.

We have a site that serves many different products. We have a sportsbook and casino, where a lot of casinos are using the provider's code. Our false positives are mainly due to points missing since we have not integrated the application on the provider's side. Therefore, a request that is not checked on our side is checked on their side, leading to gaps of knowledge which causes the false positive. 

In regards to the applications that have been onboarded fully, we have had very effective results. Everything that it has identified has given us value, either in fixing it or knowing what's there and avoiding doing it again on other parts of our code. It's been very effective and straightforward.

What is most valuable?

The real-time evaluation and library vulnerability checks are the most valuable features, because we have a code that has been inherited from the past and are trying to optimize it, improve it, and remove what's not needed. In this aspect, we have had many unused libraries. That's one of the key things that we are striving to carve out at this point.

An additional feature that we appreciate is the report associated with PCI. We are Merchant Level 1 due to the number of our transactions, so we use it for test application compliance. We also use the OWASP Top 10 type of reports since it is used by our regulators in some of the markets that we operate in, such as, Portugal and Germany.

The effectiveness of the solution’s automation via its instrumentation methodology is very effective and was a very easy integration. It does not get affected by how many reviews we perform in the way that we have designed the release methodologies. So, it has clear visibility over every release that we do, because it is the production code which is being evaluated. 

The solution has absolutely helped developers incorporate security elements while they are writing code. The great part about the fixes is they provide a lot of sensory tapes and stuff like what you should avoid to do in order to avoid future occurrences around your code. Even though the initial assessment is being done by a senior, more experienced engineers in our organization, we provide the fixes to more junior staff so they have a visceral marker for what they shouldn't do in the future, so they are receiving a good education from the tool as well.

What needs improvement?

During the period that we have been using it, we haven't identified any major issues. Personalization of the board and how to make it appealing to an organization is something that could be done on their end. The reports could be adaptable to the customer's preferences, but this isn't a big issue, as it's something that the customer can do as he builds his experience with the tool.

On the initial approaches during the PoC and the preparation of the solution, it would be more efficient if we were presented with a wider variety of scenarios aimed towards our main concern, which is system availability. However, once we fine tuned those by their scenarios that they provided later on in our discussion, we fixed it and went ahead.

For how long have I used the solution?

We evaluated the product twice: once in PoC and once in a 30-day trial. Then, we proceeded with using it in production, where it's been for four months. Our initial approach was almost nine months ago. So, we had a fair bit of experience with them.

What do I think about the stability of the solution?

The application is very stable because it is on-premise. So, we have had no issues with it. The stability of the solution is at a level where we just have the health check run on it and nothing more is needed. We don't have issues with capacity. We do not have issues with very high level of requests nor delays. It is very smooth at this point. We fine tuned it during the testing week. After that, nothing changed. It handles the traffic in a very easy way. We just configure it through the Contrast tool, if needed, which is very straightforward.

The maintenance is very simple. We have had two patches applied. Therefore, I have only needed to involve our systems team two times during these four months for one hour of time. The health check of the system has been added to our monitoring team's task, therefore there is no overhead for us.

What do I think about the scalability of the solution?

At this point, we have provided access to 20 people in the Contrast platform. However, it is being used by more people than that because once a vulnerability is identified and marked as something that we should fix, then it's handled by a person who may not have access to Contrast and is only presented with a specific vulnerability in order to fix it. Top management receives the reports that we give them as well as the KPI's. So, it's used across the organization. It's not really limited to just the teams who have actual access to it.

At this point, we see great value for the applications that we have it on. We want to spread it across lower criticality applications. This is something that's a positive thing, because if we want to have it on a larger scale, we'll just add another web node and filter different apps on it. It's a very scalable and easy to manage. We are more than sure that it will cover the needs that we'll have in the future as well. We have weekly releases with no issues so far.

How are customer service and technical support?

Every time that we approach them with a request, we have had an immediate response, including the solution, with the exact point in the documentation. Therefore, they have been very helpful.

It was a very smooth completion of the paperwork with the sales team. That's a positive as well because we are always scared by the contract, but they monitor it on a very efficient level.

I really want to highlight how enthusiastic everyone is in Contrast, from day one of the evaluation up until the release. If we think that we should change something and improve upon it, then they have been open to listening and helping. That is something that greatly suits our mentality as an organization. 

Which solution did I use previously and why did I switch?

Prior to to this, we did not have such a solution and relied on other controls.

Our initial thought was that we needed a SAST tool. So, we proceeded with approaching some vendors. What sparked the interest for Contrast is its real-time evaluation of requests from our users and identification of real-time vulnerabilities.

We have now established specific web nodes serving those requests. We get all the feedback from there along with all the vulnerabilities identified. Then, we have a clear dashboard managed by our information security team, which is the first step of evaluation. After that, we proceed with adding those pieces of the vulnerabilities to our software development life cycle.

Prior to using Contrast, we didn't have any visibility. There were no false positives; we had just the emptiness where even false positives would be a good thing. Then, within the first week of having the tool, 80 or 90 vulnerabilities had been identified, which gave us lots to do with minor false positives.

How was the initial setup?

The setup is very straightforward. Something that has worked greatly in their favor: The documentation, although extensive, was not very time consuming for us to prepare. We have a great team and had a very easy integration. The only problems that we stumbled onto was when we didn't know which solution would work better for our production. Once we found that out, everything went very smoothly and the operation was a success.

The final deployment: Once the solution was complete, it took us about less than a day. However, in order to decide which solution we would go with, we had a discussion that lasted two or three working days but was split up over a week or so to have the feedback from all the teams. The deployment was very fast. It took one day tops.

What about the implementation team?

Their support was one of the best I have seen. They were always very responsive, which is something that we appreciate. When you assign a person and time to work the project, you want it to be as effective as can be and not have to wait for responses from the provider.

Their sales team gave us feedback from the solution architects. They wanted to be involved in order to help us with some specific issues that we were dealing with since we were using two different technologies. We wanted some clarifications there, but this was not customer support. Instead, it was more at a solution level.

The integration was very simple of the solution’s automation via its instrumentation methodology. We had excellent help from the solution architects from the Security Assess team. We had the opportunity to engage many teams within our organization: our enterprise architects, DevOps team, systems team, and information security team members. Therefore, we had a clear picture of how we should implement it, not only systems-wise, but also in organization-wide effect. At this point, we have embedded it in our software development life cycle (SDLC), and we feel that it brings value on a day-to-day basis.

We prepared a solution with the solution architect that we agreed upon. We had a clear picture of what we wanted to do. Once we put the pieces together, the deployment was super easy. We have a dedicated web node for that. So, it only runs that. We have clear applications installed on that node setup, so it's very straightforward and easy to set up. That's one of the key strengths of Contrast: It is a very easy setup once you decide what you want to do.

On our end, we had the one person from the systems team, the enterprise architect who consulted in regards to which applications we should include, myself from information security, and DevOps, who was there just to provide the information in regards to the technologies we use on the CI/CD front. However, the actual involvement with the project to the implementation was the systems team along with me.

From their end, they had their solution architect and sales acted as a project manager, who helped tremendously in their time limits of responses. There was just two people. 

What was our ROI?

The solution has helped save us time and money by fixing software bugs earlier in the SDLC. The code shells and quality improve through missed links and libraries as well as units of extensive code where it's not needed. From many aspects, it has a good return of investment because we have to maintain less code use, a smaller number of libraries and stuff like that, which greatly increases the cost of our software development.

What it saves is that when a developer writes something, he can feel free to post it for review, then release it. We are sure that if something comes up, then it will be raised by the automated tool and we will be ready to assess and resolve it. We are saving time on extensive code reviews that were happening in the past.

What's my experience with pricing, setup cost, and licensing?

For what it offers, it's a very reasonable cost. The way that it is priced is extremely straightforward. It works on the number of applications that you use, and you license a server. It is something that is extremely fair, because it doesn't take into consideration the number of requests, etc. It is only priced based on the number of applications. It suits our model as well, because we have huge traffic. Our number of onboarded applications is not that large, so the pricing works great for us.

There is a very small fee for the additional web node we have in place; it's a nonexistent cost. If you decide to apply it on existing web nodes, that is eliminated as well. It's just something that suits our solution.

Which other solutions did I evaluate?

We had an extensive list that we examined. We dove into some portable solutions. We did have some excellent competitors because they gave us a clear indication of what we wanted to do. We examined SonarQube and Veracode, who presented us with a great product, but was not a great fit for us at the time. These solutions gave us the idea of going with something much larger and more broad than just a tool to produce findings. So, many competitors were examined, and we just selected the one who mostly fit our way of doing things.

The main thing to note is the key differentiation between Contrast and everything else we evaluated is the production value range since we had the chance to examine actual requests to our site using our code. Contrast eliminated the competition with their ability to add the live aspects of a request taken. That was something we weren't able to find in other solutions.

Some of the other competitive solutions were more expensive.

What other advice do I have?

I would recommend trying and buying it. This solution is something that everyone should try in order to enhance their security. It's a very easy, fast way to improve your code security and health.

We do not use the solution’s OSS feature (through which you can look at third-party open-source software libraries) yet. We have not discussed that with our solutions architect, but it's something that we may use in the future when we have more applications onboard. At this point, we have a very specific path in order to raise the volume of those critical apps, then we will proceed to more features.

During the renewal, or maybe even earlier than that, we will go with more apps, not just three.

One of the key takeaways is that in order to have a secure application, you cannot rely on just the pentest, vulnerability assessments, and the periodicity of the reviews. You need the real-time feedback on that, and Contrast Assess offers that. 

We were amazed to see how much easier it is to be PCI-compliant once you have the correct solution applied to it. We were humbled to see that we have vulnerabilities which were so easy to fix, but we wouldn't have noticed them if we didn't have this tool in place.

It is a great product. I would rate it a nine out of 10. 

Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
TS
Manager at a consultancy with 10,001+ employees
Real User
Top 10
Because they're not waiting on security to complete scans for them, Dev teams are not seeing delays in deployment

Pros and Cons

  • "The most valuable feature is the continuous monitoring aspect: the fact that we don't have to wait for scans to complete for the tool to identify vulnerabilities. They're automatically identified through developers' business-as-usual processes."
  • "This has changed the way that developers are looking at usage of third-party libraries, upfront. It's changing our model of development and our culture of development to ensure that there is more thought being put into the usage of third-party libraries."
  • "Regarding the solution's OSS feature, the one drawback that we do have is that it does not have client-side support. We'll be missing identification of libraries like jQuery or JavaScript, and such, that are client-side."
  • "The setup of the solution is different for each application. That's the one thing that has been a challenge for us. The deployment itself is simple, but it's tough to automate because each application is different, so each installation process for Contrast is different."

What is our primary use case?

We've been using Contrast Security Assess for our applications that are under more of an Agile development methodology, those that need to deliver on faster timelines.

The solution itself is inherently a cloud-based solution. The TeamServer aspect, the consolidated portal, is hosted by the vendor and we have the actual Assess agent deployed in our own application environments on-prem.

How has it helped my organization?

We've historically run dynamic and static scans for all of our applications, but for these teams that need to deploy on a much faster basis, we prefer using Contrast because there are no point-in-time scans required. There isn't a lot of triage required when it comes to reviewing the results. Everything is instant and requires little bottleneck from the security-team side, and the developers can continue on with their development and testing without us.

We have a very large backlog at the moment for DAST scan requests, from our application teams. That backlog has grown so much that some of the teams have missed their initial deployment timelines because they're waiting on us to become available to run dynamic scans. Now, with teams that have Contrast, they're not seeing any delays in their deployment process because they're not waiting on us to complete the scans on their behalf. The vulnerabilities are being automatically identified using the tool.

What is most valuable?

The most valuable feature is the continuous monitoring aspect: the fact that we don't have to wait for scans to complete for the tool to identify vulnerabilities. They're automatically identified through developers' business-as-usual processes.

The automation of the actual vulnerability identification is great. I would give it a very high rating, given that it requires little of the security team or developers to understand and start reviewing the results that are identified.

The false positive rate is another good feature. It has a very low false positive rate. That means my team, the security team, has to spend less time looking at results and findings, compared to historical, static and dynamic scans where the false positive rate is much higher. From a percentage perspective, somewhere around 90 percent of the time we used to spend has been given back to our team, because the false positive rate with Contrast is less than 5 percent.

In terms of the accuracy of vulnerability identification, so far we've had tens of thousands of issues identified in applications that have historically been scanned by dynamic and static scanning. So far, the large majority of those findings have been true positive. I may have seen just a handful, five or 10, false positives so far, in the scope of tens of thousands. That's a very low rate.

We also use the solution's OSS feature through which we can look at third-party open source software libraries. It is a great tool. We've never had a solution for software composition analysis. It has affected our software development greatly. Since we've never really had a solution for doing software composition, nor have we required fixes for vulnerable third-party libraries, this has changed the way that developers are looking at usage of third-party libraries, upfront. It's changing our model of development and our culture of development to ensure that there is more thought being put into the usage of third-party libraries.

The solution is definitely helping developers incorporate security elements while they are writing code. Since we're able to install Assess in Development and QA and all the pre-production environments, developers can start making use of the tool as soon as they have a deployed version of their products. As they code new features and test those out in their development environment, Contrast is already going to be automatically identifying things at that point. We are identifying issues much earlier in the software development life cycle, which makes it much less costly for developers to fix those findings.

We're saving time and money by fixing software bugs earlier in the software development life cycle. We're saving time on the developers' side, as well as on the security auditors' side.

What needs improvement?

Regarding the solution's OSS feature, the one drawback that we do have is that it does not have client-side support. We'll be missing identification of libraries like jQuery or JavaScript, and such, that are client-side.

The same thing is true on the custom code side: the client-side technology support. Although client-side technologies are inherently less risky than server-side technologies, which is where Contrast focuses testing, it would definitely help for this tool to identify both the server-side and client-side findings in libraries, as well as custom code. This would help us move away from using multiple tools. For example, if we have Contrast for our server-side testing, we still need to use some sort of static scanning sensor for the client-side. In a perfect world, it would just be Contrast Assess doing both of those.

For how long have I used the solution?

I have been using Contrast Security Assess for five months.

What do I think about the stability of the solution?

So far, the stability has been good. We've only had two applications where performance was affected by the agent. For the hundreds of other agents we've deployed thus far, there's been no impact.

What do I think about the scalability of the solution?

Scalability ties back to automation. It's very tough to scale this from an automated perspective, so we've just been doing manual installs from the beginning. If there were an easier way, a way to automate the deployment of the solution, that would be one of our hopes for the product roadmap.

How are customer service and technical support?

On a scale from one to five, Contrast technical support is about a four. I haven't had too many support issues just yet, but in each one that I have had, they have been very quick to respond; within hours as opposed to days. I haven't rated it a five just because I haven't had enough support requests to see if they are any different than other software vendors out there.

Which solution did I use previously and why did I switch?

We did not use something else specifically for interactive app testing or software composition. We've only had tools for static and dynamic testing.

Our decision to go with Contrast dates back to the whole issue of our application teams that need faster results and fewer bottlenecks. We use Fortify for static and dynamic scanning, and that creates a lot of time delays, either waiting for a scan or waiting for review of the scan results to be completed. Whereas with Contrast, there are no delays. The teams that are more Agile and deploying much more often require that feature.

How was the initial setup?

The setup of the solution is different for each application. That's the one thing that has been a challenge for us. The deployment itself is simple, but it's tough to automate because each application is different, so each installation process for Contrast is different. But manually installing the tool or deploying it is very simple.

The setup of the Contrast Assess agent is quite simple. Not much time is needed upfront to get this working and, thereafter, ongoing maintenance is very trivial for Assess.

We're still deploying. We have thousands of applications and thousands of teams around the world that we're deploying to. But if we're talking about just one application, at most it would take one to two hours.

The implementation strategy is that we are deploying it firm-wide within our organization to at least make use of the software composition analysis, because that is a part of the agent that is a free feature. At that point, once we have the agent deployed, that's when we would start working with application teams to give them an understanding of the findings that are being identified, just for software composition analysis. In the meanwhile, the interactive application security testing feature of the same agent is working in the background. So as teams are seeing custom code vulnerabilities being identified as well, we're working with those teams to apply licenses as needed. 

From the deployment perspective, we're focusing holistically on deploying the agent for software composition, and then thereafter, making more risk-based decisions on which teams or applications would use a license for interactive testing.

The adoption rate will be 100 percent because we're deploying all of these agents to all of our application servers out there. For now, we're at about 30 percent. We have a little over 100 users, currently. They range from application security testers and managers, like myself, to product managers who are worried about the business-side of getting the application deployed. And then there are the development teams and build-engineers who comprise those teams. Each application team maintains its own instance.

What about the implementation team?

We're working with Contrast. They've provided a very helpful technical solution architect who has been helping with the deployment.

What was our ROI?

From a security team perspective, we're able to free up a lot more time. We spend less time reviewing results and can spend our time elsewhere. Developers have the same thing. They can spend more of their time working on actionable results rather than looking at false positives and waiting for the security team to complete testing on their behalf.

What's my experience with pricing, setup cost, and licensing?

The good news is that the agent itself comes in two different forms: the unlicensed form and the licensed form. 

Unlicensed gives use of that software composition analysis for free. Thereafter, if you apply a license to that same agent, that's when the instrumentation takes hold. So one of my suggestions is to do what we're doing: Deploy the agent to as many applications as possible, with just the SCA feature turned on with no license applied, and then you can be more choosy and pick which teams will get the license applied. Thankfully, it's always going to be working. You just won't be able to see the IAST results without applying that license.

There are no fees apart from the licensing fee. Some teams might run into issues where they need to spend more money on their servers and increase memory to support the Contrast Assess agent running while the application is running, but that is a small amount.

Which other solutions did I evaluate?

We did not evaluate other options. We met with Contrast and they were the leader in the space for instrumentation, so we went forward with them.

What other advice do I have?

Make sure you understand your environment before deploying. Try to get an idea of what technologies are in use by applications so you can group them and group the deployment and the implementation. That way you can focus on automating .NET deployments, for example, first, and then move on to Java, etc.

The biggest lesson I have learned from using this solution is that there is a tool out there that is really changing the way that we are running security testing. In the security realm we're used to the static and dynamic testing approaches. Contrast Assess, as well as some other tools out there, has this new feature of interactive application security testing that really is the future for developer-driven security, rather than injecting security auditors as bottlenecks into the software development life cycle.

I would rate Contrast Security at eight out of 10, and that is because of that the lack of client-side support and the troubles in automating the deployment holistically across an organization.

Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
TM
Director of Innovation at a tech services company with 1-10 employees
Real User
Top 10
OSS feature gives us better visibility and valuable insight into third-party open-source software libraries

Pros and Cons

  • "The accuracy of the solution in identifying vulnerabilities is better than any other product we've used, far and away. In our internal comparisons among different tools, Contrast consistently finds more impactful vulnerabilities, and also identifies vulnerabilities that are nearly guaranteed to be there, meaning that the chance of false positives is very low."
  • "Contrast's ability to support upgrades on the actual agents that get deployed is limited. Our environment is pretty much entirely Java. There are no updates associated with that. You have to actually download a new version of the .jar file and push that out to your servers where your app is hosted. That can be quite cumbersome from a change-management perspective."

What is our primary use case?

It is used primarily to help put a layer of security around some of our legacy applications that were built quite some time ago. It's also used to provide better quality assessments on the vulnerabilities of some of these applications, compared to some of the other tools that we've been using.

We're using the SaaS platform.

How has it helped my organization?

The solution’s OSS feature, through which we can look at third-party open-source software libraries, give us better visibility into such libraries compared to any other tool on the market, because this is the only tool that I'm aware of that offers that capability. It's not affecting our software development a whole lot because we're not holding developers accountable to that level of metrics, but it's valuable insight to have.

In a way, Assess helps developers incorporate security elements while they are writing code. Not while they're actually writing it, but certainly while they're fixing it, because it provides really impactful feedback on how to go back and fix that code, and the best practices on how to fix it.

It also saves time and money by helping us fix software bugs earlier in the software development life cycle. The enterprise that I'm with has not, historically, prioritized any kind of security remediation at all. It considers all of it to be in a context they call "technical debt." This solution allows the organization to prioritize how to best use the labor hours allocated for technical debt. The savings are an intuitive inference to make in this case. I'm personally seeing that it's easier to get things remediated, versus where they weren't being remediated at all because the quality of the results from those other tools was just terrible. Now that I'm seeing that action being taken on them, it's very rewarding. I can nearly guarantee that we've saved time and money. I just don't know exactly how much.

What is most valuable?

The most valuable feature is the IAST part. Institutionally, we're not quite at the point of using Contrast for the Protect functionality because we have other tools that overlap with the web application firewall component of it. But for the Assess component, there's a direct correlation to other tools that we've used and the failures of those tools. Contrast, in terms of providing that vulnerability assessment, it provides an immediate benefit there.

The effectiveness of the solution’s automation via its instrumentation methodology is a solid eight out of 10.

The accuracy of the solution in identifying vulnerabilities is better than any other product we've used, far and away. In our internal comparisons among different tools, Contrast consistently finds more impactful vulnerabilities, and also identifies vulnerabilities that are nearly guaranteed to be there, meaning that the chance of false positives is very low. The number of false positives from this product is much lower compared to competing tools that we use right now: WebInspect and AppScan. It reduces the number of false positives we encounter by more than 50 percent.

What needs improvement?

The effectiveness of the solution’s automation via its instrumentation methodology is good, although it still has a lot of room for growth. The documentation, for example, is not quite up to snuff. There are still a lot of plugins and integrations that are coming out from Contrast to help it along the way. It's really geared more for smaller companies, whereas I'm contracting for a very large organization. Any application's ability to be turnkey is probably the one thing that will set it apart, and Contrast isn't quite to the point where it's turnkey.

Also, Contrast's ability to support upgrades on the actual agents that get deployed is limited. Our environment is pretty much entirely Java. There are no updates associated with that. You have to actually download a new version of the .jar file and push that out to the servers where your app is hosted. That can be quite cumbersome from a change-management perspective.

For how long have I used the solution?

I've been using Contrast Security Assess since October of last year, making it about nine months.

What do I think about the stability of the solution?

Overall, the stability is quite good. 

We've had a couple of support-related problems. Contrast is funny because there are many aspects of it that they don't support. For instance, we have ColdFusion applications and, on paper, Contrast did not support ColdFusion. However, it will still work with ColdFusion, kind of. But it has caused some problems as it comes to isolating troubleshooting issues that occur. It's left us in a position where we have to make generalized assumptions about what can and can't be supported. So, out-of-the-box, we've made the decision not to try to support ColdFusion because of the issues that that can pose for us.

What do I think about the scalability of the solution?

The scalability ties back to something I said before about change management. So far, we haven't seen anything that would prevent us from scaling upwards significantly. However, it requires the organization to have a pretty robust way of handling the changes for Contrast: for instance, the updates of the application itself. Because those updates aren't bundled into Contrast, it behooves the organization that's deploying Contrast to ensure it has a very robust change-management strategy to work with the product.

Out of our perimeter applications, we've got about 20 apps onboarded. Those applications that it has been deployed to are key applications, including key revenue-driving applications, but it's still being used only in a minority of our applications at the moment. Our adoption rate is around 10 percent. We have plans to increase usage of Contrast Security. We have hundreds of applications. Out of our customer-focused applications that are on the perimeter — we have over 200 of them — Contrast is deployed to about 20 of them.

We have about 130 users registered to use the product. The majority, about 80 percent, are developers, while about 10 percent are security personnel, and 10 percent are managers. We have a dedicated staff for maintaining the solution. That's the staff that I'm part of right now.

How are customer service and technical support?

Their level of support and troubleshooting for the product is limited because of how they handle troubleshooting. It's done through a log file that's very cumbersome to work with.

Their technical support staff is very responsive. Personally, I've put in about 60 support tickets with Contrast. Some of the support tickets have ended up being actual changes to the product itself. Overall, I'm pretty pleased with that. But they're definitely still growing. They're a small company that is on the verge of growing into a very big company. I can tell from the quality of support I'm getting that they're struggling to keep up with that demand.

Which solution did I use previously and why did I switch?

We use WebInspect and AppScan. We're evaluating the possibility of switching from them to Contrast, but right now Contrast is still in trial. We're not quite at that point in making a decision to drop one of those other tools yet.

How was the initial setup?

The initial setup is straightforward. The version we're using is built for Java, and the setup procedure involves you associating the Contrast .jar file with the JVM arguments of the app server itself. The instructions on that are relatively clear and they've broken those instructions out per container platform that the JVM can run in. It's as clear as it can be for that product.

We're still deploying. We have many apps and there's an onboarding process associated with it. But on a per-app basis, it can take us less than an hour. For a larger app, in a clustered environment, it might take closer to a week.

Because we have a very large organization, we have a different team per application. We have an onboarding process where we work with an application team to onboard the Contrast product into their workflow, and then follow up with them to ensure that they're using it correctly. It's a multi-stage approach on a per-app basis.

What about the implementation team?

We've mostly done it ourselves, although we have Contrast Security Professional Services on staff to assist with harder problems, and to follow up directly with our development teams. We've been happy with Professional Services.

What was our ROI?

We have seen ROI, but I can't get into specific numbers because those are sensitive to the organization. But some of these applications are key revenue drivers. Contrast's ability to help secure them, even if it is just those applications, gives us a little confidence that they are being looked at in terms of security. That is always going to be a significant return on investment, compared to the other tools that, frankly, weren't driving the progress necessary to secure those applications.

What's my experience with pricing, setup cost, and licensing?

If you know your needs upfront, and if you're more concerned about vulnerabilities and you already have a web application firewall that you're happy with, then focus on the Assess component of it, because the Assess component has a very straightforward licensing strategy.

If you need the web application firewall and you have a highly clustered environment, then you will be paying that license cost per server. Unfortunately, that does not scale as well for us. It helps to understand what your use case is upfront and apply that with Contrast, knowing whether or not you need it per application or per server.

Which other solutions did I evaluate?

We have not evaluated other IAST platforms.

What other advice do I have?

Make sure that you have a very good change-management strategy in place ahead of time. 

Also, it's not enough to have the solution itself. It still requires proactive management on behalf of your developers to make sure they understand what the product is offering and that they are using the product in a way that will benefit them.

Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
HK
Product Security Engineer at a tech services company with 10,001+ employees
Real User
Top 10
Finds high-priority issues that static scanning tools have not found

Pros and Cons

  • "No other tool does the runtime scanning like Contrast does. Other static analysis tools do static scanning, but Contrast is runtime analysis, when the routes are exercised. That's when the scan happens. This is a tool that has a very unique capability compared to other tools. That's what I like most about Contrast, that it's runtime."
  • "I would like to see them come up with more scanning rules."

What is our primary use case?

The product scans runtime and that is our main use case. We have deployed it for one application in our testing environment, and for the other one on in our Dev environment. Whatever routes are exercised with those environments are being scanned by Contrast.

How has it helped my organization?

It has helped us to improve the overall security posture of the company. We are able to address the findings before they have been reported by a third-party. It helps to identify things before someone else reports them or they have been widely exposed. It definitely improves the security posture of our applications, as a whole. It also improves our own security processes within the company, the way we catch the findings and resolve them. It has also helped us to gain our customers' trust.

Contrast helps save time and money by fixing software bugs earlier in the software development life cycle. We have installed the app in our Dev environment, so it's way before anything goes into production. It helps us shift left in our SDLC and it definitely helps us fix findings before the code is pushed to production.

What is most valuable?

The tool has good, strong findings. We have other static analysis tools, but Contrast has found high-priority issues which other tools have not found. The capability of the tool to scan and throw errors that other tools don't catch is important.

No other tool does the runtime scanning like Contrast does. Other static analysis tools do static scanning, but Contrast is runtime analysis, when the routes are exercised. That's when the scan happens. This is a tool that has a very unique capability compared to other tools. That's what I like most about Contrast, that it's runtime.

There is also a feature in the tool where you can actually specify that this or that is not a problem and mark it as false positive, and it doesn't show up again on your dashboard. It's pretty easy. You can filter out your false positives and be good to go. We have seen a reduction in the number of false positives because, once you mark something as a false positive, that particular one doesn't show up.

What needs improvement?

I would like to see them come up with more scanning rules. I don't know how it was done within the tool, but there is always room for improvement.

We recently had a call with the vendor. We were talking about a finding where it combined all of the instances of the finding into one. Whenever a new instance shows up that finding is being reported again. We want it to work so that once we mark it as "not a problem" the new one will be reported as a new finding, rather than an old finding popping up as a new instance.

For how long have I used the solution?

I have been using Contrast Security Assess for about eight or nine months. I joined my current company last September and I've been using it since then. In our company we have applications to work on, as subject matter experts for security. I have onboarded my applications into Contrast. After onboarding, I scan and tune the scan, and then list the non-true positives and false positives. I work with governing team to fix the issues. 

What do I think about the stability of the solution?

It's been stable. It hasn't gone down from the time we installed it on our cloud. The scans are running every day. We have very great support from the Contrast team so they would be able to help us if we were stuck anywhere.

What do I think about the scalability of the solution?

It's easily scalable. We are planning to spread it to other teams and we are planning on one more application from within our team. It's just a matter of installing it on the proper cloud and it's good to go. It's easy to configure and you just have to decide which environment you want it on and make a few configuration changes.

In our company, it's mainly security who maintains and uses the tool. We haven't onboarded any of the developers or security champions within the company because we just started with it and we want to get to know the tool entirely. Then we can pass it on to other people in the company. For now, we, as the security team, are using it. Our team has 10 to 11 people. There are a few people from the DevOps team who have access to it to do the configuration stuff, and that team is another four or five people.

How are customer service and technical support?

Contrast's tech support is very helpful. They answer our questions and address our concerns. It's been easy and smooth with them.

Which solution did I use previously and why did I switch?

We did not have a previous solution. Contrast is a one-of-a-kind tool. It does runtime scanning so this is the only runtime scanning tool we have had.

Before me, one of my teammates was working on a different application and he was the first person to use Contrast. Then we bought three licenses. There is one more person who used it before me, for a different application. We have had good findings there as well. I have put to use the second license and we have one more license to use. We have identified an application to onboard, and we have also spread the word to different teams within the company and they're working closely with the Contrast team to use it in a different way. We are using the cloud version and they're still deciding on how to use it. We are just starting with Contrast but use of it is expanding within our company.

By "application" I mean monolithic, big applications. We currently have two such applications in Contrast and we will be working on the third one. We are looking to do more.

How was the initial setup?

The setup wasn't complex. It was pretty simple. We worked with an internal team that deals with the firewalls, because that's how it has to be configured. Because it was new to us, it took time for us to understand. But otherwise, it was smooth and we were able to configure it pretty quickly. Everything together took under three months. It might have taken less time but it was during the December/January time frame so we weren't available and people from other teams weren't available.

We have an internal process where we connect with other stakeholders to come up with a plan. We worked with a different team to be able to configure it and to be able to run a scan. We also worked closely with them for key rotation and other maintenance stuff connected to the tool. We have a lot of processes internally on how to manage the tool and how to maintain the tool and to make sure it's running scans continuously and that the key rotation is done. We have our own internal processes and our own strategy to maintain it and manage the program.

There is also regular maintenance from Contrast, making sure that it doesn't go down.

What was our ROI?

We have definitely seen ROI. We have been able to onboard our applications and scan them. The scan is happening continuously, every day, and it does report new findings. We have been able to triage them and fix them, address the defects of the software, even before they were posted to Prod. This will help reduce our attack surface and make our products more secure.

What's my experience with pricing, setup cost, and licensing?

You only get one license for an application. Ours are very big, monolithic applications with millions of lines of code. We were able to apply one license to one monolithic application, which is great. We are happy with the licensing. Pricing-wise, they are industry-standard, which is fine.

Which other solutions did I evaluate?

There were other companies that the people involved in evaluations were looking at, but I was not involved in that process.

What other advice do I have?

It depends on the company, but if you want to manage and maintain and onboard, I would recommend having Contrast as part of your toolkit. It is definitely helpful. My advice would be to install it on the environment in which there are more routes exercised, whether it is the testing environment or Dev, to get most out of the tool.

In terms of configuration, we have Contrast on one of the applications in our testing environment and we have the other in the Dev environment. To decide on that took us some time because we didn't have access to all the environments of a single application.

Findings-wise, Contrast is pretty good. It's up to the app engineer to identify whether a finding is due to the functionality of the application or it really is a finding.

Contrast does report some false positives, but there are some useful findings as well from the tool. It cannot give you only true positives, so it's up to humans to make out which ones are true which ones are false. Applications do behave in different ways, and the tool might not understand that. But there are definitely a few findings which have been helpful. It's a good tool. Every other tool also has false positives and it's better than some other tools.

We are not actively using the solution's OSS feature, through which you can look at third-party open source software libraries, because we have other tools internally for third-party library scanning.

It's been a good journey so far.

Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
SW
Senior Customer Success Manager at a tech company with 201-500 employees
Real User
Top 5
Infuses software with vulnerability assessment capabilities for automatic flaw detection

Pros and Cons

  • "By far, the thing that was able to provide value was the immediate response while testing ahead of release, in real-time."
  • "I think there was activity underway to support the centralized configuration control. There are ways to do it, but I think they were productizing more of that."

What is our primary use case?

A good use case is a development team with an established DevOps process. The Assess product natively integrates into developer workflows to deliver immediate results. Highly accurate vulnerability findings are available at the same time as functional /regression testing results. There is no wait for time-consuming static scans.

Assess works with several languages, including Java and .NET, which are common in enterprise environments, as well as Node.JS, Ruby and Python. 

What is most valuable?

Assess is valuable for several reasons, but time-saving factors are high on the list. Compared to a typical development environment with a SAST tool, Assess saves developer time and reduces the time-to-market. With Assess there is no waiting for a slow static scan to complete. Vulnerability findings are reported during testing and the reported findings are highly accurate, with very few false positives. Other SAST tools often emit a great number of false positives that must be investigated and resolved before the code can be released, consuming the time of developers and the security team chasing invalid vulnerability reports. Assess also provides clear and actionable guidance on how to fix each vulnerability, saving more time. 

Assess integrates with a many common tools to generate notifications and tickets, such as JIRA tickets. The result is that application security vulnerabilities can be handled by developers as just another type of bug found during testing. Application security becomes part of the development process rather than a step that is done “after” development. The temptation to skip the security testing step to meet a release deadline is eliminated.

The combination of real-time analysis and accurate vulnerability reports can really accelerate time-to-market. One large customer was even able to eliminate the human signoff before release to production. This customer had a solid DevOps process with automated application testing, but still had the security testing and review process delaying releases. With Assess in their pipeline they were able to automate the release decision. Apps that passed functional tests and reported only vulnerabilities below a certain criticality threshold would be automatically released directly to production.

What needs improvement?

Contrast is good at listening to its customers and setting product directions based on their feedback. Contrast continues to improve along multiple axes. One axis is languages and platforms. Support for Python was recently added and Go is in beta.

Another axis is the deployment and configuration of agents. Contrast offers a lot of flexibility in agent management but is working on enhancements to improve centralized control.

For how long have I used the solution?

I've used this product for about three years.

What do I think about the stability of the solution?

Operational stability of the platform has been excellent.

The Assess agent is designed to run with the app in a preproduction environment. The agent monitors the operation of the application to which it is bound. This monitoring of course uses some processing resources and time, but the impact is usually not detectable by a human user of a web app. The additional processing might impact a loaded production system, so Contrast recommends that the Assess agent not be used in production.

However, some customers deploy Assess in production occasionally because they view the live production traffic as a source of additional test activity.

What do I think about the scalability of the solution?

Contrast is a well-designed SaaS platform and scales well. There are no practical limits on the number of users or apps. 

How are customer service and technical support?

The technical support is excellent, with a knowledgeable team and access to the necessary resources. 

How was the initial setup?

The agent installation is straightforward. Typically, for an initial user (developer) and application, Customer Success or Professional Services can just walk them through the setup over the phone. The dashboard requires no installation (SaaS), so the developer can exercise the app + agent and see vulnerabilities immediately.

Some deployments are more complex, but deployment complexity generally reflects the complexity of the customer and their overall situation. A large customer may have many business units, app teams, apps, and languages, requiring some planning. 

What other advice do I have?

Start with a small app team initially, before scheduling a larger rollout. Teams that have been using SAST tools find that using Assess changes how they think about appSec in their development workflow and helps them identify process modifications that maximize the value of the tool.

Overall, on a scale from one to ten, I would give this solution a rating of ten. The product is strong and improving, support is responsive and effective, and supported integrations work for many customers.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
Buyer's Guide
Download our free Contrast Security Assess Report and get advice and tips from experienced pros sharing their opinions.