• Latio Pulse
  • Posts
  • Do you need Container Vulnerability Scanning and SCA Scanning?

Do you need Container Vulnerability Scanning and SCA Scanning?

Exploring the convergence of Container and SCA scanning

This week I met with Kodem. Kodem has entered into the narrow but exciting world of granular runtime vulnerability detection alongside Oligo, offering a function level view of dependencies based on eBPF logs. A lot of the content here is also covered in the latest YouTube video.

This led to me ask a question I’ve had for a while:

Do you need Container Vulnerability Scanning and SCA Scanning?

(A lot of this is also covered in the latest YouTube video)

Table of Contents

Something that’s nagged at me for a while is that container vulnerability scanning has always had some degree of SCA detection. Aqua, Prisma, Wiz, Sysdig etc. can all detect things like java binaries and some of their dependencies; however, this functionality has always felt limited and very language specific - but I’ve never fully tested if that feeling is justified.

Oligo and Kodem have forced me to revisit this, as both tools fully embrace that at the container level, when combined with execution data, you really have the ability to get all of the third party dependencies, and even what functions are called. Let’s test this with some examples.

JavaScript Detection Example

In my insecure-app repo, I added a create-react-app (JavaScript) that’s also built by a docker container. A Snyk CLI test reveals vulnerabilities:

These packages need updating:

"react": "15.4.2", "react-dom": "15.4.2", "react-admin": "3.19.10"

Snyk confirms that we should do some updates:

Issues to fix by upgrading:

  Upgrade [email protected] to [email protected] to fix
  ✗ Information Exposure [Medium Severity][https://security.snyk.io/vuln/SNYK-JS-NODEFETCH-2342118] in [email protected]
    introduced by [email protected] > [email protected] > [email protected] > [email protected] and 1 other path(s)
  ✗ Denial of Service [Medium Severity][https://security.snyk.io/vuln/SNYK-JS-NODEFETCH-674311] in [email protected]
    introduced by [email protected] > [email protected] > [email protected] > [email protected] and 1 other path(s)

  Upgrade [email protected] to [email protected] to fix
  ✗ Cross-site Scripting (XSS) [Medium Severity][https://security.snyk.io/vuln/SNYK-JS-REACTADMIN-3319447] in [email protected]
    introduced by [email protected]

  Upgrade [email protected] to [email protected] to fix
  ✗ Information Exposure [Medium Severity][https://security.snyk.io/vuln/SNYK-JS-NODEFETCH-2342118] in [email protected]
    introduced by [email protected] > [email protected] > [email protected] > [email protected] and 1 other path(s)
  ✗ Denial of Service [Medium Severity][https://security.snyk.io/vuln/SNYK-JS-NODEFETCH-674311] in [email protected]
    introduced by [email protected] > [email protected] > [email protected] > [email protected] and 1 other path(s)

Let’s focus on CVE-2023-25572 and CVE-2020-15168, the first is a vulnerability with [email protected] that allows cross site scripting, the second is a transitive vulnerability from node-fetch, which comes in via react > fbjs > isomorphic-fetch > node-fetch.

Now I run the docker build and scan it with Trivy, and the vulnerabilities show here as well:

│ react-admin (package.json)│ CVE-2023-25572 │ MEDIUM   │      │ 3.19.10│ 3.19.12, 4.7.6│
| node-fetch (package.json) | CVE-2020-15168 │ LOW │ 2.6.1, 3.0.0-beta.9                  

Let’s see if this holds true for a Python example:

Python Dependency Example


requests == 2.19.1

Snyk finds CVE-2018-18074, as well as transitive dependencies in urllib such as CVE-2019-11236:

  Upgrade [email protected] to [email protected] to fix
  ✗ Information Exposure [Medium Severity][https://security.snyk.io/vuln/SNYK-PYTHON-REQUESTS-5595532] in [email protected]
    introduced by [email protected]
  ✗ Information Exposure [Critical Severity][https://security.snyk.io/vuln/SNYK-PYTHON-REQUESTS-72435] in [email protected]
    introduced by [email protected]

  Pin [email protected] to [email protected] to fix
  ✗ Regular Expression Denial of Service (ReDoS) [Medium Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-1533435] in [email protected]
    introduced by [email protected] > [email protected]
  ✗ Information Exposure Through Sent Data [Medium Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-5926907] in [email protected]
    introduced by [email protected] > [email protected]
  ✗ Information Exposure Through Sent Data [Medium Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-6002459] in [email protected]
    introduced by [email protected] > [email protected]
  ✗ HTTP Header Injection [High Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-1014645] in [email protected]
    introduced by [email protected] > [email protected]
  ✗ CRLF injection [High Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-174323] in [email protected]
    introduced by [email protected] > [email protected]
  ✗ Improper Certificate Validation [High Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-174464] in [email protected]
    introduced by [email protected] > [email protected]
  ✗ Information Exposure Through Sent Data [High Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-5969479] in [email protected]
    introduced by [email protected] > [email protected]

Interestingly enough, when I first ran Trivy, I realized I hadn’t updated my Dockerfile correctly because the CVE wasn’t there - I was overwriting the version specified in my requirements.txt. Once I corrected that issue however, Trivy finds both vulnerabilities.

│CVE-2019-11236 │1.24.3|
│ CVE-2018-18074 │ HIGH     │ fixed  │ 2.19.1   



















SCA and Container scanning both found all vulnerabilities and provided fix versions. Your mileage will likely vary depending on language support.

Scan Type




- Gives the fix version in line so developers know exactly what to change

- Doesn’t require building the image to complete a scan

- Requires use of a build file that declares dependencies

- No visibility into the container that’s built


- Scans the dependencies that are really there

- Also finds OS vulnerabilities

- Requires the image to be built first (to detect app vulns)

- Can’t link findings back to the package manager

The key question: if you have container vulnerability scanning, do you need SCA scanning? No, but you might want it because:

  1. Docker images are not always built as part of PRs before they’re merged to master.

  2. Docker images often include vulnerabilities developers can’t do anything about, such as upstream APT vulnerabilities

  3. SCA can be implemented into repos without a CLI in pipeline, or waiting for the image to exist in container registries

In sum, SCA really provides value because it has a flexible implementation, not because it’s telling you something you couldn’t find via containers.

Why it matters

This brings us to Kodem (as an example, also a reminder, this isn’t sponsored) - they’ve recognized the future convergence of this space and are unifying SCA and Container scanning in order to give people the best of both worlds. Here’s a few things that matter about getting SCA and Container scanning in the same place.

Kodem Dashboard

  1. Runtime insights with function level support provides the single greatest false positive remediation potential - you’re only seeing things that are actually called and running, across OS and package vulnerabilities.

  2. Sometimes you’re going to want to show developers their results via SCA that runs before container builds, or use it in pipeline to fail builds, but for real visibility and remediation prioritization workflows, you need to be able to check both boxes.

  3. Creating the code to cloud picture, as well as providing meaningful remediation guidance requires both SCA and container scanning to be running.

  4. Most ASPM type scanners treat Container and SCA scanning as separate products, meaning they run their container scanning (usually Trivy) in OS only mode to avoid the duplicate issues confusion. But this creates a different kind of confusion: getting different results depending on when the scan is run (pre-compiled, post-compiled, and on image).

Next generation tools like Kodem are benefitting from not segmenting their SCA and container scanning into separate tools, but allowing them to complement one another for full visibility and fixability potential.

So, do you need Container Vuln Scanning and SCA?

  1. If you’re just checking the box of vulnerability scanning to say you do it, container scanning is all you need

  2. If you want to give developer’s the best experience, SCA scanning is a better approach

  3. In the near future, there won’t be a difference!

Other Updates

Latio List 1.8

  • Added Kodem to SCA and updated container vulnerability description - Kodem has really expanded their runtime insights to be a competitor in the Oligo and Deepfactor space

  • Updated Backslash's descriptions to better describe what makes their SCA and SAST unique

  • Added Apona to SCA, SAST, and DAST

  • Added Apono to Cloud Identity - creating JIT workflows to workloads and cloud providers

  • Added Aembit to Cloud Identity - the most secure way I've seen for injecting machine to machine auth credentials

  • Added Snyk to ASPM - To be frank, what Snyk's done with this module doesn't make a lot of sense

Join the conversation

or to participate.