Do you need Container Vulnerability Scanning and SCA Scanning?
This week I met with Kodem. Kodem has entered into the narrow but exciting world of granular runtime vulnerability detection alongside Oligo, offering a function level view of dependencies based on eBPF logs. A lot of the content here is also covered in the latest YouTube video.
This led to me ask a question I’ve had for a while:
Do you need Container Vulnerability Scanning and SCA Scanning?
(A lot of this is also covered in the latest YouTube video)
Table of Contents
Something that’s nagged at me for a while is that container vulnerability scanning has always had some degree of SCA detection. Aqua, Prisma, Wiz, Sysdig etc. can all detect things like java binaries and some of their dependencies; however, this functionality has always felt limited and very language specific - but I’ve never fully tested if that feeling is justified.
Oligo and Kodem have forced me to revisit this, as both tools fully embrace that at the container level, when combined with execution data, you really have the ability to get all of the third party dependencies, and even what functions are called. Let’s test this with some examples.
JavaScript Detection Example
In my insecure-app repo, I added a create-react-app (JavaScript) that’s also built by a docker container. A Snyk CLI test reveals vulnerabilities:
These packages need updating:
"react": "15.4.2", "react-dom": "15.4.2", "react-admin": "3.19.10"
Snyk confirms that we should do some updates:
Issues to fix by upgrading:
Upgrade react@15.4.2 to react@16.5.0 to fix
✗ Information Exposure [Medium Severity][https://security.snyk.io/vuln/SNYK-JS-NODEFETCH-2342118] in node-fetch@1.7.3
introduced by react@15.4.2 > fbjs@0.8.18 > isomorphic-fetch@2.2.1 > node-fetch@1.7.3 and 1 other path(s)
✗ Denial of Service [Medium Severity][https://security.snyk.io/vuln/SNYK-JS-NODEFETCH-674311] in node-fetch@1.7.3
introduced by react@15.4.2 > fbjs@0.8.18 > isomorphic-fetch@2.2.1 > node-fetch@1.7.3 and 1 other path(s)
Upgrade react-admin@3.19.10 to react-admin@3.19.12 to fix
✗ Cross-site Scripting (XSS) [Medium Severity][https://security.snyk.io/vuln/SNYK-JS-REACTADMIN-3319447] in react-admin@3.19.10
introduced by react-admin@3.19.10
Upgrade react-dom@15.4.2 to react-dom@16.5.0 to fix
✗ Information Exposure [Medium Severity][https://security.snyk.io/vuln/SNYK-JS-NODEFETCH-2342118] in node-fetch@1.7.3
introduced by react@15.4.2 > fbjs@0.8.18 > isomorphic-fetch@2.2.1 > node-fetch@1.7.3 and 1 other path(s)
✗ Denial of Service [Medium Severity][https://security.snyk.io/vuln/SNYK-JS-NODEFETCH-674311] in node-fetch@1.7.3
introduced by react@15.4.2 > fbjs@0.8.18 > isomorphic-fetch@2.2.1 > node-fetch@1.7.3 and 1 other path(s)
Let’s focus on CVE-2023-25572 and CVE-2020-15168, the first is a vulnerability with react-admin@3.19.10 that allows cross site scripting, the second is a transitive vulnerability from node-fetch, which comes in via react > fbjs > isomorphic-fetch > node-fetch.
Now I run the docker build and scan it with Trivy, and the vulnerabilities show here as well:
│ react-admin (package.json)│ CVE-2023-25572 │ MEDIUM │ │ 3.19.10│ 3.19.12, 4.7.6│
| node-fetch (package.json) | CVE-2020-15168 │ LOW │ 2.6.1, 3.0.0-beta.9
Let’s see if this holds true for a Python example:
Python Dependency Example
Dependency:
requests == 2.19.1
Snyk finds CVE-2018-18074, as well as transitive dependencies in urllib such as CVE-2019-11236:
Upgrade requests@2.19.1 to requests@2.31.0 to fix
✗ Information Exposure [Medium Severity][https://security.snyk.io/vuln/SNYK-PYTHON-REQUESTS-5595532] in requests@2.19.1
introduced by requests@2.19.1
✗ Information Exposure [Critical Severity][https://security.snyk.io/vuln/SNYK-PYTHON-REQUESTS-72435] in requests@2.19.1
introduced by requests@2.19.1
Pin urllib3@1.23 to urllib3@1.26.18 to fix
✗ Regular Expression Denial of Service (ReDoS) [Medium Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-1533435] in urllib3@1.23
introduced by requests@2.19.1 > urllib3@1.23
✗ Information Exposure Through Sent Data [Medium Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-5926907] in urllib3@1.23
introduced by requests@2.19.1 > urllib3@1.23
✗ Information Exposure Through Sent Data [Medium Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-6002459] in urllib3@1.23
introduced by requests@2.19.1 > urllib3@1.23
✗ HTTP Header Injection [High Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-1014645] in urllib3@1.23
introduced by requests@2.19.1 > urllib3@1.23
✗ CRLF injection [High Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-174323] in urllib3@1.23
introduced by requests@2.19.1 > urllib3@1.23
✗ Improper Certificate Validation [High Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-174464] in urllib3@1.23
introduced by requests@2.19.1 > urllib3@1.23
✗ Information Exposure Through Sent Data [High Severity][https://security.snyk.io/vuln/SNYK-PYTHON-URLLIB3-5969479] in urllib3@1.23
introduced by requests@2.19.1 > urllib3@1.23
Interestingly enough, when I first ran Trivy, I realized I hadn’t updated my Dockerfile correctly because the CVE wasn’t there - I was overwriting the version specified in my requirements.txt. Once I corrected that issue however, Trivy finds both vulnerabilities.
│CVE-2019-11236 │1.24.3|
│ CVE-2018-18074 │ HIGH │ fixed │ 2.19.1
Results:
SCA and Container scanning both found all vulnerabilities and provided fix versions. Your mileage will likely vary depending on language support.
The key question: if you have container vulnerability scanning, do you need SCA scanning? No, but you might want it because:
Docker images are not always built as part of PRs before they’re merged to master.
Docker images often include vulnerabilities developers can’t do anything about, such as upstream APT vulnerabilities
SCA can be implemented into repos without a CLI in pipeline, or waiting for the image to exist in container registries
In sum, SCA really provides value because it has a flexible implementation, not because it’s telling you something you couldn’t find via containers.
Why it matters
This brings us to Kodem (as an example, also a reminder, this isn’t sponsored) - they’ve recognized the future convergence of this space and are unifying SCA and Container scanning in order to give people the best of both worlds. Here’s a few things that matter about getting SCA and Container scanning in the same place.
Kodem Dashboard
Runtime insights with function level support provides the single greatest false positive remediation potential - you’re only seeing things that are actually called and running, across OS and package vulnerabilities.
Sometimes you’re going to want to show developers their results via SCA that runs before container builds, or use it in pipeline to fail builds, but for real visibility and remediation prioritization workflows, you need to be able to check both boxes.
Creating the code to cloud picture, as well as providing meaningful remediation guidance requires both SCA and container scanning to be running.
Most ASPM type scanners treat Container and SCA scanning as separate products, meaning they run their container scanning (usually Trivy) in OS only mode to avoid the duplicate issues confusion. But this creates a different kind of confusion: getting different results depending on when the scan is run (pre-compiled, post-compiled, and on image).
Next generation tools like Kodem are benefitting from not segmenting their SCA and container scanning into separate tools, but allowing them to complement one another for full visibility and fixability potential.
So, do you need Container Vuln Scanning and SCA?
If you’re just checking the box of vulnerability scanning to say you do it, container scanning is all you need
If you want to give developer’s the best experience, SCA scanning is a better approach
In the near future, there won’t be a difference!
Other Updates
Latio List 1.8
Added Kodem to SCA and updated container vulnerability description - Kodem has really expanded their runtime insights to be a competitor in the Oligo and Deepfactor space
Updated Backslash's descriptions to better describe what makes their SCA and SAST unique
Added Apona to SCA, SAST, and DAST
Added Apono to Cloud Identity - creating JIT workflows to workloads and cloud providers
Added Aembit to Cloud Identity - the most secure way I've seen for injecting machine to machine auth credentials
Added Snyk to ASPM - To be frank, what Snyk's done with this module doesn't make a lot of sense