Skip to content

Commit 285d5db

Browse files
authored
Merge branch 'bugfix' into zip-handling-consolidation
2 parents 989d1c8 + 74b6b87 commit 285d5db

17 files changed

Lines changed: 458 additions & 142 deletions

File tree

docs/content/admin/user_management/user_permission_chart.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -63,10 +63,12 @@ The majority of Configuration Permissions give users access to certain pages in
6363
| Login Banner | n/a | n/a | Edit the login banner, located under **⚙️Configuration \> Login Banner** | n/a |
6464
| Announcements | n/a | n/a | Configure Announcements, located under **⚙️Configuration \> Announcements** | n/a |
6565
| Note Types | Access the ⚙️Configuration \> Note Types page | Add a Note Type | Edit a Note Type | Delete a Note Type |
66+
| Prioritization Engines | Access the Prioritization Engine configuration page | Add a new Prioritization Engine | Edit an existing Prioritization Engine | Delete a Prioritization Engine |
6667
| Product Types | n/a | Add a new Product Type (under Products \> Product Type) | n/a | n/a |
6768
| Questionnaires | Access the **Questionnaires \> All Questionnaires** page | Add a new Questionnaire | Edit an existing Questionnaire | Delete a Questionnaire |
6869
| Questions | Access the **Questionnaires \> Questions** page | Add a new Question | Edit an existing Question | n/a |
6970
| Regulations | n/a | Add a Regulation to the **⚙️Configuration \> Regulations** page | Edit an existing Regulation | Delete a Regulation |
71+
| Scheduling Service Schedule | Access the **Scheduling** page | Superuser only | Edit an existing Schedule (change trigger, enable/disable) | Delete a Schedule |
7072
| SLA Configuration | Access the **⚙️Configuration \> SLA Configuration** page | Add a new SLA Configuration | Edit an existing SLA Configuration | Delete an SLA Configuration |
7173
| Test Types | n/a | Add a new Test Type (under **Engagements \> Test Types**) | Edit an existing Test Type | n/a |
7274
| Tool Configuration | Access the **⚙️Configuration \> Tool Configuration** page | Add a new Tool Configuration | Edit an existing Tool Configuration | Delete a Tool Configuration |

docs/content/automation/rules_engine/about.md

Lines changed: 18 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -14,17 +14,28 @@ Rules Engine can only be accessed through the [Pro UI](/get_started/about/ui_pro
1414

1515
Currently, Rules can only be created for Findings, however more object types will be supported in the future.
1616

17-
Rules always need to be manually triggered from the **All Rules** page. When a rule is triggered, it will be applied to all existing Findings that match the filter conditions set.
17+
Rules can be triggered manually from the **All Rules** page, or scheduled to run automatically on a recurring schedule. When a rule is triggered, it will be applied to all existing Findings that match the filter conditions set.
1818

1919
## Possible Rule Actions
2020
Each Rule can apply one or more of these changes to a Finding when it is triggered successfully (i.e. matches the set Filter conditions).
2121

22-
* Modify or append one or more informational fields on a Finding, including Title, Description, Severity, CVSSv3 Vector, Active, Verified, Risk Accepted, False Positive, Mitigated
23-
* Set a User to Review a Finding
24-
* Assign a Group as Owners for a Finding
25-
* Add Tags to a Finding
26-
* Add a Note to a Finding
27-
* Create an Alert in DefectDojo with custom text
22+
### Field Modifications
23+
* **Set a field** on a Finding, including Title, Description, Severity, CVSSv3 Vector, Active, Verified, Risk Accepted, False Positive, Mitigated
24+
* **Append or Prepend text** to a Finding's Title or Description
25+
* **Set Priority** — override the calculated Priority value on a Finding (overrides automatic priority calculation)
26+
* **Set Risk** — override the calculated Risk level on a Finding (overrides automatic risk calculation)
27+
* **Add, Subtract, Multiply, or Divide** the Priority value on a Finding by a given number
28+
29+
### Assignments & Ownership
30+
* **Set a User to Review** a Finding
31+
* **Assign a Group as Owners** for a Finding
32+
* **Set a Mitigation Policy** on a Finding — assigns a pre-configured Mitigation Policy to the Finding
33+
* **Add to Risk Acceptance** — adds a Finding to an existing Risk Acceptance record (sets risk_accepted=True, active=False, and handles Jira integration and endpoint statuses)
34+
35+
### Tags, Notes & Alerts
36+
* **Add Tags** to a Finding
37+
* **Add a Note** to a Finding
38+
* **Create an Alert** in DefectDojo with custom text
2839

2940
### Filter conditions
3041
Rules are automatically triggered when a Finding meets specific Filter conditions. For more information on Filters that can be used to create Rule Actions, see the [Filter Index](/navigation/pro__filter_index) page.
Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
---
2+
title: "Scheduling Rules"
3+
description: "Automatically run Rules Engine rules on a recurring or one-time schedule"
4+
weight: 2
5+
audience: pro
6+
---
7+
<span style="background-color:rgba(242, 86, 29, 0.3)">Note: Rules Engine Scheduling is a DefectDojo Pro-only feature.</span>
8+
9+
Rules can be scheduled to run automatically rather than triggered manually each time. A scheduled rule will execute against all Findings that match its filter conditions at the configured time.
10+
11+
The user setting up the schedule must have the **Change Scheduling Service Schedule** configuration permission.
12+
13+
## Schedule Types
14+
15+
### Single Run
16+
17+
A Single Run schedule executes the rule once at a specific date and time. After the run completes, the schedule is not repeated.
18+
19+
### Repeated Run
20+
21+
A Repeated Run schedule allows you to trigger a rule on a recurring basis — for example, every day at 9:00 AM, or every Monday at 15:00.
22+
23+
**Note:** Rules Engine schedules are limited to quarter-hour marks. The minute field of a cron schedule must be one of: **0, 15, 30, or 45**. Other minute values are not permitted.
24+
25+
Examples of valid schedules:
26+
- Every hour on the hour: `0 * * * *`
27+
- Every day at 9:15 AM: `15 9 * * *`
28+
- Every Monday at 3:00 PM: `0 15 * * 1`
29+
- Every 15 minutes: `0,15,30,45 * * * *`
30+
31+
## Creating a Schedule for a Rule
32+
33+
1. Navigate to the **All Rules** page from the **Rules Engine** menu in the sidebar.
34+
2. Find the rule you want to schedule, and open its action menu (****).
35+
3. Click **Schedule Rule**. This option is only visible if the Scheduling Service is enabled and you have the required permission.
36+
4. In the **Schedule Rule** modal, fill in the following fields:
37+
38+
| Field | Description |
39+
|---|---|
40+
| **Name** | A unique name for this schedule (required, max 100 characters). |
41+
| **Description** | Optional description of the schedule's purpose. |
42+
| **Trigger Type** | Choose **Single Run** for a one-time execution, or **Repeated Run** for a recurring cron schedule. |
43+
| **Frequency** | For Repeated Run: use the cron builder to select the period (hourly, daily, weekly, etc.) and the specific minute, hour, and day values. For Single Run: select a date and time using the date picker. |
44+
| **Enable Schedule** | Toggle to enable or disable the schedule. A disabled schedule will not run until re-enabled. |
45+
46+
5. Click **Submit** to save the schedule. The rule will run automatically at the next scheduled time.
47+
48+
49+
## Permissions
50+
51+
Access to scheduling within Rules Engine requires Superuser permissions or the appropriate Configuration Permission. See [User Permission Chart](/admin/user_management/user_permission_chart) for details.

docs/content/releases/pro/changelog.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,11 +12,17 @@ For Open Source release notes, please see the [Releases page on GitHub](https://
1212

1313
## Feb 2026: v2.55
1414

15+
### Feb 26, 2026: v2.55.5
16+
17+
* **(Rules Engine)** Rules Engine now automatically retries when encountering database lock contention or serialization conflicts, reducing the likelihood of a rule run failing due to temporary load on the system.
18+
1519
### Feb 24, 2026: v2.55.4
1620

1721
* **(Connectors)** Added Akamai API Security, JFrog Xray to Connectors.
1822
* **(Surveys)** Anonymous surveys: users can now access surveys without logging in when anonymous surveys are enabled.
1923
* **(Pro UI)** The Pro UI editor now uses Markdown-based editing for text fields. This resolves issues with HTML-string encoding, especially when Findings were manually entered or edited.
24+
* **(Rules Engine)** Added **Set Mitigation Policy** action type: Rules can now assign a pre-configured Mitigation Policy to matching Findings.
25+
* **(Rules Engine)** Added **Add to Risk Acceptance** action type: Rules can now add matching Findings to an existing Risk Acceptance record, automatically setting them as risk-accepted and inactive, and handling Jira integration and endpoint statuses.
2026

2127
### Feb 17, 2026: v2.55.3
2228

@@ -126,6 +132,7 @@ No significant UX changes.
126132
#### Oct 20, 2025: v2.51.2
127133

128134
* **(Connectors)** Added Anchore Enterprise Connector.
135+
* **(Rules Engine)** Rules can now be scheduled to run automatically on a recurring or one-time basis. From the Rules list, use the **** menu on any rule to open the **Schedule Rule** form.
129136

130137

131138
#### Oct 14, 2025: v2.51.1

dojo/api_v2/serializers.py

Lines changed: 33 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -2711,8 +2711,11 @@ def process_scan(
27112711
# Attempt to create an engagement
27122712
logger.debug("reimport for non-existing test, using import to create new test")
27132713
context["engagement"] = auto_create_manager.get_or_create_engagement(**context)
2714+
# Do not close old findings when creating a brand new test: there are no
2715+
# existing findings to compare against, and close_old_findings would
2716+
# incorrectly close findings from other tests in the same scope.
27142717
context["test"], _, _, _, _, _, _ = self.get_importer(
2715-
**context,
2718+
**{**context, "close_old_findings": False},
27162719
).process_scan(
27172720
context.pop("scan", None),
27182721
)
@@ -2856,30 +2859,36 @@ def save(self):
28562859
msg = "Invalid format"
28572860
raise Exception(msg)
28582861

2862+
# Filter out ignored keys
2863+
language_names = [name for name in deserialized if name not in {"header", "SUM"}]
2864+
# Prepopulate existing Language_Type objects
2865+
existing_types = {
2866+
lt.language: lt
2867+
for lt in Language_Type.objects.filter(language__in=language_names)
2868+
}
2869+
# Determine which Language_Type objects need to be created
2870+
new_language_names = [name for name in language_names if name not in existing_types]
2871+
new_types = [Language_Type(language=name) for name in new_language_names]
2872+
Language_Type.objects.bulk_create(new_types)
2873+
# Add newly created Language_Type objects to cache
2874+
for lt in Language_Type.objects.filter(language__in=new_language_names):
2875+
existing_types[lt.language] = lt
2876+
# Delete all Languages for this product
28592877
Languages.objects.filter(product=product).delete()
2860-
2861-
for name in deserialized:
2862-
if name not in {"header", "SUM"}:
2863-
element = deserialized[name]
2864-
2865-
try:
2866-
(
2867-
language_type,
2868-
_created,
2869-
) = Language_Type.objects.get_or_create(language=name)
2870-
except Language_Type.MultipleObjectsReturned:
2871-
language_type = Language_Type.objects.filter(
2872-
language=name,
2873-
).first()
2874-
2875-
language = Languages()
2876-
language.product = product
2877-
language.language = language_type
2878-
language.files = element.get("nFiles", 0)
2879-
language.blank = element.get("blank", 0)
2880-
language.comment = element.get("comment", 0)
2881-
language.code = element.get("code", 0)
2882-
language.save()
2878+
# Prepare Languages objects for bulk insert
2879+
languages_to_create = [
2880+
Languages(
2881+
product=product,
2882+
language=existing_types[name],
2883+
files=deserialized[name].get("nFiles", 0),
2884+
blank=deserialized[name].get("blank", 0),
2885+
comment=deserialized[name].get("comment", 0),
2886+
code=deserialized[name].get("code", 0),
2887+
)
2888+
for name in language_names
2889+
]
2890+
# Bulk insert all Languages in one query
2891+
Languages.objects.bulk_create(languages_to_create)
28832892

28842893
def validate(self, data):
28852894
if is_scan_file_too_large(data["file"]):

dojo/finding/deduplication.py

Lines changed: 13 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
from django.db.models.query_utils import Q
99

1010
from dojo.celery import app
11-
from dojo.models import Finding, System_Settings
11+
from dojo.models import Endpoint_Status, Finding, System_Settings
1212

1313
logger = logging.getLogger(__name__)
1414
deduplicationLogger = logging.getLogger("dojo.specific-loggers.deduplication")
@@ -295,12 +295,19 @@ def build_candidate_scope_queryset(test, mode="deduplication", service=None):
295295
# Base prefetches for both modes
296296
prefetch_list = ["endpoints", "vulnerability_id_set", "found_by"]
297297

298-
# Additional prefetches for reimport mode
298+
# Additional prefetches for reimport mode: fetch only non-special endpoint statuses with their
299+
# endpoint joined in, so endpoint_manager can read status_finding_non_special directly without
300+
# any extra DB queries
299301
if mode == "reimport":
300-
prefetch_list.extend([
301-
"status_finding",
302-
"status_finding__endpoint",
303-
])
302+
prefetch_list.append(
303+
Prefetch(
304+
"status_finding",
305+
queryset=Endpoint_Status.objects.exclude(
306+
Q(false_positive=True) | Q(out_of_scope=True) | Q(risk_accepted=True),
307+
).select_related("endpoint"),
308+
to_attr="status_finding_non_special",
309+
),
310+
)
304311

305312
return (
306313
queryset

dojo/importers/default_reimporter.py

Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -766,12 +766,8 @@ def process_matched_mitigated_finding(
766766
else:
767767
# TODO: Delete this after the move to Locations
768768
# Reactivate mitigated endpoints that are not false positives, out of scope, or risk accepted
769-
endpoint_statuses = existing_finding.status_finding.exclude(
770-
Q(false_positive=True)
771-
| Q(out_of_scope=True)
772-
| Q(risk_accepted=True),
773-
)
774-
self.endpoint_manager.chunk_endpoints_and_reactivate(endpoint_statuses)
769+
# status_finding_non_special is prefetched by build_candidate_scope_queryset
770+
self.endpoint_manager.chunk_endpoints_and_reactivate(existing_finding.status_finding_non_special)
775771
existing_finding.notes.add(note)
776772
self.reactivated_items.append(existing_finding)
777773
# The new finding is active while the existing on is mitigated. The existing finding needs to

dojo/importers/endpoint_manager.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -157,8 +157,9 @@ def update_endpoint_status(
157157
"""Update the list of endpoints from the new finding with the list that is in the old finding"""
158158
# New endpoints are already added in serializers.py / views.py (see comment "# for existing findings: make sure endpoints are present or created")
159159
# So we only need to mitigate endpoints that are no longer present
160-
# using `.all()` will mark as mitigated also `endpoint_status` with flags `false_positive`, `out_of_scope` and `risk_accepted`. This is a known issue. This is not a bug. This is a future.
161-
existing_finding_endpoint_status_list = existing_finding.status_finding.all()
160+
# status_finding_non_special is prefetched by build_candidate_scope_queryset with the
161+
# special-status exclusion and endpoint select_related already applied at the DB level
162+
existing_finding_endpoint_status_list = existing_finding.status_finding_non_special
162163
new_finding_endpoints_list = new_finding.unsaved_endpoints
163164
if new_finding.is_mitigated:
164165
# New finding is mitigated, so mitigate all old endpoints

dojo/importers/location_manager.py

Lines changed: 9 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -127,17 +127,19 @@ def update_location_status(
127127
"""Update the list of locations from the new finding with the list that is in the old finding"""
128128
# New endpoints are already added in serializers.py / views.py (see comment "# for existing findings: make sure endpoints are present or created")
129129
# So we only need to mitigate endpoints that are no longer present
130-
# using `.all()` will mark as mitigated also `endpoint_status` with flags `false_positive`, `out_of_scope` and `risk_accepted`. This is a known issue. This is not a bug. This is a future.
131-
130+
existing_location_refs: QuerySet[LocationFindingReference] = existing_finding.locations.exclude(
131+
status__in=[
132+
FindingLocationStatus.FalsePositive,
133+
FindingLocationStatus.RiskAccepted,
134+
FindingLocationStatus.OutOfScope,
135+
],
136+
)
132137
if new_finding.is_mitigated:
133138
# New finding is mitigated, so mitigate all existing location refs
134-
self.chunk_locations_and_mitigate(existing_finding.locations.all(), user)
139+
self.chunk_locations_and_mitigate(existing_location_refs, user)
135140
else:
136141
# New finding not mitigated; so, reactivate all refs
137-
existing_location_refs: QuerySet[LocationFindingReference] = existing_finding.locations.all()
138-
139-
new_locations_values = [str(location) for location in new_finding.unsaved_locations]
140-
142+
new_locations_values = {str(location) for location in new_finding.unsaved_locations}
141143
# Reactivate endpoints in the old finding that are in the new finding
142144
location_refs_to_reactivate = existing_location_refs.filter(location__location_value__in=new_locations_values)
143145
# Mitigate endpoints in the existing finding not in the new finding

dojo/location/models.py

Lines changed: 23 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -111,9 +111,19 @@ def associate_with_finding(
111111
audit_time: datetime | None = None,
112112
) -> LocationFindingReference:
113113
"""
114-
Get or create a LocationFindingReference for this location and finding,
115-
updating the status each time. Also associates the related product.
114+
Get or create a LocationFindingReference for this location and finding.
115+
Also associates the related product.
116116
"""
117+
# Check if there is an existing reference for this finding and location
118+
# If this method is being used to set the status
119+
if LocationFindingReference.objects.filter(
120+
location=self,
121+
finding=finding,
122+
).exists():
123+
return LocationFindingReference.objects.get(
124+
location=self,
125+
finding=finding,
126+
)
117127
# Determine the status
118128
if status is None:
119129
status = self.status_from_finding(finding)
@@ -144,10 +154,17 @@ def associate_with_product(
144154
product: Product,
145155
status: ProductLocationStatus | None = None,
146156
) -> LocationProductReference:
147-
"""
148-
Get or create a LocationProductReference for this location and product,
149-
updating the status each time.
150-
"""
157+
"""Get or create a LocationProductReference for this location and product"""
158+
# Check if there is an existing reference for this finding and location
159+
# If this method is being used to set the status
160+
if LocationProductReference.objects.filter(
161+
location=self,
162+
product=product,
163+
).exists():
164+
return LocationProductReference.objects.get(
165+
location=self,
166+
product=product,
167+
)
151168
if status is None:
152169
status = self.status_from_product(product)
153170
# Use a transaction for safety in concurrent scenarios

0 commit comments

Comments
 (0)