What You'll Learn
- Explain why tuning is the most important phase of the detection lifecycle and how untuned rules cause alert fatigue
- Apply five systematic tuning techniques: field narrowing, value exclusion lists, path-based filtering, user/account filtering, and time-based suppression
- Reduce a noisy rule from 200+ alerts per day to fewer than 5 without losing real detections
- Distinguish between true positives, false positives, benign true positives, and true negatives in the context of tuning
- Write Sigma filter sections that exclude known-good activity while preserving detection of real threats
- Apply tuning methodology in Lab 8.5 to tune a pre-deployed noisy rule in a live Wazuh environment
The Tuning Imperative
You deployed a Sigma rule to your SIEM. It fires. A lot. 200 times a day. Every alert looks like this:
ALERT: Suspicious Scheduled Task Creation
Host: SRV-DEPLOY-01
User: svc_deployment
Image: C:\Windows\System32\schtasks.exe
CommandLine: schtasks /create /tn "AppDeploy_2026_Q1" /tr "C:\Deploy\run.bat" /sc daily /st 02:00
Is this malicious? No. It is the deployment team's automation creating scheduled tasks for software rollouts. But your rule fires on every single one because schtasks.exe /create matches the detection logic perfectly.
This rule is now useless. Not because the detection logic is wrong — it correctly identifies scheduled task creation. The problem is that it cannot distinguish between the deployment team's legitimate use and an attacker's malicious use. After a few days of 200 alerts, your analysts start ignoring it entirely. And the day an attacker actually uses schtasks.exe for persistence, the alert gets buried.
This is alert fatigue, and it is the number one killer of SOC effectiveness. Tuning is how you fix it.
Understanding Alert Categories
Before tuning, you need a framework for categorizing what your rule catches:
| Category | Definition | Example | Action |
|---|---|---|---|
| True Positive (TP) | Rule fires on a real attack | Attacker creates persistence task via schtasks.exe | Investigate and respond |
| False Positive (FP) | Rule fires on activity that does not match the intended detection | Rule for schtasks.exe also fires on at.exe due to broad CommandLine match | Fix the rule logic |
| Benign True Positive (BTP) | Rule fires correctly — the activity matches — but it is expected and approved | Deployment team's automated schtasks.exe usage | Add a filter exclusion |
| True Negative (TN) | Rule correctly does NOT fire on benign activity | Normal file operations that do not involve schtasks.exe | No action needed |
The most dangerous category is the Benign True Positive. The rule is technically correct — schtasks.exe IS being used. But the activity is legitimate. If you do not filter these out, they drown your real detections. Tuning is primarily about handling BTPs.
The Five Tuning Techniques
Technique 1: Field Narrowing
Make your selections more specific by matching on additional fields or using more precise modifiers.
Before (broad):
detection:
selection:
Image|endswith: '\\schtasks.exe'
CommandLine|contains: '/create'
condition: selection
After (narrow):
detection:
selection:
Image|endswith: '\\schtasks.exe'
CommandLine|contains: '/create'
selection_suspicious:
CommandLine|contains:
- '/sc onlogon'
- '/sc onstart'
- 'cmd.exe'
- 'powershell'
- 'http://'
- 'https://'
- '\\AppData\\'
- '\\Temp\\'
condition: selection and selection_suspicious
Now the rule only fires when the scheduled task creation includes suspicious indicators — persistence triggers, command interpreters, URLs, or unusual paths.
Technique 2: Value Exclusion Lists
Add a filter section that explicitly excludes known-good values:
detection:
selection:
Image|endswith: '\\schtasks.exe'
CommandLine|contains: '/create'
filter_deployment:
User:
- 'svc_deployment'
- 'svc_sccm'
- 'SYSTEM'
CommandLine|contains:
- 'AppDeploy_'
- 'WindowsUpdate'
- 'GoogleUpdate'
condition: selection and not filter_deployment
This filter removes alerts from known deployment accounts and recognized task names.
Technique 3: Path-Based Filtering
Legitimate software runs from predictable paths. Attackers stage tools in unusual locations:
detection:
selection:
Image|endswith: '\\schtasks.exe'
CommandLine|contains: '/create'
filter_legitimate_paths:
CommandLine|contains:
- 'C:\\Program Files\\'
- 'C:\\Program Files (x86)\\'
- 'C:\\Windows\\System32\\'
condition: selection and not filter_legitimate_paths
Tasks created with executables in standard paths are filtered out, focusing the rule on tasks referencing unusual locations.
Technique 4: User and Account Filtering
Different users carry different risk levels:
detection:
selection:
EventID: 4625
filter_service_accounts:
TargetUserName:
- 'svc_monitoring'
- 'svc_backup'
- 'healthcheck'
filter_machine_accounts:
TargetUserName|endswith: '$'
condition: selection and not filter_service_accounts and not filter_machine_accounts
Known service accounts and machine accounts are excluded from brute force detection because their authentication failures are expected.
Technique 5: Time-Based Suppression
Some legitimate activity happens on predictable schedules:
# Configure in SIEM post-conversion:
# Suppress rule between 01:00-03:00 daily (maintenance window)
# Suppress on first Saturday of month (patch weekend)
Time-based suppression is not a Sigma feature — it is configured in the SIEM's alerting system after conversion. Use it sparingly and always document the suppression window. An attacker who knows your maintenance window can time their attack to coincide with suppressed detections.
The Tuning Workflow
When a rule is generating too many alerts, follow this systematic process:
Step 1: Collect a Sample
Pull the last 24-48 hours of alerts from the noisy rule. Export them for analysis.
Step 2: Categorize Each Alert
For each alert, mark it as TP (real attack), FP (wrong match), or BTP (correct match, legitimate activity).
Step 3: Identify Patterns in BTPs
Look for commonalities across the benign true positives:
| Pattern | Example |
|---|---|
| Same user account | All BTPs come from svc_deployment |
| Same parent process | All BTPs have ParentImage: sccm.exe |
| Same path pattern | All BTPs reference C:\Deploy\\ |
| Same time window | All BTPs occur between 02:00-04:00 |
| Same hostname | All BTPs from SRV-DEPLOY-01 and SRV-DEPLOY-02 |
Step 4: Write Filters
Create Sigma filter sections that match the BTP patterns without excluding real attacks.
Step 5: Test the Tuned Rule
Re-evaluate the alert set with the new filters. Verify:
- All BTPs are now excluded
- No real TPs are accidentally filtered
- The remaining alerts are actionable
Step 6: Deploy and Monitor
Deploy the tuned rule and monitor alert volume for 48 hours. If new BTPs appear, repeat the process.
Tuning Anti-Patterns
| Anti-Pattern | Why It's Wrong | Better Approach |
|---|---|---|
| Disabling the rule entirely | Eliminates detection for real attacks along with the noise | Tune instead of disable |
| Raising the severity threshold | Hides the alert but does not reduce noise — analysts still see it in queries | Filter the specific BTP pattern |
| Filtering too broadly | `filter: User | contains: 'svc'might exclude a malicious service account namedsvc_backdoor` |
| Not documenting filters | Six months later, nobody knows why svc_deployment is excluded or if it's still valid | Add comments in the filter and update the falsepositives section |
| Tuning once and forgetting | New legitimate software, new accounts, new paths create new BTPs over time | Schedule quarterly tuning reviews |
A Real-World Tuning Example
Here is a rule before and after tuning, with metrics:
Before tuning:
title: Scheduled Task Creation
detection:
selection:
EventID: 7045
ServiceType: 'user mode service'
condition: selection
level: medium
Alerts per day: 187 (3 TPs, 12 FPs, 172 BTPs)
After tuning:
title: Suspicious Scheduled Task Creation — Tuned
detection:
selection:
EventID: 7045
ServiceType: 'user mode service'
filter_known_software:
ServiceName:
- 'GoogleChromeElevationService'
- 'MozillaMaintenance'
- 'AdobeUpdateService'
- 'MicrosoftEdgeUpdate'
filter_standard_paths:
ImagePath|startswith:
- 'C:\\Program Files\\'
- 'C:\\Program Files (x86)\\'
- 'C:\\Windows\\'
filter_deployment:
ImagePath|contains: '\\Deploy\\'
AccountName: 'svc_deployment'
condition: selection and not filter_known_software and not filter_standard_paths and not filter_deployment
level: medium
Alerts per day: 4 (3 TPs, 1 BTP) — 97.8% noise reduction
Key Takeaways
- Tuning is not optional — an untuned rule causes alert fatigue and is worse than no rule at all
- Benign True Positives (BTPs) are the primary target of tuning: the rule is technically correct, but the activity is legitimate
- The five tuning techniques are: field narrowing, value exclusion lists, path-based filtering, user/account filtering, and time-based suppression
- Follow a systematic workflow: collect → categorize → find patterns → write filters → test → deploy → monitor
- Never filter too broadly — use exact values, not wide patterns that could accidentally exclude real attacks
- Document every filter in the Sigma rule's falsepositives section and as comments in the filter
- Schedule quarterly tuning reviews because environments change: new software, new accounts, new paths
What's Next
You know how to write, convert, deploy, and tune Sigma rules. In Lesson 8.6, you will explore the SigmaHQ repository — 3,000+ community-contributed rules covering every major ATT&CK technique. You will learn to navigate the repository, evaluate rule quality, select high-value rules for your environment, and batch-deploy them. Lab 8.6 has you pick 10 rules, convert them all, and verify they are active in Wazuh.
Knowledge Check: Tuning & False Positive Reduction
10 questions · 70% to pass
What is a 'Benign True Positive' (BTP) in the context of detection tuning?
Which tuning technique adds a filter that explicitly lists known-good users or task names to exclude?
In Lab 8.5, you tune a rule firing 200 times per day. What is the first step in the tuning workflow?
Why is filtering with 'User|contains: svc' considered a tuning anti-pattern?
In Lab 8.5, the pre-deployed noisy rule fires on svc_deployment's automated tasks. After tuning, alerts drop from 200 to fewer than 5 per day. What percentage noise reduction is this?
What is the danger of disabling a noisy rule entirely instead of tuning it?
The field narrowing technique adds a 'selection_suspicious' block to the rule. What does this accomplish?
Why should time-based suppression be used sparingly in tuning?
In the tuning workflow, after collecting a sample and categorizing alerts, what is the next step?
Why should tuning reviews be scheduled quarterly rather than done once?
0/10 answered