This onboarding guide takes you through how to set up Splunk Core Alerts via Collector with Workbench.
Prerequisites
- You must have a Splunk Collector onboarded as a Security Device in Expel Workbench to add this integration.
- You must have provided console access to Expel. This enables our Detection & Response team to evaluate your custom rules and for our SOC analysts to properly investigate your custom rules.
Quick Start
Setup includes the following steps (select any step for detailed instructions):
- Create a New Splunk Index to Log Alerts
- Add Splunk Core Alerts as a Security Device in Workbench
- Additional Criteria for Support
Step 1: Create a New Splunk Index to Log Alerts
This step is required in order for Expel to accurately review your environment. In this section, you will create a new Splunk index to hold the results of your alerts.
Note
There may be other Alerts configured in apps added to Splunk. Please contact your Splunk administrator to ensure there aren't any alerts you will miss editing in this step.
- Log in to Splunk.
- Navigate to Settings > Indexes.
- Select New Index in the upper right corner.
-
Index Name - enter a name for the new index.
- Select Save.
- At this point, you can choose one of two ways to configure alerts to log to the index: 1) edit each alert to use the new index, or 2) create a macro for the new index and add it to each of your alerts. To edit each alert, proceed to the next step. To create a macro, skip to step 8.
- Note: For easier maintenance, consider creating a macro. Editing the macro will apply to all searches using it, preventing the need to edit each search individually.
-
Option 1: Add the index to each alert:
- For each alert in the list, select Open In Search and add the following line to the bottom of the query, replacing
<name of new index>
with the name of your index from Step 4:| tojson | collect index=<name_of_new_index>
- Select Save As > Alert.
- Configure the alert as desired.
- Select Save.
- When finished editing all alerts, proceed to Add Splunk Core Alerts as a Security Device in Workbench.
- For each alert in the list, select Open In Search and add the following line to the bottom of the query, replacing
-
Option 2: Create a macro and add it to each alert:
- Navigate to Settings > Advanced Search.
- To the right of Search macros, select Add new.
- Name - enter "send_to_expel".
-
Definition - enter the following string, replacing
<name_of_index>
with your index's name:
tojson | collect index=<name_of_index>
-
- Note: Do not include a leading pipe ( | ).
-
- Select Save.
- Select Permissions.
- Next to Everyone, select Read to give everyone Read permissions from the Search app.
- Select Save.
- Select Alerts.
- Choose an alert from the list and select Open In Search.
- Add the macro to the last line of the query:
| `send_to_expel`
- Select Save As > Alert.
- Configure the alert as desired. We strongly recommend providing an informative Description.
- Select Save.
- Repeat steps 10 through 14 for each alert in your list.
- Navigate to Settings > Advanced Search.
Step 2: Add Splunk Core Alerts as a Security Device in Workbench
Please ensure you have onboarded a Splunk Collector before proceeding with this step.
Now that you have configured Splunk to log alerts, you will add Splunk Core Alerts as a security device in Workbench.
- Log in to Workbench.
- In the side menu, navigate to Organization Settings > Security Devices.
- Select Add Security Device.
- In the search box, type “Splunk” and then select the Splunk Core Alerts (via Collector) integration.
- A configuration pane displays. Complete the fields as follows:
- SIEM - select Splunk Collector from the dropdown.
- Name - enter a name that might help you more easily identify this integration, such as “CompanyName Splunk Core Alerts”; this name will display in Workbench under the Name column, and is a text string that you can filter on.
- Location - enter the location of your integration, for example “cloud.” This is also a text string that you can filter on, so we recommend being consistent with location naming across your Expel integrations.
-
Collector query - enter the following query, replacing
<name_of_index>
with the name of the index you configured in Step 1:
index=<name_of_index>
- Select Save.
- Your device should be created successfully within a few seconds. A few reminders:
- After your connection is healthy, it will take some time for your device to begin receiving data.
- To check on the status, select the downward arrow for your device in the first column and choose View details. You can then scroll to the Connection section to see if your device is fully connected.
- Polling will happen first; data will be received after that. You must refresh the page to see updates.
- If your device does not begin polling within 15 minutes, and does not begin receiving data within 30 minutes, check out the Troubleshooting section below. Beyond that, contact Expel Support for help.
- Once you have successfully onboarded the device, contact Expel Support to request a rule review by Expel’s Detection & Response (D&R) team. The D&R Team will review the alerts you have enabled and make projections on if and how Expel can support them.
Additional Criteria for Support
The three actions below improve our ability to normalize custom alerts so they are properly presented in Workbench. They also equip our analysts with necessary context so they can make informed decisions.
CIM Compliance
To properly display alert evidence in Workbench, fields must be normalized according to Splunk’s Common Information Model. Proper normalization also enables Ruxie to perform automated actions on the alert.
Please review the fields being returned from Splunk searches to ensure they map to what is shown in Splunk's Common Information Model Add-on Manual.
Rich Evidence
Our analysts need enough context to make a decision when triaging custom rules. While Splunk Core does not include the Drilldown Search capability that Splunk Enterprise Security offers, Expel can automatically run investigative queries to add more context to the alert. Please share this with your EM and the Detection & Response Team will configure them.
Additional capabilities of Splunk Core that may help further enrich alerts include:
Use the Description Field
Any additional information about the custom rule that may be useful for our analysts should be included in the Description. Examples include the detection’s intent and suggested steps to triage.
Troubleshooting
What if a long time has passed and Expel isn’t ingesting any events from my device?
- Run the Collector Query you specified in the device configuration in Step 2 in your Splunk console and confirm there are results with timestamps after the device was onboarded.
- If you expect results but there are none, the Collector Query may need to be revised, in which case, contact Expel Support.
What if a long time has passed and Expel hasn’t surfaced any alerts to Workbench?
- First confirm that Expel is properly ingesting events by viewing the Alert Analysis dashboard.
- Has Expel’s Detection & Response (D&R) team performed a Rule Review? If not, contact Expel Support to request one on your behalf.
- Run the Collector Query you specified in the device configuration Step 2 in your Splunk console and confirm the results include alerts that D&R projected Expel could support by sending to the SOC. Also ensure they occurred after your device was onboarded.
- Contact Expel Support request assistance from D&R on your behalf.