For this to be valid you must be storing your logs in Elasticsearch using ECS as well.
ECS is essentially a field naming convention adopted by Elasticsearch for use within its products. As the engineers at Elastic develop new dashboards (like for the new SIEM App in 7.2), and new machine learning jobs (a SIEM App improvement in 7.3), they will do so using ECS defined fields. All of the 7.x Beats agents already use ECS to name their fields (and provide a good reference to those just starting with ECS). What this means for the end-user is that if you map your incoming logs to use ECS field names before the document is stored in Elasticsearch, then many of the dashboards and machine learning jobs will work without any further effort on your part. If you decide instead to keep your current field names then you’ll have to re-map all the existing visualizations and ML jobs to your custom fields.
So now that you’re storing log fields in Elasticsearch using ECS we need to map those fields to the Siemplify schema. What follows is my suggested mapping, subject to change.
You’ll notice that some Siemplify fields reference the same ECS field, such as DestinationDnsDomain and DestinationNtDomain. In these cases it may be necessary to use the Siemplify transform function EXTRACT_BY_REGEX in order to capture a subset of the ECS field to use.
I also have a spreadsheet available for download.
Update Aug 27: Having just discovered that ECS version 1.1 was recently released I’ve made some modifications to the process and file hash field mappings.
|Siemplify Connector Setting||ECS Field|
|Product Field Name||event.module|
|Event Field Name||event.category|
|Alert Name Field||event.action|
|Siemplify Entity Mapping||ECS Extracted Field||ECS Alternative Field||ECS Alternative Field|
|Siemplify System Field||ECS Extracted Field||ECS Alternative Field||ECS Alternative Field|