Cynet 360 AutoXDR Custom Response Integration

Cynet 360 AutoXDR Custom Response Integration

This article shows how to leverage the Lumu Defender API and Cynet API to mitigate security risks.


Response integration between Cynet 360 AutoXDR and Lumu

Requirements

  • A Cynet 360 AutoXDR Elite or above subscription
    • The Cynet console must have the Endpoint Detection & Response (EDR)
      module enabled.
  • Lumu Defender API key
    • To retrieve an API token, please refer to the Defender API document.
  • Script host.
    • A scripting host is required to deploy the integration. This host must have Internet visibility over Lumu Defender API endpoints and Cynet Cloud. According to the deployment model you select, you will need a host with:
      • Python 3.12+
        or
      • A Docker-enabled host.
  • Script package
    • Contact the Lumu support team to request the package we created to deploy the required files.

Contacted hosts

Allow all traffic to the following hosts. These are required for the operation of this integration:
  • Lumu services:
    • defender.lumu.io (HTTPS and WSS traffic)
  • Cynet services
    • YOUR_INSTANCE.api.cynet.com (HTTPS traffic)
Where YOUR_INSTANCE is your instance name. You can collect it by using your Cynet XDR Web console URL. Extract the part before .api.cynet.com.

Integration’s overview

The integration leverages the Threat hunting feature in the Endpoint Detection & Response (EDR) module, which manages SHA256 file hashes related to Lumu-detected adversarial contacts.
With this information, Cynet will search for these IOCs in all the managed endpoints to trigger alerts. To respond to these findings, you must create a Remediation Playbook and Auto Remediation actions. For further reference, please refer to the Cynet documentation.
Notes
The Threat hunting feature is supported on Windows devices only.

Preliminary Setup - Cynet 360 AutoXDR

Notes
While several of the steps mentioned are labeled as “Optional”, Lumu very strongly recommends executing them to maximize the integration’s capabilities.
To set up the integration, you must prepare Cynet 360 AutoXDR to communicate with the Lumu integration. To do this, you need the following:
  • Create a dedicated integration role with the least privileges (optional, but recommended).
  • Create an API user.
  • Enable the Cynet Threat Hunting feature.
  • Collect your Cynet console base URL.
  • Obtain your Cynet clientID
Next, we will guide you through the process to fulfill these requirements.
Notes
We recommend creating a dedicated integration role to track usage if needed and uphold the principle of least privilege. If you prefer not to create a new role, you can use one with the privileges outlined in this section. If you choose to use an existing user, ensure you have the necessary credentials available for a future step.
If you are using MSP access, do this first:
1. Make sure you are located in the account you want to integrate with Lumu. Head to the left navigation bar, then click on your MSP organization name. Finally, locate the account you want to configure and click on it.

Follow these steps inside your Cynet console to create it.
1. In your account window, head left to the navigation bar, expand the Settings section and click on the Users & Roles menu.

2. In the User & Roles window, click on the Roles tab, then click on the Add Role button.

3. On the panel that opens, enter the following information:
a. Under Role Name(1), provide a distinctive name for the role.
b. In the Permissions list, unfold the API(2) dropdown, and select the API - full access(3) permission
c. Once you’re done, click on the Add(4) button to save your new role.

Create an API user

After creating the integration role, you must create an API user and assign to it the role above. This will be the integration user. Follow these steps in your Cynet XDR console.
1. In your account window, head left to the navigation bar, expand the Settings section and click on the Users & Roles menu

2. In the User & Roles window, click on the API Users tab, then click on the New button.

3. In the panel that opens, fill in the data following these guidelines:
a. Under Display Name(1), provide a distinctive name for the API user.
b. Under User Role(2), enter the name of the user role created in the Create an integration role section.
c. When finished, click on the Add(3) button to save your integration user.

4. The API Access-Key & Secret-Key window will appear. Take note of the Access-Key and the Secret-Key field data and keep them on hand, these will be required later to configure the integration.
Notes
This will be the only time these will be shown.

Enable the Cynet Threat Hunting feature

The Threat Hunting feature must be enabled to make the integration work. To enable this feature, follow these steps inside your Cynet console:
Notes
You must perform this task for every Windows group.
1. Head to the left navigation bar, expand the Settings section and click on the Groups menu.

2. In the Groups window, look for the required endpoint group and click on the Pencil icon to edit it. In this example, the icon is displayed on “Manually Installed Endpoints”.

3. In the Settings tab, click on the Advanced settings link under the Endpoint Detection & Response (EDR) section.

4. Enable the Threat Hunting toggle

5. Save your changes by clicking on the Ok button
After enabling Threat Hunting for the selected groups, you must define the Alert Severity for the threat hunting-related alerts.
1. In the Settings section, click on the Threat Hunting menu.

2. In the Threat Hunting window, look for the Alert Severity field. Select the most suitable value for your security operations. Click on the Save Changes button when done.

Collect your Cynet base URL

The Cynet base URL is required for setting up the integration in later steps, the URL follows the next structure:

The CYNET-URL-INSTANCE is needed to configure the integration

Obtain your Cynet client ID

You need to obtain your client ID. This value is required to configure the integration. Depending on your Cynet access, you have these options:
  • If you are a single tenant: Contact Cynet to receive your client ID
  • If you are an MSSP: In the Cynet 360 console, navigate to Global Settings > Client Site Manager > Sites Status. Collect the ID for the required site. This process is outlined in this document.

Preliminary setup - Lumu Portal

The integration set-up process needs you to collect this information from Lumu portal:
  • Lumu Defender API key
  • Company UUID
Log in to your Lumu portal and run the following procedures to collect these data.

Collect the Lumu Defender API key

To collect the Lumu Defender API key, please refer to the Defender API document.

Collect your Lumu company UUID

To collect your Lumu company UUID, log in to your Lumu portal. Once you are in the main window, copy the string below your company name.

Preliminary Setup - Choose your integration environment

There are 2 environment options to deploy the script, select the one that best fits your current infrastructure.
  • Run it as a Python script executing the install.sh bash file
    • Creates a Python virtual run time and its dependencies for you
    • Installs the crontab line in the host
  • Run it as a Docker container.
Whichever alternative you select, you need to first unpack the integration package shared by our Support team.
Unpack the deployment package provided by Lumu in your preferred path/folder. Keep in mind this location, as it will be required for further configurations. From now on, we will refer to this folder as <app_lumu_root>.
Notes
If you use the install script, use the uninstall.sh bash file to remove the integration from the host.

Set up the configuration files

To set up the integration, you need to add and edit two configuration files:
  • companies.yml: this file contains the information collected from the Lumu portal
  • integrations.yml: this file contains the information collected from your Cynet instance
Notes
Inside the integration package, you will find sample files you can use to build your configuration files. These files are companies_template.yml and integration_template.yml.

Complete the companies file

The companies file is in charge of defining how the integration connects to Lumu and extracts the information of the incidents and related indicators of compromise.

Notes
All the parameters in red should be replaced with the real data necessary for your integration deployment. For example, the parameter “COMPANY-UUID” should end up as something similar to “aa11bb22bb33-123a-456b-789c-11aa22bb33cc”. Follow these indications for all similar parameters.

  1. -

      lumu:

        uuid: "COMPANY_UUID"

        defender_key: "DEFENDER_API_KEY"

        hash_type: "sha256"

        ioc_types: # list of ioc types, option one, many or all

          - hash

        adversary:  # list of adversary types, option one, many or all

          - C2C

          - Malware

          - Mining

          - Spam

          - Phishing

          - Anonymizer

        days: 3 # MIN 1, MAX 3

Within this file, COMPANY_UUID and DEFENDER_API_KEY fields are mandatory. Please use the values captured in the previous steps. The ioc_types values must match with the IOC types required by the integration, in this case, hash.

Complete the integrations file

The integration file contains the information required for the integration to connect and interact with your Cynet deployment:

  1. - lumu:

        uuid: "COMPANY_UUID"

        adversaryTypes: [ "C2C", "Malware", "Mining", "Spam", "Phishing", "Anonymizer"] # ["C2C", "Malware", "Mining", "Spam", "Phishing", "Anonymizer"]

        days: 3 # INTEGER=(get incidents from X days of the ioc manager local db)

      app:

        name: "UNIQUE-NAME"

        clean: false # true | false

        ioc:

          - hash

        hash_type: sha256 # sha256 | sha1 | md5

        api:

          url: "https://CYNET-URL-INSTANCE.api.cynet.com/" 

          client_id: "CYNET-CLIENT-ID" # Ask it to Cynet support or MSP admin

          username: "CYNET-API-USER-ACCESS-KEY"

          password: "CYNET-API-USER-SECRET-KEY"

Keep in mind that:

COMPANY_UUID is the ID found in the Collect your Lumu company UUID step.

DEFENDER_API_KEY is the key found in the Collect the Lumu Defender API key.

CYNET-CLIENT-ID is the client ID collected from Cynet in the Obtain your Cynet client ID step.

CYNET-API-USER-ACCESS-KEY is the access key found in Step 4 of the Create an API user section, under Access-Key.

CYNET-API-USER-SECRET-KEY is the access key found in Step 4 of the Create an API user section, under Secret-Key.

Notes
You must fill in the configuration data carefully. If there are any mistakes or missing data, you’ll receive errors. Please refer to the Troubleshooting section at the end of this article for further reference.

Prepare Python on your environment

Notes
If Docker is your chosen deployment method, you may skip this step.
If Python is your chosen deployment method, you will need to create a Virtual environment for each integration to avoid conflicts between them and your operating system tools. Make sure you follow the steps in our Preparing Environment for Custom Integrations article.

Deploy Integration as Python script

To deploy the integration as script, you need to run the install.sh script inside the integration package.
Notes
Make sure the install.sh script has the execution permission before running it.
To run the installation script, locate yourself in the app_lumu_root folder, then execute this line through CLI.

./install.sh all

The installation script will set up the Python environment and two different cron jobs.
Notes
If you want to modify the default running interval set up by the installation script, you can modify the latest crob job entries based on your environment requirements.
Notes
If you want to restart or uninstall the integration run the ./restart all and ./uninstall all respectively.

Script details

To use the script, you must locate yourself on the path selected for deployment (<app_lumu_root>). Use the following command to show all options available for the package:

python cynet_lumu.py -h
Usage: cynet_lumu [-h] [--config CONFIG] [--ioc-manager-db-path IOC_MANAGER_DB_PATH] [-v] [-l {screen,file}]
[--hours HOURS]

Options Description
-h, --help show this help message and exit
--config CONFIG default: integrations.yml, CONFIG FILE PATH of the companies, follow the nex YML template.
--ioc-manager-db-path IOC_MANAGER_DB_PATH default path: ./ioc.db, PATH where the integration goes to read the Lumu Incidents
--logging {screen,file} Logging option (default screen).
--verbose, -v Verbosity level.
--hours HOURS keep db log record from [x hours], for auto maintenance local db purpose

Usage Examples

  • Task: Query hashes related to Lumu incidents with default options

To query all the hashes related to Lumu incidents triggered in the last 30 days, run the following command.

python3 run.py

  • Task: Query hashes related to specific parameters

By default, the integration script will query incidents related to all adversary types.

python3 run.py --config integrations.yml --ioc-manager-db-path /<ioc-manager-path>/ioc.db

  • Task: Clean records

To clean the existing records in Cynet, just set up the clean flag in the integrations.yml file to true.

clean: true

Then, run the integration script as follows:

python3 run.py [--config CONFIG] [--ioc-manager-db-path IOC_MANAGER_DB_PATH]

Notes
The records not manipulated by the integration will be preserved.

  • Other tasks

According to your needs, you can combine the examples shown, also, adding the –logging {file, screen} and –verbose argument can be used for better understanding of what can be rolling wrong.

Deploy as a Docker container (Optional)

If you have a Docker environment, you can select this option to run the integration as a Docker process. To deploy and run your integration as a docker container, locate yourself at the <app_lumu_root> folder, and follow these instructions:

1. Build the container by running the following command.
docker build \
[--build-arg IOC_MAN_CONFIG='companies.yml'] \
[--build-arg APP_CONFIG='integrations.yml'] \
--tag img-cynet-response \
--file DockerfileAllInOne .
Notes
Do not forget the dot "."

2. Run the container by using the following command.

docker run -d \
--restart unless-stopped \
--log-driver json-file \
--log-opt max-size=30m \
--log-opt max-file=3 \
--name lumu-cynet-response \
img-cynet-response

With this mode, your integration will run every 5 minutes.

Troubleshooting

For troubleshooting purposes, you can run the following commands:
To log in to your container using an interactive shell:

docker exec -it lumu-cynet-response bash

To collect integration logs:

docker logs -f lumu-cynet-response

Expected results

After running the integration, you will see new detected items in the Threat Hunting section, with hash records like this.

If the Threat Hunting module finds matches, it will trigger alerts. These alerts can be managed according to your security protocols.

Troubleshooting and known issues

To identify failures in the script execution, use the -v flag. The script execution log will show more detailed information.
The application logs will be redirected to the lumu.log file. The file errors.log stores only the errors to make them easier to find and aid the troubleshooting process.

Another instance is running

If you receive the following error.

Stopping the integration 703286, it might have another older instance running, check if is feasible or not
older pid: 562292 - cwd: /home/lumu/Documents/repos/cynet-response - since: 2023-11-08 21:58:34.530000
cmdline: /home/lumu/.local/share/virtualenvs/cynet-response-gVOo7dCJ/bin/python /home/lumu/Documents/repos/cynet-response/lumu_db_ioc_management.py

There could be another instance running. To check this, open the pid.pid file in the integration folder. This file stores the process ID if it’s running.

Further steps

You can define remediation playbooks for your Lumu alerts and create actions to remediate them automatically. Next, you will find examples for a remediation playbook and an auto remediation rule.

Create a remediation playbook

Log in to your Cynet console and follow these steps:
1. Expand the left navigation menu. Click on the Remediation menu under the Settings section

2. In the Remediation window, move to the Playbooks tab. Click on the Create button

3. In the CREATE PLAYBOOK window, fill in the required information. Follow these guidelines:
a. Give a Name(1) to the playbook
b. Set the Playbook Execution Time(2) to Parallel
c. Under the Playbook Actions section, select and move the Quarantine (For Playbook)(3) and the Kill Process (For Playbook)
(4) available custom remediations to the Custom remediations playbook section
d. Click on the Save(5) button

Notes
We suggest using the previously selected custom remediations. You can select different custom remediations for your playbook according to your operating environment.

Create an auto remediation rule

After creating a remediation playbook, you must create a remediation rule and use the previously created playbook. Go back to your Cynet console and follow these steps:
1. Expand the left navigation menu and then click on the Actions menu

2. In the Actions window, move to the Auto remediation tab. Click on the Create Rule button

3. Fill in the required information. Follow these guidelines
a. Under Name(1) enter a descriptive name.
b. Under the Matching > Alert Name(2) field, type Threat Hunting Detected By File SHA256
c. Select the device groups(3) you want to apply the remediation rule to.
d. Select the Alerts Severity(4) based on the configuration made in the Threat Hunting module
e. In the Action section, select PlayBook Actions(5). Select the playbook created in the previous step
f. Click on the Save button when done.


When a Threat Hunting alert arises, the auto remediation rule will run.

For further reference on these features, refer to the following Cynet articles.
        • Related Articles

        • Wazuh XDR Custom SecOps Integration

          The Wazuh XDR Custom SecOps Integration allows you to poll and push adversary-related events to your Wazuh deployment. After configuring the integration, your Wazuh deployment will be able to receive and process Lumu events. By using it, you will be ...
        • Cortex XDR Custom SecOps Integration

          The Cortex XDR Custom SecOps Integration allows you to poll and push adversary-related events to your Cortex XDR deployment. After configuring the integration, your Cortex deployment will be able to receive and process Lumu events. Using it lets you ...
        • Forcepoint NGFW Custom Response Integration

          This article shows how to leverage Forcepoint NGFW provided by the SMC (Security Manager Center) API and Lumu Defender API to enhance your Response capabilities. Response integration between Forcepoint NGFW and Lumu A typical Forcepoint NGFW ...
        • Elastic Defend Custom Response Integration

          This article shows how to leverage the Lumu Defender API and Elastic Security API to mitigate security risks. Requirements Elastic Security subscription A Elastic Security subscription with the Elastic Defend Integration installed. Lumu Defender API ...
        • Guardicore Custom Response Integration

          This article shows how to leverage the Lumu Defender API and Guardicore API to mitigate security risks. Response integration between Guardcore and Lumu Requirements An active Guardicore Centra subscription A Guardicore administrator user. Lumu ...