Akamai SIA Custom Data Collection Integration

Akamai SIA Custom Data Collection Integration

In this article, you will find out how to configure your Akamai Secure Internet Access Enterprise (SIA) subscription and the Lumu Custom Data Collection integration to pull, transform, and inject the DNS query and Proxy logs recorded by Akamai into Lumu to enhance the detection & response capabilities of your organization.

Requirements

  • An Akamai SIA subscription.
    • An Akamai Control Center access is required for setting up and collecting Akamai configuration for the integration.
  • Two Lumu Custom Collectors for DNS queries and Proxy logs.
    • The Lumu custom collectors IDs and client key are required to set up the collection process. Information on how to create a custom collector in your Lumu portal can be found in Manage Custom Collectors.
  • Script host.
    • A scripting host is required to deploy the integration. This host must have Internet visibility over the Lumu Custom Collector API and the Akamai ETP Reporting API endpoints. According to the deployment model you select, you will need a host with:
      • Python 3.10+, or
      • A Docker-enabled host
  • Script package.
    • Contact the Lumu support team to request the package we created to deploy the required files.

Setup Akamai Secure Internet Access Enterprise (SIA)

Create an API client

An Akamai API client is required to collect network metadata from its reporting API. To create an API client, follow these steps in the Akamai Control Center:

1. Open the hamburger menu at the top left side of the screen. Click on the Identity & access menu under the ACCOUNT ADMIN section.
2. In the Identity and Access Management screen, click on the Users and API Clients tab. Then, click on the Create API client button.
3. In the Create API client window, do the following:
a. Click on the Service Account tab. Then, click on the Set API client options button.
b. Fill in the required data.
c. Click on the Select APIs button. In the API selection window, look for the ETP report API. Set the Access level to READ-WRITE. Click on the Submit button.
d. Click on the Select groups button. In the Group selection window, assign the Viewer role. Click on the Submit button.

e. Click on the I reviewed and acknowledge any escalation to the authorized users' permissions checkbox. Dinally, click on the Create API client button.
4. Now, you will see the details for the created API client. You need to create a new credential. To do so, click on the Create credential button under the Credentials section. 

Copy the details shown: client_secret, host, access_token, and client_token. These will be required later to configure the integration.

Extract SIA configuration ID

The configuration ID is required to configure the integration. To extract this data, please follow these steps in the Akamai Control Center.

1. Open the hamburger menu at the top left side of the screen. Click on the Enterprise Center menu under the ENTERPRISE SECURITY section.
2. In the Enterprise Center window, click on the Policies menu under the Threat Protection > Policies section.


3. Review your browser bar. Look for the number after the /etp/ string.



Take note of this number. This will be required for setting up the integration script.

Deploy the integration

There are 2 environment options to deploy the script, select the one that fits better in your current infrastructure. Whatever alternative you select, you need to unpack first the integration package shared by our Support team. Unpack the deployment package provided by Lumu in your preferred path/folder. Keep in mind this location, as it will be required for further configurations. From now on, we will refer to this folder as <akamai_etp_lumu_root>.

The integration works with Python 3.10. If your environment has prior versions, we recommend deploying the integration as a Docker Container.

Deploy as script

In the package, you will find the script required to run the integration. To use the script, you must locate yourself on the path selected for deployment (<akamai_etp_lumu_root>). Specific directions are included in the next sections.

Install requirements

If you are running different Python scripts in the selected host, it’s recommended to create a virtual environment to preserve the integrity of other tools. To do so, follow these steps:

1. Using a command line tool, locate yourself in the <akamai_etp_lumu_root> folder

2. Run the following command to create the virtual environment

python3 -m venv <venv_folder>

3. Activate the virtual environment running the following

source <venv_folder>/bin/activate

The file requirements.txt contains the list of requirements for this integration. After deploying the package locally, run the following command from the deployment folder:

pip install -r ./requirements.txt

Script details

To use the script, you must locate yourself on the path selected for deployment (<akamai_etp_lumu_root>). Use the following command to show all options available for the package:

python3 akamai_etp_lumu.py -h

usage: akamai_etp_lumu [-h] --aka_host AKA_HOST --aka_client_secret AKA_CLIENT_SECRET --aka_access_token AKA_ACCESS_TOKEN --aka_client_token AKA_CLIENT_TOKEN --aka_etp_config_id AKA_ETP_CONFIG_ID --lumu_client_key LUMU_CLIENT_KEY --lumu_dns_collector_id LUMU_DNS_COLLECTOR_ID --lumu_proxy_collector_id LUMU_PROXY_COLLECTOR_ID [-v] [-l {screen,file}]

Options

Description

-h, --help show this help message and exit
-ah AKA_HOST, --aka_host AKA_HOST Akamai ETP aka_host
-cs AKA_CLIENT_SECRET, --aka_client_secret AKA_CLIENT_SECRET Akamai ETP aka_client_secret
-at AKA_ACCESS_TOKEN, --aka_access_token AKA_ACCESS_TOKEN Akamai ETP aka_access_token
-ct AKA_CLIENT_TOKEN, --aka_client_token AKA_CLIENT_TOKEN Akamai ETP aka_client_token
-cid AKA_ETP_CONFIG_ID, --aka_etp_config_id AKA_ETP_CONFIG_ID Akamai ETP aka_etp_config_id
-key LUMU_CLIENT_KEY, --lumu_client_key LUMU_CLIENT_KEY Lumu Client key for collector
-dnscid LUMU_DNS_COLLECTOR_ID, --lumu_dns_collector_id LUMU_DNS_COLLECTOR_ID Lumu DNS Collector id
-proxycid LUMU_PROXY_COLLECTOR_ID, --lumu_proxy_collector_id LUMU_PROXY_COLLECTOR_ID Lumu Proxy Collector id
--logging {screen,file} Logging option (default screen).
--verbose, -v Verbosity level.

Usage Examples

Task: poll and inject Akamai ETP logs into Lumu

Run the following command to poll all the Akamai ETP logs and push them into the Lumu custom data collector. The poll process will trigger every minute.

python akamai_etp_lumu.py --aka_host AKA_HOST --aka_client_secret AKA_CLIENT_SECRET --aka_access_token AKA_ACCESS_TOKEN --aka_client_token AKA_CLIENT_TOKEN --aka_etp_config_id AKA_ETP_CONFIG_ID --lumu_client_key LUMU_CLIENT_KEY --lumu_dns_collector_id LUMU_DNS_COLLECTOR_ID --lumu_proxy_collector_id LUMU_PROXY_COLLECTOR_ID

The script starts polling from now minus a delay set in the script. By default, this offset is 3 minutes.

Task: store execution records in a file

To redirect all the output from the execution process to a file, use the --logging file argument. The integration output will be stored in a file called lumu.log.

python akamai_etp_lumu.py --aka_host AKA_HOST --aka_client_secret AKA_CLIENT_SECRET --aka_access_token AKA_ACCESS_TOKEN --aka_client_token AKA_CLIENT_TOKEN --aka_etp_config_id AKA_ETP_CONFIG_ID --lumu_client_key LUMU_CLIENT_KEY --lumu_dns_collector_id LUMU_DNS_COLLECTOR_ID --lumu_proxy_collector_id LUMU_PROXY_COLLECTOR_ID [-l {screen,file}]

It’s recommended to set this flag. The script runs as a daemon process. The information stored in the file lumu.log is useful for tracing progress or troubleshooting.

Further considerations

The script is intended to be used as a daemon process. It is recommended to use it using complementary tools like nohup. Use the following line as an example:

If you are using a Python virtual environment

nohup <venv_path>/bin/python <akamai_etp_lumu_root>/akamai_etp_lumu.py <flags and arguments> &

If you are NOT using a Python virtual environment

nohup python3 <akamai_etp_lumu_root>/akamai_etp_lumu.py <flags and arguments> &

Troubleshooting

To identify failures in the script, please use the -v flag. This will allow you to identify failures in the script execution.

Deploy as a Docker container (Optional)

If you have a Docker environment, you can select this option to run the integration as a Docker process. To deploy and run your integration as a docker container, locate yourself in the <akamai_etp_lumu_root> folder, and follow these instructions:

1. To build the container, run the following command. Change all the flags based on the reference given in the script section above.

docker build --build-arg aka_client_secret='xxx' --build-arg aka_access_token='xxx' --build-arg aka_client_token='xxx' --build-arg aka_etp_config_id='xxx' --build-arg aka_host='xxx' --build-arg lumu_client_key='xxx' --build-arg lumu_dns_collector_id='xxx' --build-arg lumu_proxy_collector_id='xxx' --tag python-lumu-akamai-etp .

Do not forget the dot "." at the end of the line

2. To run the container, run the following command:

docker run -d --name lumu-akamai-etp python-lumu-akamai-etp

Troubleshooting

For troubleshooting purposes, you can run the following commands:

To log in to your container using an interactive shell:

docker exec -it lumu-akamai-etp bash

To collect integration logs:

docker logs -f lumu-akamai-etp

Expected results

After setting up the integration, all the traffic records collected by the Akamai’s ETP agent will be illuminated by Lumu. Both collectors will start to show collected data.


Related detections will show additional data, according to the data reported by Akamai.



        • Related Articles

        • Akamai SIA Custom Response Integration

          This article shows how to leverage the Lumu Defender API and Akamai SIA (ETP) Configuration API to mitigate security risks. Requirements An Akamai SIA subscription. An Akamai Control Center access is required for setting up and collecting Akamai ...
        • DNSFilter Custom Data Collection Integration

          In this article, you will find out how to configure your DNSFilter subscription and its Lumu Custom Data Collection integration to pull, transform, and inject the query logs recorded by DNSFilter into Lumu to enhance the detection & response ...
        • Cato Networks Custom Data Collection Integration

          In this article, you will find out how to configure your Cato Networks subscription and its Lumu Custom Data Collection integration to pull, transform, and inject the FW logs recorded by Cato Networks into Lumu to enhance the detection & response ...
        • Microsoft Entra ID NSG Flow Logs Custom Data Collection Integration

          Microsoft Azure is now called Entra ID In this article, you will find out how to configure your Microsoft Entra ID subscription and its Lumu Custom Data Collection integration to pull, transform, and inject Entra ID Network Security Group flow logs ...
        • Cloudflare - S3 Compatible Storage Custom Data Collection Integration

          In this article, you will find out how to configure your Cloudflare Enterprise subscription and the Lumu Custom Data Collection integration to pull, transform, and inject the DNS Gateway logs recorded by Cloudflare into Lumu to enhance the detection ...