Netskope Out-of-the-Box Data Collection Integration

Netskope Out-of-the-Box Data Collection Integration

This article describes the required procedure to integrate SonicWall Netskope Next-Gen SGW with Lumu for automated data collection procedures. This is one of our featured Out-of-the-Box Response Integrations.

NotesThis integration previously used the Event Streaming Service, which is being retired on June 30, 2026. We recommend you delete the existing integration and create a new one using the Log Streaming service as described in this article.

Requirements

  • A Netskope Next Gen Secure Web Gateway subscription with a valid Log Streaming license.
  • An active Lumu Defender Subscription.
  • Authentication credentials for your cloud storage provider.

Preliminary Setup - Netskope Log Streaming

To set up the integration, you must prepare your Netskope Log Streaming instance to communicate with the Lumu integration. To do this, you need to create a transaction log streaming.

NotesThis guide outlines the steps for setting up credentials to store transaction logs across three major cloud providers: AWS, Azure, and GCP. Each provider uses a unique identity and access management (IAM) system, so the process for generating credentials differs.

Create a Transaction Log Streaming

1. Log in to your Netskope UI.

2. Navigate to Settings > Tools > Log Streaming and click on Create Stream.

3. Enter a descriptive Name for the stream and select Transaction Events in the Data Collections section.

NotesKeep in mind you can only create one Transaction Stream. If you already have one configured in your company, simply identify it.

4. Click Manage Fields and make sure the following fields are enabled:

date time time-taken cs-bytes
sc-bytes c-ip s-ip cs-username
cs-uri-scheme cs-uri-query cs-user-agent sc-status
cs-host cs-uri cs-uri-port
x-other-category x-policy-action x-policy-name

Then, click Save.

5. Click on Select Destination and select the cloud storage bucket you want the logs to be streamed to. Select from Azure Blob Storage, Google Cloud Storage, or AWS S3.

NotesEnsure your cloud storage object is already created and you have the corresponding authentication credentials before setting up the Transaction Stream Destination.

For Amazon S3

Fill in the following fields:

  • Bucket: The name of your Amazon S3 bucket (e.g., netskopeforlumubucket). Keep this value at hand, it will be used during the Integration Setup.
  • Folder Path: The path to the folder within the bucket where you want to store and save the logs. This is required for the integration with Lumu. Keep this value at hand, it will be used during the Integration Setup.
  • In Access, select your preferred authentication method and provide the required information.
  • Region: The AWS services region where your S3 bucket is hosted (e.g., us-east-1).

For Azure Blob

Fill in the following fields:

  • Storage account name: The name of the storage account.
  • Container Name: The name of the container within the storage account where you want to store logs. Keep this value at hand, it will be used during the Integration Setup.
  • Path: The path to the folder within the bucket where you want to store and save the logs. This is required for the integration with Lumu. Keep this value at hand, it will be used during the Integration Setup.
  • Access Key: The access key associated with the selected Azure account.

For GCP Cloud Storage

Fill in the following fields:

  • Bucket: The name of the storage bucket you created in your Google Cloud account. Keep this value at hand, it will be used during the Integration Setup.
  • Path: The path to the folder within your Google Cloud bucket where you want to store logs. This is required for the integration with Lumu. Keep this value at hand, it will be used during the Integration Setup.
  • Private Key: The private_key value from the JSON key type downloaded from your Google Cloud Storage account.

6. Once you have configured your selected cloud storage, click Save.

7. In the Log Format, select CSV. Then, select GZIP as the Compression type.

8. Lastly, click Save to confirm the configuration of your Transaction stream.

Integration Setup - Lumu Portal

This section of the article describes the steps that must be completed on the Lumu Portal to properly set up the Netskope integration. To start, log into your Lumu account through the Lumu Portal.

NotesIntegrations are also available for Lumu MSP accounts. To access them, log into the Lumu MSP Portal.

1. In the Lumu Portal, head to the left panel and select Integrations > Apps. Then, click on the Data Collection tab.

2. Locate the Netskope Next Gen SWG integration and click Add.

3. Familiarize yourself with the integration details available in the app description and click Activate to start the integration setup process.

4. In the following window, add a descriptive Name. By default, this integration will be tagged as unlabeled activity; however, you can select a label of your preference for additional visibility. Click Next when finished.

5. In the next window, fill out the following:

  • Container Name: This will be the name of the container/bucket where the logs are being stored.
  • Stream ID: This will be the Path of the folder within the container/bucket where the logs are saved that was set up previously.
  • Cloud Provider: Select the cloud provider that was configured in Destination.

Click Next when finished.

6. Enter the credentials for the selected cloud provider.

  1. For Amazon S3, enter the Access Key, Secret Access Key and Region of your AWS S3 bucket. Then, click Activate.
  1. For Azure Blob, enter the Storage Account Connection String that includes your account name, key, and endpoint details for authentication. Then, click Activate.
  1. For GCP Cloud Storage, enter the Service Account Key containing private key details and credentials for authentication. Then, click Activate.

7. Once you activate the integration, the Lumu Portal will display the details of the newly created integration.


      Get an AI Summary

          • Related Articles

          • Lumu Out-of-the-box Integrations

            For getting started with Lumu integrations with third-party solutions, consult our Integrations guide. Lumu's Out-of-the-box (OOTB) integrations are a seamless and convenient way to integrate Lumu with other solutions in your cyberdefense stack to ...
          • Netskope Log Streaming Custom Data Collection Integration

            In this article, you will find out how to configure your Netskope Log Streaming subscription and its Lumu Custom Data Collection integration to pull, transform, and inject the Web Transactions by Netskope Log Streaming into Lumu to enhance the ...
          • Kubernetes (K8s) Out-of-the-box Data Collection Integration

            To learn more about Out-of-the-box Integrations and their benefits, please refer to this article. In this article, you will find out how to configure your Kubernetes cluster to record and collect DNS data from your cluster network and have it sent to ...
          • AWS Out-of-the-Box Data Collection Integration

            To learn more about Out-of-the-box Integrations and their benefits, please refer to this article. In this article, you will find out how to configure Amazon Web Services (AWS) to pull and collect data from your network in the form of logs, and have ...
          • Cisco Umbrella Out-of-the-Box Data Collection Integration

            To learn more about Out-of-the-box Integrations and their benefits, please refer to this article. Requirements A Cisco Umbrella DNS Security Essentials subscription or above An active Lumu Defender Subscription Setup Cisco Umbrella Rest API Client ...