Skip to main content
Skip table of contents

Integrating with third-party platforms Webhook

Splunk

Set up Splunk to receive the logs, including time-stamped events, to HTTP Event Collector using the Splunk platform JSON event protocol. 

Setting Up Splunk Integration

  1. Navigate to Settings > Apps > Forge Observability - Webhook Relay.

  2. Add your Webhook URL to the Webhook Relay application configuration page.

  3. Create a Webhook Listener in Jira:

    1. Copy the Webhook URL from Webhook Relay.

      image-20240814-034719.png
    2. Navigate to Jira settings > System > Advanced > Webhooks.

    3. Create a new Webhook Listener by naming it and pasting the Webhook URL.

      image-20240913-091936.png
    4. Here’s an example of how we assign JQL issue-related events.

      image-20240814-035050.png
    5. Click Create to finalize.

  4. Create an HTTP Event Collector Token in Splunk Enterprise:

    1. Refer to the Splunk Documentation for the latest setup instructions tailored to your environment’s needs for configuring this event collector.

    2. Below is a basic example of how one of our engineers successfully set up the event collector.

This is based on a test to create an event collector from August 2024.
  1. Ensure you have your Splunk instance ready.

  2. Navigate to the Settings menu, then select Add Data to begin configuring the data you wish to send to Splunk.

    image-20240920-173546.png


  3. Enable Data Flow

    1. Choose Monitor to allow data to flow through your Splunk instance.

      image-20240920-173836.png

  4. Set Up HTTP Event Collector:

    1. Navigate to the HTTP Event Collector section.

    2. Fill in the Name field for the new token.

    3. Click Next, Review, and then click Submit to generate the token.

      image-20240920-174224.png

  5. Configure the Application:

    1. Paste the token into your application configuration page's Splunk API Token field.

    2. Enter your Splunk instance URL.

    3. Click Connect to establish the connection between your application and Splunk.

      image-20240814-035458.png

Testing the Integration

  1. Trigger an Event:

    1. Update an issue by changing its workflow status.

      image-20240814-035829.png
  2. Verify in Splunk:

    1. Go to Splunk and check that the log has been captured as a record, confirming successful integration.

      image-20240920-190243.png

    2. The above example shows a successful update of the update JQL in the previous steps.

Logstash

Logstash is an open-source data processing pipeline that collects, processes and forwards logs or other types of data to various outputs. It's often used to aggregate and transform logs from different sources before sending them to systems like Elasticsearch, Splunk, or other databases for storage and analysis. Logstash allows for real-time data processing and supports a wide range of input, filter, and output plugins, making it highly versatile for managing log data.

Setting Up Logstash Integration

  1. Navigate to Jira top menu ⚙️ Settings > Apps > Forge Observability - API Proxy.

  2. Click ⚙️ Settings and find the Logstash section.

  3. Enter the endpoint where your Logstash instance is configured to receive data.

  4. Select the appropriate authentication method (e.g., None, Basic, Bearer) and provide the necessary credentials if applicable.

    image-20240814-015553.png

Testing the Integration

  1. Trigger an event.

  2. The example below shows an update on an issue by changing its workflow status.

    image-20240814-035829.png
  3. Further Log Processing and Visualization

    image-20240913-092517.png

    You can use tools like Elasticsearch and Kibana to further process, store, and visualize your data.

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.