# Integrating Azure Log Analytics

The following steps will outline the process for sending application logs to Azure Log Analytics using FluentD.

# Create a Log Analytics Workspace

  1. In the Azure portal, click All services. In the list of resources, select Log Analytics workspaces.

  2. Click Add, and then select choices for the following items:

    • Type a Name that is globally unique across all workspaces.
    • Select a Subscription.
    • Choose an existing Resource Group or create a new one.
    • Select an available Location. We recommend using the same location as your virtual machine.
    • Click Review + Ceate then click Create
  3. In the workspace, go to Advanced settings > Connected Sources > Linux. Copy the WORKSPACE ID and PRIMARY KEY values. You will need these at a later step.

# Installing FluentD on Linux

  1. SSH into your virtual machine using the credentials you specified when launching it.

  2. In your terminal, run the following commands to install FluentD and the Azure Log Analytics plugin:

curl -fsSL https://toolbelt.treasuredata.com/sh/install-ubuntu-jammy-fluent-package5-lts.sh | sh
sudo fluent-gem install fluent-plugin-azure-loganalytics
  1. Run the following command to give the FluentD user access to the FileMage log files.
sudo usermod -a -G adm _fluentd

# Configuring FluentD

# FileMage Application Logger

To send the application to Azure Log Analytics, add the following entries to the FluentD configuration file at /etc/td-agent/td-agent.conf. Note that you need to substitute the customer_id and shared_key values with the values you copied from your Log Analytics workspace.

<source>
  @type tail
  @id input_filemage_gateway
  path /var/log/filemage/gateway.log
  pos_file /var/log/fluent/tmp/filemage-gateway.log.pos
  tag filemage.gateway
  <parse>
    @type regexp
    expression /^(?<time>.*) - \[(?<level>.*)\] (?<msg>.*)/
    time_format %Y-%m-%dT%H:%M:%S.%L%z
  </parse>
</source>

<match filemage.gateway>
  @type azure-loganalytics
  @id azure_out_gateway
  customer_id <YOUR_WORKSPACE_ID>
  shared_key <YOUR_PRIMARY_KEY>
  log_type FileMageGateway
  time_format %Y-%m-%dT%H:%M:%S.%L%z
</match>

# FileMage Special Loggers

To send the output of the special logs such as the connection_log set the format of the logger to json and use the following configuration. Note that the @id, tag, path, pos_file, and match, and log_type fields need to be unique per log file.

<source>
  @type tail
  @id input_filemage_connections
  path /var/log/filemage/connections.log
  pos_file /var/log/fluent/tmp/filemage-connections.log.pos
  tag filemage.connections
  <parse>
    @type json
    time_format %Y-%m-%dT%H:%M:%S.%L%z
  </parse>
</source>

<match filemage.connections>
  @type azure-loganalytics
  @id azure_out_connections
  customer_id <YOUR_WORKSPACE_ID>
  shared_key <YOUR_PRIMARY_KEY>
  log_type FileMageConnections
  time_format %Y-%m-%dT%H:%M:%S.%L%z
</match>

# Output Buffering

It may take several minutes for logs to be sent to Azure Log Analytics if there is very little log activity on the file since the plugin will only send logs when the internal buffer is full. To control this behavior you may modify a plugin's buffer settings.

For example, adding the following to a match section will cause the plugin to send logs approximately every 10 seconds.

<buffer time>
  timekey 10s
  timekey_wait 1s
</buffer>

# Apply Changes

Restart FluentD to apply the configuration changes.

sudo systemctl restart fluentd

# Query Logs in Azure Log Analytics

  1. In the Azure Portal, go to the Log Analytics workspace you created.

  2. In the sidebar, click on Logs. Enter the following query in the query panel and click Run.

FileMageGateway_CL
| where TimeGenerated > ago(120m)
| order by TimeGenerated desc
| limit 100