Skip to main content
 
Splunk Lantern

Instrumenting Java apps in Amazon EKS and non-EKS environments

 

In your organization, you use Kubernetes to automate deployments, scale, and manage containers. However, maintaining the system can be difficult. For that reason, cloud platforms offer Kubernetes as a managed service like Amazon Elastic Kubernetes Service (EKS). Amazon EKS helps deliver highly available and secure Kubernetes clusters in a way that is both fast and simple.

While you can still instrument applications that are running in EKS in a similar way to non-EKS applications, you'll need to make some changes. You’ll need to first understand how to instrument apps running in non-EKS environments in order to see what changes are needed.

Deploy a Java application without Kubernetes

  1. From the left sidebar within Splunk Observability Cloud, click Data Management.
  2. Click + Add Integration.
  3. Locate the APM Instrumentation section and click Java - Provides Traces.
  4. At the bottom of the UI, click Next to reach the Configure Integration step. Provide the following details about the environment where the application is deployed:
    1. Service Name: The name of the service you are instrumenting.
    2. Collector Endpoint: Reflects the endpoint with the Splunk OpenTelemetry Collector installed. By default, the setting is set to http://localhost:4317. This can be changed to reflect a remote machine hosting the Splunk OpenTelemetry Collector.
    3. Environment: The name of the environment that you are instrumenting.
    4. AlwaysOn Profiling: Select one of the following:
      • Select CPU profiling if you want only CPU.
      • Select CPU and memory profiling for both CPU and memory profiling.
    5. Application Metrics: If enabled, gathers additional JVM-specific metrics.
    6. Kubernetes: If the application is running within Kubernetes. Select No.
    7. Legacy Agent: Select whether you want to use the legacy SignalFx agent.

  5. At the bottom of the UI, click Next to reach the Install Integration step.
  6. Select the Linux or Windows tab and follow the instructions on page, providing the following details about the environment where the application is deployed:
    1. If you have not already done so, click Install the Splunk OpenTelemetry Connector and follow the steps shown on the guided wizard.
    2. Install the integration by downloading the latest version of the Splunk Distribution of the OpenTelemetry Collector for Java with the provided curl command to add to your runtime command. The appropriate curl command is generated based upon your Windows or Linux selection.
    3. Set the environment variables. Copy these from the generated commands.
    4. Update your runtime command. Copy these from the generated code.
    5. Restart your application or service.
    6. Send requests to the service or generate activity in your application.

  7. Click Next at the bottom of the UI to review. You can now interact with your data by clicking Explore Service Spans. The environment name is visible in this area.

Deploy a Java application with Amazon EKS

The process to instrument your Java application with Amazon EKS only changes slightly from deploying without Kubernetes, due to how applications are deployed with Kubernetes.

Deploying with Amazon EKS will also work the same when deploying your application to a Kubernetes cluster outside of EKS. In section 4.6. below, select Yes for Kubernetes.

  1. From the left sidebar within Splunk Observability Cloud, click Data Management.
  2. Click + Add Integration.
  3. Locate the APM Instrumentation section and click Java - Provides Traces.
  4. Click Next at the bottom of the UI to reach the Configure Integration step. Provide the following details about the environment where the application is deployed:
    1. Service Name: The name of the service you are instrumenting.
    2. Collector Endpoint: Reflects the endpoint with the Splunk OpenTelemetry Collector installed. By default, the setting is set to http://localhost:4317. This can be changed to reflect a remote machine hosting the Splunk OpenTelemetry Collector.
    3. Environment: The name of the environment that you are instrumenting.
    4. AlwaysOn Profiling: Select CPU profiling if you want only CPU, or select CPU and memory profiling for both CPU and Memory profiling.
    5. Application Metrics: If enabled, gathers additional JVM-specific metrics.
    6. Kubernetes: If the application is running within Kubernetes. Select Yes.
    7. Legacy Agent: Whether or not you would like to use the legacy SignalFx agent.

  5. Click Next at the bottom of the UI to reach the Install Integration step.
  6. Select the Linux or Windows tab and then follow the instructions on page, providing the following details about the environment where the application is deployed:

    1. If you have not already done so, click Install the Splunk OpenTelemetry Connector and follow the steps shown on the guided wizard.
    2. Install the integration by downloading the latest version of the Splunk Distribution of the OpenTelemetry Collector for Java with this curl command to add to your runtime command. The appropriate curl command is generated based upon your Windows or Linux selection.
    3. Configure the Kubernetes Downward API to expose the environment variables to Kubernetes resources. Here is an example showing how to inject environment variables in .spec.template.spec.containers.env of the yaml deployment file:
    4. Update your runtime command. Copy these from the generated code.
    5. Restart your application or service.
    6. Send requests to the service or generate activity in your application.

  7. Click Next at the bottom of the UI to review. You can now interact with your data by clicking Explore Service Spans.

Now that the service is instrumented, you can see RED metrics created automatically, and you can also view every transaction’s full trace through your infrastructure.

Next steps

The content in this guide comes from a previously published blog, one of the thousands of Splunk resources available to help users succeed. In addition, these Splunk resources might help you understand and implement this use case: