Set up an Uptime test π
An Uptime test lets you make a request to a specified URL or port address and monitor its response time and response code. Uptime tests record three metrics from each run: response time, DNS time, and time to first byte.
Decide whether you want to set up an HTTP Uptime test or Port Uptime test, and then follow the steps below to set up your test. To learn more about the types of Uptime tests, see Use an Uptime Test to test port or HTTP uptime.
Note
If the site or application you are monitoring uses allow lists or block lists for visitors or an analytics tool to measure traffic, check that it is configured to accommodate traffic from Splunk Synthetic Monitoring. See Configure your site to accommodate synthetic tests for instructions.
Configure an HTTP Uptime test π
Follow these steps to set up a HTTP Uptime test:
From the landing page of Splunk Observability Cloud, navigate to Splunk Synthetic Monitoring.
In the Tests section, select Add New Test and select Uptime Test from the drop-down list. This opens the New Uptime Test page.
Make sure the HTTP tab is selected.
In the Name field, enter a name for your test. You will use this name to identify data from the test in your alerts and reports.
In the URL field, paste the URL for the page you want to test, including
http
orhttps
.As you build your test, you can use Try now to check that the configuration of your test is valid. Run results arenβt stored. For more, see Validate your test configuration with try now.
Configure a Port Uptime test π
Follow these steps to set up a Port Uptime test:
From the landing page of Splunk Observability Cloud, navigate to Splunk Synthetic Monitoring.
In the Tests section, select Add New Test and select Uptime Test from the drop-down list. This opens the New Uptime Test page.
Select the Port tab.
In the Name field, enter a name for your test. You will use this name to identify data from the test in your alerts and reports.
In the Address field, use the drop-down list to indicate whether the port you are monitoring follows TCP or UDP protocol. Enter the host and port addresses.
Customize your test details π
Use these steps to customize your test configuration and finish creating your test:
In the Locations field, enter the locations from which you want to test the URL. You can select one or multiple locations.
In the Device Type field, use the list to select the device from which youβd like to conduct the test.
In the Frequency field, select your desired test frequency from the list.
(Optional) Use the Round Robin selector to switch between options: enabling Round Robin means your test cycles through your selected locations one at a time, while disabling Round Robin runs the test from all selected locations concurrently at your selected frequency.
If you want to receive alerts from this test, select + Create detector to set up a detector on the test. Use the dialog box to customize your detector.
Select Submit. This redirects you to the Test History page for your new test. If youβve just created the test, allow at least one test frequency interval for your test to begin collecting synthetic data.
(Optional) Select Edit test or the three-dot Actions menu in the row for your test to edit, pause, duplicate, or delete this test.
See also π
See Public locations to learn more about the locations from which you can run your test.
See Set up detectors and alerts in Splunk Synthetic Monitoring to learn more about detector options.
View your Uptime test π
Now that you created and saved a test, check whether itβs collecting data as expected:
From the Tests list, select the three-dot Actions menu and select Play arrow icon to manually trigger a live run of the test, or wait for at least one duration of the test frequency you set so that the test has time to run and collect data.
Select the test youβre interested in to open the Test History view, where you can view visualizations of recent test results and metrics.
Interpret your Uptime test results π
See Interpret Uptime test results for an overview of run-level Uptime test results.
Advanced settings for uptime tests π
There are many reasons why you might want to configure advanced settings for your synthetics tests. Here are a few:
Accessing a site with a modal that appears randomly and interrupts the flow of the test. For example, a marketing modal might prompt a user to sign up for a rewards program. To circumvent this issue you can set a cookie to stop the popup modal from appearing and interfering with your test.
Running a test on a site that requires users to log in to access the site.
Specifying the type of device on which you want to run your test by setting the
User-Agent
header on requests.Testing out a CDN. For example, you might want to load the HTML page in the browser, but rewrite the hosts for some or all requests to a new host.
Filtering out requests from analytics on the back end by sending a specific header in the requests.
Running a test on a pre-production site that has a self-signed certificate.
Select an HTTP method π
Select an HTTP method and add a payload.
Set custom headers π
Specify custom headers to send with each request. For example, you can add a header in your request to filter out requests from analytics on the back end by sending a specific header in the requests. You can also use custom headers to set cookies.
Activate SSL/TLS validation π
When activated, this feature is used to enforce the validation of expired, invalid hostname, or untrusted issuer on SSL/TLS certificates. When testing pre-production environments that have self-signed or invalid certificates, itβs best to leave this feature deactivated.
Add assertions π
You can make an assertion on two values. Add two parameters along with the comparison that you would like to perform between the two. There are three types of comparisons: string, numeric, and regular expression. For string and numeric comparisons, values are coerced to the comparison type before the comparison is made. For a regular expression comparison, the first parameter is a string and the second parameter is a regular expression. An assertion step fails if the assertion is false when the step runs.
Example π
For an example, see Scenario: Kai monitors the performance of a user-facing application.