Skip to content

Filters

The following table represents the comparison operators that are supported and example values. Filters are added as escaped JSON strings formatted as {"key":"<field>","operator":"<comparison_operator>","value":"<value>"}.

  • Refer to the Log fields page for a list of fields related to each dataset.

  • Comparison operators define how values must relate to fields in the log line for an expression to return true.

  • Values represent the data associated with fields.

The filter field has limits of approximately 30 operators and 1000 bytes. Anything exceeding this value will return an error.

Logical Operators

  • Filters can be connected using AND, OR logical operators.

  • Logical operators can be nested.

Here are some examples of how the logical operators can be implemented. X, Y and Z are used to represent filter criteria:

  • X AND Y AND Z - {"where":{"and":[{X},{Y},{Z}]}}

  • X OR Y OR Z - {"where":{"or":[{X},{Y},{Z}]}}

  • X AND (Y OR Z) - {"where":{"and":[{X}, {"or":[{Y},{Z}]}]}}

  • (X AND Y) OR Z - {"where":{"or":[{"and": [{X},{Y}]},{Z}]}}

Set filters via API or dashboard

Filters can be set via API or the Cloudflare dashboard. Note that using a filter is optional, but if used, it must contain the where key.

API

Here is an example request using cURL via API:

Terminal window
curl -s -X POST https://api.cloudflare.com/client/v4/zones/<ZONE_ID>/logpush/jobs \
-H 'X-Auth-Key: <KEY>' \
-H 'X-Auth-Email: <EMAIL>' \
-H 'Content-Type: application/json' \
-d '{
"name":"static-assets",
"output_options": {
"field_names": ["ClientIP", "EdgeStartTimestamp", "RayID"],
"sample_rate": 0.1,
"timestamp_format": "rfc3339"
"CVE-2021-44228": "true"
},
"dataset": "http_requests",
"filter":"{\"where\":{\"and\":[{\"key\":\"ClientRequestPath\",\"operator\":\"contains\",\"value\":\"/static\"},{\"key\":\"ClientRequestHost\",\"operator\":\"eq\",\"value\":\"example.com\"}]}}",
"destination_conf": "s3://<BUCKET_PATH>?region=us-west-2/"
}' | jq .

Dashboard

To set filters through the dashboard:

  1. Log in to the Cloudflare dashboard and select the domain you want to use.
  2. Go to Analytics & Logs > Logs.
  3. Select Add Logpush job. A modal window will open.
  4. Select the dataset you want to push to a storage service.
  5. Below Select data fields, in the Filter section, you can set up your filters.
  6. You need to select a Field, an Operator, and a Value.
  7. You can connect more filters using AND and OR logical operators.
  8. Select Next to continue the setting up of your Logpush job.