Export logs to a Google Cloud Storage
This task is designed to send logs to a Google Cloud Storage.
yaml
type: "io.kestra.plugin.ee.gcp.gcs.logexporter"Ship logs to GCP
yaml
id: log_shipper
namespace: company.team
triggers:
- id: daily
type: io.kestra.plugin.core.trigger.Schedule
cron: "@daily"
tasks:
- id: log_export
type: io.kestra.plugin.ee.core.log.LogShipper
logLevelFilter: INFO
lookbackPeriod: P1D
logExporters:
- id: GCPLogExporter
type: io.kestra.plugin.ee.gcp.gcs.LogExporter
projectId: myProjectId
format: JSON
maxLinesPerFile:10000
bucket: my-bucket
logFilePrefix: kestra-log-file
chunk: 1000
GCS Bucket to upload logs files.
The bucket where log files are going to be imported
Validation RegExp
^[a-zA-Z0-9][a-zA-Z0-9_-]* Min length
1 Default
1000The chunk size for every bulk request.
Default
JSON Possible Values
IONJSONFormat of the exported files
The format of the exported files
The GCP service account to impersonate.
Default
kestra-log-filePrefix of the log files
The prefix of the log files name. The full file name will be logFilePrefix-localDateTime.json/ion
Default
100000Maximum number of lines per file
The maximum number of lines per file
The GCP project ID.
SubType string
Default
["https://www.googleapis.com/auth/cloud-platform"]The GCP scopes to be used.
The GCP service account key.