Skip to content

AWS S3

Follow the steps documented below to send your applications' log aggregation to an Amazon S3 bucket.


Create An Logs Aggregation Endpoint Integration in Rafay

  • Login into the Rafay Console as a Project Admin.
  • Click on Integrations > Aggregation Endpoints
  • Click on "New Endpoint"
  • Provide a Name , select "AWS S3" from the drop down for Type
  • Enter the "Bucket name"
  • (Optional) enter the "Prefix"
  • Provide the AWS Access key and Secret to access this S3 bucket. Ensure to include the read and write permission to S3 bucket for this AWS credential
  • Select AWS Region from the dropdown list

Create S3 Endpoints


Use The Log Aggregation Endpoint in Rafay's Wizard Workload

In order to send logs of the applications published via Rafay's Wizard Workload to the log endpoint created above, select the endpoint name from the dropdown list of "Logs endpoint" under Workloads > Containers > Logs Aggregation

Select S3 Endpoint


Annotations for Yaml or Helm

Workloads based on Helm or k8s yaml can use Rafay supported annotations below to send logs of the applications to the log endpoint created above ​

  annotations:
    rafay.dev/logging: "<log_endpoint_name>"
​ Here is an example yaml for a deployment with the Rafay Annotations for log integration to AWS S3.

apiVersion: apps/v1
kind: Deployment
metadata:
  name: nginx-s3-logging
  annotations:
    rafay.dev/logging: s3-logs
spec:
  selector:
    matchLabels:
      app: nginx-s3-logging
  replicas: 2
  template:
    metadata:
      labels:
        app: nginx-s3-logging
    spec:
      containers:
      - name: nginx-s3-logging
        image: nginx:latest
        ports:
        - containerPort: 80

View The Application Logs In AWS S3

After the application logs have been sending to the S3 bucket from the Rafay's managed Kubernetes clusters, you should be able to see these logs in the S3 bucket

Logs In S3 Bucket