Logging in AWS
Deploying your application or infrastructure in Amazon Web Services (AWS)? Stream your logs to highlight to see everything in one place. Most AWS Services support streaming logs via Fluent Forward though the exact configuration will differ. Read the AWS documentation here to learn more.
Check out the following examples of setting up logs streaming in these services:
AWS ECS Containers
To stream your container logs to highlight from an ECS Fargate container, we recommend running a fluent-bit agent alongside the container to stream logs to highlight (which accepts AWS FireLens logs via the [Fluent Forward](https://docs.fluentbit.io/manual/pipeline/outputs/forward/ protocol)).
Here's a sample task definition (based on the AWS docs) containing a dummy app container and a fluent-bit agent configured alongside.
AWS Kinesis Firehose for logs from infrastructure or other services
Let's say you are running RDS Postgres or MSK Kafka services that are core infrastructure for your application, and you are interested in searching and browsing the logs. The best way to export such infrastructure logs is via AWS Kinesis Firehose shipping to our HTTP logs endpoint.
First, create a Kinesis Data Stream.
Next, create a Kinesis Data Firehose with an HTTP destination to route data to highlight.
Configure your Kinesis data stream to ship logs to HTTP https://pub.highlight.io/v1/logs/firehose, enabling GZIP content encoding and passing paramater
x-highlight-project with your highlight project ID.
Finally, connect your AWS CloudWatch Log Stream to the Kinesis Data Stream via a Kinesis Subscription Filter.
If you have any questions with your setup, don't hesitate to reach out!