globaldatanetmenu

.Serverless querying and evaluating of Logs using Athena

May 23rd 2022-4 min read

Logging and monitoring are important components of IT in any company. Without logging it's harder to troubleshoot technical issues and nearly impossible to gain operational insights and intelligence.

When using AWS it is best practice to store the logs, regardless of whether they are application or governance logs (such as config, firewall or management logs), centrally and securely in an account used only for this purpose.

But how do you cost-effectively search for and evaluate problems in all those logs? That's where Amazon Athena comes in! Athena is an interactive query service that facilitates the analysis of data in Amazon S3 using standard SQL. Athena does not require a server, so there is no infrastructure to manage and you only pay for the queries that are executed.

During implementing Athena with centralised logs in AWS Organizations you may stumble across some challenges like:

  • Querying multiple accounts (needs partitions per account & day to be created)
  • New partitions must be added when creating date or time partitions - i.e. daily per account and region

To speed up query processing of highly partitioned tables and fully automate partition management you can use partition projection in Athena. With partition projection, partition values and locations are calculated from the configuration rather than read from a repository such as the AWS Glue Data Catalog. Because in-memory operations are often faster than remote operations, partition projection can reduce query runtime for highly partitioned tables. Depending on the specific characteristics of the query and the underlying data, partition projection can significantly reduce query runtime for queries that are limited to retrieving partition metadata.

Blog Content

Following I will show you how to create Athena tables for some of the most important AWS services.

VPC FlowLogs

VPC Flow Logs is a feature that enables you to capture information about the IP traffic going to and from network interfaces in your VPC. You can use these logs to examine network traffic patterns and identify threats and risks on your VPC network.

🚨 VPC FLowLogs are a non-Hive compatible scheme so you cannot add partitions using the MSCK REPAIR TABLE command.

💡 If you want to know how to query the VPC FlowLogs take a look at this documentation.

Here you can see how to create a Athena table for your central VPC FlowLogs:

CREATE EXTERNAL TABLE IF NOT EXISTS vpc_flow_logs_pp (
  version int,
  account string,
  interfaceid string,
  sourceaddress string,
  destinationaddress string,
  sourceport int,
  destinationport int,
  protocol int,
  numpackets int,
  numbytes bigint,
  starttime int,
  endtime int,
  action string,
  logstatus string
)
PARTITIONED BY (
   `account` string,
   `region` string,
   `timestamp` string)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ' '
LOCATION
  's3://S3BucketName/AWSLogs/'
 
TBLPROPERTIES (
  'projection.account.values'='AccountIdsCommaseperated',
  'projection.account.type'='enum',
  'projection.enabled'='true',
  'projection.timestamp.format'='yyyy/MM/dd',
  'projection.timestamp.interval'='1',
  'projection.timestamp.interval.unit'='DAYS',
  'projection.timestamp.range'='2020/01/01,NOW',
  'projection.timestamp.type'='date',
  'projection.region.type'='enum',
  'projection.region.values'='RegionsCommaseperated',
  'storage.location.template'='s3://S3BucketName/AWSLogs/${account}/vpcflowlogs/${region}/${timestamp}',
  'skip.header.line.count'='1');

AWS CloudTrail

AWS CloudTrail is a governance, compliance, operations and risk monitoring service. CloudTrail provides an event history of your AWS account activity. This includes actions performed via the AWS Management Console, AWS SDKs, command line tools and other AWS services. The event history simplifies security analysis, resource change tracking and troubleshooting. More over, you can use CloudTrail to detect anomalous activity in you AWS accounts. These capabilities simplify operational analysis and troubleshooting.

💡 If you want to know how to query the AWS CloudTrail Logs take a look at this documentation.

Here you can see how to create a Athena table for your central CloudTrail Logs:

CREATE EXTERNAL TABLE cloudtrail_logs_pp(
    eventVersion STRING,
    userIdentity STRUCT<
        type: STRING,
        principalId: STRING,
        arn: STRING,
        accountId: STRING,
        invokedBy: STRING,
        accessKeyId: STRING,
        userName: STRING,
        sessionContext: STRUCT<
            attributes: STRUCT<
                mfaAuthenticated: STRING,
                creationDate: STRING>,
            sessionIssuer: STRUCT<
                type: STRING,
                principalId: STRING,
                arn: STRING,
                accountId: STRING,
                userName: STRING>>>,
    eventTime STRING,
    eventSource STRING,
    eventName STRING,
    awsRegion STRING,
    sourceIpAddress STRING,
    userAgent STRING,
    errorCode STRING,
    errorMessage STRING,
    requestParameters STRING,
    responseElements STRING,
    additionalEventData STRING,
    requestId STRING,
    eventId STRING,
    readOnly STRING,
    resources ARRAY<STRUCT<
        arn: STRING,
        accountId: STRING,
        type: STRING>>,
    eventType STRING,
    apiVersion STRING,
    recipientAccountId STRING,
    serviceEventDetails STRING,
    sharedEventID STRING,
    vpcEndpointId STRING
  )
PARTITIONED BY (
   `account` string,
   `region` string,
   `timestamp` string)
ROW FORMAT SERDE 'com.amazon.emr.hive.serde.CloudTrailSerde'
STORED AS INPUTFORMAT 'com.amazon.emr.cloudtrail.CloudTrailInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
  's3://S3BucketName/AWSLogs/OrgIdIfOrgTrailIsUsed'
T