<tbody id="asv3h"><nobr id="asv3h"></nobr></tbody>

<track id="asv3h"></track>
<bdo id="asv3h"></bdo>

      Support Logging setup

      Amazon ELB

      Send Amazon ELB Classic Logs to Loggly

      You can push your Amazon Elastic Load Balancer (ELB) Classic logs to Loggly using an AWS Lambda Script. It converts the ELB logs written to S3 into JSON format and then sends them to Loggly. ELB (Application) Logs are not supported at this time.

      Alternatively, you may use our S3 ingestion service that will directly ingest ELB Classic logs into Loggly without requiring a Lambda function. Our app pack for ELB contains popular dashboards and saved searches. It currently only supports ingesting logs using our S3 ingestion service and not this Lambda script.

      AWS Setup

      1. Get the Lambda Code

      Clone the git repo

      git clone https://github.com/cboscolo/elb2loggly.git
      cd elb2loggly

      Edit elb2loggly.js with proper Loggly customer token and optional log tags. (You can set these as tags on the S3 Bucket that contains the logs.) Install require npm packages.

      npm install

      Zip up your code

      zip -r elb2loggly.zip elb2loggly.js node_modules

      The resulting zip (elb2loggly.zip) is what you will upload to AWS in step 2 below.

      2. Configure the Lambda Function

      Go to AWS Lambda Console Console. Click “Create a Lambda function” button. (Choose “Upload a .ZIP file”). Fill the following details.

      Name: elb2loggly
      Upload lambda function (zip file you made above in Step 1)
      Handler*: elb2loggly.handler
      Role*: In the drop down click "S3 execution role". (This will open a new window to create the role, click Allow)
      Set memory at 128MB
      Set Timer to 10 seconds.
      

      Configure Event Source to call elb2loggly when logs added to S3 bucket. Go to AWS Lambda Console . Make sure the elb2loggly lambda function is selected, then click ‘Actions->Add event source‘. Then fill the following details.

      Event source type: S3
      Bucket: Choose the bucket that contains your ELB logs.
      Event type: ObjectCreated (All)

      3. Configure ELB Logging

      Goto the EC2 Management Console under ‘Load Balancers’. Choose your ELB and scroll down to Access Logs and click edit then set interval to 5 minutes and S3 Location to the bucket where you want to put your logs.

      4. Tag Your Logs in S3

      The Lamba script will look for your customer token as an S3 tag, which it will use to send data to your account. It will also add tags for Loggly, which will make the logs easier to find in search. Using the S3 Management Console click the bucket that contains your ELB logs. Under Properties -> Tags, add the following tag:

      Key: loggly-customer-token
      Value: TOKEN
      
      Key: loggly-tag
      Value: aws-elb 
      

      Replace:

      5. Verify Events

      Search Loggly events with the tag as aws-elb over the past 20 minutes. It may take few minutes to index the events. If if doesn’t work, see the troubleshooting section below.

      tag:aws-elb
      access-log-aws-elb

      Troubleshooting

      If you don’t see any data show up in the verification step, then check for these common problems.

        • Wait a few minutes in case indexing needs to catch up
        • Make sure you’ve included your own customer token
        • Make sure you have configured same roles as mentioned above.
        • Go to your Lamda function in AWS Console and click on View logs in Cloudwatch in the Monitoring tab to view logs.
        • If you still do not see your log, search for the “error” field with tag as aws-elb in your past 20 minutes logs.
        • Search or post your own Amazon ELB logging questions in the community forum.

       

      Learn how Loggly?can help with all your?AWS Log Management

      Thanks for the feedback! We'll use it to improve our support documentation.


      caopon超碰最新