5 min
IT Ops
The Role of Log Files in Experiments
You have heard, no doubt, of theLean Startup
[http://www.amazon.com/The-Lean-Startup-Entrepreneurs-Continuous/dp/0307887898/]
. If you need a refresher to place the name, it’s a book, but it’s also a
business trend with such momentum as to have awebsite advertising it as a
“movement.” [http://theleanstartup.com/]And, frankly, that advertisement is
hardly a stretch. The title and the terms coined in it are on everyone’s lips
in the tech industry these days because people at companies of all s
5 min
IT Ops
TypeScript Language Primer
What is TypeScript?
TypeScript is an open source typed superset of JavaScript
[https://logentries.com/doc/javascript/] which compiles to plain JavaScript.
Many people have considered JavaScript’s loose typing as a problem, but now
TypeScript offers a solution to that. Specifically, TypeScript allows you to
code with decorators/annotations, classes, interfaces, private properties and
typing compliance.
We also might say that TypeScript is ES6 with some extra options.
What does TypeScript do?
8 min
IT Ops
IOT made real - Using TI SensorTag data with Logentries and NodeRed
Learn how to send IoT Data from the TI CC 2650 SensorTag to Logentries (using
Node-Red).
This is the first of a series of IoT blogs that show you how easy it can be to
integrate a range of real sensor devices with Logentries and how to use the data
from those devices once it is in Logentries. This follows the earlier blog
[/2014/12/end-to-end-iot-monitoring-with-log-data/] showing why a centralised
logging service would be useful for IoT developers and users. This series of
blogs will show ju
5 min
IT Ops
Troubleshooting with Nexpose Logs
Nexpose [http://www.rapid7.com/products/nexpose/index.jsp] is the industry
standard in Vulnerability Management, giving you the confidence you need to
understand your ever-changing attack surface, focus on what matters, and create
better security outcomes.
[/2016/02/troubleshooting-with-nexpose-logs/]
Table of contents
* Where are the Nexpose logs located?
* Setting up the Logentries Agent
* Analyzing the logs- mem.log
* nsc.log
* auth.log
* Get started
-------------------------------
2 min
IT Ops
Using Logs for Security & Compliance: Part 3
This 3-part series explores the critical role logs play in maintaining
regulatory compliances and provides specific examples of known events to look
for an how to evaluate different compliance tools.
--------------------------------------------------------------------------------
[http://info.logentries.com/using-logs-to-address-compliance-standards]
When it comes to PCI Compliance
[https://www.rapid7.com/solutions/compliance/pci-dss/], simply collecting and
storing your logs isn’t enough.
5 min
IT Ops
Considering the Explosive Growth of Log Analytics
You’d have to be living in a cave to not know that the practice of log analytics
in corporate IT has grown dramatically in the last 10 years. This explosion in
logging activities over the recent years is due to two factors, the maturing of
log technology and the expanded application of logging to new information
domains such as tracking user behavior, tracking page views, and tracking API
interaction, to name a few such activities.
As logging technology matures, the price goes down. Getting a
2 min
IT Ops
Using Logs for Security & Compliance: Part 2
This 3-part series explores the critical role logs play in maintaining
regulatory compliances and provides specific examples of known events to look
for an how to evaluate different compliance tools.
--------------------------------------------------------------------------------
[http://info.logentries.com/using-logs-to-address-compliance-standards]
For organizations looking to achieve and maintain PCI compliance, requirements
related to the secure retention of log data are common.
The se
2 min
IT Ops
Using Logs for Security & Compliance: Part 1
This 3-part series explores the critical role logs play in maintaining
regulatory compliances and provides specific examples of known events to look
for an how to evaluate different compliance tools. To download the free 24-page
white paper, click here
[http://info.logentries.com/using-logs-to-address-compliance-standards].
--------------------------------------------------------------------------------
For organizations that need to remain compliant with specific regulatory
standards, requ
2 min
IT Ops
Analyzing ELB Log Data
Thanks to some slick work from our engineering team, we have recently released a
lightweight python script that will allow you to pull your Elastic Load Balancer
logs from S3 into Logentries.
In this implementation, we use AWS Lambda and leverage the S3 trigger, so the
script only runs when needed.
The full documentation is available here:
https://logentries.com/doc/s3-ingestion-with-lambda/
[https://logentries.com/doc/s3-ingestion-with-lambda/?utm_source=134&utm_medium=blog&utm_campaign=12
1 min
IT Ops
Introducing a Buildbot status plugin for pushing status updates to Logentries
Buildbot is a framework for building continuous deployment and integration
systems, it is highly flexible and is written in python. It is also a mature
system which a number of large projects use e.g. Mozilla, Chromium, Python – see
trac.buildbot.net/wiki/SuccessStories
[http://trac.buildbot.net/wiki/SuccessStories]
To send build status information — specifically Start, Success and Failure
states from Buildbot to Logentries — start by generating a log token from
Logentries. [https://logentri
2 min
IT Ops
How to Log Messages from Slack
We recently added support for unedited HTTP logging in Logentries. This means
you can send us log data via HTTPS drain (from heroku), or via any webhook you
want.
One webhook that we’ve been looking to log for a while is Slack
[https://logentries.com/resources/#plug-ins].
People are always chatting away on Slack, and this data might be useful some
day. You can send the data into Logentries however you want, and then worry
about what to do it when you actually need it!
First, you’ll need to
3 min
IT Ops
Logentries recognized by Docker as Ecosystem Technology Partner for Logging
Since last year, we’ve anticipated the impact of Docker
[/2014/05/musings-on-the-future-of-docker/] and have been building integrations
– first as experiments
[/2014/03/how-to-run-rsyslog-in-a-docker-container-for-logging/] and later as
full-blown solutions
[/2015/07/an-all-inclusive-log-monitoring-container-for-docker/]. It’s therefore
with great pleasure that we’re announcing our recognition by Docker as an
Ecosystem Technology Partner for Logging.
Why Monitor Docker Logs?
Most teams that
5 min
IT Ops
Analysing Hystrix metrics with Logentries
We’ve been using Hystrix [http://techblog.netflix.com/2012/11/hystrix.html] in
production here at Logentries for over a year now [shameless plug: I briefly
talked about this [https://speakerdeck.com/m0wfo/clojure-ireland-talk-june-2015]
at a Clojure Ireland meetup recently :)] and have found it useful not only for
bulkheading [http://martinfowler.com/bliki/CircuitBreaker.html] requests, but
for getting fine-grained metrics for internal API calls.
Netflix has also open-sourced a funky dashbo
4 min
IT Ops
Introducing LEQL: percentile() & median
While analyzing data, it’s important to use a variety of calculations to ensure
you get the best insights. Today, we’re excited to announce the availability of
our two newest LEQL functions: percentile() and median.
percentile() allows you to calculate the number below which a given percentage
of your log entries fall. To use a real world example, what was the longest
response time for 95% of my application’s users? Similarly, median (or the 50th
Percentile) gives you the middle number in a s
4 min
IT Ops
Unleash the power of node.js for Shell Scripting (Part 2)
Ready for our first proper node.js Script!
In a previous post
[/2015/10/20/unleash-the-power-of-node-js-for-shell-scripting-part-1/], we
learned about some tools that helped us create a script in node.js. It is now
time to put this into practice by implementing a script that connects to a few
online newspapers, searches in the news for specific keywords and returns those
articles.
Our new script will need to accept the following parameters:
* A file with the list of newspapers (one URL per li