Mike Lindegarde... Online

Things I'm likely to forget.

Using Serilog, Elasticsearch 5, and Kibana 5 for Effective Error Logging

Why use Serilog over NLog

Skip to the instructions

For the longest time I didn't understand why everyone was so excited about Serilog.  I've used NLog for a long time and it seemed more than capable of doing what I needed:  logging messages to some data store (log files, databases, etc...).

Then I stared using Elasticsearch.  Suddenly I saw the light.  Structured event data always struck me as one of those neat features that wasn't really needed.  However, once you start using something like Elasticsearch the power of structured event data quickly become evident.

Adding the visualizations offered by Kibana takes your logging to the next level.  A quick glance at a dashboard and you instantly know if there has been an uptick in errors (or whatever you might be logging).  The interactive visualizations allow you to quickly filter out noise and identify the root cause of problems you or your users might be experiencing.

 

Installing the JDK

This part is frequently overlooked.  You need to have Java installed and running on the box you're planning to use as your Elasticsearch server.  For the purposes of this tutorial I'm going to assume that you're going to set things up on a Windows machine.  You'll also need to ensure that the JAVA_HOME environment variable is correctly set.

Step 1: Download the JDK

You can download the current JDK from Oracle.  You'll want to click on the "Java Platform (JDK) 8u111 / 8u112" link.  On the following page download the appropriate package (in my case it's jdk-8u111-windows-x64.exe).  Once the download completes run the installer and let it do it's thing.  In most cases the default options (install location, etc...) are just fine.

Step 2: Setting the JAVA_HOME Environment Variable

In order for Elasticsearch to work you'll need to have the JAVA_HOME variable set.  In order to set the JAVA_HOME variable you'll need to access the Advanced System Settings in Windows.  You can do that by:

  1. Click on the Start Menu
  2. Right click on Computer
  3. Select Properties

In the window that appears, select Advanced System Settings in the upper left.  That will bring up a small dialog window with five tabs across the top:

  1. Select Advanced.
  2. Click on Environment Variables...
  3. In the System variables section click New...
  4. For the Variable Name use JAVA_HOME and for the value use the location of your install's bin directory

 

Setting up Elasticsearch and Kibana

I'm not going to teach you the ins and outs of Elasticsearch.  I'm going to give you just enough information to get things up and running.  Elasticsearch is incredibly powerful and I strongly encourage you to get a book or consult with someone more knowledgeable than me.

Step 1: Download and Install Elasticsearch 

Elastic does a great job walking you through the steps necessary to setup Elasticsearch on any platform.  Rather than repeating that information here, I'll simply point you in the right direction: https://www.elastic.co/downloads/elasticsearch.

Although the download page has some basic installation instructions, I found the instruction in the documentation to be much more helpful.  You can find that information here.

I would strongly recommend setting up Elasticsearch as a service.  However, you'll want to make sure you have Elasticsearch successfully running before setting up the service.  It's much easier to resolve problems when you can see the errors in the console window.

Step 2: Download and Install Kibana

Installing Kibana goes pretty much exactly like installing Elasticsearch.  Simply download the compressed file, decompress to wherever you like, then run the bat file.  You can download Kibana from here:  https://www.elastic.co/downloads/kibana.  You can find better installation documentation here.

If you want to setup Kibana to run as a service you can use the following command in the Windows Console or your preferred terminal (you can see my setup here):

sc create "ElasticSearch Kibana 4.0.1" binPath= "{path to batch file}" depend= "elasticsearch-service-x64"

That handy little line comes to you courtesy of Stack Overflow.

At this point you should be able to verify that Elasticsearch is running at http://localhost:9200 and that Kibana is running at http://localhost:5601 by visiting those URLs in your preferred browser.

 

Using Serilog

As mentioned in the introduction, we'll be using Serilog instead of NLog.  This is so that we can take advantage of the structured data Serilog gives us in our Elasticsearch indexes.  Setting up Serilog with .NET Core is pretty straight forward.

Step 1: Add the Required Packages

Add the following packages to your project.json file:

{
  "dependencies": {
    "Microsoft.NETCore.App": {
      "version": "1.0.0",
      "type": "platform"
    },
    "Swashbuckle": "6.0.0-beta902",
    "Microsoft.ApplicationInsights.AspNetCore": "1.0.0",
    "Microsoft.AspNetCore.Mvc": "1.0.0",
    // removed for length
    "Microsoft.Extensions.Options.ConfigurationExtensions": "1.0.0",
    "Serilog": "2.3.0",
    "Serilog.Extensions.Logging": "1.3.1",
    "Serilog.Sinks.Literate": "2.0.0",
    "Serilog.Sinks.ElasticSearch": "4.1.1"
  },
  // truncated to save space
}

Save your project.json file and let Visual Studio restore the packages.

Step 2: Modify Your Startup.cs

You'll need to modify your Startup.cs in two places: the constructor and in the Config method.  First we'll look at the changes to the constructor:

ElasticsearchConfig esConfig = new ElasticsearchConfig();
Configuration.GetSection("Elasticsearch").Bind(esConfig);

LoggerConfiguration loggerConfig = new LoggerConfiguration()
	.Enrich.FromLogContext()
	.Enrich.WithProperty("Application","App Name")
	.WriteTo.LiterateConsole();

if(esConfig.Enabled)
{
	loggerConfig.WriteTo.Elasticsearch(new ElasticsearchSinkOptions(esConfig.Uri)
	{
		AutoRegisterTemplate = true,
		MinimumLogEventLevel = (LogEventLevel)esConfig.MinimumLogEventLevel,
		CustomFormatter = new ExceptionAsObjectJsonFormatter(renderMessage:true),
		IndexFormat = esConfig.IndexFormat
	});
}

Log.Logger = loggerConfig.CreateLogger();

Looking at the code you should notice that loading settings from my appsettings.json file.  If you need some help with that you read my previous post.

I've enriched my events in two ways.  First I've configured Serilog to use the LogContext.  For more information take a look at the Serilog documentation here.  The second enrichment simply puts the application name on every event generated by Serilog.

I always want Serilog to write to the console (at least while the application is being developed).  To accomplish that I'm using the LiterateConsole Sink.  If you want to know why I'm using the LiterateConsole over the ColoredConsole you can read more about it here.

Lastly, depending on the value in esConfig.Enabled I'm conditionally setting up the Elasticsearch sink.  You can find all the information about the various configuration options here.  Here is the short version:

  • AutoRegisterTemplate - Tells Serilog to automatically register a template for the indexes it creates using a template optimized for working with Serilog events.
  • MinimumLogEventLevel - Kind of straight forward.
  • CustomFormatter - In order to avoid deeply nested objects Serilog writes inner exceptions as an array of exceptions.  This can be problematic for visualizations and some queries.  You can change this behavior using the ExceptionAsJsonObjectFormatter.
  • IndexFormat - This is the pattern Serilog will use to generate the indexes it creates.  Typically it's something like "api-logs-{0:yyyy.MM.dd}".  If you do not provide a format Serilog will use it's default value.

Finally, modify your Configure method:

public void Configure(
	IApplicationBuilder app, 
	IHostingEnvironment env, 
	ILoggerFactory loggerFactory,
	IApplicationLifetime appLifetime)
{
	loggerFactory.AddSerilog();

	app.UseMvc();
	app.UseSwagger();
	app.UseSwaggerUi();

	appLifetime.ApplicationStopped.Register(Log.CloseAndFlush);
}

That's it.  Now you're ready to start writing some events to Elasticsearch.

Step 3: Write some exceptions to Elasticsearch

You'll need to use Serilog's ILogger interface wherever you need to log an event.  I tend to use StructureMap as my IoC container instead of the default implementation Microsoft offers.  This means I need to register the interface in my StructureMap configuration:

_.For<Serilog.ILogger>().Use(Log.Logger);

Once that is done, I can easily inject Serilog into any object created via the IoC container (i.e. my controllers).  Writing an event with structured data to Elasticsearch is as simple as making the following call in your code wherever appropriate:

_logger.Error(ex, "Failed to create object requested by {@staff}", _staff)

For more information about the features Serilog offers please refer to their documentation.  I encourage you to take advantage of source contexts whenever possible.  Having the SourceContext property in your event data makes filtering a lot easier.

 

Using Kibana

It's taken a while, but you've finally got Elasticsearch setup, Kibana installed and running, and your source code writing events to an Elasticsearch index.  Great... now it's time to start seeing the effort pay off.

Step 1: Setup Your Index Pattern

If this is the first time you've run Kibana you will most likely be looking at the screen where Kibana asks you to tell it about your index pattern:

The welcome screen

If you recall, back when we setup the Serilog Elasticsearch sink one of the properties we configured was the IndexFormat.  This is the value you'll want to use here less the date format portion of the string.  If you used "api-logs-{0:yyyy.MM.dd}" for your IndexFormat, then the Index Pattern is "api-logs-".

With the Index Pattern set you'll want to head over to the Discover tab.

Step 2: Save a Simple Query

Before you can discover anything you'll need to make sure you've logged at least a few events to Elasticsearch.  You'll also want to make sure that they occurred within the time frame you're currently viewing (look in the upper right corner of the Kibana window).  As long as you have some events stored in ES, clicking on Discover should display a window that looks something like this:

Discover tab with no filters

In order to create a visualization you're going to need to save a search.  You can find the full Discover documentation here.  For the purposes of moving forward, we'll save a simple search:

  1. In the Query bar type in: level:Error
  2. Click on the search button (magnifying glass)
  3. Click on Save in the upper right corner
  4. Give the search a slick name like All Errors
  5. Click Save

Step 3: Create a Simple Visualization

With the search saved it's time to move over to the Visualize section of Kibana.

There are several visualizations you can create.  In this example we'll create a simple Vertical Bar Chart using a Date Histogram to group the errors by date and time.  Creating this visualization is pretty straight forward:

  1. Select the Vertical Bar Chart option from the list of visualizations
  2. On the right you can select All Errors from the list of saved searches
  3. In the next window select X-Axis
  4. Under Aggregation choose Date Histogram
  5. Leave all of the default settings
  6. Click on the run button (Looks like a play button)
  7. Click Save in the upper right

Step 4: Build your dashboard

With the visualization saved you can easily add it to your dashboard.  You can find a lot more information about building dashboards than I can find in the official Kibana documentation.

 

Configuration using ASP.NET Core 1.0 and StructureMap

First, the bad news

Before .NET Core I used build specific web.config transforms.  When building MVC apps I took advantage XML transforms to have build specific configurations (obvious examples being Debug vs. Release).  If the project type didn't have transforms out of the box I used something like SlowCheetah to handle the XML transform (for example WPF).

While just about every tutorial out there tells you how to setup environment specific appsettings.json files, I haven't found any information about build specific application settings.  Hopefully I'm just missing something.  While this isn't a huge loss, it was convenient to be able to select "Debug - Mock API" as my build configuration and have a transform in place to adjust my web.config as necessary.

A Basic Example

Microsoft's new approach to configuration makes it incredibly easy to use strongly typed configurations via the IOptions<T> interface. Let's start with the following appsettings.json file:

{
  "Logging": {
    "UseElasticsearch":  true, 
    "IncludeScopes": false,
    "LogLevel": {
      "Default": "Debug",
      "System": "Information",
      "Microsoft": "Information"
    }
  },
  "IdentityServer": {
    "Authority": "http://localhost:5000",
    "Scope": "Some Scope"
  }
}

In order to take advantage of strongly typed configuration you'll also need a simple POCO (Plain Old CLR Object) that matches the JSON you've added to the appsettings.json file:

namespace Project.Api.Config
{
    public class IdentityServerConfig
    {
        public string Authority {get; set;}
        public string Scope {get; set;}
    }
}

With those two things in place, it's simply a matter of adding the appropriate code to your project's Startup.cs.  The following example code includes several things that are not necessary for this basic example.  My hope is that you might see something that answers a question you may have that I don't explicitly address in this post.

public Startup(IHostingEnvironment env)
{
	var builder = new ConfigurationBuilder()
		.SetBasePath(env.ContentRootPath)
		.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
		.AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true)
		.AddJsonFile("appsettings.local.json", optional: true);

	builder.AddEnvironmentVariables();
	Configuration = builder.Build();
	
	LoggingConfig loggingConfig = new LoggingConfig();
	Configuration.GetSection("Logging").Bind(loggingConfig);

	LoggerConfiguration loggerConfig = new LoggerConfiguration()
		.Enrich.FromLogContext()
		.Enrich.WithProperty("application","Application Name")
		.WriteTo.LiterateConsole();

	if(loggingConfig.UseElasticsearch)
	{
		loggerConfig.WriteTo.Elasticsearch(new ElasticsearchSinkOptions(new Uri("http://localhost:9200"))
		{
			AutoRegisterTemplate = true,
			CustomFormatter = new ExceptionAsObjectJsonFormatter(renderMessage:true),
			IndexFormat="logs-{0:yyyy.MM.dd}"
		});
	}

	Log.Logger = loggerConfig.CreateLogger();
}

public IServiceProvider ConfigureServices(IServiceCollection services)
{
	services.Configure<IdentityServerConfig>(Configuration.GetSection("IdentityServer"));

	services.AddMvc().AddMvcOptions(options => 
	{
		options.Filters.Add(new GlobalExceptionFilter(Log.Logger));
	});
	
	services.AddSwaggerGen();
	services.ConfigureSwaggerGen();
	services.AddMemoryCache();

	return services.AddStructureMap(Configuration);
}

public void Configure(
	IApplicationBuilder app, 
	IHostingEnvironment env, 
	ILoggerFactory loggerFactory,
	IApplicationLifetime appLifetime)
{
	IdentityServerConfig idSrvConfig = new IdentityServerConfig();
	Configuration.GetSection("IdentityServer").Bind(idSrvConfig);

	loggerFactory.AddSerilog();

	app.UseIdentityServerAuthentication(new IdentityServerAuthenticationOptions
	{
		Authority = idSrvConfig.Authority,
		ScopeName = idSrvConfig.Scope,
		RequireHttpsMetadata = false,
		AutomaticAuthenticate = true,
		AutomaticChallenge = true
	});

	app.UseMvc();
	app.UseSwagger();
	app.UseSwaggerUi();

	appLifetime.ApplicationStopped.Register(Log.CloseAndFlush);
}

Let's take a closer look at what that code is doing...

StructureMap

By default you get Microsoft's IoC container.  While it does the job for simple projects, I much prefer the power that StructureMap gives me.  However, I was having trouble getting IOptions<IdentityServerConfig> properly injected into my controllers. 

The solution to my problem ended up being pretty straight forward.  Just make sure that all of your calls to services.configure<T> come before you make you're call to:

// do this:
services.Configure<IdentityServerConfig>(Configuration.GetSection("IdentityServer"));

// before this:
container.Populate(services);

In hind site that's a pretty obvious thing to do.  StructureMap won't know anything about what you've added to the default IoC container after you call container.Populate(servcies).

Using your settings

After the configuration has been loaded and StructureMap has been configured you can get access to the values from your appsettings.json file by injecting IOptions<T> (where T would be IdentityServerConfig in my example) into the controller (or whatever class you need).

That's great, unless you need to access the values in Startup.cs for some reason.  The solution to that problem is to use the following code after the configuration has been loaded (via builder.Build()):

IdentityServerConfig idSrvConfig = new IdentityServerConfig();
Configuration.GetSection("IdentityServer").Bind(idSrvConfig);

While that's pretty simple code, I had some trouble finding that information.

Overriding settings

If you look at the "Logging" section in my appsettings.json you'll notice there is a Boolean value indicating whether or not Elasticsearch should be used.  I have Elasticsearch running locally, but not in the development environment.

{
  "Logging": {
    "UseElasticsearch":  false, 
    "IncludeScopes": false,
    "LogLevel": {
      "Default": "Debug",
      "System": "Information",
      "Microsoft": "Information"
    }
  },
  "IdentityServer": {
    "Authority": "http://localhost:5000",
    "Scope": "Some Scope"
  }
}

To get around that problem I added a Boolean value to my configuration that I can override with a settings file that only exists on my computer:

{
  "Logging": {
    "UseElasticsearch": true
  }
}

Notice that this file only needs to have the values you're overriding.  You can then configure the configuration builder to load the local settings file if it exists:

var builder = new ConfigurationBuilder()
	.SetBasePath(env.ContentRootPath)
	.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
	.AddJsonFile($"appsettings.{env.EnvironmentName}.json", optional: true)
	.AddJsonFile("appsettings.local.json", optional: true);

The order you add the JSON files does matter.  I always set things up so that my local file will override any other change.

 

Debugging ASP.NET Core Web APIs with Swagger

Debugging APIs

Debugging .NET based RESTful APIs isn't really that difficult.  Once have your code base successfully passing all unit tests it's just a matter of having the right tools and knowing the URLs for all of the end points you need to test.  Usually you're doing this to verify that your API works as expected (authentication / authorization, HTTP status codes, location headers, response bodies, etc...)

For a long time now I've been using an excellent Chrome App called Postman.  Postman offers a lot of great features:

  1. Slick user interface
  2. Ability to save API calls as Collections
  3. You can access your Collections from any computer (using Chrome)
  4. It supports Environments (which allow you to setup environment variables)
  5. You can share Collections and Environments
  6. Test automation

So why not just stick with Postman?  Simple, it doesn't lend itself well to exploring an API.  That's not a problem for the API developer (usually); however, it is a problem for third parties looking to leverage your API (be it another team or another company).  Swagger does an excellent job documenting your API and making it much easier for other users to explore and test.

Using Swagger with an ASP.NET Core 1.0 Web API

Like most things in the .NET world, adding Swagger boils down to adding a NuGet package to your project.  I would assume you could still use the NuGet Package Manager Console; however, we'll just add the required package to our project.json file:

dependencies": {
    "Microsoft.NETCore.App": {
      "version": "1.0.0",
      "type": "platform"
    },
    "Swashbuckle": "6.0.0-beta901"
  },

Next you'll need to add a few lines to your Startup.cs file:

public void ConfigureServices(IServiceCollection services)
{
    // Add framework services.
    services.AddApplicationInsightsTelemetry(Configuration);
    services.AddMvc();
    services.AddSwaggerGen();
}

and:

public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
    loggerFactory.AddConsole(Configuration.GetSection("Logging"));
    loggerFactory.AddDebug();
 
    app.UseApplicationInsightsRequestTelemetry();
    app.UseApplicationInsightsExceptionTelemetry();
 
    app.UseMvc();
    app.UseSwagger();
    app.UseSwaggerUi();
}

Now you should be able to run your app and explore your API using Swagger by appending /swagger/ui to the Web API's base URL.  It would probably be a good idea to set the your project's Launch URL to the Swagger UI's URL.  You can set by right clicking on your project, selecting Properties, and navigating to the Debug tab.

Security via OperationFilters

In most situations you're going to need to add some sort of Authorization header to your API call.  Fortunately Swashbuckle provides a relatively easy way to add new fields to the Swagger UI.

The following class will take care of adding the Authorization field to the Swagger UI:

public class AuthorizationHeaderParameterOperationFilter : IOperationFilter
{
	public void Apply(Operation operation, OperationFilterContext context)
	{
		var filterPipeline = context.ApiDescription.ActionDescriptor.FilterDescriptors;
		var isAuthorized = filterPipeline.Select(filterInfo => filterInfo.Filter).Any(filter => filter is AuthorizeFilter);
					var allowAnonymous = filterPipeline.Select(filterInfo => filterInfo.Filter).Any(filter => filter is IAllowAnonymousFilter);

		if (isAuthorized && !allowAnonymous)
		{
			if (operation.Parameters == null)
				operation.Parameters = new List<IParameter>();

			operation.Parameters.Add(new NonBodyParameter
			{                    
				Name = "Authorization",
				In = "header",
				Description = "access token",
				Required = false,
				Type = "string"
			});
		}
	}
}

With that in place you simply need to tell Swashbuckle about it in your Startup.cs:

public void ConfigureServices(IServiceCollection services)
{
    // Add framework services.
    services.AddApplicationInsightsTelemetry(Configuration);
    services.AddMvc();
    services.AddSwaggerGen();
    services.ConfigureSwaggerGen(options =>
	{
		options.SingleApiVersion(new Info
		{
			Version = "v1",
			Title = "Sample API",
			Description = "This is a sample API",
			Contact = new Contact
			{
				Name = "Mike",
				Email = "email@example.com"
			}
		});
		options.OperationFilter<AuthorizationHeaderParameterOperationFilter>();
		options.IncludeXmlComments(GetXmlCommentsPath());
		options.DescribeAllEnumsAsStrings();
	});
}

If you run your API project you should now see the Authorization field added to the "Try it out!" section of the Swagger UI for the selected end point.

That's all there is to it.  You now have a self documenting API that is both easy to explore and test using the Swagger UI.  To add even more value to the Swagger UI you should look into using the attributes and XML Documentation support that Swashbuckle offers.

Git Your Own Hub

The setup

Skip it

Cloud based version control has two major benefits:  it makes version control readily accessible to everyone and it provides an offsite backup.  Several cloud based providers are available; GitHub, BitBucket, Visual Studio Team Services, Google Cloud Platform, and CodePlex to name a few.  However, most (if not all) of them have their limitations.  For the most part, if you want to keep your repositories private, you'll end up paying at some point.

Update

GitHub has changed it plans so that all paid plans now include unlimited private repositories.  When I first started using GitLab you were limited to just a few private repositories on most of the reasonably priced plans.  However, if you are a large company with several developers or you simply don't like the idea of your code being hosted on someone else's server GitLab is still a solid alternative to GitHub

Enter GitLab

GitLab is a pretty solid repository manager.  It offers most of the features that the more well known options offer and you can host it locally or you can use GitLab.com.  If you have a Linux box available use it.  Otherwise you can leverage VirtualBox to setup an Ubuntu VM.

Installation

GitLab's website does an excellent job of walking you through the installation process.  Rather than trying to reinvent the wheel, I'm just going to direct you to their website: https://about.gitlab.com/downloads/.

 

It's been a while...

You were probably beginning to think that I was one of those flash in the pan bloggers.  Here one day, gone the next.  Nope.

It's been a while, but I'm back.  Some things beyond my control happened.  They sucked, but they're in the past now.  Time to keep moving forward.

My next post will focus on GitLab: an awesome alternative to GitHub if you have the resources to host it locally (and these days, just about everyone should have those resources available in some form or fashion).  After that I'm hoping to move onto some F# posts.  While I've been enamored with the language for some time, I haven't had the chance to really put it to use until now.

I'm also planning to continue my work with Antlr.  According to Google Analytics that's what brings most of you to my blog anyway.

Beyond that we'll just have to wait and see what catches my interest.  I'm doing my best to talk my current client into using a document database for caching.  Not only is it something I want to learn more about, it's something I honestly believe is to valuable to pass up.  Why hammer your relational DB when you can just grab the data from a cache or a secondary data store that has the data in the format you need without having to do four hundred joins?

Thanks for stopping by.  I'll have that GitLab post up soon (unless things start to go south again).