3

I am new to Serilog and I am trying to determine how to send a serialized json to the console with a log level and date time field. There doesn't seem to be any info in the docs under structured data.

Here is my code that is called in the Startup.cs:

private void LoggerLoop(ILogger<Startup> logger)
{
    RabbitModel rb = new RabbitModel
    {
        Id = 1,
        DeviceNum = 1,
        DeviceName = "Device 1",
        InputNum = 1,
        InputName = "Input 1",
        InputState = 1,
        OnPhrase = "On",
        OffPhrase = "Off",
        When = "2020-01-01T22:45:00.1124303+00:00"
    };

    while (true)
    {
        logger.LogInformation("{@rb}", rb);
        Thread.Sleep(1000);
    }
}

And here is my output:

[14:28:22 INF] {"Id": 1, "DeviceNum": 1, "DeviceName": "Device 1", "InputNum": 1, "InputName": "Input 1", "InputState": 1, "OnPhrase": "On", "OffPhrase": "Off", "When": "2020-01-01T22:45:00.1124303+00:00", "$type": "RabbitModel"}

I did notice that its added a field $type and wondered if this is possible for the [14:28:22 INF] to be added to the json?

8
  • Why do you need that data in json? Commented Feb 4, 2020 at 14:58
  • I especially want the log level and the rest in json because I wan't to be able to use it with Elasticsearch Commented Feb 4, 2020 at 15:11
  • Use Elasticsearch sink for that. It will automatically convert all log data to json (elastic format). You can direct write to elastic, or write to console with ElasticsearchJsonFormatter, but then you need something that will collect logs and send it to elastic. It can be fluentd if you use containers Commented Feb 4, 2020 at 15:15
  • Thanks, yes I am running everything from containers. I was planning on sending json to Filebeat via Docker containers that then sent that data to Elasticsearch. Commented Feb 4, 2020 at 15:56
  • 1
    I'll describe our flow: according to 12 factor app all logs writes to stdout (console) in elasticsearch format, then we have kubernetes cluster with installed fluentd which collect all logs from containers (by filter) and send it to the Elastic. All logs are structured, it improves searching in Kibana Commented Feb 4, 2020 at 17:07

1 Answer 1

3

According to 12 factor app, application should writes all logs to stdout/stderr.

Then you need collect all logs together, and route to one or more final destinations for viewing (Elasticserach). Open-source log routers (such as FluentBit, Fluentd and Logplex) are available for this purpose.

So, the app never concerns itself with routing or storage of its logs. In dotnet app You can easily achieve it using Serilog

Let's say we have the following logger settings in appsettings.json

"Logging": {
    "OutputFormat": "console",
    "MinimumLevel": "Information"
}

We can create an extension method

private static IWebHostBuilder CreateWebHostBuilder() =>
    WebHost.CreateDefaultBuilder()
        .UseStartup<Startup>()
        .UseLogging();
}

that can write logs to the console in both plain text and elasticsearch format. Plain text logs will be useful for development, because it is more human readable. On Production we enable elasticsearch format and see all logs only in Kibana.

The code of extension with comments:

public static IWebHostBuilder UseLogging(this IWebHostBuilder webHostBuilder, string applicationName = null) =>
    webHostBuilder
        .UseSetting("suppressStatusMessages", "True") // disable startup logs
        .UseSerilog((context, loggerConfiguration) =>
        {
            var logLevel = context.Configuration.GetValue<string>("Logging:MinimumLevel"); // read level from appsettings.json
            if (!Enum.TryParse<LogEventLevel>(logLevel, true, out var level))
            {
                level = LogEventLevel.Information; // or set default value
            }

            // get application name from appsettings.json
            applicationName = string.IsNullOrWhiteSpace(applicationName) ? context.Configuration.GetValue<string>("App:Name") : applicationName;

            loggerConfiguration.Enrich
                .FromLogContext()
                .MinimumLevel.Is(level)
                .MinimumLevel.Override("Microsoft", LogEventLevel.Warning)
                .MinimumLevel.Override("System", LogEventLevel.Warning)
                .Enrich.WithProperty("Environment", context.HostingEnvironment.EnvironmentName)
                .Enrich.WithProperty("ApplicationName", applicationName);

            // read other Serilog configuration
            loggerConfiguration.ReadFrom.Configuration(context.Configuration);

            // get output format from appsettings.json. 
            var outputFormat = context.Configuration.GetValue<string>("Logging:OutputFormat");
            switch (outputFormat)
            {
                case "elasticsearch":
                    loggerConfiguration.WriteTo.Console(new ElasticsearchJsonFormatter());
                    break;
                default:
                    loggerConfiguration.WriteTo.Console(
                        theme: AnsiConsoleTheme.Code,
                        outputTemplate: "[{Timestamp:yy-MM-dd HH:mm:ss.sssZ} {Level:u3}] {Message:lj} <s:{Environment}{Application}/{SourceContext}>{NewLine}{Exception}");
                    break;
            }
        });

When OutputFormat is elasticsearch the log will be like

{"@timestamp":"2020-02-07T16:02:03.4329033+02:00","level":"Information","messageTemplate":"Get customer by id: {CustomerId}","message":"Get customer by id: 20","fields":{"CustomerId":20,"SourceContext":"Customers.Api.Controllers.CustomerController","ActionId":"c9d77549-bb25-4f87-8ea8-576dc6aa1c57","ActionName":"Customers.Api.Controllers.CustomerController.Get (Customers.Api)","RequestId":"0HLTBQP5CQHLM:00000004","RequestPath":"/v1/customers","CorrelationId":"daef8849b662117e","ConnectionId":"0HLTBQP5CQHLM","Environment":"Development","ApplicationName":"API","Timestamp":"2020-02-07T14:02:03.4329033Z"}}

in other case (use only for debugging)

[20-02-07 13:59:16.16Z INF] Get customer by id: 20

Then you should configure log router to collect logs from container and send it to the Elasticsearch.

If all logs are structured, it improves searching and creating indexes in Kibana.

Sign up to request clarification or add additional context in comments.

8 Comments

Thanks, you wouldn't know the appsettings.json equivalent to ElasticsearchJsonFormatter()? At the moment I have "formatter": "Serilog.Formatting.Json.JsonFormatter, Serilog". I have installed Serilog.formatting.elasticsearch in nuget.
What do you mean? Yes, you need to install package Serilog.Formatting.Elasticsearch
I have just posted this on their github: github.com/serilog/serilog-sinks-elasticsearch/issues/314. I am configuring Serilog from the appsettings.json and not newing up ElasticsearchJsonFormatter() in code. What would this be?
I'm getting an error on this line applicationName = string.IsNullOrWhiteSpace(applicationName) ? context.Configuration.GetValue<string>("App:Name") : applicationName;
Just remove this line if you don't need it. See updated answer
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.