Mamemaki.EventFlow.Outputs.BigQuery.Storage 1.0.1

dotnet add package Mamemaki.EventFlow.Outputs.BigQuery.Storage --version 1.0.1                
NuGet\Install-Package Mamemaki.EventFlow.Outputs.BigQuery.Storage -Version 1.0.1                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Mamemaki.EventFlow.Outputs.BigQuery.Storage" Version="1.0.1" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Mamemaki.EventFlow.Outputs.BigQuery.Storage --version 1.0.1                
#r "nuget: Mamemaki.EventFlow.Outputs.BigQuery.Storage, 1.0.1"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Mamemaki.EventFlow.Outputs.BigQuery.Storage as a Cake Addin
#addin nuget:?package=Mamemaki.EventFlow.Outputs.BigQuery.Storage&version=1.0.1

// Install Mamemaki.EventFlow.Outputs.BigQuery.Storage as a Cake Tool
#tool nuget:?package=Mamemaki.EventFlow.Outputs.BigQuery.Storage&version=1.0.1                

Mamemaki.EventFlow.Outputs.BigQuery

Introduction

This is a custom output for Microsoft.Diagnostics.EventFlow to insert events into Google BigQuery using BigQuery Storage Write API.

The general flow of the process:

  1. Convert Microsoft.Diagnostics.EventFlow's events(EventData) to Google BigQuery Storage protobuf message(AppendRowsRequest) array by your mapper class.
  2. If TableId has date format strings, expand it.
  3. If AutoCreateTable enabled, create table if not exists.
  4. Call BigQueryWriteClient.AppendRowsStream.WriteAsync() to stream insert rows data into BigQuery.

Getting Started

To quickly get started, you can create a simple console application in VisualStudio as described below or just download and run PlayGround project.

0. Preparing setup to use BigQuery

1. Install NuGet packages

PM> Install-Package Microsoft.Diagnostics.EventFlow.Inputs.Trace
PM> Install-Package Mamemaki.EventFlow.Outputs.BigQuery.Storage
PM> Install-Package Microsoft.Diagnostics.EventFlow.Outputs.StdOutput

2. Add "eventFlowConfig.json"

Add a JSON file named "eventFlowConfig.json" to your project and set the Build Action property of the file to "Copy if Newer". Set the content of the file to the following:

{
  "inputs": [
    {
      "type": "Trace",
      "traceLevel": "Warning"
    }
  ],
  "outputs": [
    {
      "type": "StdOutput"
    },
    {
      "type": "BigQueryStorage",
      "projectId": "xxxxxx-nnnn",
      "datasetId": "xxxxxxxx",
      "tableId": "from_eventflow_{yyyyMMdd}",
      "tableSchemaFile": ".\\tableSchema.json",
      "mapperQualifiedTypeName": "<YourMapperClassTypeName>, <AssemblyName>",
      "autoCreateTable": true
    }
  ],
  "schemaVersion": "2016-08-11",
  "extensions": [
    {
      "category": "outputFactory",
      "type": "BigQueryStorage",
      "qualifiedTypeName": "Mamemaki.EventFlow.Outputs.BigQuery.Storage.BigQueryStorageOutputFactory, Mamemaki.EventFlow.Outputs.BigQuery.Storage"
    }
  ]
}

Replace projectId, datasetId, mapperQualifiedTypeName values as your environment.

3. Add "tableSchema.json"

Add a JSON file named "tableSchema.json" to your project and set the Build Action property of the file to "Copy if Newer". Set the content of the file to the following:

[
  {
    "name": "Timestamp",
    "type": "TIMESTAMP",
    "mode": "REQUIRED"
  },
  {
    "name": "Message",
    "type": "STRING",
    "mode": "REQUIRED"
  }
]

4. Create protobuf message proto file

Add event_record.proto to your project. Set the content of the file to the following:

syntax = "proto2";

message EventRecord {
  required int64 Timestamp = 1;
  required string Message = 2;
}

5. Generate protobuf message c# code

Generate protobuf message c# code by following command.

protoc -I=./ --csharp_out=./ --csharp_opt=file_extension=.pb.g.cs ./event_record.proto

protoc command can be installed by nuget package Google.Protobuf.Tools.

6. Create implementation of IProtobufMessageMapper

Create a class that implements IProtobufMessageMapper for mapping EventData to AppendRowsRequest. The class implementation is as follows:

    internal class EventRecordProtobufMessageMapper : IProtobufMessageMapper
    {
        public DescriptorProto GetDescriptorProto()
        {
            return EventRecord.Descriptor.ToProto();
        }

        public IMessage Map(EventData eventEntry)
        {
            var eventRecord = new EventRecord();
            eventRecord.Timestamp = eventEntry.Timestamp.ToLocalTime().ToUnixTimeMicroseconds();
            if (eventEntry.TryGetPropertyValue("Message", out var message))
                eventRecord.Message = message as string;
            else
                return null;    // Discard no message event
            return eventRecord;
        }
    }

ToUnixTimeMicroseconds() is a custom extension method.

Replace mapperQualifiedTypeName in eventFlowConfig.json to your class type name and assembly name. e.g. PlayGround.EventRecordProtobufMessageMapper, PlayGround

7. Create an EventFlow pipeline

Create an EventFlow pipeline in your application code using the code below. Run your application and see your traces in console output or in Google BigQuery.

    using (var pipeline = DiagnosticPipelineFactory.CreatePipeline("eventFlowConfig.json"))
    {
        System.Diagnostics.Trace.TraceWarning("EventFlow is working!");
        Console.WriteLine("Press any key to exit...");
        Console.ReadKey(intercept: true);
    }

Query result in Google BigQuery console:

Query result in Google BigQuery console

Configuration Details

Output parameters

Parameter Description Required(default)
projectId Project id of Google BigQuery. Yes
datasetId Dataset id of Google BigQuery. Yes
tableId Table id of Google BigQuery. The string enclosed in brackets can be expanded through DateTime.Format(). e.g. "accesslog_{yyyyMMdd}" ⇒ accesslog_20181231 Yes
timeZoneId Time zone id for table id expanding. No(UTC)
tableSchemaFile Json file that define Google BigQuery table schema. Yes
mapperQualifiedTypeName The mapper class type name. e.g. "PlayGround.EventRecordProtobufMessageMapper, PlayGround" Yes
autoCreateTable If set true, check table exsisting and create table dynamically. see Dynamic table creating. No(false)

Authentication

This library refers Application Default credentials(ADC). You can use GOOGLE_APPLICATION_CREDENTIALS environment variable to set your ServiceAccount key. See https://cloud.google.com/docs/authentication/getting-started

Your ServiceAccount need permissions as bellow

  • bigquery.tables.updateData
  • bigquery.tables.create if use autoCreateTable

Table id formatting

tableId accept DateTime.ToString() format to construct table id. Table id's formatted at runtime using the Time zone that specified by the configuration.

For example, with the tableId is set to accesslog_{yyyyMM01}, table ids accesslog_20181101, accesslog_20181201 and so on.

Note that the timestamp of logs and the date in the table id do not always match, because there is a time lag between collection and transmission of logs.

Table schema

There is one method to describe the schema of the target table.

  1. Load a schema file in JSON.

On this method, set tableSchemaFile to a path to the JSON-encoded schema file which you used for creating the table on BigQuery. see table schema for detail information.

Example:

[
  {
    "name": "Timestamp",
    "type": "TIMESTAMP",
    "mode": "REQUIRED"
  },
  {
    "name": "FormattedMessage",
    "type": "STRING",
    "mode": "REQUIRED"
  }
]

Dynamic table creating

When autoCreateTable is set to true, check to exists the table before insertion, then create table if does not exist. When table id changed, do this sequence again. In other words, no table existing check occurred if not change table id.

References

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
1.0.1 42 1/21/2025
1.0.0 50 1/20/2025