<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Developing Dane]]></title><description><![CDATA[My software development related bloggings...mostly.]]></description><link>https://developingdane.azurewebsites.net/</link><generator>Ghost 0.9</generator><lastBuildDate>Sat, 04 Apr 2026 03:56:44 GMT</lastBuildDate><atom:link href="https://developingdane.azurewebsites.net/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA["Proof" for the period of a simple pendulum using only dimensional analysis]]></title><description><![CDATA[<p>This is not a mathematical proof for the period of a simple pendulum. For that you can find the basics in any first year physics book. This is just a thought experiment which I think illustrates the power of simple dimensional analysis.</p>

<p>Take the following Simple Pendulum. Its Period (<strong>t</strong></p>]]></description><link>https://developingdane.azurewebsites.net/proof-for-the-period-of-a-simple-pendulum/</link><guid isPermaLink="false">937d34f6-57d0-4d1b-95d8-ed80695d2497</guid><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Sat, 24 Feb 2024 03:55:13 GMT</pubDate><content:encoded><![CDATA[<p>This is not a mathematical proof for the period of a simple pendulum. For that you can find the basics in any first year physics book. This is just a thought experiment which I think illustrates the power of simple dimensional analysis.</p>

<p>Take the following Simple Pendulum. Its Period (<strong>t</strong>) is the time it takes for the bob to sweep through one complete cycle. Let us make the assumption that the value of <strong>t</strong> has a deterministic relation with some arrangement of measurable factors. Under this assumption we list the measurable factors.</p>

<ul>
<li><strong>l</strong> = Length of the string, e.g. meters</li>
<li><strong>m</strong> = Mass of the bob, e.g. kilograms</li>
<li><strong>a</strong> = Angle to which the bob is initially lifted, e.g. radians</li>
<li><strong>g</strong> = gravitational force, length per time squared e.g. meters per second squared</li>
</ul>

<p><img src="https://developingdane.azurewebsites.net/content/images/2021/07/simple-pendulum.jpg" alt=""></p>

<p>Only <strong>g</strong> has a dimensional component of time and its only other component is of distance. Only <strong>l</strong> has a dimension of distance. Thus, dimensional analysis indicates <strong>m</strong> and <strong>a</strong> cannot be related to <strong>t</strong>.</p>

<p>Since we know period must have the dimension of time it's a trivial task to arrange the remaining factors <strong>l</strong> and <strong>g</strong> into a formula that produces a time result. If we use seconds (s) and meters (m) as our units of time and length respectively we have:</p>

<p><img src="https://developingdane.azurewebsites.net/content/images/2023/09/t1-1.png" alt=""></p>

<p>Removing unit labels and adding a constant of proportionality (<strong>k</strong>) we are left with:</p>

<p><img src="https://developingdane.azurewebsites.net/content/images/2023/09/t2.png" alt=""></p>

<p>If we were to assume our new formula was valid we could experimentally measure the value of <strong>k</strong> and that value would resolve to approximately 2Pi. Thus, using only simple dimensional analysis we've generated the classical formula for the period of a Simple Pendulum.</p>

<p><img src="https://developingdane.azurewebsites.net/content/images/2023/09/t3.png" alt=""></p>]]></content:encoded></item><item><title><![CDATA[Exploring Stowage]]></title><description><![CDATA[<p>I originally came across <a href="https://github.com/aloneguid/stowage">Stowage</a> on the blog aggregator <a href="https://dotnet.libhunt.com/">Awesome .NET</a>. Its github repo has a description which reads "Stowage is a bloat-free .NET cloud storage kit that supports at minimum THE major cloud providers." That caught my attention because I'd been contemplating how best to modernize and standardize file</p>]]></description><link>https://developingdane.azurewebsites.net/exploring-stowage/</link><guid isPermaLink="false">890258e7-7cd7-470f-8402-51ef5d75db5a</guid><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Sun, 27 Mar 2022 03:45:18 GMT</pubDate><content:encoded><![CDATA[<p>I originally came across <a href="https://github.com/aloneguid/stowage">Stowage</a> on the blog aggregator <a href="https://dotnet.libhunt.com/">Awesome .NET</a>. Its github repo has a description which reads "Stowage is a bloat-free .NET cloud storage kit that supports at minimum THE major cloud providers." That caught my attention because I'd been contemplating how best to modernize and standardize file storage operations in an older .NET application for some time.</p>

<h4 id="onward">Onward</h4>

<p>I created the <a href="https://github.com/DaneVinson/StowageExplorer">StowageExplorer</a> project to take Stowage for a spin. While adding the <a href="https://www.nuget.org/packages/Stowage/">NuGet package</a> I discovered Stowage is .NET 6 only. That ruled it out for my original intent but then I found it had ZERO dependencies. I absolutely loved that so I continued.</p>

<p>Stowage provides a simple abstraction around file/directory operations and built-in implementations of that abstraction for Local Disk, In-Memory, AWS S3, Azure Blob Storage, Google Cloud Storage and Databricks storage systems. Stowage's primary abstraction is the interface <code>IFileStorage</code> and implementations of that interface are available for all the previously mentioned storage systems. In general an instance of an <code>IFileStorage</code> implementation is obtained as follows:</p>

<pre><code class="language-csharp">using (IFileStorage fileStorage = Files.Of.{storage_provider}(string arg1,...))  
{
   await fileStorage.{action_method};
}
</code></pre>

<p>Factory methods are available through <code>Files.Of</code> which return instances of each of the <code>IFileStorage</code> implementation previously listed. For example the Azure Blob Storage implementation might be generated as follows:</p>

<pre><code class="language-csharp">var fileStorage = Files.Of.AzureBlobStorage(string "{account}", string "{key}");  
</code></pre>

<p>An instance of <code>IFileStorage</code> exposes methods for all common read/write operations against the underlying file storage type and all the methods are <code>async</code>. With this abstraction tasks such as copying files from one place to another are standardized and the developer has no need to concern himself with the details of the underlying implementations. Following are some example usages:</p>

<pre><code class="language-csharp">// Get a list of all files in a store recursively
var files = await fileStorage.Ls(path, true, cancellationToken);

// Write a text file
await fileStorage.WriteText("ThisIsMyFileName.txt", "This is the body of the file.");

// Copy a file from one file store to another
await using var sourceStream = await fileStorage1.OpenRead(sourcePath, cancellationToken)  
await using var targetStream = await fileStorage2.OpenWrite(targetPath, writeMode, cancellationToken)  
{
    await sourceStream.CopyToAsync(targetStream, cancellationToken);
}

// Delete a file
await fileStorage.Rm("ThisIsMyFileName.txt");  
</code></pre>

<p>The <code>IFileStorage</code> interface is a straightforward and elegant abstraction for working with files but ideally the instantiation of <code>IFileStorage</code> objects would be decoupled from their specific implementations in code. For example, let us say a developer (Bilbo) wishes to get a list of files in the root folder of an application's primary cloud storage (Cloud1). To do so he'd need to know the fact that the storage provider for Cloud1 is Azure Blob Storage so that he could call the method <code>Files.Of.AzureBlobStorage</code>. In this scenario Bilbo should not need to know which provider hosts Cloud1. He just needs a list of files and ideally he should only need to know the name (or key) of the provider he wishes to access. Specific details regarding the type of storage host and its required options should be resolved by configuration and dependency injection. As an example of how this might be accomplished with Stowage I created the class library <a href="https://github.com/DaneVinson/StowageExplorer/tree/main/SE.Domain">SE.Domain</a>.</p>

<p>The use case for SE.Domain is to supply a fictitious application with the three different <code>IFileStorage</code> providers it needs to function correctly: Cloud1, LocalStorage and TempStorage. Additionally, these providers must target two different underlying storage hosts, i.e. Azure Blog Storage and local disk.</p>

<p>We can satisfy this use case with reasonable simplicity by defining the interface.</p>

<pre><code class="language-csharp">public interface IStorageOptions  
{
    string Name { get; }
}
</code></pre>

<p>We then create implementations of <code>IStorageOptions</code> for each of our two required storage hosts.</p>

<pre><code class="language-csharp">public class AzureStorageOptions : IStorageOptions  
{
    public string AccountName { get; set; } = string.Empty;
    public string ContainerName { get; set; } = string.Empty;
    public string Key { get; set; } = string.Empty;
    public string Name { get; set; } = string.Empty;
}

public class LocalStorageOptions : IStorageOptions  
{
    public string Root { get; set; } = string.Empty;
    public string Name { get; set; } = string.Empty;
}
</code></pre>

<p>Each of these options classes can then be easily loaded by configuration during application start-up. An example of this can be seen in the <a href="https://github.com/DaneVinson/StowageExplorer/tree/main/SE.ConsoleApp">SE.ConsoleApp</a> application. The <code>appsettings.json</code> file in that project defines the required options as one <code>AzureStorageOptions</code> and two <code>LocalStorageOptions</code>:</p>

<pre><code class="language-json">"AzureStorageOptions": [
  {
    "Name": "Cloud1",
    "AccountName": "account1",
    "ContainerName": "container1",
    "Key": "key1"
  }
],
"LocalStorageOptions": [
  {
    "Name": "LocalStorage",
    "Root": "\\fileserver1\\files"
  },
  {
    "Name": "TempStorage",
    "Root": "c:\\temp"
  }
]
</code></pre>

<p>The last piece we need is something to instantiate the specific <code>IFileStorage</code> implementations based on the configuration options, i.e. call the correct <code>Files.Of</code> factory methods. This is the function of the <a href="https://github.com/DaneVinson/StowageExplorer/blob/main/SE.Domain/StorageManager.cs"><code>SE.Domain.StorageManger</code></a> class. <code>StorageManager</code> instantiates <code>IFileStorage</code> objects as defined by the configuration options and manages their lifetimes. It implements <code>IDisposable</code> and is intended to be scoped as singleton. Finally, it allows for access to those instances by name alone, e.g. Cloud1.</p>

<p>All of this is brought together during application start-up when configuring services (i.e. <code>IServiceCollection</code>).</p>

<pre><code class="language-csharp">services  
    .AddSingleton&lt;StorageManager&gt;()
    .AddSingleton(_ =&gt; {
        var options = new List&lt;IStorageOptions&gt;();
        options.AddRange(configuration.GetSection("AzureStorageOptions").Get&lt;AzureStorageOptions[]&gt;());
        options.AddRange(configuration.GetSection("LocalStorageOptions").Get&lt;LocalStorageOptions[]&gt;());
        return options;
    });
</code></pre>

<p>Now Bilbo can easily get his list of Cloud1 files provided the class he is working in has taken a dependency on <code>StorageManager</code>.</p>

<pre><code class="language-csharp">var files = storageManager  
                .GetFileStorage("Cloud1")
                .Ls(path: null, recurse: true)
</code></pre>

<p><br>  </p>

<h4 id="conclusion">Conclusion</h4>

<p>Stowage is a beautiful and focused library which greatly simplifies working with multiple storage providers. With just a bit of extra effort abstractions can be created to further standardize such work in any application. I love the simplicity and uniformity throughout the library and, as previously noted, having no dependencies on other packages is tremendous. Stowage has found a home in my toolbox.</p>

<h4 id="references">References</h4>

<ul>
<li><a href="https://github.com/aloneguid/stowage">Stowage</a> github repo</li>
<li><a href="https://github.com/DaneVinson/StowageExplorer">StowageExplorer</a> github repo</li>
<li><a href="https://www.nuget.org/packages/Stowage/">Stowage NuGet</a></li>
<li><a href="https://github.com/aloneguid/storage">Storage.Net</a> github repo (Stowage's predecessor, .NET Standard 2.0/2.1)</li>
<li><a href="https://www.nuget.org/packages/Storage.Net/">Stowage.Net NuGet</a></li>
</ul>]]></content:encoded></item><item><title><![CDATA[JsonEnvelopes .NET Standard Library]]></title><description><![CDATA[<h3 id="description">Description</h3>

<p>JsonEnvelopes is a simple .NET Standard library which utilizes a concrete implementation of <a href="https://docs.microsoft.com/en-us/dotnet/api/system.text.json.serialization.jsonconverter-1?view=net-5.0"><code>JsonCovnverter&lt;T&gt;</code></a> (<a href="https://www.nuget.org/packages/System.Text.Json/">System.Text.Json</a>) to serialize and deserialize objects in a way that allows message receivers to be agnostic with respect to message type.</p>

<h3 id="messaginginnet">Messaging in .NET</h3>

<p>Any software developer working in the</p>]]></description><link>https://developingdane.azurewebsites.net/jsonenvelopes/</link><guid isPermaLink="false">a587e113-2596-4476-8a14-a640aab6d445</guid><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Mon, 25 Jan 2021 02:13:36 GMT</pubDate><content:encoded><![CDATA[<h3 id="description">Description</h3>

<p>JsonEnvelopes is a simple .NET Standard library which utilizes a concrete implementation of <a href="https://docs.microsoft.com/en-us/dotnet/api/system.text.json.serialization.jsonconverter-1?view=net-5.0"><code>JsonCovnverter&lt;T&gt;</code></a> (<a href="https://www.nuget.org/packages/System.Text.Json/">System.Text.Json</a>) to serialize and deserialize objects in a way that allows message receivers to be agnostic with respect to message type.</p>

<h3 id="messaginginnet">Messaging in .NET</h3>

<p>Any software developer working in the Enterprise space is likely acquainted with messaging. Modern Enterprise applications typically rely heavily on sending and receiving serialized messages. In .NET the serialization and sending of messages is often trivialized by packages such as <a href="https://www.nuget.org/packages/System.Text.Json/">System.Text.Json</a> and <a href="https://www.nuget.org/packages/Microsoft.Azure.ServiceBus/">Microsoft.Azure.ServiceBus</a> respectively. The more interesting decisions come in designing a receiving strategy for those messages.</p>

<p>With .NET the primary challenge in receiving messages is the need to resolve the <code>Type</code> of the message in advance of its deserialization. There are many strategies to solve this issue and they often depend on the function of the receiver. In the following examples we'll consider a <code>CastFireball</code> command being received as a serialized message.</p>

<h5 id="webapi">Web API</h5>

<p>In a Web API style application messages are received by controllers. In the following example a <code>SpellsController</code> resolves the body of an HTTP POST made to <em>{root}/spells/cast/fireball</em> as an instance of <code>CastFireball</code>. The route (and the HTTP method, i.e. POST) informs the service what <code>Type</code> to expect and thus how to deserialize the received message.</p>

<pre><code class="language-csharp">[ApiController]
[Route("[controller]")]
public class SpellsController : ControllerBase  
{
    [HttpPost]
    [Route("cast/fireball")]
    public async Task&lt;ActionResult&gt; ReceiveCastFireball([FromBody]CastFireball spell) 
    {
        await HandleCastFireballAsync(spell);
...
</code></pre>

<p><br></p>

<h5 id="queuepertypequeuereaders">Queue per Type Queue Readers</h5>

<p>With this strategy a message's <code>Type</code> is resolved by the name of its queue. In the following example an <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-overview">Azure Function</a> listens on an Azure Service Bus queue named <em>cast-fireball</em> and handles incoming messages as <code>CastFireball</code>.</p>

<pre><code class="language-csharp">public async Task Run([ServiceBusTrigger("cast-fireball")]string message)  
{
    var spell = JsonSerializer.Deserialize&lt;CastFireball&gt;(message);
    await HandleCastFireballAsync(spell);
...
</code></pre>

<p><br></p>

<h5 id="partitionkeypertypequeuereaders">Partition Key per Type Queue Readers</h5>

<p>If your bus supports partition keys another option might be to use the message's <code>Type</code> as its partition key. In the following example an <a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-overview">Azure Function</a> listens on an Azure Service Bus queue named <em>spells</em> and resolves the <code>Type</code> of the incoming messages with their partition keys. It should be noted that the <a href="https://docs.microsoft.com/en-us/dotnet/api/microsoft.azure.servicebus.message?view=azure-dotnet"><code>Message</code> class</a> used in this example also provides the property <code>ContentType</code> which could alternately be used for this purpose but libraries for some bus implementations may not expose such a property.</p>

<pre><code class="language-csharp">public async Task Run([ServiceBusTrigger("spells")]Message message)  
{
    var json = Encoding.UTF8.GetString(message.Body);
    switch (message.PartitionKey)
    {
        case "CastFireball":
            var spell = JsonSerializer.Deserialize&lt;CastFireball&gt;(json);
            await HandleCastFireballAsync(spell);
...
</code></pre>

<p>With each of these message receiving strategies (and many others) the code to deserialize a received message and "handle" it (e.g. calling <code>HandleCastFireballAsync</code>) will become highly repetitive as the number of message types increases. A preferable solution would be to perform message deserialization generically and use standard Dependency Injection (DI) or a library such as <a href="https://github.com/jbogard/MediatR">MediatR</a> to handle the deserialized object. Facilitating this is exactly the purpose of JsonEnvelopes.</p>

<h3 id="jsonenvelopes">JsonEnvelopes</h3>

<p>An envelope can be simply thought of as a content wrapper with a label. In JsonEnvelopes this idea is expressed as <a href="https://github.com/DaneVinson/JsonEnvelopes/blob/main/JsonEnvelopes/EnvelopeOfT.cs"><code>Envelope&lt;TContent&gt;</code></a> where the content is an instance of <code>TContent</code> and the label is <code>TContent</code>'s type name.</p>

<p>In the following example an instance of <code>CastFireBall</code> is wrapped in an instance of <code>Envelope&lt;TContent&gt;</code> which is then serialized, ready to be sent. Note that the call to <code>Serialize&lt;T&gt;</code> specifies <code>T</code> as <code>Envelope</code> not <code>Envelope&lt;CastFireBall&gt;</code>.</p>

<pre><code class="language-csharp">var command = new CastFireBall();  
var envelope = new Envelope&lt;CastFireBall&gt;(command);  
string json = JsonSerializer.Serialize&lt;Envelope&gt;(envelope);  
</code></pre>

<p>Any json string created this way can be deserialized as follows.</p>

<pre><code class="language-csharp">var envelope = JsonSerializer.Deserialize&lt;Envelope&gt;(json);  
</code></pre>

<p>Again note the use of the type <code>Envelope</code>. Calling <code>Serialize&lt;Envelope&gt;</code> and <code>Deserilaize&lt;Envelope&gt;</code> triggers the use of a custom <code>JsonConverter</code>. With this in hand we can now leverage JsonEnvelopes to deserialize and handle objects in a more generic manner. In the following example we'll consider using standard Dependency Injection. The use of a library like MediatR can simplify the code even further. Complete examples of each technique can be found in the <a href="https://github.com/DaneVinson/JsonEnvelopes/tree/main/JsonEnvelopes.Example">JsonEnvelopes.Example project</a>.</p>

<p>We first define two simple interfaces for handling commands.</p>

<pre><code class="language-csharp">public interface ICommandHandler  
{
    Task&lt;bool&gt; HandleAsync(object command);
}

public interface ICommandHandler&lt;TCommand&gt; : ICommandHandler  
{
    Task&lt;bool&gt; HandleAsync(TCommand command);
}
</code></pre>

<p>Next we define an implementation of <code>ICommandHandler&lt;CastFireball&gt;</code>.</p>

<pre><code class="language-csharp">public class CastFireballHandler : ICommandHandler&lt;CastFireball&gt;  
{
    public Task&lt;bool&gt; HandleAsync(CastFireball command)
    {
        // Handling code
    }

    public Task&lt;bool&gt; HandleAsync(object command) =&gt;
        HandleAsync((CastFireball)command);
}
</code></pre>

<p>At application startup (typically in the <code>ConfigureServices</code> method of Startup.cs) we wire-up Dependency Injection for <code>ICommandHandler&lt;CastFireball&gt;</code> in the standard way. The following line would be repeated for each implementation of <code>ICommandHandler&lt;TCommand&gt;</code>, i.e. generally once per command type.</p>

<pre><code class="language-csharp">services.AddSingleton&lt;ICommandHandler&lt;CastFireball&gt;, CastFireballHandler&gt;();  
</code></pre>

<p>Finally, we'll implement our previous Web API and Queue Reader examples using JsonEnvelopes.</p>

<h5 id="webapi">Web API</h5>

<p>In the following example we first get the <code>Type</code> specified by the <code>envelope</code>'s <code>ContentType</code> string property and use it to get the <code>Type</code> of the specific <code>ICommandHandler&lt;TCommand&gt;</code> to be used. Next, we use the injected <code>IServiceProvider</code> to get an instance of that interface (as specified at application start-up) which we cast as <code>ICommandHandler</code>. Finally, we use the <code>handler</code> to handle the command.</p>

<pre><code class="language-csharp">[ApiController]
[Route("[controller]")]
public class CommandsController : ControllerBase  
{
    private readonly IServiceProvider _serviceProvider;

    public CommandsController(IServiceProvider provider) =&gt;
        _serviceProvider = provider;

    [HttpPost]
    public async Task&lt;ActionResult&gt; ReceiveCommand([FromBody]Envelope envelope) 
    {
        var contentType = Type.GetType(envelope.ContentType);
        var handlerType = typeof(ICommandHandler&lt;&gt;).MakeGenericType(contentType));
        var handler = _serviceProvider.GetService(handlerType) as ICommandHandler;
        await handler.HandleAsync(commandEnvelope.GetContent());
...
</code></pre>

<p>This code is obviously slightly more complicated than the previous Web API example. However, note that the new <code>ReceiveCommand</code> method can be used for <strong>any</strong> command. We no longer need a method for each command type. Additionally, rather than our Web API specifying a route per command type, e.g. <em>{root}/spells/cast/fireball</em>, all commands can be sent to the same route, i.e. <em>{root}/commands</em>. </p>

<h5 id="queuereader">Queue Reader</h5>

<p>As before we see an example of an Azure Function listening on an Azure Service Bus queue. However, in this case we listen to the <em>commands</em> queue and handle all commands generically in much the same way as the Web API example.</p>

<pre><code class="language-csharp">public async Task Run([ServiceBusTrigger("commands")]string message)  
{
    var envelope = JsonSerializer.Deserialize&lt;Envelope&gt;(message);
    var contentType = Type.GetType(envelope.ContentType);
    var handlerType = typeof(ICommandHandler&lt;&gt;).MakeGenericType(contentType));
    var handler = _serviceProvider.GetService(handlerType) as ICommandHandler;
    await handler.HandleAsync(commandEnvelope.GetContent());
...
</code></pre>

<p>This code is also slightly more complicated than the previous Queue Reader examples but again one implementation can be used to handle <strong>all</strong> command types. We no longer have need of strategies like Queue Per Type or Partition Key Per Type.</p>

<h3 id="wrapup">Wrap Up</h3>

<p>I originally created JsonEnvelopes months ago when I found myself needing to solve the previously described issues for perhaps the tenth time in the last five years. Rather than adding yet another new implementation to the project I'd just started I decided to create the stand-alone project JsonEnvelopes first. I created the GitHub <a href="https://github.com/DaneVinson/JsonEnvelopes">repo</a>, added code and setup an Azure DevOps Pipeline (<a href="https://github.com/DaneVinson/JsonEnvelopes/blob/main/azure-pipelines.yml">azure-pipelines.yml</a>) to push the <a href="https://www.nuget.org/packages/JsonEnvelopes/">JsonEnvelopes package</a> to nuget.org. I'm happy with the results of both the code and the CI/CD pipeline that resulted from this exploration.</p>

<h4 id="references">References</h4>

<ul>
<li><a href="https://github.com/DaneVinson/JsonEnvelopes">JsonEnvelopes</a> GitHub repo</li>
<li><a href="https://www.nuget.org/packages/JsonEnvelopes/">JsonEnvelopes</a> NuGet Package</li>
<li><a href="https://github.com/jbogard/MediatR">MediatR</a> GitHub repo</li>
<li><a href="https://www.nuget.org/packages/Microsoft.Azure.ServiceBus/">Microsoft.Azure.ServiceBus</a> NuGet Package</li>
<li><a href="https://www.nuget.org/packages/System.Text.Json/">System.Text.Json</a> NuGet Package</li>
<li><a href="https://docs.microsoft.com/en-us/azure/azure-functions/functions-overview">Azure Functions Overview</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/devops/pipelines/?view=azure-devops">Azure Pipelines Documentation</a></li>
</ul>]]></content:encoded></item><item><title><![CDATA[Comparing Performance of gRPC, Web API and WCF Services]]></title><description><![CDATA[<p>I started paying attention to gRPC about a year ago. Almost every gRPC advocate and evangelist I came across touted its performance capabilities as a significant value-add over other available service technologies. </p>

<p>The release of .NET Core 3.0 came with a new gRPC project template so I dove right</p>]]></description><link>https://developingdane.azurewebsites.net/service-compare/</link><guid isPermaLink="false">cd76b0db-126b-40ab-9fa8-b8937c4ac332</guid><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Wed, 13 May 2020 00:13:46 GMT</pubDate><media:content url="https://developingdane.com/content/images/2020/02/service-compare-2.png" medium="image"/><content:encoded><![CDATA[<img src="https://developingdane.com/content/images/2020/02/service-compare-2.png" alt="Comparing Performance of gRPC, Web API and WCF Services"><p>I started paying attention to gRPC about a year ago. Almost every gRPC advocate and evangelist I came across touted its performance capabilities as a significant value-add over other available service technologies. </p>

<p>The release of .NET Core 3.0 came with a new gRPC project template so I dove right in. During my explorations I found myself continually wondering how its performance stacked up against other service types available in .NET. A couple months ago I decided to circle back and satisfy this curiosity.</p>

<h3 id="methodology">Methodology</h3>

<p>After some consideration I decided to quantify performance characteristics for gPRC, Web API and WCF service implementations built with the latest (2020) .NET technologies. The best way to compare these services would have been to implement each targeting the same version of .NET (Core 3.1). Unfortunately, WCF has never been a well supported technology in .NET Core. Because of this I chose to create the WCF service implementation in .NET Framework 4.8. To help normalize service comparisons I decided to create separate .NET Core and Framework Web API implementations.</p>

<p>In order to compare services I defined a set of operations to be exposed by each and consumed by a common client. Completion times for each of these operations across all implementations could then be analyzed. Operations selected were as follows.</p>

<ul>
<li>Get - Get a fixed collection of objects from the service</li>
<li>First - Get a single object from the service (the first in the fixed collection)</li>
<li>Send - Send a fixed collection of objects to the service</li>
<li>Send One - Send a single object to the service</li>
</ul>

<p>Additionally, I decided to examine streaming operations available only with gRPC.</p>

<ul>
<li>Get Streaming - Stream a fixed collection of objects from a service </li>
<li>Send Streaming - Stream a fixed collection of objects to a service</li>
</ul>

<h3 id="setup">Setup</h3>

<p>I started by creating a the <a href="https://github.com/DaneVinson/StupidTodo/tree/service-compare">service-compare</a> branch of my <a href="https://github.com/DaneVinson/StupidTodo">Stupid Todo</a> exploratory application. I added projects as needed for a total of four service implementations and a single client.</p>

<ul>
<li>StupidTodo.Grpc - gRPC service project, .NET Core 3.1</li>
<li>StupidTodo.WebApi - ASP.NET Web API project, .NET Core 3.1</li>
<li>StupidTodo.Framework.WebApi - ASP.NET Web API project, .NET Framework 4.8</li>
<li>StupidTodo.Framework.Wcf - ASP.NET WCF project, .NET Framework 4.8</li>
<li>StupidTodo.AdminConsole - Console application project used to gather data from each service implementation, .NET Core 3.1</li>
</ul>

<p>Each service implementation exposes the previously discussed operations and shares the data provider <a href="https://github.com/DaneVinson/StupidTodo/blob/service-compare/StupidTodo.Domain/GenFuTodoDataProvider.cs"><code>GenFuDataProvider</code></a>. That provider statically loads 5000 pre-generated <a href="https://github.com/DaneVinson/StupidTodo/blob/service-compare/StupidTodo.Domain/Todo.cs"><code>Todo</code></a> objects from a source shared by all implementations.</p>

<p>Each data set was gathered using an instance of the client backed by an instance of a specific service implementation. Both service and client instances were hosted side-by-side on the same physical machine.</p>

<p>Each data gathering cycle began in the client with a warm-up calling each available operation once. Afterwards, operations were executed one at time in the following order 1000 times each.</p>

<ul>
<li>Get</li>
<li>Send</li>
<li>First</li>
<li>Send One</li>
<li>Get Streaming (gRPC only)</li>
<li>Send Steaming (gRPC only)</li>
</ul>

<p>Execution times for each operation were recorded in the client.</p>

<p>During my initial research into gRPC performance I came across the blog post <a href="https://www.yonego.com/nl/why-milliseconds-matter/">REST vs gRPC | Why Milliseconds Matter</a>. In it the author (unnamed) found that the performance difference between gRPC and REST technologies became even more pronounced as the number of clients accessing the services increased. After reading this I decided to perform additional test runs where each set of operations was run on ten separate client instances simultaneously.</p>

<h3 id="results">Results</h3>

<p><img src="https://developingdane.azurewebsites.net/content/images/2020/04/get.png" alt="Comparing Performance of gRPC, Web API and WCF Services"></p>

<p><img src="https://developingdane.azurewebsites.net/content/images/2020/04/send.png" alt="Comparing Performance of gRPC, Web API and WCF Services"></p>

<p><img src="https://developingdane.azurewebsites.net/content/images/2020/04/first.png" alt="Comparing Performance of gRPC, Web API and WCF Services"></p>

<p><img src="https://developingdane.azurewebsites.net/content/images/2020/04/sendone.png" alt="Comparing Performance of gRPC, Web API and WCF Services"></p>

<h3 id="analysis">Analysis</h3>

<p>If we first compare the two primary technologies .NET Core and .NET Framework we find that .NET Core based services generally showed better performance. There were anomalies to this trend in the <code>First</code> and <code>SendOne</code> operations but I think these can be attributed to the extremely short response times. For those operations the uncertainty in measurements was large and thus the accuracy and utility of the data is questionable.</p>

<p>The results of the <code>Get</code> operation were largely as expected. gRPC performed the best followed by .NET Core Web API, then .NET Framework Web API and finally WCF. Additionally, these trends held for both the single client and ten simultaneous clients cases. The outlier was gRPC's service streaming. This operation was considerably slower than other <code>Get</code> variants and service streaming to ten simultaneous clients was especially slow. This is consistent with the fact that gRPC server streaming isn't designed as a high-performance data transfer mechanism. It's primary value proposition can be thought of as providing a long lived connection to a client.</p>

<p>The results of the <code>Send</code> operations were somewhat surprising. Here performance rankings were gRPC, .NET Core Web API, WCF and then .NET Framework Web API. The .NET Core based services behaved as might be expected but the fact that the WCF service outperformed the .NET Framework Web API is fairly shocking. I have no good explanation for the data but I replicated the experiment several times and consistently found similar results. Also, note that data for gRPC client streaming was absent. During testing upwards of 10% of gRPC <code>Send</code> attempts using client streaming resulted in some form of exception so these results were omitted. The common case found the server reporting that the connection had been closed before the client had completed sending all data.</p>

<p>Results from the <code>First</code> operations were again consistent with expectations showing service order by performance was gRPC, .NET Core Web API, .NET Framework Web API and WCF. Note how significant the variance is for the ten simultaneous clients tests in all cases. One interesting feature is that the WCF service shows very small performance differences between the single and ten simultaneous client cases. More testing would be needed here but results would seem to indicate that WCF handles increasing numbers of client connections quite efficiently for small get operations. </p>

<p>Results of the <code>SendOne</code> operations showed a number of unexpected results. Performance rankings were gRPC, WCF, .NET Core Web API and finally .NET Framework Web API. At first glance it seems very strange that WCF would out perform not only .NET Framework Web API (which was also the case in the <code>Send</code> operation) but also .NET Core Web API. However, examination of the deviation shows the variance in the WCF data to be on the order of the mean values. This is a clear indicator that the mean values alone do not give an accurate picture of results. An additional oddity is that the ten simultaneous WCF clients case showed better performance than the single client case. Here again note the significant variance especially in the single client data. If we refrain from drawing conclusions about the WCF data in this operation the remaining data shows trends consistent with expectations.</p>

<h3 id="conclusions">Conclusions</h3>

<p>Through all testing gRPC performance was found to be superior to that of the other examined technologies. Most of the results confirmed the expectations that performance ranking would be gRPC, .NET Core Web API, .NET Framework Web API and finally WCF. The WCF data did exhibit a few unexpected and interesting results but nothing which convinced me that it wasn't the least performant overall.</p>

<p>Despite it's impressive performance gRPC in .NET Core (Grpc.Net) does have some limitations. </p>

<p>First, due to its heavy reliance upon HTTP/2 it is currently (2020) unsuitable as a service layer for most browser based applications. This limitation can be largely mitigated by the use of <a href="https://grpc.io/docs/tutorials/basic/web/">gRPC-Web</a> but this project is currently "experimental". Though I haven't tested it yet, I suspect there's also a performance cost associated with its use.</p>

<p>To me the most significant limitation of Grpc.Net is that it's entirely reliant upon code generation based on your protobuf (Protocol Buffer) service definitions. Generated code is almost never sufficient as a domain model so you'll likely need to create and maintain a mapping layer between you domain model and the models generated by Grpc.Net. Such mapping code is often significant in size and can be brittle. The project <a href="https://protobuf-net.github.io/protobuf-net.Grpc/">protobuf-net.Grpc</a> attempts to address this issue by allowing you to generate protobuf definitions from C#. It relies upon the use of data annotations in the C# code and supports both its own data annotations and many of those found in the <code>System.ComponentModel.DataAnnotations</code> namespace (same annotations used in WCF). I'd like to test the performance of this framework but my early explorations have been somewhat less than fruitful.</p>

<p>Grpc.Net definitely has potential. Today its use cases  often involve service-to-service communication such as you might find in a micro-services architecture. For such cases the superior performance can be of significant benefit. Over time I expect the technology to continue improving and its use cases to broaden. I'll be watching with interest.</p>

<h6 id="references">References</h6>

<ul>
<li><a href="https://github.com/DaneVinson/StupidTodo/tree/service-compare">Source Code</a> - The <code>service-compare</code> branch of my <a href="https://github.com/DaneVinson/StupidTodo">Stupid Todo</a> repository</li>
<li><a href="https://github.com/MisterJames/GenFu">GenFu</a> - Library used "to generate realistic test data" used in testing</li>
<li><a href="https://docs.microsoft.com/en-us/aspnet/core/grpc/comparison?view=aspnetcore-3.1">Compare gRPC services with HTTP APIs</a> - Microsoft Docs article by James Newton-King</li>
<li><a href="https://medium.com/@EmperorRXF/evaluating-performance-of-rest-vs-grpc-1b8bdf0b22da">Evaluating Performance of REST vs. gRPC</a> - Blog post by Ruwan Fernando</li>
<li><a href="https://www.yonego.com/nl/why-milliseconds-matter/">REST vs gRPC | Why Milliseconds Matter</a> - Blog post by unnamed author</li>
<li><a href="https://dev.to/thangchung/performance-benchmark-grpc-vs-rest-in-net-core-3-preview-8-45ak">Performance benchmark: gRPC vs. REST in .NET Core 3 Preview 8</a> - Blog post by Thang Chung</li>
<li><a href="https://protobuf-net.github.io/protobuf-net.Grpc/gettingstarted.html">protobuf-net</a> - "Simple gRPC access in .NET Core 3 - think WCF, but over gRPC"</li>
</ul>]]></content:encoded></item><item><title><![CDATA[Configuration in Blazor Client (WASM)]]></title><description><![CDATA[<p>I've been working with Blazor Client (WASM) a fair amount recently and a couple of days ago someone asked me about how to load configuration from a file. I'd never had the need for that type of configuration before but it got me curious about what was possible. </p>

<p>After a</p>]]></description><link>https://developingdane.azurewebsites.net/configuration-in-blazor-client/</link><guid isPermaLink="false">fa4f1e45-9b50-40ee-8c9b-460087db60ce</guid><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Wed, 04 Mar 2020 15:28:00 GMT</pubDate><media:content url="https://developingdane.com/content/images/2020/03/blazor_wasm_config.png" medium="image"/><content:encoded><![CDATA[<img src="https://developingdane.com/content/images/2020/03/blazor_wasm_config.png" alt="Configuration in Blazor Client (WASM)"><p>I've been working with Blazor Client (WASM) a fair amount recently and a couple of days ago someone asked me about how to load configuration from a file. I'd never had the need for that type of configuration before but it got me curious about what was possible. </p>

<p>After a little searching I discovered that configuration for Blazor WASM is typically being loaded through an HTTP GET request to a remote resource. That's a completely valid and viable solution but I was specifically curious about loading configuration directly from a file in the Blazor client project, e.g. <code>appsettings.json</code>.</p>

<h4 id="goal">Goal</h4>

<p>Load a json configuration file directly from a Blazor WASM client using typical .NET Core practices for working with configuration. </p>

<h4 id="projectsetup">Project Setup</h4>

<p>I started by creating the Blazor WebAssembly App <code>BlazorWasmConfig</code> using the template in Visual Studio 2019 (note: I'd previously installed all required dependencies as documented in <a href="https://docs.microsoft.com/en-us/aspnet/core/blazor/get-started?view=aspnetcore-3.1&amp;tabs=visual-studio">Getting started in ASP.NET Core Blazor</a>). Next, I removed everything from the <code>BlazorWasmConfig</code> project I wouldn't need. This left me with the single page <code>Index.razor</code> which I'd use to display an object loaded from configuration. </p>

<p>My starting point.</p>

<p><img src="https://developingdane.azurewebsites.net/content/images/2020/03/BlazorWasmConfig_base.png" alt="Configuration in Blazor Client (WASM)"></p>

<p>I added the file <code>appsettings.json</code>.</p>

<pre><code class="language-json">{
  "GalaxyStuff": {
    "GalaxyCluster": "Virgo Supercluster",
    "GalaxyName": "Milky Way",
    "StarCount": 300000000000
  }
}
</code></pre>

<p>I defined the class <code>GalaxyInfo</code> to map configuration.</p>

<pre><code class="language-csharp">public class GalaxyInfo  
{
    public string GalaxyCluster { get; set; }
    public string GalaxyName { get; set; }
    public long StarCount { get; set; }

    public override string ToString()
    {
        return $"Galaxy: {GalaxyName}, Cluster: {GalaxyCluster}, Stars: {StarCount}";
    }
}
</code></pre>

<p>Finally, I installed two NuGet packages for configuration.</p>

<pre><code class="language-powershell">Install-Package Microsoft.Extensions.Configuration.Binder  
Install-Package Microsoft.Extensions.Configuration.Json  
</code></pre>

<p><br>  </p>

<h4 id="implementation">Implementation</h4>

<p>Creation of a configuration object in a .NET Core application typically looks something like this.</p>

<pre><code class="language-csharp">var config = new ConfigurationBuilder()  
                .SetBasePath(Directory.GetCurrentDirectory())
                .AddJsonFile("appsettings.json")
                .Build();
</code></pre>

<p>This tells the runtime to look for the file <code>appsettings.json</code> in the current directory and use it as a source for configuration. This simple pattern works well in .NET Core applications everywhere but utterly breaks down in Blazor WASM. The previous code compiles and executes without exceptions but when running in the browser <code>Directory.GetCurrentDirectory()</code> simply returns "/" and there are no files. No matter how you include <code>appsettings.json</code> in the Blazor client project the above method will never find it. This type of file IO simply does not apply to life in the browser.</p>

<p>The secret sauce to reading from a file included in the Blazor client project is including that file as an <code>Embedded resource</code>. This can be done in Visual Studio by setting the <code>Build Action</code> property of the file to <code>Embedded resource</code> and by setting the <code>Copy to Output Directory</code> property to <code>Copy always</code> (or <code>Copy if newer</code>). Alternately, you can specify both settings directly in the project file itself.</p>

<pre><code class="language-xml">&lt;ItemGroup&gt;  
    &lt;EmbeddedResource Include="appsettings.json"&gt;
        &lt;CopyToOutputDirectory&gt;Always&lt;/CopyToOutputDirectory&gt;
    &lt;/EmbeddedResource&gt;
&lt;/ItemGroup&gt;
</code></pre>

<p>Once this is done you can read the file's contents as a <code>Stream</code>. The extension method <code>AddJsonStream</code> makes it easy to add this steam read to the configuration pipeline.</p>

<pre><code class="language-csharp">string fileName = "BlazorWasmConfig.appsettings.json";  
var stream = Assembly.GetExecutingAssembly()  
                     .GetManifestResourceStream(fileName)

var config = new ConfigurationBuilder()  
                    .AddJsonStream(stream)
                    .Build();
</code></pre>

<p>Note that this method is almost identical to the previous except that instead of loading <code>appsettings.json</code> from a on-disk file using <code>AddJsonFile</code> we are loading it through a stream from the embedded resource using <code>AddJsonStream</code>. </p>

<p>Using the previously defined <code>config</code> object we can then inject configuration into application components as needed. Illustrated here is a Singleton instance of the <code>GalaxyInfo</code> class initialized with data from configuration.</p>

<pre><code class="language-csharp">builder.Services.AddTransient(_ =&gt;  
{ 
    return config.GetSection("GalaxyStuff")
                 .Get&lt;GalaxyInfo&gt;(); 
});
</code></pre>

<p>In <code>Index.razor</code> page we receive an instance of <code>GalaxyInfo</code> through the dependency injection pipeline shown above and display the <code>ToString()</code> of that object.</p>

<pre><code class="language-html">@page "/"
@inject GalaxyInfo GalaxyInfo

&lt;h4&gt;@nameof(GalaxyInfo) loaded as config from appsettings.json embedded resource&lt;/h4&gt;  
@GalaxyInfo?.ToString()
</code></pre>

<p><br>  </p>

<h4 id="results">Results</h4>

<p><img src="https://developingdane.azurewebsites.net/content/images/2020/03/BlazorWasmConfig_final.png" alt="Configuration in Blazor Client (WASM)"></p>

<h4 id="conclusions">Conclusions</h4>

<p>I was fairly surprised how easy this was to implement. I'm not sure I see a strong need for including a specific configuration file in a Blazor client (or any other web client) but considering how smoothly it went I'll definitely put that in my tool bag.</p>

<h5 id="references">References</h5>

<ul>
<li><a href="https://github.com/DaneVinson/BlazorWasmConfig">BlazorWasmConfig</a> - Source code</li>
<li><a href="https://docs.microsoft.com/en-us/aspnet/core/blazor/get-started?view=aspnetcore-3.1&amp;tabs=visual-studio">Getting started in ASP.NET Core Blazor</a></li>
</ul>]]></content:encoded></item><item><title><![CDATA[Exploring LiteDB]]></title><description><![CDATA[<p>Last year I saw a reference to <a href="https://www.litedb.org/">LiteDB</a> on <a href="https://dotnet.libhunt.com/">Awesome .NET</a>. It described itself as "An embedded NoSQL database for .NET". That drew my attention. A short examination of their documents further peaked my interest.</p>

<p>LiteDB describes its target scenarios in the <a href="https://github.com/mbdavid/LiteDB#where-to-use">Where to use</a> section of the GitHub repo's</p>]]></description><link>https://developingdane.azurewebsites.net/exploring-litedb/</link><guid isPermaLink="false">e5487e3c-8b69-4c5b-8d2d-65aad6e9593c</guid><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Sat, 05 Oct 2019 04:11:46 GMT</pubDate><media:content url="https://developingdane.com/content/images/2019/09/litedb.png" medium="image"/><content:encoded><![CDATA[<img src="https://developingdane.com/content/images/2019/09/litedb.png" alt="Exploring LiteDB"><p>Last year I saw a reference to <a href="https://www.litedb.org/">LiteDB</a> on <a href="https://dotnet.libhunt.com/">Awesome .NET</a>. It described itself as "An embedded NoSQL database for .NET". That drew my attention. A short examination of their documents further peaked my interest.</p>

<p>LiteDB describes its target scenarios in the <a href="https://github.com/mbdavid/LiteDB#where-to-use">Where to use</a> section of the GitHub repo's readme.</p>

<ul>
<li>Desktop/local small applications</li>
<li>Application file format</li>
<li>Small web applications</li>
<li>One database per account/user data store</li>
<li>Few concurrent write operations</li>
</ul>

<p>LiteDB will almost always be an inappropriate choice for persistence needs outside of these targets but within them it can be a simple to implement and effective solution.</p>

<p>For my exploration I created the <a href="https://github.com/DaneVinson/StupidTodo/tree/litedb">litedb branch</a> of the <a href="https://github.com/DaneVinson/StupidTodo">Stupid Todo</a> reference application. In this branch I replaced the static data provider used in the <a href="https://github.com/DaneVinson/StupidTodo/tree/master">master branch</a> version of  <code>StupidTodo.WebApi.TodoController</code> with a LiteDB implementation. Code examples shown here are snippets from that branch.</p>

<h3 id="usinglitedb">Using LiteDB</h3>

<p>To get started with LiteDB just include the NuGet package. Note that the only dependencies are .NET 4.5 or .NET Standard 2.0 so it's compatible with the vast majority of .NET and .NET Core projects.</p>

<pre><code class="language-json">Install-Package LiteDB  
</code></pre>

<p>Most interactions with LiteDB will begin with an instance of <code>LiteDatabase</code>. Shown is the simplest where LiteDB looks for a file named <em>stupid-todo.db</em> in the execution directory and creates a new one (&lt; 10 kb) if the file is not found.</p>

<pre><code class="language-csharp">using (var database = new LiteDatabase("stupid-todo.db"))  
{ ... }
</code></pre>

<p>CRUD operations with LiteDB are equally simple. Notice how <code>GetCollection</code> joins a specific .NET Type with a named collection in the database, i.e. the <code>Todo</code> Type joined to the <em>todos</em> collection. The resulting <code>LiteCollection</code> is then used to perform operations.</p>

<pre><code class="language-csharp">LiteCollection&lt;Todo&gt; todos = database.GetCollection&lt;Todo&gt;("todos");

// Get all Todo as an array
Todo[] allTodos = todos.FindAll().ToArray();

// Insert or update a Todo
bool upsertResult = todos.Upsert(todo);

// Delete Todo
bool deleteResult = todos.Delete(id);
</code></pre>

<p>Illustrated here are <code>LiteCollection</code> instance methods <code>FindAll</code>, <code>Upsert</code> and <code>Delete</code> but there are many others including <code>Find</code>, <code>Insert</code>, <code>Update</code> and <code>Exists</code> each with overloads. </p>

<p>LiteDB also supports a simple indexing mechanism. Call an overload of <code>EnsureIndex</code> after insert/update/upsert and the index will be added if it doesn't exist (<em>Id</em> fields are automatically indexed).</p>

<pre><code class="language-csharp">// Ensure a unique index on the Todo's Description field
todos.EnsureIndex(t =&gt; t.Description, true);  
</code></pre>

<p><br>  </p>

<h3 id="summary">Summary</h3>

<p>LiteDB shows a focus on simplicity throughout. Its API is limited but sufficient for its target use cases. It's clearly not the right persistence solution for most scenarios but in its target range it works quite well.</p>

<p><br>  </p>

<h5 id="references">References</h5>

<ul>
<li><a href="https://www.litedb.org">LiteDB</a> - LiteDB Home</li>
<li><a href="https://github.com/mbdavid/LiteDB">LiteDB GitHub</a> - LiteDB GitHub repository</li>
<li><a href="https://github.com/DaneVinson/StupidTodo/tree/litedb">Source Code</a> - litedb branch of StupidTodo on GitHub</li>
<li><a href="https://github.com/DaneVinson/StupidTodo">Stupid Todo</a> - StupidTodo reference application on GitHub</li>
<li><a href="https://stupidtodo-litedb.azurewebsites.net/index.html">Stupid Todo / LiteDB Demo</a> - Live demo of Stupid Todo using LiteDB persistence</li>
<li><a href="https://dotnet.libhunt.com/">Awesome .NET</a> - Blog aggregator that alerted me to LiteDB</li>
</ul>]]></content:encoded></item><item><title><![CDATA[.NET Core configuration, a deeper dive]]></title><description><![CDATA[<p>In my previous post <a href="https://developingdane.azurewebsites.net/net-core-configuration-deeper-dive/../net-core-configuration-files/">.NET Core Configuration Files</a> I discussed a simple and resilient method of accessing configuration data with .NET Core. In this post I'll explore .NET Core configuration options in greater depth.</p>

<p>For this discussion let's start with the following <code>appsettings.json</code> configuration file.</p>

<pre><code class="language-json">{
  "favoriteBeerStyle": "IPA",
  "beerOfTheNow": {
    "name"</code></pre>]]></description><link>https://developingdane.azurewebsites.net/net-core-configuration-deeper-dive/</link><guid isPermaLink="false">fb7a9019-99d1-4070-b2a4-9989fa0b24c4</guid><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Fri, 11 Jan 2019 02:50:33 GMT</pubDate><content:encoded><![CDATA[<p>In my previous post <a href="https://developingdane.azurewebsites.net/net-core-configuration-deeper-dive/../net-core-configuration-files/">.NET Core Configuration Files</a> I discussed a simple and resilient method of accessing configuration data with .NET Core. In this post I'll explore .NET Core configuration options in greater depth.</p>

<p>For this discussion let's start with the following <code>appsettings.json</code> configuration file.</p>

<pre><code class="language-json">{
  "favoriteBeerStyle": "IPA",
  "beerOfTheNow": {
    "name": "Blacksheep CDA",
    "abv": "6.7",
    "fnord": "nothing to see here",
    "brewery": {
      "name": "Lucky Labrador Brewing Company",
      "location": "Portland, OR",
      "rating": "9.5"
    }
  }
}
</code></pre>

<p><br> <br>
I previously discussed several ways to get information from such configuration. For example, consider the following where <code>configuration</code> is and instance of <code>Microsoft.Extensions.Configuration.IConfiguration</code>.</p>

<pre><code class="language-csharp">string fav1 = configuration.GetSection("favoriteBeerStyle")?.Value;  
string fav2 = configuration.GetSection("FAVORITEBEERSTYLE")?.Value;  
string fav3 = configuration["favoriteBeerStyle"];  
</code></pre>

<p><br> <br>
Each of the three <code>fav</code> variables will have the value "IPA". In my opinion the syntax used with <code>fav3</code> is the simplest for the vast majority of cases but opinions vary.</p>

<p>I further discussed how .NET Core's configuration system makes reading complex data from config files simple and extremely resilient. Take the following code.</p>

<pre><code class="language-csharp">string current = configuration["beerOfTheNow:name"];  
string currentBrewery = configuration["beerOfTheNow:brewery:name"];  
string badName = configuration["beerOfTheNow:not_a_name"];  
</code></pre>

<p><br> <br>
After execution <code>current</code> will have the value "Blacksheep CDA", <code>currentBrewery</code> will have the value "Lucky Labrador Brewing Company" and <code>badName</code> will be null.</p>

<p>Sometimes reading configurations in this way can be useful and even appropriate. However, for most applications reading values directly from an instance of <code>IConfiguration</code> is not ideal. A better solution is to provide required configuration data to object instances using dependency injection (DI).</p>

<p>As an example take the definitions of the following simple classes.</p>

<pre><code class="language-csharp">public class TheBeer  
{
    public string Name { get; set; }
    public Brewery Brewery { get; set; }
    public double Abv { get; set; }
    public double Ibu { get; set; }
}

public class Brewery  
{
    public string Name { get; set; }
    public string Location { get; set; }
    public double Rating { get; set; }
}
</code></pre>

<p><br> <br>
Note that <code>TheBeer</code> class has many properties which match those found in the <code>applicationsettings.json</code> config file's "beerOfTheNow" section. Further, note that the two are not perfectly aligned. Finally, note that one of these properties is of type <code>Brewery</code> and that the <code>Brewery</code> class has properties which match the "brewery" sub-section of the "beerOfTheNow" section. </p>

<p>Now, take an ASP.NET <code>Controller</code> with a constructor dependency on <code>TheBeer</code>.</p>

<pre><code class="language-csharp">public class TodoController : Controller  
{
    private readonly TheBeer Beer;

    public TodoController(TheBeer beer)
    {
        Beer = beer;
    }
</code></pre>

<p><br> <br>
Let us further assume that we'd like the <code>beer</code> argument of to be an instance of a <code>TheBeer</code> hydrated with appropriate values from the "beerOfTheNow" section of the config file. A common use case would be to use DI to provide all instances of <code>TodoController</code> with an instance of <code>TheBeer</code>. A typical example adding this dependency to the DI pipeline might look like this.  </p>

<pre><code class="language-csharp">public void ConfigureServices(IServiceCollection services)  
{
    var beer = new TheBeer(); // then hydrate with configuration
    services.AddSingleton(beer)
            .AddMvc();
}
</code></pre>

<p><br> <br>
There are a number of ways to hydrate the <code>beer</code> variable with data from the "beerOfTheNow" configuration section. It should be fairly obviously that loading properties one at a time over the <code>TheBeer</code>/<code>Brewery</code> graph object is less than ideal. Better options might be the <code>IConfiguration</code> extensions methods <code>Bind</code> or (the more elegant?) <code>Get</code>.</p>

<pre><code class="language-csharp">// NuGet: Microsoft.Extensions.Configuration.Binder
using Microsoft.Extensions.Configuration;

var beer1 = new TheBeer();  
configuration.GetSection("beerOfTheNow").Bind(beer);

var beer2 = configuration.GetSection("beerOfTheNow").Get&lt;TheBeer&gt;();  
</code></pre>

<p><br> <br>
In both cases the <code>beer</code> variables will be hydrated as expected based on available configuration. Properties with no matching configuration will have values set to property type defaults and configuration values that have no matching property will be ignored. Again, very resilient.</p>

<p>The last option I'll discuss utilizes features from the <a href="https://docs.microsoft.com/en-us/aspnet/core/fundamentals/configuration/options?view=aspnetcore-2.2">Options pattern in ASP.NET Core</a>. This pattern covers quite a lot but I find the <code>IServiceCollection</code> extension method <code>Configure</code> particularly useful. The <code>Configure</code> method can be used instead of the <code>AddSingleton</code> to resolve configuration dependencies. The difference is that while <code>AddSingleton</code> resolves dependencies of type <code>T</code>, <code>Configure</code> resolves dependencies of <code>IOptions&lt;T&gt;</code>. For example, the constructor of the previously shown <code>TodoController</code> would need to be updated to accept an argument of type <code>IOptions&lt;TheBeer&gt;</code> instead of <code>TheBeer</code>.</p>

<pre><code class="language-csharp">public TodoController(IOptions&lt;TheBeer&gt; options)  
{
    Beer = options?.Value;
}
</code></pre>

<p><br> <br>
The <code>IOptions</code> interface is extremely simple and a default implementation is injected for you by the <code>Configure</code> method.</p>

<pre><code class="language-csharp">namespace Microsoft.Extensions.Options  
{
    public interface IOptions&lt;out TOptions&gt; where TOptions : class, new()
    {
        TOptions Value { get; }
    }
}
</code></pre>

<p><br> <br>
Both the <code>AddSingleton</code> and <code>Configure</code> extensions methods can be used to satisfy dependencies on configuration data.</p>

<pre><code class="language-csharp">// Resolve dependencies on TheBeer
services.AddSingleton(Configuration.GetSection("beerOfTheNow")?.Get&lt;TheBeer&gt;());

// Resolve dependencies on IOptions&lt;TheBeer&gt;
// NuGet: Microsoft.Extensions.Options.ConfigurationExtensions
using Microsoft.Extensions.DependencyInjection;

services.Configure&lt;TheBeer&gt;(Configuration.GetSection("beerOfTheNow"));  
</code></pre>

<p><br> <br>
I currently favor the <code>Configure</code> method to resolve configuration dependencies for the majority of .NET Core applications. However, there are MANY applications with complex configuration needs far outside the scope of the simple examples I've used here. For those, I'd say that the ASP.NET Core configuration system has an abundance of options for the developers who build and maintain such software. I'll leave those explorations to the reader...which likely means me.</p>

<p><strong>References</strong></p>

<ul>
<li><a href="https://developingdane.azurewebsites.net/net-core-configuration-deeper-dive/../net-core-configuration-files/">.NET Core Configuration Files</a> - Developing Dane, December, 2016</li>
<li><a href="https://docs.microsoft.com/en-us/aspnet/core/fundamentals/configuration/options?view=aspnetcore-2.2">Options pattern in ASP.NET Core 2.2</a>, Microsoft documentation </li>
<li><a href="https://github.com/DaneVinson/StupidTodo/tree/stupid-configs">Exploration source code</a> - stupid-configs branch of the <a href="https://github.com/DaneVinson/StupidTodo">StupidTodo</a> exploratory application's source code repository</li>
</ul>]]></content:encoded></item><item><title><![CDATA[An Early Exploration of Blazor]]></title><description><![CDATA[<p>My first exposure to Blazor was prior to it having a name. In  late 2017 I watched a recording of Steve Sanderson's NDC Oslo 2017 presentation <a href="https://www.youtube.com/watch?v=MiLAE6HMr10">Web Apps can’t really do that, can they?</a> He discussed four emerging web technologies being supported by the major browser manufacturers. One of</p>]]></description><link>https://developingdane.azurewebsites.net/blazor/</link><guid isPermaLink="false">66bc560b-9469-4c99-8eba-2945ab77f46a</guid><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Thu, 08 Nov 2018 06:24:59 GMT</pubDate><media:content url="https://developingdane.com/content/images/2018/11/blazor.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://developingdane.com/content/images/2018/11/blazor.jpg" alt="An Early Exploration of Blazor"><p>My first exposure to Blazor was prior to it having a name. In  late 2017 I watched a recording of Steve Sanderson's NDC Oslo 2017 presentation <a href="https://www.youtube.com/watch?v=MiLAE6HMr10">Web Apps can’t really do that, can they?</a> He discussed four emerging web technologies being supported by the major browser manufacturers. One of those technologies was Web Assembly (WASM). In Steve's WASM demo he explored using ASP.NET Razor technology directly in the browser by compiling C# to Web Assembly using and open-source C# to C++ compiler. It was quite impressive and people's head's exploded, mine included. Shortly afterwards I discovered that Microsoft had initiated the experimental project "Blazor". Since then there's been a dearth of excitement. I finally decided to pull down bits and give Blazor as spin. For me that excitement became infectious.</p>

<h2 id="gettingstarted">Getting Started</h2>

<p>Blog posts and code snippets are plentiful for installing and setting up Blazor. The official <a href="https://blazor.net/docs/get-started.html">Get Started</a> page covers the basics so I won't duplicate that here. A note to Windows users who routinely make use of both Visual Studio (VS) and the dotnet CLI, be sure to install the latest packages for both. You'd think that the VS install would subsume the dotnet CLI but in my experience that has not been the case.</p>

<p>After you have the Blazor SDK installed you'll find new Blazor templates in both VS (new project Web -> ASP.NET Core Web Application) and the dotnet CLI (<code>dotnet new</code>). Newly available project types are:</p>

<ul>
<li><code>Blazor (hosted in ASP.NET server)</code> - ASP.NET project which provides a Web API to a dependent Blazor client application.</li>
<li><code>Blazor Library</code> - .NET Standard library project which aids in interfacing with JavaScript components.</li>
<li><code>Blazor (Server-side in ASP.NET Core)</code> - ASP.NET project which hosts both service and client applications providing real-time client-side HTML to the browser using <a href="https://www.asp.net/signalr">SignalR</a>.</li>
<li><code>Blazor (standalone)</code> - .NET Standard project which produces only the client-side Blazor, i.e. WASM.</li>
</ul>

<p>In this exploration of Blazor I utilized the <code>Blazor (standalone)</code> project template to create a browser client which had feature parity with my <a href="https://github.com/DaneVinson/StupidTodo/tree/vue">Vue.js Stupid Todo client project</a>. The result is the <a href="https://github.com/DaneVinson/StupidTodo/tree/blazor">Blazor Stupid Todo client project</a> project.</p>

<h2 id="codingwithblazor">Coding with Blazor</h2>

<p>Anyone familiar with ASP.NET Razor syntax will feel relatively at home working with Blazor. I've heard Daniel Roth say "Blazor is Razor in the browser" and that's essentially correct. The syntax used is almost identical but with Blazor page rendering occurs directly in the browser. This eliminates the need for a round-trip to the server for such rendering. With Blazor it's become a possibility that .NET developers will soon be able to build browser applications rivaling those created with JavaScript frameworks such as Angular and React.</p>

<p>At this point development patterns for Blazor are in their infancy and Microsoft's best (only) sample application is <a href="https://github.com/aspnet/samples/tree/master/samples/aspnetcore/blazor">Flight Finder</a>. Flight Finder makes use of a singleton instance of the class <code>AppState</code> to handle various application concerns. To me their use of the <code>AppState</code> class is reminiscent of a ViewModel in the Model/View/ViewModel (MVVM) pattern. The MVVM pattern has historically been applied to XAML frameworks (WPF, Xamarin, Silverlight) but to me it seems like it may be a reasonable fit with Blazor as well. I decided to explore this idea further.</p>

<p>A partial (and simplified) implementation of the MVVM pattern in Blazor applied to the Stupid Todo application has a Model (Todo.cs), View (Todo.cshtml) and ViewModel (TodoViewModel.cs).</p>

<p>Todo.cs (Model)  </p>

<pre><code class="language-csharp">public class Todo  
{
    public string Description { get; set; }
    public bool Done { get; set; }
    public Guid Id { get; set; }
}
</code></pre>

<p><br> <br>
Todo.cshtml (View)  </p>

<pre><code class="language-csharp">@inject TodoViewModel ViewModel

&lt;div&gt;  
    &lt;button onclick=@(async () =&gt; await ViewModel.CheckDoneAsync())&gt;Complete&lt;/i&gt;&lt;/button&gt;
    &lt;input type="text" bind=@ViewModel.Todo.Description /&gt;
    &lt;button onclick=@(async () =&gt; await ViewModel.SaveAsync())&gt;Save&lt;/button&gt;
&lt;/div&gt;

@functions
{
    [Parameter]
    private Todo SourceTodo { get; set; }

    protected override void OnInit()
    {
        ViewModel.Load(SourceTodo);
    }
}
</code></pre>

<p><br> <br>
TodoViewModel.cs (ViewModel)  </p>

<pre><code class="language-csharp">public class TodoViewModel  
{    
    public TodoViewModel(HttpClient httpClient)
    {
        HttpClient = httpClient;
    }

    public async Task CheckDoneAsync()
    {
        Todo.Done = true;
        await SaveAsync();
    }

    public void Load(Todo todo)
    {
        Todo = todo;
    }

    public async Task SaveAsync()
    {
        var response = await HttpClient.PutAsync($"https://stupidtodo-api.azurewebsites.net/api/{Todo.Id}, Todo);
    }

    public Todo Todo { get; set; }  
    private readonly HttpClient HttpClient;
}
</code></pre>

<p><br> <br>
In the <code>ConfigureServices</code> method of Startup.cs we wire up the needed dependencies.  </p>

<pre><code class="language-csharp">services.AddSingleton&lt;TodoViewModel&gt;()  
        .AddSingleton&lt;HttpClient&gt;(new HttpClient());
</code></pre>

<p><br> <br>
The Todo component can then be leveraged in parent component as a simple element in the .cshtml.  </p>

<pre><code class="language-csharp">&lt;Todo SourceTodo="@SelectedTodo" /&gt;  
</code></pre>

<p><br> <br>
In this way the parent component can create a rendered Todo (Model) by passing an instance of a Todo to the child component as a <code>Parameter</code>. The <code>OnInit</code> of method of the child View component calls the ViewModel's <code>Load</code> method passing it the View's <code>Parameter</code> Todo. All bindings in the View reference the ViewModel and everything just works. Excellent!</p>

<p>One caveat, two-way bindings in Blazor work fairly well even now but binding updates for inputs fire on leave of the input, not on its change. Steve Sanderson has said the solution will be to utilize <code>bind-value-oninput</code> instead of <code>bind</code> in the .cshtml but as of 0.6.0 that path is non-functional.</p>

<h2 id="conclusions">Conclusions</h2>

<p>As of this post Blazor is roughly a year old and the current latest version of is 0.6.0. Already there are dozens of excellent articles, blog posts and videos on the technology. Enthusiasm in the .NET community is very high and Blazor looks to have the momentum to move towards production. At the moment Blazor is as rough as you might expect a pre-release framework to be. You'll encounter missing features and nagging tooling issues with any significant development effort you undertake. That said, what Blazor has accomplished in its relatively short existence is extremely impressive. Since these explorations I've found myself pondering whether or not I'll ever voluntarily create a JavaScript client again. That's fairly ironic considering my last post <a href="https://developingdane.azurewebsites.net/blazor/../vue-js/">Vue.js</a> where I sing that framework's praises.</p>

<h2 id="future">Future</h2>

<p>Several years ago I posted <a href="https://developingdane.azurewebsites.net/blazor/../how_html5_destroyed_my_csla_dream/">How HTML5 destroyed my CSLA dream</a>.   For quite some time I've been contemplating a new post along the lines of "How Microsoft Rescued my CSLA dream". My plan had been to discuss how Xamarin and the future release of XMAL Standard could potentially restore my dream of writing end-to-end, multi-layered and multi-tiered applications using a single set of semantics (language). Until now I've been under the assumption that XAML would serve as the client-side markup in this new landscape. Blazor has me considering a whole new set of possibilities and I'm pretty excited about the potential.</p>

<p><strong>References</strong></p>

<ul>
<li><a href="https://blazor.net/">Blazor</a>, Blazor home</li>
<li><a href="https://github.com/DaneVinson/StupidTodo/tree/blazor">Stupid Todo Blazor</a>, Blazor branch of the StupidTodo exploratory application's source code repository</li>
<li><a href="https://www.youtube.com/watch?v=MiLAE6HMr10">Web Apps can’t really do that, can they?</a>, Steve Sanderson, NDC Oslo 2017, video</li>
<li><a href="http://www.lhotka.net/weblog/CSLARunningOnBlazor.aspx">CSLA running on Blazor</a>, Rockford Lhotka, blog post</li>
<li><a href="https://channel9.msdn.com/Events/dotnetConf/2018/S207">Blazor: Modern Web development with .NET and WebAssembly</a>, Daniel Roth, .NET Conf 2018, video</li>
<li><a href="https://stupidtodo-blazor.azurewebsites.net/">Stupid Todo Client, Blazor</a>, live instance of the Blazor Stupid Todo client</li>
<li><a href="https://stupidtodo-client-vue.azurewebsites.net/">Stupid Todo Client, Vue.js</a>, live instance of the Vue.js Stupid Todo client</li>
</ul>]]></content:encoded></item><item><title><![CDATA[Vue.js]]></title><description><![CDATA[<p>Several months ago I found myself contemplating the creation of yet another HTML client, this time to help in prototyping a larger application concept I was working on. I wanted the client to be responsive and I didn't want to put a lot of work into it.</p>

<p>There's a wide</p>]]></description><link>https://developingdane.azurewebsites.net/vue-js/</link><guid isPermaLink="false">ea2a0ca3-fada-4f14-b8d9-96278fecfa81</guid><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Sun, 09 Sep 2018 04:34:50 GMT</pubDate><media:content url="https://developingdane.com/content/images/2018/09/vuejs.png" medium="image"/><content:encoded><![CDATA[<img src="https://developingdane.com/content/images/2018/09/vuejs.png" alt="Vue.js"><p>Several months ago I found myself contemplating the creation of yet another HTML client, this time to help in prototyping a larger application concept I was working on. I wanted the client to be responsive and I didn't want to put a lot of work into it.</p>

<p>There's a wide variety of excellent frameworks to choose from when building HTML clients. Frameworks such as <a href="https://angular.io/">Angular</a>, <a href="https://reactjs.org/">React</a> and <a href="https://aurelia.io/">Aruelia</a> offer rich feature sets for building amazing web applications but all of them come at a cost, e.g. npm, web pack, development web servers, etc. I wanted something simpler.</p>

<p>In searching for simple HTML client application frameworks I quickly encountered <a href="https://vuejs.org/">Vue.js</a>. It was exactly what I was looking for. Vue requires only a single .js file reference for the base framework.  </p>

<pre><code class="language-html">&lt;script src="{path}/vue.js"&gt;&lt;/script&gt;  
</code></pre>

<p>The addition of a second file is needed for interactions with HTTP services.  </p>

<pre><code class="language-html">&lt;script src="{path}/vue-resource.js"&gt;&lt;/script&gt;  
</code></pre>

<p>With these two references in place, the ubiquitous <a href="https://getbootstrap.com/">Bootstrap</a> and application specific <code>app.js</code> and <code>index.html</code> files an impressively full-featured client application can be created with a few dozen lines of code. </p>

<p>Vue can be used to create large-scale, componentized client applications rivaling anything you might see in other frameworks but it also allows for ultra-rapid creation of basic HTML client applications. In this Vue excels above most of its rivals.</p>

<p>I used Vue to create an simple HTML client for my <a href="https://github.com/DaneVinson/StupidTodo">Stupid Todo</a> exploratory application (<a href="https://stupidtodo-client-vue.azurewebsites.net/">live demo</a>). For that application the combined line count of the <code>app.js</code> and the <code>index.html</code> files is ~180. It's a fairly impressive illustration of what can be accomplished using Vue with only 180 lines of written code.</p>

<p>I created a short video discussing Vue and demonstrating the Vue client for Stupid Todo.  </p>

<iframe width="560" height="315" src="https://www.youtube.com/embed/bRuPCYfYO8Y" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen></iframe>

<p>If you're like me and occasionally have the need to create simple HTML client applications I recommend you give Vue.js a look.</p>

<p><strong>References</strong></p>

<ul>
<li><a href="https://vuejs.org/">Vue.js</a>, Vue.js home</li>
<li><a href="https://github.com/DaneVinson/StupidTodo">Stupid Todo</a>, GitHub repo</li>
<li><a href="https://stupidtodo-client-vue.azurewebsites.net/">Stupid Todo Vue</a>, live Vue client for the Stupid Todo exploratory application (allow time to warm-up the free-tier Azure Web App)</li>
<li><a href="https://johnpapa.net/vue-typescript/">Vue.js with TypeScript</a>, blog entry by John Papa</li>
</ul>]]></content:encoded></item><item><title><![CDATA[My year with a MacBook]]></title><description><![CDATA[<p>I'm primarily a .NET developer but for a year I worked exclusively on a MacBook using Boot Camp to run Windows 10.</p>

<p>Like many software companies the one I work for provides its development staff with options for their working computers. My options were a MacBook Pro or a Dell.</p>]]></description><link>https://developingdane.azurewebsites.net/my-year-with-a-macbook/</link><guid isPermaLink="false">0f38e639-8a0d-42c6-a30f-bbcdeae1b8ff</guid><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Fri, 20 Jul 2018 04:10:23 GMT</pubDate><media:content url="https://developingdane.com/content/images/2018/07/apple-windows.png" medium="image"/><content:encoded><![CDATA[<img src="https://developingdane.com/content/images/2018/07/apple-windows.png" alt="My year with a MacBook"><p>I'm primarily a .NET developer but for a year I worked exclusively on a MacBook using Boot Camp to run Windows 10.</p>

<p>Like many software companies the one I work for provides its development staff with options for their working computers. My options were a MacBook Pro or a Dell. I'd used three different Dell laptops in my career and I disliked every one of them so I went with the MacBook. I've used OSX in the past and I despised it. I had no interest in using it as my primary OS so I ran Windows 10 on the MacBook hardware using Apple's excellent Boot Camp. Here's the pros and cons I encountered after a year of dedicated use with a MacBook running Windows.</p>

<p>Pros</p>

<ul>
<li>Physically solid. The MacBook just feels well constructed. Every aspect of it's design indicates precision and quality.</li>
<li>Amazing screen quality. The MacBook has a Retina Display screen. Its screen quality is well beyond anything I've ever experienced with a laptop.</li>
<li>Good hardware resources. Intel i7 (4 core), 16 GB RAM and 1 TB SSD</li>
<li>Attention to detail. Apple's design details are top notch. The MackBook has many elegant and functional features, e.g. the small bevel on the top edge of the screen frame which allows the user to easily open a closed laptop.</li>
</ul>

<p>Cons</p>

<ul>
<li>Hardware virtualization issues. The process of getting hardware virtualization working with Boot Camp Windows 10 is painful. Unfortunately, hardware vitalization is required to run Docker for Windows. This impedance to running Docker is what finally drove me to give up the MacBook.</li>
<li>Touchpad too "grab" happy. I routinely grabbed files/folders I'd only meant to select. This caused me many headaches with accidentally moved files. </li>
<li>Screen maring. I cleaned my screen with desalinated water and a micro-fiber cloth and still it was permanently marred. I discovered this is not an uncommon issue with the model of MacBook I had.  </li>
<li>Power button placement. The MacBook power button is simply the top-right button on the keyboard. Every laptop I'd ever used had a power button that was separate from the keyboard proper. I can't count the number of times I meant to hit delete and instead shut down my machine. It still occasionally happened after a full year.</li>
<li>Battery life was not good. When new I'd get ~4 hours of battery life while coding. That dropped over the year to less than 3 hours.</li>
<li>Remote desktop issues run amok. RDP always had DPI issues and connection issues requiring reboot to solve.</li>
</ul>

<p>I enjoyed much about the MacBook in the year I used it. I think Apple did a fairly tremendous job with Boot Camp and if I'd had a more reliable experience with Docker this post might be something very different. In the end I always felt a little like I was paddling upstream trying to run Windows on a MacBook. I guess that's not really a surprising conclusion. Perhaps someday Boot Camp will provide a more complete experience for Windows on a MacBook. For now...I've accepted a Dell as my primary work computer. For me that about says it all.</p>]]></content:encoded></item><item><title><![CDATA[F# / C# Project Comparison]]></title><description><![CDATA[<p>My first look into F# was in 2011 when I read <a href="http://shop.oreilly.com/product/9780596153656.do">Programming F#</a>. I enjoyed working through that book and I realized that F# had opened the door to functional programming in .NET but at the time C# was vastly superior in both available information and developer experience. I set</p>]]></description><link>https://developingdane.azurewebsites.net/fsharp-csharp-compare/</link><guid isPermaLink="false">4284e6a4-3c55-472c-9b0b-20414741a5cd</guid><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Sun, 18 Mar 2018 00:30:43 GMT</pubDate><media:content url="https://developingdane.com/content/images/2018/03/spy-vs-spy.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://developingdane.com/content/images/2018/03/spy-vs-spy.jpg" alt="F# / C# Project Comparison"><p>My first look into F# was in 2011 when I read <a href="http://shop.oreilly.com/product/9780596153656.do">Programming F#</a>. I enjoyed working through that book and I realized that F# had opened the door to functional programming in .NET but at the time C# was vastly superior in both available information and developer experience. I set F# aside.</p>

<p>Recently I helped interview candidates applying for a principal software engineering position. One candidate (now employee) was very enthusiastic about functional programming in general and F# in particular. His enthusiasm rekindled my interest. </p>

<p>I started looking into the current state of F# and quickly discovered the previously mentioned gaps had closed significantly. Microsoft has made a commitment to F# as a first-class language and its projects are now beautifully interopable with the majority of other .NET project types. I decided I needed to re-explore F#.</p>

<h3 id="fexplorationproject">F# Exploration Project</h3>

<p>For my second dive into F# I decided to use <a href="https://github.com/dustinmoris/Giraffe">Giraffe</a> to recreate a web API I'd previously written in C#. Giraffe describes itself as "A native functional ASP.NET Core web framework for F# developers." My <a href="https://github.com/DaneVinson/TrackIt">TrackIt</a> exploratory application includes the ASP.NET Core Web API C# project <a href="https://github.com/DaneVinson/TrackIt/tree/master/TrackIt.Service.CoreWebApi">TrackIt.Services.CoreWebApi</a>. I used that project as the feature parity target for a new F# project. <a href="https://github.com/DaneVinson/TrackIt/tree/master/TrackIt.Service.GiraffeWebApi">TrackIt.Services.GiraffeWebApi</a> is the result.</p>

<h3 id="projectcomparison">Project Comparison</h3>

<p>When complete the Giraffe implementation was a feature match for the one I'd previously created in C#. The following gives a sense of the differences in C# and F# code showing in-code handling of an HTTP DELETE to /datapoints/{id}</p>

<p>C#  </p>

<pre><code class="language-csharp">public async Task&lt;IActionResult&gt; DeleteDataPointAsync(string id)  
{
    var result = await Manager.DeleteAsync(id);
    if (result.Success) { return Ok(result.Value); }
    else { return BadRequest(result); }
}
</code></pre>

<p>F#  </p>

<pre><code class="language-fsharp">let deleteDataPoint id =  
    fun (next : HttpFunc) (ctx : HttpContext) -&gt;
        task {
            use manager = getDataPointManager configuration
            let! result = manager.DeleteAsync(id)
            if (result.Success) then return! jsonCamelCase true next ctx
            else return! (setStatusCode 400 &gt;=&gt; jsonCamelCase result) next ctx
        }
</code></pre>

<p>When I was done with the Griaffe project realized I had the opportunity to quantitatively compare feature parity code from two different languages. The metrics I decided upon for this comparison were file count, line count and significant character count.  </p>

<hr>  

<h5 id="method">Method</h5>

<p>In each project only source code files were considered, i.e. .cs/.fs for the C#/F# projects respectively. Files were parsed for line count and significant characters. All blank lines and comments were excluded in parsing and significant character counts were found by trimming lines of spaces and removing tabs.</p>

<h5 id="results">Results</h5>

<table style="width:100%">  
  <tr>
    <th>Metric</th>
    <th>F#</th>
    <th>C#</th> 
    <th>F#/C#</th>
  </tr>
  <tr>
    <td>Files</td>
    <td>6</td>
    <td>7</td> 
    <td>0.86</td>
  </tr>
  <tr>
    <td>Lines</td>
    <td>219</td>
    <td>312</td> 
    <td>0.70</td>
  </tr>
  <tr>
    <td>Significant Characters</td>
    <td>8376</td>
    <td>9123</td> 
    <td>0.92</td>
  </tr>
</table>

<h5 id="discussion">Discussion</h5>

<p>The Files metric shows 14% fewer files in the F# project. However, the difference is a single file. For the comparison sample size no significant conclusions should be drawn from this discrepancy.</p>

<p>The Lines metric shows 30% fewer lines in the F# project. That's a significant number largely explained by C#'s use of the curly brace coupled with my personal coding standards, i.e. C# curly braces get their own lines.</p>

<p>The Significant Characters metric shows 8% fewer significant characters in the F# project. This difference seems a bit underwhelming but I'd argue that 8% less total code in a large application is very significant.  </p>

<hr>  

<h3 id="finalthoughts">Final Thoughts</h3>

<p>During this process I've found myself again drawn to the terse and tightly constrained nature of the F# language. The support the language now enjoys, especially with .NET Core 2.0, opens the door to a staggering number possibilities. I can say with 100% confidence it won't be another 6 years before I find myself working in an F# project.</p>

<h3 id="references">References</h3>

<ul>
<li><a href="https://github.com/DaneVinson/TrackIt">TrackIt source code</a> (GitHub project)</li>
<li><a href="https://github.com/giraffe-fsharp/Giraffe">Giraffe</a> (GitHub project)</li>
<li><a href="https://www.youtube.com/watch?v=HyRzsPZ0f0k">Getting Started with ASP.NET Core Giraffe</a>, Ody Mbegbu (Video)</li>
<li><a href="https://www.hanselman.com/blog/AFunctionalWebWithASPNETCoreAndFsGiraffe.aspx">A Functional Web with ASP.NET Core and F#'s Giraffe</a>, Scott Hanselman (Blog entry)</li>
<li><a href="https://developingdane.azurewebsites.net/fsharp-csharp-compare/../my-latest-exploratory-application-trackit/">My latest exploratory application, TrackIt</a>, Developing Dane (Blog entry)</li>
</ul>]]></content:encoded></item><item><title><![CDATA[My Latest Exploratory Application, TrackIt]]></title><description><![CDATA[<p>Every couple of years I settle on the general design for a full-stack application which I then use to explore various technologies and architectural patterns. I refer to these as my exploratory applications. In my 2014 post <a href="https://developingdane.com/recipe_box_is_live/">RecipeBox is Live</a> I discuss the public release of one such application. </p>

<p>A</p>]]></description><link>https://developingdane.azurewebsites.net/my-latest-exploratory-application-trackit/</link><guid isPermaLink="false">39d137ba-d287-43c3-a449-57ebe7a9e389</guid><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Sun, 05 Nov 2017 06:05:16 GMT</pubDate><content:encoded><![CDATA[<p>Every couple of years I settle on the general design for a full-stack application which I then use to explore various technologies and architectural patterns. I refer to these as my exploratory applications. In my 2014 post <a href="https://developingdane.com/recipe_box_is_live/">RecipeBox is Live</a> I discuss the public release of one such application. </p>

<p>A short time ago I made public my latest exploratory application, <a href="https://github.com/DaneVinson/TrackIt">TrackIt</a>. In conjunction with the code release I created an <a href="https://www.youtube.com/watch?v=2fLqZI1M-fY">8 minute walk-through video</a>. This is the first such video I've ever made. It's not perfect but I'm fairly satisfied with the results.</p>

<iframe width="560" height="315" src="https://www.youtube.com/embed/2fLqZI1M-fY" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen></iframe>

<p>I continue to evolve TrackIt. Currently I'm finishing up a service implementation written in F# using <a href="https://github.com/giraffe-fsharp/Giraffe">Giraffe</a>. Perhaps more on that later.</p>]]></content:encoded></item><item><title><![CDATA[Getting Squirrelly...again]]></title><description><![CDATA[<h2 id="previously">Previously</h2>

<p>I first investigated <a href="https://github.com/Squirrel/Squirrel.Windows">Squirrel.Windows</a> as a technology for Windows application deployment two years ago during a search for an alternative to <a href="https://msdn.microsoft.com/en-us/library/142dbbz4(v=vs.90).aspx">ClickOnce</a> (<a href="https://developingdane.azurewebsites.net/click_once_for_my_2_cents/">my 2 cents</a>). I documented some of those efforts in my post <a href="https://developingdane.com/click_once_squirrel_and_nuts/">ClickOnce, Squirrel and Nuts</a>. In that post I expressed a very favorable opinion of</p>]]></description><link>https://developingdane.azurewebsites.net/squirrel-and-nuts-the-clickonce-free-sequel-2/</link><guid isPermaLink="false">db068422-f5fa-4774-b0fb-22fcf4a0e93a</guid><category><![CDATA[Squirrel]]></category><category><![CDATA[Windows Application Deployment]]></category><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Sat, 05 Aug 2017 02:18:57 GMT</pubDate><media:content url="https://developingdane.com/content/images/2017/08/squirrel2.jpg" medium="image"/><content:encoded><![CDATA[<h2 id="previously">Previously</h2>

<img src="https://developingdane.com/content/images/2017/08/squirrel2.jpg" alt="Getting Squirrelly...again"><p>I first investigated <a href="https://github.com/Squirrel/Squirrel.Windows">Squirrel.Windows</a> as a technology for Windows application deployment two years ago during a search for an alternative to <a href="https://msdn.microsoft.com/en-us/library/142dbbz4(v=vs.90).aspx">ClickOnce</a> (<a href="https://developingdane.azurewebsites.net/click_once_for_my_2_cents/">my 2 cents</a>). I documented some of those efforts in my post <a href="https://developingdane.com/click_once_squirrel_and_nuts/">ClickOnce, Squirrel and Nuts</a>. In that post I expressed a very favorable opinion of Squirrel but I also I noted some areas of dissatisfaction in the discussion and conclusions.</p>

<ul>
<li>Latest version of the deployed application was not always run</li>
<li>Cleanup of previous application versions wasn't working</li>
<li>Instability with <code>releasify</code> command</li>
<li>Reliance on historical <code>.nupkg</code> files from the <code>Releases</code> folder</li>
</ul>

<h2 id="lately">Lately</h2>

<p>Recently I again went in search of ClickOnce replacement options. It became quickly apparent not much had changed. Squirrel still looked to me to be the best available alternative so I took another dive. I examined the state of the issues previously discussed then developed a basic strategy for controlling Squirrel deployed releases.</p>

<h6 id="latestversionissue">Latest Version Issue</h6>

<p>The first issue can now be resolved through the use of a simple static method on the class <code>Squirrel.UpdateManager</code>.  </p>

<pre><code class="language-csharp">UpdateManager.RestartApp("{app_exe_name}");  
</code></pre>

<p>This works quite well and if added to an application's start-up, after a Squirrel update but before the rendering of the view, users will always be presented with the latest version of the application. Very nice.</p>

<h6 id="cleanupissue">Cleanup Issue</h6>

<p>Application cleanup was still an issue when I started evaluating Squirrel the second time. In looking through  GitHub issues for Squirrel.Windows I found others had the same problem so I attempted to debug it myself. I discovered that this <a href="https://github.com/Squirrel/Squirrel.Windows/issues/1024">issue</a> was a trival defect. It's been since resolved. Squirrel now cleans older versions of the deployed application leaving only the current and one previous.</p>

<h6 id="releasifyinstabilityissue">Releasify Instability Issue</h6>

<p>This issue still exists, however, occurrences have GREATLY diminished. Previously the <code>releasify</code> command would stop working (requiring a Visual Studio restart) every 3-5 executions. During the second evaluation I encountered this issue exactly two times in roughly 100 executions.</p>

<h6 id="historicalnupkgfilesissue">Historical .nupkg Files Issue</h6>

<p>In my first post I noted the <code>releasify</code> command relies on the output of the previous execution in order to generate delta packages. My concern was this would necessitate source control for binary <code>.nupkg</code> files. That would mean source control of binary files created by compiling code files stored in the same source control project. That's far from ideal. </p>

<p>Over the course of the second evaluation I developed a simple strategy to build and manage Squirrel deployment releases. This strategy addressed my concerns related to historical <code>.nupkg</code> files.</p>

<h2 id="squirreldeploymentsreleasestrategy">Squirrel Deployments Release Strategy</h2>

<p>I developed the following strategy to provide controlled releases to Squirrel deployments. It assumes access to both a source control system and some form of shared file (binary) storage. Ideally both should be covered under some type of backup policy.</p>

<h5 id="setupandinitialrelease">Setup and Initial Release</h5>

<p>As described in Squirrel's <a href="https://github.com/Squirrel/Squirrel.Windows/tree/master/docs/getting-started">Getting Started Guide</a> the <code>Squirrel.Windows</code> NuGet package should installed in the start-up project of your Windows application. </p>

<p>In the static start-up of your application add code to check for available updates, apply any updates found and restart the application if updates were applied. All this should be performed prior to rendering the display. This ensures the user is always presented with the latest version of the application. The following code sample illustrates this method for a Windows Forms Application.  </p>

<pre><code class="language-csharp">static class Program  
{
    [STAThread]
    static void Main()
    {
        Task.Run(() =&gt; CheckAndApplyUpdate()).GetAwaiter().GetResult();
        Application.EnableVisualStyles();       
        Application.SetCompatibleTextRenderingDefault(false);
        Application.Run(new Form1());
    }

    public static async Task CheckAndApplyUpdate()
    {
        bool updated = false;
        using (var updateManager = new UpdateManager(URI))
        {
            var updateInfo = await updateManager.CheckForUpdate();
            if (updateInfo.ReleasesToApply != null &amp;&amp; 
                updateInfo.ReleasesToApply.Count &gt; 0)
            {
                var releaseEntry = await updateManager.UpdateApp();
                updated = true;
            }
        }
        if (updated) { UpdateManager.RestartApp(EXE_NAME); }
    }

    private const string EXE_NAME = "WindowsFormsApp2.exe";
    private const string URI = "{URI to Releases folder}";
}
</code></pre>

<p>Add a <code>{project}.nuspec</code> file to the start-up project (be sure it's also added to source control). Include only <code>file</code> entries for the assemblies that are necessary to run the application through the Squirrel update and restart. Excluded all assemblies not required to get to that point. This ensures that the <code>setup.exe</code> file created by the initial <code>releasify</code> command will be as small as possible. Following is sample content for a <code>.nuspec</code> file's <code>packages</code> element. Note: The <code>description</code> element is required and its value is used by Windows search features to find the application.  </p>

<pre><code class="language-markup">&lt;metadata&gt;  
  &lt;id&gt;WindowsFormsApp2&lt;/id&gt;
  &lt;version&gt;1.0.0&lt;/version&gt;
  &lt;authors&gt;Dane Vinson&lt;/authors&gt;
  &lt;requireLicenseAcceptance&gt;false&lt;/requireLicenseAcceptance&gt;
  &lt;description&gt;WindowsFormsApp2&lt;/description&gt;
&lt;/metadata&gt;  
&lt;files&gt;  
  &lt;file src=".\bin\Release\WindowsFormsApp2.exe" 
        target="lib\net45\WindowsFormsApp2.exe" /&gt;
  &lt;file src=".\bin\Release\WindowsFormsApp2.exe.config" 
        target="lib\net45\WindowsFormsApp2.exe.config" /&gt;
  &lt;file src=".\bin\Release\DeltaCompressionDotNet.dll" 
        target="lib\net45\DeltaCompressionDotNet.dll" /&gt;
  &lt;file src=".\bin\Release\DeltaCompressionDotNet.MsDelta.dll" 
        target="lib\net45\DeltaCompressionDotNet.MsDelta.dll" /&gt;
  &lt;file src=".\bin\Release\DeltaCompressionDotNet.PatchApi.dll" 
        target="lib\net45\DeltaCompressionDotNet.PatchApi.dll" /&gt;
  &lt;file src=".\bin\Release\Mono.Cecil.dll" 
        target="lib\net45\Mono.Cecil.dll" /&gt;
  &lt;file src=".\bin\Release\NuGet.Squirrel.dll" 
        target="lib\net45\NuGet.Squirrel.dll" /&gt;
  &lt;file src=".\bin\Release\SharpCompress.dll" 
        target="lib\net45\SharpCompress.dll" /&gt;
  &lt;file src=".\bin\Release\Splat.dll" 
        target="lib\net45\Splat.dll" /&gt;
  &lt;file src=".\bin\Releas\Squirrel.dll" 
        target="lib\net45\Squirrel.dll" /&gt;
&lt;/files&gt;  
</code></pre>

<p>Create the initial release</p>

<ol>
<li>Build the start-up project  </li>
<li>Create <code>.nupkg</code> from <code>.nuspec</code>  </li>
<li><code>releasify</code> the <code>.nupkg</code>  </li>
<li>Commit changes to <code>.nuspec</code> file</li>
</ol>

<p>The initial "minified" <code>setup.exe</code> file should deployed to a location accessible to any potential application users. Designed correctly it's possible for this file to serve as the application's primary installer for its lifetime.</p>

<p>The first change to the application should be to add <code>file</code> entries to the <code>.nuspec</code> file for all remaining application dependencies. This should be considered a release. For this and all future releases proceed with the steps below.</p>

<h5 id="createnewrelease">Create New Release</h5>

<p>Creation of Squirrel deployment releases should follow these steps.</p>

<ol>
<li>Commit changes for the release  </li>
<li>Delete <code>Releases</code> from the build-time environment (if it exists)  </li>
<li>Copy <code>Releases</code> from the shared location to the build-time environment  </li>
<li>Increment the <code>version</code> element in the <code>.nuspec</code> file  </li>
<li>Build the start-up project  </li>
<li>Create <code>.nupkg</code> from <code>.nuspec</code>  </li>
<li><code>releasify</code> the <code>.nupkg</code>  </li>
<li>Commit changes to <code>.nuspec</code>  </li>
<li>Copy newest <code>*.nupkg</code> and <code>RELEASES</code> files from build-time <code>Releases</code> folder to the shared <code>Releases</code> location</li>
</ol>

<p>If for any reason the shared <code>Releases</code> repository is lost simply re-create an empty <code>Releases</code> folder at the share location and repeat the previous steps. The next time users run the application they'll have to download all files but the application will continue to function correctly. Even a fresh install with the original "minified" <code>setup.exe</code> will correctly update to the latest version of the application. I've found Squirrel to be quite resilient with respect to historical <code>.nupkg</code> files.</p>

<h2 id="conclusionsthesequel">Conclusions...the Sequel</h2>

<p>Once again I've been impressed with Squirrel. As an application updater it's matched or surpassed ClickOnce in almost every way. The one remaining feature of ClickOnce that Squirrel does not provide is the ability to run the application directly from a browser. I didn't discuss this in my previous post because I felt it fell beyond Squirrel's intent. That said there are numerous benefits to that feature not the least of which is that it eliminates the need for an actual "installer" (e.g. <code>setup.exe</code>).</p>

<h2 id="finally">Finally</h2>

<p>I'm currently working towards using Squirrel and the described release strategy to replace ClickOnce in of one of my company's production applications. I'm also working on an idea to provide install/run capabilities directly from the browser but as of now that's only partially formed.</p>]]></content:encoded></item><item><title><![CDATA[Comparing Data Stores]]></title><description><![CDATA[<h3 id="background">Background</h3>

<p>I've found myself trying to quantify differences between available data store options in .NET many times over the last decade. Historically I've performed such comparisons using a simple Console Application project in Visual Studio. I'd write test methods to provide comparable data for the data stores under evaluation, analyze</p>]]></description><link>https://developingdane.azurewebsites.net/comparing-data-stores/</link><guid isPermaLink="false">2b4ee92f-2f28-407d-929e-b4566099e0dc</guid><category><![CDATA[Architecture]]></category><category><![CDATA[.NET]]></category><category><![CDATA[Azure]]></category><category><![CDATA[Azure Sql]]></category><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Sun, 09 Apr 2017 05:20:19 GMT</pubDate><media:content url="https://developingdane.com/content/images/2017/05/colage.jpg" medium="image"/><content:encoded><![CDATA[<h3 id="background">Background</h3>

<img src="https://developingdane.com/content/images/2017/05/colage.jpg" alt="Comparing Data Stores"><p>I've found myself trying to quantify differences between available data store options in .NET many times over the last decade. Historically I've performed such comparisons using a simple Console Application project in Visual Studio. I'd write test methods to provide comparable data for the data stores under evaluation, analyze the data, select for implementation and move forward. I've experienced success with this pattern for many years. </p>

<p>Recently, I found myself about to write code for yet another such comparison, once again I was re-inventing some testing methodologies. This time I decided to create (and document) an abstraction layer around the concept of a data store. My thinking was to provide a single testable abstraction that could be implemented in any .NET data storage technology. Not only would future tests be easier to write but the empirical data they generated would be standardized and thus comparable.</p>

<h3 id="objective">Objective</h3>

<p>Design an abstraction layer in .NET which defines the concept of a "data store" then implement this abstraction using an array of .NET data storage technologies. Gather data. Compare and contrast results.</p>

<h3 id="sourcecode">Source Code</h3>

<p><a href="https://github.com/DaneVinson/DataRepositoryComparison">DataRepositoryComparison</a> is a Visual Studio solution targeting .NET 4.6.1. It contains the projects <code>Model</code> (Class Library), <code>ConsoleApp</code> (Console Application) and multiple separate data store implementation projects, e.g. <code>Sql.Dapper</code>. The <code>Model</code> project defines  the common domain model abstractions to be used by all concrete implementations. <code>ConsoleApp</code> provides a run-time environment for testing. All other projects are specific implementations of the <code>Model</code> project's abstractions using a particular .NET data store technology, e.g. <a href="https://github.com/StackExchange/dapper-dot-net">Dapper</a>.</p>

<p>The Repository pattern is employed and an abstraction of the "data store" concept is defined.  </p>

<pre><code class="language-csharp">public interface IRepository  
{
    bool Create(IEnumerable&lt;IThing&gt; things);
    bool Delete(IEnumerable&lt;string&gt; ids);
    IThing[] Get();
    IThing[] Get(IEnumerable&lt;string&gt; ids);
}
</code></pre>

<p>This design allows for the creation of <code>IThing</code> entities, deletion of <code>IThing</code> entities by identifier, retrieval of all available <code>IThing</code> entities and retrieval of <code>IThing</code> entities by identifier. Note, <code>IRepository</code> has no metaphor for query (excluding by identifier) or update.</p>

<h3 id="setup">Setup</h3>

<p>All testing was performed using the following Azure VM (Resource Manger deployment model).</p>

<ul>
<li>Template: SQL Server 2016 SP1 Standard on Windows Server 2016</li>
<li>Size: DS2_V2 (2 Core, 7 GB, $104.16/month)</li>
<li>Disk Type: SSD</li>
</ul>

<p>The following <code>IRepository</code> implementations and their associated configurations were used to gather data.</p>

<ul>
<li><code>Sql.Dapper</code>: Local (VM) SQL Server</li>
<li><code>Sql.Dapper</code>: Azure SQL, Basic (5 DTU), $5/month</li>
<li><code>Sql.Dapper</code>: Azure SQL, Premium (125 DTU), $465/month</li>
<li><code>DocumentDB</code>: 400 RU, $24/month/collection</li>
<li><code>DocumentDB</code>: 10,000 RU, $600/month/collection</li>
<li><code>AzureStorageTable</code>: Premium Account, LRS</li>
<li><code>AzureStorageBlob</code>: Premium Account, LRS</li>
</ul>

<h3 id="method">Method</h3>

<p>The following is a simplified version of the data gathering cycle applied to each <code>IRepository</code> implementation.  </p>

<pre><code class="language-csharp">IRepository repository = {concrete_implementation_instance};  
IThing[] things = {GenFu_IThing_data};  
repository.Create(things);  
ids = repository.Get().Select(t =&gt; t.Id).ToArray();  
repository.Get(ids);  
repository.Delete(ids);  
</code></pre>

<p>Timers were scoped around each <code>IRepository</code> method call and completion times were recorded.</p>

<p>For each <code>IRepository</code> implementation data was gathered using 100 iterations of the cycle each with 100 randomly generated <code>IThing</code> entities (thank you <a href="https://github.com/MisterJames/GenFu">GenFu</a>). Additionally, data gathering cycles were executed both synchronously and asynchronously.</p>

<h3 id="results">Results</h3>

<p><img src="https://developingdane.azurewebsites.net/content/images/2017/04/create100x100.jpg" alt="Comparing Data Stores"></p>

<p><img src="https://developingdane.azurewebsites.net/content/images/2017/04/getall100x100.jpg" alt="Comparing Data Stores"></p>

<p><img src="https://developingdane.azurewebsites.net/content/images/2017/04/get100x100.jpg" alt="Comparing Data Stores"></p>

<p><img src="https://developingdane.azurewebsites.net/content/images/2017/04/delete100x100.jpg" alt="Comparing Data Stores"></p>

<h3 id="discussion">Discussion</h3>

<p>The "LocalSql" repository shown in the charts (<code>Sql.Dapper</code> implementation of <code>IRepository</code> on VM local resources) serves as an excellent baseline for comparisons. As expected "local" repository times for all actions are faster than those of other repositories. Most are significantly faster.</p>

<p>In almost every case asynchronous operations outperformed their synchronous counterparts. However, for <code>Sql.Dapper</code> Azure SQL the differences were dramatic in operations executed with unique Id, i.e. Create, Get, Delete. Note that for the remaining operation, i.e. GetAll, synchronous actually slightly outperformed asynchronous. The data suggests that Azure SQL is heavily optimized for asynchronous operations. That's consistent with expectations.</p>

<p>Excluding synchronous operations <code>Sql.Dapper</code> on Azure SQL made a strong showing. Even the Azure Basic tier database performs near the top of every operation. Also noteworthy is the performance of <code>AzureStorageTable</code>, again near the top in performance for each operation. A surprise for me was <code>AzureStorageBlob</code>. This implementation literally just serializes each IThing to json and writes it to an individual file on Azure Blob Storage. That would seem to be a horribly inefficient way to store data but, with the exception of GetAll, the performance is admirable.</p>

<h3 id="conclusions">Conclusions</h3>

<p>The abstraction <code>IRepository</code> worked well with implementations of a widely varied group of data storage technologies. Implementations were simple to write and data gathered from exercising them was easily comparable. I'm satisfied with the results and will likely add additional implementations in the future.</p>

<h3 id="references">References</h3>

<ul>
<li><a href="https://github.com/DaneVinson/DataRepositoryComparison">DataRepositoryComparison</a> (source code)</li>
<li><a href="https://github.com/StackExchange/Dapper">Dapper</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/documentdb/">DocumentDB</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/storage/storage-dotnet-how-to-use-tables">Azure Storage Tables</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/storage/storage-dotnet-how-to-use-blobs">Azure Storage Blob</a></li>
</ul>]]></content:encoded></item><item><title><![CDATA[Scheduling Azure WebJobs]]></title><description><![CDATA[<h3 id="scheduledazurewebjoboptions">Scheduled Azure WebJob Options</h3>

<p>I had a need to understand the options for scheduling Azure WebJobs. I quickly found the Azure documentation article <a href="https://docs.microsoft.com/en-us/azure/app-service-web/web-sites-create-web-jobs">Run background tasks with WebJobs</a> which provides two possible methods for scheduling the execution of a WebJob, CRON expression and the use of Azure Scheduler. Those options</p>]]></description><link>https://developingdane.azurewebsites.net/scheduled-azure-web-jobs/</link><guid isPermaLink="false">18d3f07b-3152-42b6-ab5e-0dc96f4d04ff</guid><category><![CDATA[Azure]]></category><category><![CDATA[Azure App Service]]></category><category><![CDATA[Visual Studio]]></category><dc:creator><![CDATA[Dane Vinson]]></dc:creator><pubDate>Mon, 20 Mar 2017 04:23:29 GMT</pubDate><media:content url="https://developingdane.com/content/images/2017/03/azurewebjobicon.jpg" medium="image"/><content:encoded><![CDATA[<h3 id="scheduledazurewebjoboptions">Scheduled Azure WebJob Options</h3>

<img src="https://developingdane.com/content/images/2017/03/azurewebjobicon.jpg" alt="Scheduling Azure WebJobs"><p>I had a need to understand the options for scheduling Azure WebJobs. I quickly found the Azure documentation article <a href="https://docs.microsoft.com/en-us/azure/app-service-web/web-sites-create-web-jobs">Run background tasks with WebJobs</a> which provides two possible methods for scheduling the execution of a WebJob, CRON expression and the use of Azure Scheduler. Those options were a good start and I'd like to discuss my thoughts on them and present additional possibilities.</p>

<h6 id="option1">Option 1</h6>

<p>For CRON expressions the article describes the use of a <code>settings.job</code> file. What's not clear in that article is where this file originates. It turns out its added to your WebJob when you create a Triggered, Scheduled WebJob through the Azure Portal.</p>

<p><img src="https://developingdane.azurewebsites.net/content/images/2017/03/azurewebjobadd.jpg" alt="Scheduling Azure WebJobs"></p>

<h6 id="option2">Option 2</h6>

<p>Use Visual Studio to manually add <code>settings.job</code> to a console app (be sure the file's <code>Copy to output directory</code> property is set to <code>Copy always</code>) and then deploy that app as a <code>Run on Demand</code> WebJob using VS publishing.</p>

<p><img src="https://developingdane.azurewebsites.net/content/images/2017/03/vsazurewebjobpublish.jpg" alt="Scheduling Azure WebJobs"></p>

<h6 id="option3">Option 3</h6>

<p>The <a href="https://github.com/Azure/azure-webjobs-sdk-extensions">Azure WebJobs SDK</a> provides yet another method of creating CRON driven WebJobs. This method utilizes the SDK's <code>TimerTrigger</code> and requires a Continous WebJob. The WebJob can be created using the Azure Portal or by VS publishing as long as it's Continuous. </p>

<p><code>TriggerTimer</code> is an attribute applied to a method argument of type <code>TimerInfo</code>. Its constructor takes a CRON expression string. By convention this method is to be implemented in the <code>Functions</code> class of the console app to be deployed.</p>

<pre><code class="language-csharp">public class Functions  
{
    public static void MyJob([TimerTrigger("0 * * * * *")]TimerInfo timer)
    {
        // Do stuff
    }
}
</code></pre>

<h6 id="options13restrictions">Options 1-3 Restrictions</h6>

<p>Each of the CRON expression methods previously described requires the Azure App Service hosting the WebJob be <code>Always On</code>. This means that the App Service's scaling tier must be Basic (currently $55/month) or higher. For many user's legitimate WebJob use cases the cost/benefit ratio simply doesn't make sense at that monthly cost. </p>

<h6 id="option4">Option 4</h6>

<p>Users who require a cheaper solution will likely need to employ the second option from the originally referenced article, i.e. using Azure Scheduler to trigger Azure WebJobs. In that scenario there's no restriction on App Service pricing tier, i.e. Free works, and the cost for Azure Scheduler is minimal. The downside is, of course, an increase in system complexity.</p>

<p>References</p>

<ul>
<li><a href="https://docs.microsoft.com/en-us/azure/app-service-web/web-sites-create-web-jobs">Run background tasks with WebJobs</a></li>
<li><a href="https://github.com/Azure/azure-webjobs-sdk-extensions">Azure WebJobs SDK Github</a></li>
<li><a href="http://blog.amitapple.com/post/2015/06/scheduling-azure-webjobs/#.WM9NQDsrIdU">Scheduling Azure WebJobs with cron expressions</a></li>
<li><a href="https://blogs.msdn.microsoft.com/tfssetup/2016/05/18/deploying-and-schedule-azure-webjobs-from-vsts-to-azure-web-app-with-bonus-cron-scheduling/">Deploying and Schedule Azure WebJobs from VSTS to Azure Web App (With bonus CRON Scheduling)</a></li>
</ul>]]></content:encoded></item></channel></rss>