# Friday, 30 November 2018

Given an Azure Function, you may wish to change the URL that points to this function. There are several reasons to do this:

  1. Make the URL simpler
  2. Make the URL more readable
  3. Make the URL conform to your organization's standards

To reassign a Function's URL, you will need to know the existing URL. To find this, select the Function and click the "Get function URL" link, as shown in Fig. 1.

FP01-GetFunctionUrl
Fig. 1

The Function URL dialog displays, as shown in Fig. 2.

FP02-FunctionUrl
Fig. 2

Click the [Copy] icon to copy this URL to your clipboard. You may wish to paste this into a text document or another safe place for later use.

Each Azure Function App contains a "Proxies" section, as shown in Fig. 3.

FP03-Proxies
Fig. 3

Click the [+] icon to display the "New proxy" blade, as shown in Fig. 4.

FP04-Proxy
Fig. 4

At the "Name" field, enter a name to identify this proxy. I like to include the name of the original function in this name, to make it easy to track to its source.

At the "Route template" field, enter a template for the new URL. This is everything after the "https://" and the domain name. If the function accepts parameters, you will need to add these and surround them with curly brackets: "{" and "}".

At the "Allowed HTTP methods" dropdown, select "All methods" or check only those methods you wish your new URL to support.

At the "Backend URL" field, enter the full original URL copied earlier to your clipboard. If the function accepts parameters, you will need to add these and surround them with curly brackets: "{" and "}". The parameter name here must match the parameter name in the "Route template" field.

An example

For example, if I created a Function with an HTTPTrigger and accepted all the defaults (as described here), you will have a function that accepts a querystring parameter of "name" and outputs "Hello, " followed by the value of name.

My original function URL looked similar to the following:

https://dgtestfa.azurewebsites.net/api/HttpTrigger1?code=idLURPj58mZrDdkAh9LkTkkz2JZRmp6/ru/DQ5RbotDpCtg/WY/pRw==

So, I entered the following values into the "New Proxy" blade:

Name: HttpTrigger1Proxy
Route template: welcome/{name}
Allowed HTTP methods: All methods
Backend URL: https://dgtestfa.azurewebsites.net/api/HttpTrigger1?code=idLURPj58mZrDdkAh9LkTkkz2JZRmp6/ru/DQ5RbotDpCtg/WY/pRw==&name={name}

With these settings, I can send a GET or POST request to the following url:

https://dgtestfa.azurewebsites.net/welcome/David

and receive the expected response:

Hello, David

This new URL is much simpler and easier to remember than the original one.

In this article, I showed you how to create a proxy that redirects from a new URL to an existing Azure Function.

Friday, 30 November 2018 09:43:00 (GMT Standard Time, UTC+00:00)
# Thursday, 29 November 2018

GCast 24:

Azure Function CosmosDB Binding

Using the CosmosDB binding in an Azure Function allows you to read and write documents in an Azure CosmosDB database without writing code.

Thursday, 29 November 2018 09:22:00 (GMT Standard Time, UTC+00:00)
# Wednesday, 28 November 2018

Setting up continuous deployment of an Azure Function from GitHub  is straightforward.

In this article, I already had an Azure Function (created using Visual Studio) in a GitHub  repository and an empty Azure Function App.

See this article for information on GitHub

See this article to learn how to create an Azure Function App.

Open the Azure Function App in the Azure portal, as shown in Fig. 1.

DF01-FunctionApp-big
Fig. 1

Click the "Platform features" link (Fig. 2) to display the "Platform features" page, as shown in Fig. 3.

DF02-PlatformFeaturesLink
Fig. 2

DF03-PlatformFeatures
Fig. 3

Under "Code Deployment", click the "Deployment Center" link to open the "Deployment Center" page, as shown in Fig. 4.

DF04-DeploymentCenter
Fig. 4

On the "Deployment Center" page, select the "GitHub " tile and click the [Continue] button, as shown in Fig. 5.

DF05-DeploymentCenterContinue
Fig. 5

The wizard advances to the "Configure" page of the "Deployment Center" wizard, as shown in Fig. 6.

DF06-ConfigureDeployment
Fig. 6

At the "Organization" dropdown, select the GitHub  account where your code resides. If you don't see the account, you may need to give your Azure account permission to view your GitHub  repository.

At the "Repository" dropdown, select the code repository containing your Azure Functions.

At the "Branch" dropdown, select the code branch you wish to deploy whenever a change is pushed to the repository. I almost always select "master" for this.

Click the [Continue] button to advance to the "Summary" page of the "Deployment Center" wizard, as shown in Fig. 7.

DF07-Summary
Fig. 7

On the "Summary" page, review your choices and click the [Finish] button if they are correct. (If they are not correct, click the [Back] button and make the necessary corrections.

In a few minutes, the function or functions in your repository will appear under your Function App in the Azure portal, as shown in Fig. 8.

DF08-Function
Fig. 8

Any future changes pushed to the repository will automatically be added to the Function App.

For example, I can open my Visual Studio project and add a second function, as shown in Fig. 9

DF09-AddNewFunction
Fig. 9

After testing the change, I can push it to my GitHub  repository with the following commands:

git add .
git commit -m "Added a new function"
git push origin master

Listing 1

Because a webhook was added to my GitHub  repository, this change will be pushed to my Azure Function App. Fig. 10 shows the Function app a few minutes after I pushed my change to GitHub .

DF10-FunctionAppAfterPush
Fig. 10

In this article, you learned how to configure continuous deployment of your Azure Function App from a GitHub repository.

Wednesday, 28 November 2018 08:33:00 (GMT Standard Time, UTC+00:00)
# Tuesday, 27 November 2018

In a recent article, I showed how to create a Durable Azure Function. If you are unfamiliar with Durable Functions, I recommend you read that article first.

In that article, the Durable Function called 3 Activity Functions in sequence. No Function executed until the Function before it completed. Sometimes, it is important that Functions execute in a certain order. But sometimes it does not matter in which order a Function executes - only that they each complete successfully before another Activity Function is called. In these cases, executing sequentially is a waste of time. It is more efficient to execute these Azure Functions in parallel.

In this article, I will show how to create a durable function that executes three Activity Functions in parallel; then waits for all 3 to complete before executing a fourth function.
 
Fig. 1 illustrates this pattern.

PD01-ParallelDurableFunctionFlow
Fig. 1
 
As we noted in the earlier article, a Durable function is triggered by a starter function, which is in turn triggered by an HTTP request, database change, timer, or any of the many triggers supported by Azure Functions, as shown in Fig. 2.

PD02-DurableFunctionTrigger
Fig. 2

I created 4 Activity Functions that do nothing more than write a couple messages to the log (I use LogWarning, because it causes the text to display in yellow, making it easier to find); delay a few seconds (to simulate a long-running task); and return a string consisting of the input string, concatenated with the name of the current function. The functions are nearly identical: Only the Function Name, the message, and the length of delay are different.

The 4 functions are shown below:

    public static class Function1
     {
         [FunctionName("Function1")]
         public static async Task<string> Run(
             [ActivityTrigger] string msg,
             ILogger log)
         {
             log.LogWarning("This is Function 1");
             await Task.Delay(15000);
             log.LogWarning("Function 1 completed");
             msg += "Function 1";
            return msg;
        }
    }
  

Listing 1

    public static class Function2 
    { 
        [FunctionName("Function2")] 
        public static async Task<string> Run( 
            [ActivityTrigger] string msg, 
            ILogger log) 
        { 
             log.LogWarning("This is Function 2"); 
            await Task.Delay(10000); 
            log.LogWarning("Function 2 completed"); 
            msg += "Function 2"; 
            return msg; 
        } 
    }
  

Listing 2

    public static class Function3 
     { 
        [FunctionName("Function3")] 
        public static async Task<string> Run( 
            [ActivityTrigger] string msg, 
            ILogger log) 
        { 
            log.LogWarning("This is Function 3"); 
            await Task.Delay(5000); 
             log.LogWarning("Function 3 completed"); 
            msg += "Function 3"; 
            return msg; 
        } 
    }
  

Listing 3

    public static class Function4 
    { 
        [FunctionName("Function4")] 
        public static async Task<string> Run( 
             [ActivityTrigger] string msg, 
            ILogger log) 
         { 
            log.LogWarning("This is Function 4"); 
             int secondsDelay = new Random().Next(8, 12); 
            await Task.Delay(1000); 
            log.LogInformation("Function 4 completed"); 
            msg += "\n\rFunction 4"; 
            return msg; 
        } 
    }
  

Listing 4

We use the Parallel Task library to launch the first 3 functions and have them run in parallel; then, wait until each of the first 3 complete before executing the 4th Activity Function.

Listing 5 shows this code in our Durable Orchestration function.

    public static class DurableFunction1 
    { 
        [FunctionName("DurableFunction1")] 
         public static async Task<IActionResult> Run( 
            [OrchestrationTrigger] DurableOrchestrationContext ctx, 
            ILogger log) 
        { 
            var msg = "Durable Function: "; 
            var parallelTasks = new List<Task<string>>(); 
             Task<string> task1 = ctx.CallActivityAsync<string>("Function1", msg); 
            parallelTasks.Add(task1); 
            Task<string> task2 = ctx.CallActivityAsync<string>("Function2", msg); 
            parallelTasks.Add(task2); 
            Task<string> task3 = ctx.CallActivityAsync<string>("Function3", msg); 
             parallelTasks.Add(task3);

            await Task.WhenAll(parallelTasks);

            // All 3 Activity functions finished 
            msg = task1.Result + "\n\r" + task2.Result + "\n\r" + task3.Result;

            // Use LogWarning, so it shows up in Yellow, making it easier to spot 
            log.LogWarning($"All 3 Activity functions completed for orchestration {ctx.InstanceId}!");

            msg = await ctx.CallActivityAsync<string>("Function4", msg); 
            log.LogWarning(msg);

            return new OkObjectResult(msg); 
        } 
    }
  

Listing 5

We create a new List of Tasks and add each activity to that list:

var msg = "Durable Function: ";
var parallelTasks = new List<Task<string>>();
Task<string> task1 = ctx.CallActivityAsync<string>("Function1", msg);
parallelTasks.Add(task1);
Task<string> task2 = ctx.CallActivityAsync<string>("Function2", msg);
parallelTasks.Add(task2);
Task<string> task3 = ctx.CallActivityAsync<string>("Function3", msg);
parallelTasks.Add(task3);

The following line tells the system to wait until all 3 tasks in that list are completed.

await Task.WhenAll(parallelTasks);

When all 3 tasks complete, we resume the program flow, calling the 4th Activity and logging the output:

log.LogWarning($"All 3 Activity functions completed for orchestration {ctx.InstanceId}!");
msg = await ctx.CallActivityAsync<string>("Function4", msg);
log.LogWarning(msg);

As in the previous article, we launch this Durable Orchestration Function with a starter function (in this case a function with an HTTP trigger), as shown in Listing 6 below.

    public static class StarterFunction1 
    { 
        [FunctionName("StarterFunction1")] 
        public static async Task<HttpResponseMessage> Run( 
            [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] 
            HttpRequestMessage req, 
            [OrchestrationClient] DurableOrchestrationClient starter, 
            TraceWriter log) 
        { 
             log.Info("About to start orchestration");

            var orchestrationId = await starter.StartNewAsync("DurableFunction1", log); 
            return starter.CreateCheckStatusResponse(req, orchestrationId); 
        } 
    }
  

Testing the Orchestration

We can test this orchestration by running the solution, which displays the HTTP Trigger URL, as shown in Fig. 3

PD003-StartFunction
Fig. 3

We can then open a browser, type the HTTP Trigger URL in the address bar, and press [ENTER] to trigger the function, as shown in Fig. 4

PD004-TriggerFunction
Fig. 4

Switch back to the function output to view the messages as they scroll past. You should see output from each of the first 3 functions (although not necessarily in the order called), followed by a message indicating the first 3 are complete; then output from Function 4. This is shown in Fig. 5.

PD005-FinalOutput
Fig. 5

You can view this project under “Durable Functions” in this GitHub repository.

In this article, I showed how to create a Durable Orchestration Function that launches activity functions that run in parallel.

Tuesday, 27 November 2018 07:29:00 (GMT Standard Time, UTC+00:00)
# Monday, 26 November 2018

Episode 539

Brady Gaster on Marketing Azure

Brady Gaster helps to build and coordinate many of the Azure demos you see on stage at large technical conferences. He talks about how his team tells a story with tools and code.

Monday, 26 November 2018 07:22:00 (GMT Standard Time, UTC+00:00)
# Sunday, 25 November 2018

FearTrumpInTheWhiteHouseI have read a lot of biographies in my life and many of those focused on the lives of U.S. presidents; but it's rare for me to read a book about a sitting president. I made an exception with Fear: Trump in the White House - partly because it was written by Pulitzer Prize winner Bob Woodward, whose books I've always admired; and partly because Donald Trump's presidency - on which this focuses - is different from any other presidency.

So less than two years into his term, I am reading about Trump's ascent to power.

Woodward describes some of the positives of the Trump presidency. For example, he spends a lot of time discussing a decisive and successful strike on a Syrian air base in response to Syrian President Assad's use of chemical weapons on children, in violation of international law.

But Trump supporters will probably not remember the positive coverage, because there are so many unfavourable descriptions of the President. After the first third of Fear, Woodward's descriptions of Mr. Trump are largely unflattering.

The title of the book comes from a Donald Trump quote: "Real power is fear.", a sentiment he has expressed multiple times.

Entering the White House, Donald Trump had very little understanding of the most basic principles of economics or policy or politics or governing.

For example, he repeatedly suggested that printing more money was a viable solution to the deficit. 

Worse, Trump had no interest in correcting his misconceptions. After trying unsuccessfully to educate Trump on the fact that the U.S. had years ago moved from a manufacturing-based economy to a service based economy, a frustrated advisor asked the president "Why do you have these views?" to which Mr. Trump replied, "I just always have."

He displayed similar ignorance and stubbornness in other matters.

He repeatedly insisted to his advisors that the U.S. should not spend money defending other countries and resisted the argument that doing so was an investment in American security. For example, forward-positioned American troops in South Korea reduce the alert time of a potential North Korean nuclear launch from 15 minutes down to 7 seconds. His generals and economic advisors praised this as a good investment. Despite this, Trump would raise the issue every few months, arguing that the U.S. was wasting money in South Korea.

He repeatedly insisted on huge increases in tariffs, despite advice from his economic advisors that doing so would damage the economy.

Donald Trump takes pride in his decisive action, but he often does so without seeking advice or ignoring expert advice.

He pushed hard for withdrawing from Paris Climate Accord with little consultation about the legality and impact.

He declared via Twitter that transgenders would not be allowed in the military - a major policy decision that broke a campaign promise and defied existing laws. He justified it by grossly overestimating the cost and impact transgender soldiers had on the military. The military refused to enforce this ban and it ultimately failed after an expensive court battle.

In choosing his advisors and staff, Donald Trump values personal loyalty to Donald Trump over experience, intelligence, or other qualifications.

Once hired, he ruled his people by intimidation and bullying, often publicly insulting his staff. He took delight in setting one staff member against another. The result is an abundance of infighting within the White House, which made it difficult for everyone to act in a unified manner.

The advisors in the administration do everything in their power to mitigate Trump's worse impulses. Since reasoning often fails, they often deliberately delayed executing on a directive or they stole papers from the president's desk to prevent him from signing an order or even thinking about it.

This worked because the president's attention span is short, and he has no list of things to accomplish - neither on paper nor in his head. When a paper was removed from his desk, he often did not miss it.

In one of the lowest points in the Trump presidency, the president refused to condemn Nazis and Klansmen marching in Charlottesville, VA, chanting racists slogans, such as "Jews will not replace us". Trump's initial response placed blame "on both sides". His advisors finally convinced him to deliver a speech a few days later, explicitly condemning the white supremacist groups; but Trump almost immediately regretted doing so, complaining that it made him look weak. A few days later, he reverted to his equivalency argument stating there were "very fine people on both sides". His handling of the incident drew praise from Ku Klux Klan leader David Duke and almost universal criticism from non-racists in both political parties.

Fear largely paints a picture of a petty, impetuous, ill-tempered, easily distracted, stubborn, president with little ability to listen or learn

The president's pre-conceived notions (often based on ignorance), his poor listening skills, and his frequent refusal to consider opposing viewpoints often made life frustrating for his advisors, many of whom left shortly after taking the position.

Not surprisingly, Donald Trump himself declared the book "fake", even without reading it. And Trump supporters often close their mind to anything critical of their hero, labeling any criticism - fair or otherwise as "fake news". But several things reinforce the credibility of this book. One is Woodward's reputation: He has won two Pulitzer Prizes and has written critical of public figures in both major parties. The other is that the picture Woodward paints of the president in private is consistent with the image Trump projects through Twitter and his rallies. He has never shied away from personal attacks or name-calling; he frequently overstates his own abilities in speeches, claiming to be the best in the world at multiple skills; and his closest associates have publicly attacked on another.

Woodward conducted hundreds of hours of interviews for this book. He does not identify many of his sources, but it's not difficult to guess some of them, particularly when he reports on a private conversation between Trump and one other person. Woodward requested an interview with the president, but never received a reply.

The author writes in the straightforward style of a professional journalist. Woodward seldom asserts his own opinion. Instead, he quotes the opinions of others in the administration.

Woodward closes the book with a disagreement between Trump and his lawyer about whether the President should testify as part of Special Prosecutor Robert Mueller's investigation into alleged collusion between the Trump campaign and Russia. Trump insists he would be an excellent witness. His lawyer tries to tactfully discourage Trump from testifying (he even threatens to resign as his lawyer) because he knows that Trump is a habitual liar and will almost certainly commit perjury during any sworn testimony.

As I write this review, my hope is that Donald Trump will notice it and label me an enemy, as he does with so many who disagree with him.

TrumpTweet

Sunday, 25 November 2018 07:02:00 (GMT Standard Time, UTC+00:00)
# Saturday, 24 November 2018

Padington StationI was in a period of transition last week when the account rep asked if I could travel to London Monday to meet with a customer. A re-org had been announced a few days earlier and I had not yet met my new manager. Following numerous e-mails, meetings, and calls, it was decided I should make the trip. So, I found myself on Thursday afternoon purchasing an international ticket to leave for England Saturday afternoon.

Many who know me are surprised I had never visited the UK, given the amount of travel - both domestic and international - that I've done the past few years. But I was excited for my first visit.

I arrived at London's Heathrow Airport early Sunday morning, expecting to spend the day exploring the city. Unfortunately, a sleepless transatlantic flight meant that I spent much of the day in bed. I did manage to connect with a couple expats living in London, so I had lunch with James, who moved here from Ohio a few months ago; and dinner with Tobiasz, who moved to England from Poland a couple years ago.

On Monday, I also managed to connect with Andy, whom I had known for a few years from our participation in the IT Camp in Transylvania. He joined me for lunch, where I experienced  the delight that is English meat pie.

Waterhouse SqureIt was good to see friendly faces in a foreign country. But I came here to work, so I met with the customer Monday, where we successfully brainstormed potential projects on which we could work together. It was good that I came, and I went to dinner with a couple other Microsoft folks afterward to experience the delight that is fish and chips and English beer in an English pub.

St. James PalaceI had Tuesday to myself, so I opened the day with a guided walking trip around London. The tour was well worth my time and money. Sights included Buckingham Palace, St. James Palace, St. James Garden, Covent Market, and Trafalgar Square. Robin, our tour guide did an excellent job of combining humor and history. The rain began to fall as we began the tour and steadily increased in intensity until we ended 3 hours later in a downpour at Westminster Abbey. A hot meal in nearby pub took the chill off nicely.

National GalleryWith a full belly, I decided to visit the National Gallery, where I saw works by Van Gogh, da Vinci, Bruegel, Cezanne, and others. Like most museums in London, the National Gallery offers free admission, with a suggested (but not required) donation. I could easily spend an entire day at the Gallery, but a couple hours sufficed on this day.

I walked around the city about more, passing through Leicester Square and Chinatown before heading back to my hotel.

My new manager is based in the UK, so he drove to London Tuesday evening and I had a chance to meet him face-to-face for the first time.

After we parted, I decided I needed to experience one more English beer in one more English pub before checking in for my final night and my flight home. I chose the White Hart, which bills itself as "The Oldest licensed premises in London", where I sat in a corner, sipped a lager, and read a couple chapters of "Brideshead Revisited".

It was a whirlwind trip, executed in a short time with a minimum of planning, but it was well worth the effort. I hope to return soon.

Saturday, 24 November 2018 08:26:00 (GMT Standard Time, UTC+00:00)
# Friday, 23 November 2018

Azure Functions provide a simple way to deploy code in a scalable, cost-effective way.

By default, Azure Functions are stateless, which makes it difficult to create complex workflows with basic Azure functions - particularly long-running workflows, such as those that require human interaction.

A Durable Azure Function maintains state for a long time, without having to stay in memory, making it ideal for orchestrations. Stateful information is stored in an Azure Storage Account when the the process terminates. This saves you money, because the default pricing model for Azure functions only charges you while the function is running.

A Durable Function is not triggered in the same way as other Azure Functions (via HTTP, queue, database changes, timer, etc.) Rather, it is called from a "starter" function, which can be triggered in the usual way.

Rather than placing all logic within a single Durable Function, it usually makes more sense to split tasks into individual Activity Functions and have the Durable Function manage these. The most simple Durable Function would simply call multiple activities in sequence. A diagram of this is shown in Fig. 1.

DF01-DurableFunctionFlow
Fig. 1

You can create a Function App for an Azure Durable function in Visual Studio in the same way you create any function - by selecting File | New Project from the menu and selecting "Azure Functions" from the Project Templates dialog, as shown in Fig. 2.

DF02-NewFunctionProject
Fig. 2

Select "Azure Functions v2" from the top dropdown and HttpTrigger" from the list of templates, as shown in Fig. 3; then, click the [OK] button to create the solution and project.

DF03-FunctionTemplate
Fig. 3

The new project contains a function named "Function1". Right-click this function in the Solution Explorer and rename it to "StarterFunction", as shown in Fig. 4.

DF04-RenameFunction
Fig. 4

Open StarterFunction.cs and change the first line of the class from

[FunctionName("Function1")]

to

[FunctionName("StarterFunction")]

Now, you can add a Durable Function to the project. Right-click the project in the Solution Explorer and select Add | New Azure Function from the context menu, as shown in Fig. 5.

DF05-AddNewAzureFunction
Fig. 5

Name the new function "DurableFunction1", as shown in Fig. 6.

DF06-AddDurableFunction
Fig. 6

At the next dialog, select "Durable Function Orchestration" from the list of triggers and click the [OK] button to create the function, as shown in Fig. 7.

DF07-DurableFunctionsOrchestration
Fig. 7

This Durable Function will manage 3 functions, calling each one sequentially. To the project, add 3 new functions named "Function1", "Function2", and "Function3". It does not matter which trigger you choose, because we are going to overwrite the trigger. Paste the code below into each function:

    public static class Function1 
    { 
        [FunctionName("Function1")] 
        public static async Task<string> Run( 
            [ActivityTrigger] string msg, 
            ILogger log) 
        { 
            log.LogWarning("This is Function 1");

            await Task.Delay(10000); 
            msg += "Function1 done; "; 
            return msg; 
        } 
    }
  

Listing 1

    public static class Function2 
    { 
        [FunctionName("Function2")] 
        public static async Task<string> Run( 
             [ActivityTrigger] string msg, 
            ILogger log) 
        { 
            log.LogWarning("This is Function 2");

            await Task.Delay(10000); 
            msg += "Function2 done; "; 
            return msg; 
        } 
    }
  

Listing 2

    public static class Function3 
    { 
        [FunctionName("Function3")] 
        public static async Task<string> Run( 
            [ActivityTrigger] string msg, 
            ILogger log) 
        { 
            log.LogWarning("This is Function 3");

            await Task.Delay(10000); 
            msg += "Function3 done; "; 
            return msg; 
        } 
    }
  

Listing 3

As you can see, each function essentially does the same thing: log a brief message; wait 10 seconds; then, return a string consisting of the string passed in with a bit more appended to the end.

Notice also that the "msg" parameter in each function is decorated with the [ActivityTrigger] attribute, which is what makes each of these an Activity Function.

The Task.Delay() simulates a long-running activity. Imagine an activity that requires human input, such as a manager navigating to a web page and filling out a form. It might take days or weeks for this to happen. We certainly would not the application to continue running during this time: This would be an inefficient use of resources and it would be expensive. Durable functions handle this by storing state information in Azure storage; then retrieving that state when the function needs to resume.

Return to the DurableFunction1 class and replace the code with the following:

    public static class DurableFunction1 
    { 
        [FunctionName("DurableFunction1")] 
        public static async Task<IActionResult> Run( 
            [OrchestrationTrigger] DurableOrchestrationContext ctx, 
            ILogger log) 
        { 
            var msg = "Durable Function: "; 
             msg = await ctx.CallActivityAsync<string>("Function1", msg); 
            msg = await ctx.CallActivityAsync<string>("Function2", msg); 
            msg = await ctx.CallActivityAsync<string>("Function3", msg);

            // Use LogWarning, so it shows up in Yellow, making it easier to spot 
            log.LogWarning(msg);

            return new OkObjectResult(msg); 
        } 
    }
  

Listing 4

You will probably have to add the following to the top of the file in order for it to compile:

using Microsoft.AspNetCore.Mvc;

In Listing 4, we see that the Durable Function calls the 3 Activity functions in order. It passes to each Activity Function the output of the previous function. At then end of the orchestration, we expect to see a concatenation of messages from each of the 3 Activity Functions.

Notice also the parameter of type DurableOrchestrationContext, which is decorated with the [OrchestrationTrigger] attribute. This identifies this as a Durable Orchestration Function.

Finally, return to the StarterFunction class and replace the code with the following:

    public static class StarterFunction
    {
        [FunctionName("StarterFunction")]
        public static async Task<HttpResponseMessage> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]
            HttpRequestMessage req,
            [OrchestrationClient] DurableOrchestrationClient starter,
            ILogger log)
        {
            log.LogInformation("About to start orchestration");

            var orchestrationId = await starter.StartNewAsync("DurableFunction1", log);
            return starter.CreateCheckStatusResponse(req, orchestrationId);
        }
    }
  

Listing 5

To see this in action, compile and run the project. A console will display similar to the one in Fig. 8.

DF08-RunFunction
Fig. 8.

You can trigger the StarterFunction by issuing an HTTP GET to the URL displayed in the console (in this case http://localhost:7071/api/StarterFunction). Open a browser, enter this URL into the address bar, and press [ENTER].

Watch the console. You should see the log statements in each of the functions display in turn. Finally, we will see the final value of the msg variable after being passed to all 3 Activity functions. The output should look something like fig. 9.

DF09-FunctionComplete
Fig. 9

This illustrates the concepts of a Durable Orchestration Function. You can view the source code in the SequentialDurableFunctionDemo project at my Azure-Function-Demos GitHub repository.

Friday, 23 November 2018 09:23:00 (GMT Standard Time, UTC+00:00)
# Thursday, 22 November 2018

GCast 23:

Azure Logic Apps

Learn how to create a Logic App to deploy a workflow in the cloud.

Thursday, 22 November 2018 09:12:00 (GMT Standard Time, UTC+00:00)
# Wednesday, 21 November 2018

Azure Functions allow you to declaratively add bindings to external resources by decorating a C# function with binding attributes.

This means you need to write less code and the code you do write will focus more on your business logic than on updating resources.

In this article, I will show you how to add CosmosDB bindings to an Azure function in order to read from and write to a CosmosDB database.

Create an Configure CosmosDB database and collection

See this article to learn how to create a new CosmosDB instance.

Next create a Database and Collection within your CosmosDB. This article describes how to create a CosmosDB Database and Collection; or you can quickly create a Database named "ToDoList" and a Collection named "Items" from the "Quick Start" tab of the CosmosDB database you created, as shown in Fig. 1.

CD01-QuickStart
Fig. 1

As you work with data in this database, you can view the documents on the "Data Explorer" tab, as shown in Fig. 2.

CD02-DataExplorer
Fig. 2

You will need the Connection String of your CosmosDB. You can find two connection strings on the "Keys" tab, as shown in Fig. 3. Copy either one and save it for later.

CD03-Keys
Fig. 3

Visual Studio project

Create a function in Visual Studio 2017. If you base it on the "Azure Functions" template (Fig. 4), it will have many of the necessary references.

CD04-NewAzureFunctionApp
Fig. 4

Open the local.settings.json file and add a key for "CosmosDBConnection", as shown in Fig. 5. Set its value to the connection string you copied from the "Keys" blade above.

CD05-localsettingsjson
Fig. 5

Delete the existing Function1.cs file from the project and add a new function by right-clicking the project in the Solution Explorer and selecting Add | New Function from the context menu, as shown in Fig. 6. Give the function a meaningful name.

CD06-AddFunction
Fig. 6

Repeat this for any function you wish to add.

Create a model of the expected data

CosmosDB is a schemaless document database, meaning that the database engine does not enforce the type of data it accepts. This is distinct from something like SQL Server, which requires you to define in advance the name, data type, and rules of each column you expect to store.

If you want to validate data, you must do so in your application. One way to do this is to create a Model class that matches the expected incoming data.

In my demo, I expect only to store data that looks like the following:

{
"id" : "001",
"description" : "Write blog post",
"isComplete" : false
}
  

So I created the ToDoItem class shown in Listing 1

public class ToDoItem 
 { 
    [JsonProperty("id")] 
    public string Id { get; set; }

    [JsonProperty("description")] 
    public string Description { get; set; }

    [JsonProperty("isComplete")] 
    public bool IsComplete { get; set; } 
 }
  

Listing 1

Insert a document

The code below generates a function to insert a new document into a database. The function is triggered when you send an HTTP POST request to the function's URL (in this case, "api/InserToDoItem). The document will have the value of the JSON

[FunctionName("InsertItem")] 
public static HttpResponseMessage Run( 
    [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)]HttpRequestMessage req, 
    [CosmosDB( 
        databaseName: "ToDoList", 
        collectionName: "Items", 
        ConnectionStringSetting = "CosmosDBConnection")] 
    out ToDoItem document, 
    ILogger log) 
{ 
    var content = req.Content; 
    string jsonContent = content.ReadAsStringAsync().Result; 
    document = JsonConvert.DeserializeObject<ToDoItem>(jsonContent);

    log.LogInformation($"C# Queue trigger function inserted one row");

    return new HttpResponseMessage(HttpStatusCode.Created); 
}
  

Let's walk through the function.

[FunctionName("InsertItem")]

The name of the function is InsertItem

public static HttpResponseMessage Run(

The Run method executes when the function is triggered. It returns an HTTP Response Message

[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)]HttpRequestMessage req,

The first parameter is the incoming HTTP Request. It is decorated with HttpTrigger, indicating this is an HTTP trigger. Within this decorator's parameters, we indicate that the function can be called anonymously, that it can only be called with an HTTP POST (not GET or PUT or any other verb); and that we are not changing the default routing.

[CosmosDB(
      databaseName: "ToDoList",
      collectionName: "Items",
      ConnectionStringSetting = "CosmosDBConnection")]
     out ToDoItem document,        

The second parameter is an output parameter of type ToDoItem. We will populate this with the data in the Request body, so we type it as a ToDoItem. This parameter is decorated with the CosmosDB attribute, indicating that we will automatically insert this document into the CosmosDB. The databaseName, collectionName, and ConnectionStringSetting tell the function exactly where to store the document. The ConnectionStringSetting argument must match the name  we added     for the connection string in the local.settings.json file, as described above.

ILogger log)

The logger allows us to log information at points in the function, which can be helpful when troubleshooting and debugging.

var content = req.Content;
string jsonContent = content.ReadAsStringAsync().Result;
document = JsonConvert.DeserializeObject<ToDoItem>(jsonContent);

The 3 lines above retrieve the body in the HTTP POST request and convert it to a .NET object of type ToDoItem, which validates that it is the correct format.

log.LogInformation($"C# Queue trigger function inserted one row");

This line is not necessary, but may help us to understand what part of the function executed when we are troubleshooting.

return new HttpResponseMessage(HttpStatusCode.Created);

When the document is successfully inserted, we return an HTTP 201 (Created) status to indicate success.

Retrieve all documents

The following function retrieves all the documents in a container.

    public static class GetItems
    {
        [FunctionName("GetItems")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", Route = null)] HttpRequest req,
            [CosmosDB(
                databaseName: "ToDoList",
                collectionName: "Items",
                ConnectionStringSetting = "CosmosDBConnection",
                SqlQuery = "select * from Items")
            ]IEnumerable<ToDoItem> toDoItems,
            ILogger log)
        {
            log.LogInformation($"Function triggered");

            if (toDoItems == null)
            {
                log.LogInformation($"No Todo items found");
            }
            else
            {
                var ltodoitems = (List<ToDoItem>)toDoItems;
                if (ltodoitems.Count == 0)
                {
                    log.LogInformation($"No Todo items found");
                }
                else
                {
                    log.LogInformation($"{ltodoitems.Count} Todo items found");
                }
            }

            return new OkObjectResult(toDoItems);
        }
    }
  

Breaking down this function:

[FunctionName("GetItems")]        

The name of the function is “GetItems”.

public static async Task<IActionResult> Run(

The Run method executes when the function is triggered. This method is asynchronous and will eventually return an ActionResult.

[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)]HttpRequestMessage req,

The first parameter is the incoming HTTP Request. It is decorated with HttpTrigger, indicating this is an HTTP trigger. Within this decorator's parameters, we indicate that the function can be called anonymously, that it can only be called with an HTTP GET; and that we are not changing the default routing.

[CosmosDB(
databaseName: "ToDoList",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
SqlQuery = "select * from Items") ]IEnumerable<ToDoItem> toDoItems,

This parameter is what will be returned by the function (eventually, because it runs asynchronously). It is a list of objects of type ToDoItem. When serialized, this will be transformed into an array of JSON objects. This parameter is decorated with the CosmosDB attribute, indicating that we will automatically retrieve the list from the CosmosDB. The databaseName, collectionName, and ConnectionStringSetting tell the function exactly where to store the document. The SQlQuery tells what query to run to retrieve the data (in this case, return all the rows)

ILogger log)

The logger allows us to log information at points in the function, which can be helpful when troubleshooting and debugging.

log.LogInformation($"Function triggered");
if (toDoItems == null)
    {
         log.LogInformation($"No Todo items found");
    }
    else
    {
         var ltodoitems = (List<ToDoItem>)toDoItems;
         if (ltodoitems.Count == 0)
        {
            log.LogInformation($"No Todo items found");
        }
        else
        {
             log.LogInformation($"{ltodoitems.Count} Todo items found");
         }
    }

We did not need to write code to query the database. This happens automatically. The code above simply verifies that items were returned and transforms them into  List<ToDoItem> and stores this list in a local variable.

return new OkObjectResult(toDoItems);

We return a 200 (“OK”) HTTP response and the list of items.

Retrieve a single document by its ID

The following function retrieves a single document, given the ID.

    public static class GetItemById
    {
        [FunctionName("GetItemById")]
            public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", Route = "GetItem/{id}")] HttpRequestMessage req,
            [CosmosDB(
                databaseName: "ToDoList",
                collectionName: "Items",
                ConnectionStringSetting = "CosmosDBConnection",
                Id = "{id}")
            ]ToDoItem toDoItem,
            ILogger log)
        {
            log.LogInformation($"Function triggered");

            if (toDoItem == null)
            {
                log.LogInformation($"Item not found");
                return new NotFoundObjectResult("Id not found in collection");
            }
            else
            {
                log.LogInformation($"Found ToDo item {toDoItem.Description}");
                return new OkObjectResult(toDoItem);
            }

        }
    }
  

Here are the details of this function:

[FunctionName("GetItemById")]        

The name of the function is “GetItemById”

public static async Task<IActionResult> Run(

The Run method executes when the function is triggered. This method is asynchronous and will eventually return an ActionResult.

[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)]HttpRequestMessage req,

The first parameter is the incoming HTTP Request. It is decorated with HttpTrigger, indicating this is an HTTP trigger. Within this decorator's parameters, we indicate that the function can be called anonymously, that it can only be called with an HTTP GET; and that we are not changing the default routing.

[CosmosDB(
databaseName: "ToDoList",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
Id = "{id}")
]IEnumerable<ToDoItem> toDoItems,
 

This parameter is what will be returned by the function (eventually, because it runs asynchronously). It will be an object of type ToDoItem. This parameter is decorated with the CosmosDB attribute, indicating that we will automatically retrieve the list from the CosmosDB. The databaseName, collectionName, and ConnectionStringSetting tell the function exactly where to store the document. The id tells the function on which Id to filter the results.  

ILogger log)

The logger allows us to log information at points in the function, which can be helpful when troubleshooting and debugging.

log.LogInformation($"Function triggered");

Debugging information. Not necessary for the operation, but helpful when troubleshooting.  

if (toDoItem == null)
{
    log.LogInformation($"Item not found");
   return new NotFoundObjectResult("Id not found in collection");
}
else
{
    log.LogInformation($"Found ToDo item {toDoItem.Description}");
   return new OkObjectResult(toDoItem);
}

We did not need to write code to query the database. This happens automatically. The code above simply checks if an item was returned matching the ID. If an item is found, we return a 200 (“OK”) HTTP response, along with the item. If no item is returned, we return a 404 (“Not Found) HTTP response.


Retrieve a set of documents using a query

The following function retrieves a set of document. A query tells the function how to filter, sort and otherwise retrieve the documents. In this example, we only want to return documents for which isComplete = true.

    public static class GetCompleteItems
    {
        [FunctionName("GetCompleteItems")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", Route = null)] HttpRequest req,
            [CosmosDB(
                databaseName: "ToDoList",
                collectionName: "Items",
                ConnectionStringSetting = "CosmosDBConnection",
                SqlQuery = "select * from Items i where i.isComplete")
            ]IEnumerable<ToDoItem> toDoItems,
            ILogger log)
        {
            log.LogInformation($"Function triggered");

            if (toDoItems == null)
            {
                log.LogInformation($"No complete Todo items found");
            }
            else
            {
                var ltodoitems = (List<ToDoItem>)toDoItems;
                if (ltodoitems.Count == 0)
                {
                    log.LogInformation($"No complete Todo items found");
                }
                else
                {
                    log.LogInformation($"{ltodoitems.Count} Todo items found");
                }
            }

            return new OkObjectResult(toDoItems);
        }
    }
  

We will now explore this function in more detail:   

[FunctionName("GetCompleteItems")]        

The name of the function is “GetCompleteItems”.

public static async Task<IActionResult> Run(

The Run method executes when the function is triggered. This method is asynchronous and will eventually return an ActionResult.

[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)]HttpRequestMessage req,

The first parameter is the incoming HTTP Request. It is decorated with HttpTrigger, indicating this is an HTTP trigger. Within this decorator's parameters, we indicate that the function can be called anonymously, that it can only be called with an HTTP GET; and that we are not changing the default routing.

[CosmosDB(
databaseName: "ToDoList",
collectionName: "Items",
ConnectionStringSetting = "CosmosDBConnection",
SqlQuery = "select * from Items i where i.isComplete")
]IEnumerable<ToDoItem> toDoItems,

This parameter is what will be returned by the function (eventually, because it runs asynchronously). It is a list of objects of type ToDoItem. When serialized, this will be transformed into an array of JSON objects. This parameter is decorated with the CosmosDB attribute, indicating that we will automatically retrieve the list from the CosmosDB. The databaseName, collectionName, and ConnectionStringSetting tell the function exactly where to store the document. The SQlQuery tells what query to run to retrieve the data (in this case, return only rows with isComplete=true) It is important to note that I am using the JSON property (“isComplete”), rather than the .NET class property (“IsComplete”) in this query. Even though they differ only in their case, the query is case-sensitive.  

ILogger log)

The logger allows us to log information at points in the function, which can be helpful when troubleshooting and debugging.

log.LogInformation($"Function triggered");
if (toDoItems == null)
    {
         log.LogInformation($"No complete Todo items found");
    }
    else
    {
         var ltodoitems = (List<ToDoItem>)toDoItems;
         if (ltodoitems.Count == 0)
        {
            log.LogInformation($"No complete Todo items found");
        }
        else
        {
             log.LogInformation($"{ltodoitems.Count} Todo items found");
         }
    }

We did not need to write code to query the database. This happens automatically. The code above simply verifies that items were returned and transforms them into  List<ToDoItem> and stores this list in a local variable.

return new OkObjectResult(toDoItems);

We return a 200 (“OK”) HTTP response and the list of items.

Conclusion

Notice that in each of these functions, I did not need to write code to query or update the database. By decorating a parameter with the CosmosDb attribute, the function automatically took care of the database operations.

You can find this code in the CosmosDBBinding solution in my Azure Function demos on GitHub.

Wednesday, 21 November 2018 09:07:00 (GMT Standard Time, UTC+00:00)
# Tuesday, 20 November 2018

In previous articles, I showed how to create Azure Function Apps and Azure Functions directly in the Azure Portal. You can also create Function Apps and Functions in Visual Studio and then deploy them to Azure. I prefer to do this, because it makes it easier to get my code into source control.

Before working with and creating Azure artifacts in Visual Studio, you must install the Azure tools. To install these tools, launch Visual Studio installer and check "Azure Development, as shown in Fig. 1.

AF01-AzureDevTools
Fig. 1

Once the Azure tools are installed, launch Visual Studio and select File | New | Project  from the menu, as shown in Fig. 2.

AF02-FileNewProject
Fig. 2

In the "New Project" dialog, expand Visual C# | Cloud in the left tree and select "Azure Functions" from the list of templates; then enter a project name and location, as shown in Fig. 3.

AF03-AzureFunctionTemplate
Fig. 3

The next dialog (Fig. 4) presents a list of options for your Azure Function.

AF04-FunctionOptions
Fig. 4

In the top dropdown, select "Azure Functions v2".

Select "Http Trigger" to create a function that will be triggered by an HTTP GET or POST to a web service URL.

At the "Storage Account" dropdown, select "Storage Emulator". This works well for running and testing your function locally. You can change this to an Azure Storage Account when you deploy the Function to Azure.

At the "Access rights" dropdown, select "Function".

Click the [OK] button to create an Azure Function App with a single Azure Function.

A function is generated with the following code:

[FunctionName("Function1")]
public static async Task<IActionResult> Run(
    [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
    ILogger log)
{
    log.LogInformation("C# HTTP trigger function processed a request.");

    string name = req.Query["name"];

    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    dynamic data = JsonConvert.DeserializeObject(requestBody);
    name = name ?? data?.name;

    return name != null
        ? (ActionResult)new OkObjectResult($"Hello, {name}")
        : new BadRequestObjectResult("Please pass a name on the query string or in the request body");
}
  

Listing 1

The method is decorated with the "FunctionName" attribute, which provides the name of the function.

[FunctionName("Function1")]
  

Notice that the first parameter is decorated with

[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]
  

This tells the system that the Function is triggered by an HTTP request and that it will request either a GET or POST verb.

We also pass in an ILogger, so that we can output debugging information.

Let's walk through the code in this function

Log some information, so we can confirm the function was properly triggered.

log.LogInformation("C# HTTP trigger function processed a request.");
  

If a "name" parameter is passed in the querystring, capture the value of this parameter.

string name = req.Query["name"];
  

If this is a POST request, there may be information sent in the request body. Retrieve this information and convert it to a JSON object:

string requestBody = await new StreamReader(req.Body).ReadToEndAsync(); 
dynamic data = JsonConvert.DeserializeObject(requestBody);
  

If the "name" parameter was passed in the querystring, use that; if not, look for it in the JSON object from the request body.

name = name ?? data?.name;
  

If a "name" parameter was found, return an HTTP Response Code 200 (OK) with a body containing the text "Hello, " followed by the value of the name.

If no "name" parameter was passed, return an HTTP Response Code 400 (Bad Request) with a message into the body indicating a name is required.

return name != null 
    ? (ActionResult)new OkObjectResult($"Hello, {name}") 
    : new BadRequestObjectResult("Please pass a name on the query string or in the request body");
  

Publish App

One quick way to publish a Function App to Azure is directly from Visual Studio. To do this, right-click the project in the Solution Explorer and select "Publish" from the context menu, as shown in Fig. 5.

AF05-RightClickPublish
Fig. 5

The "Pick a publish target" dialog displays, as shown in Fig. 6.

AF06-PickPublishTarget
Fig. 6

Check the "Run from ZIP" checkbox.

Select either the "Create New" or "Select Existing" radio button, depending whether you wish to deploy to an existing or a newly-created Azure Function; then click the [Publish] button.

The follow-up dialog if you select "Create New" is shown in Fig. 7a and for "Select existing" in Fig. 7b.

Click the [OK] or [Create] button at the bottom of the follow-up dialog to deploy the Function.

This article showed how to create an Azure Function App in Visual Studio, making it easier to test locally and integrate your code with source control.

Tuesday, 20 November 2018 09:41:00 (GMT Standard Time, UTC+00:00)
# Monday, 19 November 2018

Episode 538

Jeff Fritz on Live Streaming Coding

Jeff Fritz uses twitch.tv to live stream while he codes with others. He talks about how, why, and when he does it.

Monday, 19 November 2018 07:16:00 (GMT Standard Time, UTC+00:00)
# Sunday, 18 November 2018

GANGConf (1)Sometimes you can go home again.

I was a member of the Great Lakes Area .NET User Group (a.k.a. GANG) for years and spent some time on the board, including 2 years as President. But I've had much less interaction with them since joining Microsoft and moving to Chicago in 2014.

So, I was excited when my friend Ondrej called to tell me that GANG was hosting a conference and I could speak there if I wanted. I wanted to be a part of this event, so I made the trek from back to Detroit.

The event was held at the Microsoft offices in downtown Detroit. About 70 people came to hear a presentation on both technical topics and soft skills.

GANGConf (2)Cassandra Faris opened the conference telling people how they can manage and  promote your personal brand.

J Tower was next with a presentation on how to use .NET Standard to share code among different types of applications and platforms.

I wrote a presentation about Azure Functions and delivered for the first time at this event.

Kevin Davis's presentation titled "Living your Best (Developer) Life" talked about how to choose and manage your career.

Aydin Akcasu had the best demos of the day, showing Bluetooth devices integrating  with the Chrome web browser.

Finally, Daniel Davis described the benefits of clean code and how to achieve it.

The event reminded me of a similar Saturday event I hosted to celebrate GANG'S 10-year anniversary back in 2011. This is the second year in a row, GANG has held GANGConf and president Ryan Albertson promised to do it again next year.
I hope to be there again for it.

GANGConf (3)

Sunday, 18 November 2018 08:33:00 (GMT Standard Time, UTC+00:00)
# Saturday, 17 November 2018

DoAndroidsDreamThe nuclear fallout from World War Terminus has killed most of earth's animals, left a cloud of radioactive dust across the planet, and encouraged the people of Earth to emigrate to colonies on other planets. Those left on earth need to find a way to survive in a polluted and chaotic world.

The biggest advance that science has brought is the creation of androids - creatures that look exactly like humans and are designed to serve humans on the off-world colonies. But androids lack empathy and sometimes they escape their servitude, kill their human masters, and hide among the humans of Earth.

Rick Deckard is a bounty hunter, tasked with tracking down renegade androids. The police have developed a test to identify androids based on their lack of empathy.

In one 2-day period, Deckard tracks down 6 killer androids and struggles with his own purpose in life.

Do Androids Dream of Electric Sheep by Philip K. Dick tells the story of these two days and Deckard's pursuit of the androids. But mostly it tells of how Deckard and the rest of Earth's people have lost its own humanity. They spend their days using machines to alter their moods and collecting animals as status symbols (or lifelike mechanical animals, if they cannot find the real ones). Deckard himself is troubled by the empathy he feels towards the androids he is hired to destroy.

The story's title refers to Deckard's pet electric sheep that he keeps in order to impress his wife and neighbors.

Dick does a masterful job painting a dystopian society. The post-nuclear-war world is filthy and gray and empty, and people struggle to maintain a sense of normalcy. The world outside is so bland that they use a "Mood Organ" - a mood altering machine to dial their emotions either up and down. They have latched onto a religion, which is based on a VR recreation of a martyr experiencing a stoning; The most popular TV show features Buster Friendly, a goofy host, who holds his audiences in near-religious control.

Dick focuses on Deckard's struggle to find meaning in life. He questions his job: destroying androids for whom he feels empathy. He does it for the money in hopes of making his wife happy with a live animal. But his wife spends her days distracted by the Mood Organ.

Society demonizes androids for their lack of empathy; but many humans lack this same quality: the radioactive fallout caused brain damage in some humans and no one cares about them; And Deckard's years of bounty hunting take their toll on his ability to empathize.

It's worth noting that Ridley Scott's excellent 1982 movie "Blade Runner" is loosely based on this novel, which certainly boosted the book's popularity. But the book is far more cerebral than the movie, exploring themes of religion and human nature and humanity. 

"Do Androids Dream of Electric Sheep" is recommended to any fan of science fiction.

Saturday, 17 November 2018 09:30:00 (GMT Standard Time, UTC+00:00)
# Friday, 16 November 2018

In a previous article, I showed you how to create a new Azure Function with an HTTP trigger.

After you create an Azure Function, it is useful to be able to test it right in the Azure Portal.

To test an Azure function, log into the Azure Portal, open the Function App, and select your Function, as shown in Fig. 1

TF01-Function
Fig. 1

Click the [Run] button (Fig. 2) above the Function to open a Log output window and a Testing dialog, as shown in in Fig. 3.

TF02-RunButton
Fig. 2

TF03-TestDialog
Fig. 3

In the Test dialog on the right, you can change the HTTP verb by selecting either "POST" or "GET" in the "HTTP method" dropdown, as shown in Fig. 4.

TF04-HttpMethod
Fig. 4

If you select the "POST" HTTP method, the "Request body" section (Fig. 5) is enabled and you can modify the data you want to send in the HTTP Body of your request.

TF05-RequestBody
Fig. 5

You can add querystring parameters to your request by clicking the "+ Add parameter" link under "Query" (Fig. 6) and entering a name and value of the parameter, as shown in Fig. 7.

TF06-QueryParameters
Fig. 6

TF07-AddParameter
Fig. 7

Repeat this for as many querystring parameters as you need.

Similarly, you can add name/value pairs to the HTTP header of your  request by clicking the "+ Add header" link and entering the name and value of each header, as shown in Fig. 8.

TF08-AddHeader
Fig. 8

When everything is configured the way you want, click the [Run] button at the bottom (Fig.9) to call the web service and trigger your function.

TF09-RunButton
Fig. 9

The "Output" section (Fig. 10) will display the HTTP response, as well as any text returned in the body of the response. Any response between 200 and 299 is good; any response of 400 and above indicates an error.

TF10-Output
Fig. 10

If you function outputs log information, you will see this in the Log output window, as shown in Fig. 11.

TF11-LogOutput
Fig. 11

In this article, I showed how to test a function from within the Azure portal. You should create more sophisticated automated test as part of your build/deploy process, but this serves as a good, simple way to make sure your function is behaving as expected after you create it.

Friday, 16 November 2018 19:06:00 (GMT Standard Time, UTC+00:00)
# Thursday, 15 November 2018

GCast 22:

Creating an Azure Function Proxy

Learn how to create a proxy URL using Azure Functions

Thursday, 15 November 2018 09:49:00 (GMT Standard Time, UTC+00:00)
# Wednesday, 14 November 2018

In the last article, I showed how to create an Azure Function App. A Function App is not useful by itself: it is just a container for functions, which perform the real work.

Once you have created an Azure Function App, you will want to add one or more Functions to it.

Navigate to the Azure Portal, log in, and open your Function app, as shown in Fig. 1.

Fu01-FunctionApp
Fig. 1

Click either the [+] icon next to the "Functions" section on the left (Fig. 2) or the [New function] button at the bottom (Fig. 3)

Fu02-NewFunctionIcon
Fig. 2

Fu03-NewFunctionButton
Fig. 3

NOTE: If this Function App already contains at least one function, the [New function] button does not display.

The "CHOOSE A DEVELOPMENT ENVIRONMENT" page of the "Azure Functions for .NET - getting started" dialog displays, as shown in Fig. 4

Fu04-ChooseDevEnv
Fig. 4

Select the [In-portal] tile and click the [Continue] button to advance to the "CREATE A FUNCTION" page, as shown in Fig. 5

Fu05-CreateAFunction
Fig. 5

Two triggers are listed: "Webhook+API", which will cause your function to execute after a web service URL is hit; and "Timer", which allows you to schedule your function to run at regular intervals. You can see more triggers by clicking the "More templates…" tile; but, for this demo, select the [Webhook+API] tile and click the [Create] button. After a few seconds, a function is created with an HTTP trigger and some sample code, as shown in Fig. 6.

Fu06-NewFunction
Fig. 6

This sample function accepts a "name" parameter (either in the querystring or in the Body of a POST request) and returns an HTTP 200 (OK) response with the string "Hello, ", followed by the value of the name parameter. If no "name" parameter is supplied,  it returns a 400 (Bad Request) response with an error message.

You can now modify and save this code as you like.

In the next article, I will show you how to test this function within the portal.

Wednesday, 14 November 2018 09:59:00 (GMT Standard Time, UTC+00:00)
# Tuesday, 13 November 2018

An Azure Function allows you to deploy scalable code to the cloud without worrying about the server or other infrastructure issues.

Azure Functions are contained within a Function App, so you need to create a Function App first.  To create a Function App, navigate to the Azure Portal, sign in and click the [Create a resource] button, as shown in Fig. 1.

FA01-CreateAResource
Fig. 1

From the menu, select Compute | Function App, as shown in Fig. 2.

FA02-ComputeFunctionApp
Fig. 2

The "Create Function App" blade displays as shown in Fig. 3

FA03-CreateFunctionAppBlade
Fig. 3

At the "App Name" field, enter a unique name for your Function App.

At the "Subscription" field, select the Azure subscription with which to associate this Function App. Most people will have only one subscription.

At the "Resource Group" field, select "Create new" and enter the name of a Resource Group to create or select "Use existing" and select an existing resource group in which to store your Function App. A Resource Group is an organizational grouping of related assets in Azure.

At the "OS" radio button, select the operating system (Windows or Linux) on which you wish to host your Function App.

At the Hosting plan, select either "Consumption Plan" or "App Service Plan". With the Consumption Plan, you only pay for the time that your functions are running. Since most functions do not run 24 hours a day / 7 days a week, this can be a real cost savings. With the App Service Plan, you pay as long as your functions are available. This is appropriate if you expect clients to be constantly calling your functions.

At the "Location" field, enter a region in which you want your Functions to run. In order to minimize latency, you should select a region close to any resources with which the Functions will interact.

At the "Runtime Stack" dropdown, select one of the platforms. Select ".NET" if you plan to write your code in C# or F#. Select "JavaScript" if you plan to create a node function. Select "Java" if you plan to write your code in Java. As of this writing, Java is in Preview, so performance is not guaranteed.

If you selected "Consumption Plan" hosting plan, you will be prompted for a storage account. Function definitions will be stored in this account. Select an existing storage account or create a new one. I prefer to use a storage account for all my Function Apps in a given Resource Group.

For extra monitoring, turn on Application Insights and select the same region in which your Function App is located. If this region is not available, select a nearby region.

Click the [Create] button to create your Function App.

After your Function App is created, you will want to add a Function to it. I will show how to do this in the next article.

Tuesday, 13 November 2018 09:54:00 (GMT Standard Time, UTC+00:00)
# Monday, 12 November 2018

Episode 537

Robert Greene on DevOps

Robert Greene defines DevOps, discusses its advantages, and describes how to accomplish it with Microsoft tooling.

Monday, 12 November 2018 09:29:00 (GMT Standard Time, UTC+00:00)
# Sunday, 11 November 2018

FrankensteinFrankenstein is the story of Victor Frankenstein, a scientist who discovers a way to re-animate dead tissue and uses this knowledge to build a giant, grotesque creature. Victor is repulsed by his creation and rejects it, which angers the creature and inspires him to seek revenge on his creator by destroying those closest to him.

The story of Frankenstein is familiar to almost everyone - primarily through the 1931 movie and the works that it inspired. But the original novel by Mary Wollstonecraft Shelley, published in 1818 offers much that later interpretations do not.

The novel differs from most later interpretations of the story by how much is given to the creature displays complex feelings and motivations. Unlike Karloff's mute, shuffling monstrosity, Shelley's monster was self-educated, literate, and articulate. He feels the betrayal of his maker and the pain of rejection by the world. He takes out this pain by destroying everyone that Victor loves.

In this classic horror novel, Shelley explores the dangers of playing God, the power of loneliness, and the fear of losing all that we love. She even brings in the responsibilities that fathers have for their children - a topic at least as relevant today as it was 200 years ago. Victor Frankenstein is the ultimate deadbeat dad. he runs from his responsibilities, abandoning his offspring when needed most.

There are two evil creatures in this book. Victor shirks his responsibilities and abandons the creature he created, primarily because of its physical deformities. The monster dreamed of love and acceptance; but turned to evil and violence and revenge when he was rejected by his creator and by the world.

Mary Shelley was a pioneer in the science fiction and the horror genres and the longevity of "Frankenstein" is a testament to this standing.

Sunday, 11 November 2018 09:29:00 (GMT Standard Time, UTC+00:00)
# Saturday, 10 November 2018

JeffLorberIt took 40 years of recording for Jeff Lorber to finally win a Grammy. After 6 times a bridesmaid,
"Prototype" by The Jeff Lorber Fusion won the award for Best Contemporary Instrumental Album this January.

And it took just as long for me to finally see The Jeff Lorber Fusion in concert, which I did Wednesday night at The Promontory in Chicago's Hyde Park.

Reserved seating was listed as "Sold Out" but I managed to get a seat at a table, thanks to someone canceling and me showing up an hour before the doors opened.

The concert was a fundraiser for the Musical Arts Institute, so the evening began with performances by a number of members and students of the institute. It was entertaining half hour of music, but the performers definitely amateurs - many of them in their teens.

Not so with the main act. Jeff Lorber is as professional as they come, and he brought technical proficiency with high energy jazz fusion throughout his set.

Lorber played with a trio that included two local Chicagoans, including bassist Michael Manson (director of the Musical Arts Institute), who nearly stole the show with his amazing playing and a face that morphed like a jellyfish.

But it was Jeff Lorber's show and he did not disappoint. The energy remained high throughout the night and every song was great. Highlights included "What's the Deal" and "Test Drive" from his Grammy-winning album; "Montserrat" from "Galaxy"; and "Rain Dance", a song that has been sampled by multiple artists, including Notorious B.I.G., Ariana Grande, Ja Rule, and Mariah Carey.  I don't recall any ballads during the show.

The trio did not play a long set - perhaps 90 minutes - but the audience left feeling we got our money's worth.

Photos

Saturday, 10 November 2018 09:26:00 (GMT Standard Time, UTC+00:00)
# Friday, 09 November 2018

Azure Functions provide a simple way to deploy code in a scalable, cost-effective way.

The beauty of Azure functions is that the developer does not need to worry about where they are deployed. Azure takes care of spinning up a server and other resources at the appropriate time and scaling out a Function as demand increases. The infrastructure is abstracted away from the developer allowing the developer to focus on the code and the business problem.

Azure functions can be written in C#, F#, JavaScript, or Java.

Each Function has a "Trigger" which, as the name implies is an event that causes the function code to run. This can be an HTTP request, a message on a queue or message bus, delivery of an email, data inserted into a blob storage container or CosmosDB database, or a timed interval.

Triggers are just one way that Azure functions can easily connect to other services. There are also bindings available to interact with databases, queues, and other services with a minimum of code.

One nice feature of Azure Function Apps is the "Consumption Plan" pricing model. Selecting this plan means that you are only charged while your function is running, which can save plenty of money - particularly if your app is not running 24 hours a day every day. Of course, you can also choose to run your function as part of an App Service Plan, in which case you will pay for the entire time the function is available, whether or not it is running. This may be desirable if you already have an App Service Plan running and want to include your functions in that same plan.

You can create functions directly in the Azure Portal. Or you can create them locally using tools like Visual Studio and Visual Code and deploy them either directly from the IDE or through your continuous integration processes.

The source code for the Azure Functions run-time is even open source! Check out the code at https://github.com/Azure/azure-functions-host.

You can get a free Azure account at http://azure.com. You can read more about Azure functions at https://docs.microsoft.com/en-us/azure/azure-functions.

In upcoming articles, I'll show you how to create, deploy, test, and manage Azure functions.

Friday, 09 November 2018 09:25:00 (GMT Standard Time, UTC+00:00)
# Thursday, 08 November 2018

GCast 21:

Azure Functions Continuous Deployment

Learn how to configure continuous deployment from GitHub to Azure functions. Each time you push code changes to GitHub, that code is automatically deployed to Azure.

Thursday, 08 November 2018 09:48:00 (GMT Standard Time, UTC+00:00)
# Wednesday, 07 November 2018

In the last article, I showed how to create a new Azure CosmosDB account. In this article, I will show how to add a database with containers to that account.

Navigate to the Azure portal and sign in; then open your Azure CosmosDB account. You may be directed to either the "Quick start" blade (Fig. 1) or the "Overview" blade (Fig. 2)

Co01-CosmosDBAccountQuickStart
Fig. 1

Co02-CosmosDBAccountOverview
Fig. 2

Open the "Data Explorer" blade, as shown in Fig. 3.

Co03-CosmosDBAccountDataExplorer
Fig. 3

For a newly-created account, no databases are listed.

To create a new database, click the [New Database] button (Fig. 4).

Co04-NewDatabaseButton
Fig. 4

The "New Database" blade displays, as shown in Fig. 5.

Co05-NewDatabaseBlade
Fig. 5

At the "Database id" field, enter a unique id for your database.

Click the [OK] button.

After a few seconds, a new database will display in the "Data Explorer" blade, as shown in Fig. 6.

Co06-Database
Fig. 6

In a CosmosDB database, documents are stored within collections, which help you to organize your data. To create  a new collection, right-click the database and select "New Collection" from the context menu, as shown in Fig. 7

Co07-NewCollectionMenu
Fig. 7

The "Add Collection" blade displays, as shown in Fig. 8

Co08-AddCollectionBlade
Fig. 8

At the "Collection Id" field, enter a name for your collection. Collection names must be unique within a database.

Select the appropriate Storage capacity. If you expect to store a small amount of data, select "Fixed"; for databases expected to grow beyond 10GB, select "Unlimited".

If you select unlimited, you can specify a path within each document to find the Partition Key. A Partition Key is used to determine which data to keep together when distributing data across multiple servers.

At the "Throughput" field, enter the number of 1K documents per second you need to process. You will pay more for higher throughput, so consider whether you need a higher throughput before increasing this value.

Click the [OK] button to create the collection.

The new collection will display beneath the database as shown in Fig. 9. You may need to expand the tree in order to see the collection.

Co09-DBandCollection
Fig. 9

You can write to and read from this collection programmatically or you can upload document within the portal by clicking the [Upload] button, as shown in Fig. 10.

Co10-UploadButton
Fig. 10

Data in a CosmosDB account is stored in databases and collections. This article showed how to create these.

Wednesday, 07 November 2018 06:36:00 (GMT Standard Time, UTC+00:00)
# Tuesday, 06 November 2018

Azure CosmosDB is a flexible, fast, reliable, scalable, geographically distributed NoSQL database.

You can create a CosmosDB account and database in the Azure poral.

Navigate to the Azure portal and login.

Click the [Create a resource] button, as shown in Fig. 1.

CDB01-CreateResourceButton
Fig. 1

From the menu, select Database | Azure CosmosDB, as shown in Fig. 2.

CDB01-DatabaseCosmosDb
Fig. 2

The "Create Azure CosmosDB Account" blade displays, as shown in Fig. 3.

CDB03-CreateAzureCosmosDbAccount
Fig. 3

At the Subscription dropdown, select your Azure subscription. Most of you will have only one subscription.

At the Resource Group dropdown, select an existing resource group or click "Create new" to display the New Resource Group dialog, as shown in Fig. 4.

CDB04-CreateResourceGroup
Fig. 4

In the New Resource Group dialog, enter a unique name for your resource group and click the [OK] button.

At the "API" dropdown, select the API you want to use to access the databases in this account, as shown in Fig. 5.  Options are

  • Core (SQL)
  • MongoDB
  • Cassandra
  • Azure Table
  • Gremlin (graph)

CDB05-Api
Fig. 5

If you are migrating data from another database, you may want to choose the API that resembles your old database in order to minimize changes to the client code accessing the database. If this is a new database, you may wish to choose the API with which you and your team are most familiar.

At the "Location" dropdown, select a region in which to store your data. It is a good idea to keep your data near your users and/or near any services that will interact with your data.

The "Geo-Redundancy" and "Multi-region writes" options allow you to globally distribute your data. There is an extra charge for enabling these features.

You can enable Geo-Redundancy by clicking the [Enable] button next to "Geo-Redundancy". This creates a copy of your data in another nearby region and keeps that data in sync.

Click the [Enable] button next to "Multi-region writes" if you wish to allow data to be written in multiple regions. This will improve the performance when writing data to the database.

Notice the tabs at the top of the page (Fig. 5). The "Basics" tab displays first, but the "Network", "Tags", and "Summary" tabs are also available.

CDB06-Tabs
Fig. 6

The "Network" tab (Fig. 7) allows you to add your CosmosDB account to a specific Virtual Network and Subnet. This is not required.

CDB07-NetworkTab
Fig. 7

The "Tags" tab (Fig. 8) allows you to assign metadata to this CosmosDB account, which may help when grouping together related accounts on a report. This is not required.

CDB08-TagsTab
Fig. 8

The "Summary" tab (Fig. 9) displays all the options you have chosen and validates that you completed the required responses and that all responses are consistent. You can navigate to this tab by clicking the "Summary" tab link at the top or by clicking the [Review + create] button on any other tab.

CDB09-SummaryTab
Fig. 9

Click the [Create] button to begin creating your CosmosDB account. This will take a few minutes. A message displays as shown in Fig. 10 when the account is created and deployed.

CDB10-DeploymentComplete
Fig. 10

As you can see, there are a number of links to documentation and tutorials.

Click the [Go to resource] button to open the CosmosDB account. By default, the "Quick start" blade displays, as shown in Fig. 11.

CDB11-CosmosDBQuickStartPage
Fig. 11

In this article, I showed how to create a new Azure CosmosDB account. In the next article, I will show how to add a database with containers to that account.

Tuesday, 06 November 2018 06:28:00 (GMT Standard Time, UTC+00:00)
# Monday, 05 November 2018

Episode 536

Hao Luo on Rust

Hao Luo talks about the Rust programming language, how it works, and how he is using it.

Monday, 05 November 2018 07:24:00 (GMT Standard Time, UTC+00:00)
# Sunday, 04 November 2018

11/4
Today I am grateful for my first visit to Montreal in almost 12 years.

11/3
Today I am grateful to visit St. Ours, Quebec yesterday - home of some of my ancestors.

11/2
Today I am grateful for Montreal smoked meats and poutine.

11/1
Today I am grateful for dinner last night at a rotating Portuguese restaurant high above Montreal with Brent, LaBrina, and Sarah.

10/31
Today I am grateful to attend my first home Montreal Canadiens game yesterday.

10/30
Today I am grateful for dinner with Nick last night.

10/29
Today I am grateful to attend an exciting Red Wings victory at Little Caesars Arena last night.

10/28
Today I am grateful to be my nephew Sterling's sponsor for his confirmation last night.

10/27
Today I am grateful for dinner last night with Esteban.

10/26
Today I am grateful for a week in Dallas.

10/25
Today I am grateful for dinner last night with Tristan, Joe, Timothy, Sergii, Danny, Denis, and Ashley.

10/24
Today I am grateful to attend a Dallas Stars home game for the first time.

10/23
Today I am grateful for dinner last night with Amanda, Timothy, and Sergii.

10/22
Today I am grateful for Texas BBQ.

10/21
Today I am grateful for all the good restaurants in Chicago.

10/20
Today I am grateful to wake up to this view every morning.

10/19
Today I am grateful for free pound cake and ice cream at the Grand Opening of a local bakery yesterday.

10/18
Today I am grateful for the colors of autumn.

10/17
Today I am grateful for a hot bath last night.

10/16
Today I am grateful for dinner with Corey last night.

10/15
Today I am grateful for dinner with Ryan last night, where I was able to introduce him to his first Chicago-style hot dog.

10/14
Today I am grateful that, after 5 years, 3 roles, and 5 managers, I still enjoy working at Microsoft.

10/13
Today I am grateful for an empty seat next to me on my flights to and from Seattle.

10/12
Today I am grateful for the DJ playing so much non-American music at last night's party.

10/11
Today I am grateful to Brent for bringing to Seattle homemade BBQ to share with us.

10/10
Today I am grateful to learn from my teammates.

10/9
Today I am grateful for dinner last night in downtown Seattle with Brent, Carl, and Michael.

10/8
Today I am grateful to meet Richard for a drink last night.

10/7
Today I am grateful that, for the first time in my life, I have hired someone to clean my home twice a month.

Sunday, 04 November 2018 15:28:39 (GMT Standard Time, UTC+00:00)
# Saturday, 03 November 2018
  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

These are the 3 laws of robotics. They are built into the core technology of a robot's positronic brain and no robot may violate them. All robotic technology is built on top of these laws, ensuring that robots will be safe.

In the 1940s, Isaac Asimov wrote a series of stories speculating on the future evolution of robots and, in 1950, he compiled them into a single volume titled I, Robot. Although only loosely connected, most of the stories include Dr. Susan Calvin, a  Robopsychologist at U.S. Robotics, the only company able to manufacture these machines.

IRobotMany of the stories in this book revolve around the 3 laws - particularly exploring what happens when the laws come into conflict or when an ambiguous situation makes it difficult for a robot to interpret and apply the laws. The greater the conflict, the more stress placed on a positronic brain, which is why robots need a psychologist and humans need Susan Calvin to help them understand robots.

These stories launched a series of very good Robot novels for Asimov, who eventually tied the universe in which his robots existed into his Empire and Foundation series. Asimov's robots and the Robotics Laws influenced many other books and movies featuring mechanical men; and even influenced the real world field of robotics, as his three laws are often brought up when discussing the ethics of the technology. In fact, the term "robotics" was invented by Asimov and first appeared in a story in this collection.

I, Robot succeeds because it is based on plausible scientific principles and because it raises questions that science would be likely to encounter as it advances. There are no strong characters throughout the series (Calvin is a minor character in most of them), but the story and the ethical questions carry them along well.

My favourite story is "Liar", which is about a robot that gains the ability to read minds and uses this power to do what he believes will bring no harm to humans. It is a parable of the result of good intentions wrongly applied.

I, Robot is a reminder that advancing technology brings with it ethical choices and questions. Any fan of science fiction will enjoy it.

Saturday, 03 November 2018 08:21:00 (GMT Standard Time, UTC+00:00)
# Friday, 02 November 2018

Sometimes, you want to store quotation marks within a string, as in the following example

"Alive", she cried!

In C#, there are at least 4 ways to embed a quote within a string:

  1. Escape quote with a backslash
  2. Precede string with @ and use double quotes
  3. Use the corresponding ASCII character
  4. Use the Hexadecimal Unicode character

Escape with backslash

You can precede any escaped character with a backslash ("\") to preserve that character.

For example:

var lyrics = "\"Alive\", she cried!"
  

Precede with @ and use double quotes

If you precede the string with the "@" character, you can use a double set of quotation marks to indicate a single set of quotation marks within a string.

For example:

var lyrics = @"""Alive"", she cried!"
  

Use the ASCII character

A double quote is the ASCII value 34, so you can append this to your string.

For example:

quote = (char)34 + "Alive" + (char)34 + ", she cried!";
  

Use the Hexadecimal Unicode character

You can escape Unicode characters by preceding the hexadecimal value with "\u". The hexadecimal value of a double quote is 0022, so you can include this in your string.

For example:

quote = "\u0022Alive\u0022, she cried!";
  

These techniques work for many other characters that are difficult to represent within quotation marks, such as line feeds, non-English characters, and non-printing characters.

There are other ways to include quotation marks within a C# string, but these should be enough to get you started.
C#
Friday, 02 November 2018 07:08:00 (GMT Standard Time, UTC+00:00)
# Thursday, 01 November 2018
Thursday, 01 November 2018 09:02:00 (GMT Standard Time, UTC+00:00)