# Saturday, June 29, 2019

HousekeepingHousekeeping by Marilynne Robinson is filled with water and filled with tragedy.

It opens with a train crash into a lake, killing hundreds, including the grandfather of Ruthie and Lucille. Later, Lucille and Ruthie's mother commits suicide by driving her car into the same lake. Abandoned years earlier by their father, the girls grow up under the care of their grandmother and aunts until eccentric Aunt Silvie shows up and moves in.

Silvie is a former transient, who sometimes falls asleep on park benches. She is not cut out for motherhood and the girls withdraw into one another, skipping schools and making no friends, other than one another. They skip school and the local authorities begin to question their situation, forcing everyone in this family to make a choice.

Housekeeping is a simple story, built on the strength of the characters. Robinson presents humor and tragedy in an eloquent style that keeps the reader engaged. For such a short novel, we see a full picture of the three main characters. It is worth the time to read.

Saturday, June 29, 2019 9:49:00 AM (GMT Daylight Time, UTC+01:00)
# Wednesday, June 26, 2019

Azure IoT Hub allows you to route incoming messages to specific endpoints without having to write any code.

Refer to previous articles (here, here, and here, to learn how to create an Azure IoT Hub and how to add a device to that hub.

To perform automatic routing, you must

  1. Create an endpoint
  2. Create and configure a route that points to that endpoint
  3. Specify the criteria to invoke that route

Navigate to the Azure Portal and log in.

Open your IoT Hub, as shown in Fig. 1.

ir01-IotHubOverviewBlade
Fig. 1

Click the [Message routing] button (Fig. 2) under the "Messaging" section to open the "Routing" tab, as shown in Fig. 3

ir02-RoutingButton
Fig. 2

ir03-RoutingBlade
Fig. 3

Click the [Add] button to open the "Add a route" blade, as shown in Fig. 4.

ir04-AddRouteBlade
Fig. 4

At the "Name" field, enter a name for your route. I like to use something descripting, like "SendAllMessagesToBlobContainer".

At the "Endpoint" field, you can select an existing endpoint to which to send messages. An Endpoint is a destination to send any messages that meet the specified criteria. By default, only the "Events" endpoint exists. For a new hub, you will probably want to create a new endpoint. To create a new endpoint, click the [Add] button. This displays the "Add Endpoint" dialog, as shown in Fig. 5.

ir05-AddEndpoint
Fig. 5

At the "Endpoint" dropdown, select the type of endpoint you want to create. Fig. 6 shows the "Add a storage endpoint" dialog that displays if you select "Blob Storage".

ir06-AddStorageEndpointBlade
Fig. 6

At the "Endpoint name", enter a descriptive name for the new endpoint.

Click the [Pick a container] button to display a list of Storage accounts, as shown in Fig. 7.

ir07-PickStorageAccount
Fig. 7

Select an existing storage account or click the [+ Storage account] button to create a new one. After you select a storage account, the "Containers" dialog displays, listing all blob containers in the selected storage account, as shown in Fig. 8.

ir08-PickContainer
Fig. 8

Select an existing container or click the [+Container] button to create a new container. Messages matching the specified criteria will be stored in this blob container.

Back at the "Add a storage endpoint" dialog (Fig. 6), you have options to set the Batch frequency, Chunk size window, and Blob file name format.

Multiple blob messages are bundled together into a single blob.

The Batch frequency determines how frequently messages get bundled together. Lowering this value decreases latency; but doing so creates more files and requires more compute resources.

Chunk size window sets the maximum size of a blob. If a bundle of messages would exceed this value, the messages will be split into separate blobs.

The Blob file name format allows you to specify the name and folder structure of the blob. Each value within curly braces ({}) represents a variable. Each of the variables shown is required, but you can reorder them or remove slashes to change folders into file name parts or add more to the name, such as a file extension.

Click the [Create] button to create the endpoint and return to the "Add a route" blade, as shown in Fig. 9.

ir09-SaveRoute
Fig. 9

At the "Endpoint" dropdown, select the endpoint you just created.

At the "Data source" dropdown, you can select exactly what data gets routed to the endpoint. Choices are "Device Telemetry Messages"; "Device Twin Change Events"; and "Device Lifecycle Events".

The "Routing query" field allows you to specify the conditions under which messages will be routed to this endpoint.

If you leave this value as 'true', all messages will be routed to the specified endpoint.

But you can filter which messages are routed by entering something else in the "Routing query" field. Query syntax is described here.

Click the [Save] button to create this route.

In this article, you learned how to perform automatic routing for an Azure IoT Hub.

IoT
Wednesday, June 26, 2019 8:55:00 AM (GMT Daylight Time, UTC+01:00)
# Tuesday, June 25, 2019

Data Lake storage is a type of Azure Storage that supports a hierarchical structure.

There are no pre-defined schemas in a Data Lake, so you have a lot of flexibility on the type of data you want to store. You can store structured data or unstructured data or both. In fact, you can store data of different data types and structures in the same Data Lake.

Typically a Data Lake is used for ingesting raw data in order to preserve that data in its original format. The low cost, lack of schema enforcement, and optimization for inserts make it ideal for this. From the Microsoft docs: "The idea with a data lake is to store everything in its original, untransformed state."

After saving the raw data, you can then use ETL tools, such as SSIS or Azure Data Factory to copy and/or transform this data in a more usable format in another location.

Like most solutions in Azure, it is inherently highly scalable and highly reliable.

Data in Azure Data Lake is stored in a Data Lake Store.

Under the hood, a Data Lake Store is simply an Azure Storage account with some specific properties set.

To create a new Data Lake storage account, navigate to the Azure Portal, log in, and click the [Create a Resource] button (Fig.1).

dl01-CreateResource
Fig. 1

From the menu, select Storage | Storage Account, as shown in Fig. 2.

dl02-MenuStorageAccount
Fig. 2

The "Create Storage Account" dialog with the "Basic" tab selected displays, as shown in Fig. 3.

dl03-Basics
Fig. 3

At the “Subscription” dropdown, select the subscription with which you want to associate this account. Most of you will have only one subscription.

At the "Resource group" field, select a resource group in which to store your service or click "Create new" to store it in a newly-created resource group. A resource group is a logical container for Azure resources.

At the "Storage account name" field, enter a unique name for the storage account.

At the "Location" field, select the Azure Region in which to store this service. Consider where the users of this service will be, so you can reduce latency.

At the "Performance" field, select the "Standard" radio button. You can select the "Premium" performance button to achieve faster reads; however, there may be better ways to store your data if performance is your primary objective.

At the "Account kind" field, select "Storage V2"

At the "Replication" dropdown, select your preferred replication. Replication is explained here.

At the "Access tier" field, select the "Hot" radio button.

Click the [Next: Advanced>] button to advance to the "Advanced" tab, as shown in Fig. 4.

dl04-Advanced
Fig. 4

The important field on this tab is "Hierarchical namespace". Select the "Enabled" radio button at this field.

Click the [Review + Create] button to advance to the "Review + Create" tab, as shown in Fig. 5.

dl05-Review
Fig. 5

Verify all the information on this tab; then click the [Create] button to begin creating the Data Lake Store.

After a minute or so, a storage account is created. Navigate to this storage account and click the [Data Lake Gen2 file systems] button, as shown in Fig. 6.

dl06-Services
Fig. 6

The "File Systems" blade displays, as shown in Fig. 7.

dl07-FileSystem
Fig. 7

Data Lake data is partitioned into file systems, so you must create at least one file system. Click the [+ File System] button and enter a name for the file system you wish to create, as shown in Fig. 8.

dl08-AddFileSystem
Fig. 8

Click the [OK] to add  this file system and close the dialog. The newly-created file system displays, as shown in Fig. 9.

dl09-FileSystem
Fig. 9

If you double-click the file system in the list, a page displays where you can set access control and read about how to manage the files in this Data Lake Storage, as shown in Fig. 10

dl10-FileSystem
Fig. 10

In this article, you learned how to create a Data Lake Storage and a file system within it.

Tuesday, June 25, 2019 10:10:00 AM (GMT Daylight Time, UTC+01:00)
# Monday, June 24, 2019

Episode 569

John Alexander on ML.NET

John Alexander describes how .NET developers can use ML.NET to build and consume Machine Learning solutions.

Monday, June 24, 2019 9:01:00 AM (GMT Daylight Time, UTC+01:00)
# Sunday, June 23, 2019

Frank and April Wheeler were living the 1950s American dream. Frank had a steady - if unfulfilling - job in New York City, April was the attractive wife he always wanted, and they owned a large home in a quiet neighborhood in suburban New Jersey.

But, like nearly all their neighbors, the Wheelers were far from happy.

They were bored suburbanites, working dead-end jobs, in loveless marriages, talking about their dreams.

They talked of how they didn't belong - of how they were so much better than the rest of the sheep who surrendered to the conformity of the world. But they take no action to correct their circumstances. The fact is that they are not as much "better" as they believe.

April suggests that the Wheelers move to Paris and start a new life, so that Frank can explore his potential. But Frank is not interested in his potential or in self-exploration. He likes the low expectations that come with his job. And, when he is given an opportunity at a promotion, he leaps at the chance.

Frank and April are self-aware enough to believe they are superior to their neighbors and co-workers, but not self-aware enough to realize they are not. They either don't know themselves or they refuse to see themselves.

They are under the illusion that their problems are easily fixable - move to Paris; get a promotion; have an affair. New flash: They are not.

Instead they continue their pretentious life of drunken lunches and adultery and deluding themselves that they are destined for more. No one takes responsibility for his or her own actions, choosing instead to blame others or the expectations of society.

The only honest person in the book is John Givings, a son of the Wheelers' neighbors, who has been literally certified insane and institutionalized. But John is so shockingly rude that it's difficult for anyone to listen to him or to take him seriously.

Inevitably, the story ends in tragedy, with no lessons learned and everyone continuing to face their troubles alone.

Don't read Revolutionary Road by Richard Yates to feel good about yourself. Read it as a warning about buying too much into the American dream. The sad part is how relevant this warning feels today.

Sunday, June 23, 2019 7:29:00 AM (GMT Daylight Time, UTC+01:00)
# Saturday, June 22, 2019

NeverLetMeGoIt isn't obvious until well into Never Let Me Go by Kazuo Ishiguro that this is a story of a dystopian society. Ishiguro drops hints throughout the story, slowly revealing the situation in which the characters find themselves. Words like "donations", "Possible", and "Completion" are introduced, and we know they have some mysterious meaning, but are not told that meaning until much later.

Kathy H is a 31-year-old “Carer” looking back on her life - particularly her time at Hailsham - a boarding school in rural England. Life is good at Hailsham, but the students are secluded and are given almost no knowledge of the outside world, other than being told they will someday have a special place in it.

Everyone has a name like "Kathy H" or "Tommy D". At first, I thought this was a literary device, with the author pretending to protect identities; but, on reflection, I think the students were not given last names one more way to dehumanize them.

Never Let Me Go is a story of false hope; of what it means to be human and to have a soul; and of how much control each of us has over our destiny. It is told in a believable manner in a world not very different from ours and referencing technology that does not sound far-fetched.

It is a dystopian nightmare, disguised as a coming-of-age story.

Saturday, June 22, 2019 9:56:00 AM (GMT Daylight Time, UTC+01:00)
# Thursday, June 20, 2019

GCast 53:

Creating a Data Warehouse in Azure

Learn how to create a new SQL Sever data warehouse in Microsoft Azure.

Thursday, June 20, 2019 9:24:00 AM (GMT Daylight Time, UTC+01:00)
# Tuesday, June 18, 2019

CTRL+V has been in Windows since the beginning: After copying something to the Windows clipboard (via CTRL+C or some other method), hold down the CTRL key and press V to insert that something at the current cursor location.

But I learned today about a new feature: WINDOWS + V.

Hold down the WINDOWS key (Fig. 1) and press V.

wv01-WindowsKey
Fig. 1

This will bring up a context menu, listing the last few items added to the clipboard, as shown in Fig. 2.

wv02-ContextMenu
Fig. 2

You can then select from this list which item to insert at the current cursor position.

The context menu even lists the time the item was added to the clipboard.

This is useful if you need to copy several items before pasting them. But, the most useful use case is when you accidentally copy something to the clipboard without thinking you might overwrite a previous item copied there. Now you have some time to still use that previously overwritten item.

I'm unclear how long items stay in this clipboard list, but I like this advantage.

Tuesday, June 18, 2019 2:11:00 AM (GMT Daylight Time, UTC+01:00)
# Monday, June 17, 2019

Episode 568

Heather Wilde on Anticipatory Design

Heather Wilde discusses how to machine learning with user interfaces and user experience to craft a more personalized experience between a person and the products and services they use.

https://twitter.com/heathriel

Monday, June 17, 2019 8:21:00 AM (GMT Daylight Time, UTC+01:00)
# Sunday, June 16, 2019

ArrowOfGodArrow of God is part of Chinua Achebe's African Trilogy. Although it is Achebe's third novel, chronologically, it is second in the trilogy. The first book - Things Fall Apart - took place as the English colonizers were arriving in west Africa; book 2 - No Longer At Ease - takes place near the end of the colonization period; and "Arrow of God"'s story happens in the 1920's - at the height of European colonization.

Unlike the first two books, this one does not focus on Okonkwo or his descendants.
Arrow of God tells the story of Ezeulu, a high priest of the god Ulu in colonial Nigeria. Ezeulu's tribe goes to war with another tribe - a result of a perceived insult and a dispute over land ownership. The British governor steps in, ends the conflict, and burns everyone's guns. It is this assertion of control by the British over the local population that is at the heart of the novel.

The British are in Africa to implement their legal system and their culture and their religion on the native population and they use any means to do so. They implement "Local Rule", installing an African in a position of authority, but controlling that man so he works in their interests.

The British government bureaucracy makes them terribly inefficient, making their job harder; but the infighting among individuals, villages, and tribes of the Africans makes them easier to manipulate. As part of England's efforts at influencing the local population, missionaries are trying to convert the natives to Christianity. Many resist this change in culture; but they are aided when a failure of the yam crop leads to famine and praying to the god Ulu does not help. The religious conversions are a microcosm of the colonizers' efforts to implement their culture on the Africans, subsuming the existing culture.

Ezeulu stands in the center of it all. He wants to better understand the British and sends his son to work with them to gain more information; but he rejects their offer to make him a Local Ruler. Still, his own people are suspicious of his motives, feeling he has given too much to the white invaders. There is conflict between Ezeulu and

Arrow of God is the story of conflict and influence; of people trying to hold onto their traditions in the face of a tidal wave of change; and of the hope on which people place their faith; of how culture was supplanted in a short time. It is the story of toe loss of the Igbo cultural identity.

Achebe was probably the first English language writer to portray events in Africa from the point of view of the Africans.

The book is told in an eloquent style, peppered with many African proverbs. This one sums it up best:

"When brothers fight to death a stranger inherits their father's estate"

Sunday, June 16, 2019 9:06:00 AM (GMT Daylight Time, UTC+01:00)
# Saturday, June 15, 2019

ShelteringSkySometimes, I dream about traveling to new places with no responsibilities and no worries about money.

The Sheltering Sky by Paul Bowles may have cured me of that dream.

"The Sheltering Sky" tells the story of 3 idle rich people - husband and wife, Port and Kit, along with their friend Tunner - who decide to explore north Africa after World War II. They don't have a plan and so drift from city to city seeing the sites and trying to overcome their boredom. They insist this makes them "travelers", rather than "tourists". They drift through a land of sweltering temperatures and dust storms and bedbugs and lice and biting flies.

The story turns tragic as the trio splits up, Port falls gravely ill, and Kit becomes lost in the desert. But this book isn't defined by plot twists or great character developments. It is an existential tale about people who appear to be drifting through life, with no purpose and no direction. What did this trio contribute to the world on their travels? What did they gain for themselves? What meaning did their lives have? They are searching for something, but even they don't know what that is. And none of them ever finds it.

Every character seems bent on self-destruction in this story. Almost everyone we meet is unlikable - from the 3 main characters to the constantly-arguing-and-probably-incestuous young man and his mother they encounter to a traveler who kidnaps and rapes Kit before adding her to his harem. Port and Kit are each unfaithful to one another during their trip, but only Kit feels any guilt about it.

Some will take offense at Kit's reaction to her rape (she accepts it and even comes to enjoy it), but I saw this as evidence of her decreasing mental stability.

If you are looking for a good travelogue of Algeria and Morocco, skip this one. If you want a commentary on the state of the human condition, this is a pretty good one.

Saturday, June 15, 2019 9:46:00 AM (GMT Daylight Time, UTC+01:00)
# Friday, June 14, 2019

In the last few articles, I introduced the OCR Service of the Cognitive Services Computer Vision API. The OCR service is a general-purpose tool for detecting text in an image. But this tool is only useful if you want to do something with that text. Often it is easier to figure out how to process recognized text if you know something about the image.

Enter receipt-api - an open source project that builds on the Cognitive Services API to recognize information in a store receipt.

You can download this project at https://github.com/nzregs/receipt-api.

Compile and run it in Visual Studio and you have a web service that you can call by submitting an HTTP POST to the following URL:

http://localhost:xxxxx/api/values

where xxxxx is the port number on which the web service is running. You can spot this port easily because a browser launches when the app runs and the port number is in the URL, as shown in Fig. 1.

ra01-Browser
Fig. 1

Before Testing

In order to test the API, you will need an image of a receipt. You can take a photo with your phone and copy it to your computer.

You will also need to create a Computer Vision service in Azure, as described here.

Finally, you will also need to make a change to the receipt-api project. Open the solution and rename Sample-Secrets_cs.txt to Secrets.cs. The code in this file is shown in listing 1.

Listing 1:

public class Secrets

{
    // rename this file to Secrets.cs
    // update the constants below with your API Key and API Endpoint
    public const string apikey = "28737opek;jwlbjksgui3y2[pik";
    public const string apiendpoint_ocr = @"https://australiaeast.api.cognitive.microsoft.com/vision/v1.0/ocr";
}
  

Replace the key and api endpoint with the key and endpoint in the service associated with your cognitive service.

Testing the Receipt API

A simple way to test any API is with Postman, a free tool available at https://www.getpostman.com/.

Download, install, and run Postman.

With the receipt-api service running, create a new request in Postman consisting of a POST to the receipt-api service URL, as shown in Fig. 2.

ra02-Postman
Fig. 2

On the "Headers" tab, add a header row with NAME = Content-Type and VALUE = application/octet-stream, as shown in Fig. 3.

ra03-Headers
Fig. 3

On the "Body" tab, click the [Select File] button and select the photo of the receipt from your computer, as shown in Fig. 4.

ra04-Body
Fig. 4

Click the [Send] and wait for a response to appear. If all goes well, you will see something like Fig. 5.

ra05-Response
Fig. 5

If you have used the OCR service, you will notice that this response looks identical to the response from that service. But scroll to the bottom, as shown in Fig. 6 and you will see information specific to receipts.

ra06-Response
Fig. 6

This is from the receipt shown in Fig. 7.

ra07-Receipt
Fig. 7

How it works

The solution works by first calling the Cognitive Services OCR service; then, looping through each line and word, looking for patterns. It uses regular expressions to find these patterns. Below is the code to find the date in the recognized text:

Listing 2:

static string ExtractDate(string line) 
 { 
    string receiptdate = ""; 
    // match dates "01/05/2018" "01-05-2018" "01-05-18" "01 05 18" "01 05 2018" 
    string pat = @"\s*((31([-/ .])((0?[13578])|(1[02]))\3(\d\d)?\d\d)|((([012]?[1-9])|([123]0))([-/ .])((0?[13-9])|(1[0-2]))\12(\d\d)?\d\d)|(((2[0-8])|(1[0-9])|(0?[1-9]))([-/ .])0?2\22(\d\d)?\d\d)|(29([-/ .])0?2\25(((\d\d)?(([2468][048])|([13579][26])|(0[48])))|((([02468][048])|([13579][26]))00))))\s*"; 
    foreach (Match in Regex.Matches(line, pat)) 
    { 
        receiptdate = match.Value.Trim(); 
            receiptdate = receiptdate.Replace("-", "/"); 
         receiptdate = receiptdate.Replace(".", "/"); 
        receiptdate = receiptdate.Replace(" ", "/"); 
    }

    // didnt find date?  now we'll try searching with month names.  03 OCT 2017, 03 October 2017 etc 
    if (receiptdate == "") 
    { 
        pat = @"((31(?![-/ .](Feb(ruary)?|Apr(il)?|June?|(Sep(?=\b|t)t?|Nov)(ember)?)))|((30|29)(?![-/ .]Feb(ruary)?))|(29(?=[-/ .]Feb(ruary)?[-/ .](((1[6-9]|[2-9]\d)(0[48]|[2468][048]|[13579][26])|((16|[2468][048]|[3579][26])00)))))|(0?[1-9])|1\d|2[0-8])[-/ .](Jan(uary)?|Feb(ruary)?|Ma(r(ch)?|y)|Apr(il)?|Ju((ly?)|(ne?))|Aug(ust)?|Oct(ober)?|(Sep(?=\b|t)t?|Nov|Dec)(ember)?)[-/ .]((1[6-9]|[2-9]\d)\d{2})";

        foreach (Match in Regex.Matches(line, pat, RegexOptions.IgnoreCase)) 
        { 
            receiptdate = match.Value.Trim(); 
            receiptdate = receiptdate.Replace("/", "-"); 
            receiptdate = receiptdate.Replace(".", "-"); 
            receiptdate = receiptdate.Replace(" ", "-"); 
         } 
    }

    return receiptdate; 
}
  

Limitations

This tool is not perfect.

It is incomplete. Although the model supports a Business Name and a Tax Total, it looks like the logic to extract this information has not yet been written.

Note that it is an Open Source project and you are welcome to contribute and submit a Pull Request. If this logic is important to your project, write it and share it with the world.

The solution is also limited by the capabilities of the OCR service it calls. However, my experience is that this service becomes more accurate as time goes on.

The results are best with a clear, high-contrast receipt. If your receipt is wrinkled or faded or has a watermark, the OCR will be degraded, effecting any analysis of the recognized text.

Strengths

The receipt-api project does provide several advantages:

  • It is simple to use.
  • It can scale when deployed to Azure.
  • It is an open source project, so other developers (including you) can improve it.
  • It is free.
  • It has an MIT license and can be used without restriction.

The receipt-api open source project provides a simple way to extract data from a receipt.

Friday, June 14, 2019 9:22:00 AM (GMT Daylight Time, UTC+01:00)
# Thursday, June 13, 2019

GCast 52:

Using hilite.me to Format Code as HTML

The online tool at hilite.me allows me to format code in just about any language and paste it into my blog.

GCast | HTML5 | Screencast | Video | Web
Thursday, June 13, 2019 9:03:00 AM (GMT Daylight Time, UTC+01:00)
# Wednesday, June 12, 2019

In a previous article, I showed how to use the Microsoft Cognitive Services Computer Vision API to perform Optical Character Recognition (OCR) on a document containing a picture of text. We did so by making an HTTP POST to a REST service.

If you are developing with .NET languages, such as C# Visual Basic, or F#, a NuGet Package makes this call easier. Classes in this package abstract the REST call, so can write less and simpler code; and strongly-typed objects allow you to make the call and parse the results more easily.


To get started, you will first need to create a Computer Vision service in Azure and retrieve the endpoint and key, as described here.

Then, you can create a new C# project in Visual Studio. I created a WPF application, which can be found and downloaded at my GitHub account. Look for the project named "OCR-DOTNETDemo". Fig. 1 shows how to create a new WPF project in Visual Studio.

od01-FileNewProject
Fig. 1

In the Solution Explorer, right-click the project and select "Manage NuGet Packages", as shown in Fig. 2.

od02-ManageNuGet
Fig. 2

Search for and install the "Microsoft.Azure.CognitiveServices.Vision.ComputerVision", as shown in Fig. 3.

od03-NuGet
Fig. 3

The important classes in this package are:

  • OcrResult
    A class representing the object returned from the OCR service. It consists of an array of OcrRegions, each of which contains an array of OcrLines, each of which contains an array of OcrWords. Each OcrWord has a text property, representing the text that is recognized. You can reconstruct all the text in an image by looping through each array.
  • ComputerVisionClient
    This class contains the RecognizePrintedTextInStreamAsync method, which abstracts the HTTP REST call to the OCR service.
  • ApiKeyServiceClientCredentials
    This class constructs credentials that will be passed in the header of the HTTP REST call.

Create a new class in the project named "OCRServices" and make its scope "internal" or "public"

Add the following "using" statements to the top of the class:

using Microsoft.Azure.CognitiveServices.Vision.ComputerVision;
using Microsoft.Azure.CognitiveServices.Vision.ComputerVision.Models;
using System.IO;
  


Add the following methods to this class:

Listing 1:

internal static async Task<OcrResult> UploadAndRecognizeImageAsync(string imageFilePath, OcrLanguages language) 
 { 
    string key = "xxxxxxx"; 
    string endPoint = "https://xxxxx.api.cognitive.microsoft.com/"; 
    var credentials = new ApiKeyServiceClientCredentials(key);

    using (var client = new ComputerVisionClient(credentials) { Endpoint = endPoint }) 
    { 
        using (Stream imageFileStream = File.OpenRead(imageFilePath)) 
        { 
             OcrResult ocrResult = await client.RecognizePrintedTextInStreamAsync(false, imageFileStream, language); 
            return ocrResult; 
        } 
    } 
}

internal static async Task<string> FormatOcrResult(OcrResult ocrResult) 
{ 
    var sb = new StringBuilder(); 
    foreach(OcrRegion region in  ocrResult.Regions) 
    { 
        foreach (OcrLine line in region.Lines) 
        { 
             foreach (OcrWord word in line.Words) 
            { 
                 sb.Append(word.Text); 
                sb.Append(" "); 
            } 
            sb.Append("\r\n"); 
        } 
         sb.Append("\r\n\r\n"); 
    } 
    return sb.ToString(); 
}
  

The UploadAndRecognizeImageAsync method calls the HTTP REST OCR service (via the NuGet library extractions) and returns a strongly-typed object representing the results of that call. Replace the key and the endPoint in this method with those associated with your Computer Vision service.

The FormatOcrResult method loops through each region, line, and word of the service's return object. It concatenates the text of each word, separating words by spaces, lines by a carriage return and line feed, and regions by a double carriage return / line feed.

Add a Button and a TextBlock to the MainWindow.xaml form.

In the click event of that button add the following code.

Listing 2:

private async void GetText_Click(object sender, RoutedEventArgs e) 
{ 
    string imagePath = @"xxxxxxx.jpg"; 
    OutputTextBlock.Text = "Thinking…"; 
    var language = OcrLanguages.En; 
    OcrResult ocrResult =  await OCRServices.UploadAndRecognizeImageAsync(imagePath, language); 
     string resultText = await OCRServices.FormatOcrResult(ocrResult); 
    OutputTextBlock.Text = resultText; 
 }
  


Replace xxxxxxx.jpg with the full path of an image file on disc that contains pictures of text.

You will need to add the following using statement to the top of MainWindow.xaml.cs.

using Microsoft.Azure.CognitiveServices.Vision.ComputerVision.Models;
  

If you like, you can add code to allow users to retrieve an image and display that image on your form. This code is in the sample application from my GitHub repository, if you want to view it.

Running the form should look something like Fig. 4.

od04-RunningApp
Fig. 4

Wednesday, June 12, 2019 9:46:00 AM (GMT Daylight Time, UTC+01:00)
# Tuesday, June 11, 2019

In a previous article, I described the details of the OCR Service, which is part of the Microsoft Cognitive Services Computer Vision API.

To make this API useful, you need to write some code and build an application that calls this service.

In this article, I will show an example of a JavaScript application that calls the OCR web service.

If you want to follow along, you can find all the code in the "OCRDemo" project, included in this set of demos.

To use this demo project, you will first need to create a Computer Vision API service, as described here.

Read the project's read.me file, which explains the setup you need to do in order to run this with your account.

If you open index.html in the browser, you will see that it displays an image of a poem, along with some controls on the left:

  • A dropdown list to change the poem image
  • A dropdown list to select the language of the poem text
  • A [Get Text] button that calls the web service.

Fig. 1 shows index.html when it first loads:

oj01-WebPage
Fig. 1

    Let's look at the JavaScript that runs when you click the [Get Text] button. You can find it in script.js

    print 'hello world!'$("#GetTextFromPictureButton").click(function () {
         var outputDiv = $("#OutputDiv");
         outputDiv.text("Thinking…");
         var url = $("#ImageUrlDropdown").val();
         var language = $("#LanguageDropdown").val();
    
        try {
             var computerVisionKey = getKey();
         }
         catch(err) {
             outputDiv.html(missingKeyErrorMsg);
             return;
         }
    
        var webSvcUrl = "https://westcentralus.api.cognitive.microsoft.com/vision/v2.0/ocr";
        webSvcUrl = webSvcUrl + "?language=" + language;
        $.ajax({
            type: "POST",
            url: webSvcUrl,
            headers: { "Ocp-Apim-Subscription-Key": computerVisionKey },
            contentType: "application/json",
            data: '{ "Url": "' + url + '" }'
        }).done(function (data) {
            outputDiv.text("");
    
            var regionsOfText = data.regions;
            for (var r = 0; r < regionsOfText.length; h++) {
                var linesOfText = data.regions[r].lines;
                for (var l = 0; l < linesOfText.length; l++) {
                    var output = "";
    
                    var thisLine = linesOfText[l];
                    var words = thisLine.words;
                    for (var w = 0; w < words.length; w++) {
                        var thisWord = words[w];
                        output += thisWord.text;
                        output += " ";
                    }
                    var newDiv = "<div>" + output + "</div>";
                    outputDiv.append(newDiv);
    
                }
                outputDiv.append("<hr>");
            }
    
        }).fail(function (err) {
            $("#OutputDiv").text("ERROR!" + err.responseText);
        });
      

    This code uses jQuery to simplify selecting elements, but raw JavaScript would work just as well.

    On the page is an empty div with the id="OutputDiv"

    In the first two lines, we select this div and set its text to "Thinking…" while the web service is being called.

        var outputDiv = $("#OutputDiv");
        outputDiv.text("Thinking…");

    Next, we get the URL of the image containing the currently displayed poem and the selected language. These both come from the selected items of the two dropdowns.

        var url = $("#ImageUrlDropdown").val(); 
        var language = $("#LanguageDropdown").val();
      

    Then, we get the API key, which is in the getKey() function, which is stored in the getkey.js file. You will need to update this file yourself, adding your own key, as described in the read.me.

        try { 
            var computerVisionKey = getKey(); 
        } 
        catch(err) { 
            outputDiv.html(missingKeyErrorMsg); 
            return; 
        }
      

    Now, it's time to call the web service. My Computer Vision API service was created in the West Central US region, so I've hard-coded the URL. You may need  to change this, if you created your service in a different region.

    I add a querystring parameter to the URL to indicate the slected language.

    Then, I call the web service by submitting an HTTP POST request to the web service URL, passing in the appropriate headers and constructing a JSON document to pass in the request body.

        var webSvcUrl = "https://westcentralus.api.cognitive.microsoft.com/vision/v2.0/ocr";
        webSvcUrl = webSvcUrl + "?language=" + language;
        $.ajax({
            type: "POST",
            url: webSvcUrl,
            headers: { "Ocp-Apim-Subscription-Key": computerVisionKey },
            contentType: "application/json",
            data: '{ "Url": "' + url + '" }'
      

    Finally, I process the results when the HTTP response returns.

    JavaScript is a dynamic language, so I don't need to create any classes to identify the structure of the JSON that is returned; I just need to know the names of each property.

    The returned JSON contains an array of regions; each region contains an array of lines; and each line contains an array of words.

    In this simple example, I simply loop through each word in each line in each region, concatenating them together and adding some HTML to format line breaks.

    Then, I append this HTML to the outputDiv and follow it up with a horizontal rule to emphasize that it is the end.

        }).done(function (data) { 
            outputDiv.text("");
    
            var regionsOfText = data.regions; 
            for (var r = 0; r < regionsOfText.length; h++) { 
                var linesOfText = data.regions[r].lines; 
                for (var l = 0; l < linesOfText.length; l++) { 
                     var output = "";
    
                    var thisLine = linesOfText[l]; 
                    var words = thisLine.words; 
                     for (var w = 0; w < words.length; w++) { 
                         var thisWord = words[w]; 
                        output += thisWord.text; 
                        output += " "; 
                    } 
                     var newDiv = "<div>" + output + "</div>"; 
                     outputDiv.append(newDiv);
    
                } 
                outputDiv.append("<hr>"); 
            }
      

    I also, catch errors that might occur, displaying a generic message in the outputDiv, where the returned text would have been.

        catch(err) { 
            outputDiv.html(missingKeyErrorMsg); 
            return; 
        }
      

    Fig. 2 shows the results after a successful web service call.

    oj02-Results
    Fig. 2

    Try this yourself to see it in action. The process is very similar in other languages.

    Tuesday, June 11, 2019 9:11:00 AM (GMT Daylight Time, UTC+01:00)
    # Monday, June 10, 2019

    Episode 567

    Elton Stoneman on Docker

    Elton Stoneman describes how to manage containers using Docker on a local machine and in the cloud.

    Monday, June 10, 2019 9:52:00 AM (GMT Daylight Time, UTC+01:00)
    # Sunday, June 9, 2019

    ClaudiusTheGod"Claudius the God" continues Robert Graves's fictional translation of the Roman Emperor Claudius's autobiography that he began in "I, Claudius".

    The first novel ended with the assassination of Nero and with Claudius being unexpectedly and reluctantly elevated to Emperor.

    Claudius was born handicapped and undersized and weak and stuttering, so he remained mostly overlooked during his pre-Emperor years, avoiding the (often fatal) power struggles exercised by his family. So, he seems completely unprepared and unqualified for his role as supreme leader. Despite his initial reluctance, Claudius embraces the role, implementing his policies to modernize the Empire and sometimes executing his enemies without trial. During his 13-year reign, he successfully undoes much of the damage caused by the mad Caligula. Overall, he performed better than almost anyone expected.

    At first glance, this seems like a good deal for Claudius.

    But his life is damaged by his relationships with women - particularly with Messalina - his young and beautiful wife - who conspires against him and is unfaithful with literally hundreds of other men. After four marriages, Claudius remains unhappy.

    Once again, Graves does a good job of bringing to life the political intrigue and personal dramas of ancient Rome and her key players. Nearly everyone in this world with ambition is assassinated or plots assassinations or both. Claudius tries to stay above this and is mostly successful for most of his life.

    Previous Roman Emperors had embraced their role as god-king; but Claudius resisted this idea, insisting that he was not a god. In fact, he dreams of Rome eventually rejecting its recent incarnation as a dictatorship and returning to a Republic. His childhood friend the Hebrew king Herod Agrippa takes a different approach, declaring himself to be the prophesied Messiah and is ultimately killed for his arrogance.

    But, Claudius changes with time, as happens to so many with ultimate power. Near the end of his life, Claudius accepted his deification when he discovered the Britons were building temples to him. He begins plotting the succession of the throne - without the best interests of the Empire.

    The full title of this book is "Claudius, the God and his wife, Messalina" and Claudius's full name is Tiberius Claudius Drusus Nero Germanicus. Much like these names, Robert Graves shortens and simplifies the 13-year reign of Emperor Claudius.

    Sunday, June 9, 2019 9:11:00 AM (GMT Daylight Time, UTC+01:00)
    # Saturday, June 8, 2019

    NoLongerAtEaseIn Things Fall Apart, Chinua Achebe introduced us to Okonkwo, a west African tribesman trying to preserve his people's culture as European colonialists invaded his homeland in the late 19th century.

    Achebe's second novel - No Longer at Ease - follows the life and downfall of Okonkwo's grandson Obi decades later.

    Obi returns from a college education in England to begin a life of civil service in his homeland of Nigeria. He returns with high expectations and high ideals, but he finds himself conflicted in every part of his life.

    He wants to help his people, but he has seen the world and known many of its pleasures.

    He wants to marry Clara, but his family and the rest of society disapproves because she is an osu - an inferior caste. And he is tempted by other beauties.

    He resists the temptations of bribery, but many of his friends and colleagues are ready with rationalizations why bribes are acceptable.

    He earns far more than most Nigerians, but he finds himself overwhelmed by his expenses and constrained by the salary of a civil servant.

    Even his education presented conflict. The elders who loaned him for his education instructed him to study Law; but Obi decided he liked English better and switched majors during his tuition.

    Worse, he discovers on his return that Lagos has changed - more modern and more corrupt - from his earlier days. It is the late 1950s and the colonial era is ending and countries like Nigeria are struggling to find their new, post-colonial identity.

    Achebe brings to life the time and his characters in a lyrical, yet straightforward style. He peppers the speech of his character with frequent African proverbs.

    I loved the reference back to the first novel: The rift between Obi's father and grandfather when his father converted to Christianity, rejecting the violence of the past and beginning the family's assimilation.

    No Longer at Ease is a story of Local Culture vs Colonialism; of Idealism vs Practicality; of Individualism vs Duty; of Acceptance of Modern Christianity vs Adherence to an old caste system; and of Tradition vs Progress.

    The reader struggles with these conflicts along with Obi.

    Saturday, June 8, 2019 9:43:00 AM (GMT Daylight Time, UTC+01:00)
    # Friday, June 7, 2019

    The Microsoft Cognitive Services Computer Vision API contains functionality to infer a lot of information about a given image. One capability is to convert pictures of text into text, a process known as "Optical Characer Recognition" or "OCR".

    Performing OCR on an image is simple and inexpensive. It is done through a web service call; but first, you must set up the Computer Vision Service, as described in this article.

    In that article, you were told to save two pieces of information about the service: The API Key and the URL. Here is where you will use them.

    HTTP Endpoint

    The OCR service is a web service. To call it, you send an HTTP POST request to an HTTP endpoint. The endpoint consists of the URL copied above, followed by "vision/v2.0/ocr", followed by some optional querystring parameters (which we will discuss later).

    So, if you create your service in the EAST US Azure region, the copied URL will be

    https://eastus.api.cognitive.microsoft.com/

    and the HTTP endpoint for the OCR service will be

    https://eastus.api.cognitive.microsoft.com/vision/v2.0/ocr

    Querystring Parameters

    The optional querystring parameters are

    language:

    The 2-character language code of the text you are recognizing. This helps the service more accurately and quickly match pictures of words to the words they represent. If you omit this parameter, the system will analyze the text and guess an appropriate language. Currently, the service supports 26 languages. The 2-character code of each supported language is listed in Appendix 1 at the bottom of this article.

    detectOrientation

    "true", if you want the service to adjust the orientation of the image before performing OCR. If you pass "false" or omitting this parameter, the service will assume the image is oriented correctly.

    If you have an image with English text and you want the service to detect and adjust the image's orientation, the above URL becomes:

    https://eastus.api.cognitive.microsoft.com/vision/v2.0/ocr?language=en&detectOrientation=true

    HTTP Headers

    In the header of the HTTP request, you must add the following name/value pairs:

    Ocp-Apim-Subscription-Key

    The API key you copied above

    Content-Type

    The media type of the image you are passing to the service in the body of the HTTP request

    Possible values are:

    • application/json
    • application/octet-stream
    • multipart/form-data

    The value you pass must be consistent with the data in the body.

    If you select "application/json", you must pass in the request body a URL pointing to the image on the public Internet.

    If you select "application/json" or "application/octet-stream", you must pass the actual binary image in the request body.

    Body

    In the body of the HTTP request, you pass the image you want the service to analyze.

    If you selected "application/json" as the Content-Type in the header, pass a URL within a JSON document, with the following format:

    {"url":"image_url"}

    where image_url is a URL pointing to the image you want to recognize.

    Here is an example:

    {"url":"https://www.themeasuredmom.com/wp-content/uploads/2016/03/Slide11.png"}

    If you selected "application/octet-stream" or "multipart/form-data" as the Content-Type in the header, pass the actual binary image in the body of the request.

    The service has some restrictions on the images it can analyze.

    It cannot analyze an image larger than 4MB.

    The width and height of the image must be between 50 and 4,200 pixels

    The image must be one of the following formats: JPEG, PNG, GIF, BMP.

    Sample call with Curl

    Here is an example of a call to the service, using Curl:

    curl -v -X POST "https://eastus.api.cognitive.microsoft.com/vision/v2.0/ocr?language=en&detectOrientation=true" -H "Content-Type: application/json" -H "Ocp-Apim-Subscription-Key: f27c7436c3a64d91a177111a6b594537" --data-ascii "{'url' : 'https://www.themeasuredmom.com/wp-content/uploads/2016/03/Slide11.png'}"

    (NOTE: I modified the key, so it will not work. You will need to replace it with your own key if you want this to work.)

    Response

    If all goes well, you will receive an HTTP 200 (OK) response.

    In the body of that response will be the results of the OCR in JSON format.

    At the top level is the language, textAngle, and orientation

    Below that is an array of 0 or more text regions. Each region represents a block of text within the image.

    Each region contains an array of 0 or more lines of text.

    Each line contains an array of 0 or more words.

    Each region, line, and word contains a bounding box, consisting of the left, top, width, and height of the word(s) within.

    Here is a partial example of the JSON returned from a successful web service call:

    {
        "language": "en",
        "textAngle": 0.0,
        "orientation": "Up",
        "regions": [
            {
                "boundingBox": "147,96,622,1095",
                "lines": [
                    {
                        "boundingBox": "408,96,102,56",
                        "words": [
                            {
                                "boundingBox": "408,96,102,56",
                                "text": "Hey"
                            }
                        ]
                    },
                    {
                        "boundingBox": "282,171,350,45",
                        "words": [
                            {
                                "boundingBox": "282,171,164,45",
                                "text": "Diddle"
                            },
                            {
                                "boundingBox": "468,171,164,45",
                                "text": "Diddle"
                            }
                        ]
                    },
                    etc...
                     }
                ]
            }
        ]
    }
      

    The full JSON can be found in Appendix 2 below.

    Errors

    If an error occurs, the response will not by HTTP 200. It will be an HTTP Response code greater than 400. Additional error information will be in the body of the response.

    Common errors include:

    • Images too large or too small
    • Image not found (It might require a password or be behind a firewall)
    • Invalid image format
    • Incorrect API key
    • Incorrect URL (It must match the API key. If you have multiple services, it’s easy to mix them up)
    • Miscellaneous spelling errors (e.g., not entering a valid language code or misspelling a header parameter)

    In this article, I showed how to call the Cognitive Services OCR Computer Vision Service.

    Appendix 1: Supported languages

    zh-Hans (ChineseSimplified)
    zh-Hant (ChineseTraditional)
    cs (Czech)
    da (Danish)
    nl (Dutch)
    en (English)
    fi (Finnish)
    fr (French)
    de (German)
    el (Greek)
    hu (Hungarian)
    it (Italian)
    ja (Japanese)
    ko (Korean)
    nb (Norwegian)
    pl (Polish)
    pt (Portuguese,
    ru (Russian)
    es (Spanish)
    sv (Swedish)
    tr (Turkish)
    ar (Arabic)
    ro (Romanian)
    sr-Cyrl (SerbianCyrillic)
    sr-Latn (SerbianLatin)
    sk (Slovak)

    Appendix 2: JSON Response Example

    {
        "language": "en",
        "textAngle": 0.0,
        "orientation": "Up",
        "regions": [
            {
                "boundingBox": "147,96,622,1095",
                "lines": [
                    {
                        "boundingBox": "408,96,102,56",
                        "words": [
                            {
                                "boundingBox": "408,96,102,56",
                                "text": "Hey"
                            }
                        ]
                    },
                    {
                        "boundingBox": "282,171,350,45",
                        "words": [
                            {
                                "boundingBox": "282,171,164,45",
                                "text": "Diddle"
                            },
                            {
                                "boundingBox": "468,171,164,45",
                                "text": "Diddle"
                            }
                        ]
                    },
                    {
                        "boundingBox": "239,336,441,46",
                        "words": [
                            {
                                "boundingBox": "239,336,87,46",
                                "text": "Hey"
                            },
                            {
                                "boundingBox": "359,337,144,35",
                                "text": "diddle"
                            },
                            {
                                "boundingBox": "536,337,144,35",
                                "text": "diddle"
                            }
                        ]
                    },
                    {
                        "boundingBox": "169,394,576,35",
                        "words": [
                            {
                                "boundingBox": "169,394,79,35",
                                "text": "The"
                            },
                            {
                                "boundingBox": "279,402,73,27",
                                "text": "cat"
                            },
                            {
                                "boundingBox": "383,394,83,35",
                                "text": "and"
                            },
                            {
                                "boundingBox": "500,394,70,35",
                                "text": "the"
                            },
                            {
                                "boundingBox": "604,394,141,35",
                                "text": "fiddle"
                            }
                        ]
                    },
                    {
                        "boundingBox": "260,452,391,50",
                        "words": [
                            {
                                "boundingBox": "260,452,79,35",
                                "text": "The"
                            },
                            {
                                "boundingBox": "370,467,80,20",
                                "text": "cow"
                            },
                            {
                                "boundingBox": "473,452,178,50",
                                "text": "jumped"
                            }
                        ]
                    },
                    {
                        "boundingBox": "277,509,363,35",
                        "words": [
                            {
                                "boundingBox": "277,524,100,20",
                                "text": "over"
                            },
                            {
                                "boundingBox": "405,509,71,35",
                                "text": "the"
                            },
                            {
                                "boundingBox": "509,524,131,20",
                                "text": "moon."
                            }
                        ]
                    },
                    {
                        "boundingBox": "180,566,551,49",
                        "words": [
                            {
                                "boundingBox": "180,566,79,35",
                                "text": "The"
                            },
                            {
                                "boundingBox": "292,566,103,35",
                                "text": "little"
                            },
                            {
                                "boundingBox": "427,566,82,49",
                                "text": "dog"
                            },
                            {
                                "boundingBox": "546,566,185,49",
                                "text": "laughed"
                            }
                        ]
                    },
                    {
                        "boundingBox": "212,623,493,51",
                        "words": [
                            {
                                "boundingBox": "212,631,42,27",
                                "text": "to"
                            },
                            {
                                "boundingBox": "286,638,72,20",
                                "text": "see"
                            },
                            {
                                "boundingBox": "390,623,96,35",
                                "text": "such"
                            },
                            {
                                "boundingBox": "519,638,20,20",
                                "text": "a"
                            },
                            {
                                "boundingBox": "574,631,131,43",
                                "text": "sport."
                            }
                        ]
                    },
                    {
                        "boundingBox": "301,681,312,35",
                        "words": [
                            {
                                "boundingBox": "301,681,90,35",
                                "text": "And"
                            },
                            {
                                "boundingBox": "425,681,70,35",
                                "text": "the"
                            },
                            {
                                "boundingBox": "528,681,85,35",
                                "text": "dish"
                            }
                        ]
                    },
                    {
                        "boundingBox": "147,738,622,50",
                        "words": [
                            {
                                "boundingBox": "147,753,73,20",
                                "text": "ran"
                            },
                            {
                                "boundingBox": "255,753,114,30",
                                "text": "away"
                            },
                            {
                                "boundingBox": "401,738,86,35",
                                "text": "with"
                            },
                            {
                                "boundingBox": "519,738,71,35",
                                "text": "the"
                            },
                            {
                                "boundingBox": "622,753,147,35",
                                "text": "spoon."
                            }
                        ]
                    },
                    {
                        "boundingBox": "195,1179,364,12",
                        "words": [
                            {
                                "boundingBox": "195,1179,45,12",
                                "text": "Nursery"
                            },
                            {
                                "boundingBox": "242,1179,38,12",
                                "text": "Rhyme"
                            },
                            {
                                "boundingBox": "283,1179,36,9",
                                "text": "Charts"
                            },
                            {
                                "boundingBox": "322,1179,28,12",
                                "text": "from"
                            },
                            {
                                "boundingBox": "517,1179,11,10",
                                "text": "C"
                            },
                            {
                                "boundingBox": "531,1179,28,9",
                                "text": "2017"
                            }
                        ]
                    },
                    {
                        "boundingBox": "631,1179,90,12",
                        "words": [
                            {
                                "boundingBox": "631,1179,9,9",
                                "text": "P"
                            },
                            {
                                "boundingBox": "644,1182,6,6",
                                "text": "a"
                            },
                            {
                                "boundingBox": "655,1182,7,9",
                                "text": "g"
                            },
                            {
                                "boundingBox": "667,1182,7,6",
                                "text": "e"
                            },
                            {
                                "boundingBox": "690,1179,31,12",
                                "text": "7144"
                            }
                        ]
                    }
                ]
            }
        ]
    }
      
    Friday, June 7, 2019 9:09:00 AM (GMT Daylight Time, UTC+01:00)
    # Thursday, June 6, 2019

    GCast 51:

    Creating an Azure Container Instance

    Learn how to create an Azure Container instance from a container repository.

    Azure | GCast | IAAS | Screencast | Video
    Thursday, June 6, 2019 9:15:00 AM (GMT Daylight Time, UTC+01:00)
    # Wednesday, June 5, 2019

    The Microsoft Cognitive Services Computer Vision API contains functionality to infer a lot of information about a given image.

    As of this writing, the API is on version 2.0 and supports the following capabilities:

    Analyze an Image

    Get general information about an image, such as the objects found, what each object is and where it is located. It can even identify potentially pornographic images.

    Analyze Faces

    Find the location of each face in a video and determine information about each face, such as is age, gender, and type of facial hair or glasses.

    Optical Character Recognition (OCR)

    Convert a picture of text into text

    Recognize Celebrities

    Recognize famous people from photos of their face

    Recognize Landmarks

    Recognize famous landmarks, such as the Statue of Liberty or Diamond Head Volcano.

    Analyze Video

    Retrieve keywords to describe a video at different points in time as it plays.

    Generate a Thumbnail

    Change the size and shape of an image, without cropping out the main subject.

    Getting Started

    To get started, you need to create a Computer Vision Service. To do this, navigate to the Azure Portal, login in, click the [Create a resource] button (Fig. 1) and enter "Computer Vision" in the Search box, as shown in Fig. 2.

    cv01-CreateResource
    Fig. 1

    cv02-SearchForComputerVision
    Fig. 2

    A dialog displays, with information about the Computer Vision Service, as shown in Fig. 3.

    cv03-ComputerVisionSplashPage
    Fig. 3

    Click the [Create] button to display the Create Computer Vision Service blade, as shown in Fig. 4.

    cv04-NewSvc
    Fig. 4

    At the "Name" field, enter a name by which you can easily identify this service. This name must be unique among your services, but need not be globally unique.

    At the "Subscription" field, select the Subscription with which you want to associate this service. Most of you will only have one subscription.

    At the "Location" field, select the Azure Region in which to store this service. Consider where the users of this service will be, so you can reduce latency.

    At the "Pricing tier" field, select "F0" to use this service for free or "S1" to incur a small charge for each call to the service. If you select the free service, you will be limited in the number and frequency of calls that can be made.

    At the "Resource group" field, select a resource group in which to store your service or click "Create new" to store it in a newly-created resource group. A resource group is a logical container for Azure resources.

    Click the [Create] button to create the Computer Vision service.

    Usually, it takes less than a minute to create a Computer Vision Service. When Azure has created this service, you can navigate to it by its name or the name of the resource group.

    Two pieces of information are critical when using the service: The Endpoint and the API keys.

    The Endpoint can be found on the service's Overview blade, as shown in Fig. 5.

    cv05-OverviewBlade
    Fig. 5

    The API Keys can be found on the service's "Keys" blade, as shown in Fig. 6. There are 2 keys, in case one key is compromised; you can use the other key, while the first is regenerated, in order to minimize downtime.

    cv06-KeysBlade
    Fig. 6

    Copy the URL and and one of the API keys. You will need it to call the web services. We will describe how to make specific calls in future articles.

    Wednesday, June 5, 2019 4:46:00 PM (GMT Daylight Time, UTC+01:00)
    # Tuesday, June 4, 2019

    GeorgeClinton-1It's a good thing The Aragon Ballroom in Chicago's Uptown has such a large stage. They needed all of it to hold about 20 singers and dancers and rappers and guitarists and horns and drummers that make up George Clinton's Parliament Funkadelic.

    But even with its large size, the Aragon stage could not contain the entire band for the entire concert Friday night.   At least three musicians jumped over the front into the orchestra pit and ran out into the audience during the performance.

    Many years ago, Parliament and Funkadelic were two separate bands (although with largely the same members), but today Clinton has combined them into a single entity - Parliament Funkadelic.

    GeorgeClinton-2Clinton conducted... no, that's not the right word... Directed...? No... He more or less presided over the band's performance, stepping forward occasionally to acknowledge a performer, deliver a few lyrics, or lead the audience in handclapping. In fact, the 77-year old spent much of the concert sitting on a chair in front of the drum set.

    The show started slowly, not helped by failing sound systems and the poor acoustics of the Aragon; but Clinton gained energy as the night went on.

    The crowd expressed most delight when the band broke into their old songs - "Give Up the Funk", "Flashlight", "One Nation Under Groove", ...)

    GeorgeClinton-3With no seats on the ground floor, the audience could not help but dance, clap, and wave their hands through much of the show. It wasn't perfect, but the crowded stage delivered enough to delight those who made it out for the night.

    His health is failing, and his energy is waning, and this will likely be his last tour. But he gave what he could to the crowded ballroom audience, who waited through three warmup acts to see the Rock and Roll Hall of Famer.

    Tuesday, June 4, 2019 9:31:00 AM (GMT Daylight Time, UTC+01:00)
    # Monday, June 3, 2019

    Episode 566

    Hattan Shobokshi on TerraForm

    Hattan Shobokshi describes how to use Terraform to implement Infrastructure As Code.

    Monday, June 3, 2019 9:40:00 AM (GMT Daylight Time, UTC+01:00)
    # Sunday, June 2, 2019

    6/2
    Today I am grateful for the newly-renovated hallways in my building.

    6/1
    Today I am grateful to see George Clinton and Parliament Funkadelic in concert last night.

    5/31
    Today I am grateful to attend the AWS Summit yesterday.

    5/30
    Today I am grateful for a fresh tune-up on my new (to me) bicycle.

    5/29
    Today I am grateful to get my son moved into a new home last night.

    5/28
    Today I am grateful for good health insurance.

    5/27
    Today I am grateful to sleep in my own bed last night.

    5/26
    Today I am grateful for 4 days in Stockholm, one of Europe's great cities.

    5/25
    Today I am grateful to be invited to speak at the DevSum conference in Stockholm for the third time.

    5/24
    Today I am grateful for dinner at a Viking restaurant and fancy drinks with friends last night in Stockholm's Old City.

    5/23
    Today I am grateful for dinner last night with Jay and Mike in Stockholm.

    5/22
    Today I am grateful for 2 days in Copenhagen - my first visit to Denmark.

    5/21
    Today I am grateful for dinner and a long walk through Copenhagen yesterday with Joseph and Deidre.

    5/20
    Today I am grateful for Sunday brunch with Dave, Sue, Gary, and Debora.

    5/19
    Today I am grateful for a full day at home.

    5/18
    Today I am grateful for a week in Germany.

    5/17
    Today I am grateful for a German dinner with my team last night.

    5/16
    Today I am grateful to learn and teach DevOps with Lufthansa this week.

    5/15
    Today I am grateful for my first time staying in Germany in 31 years.

    5/14
    Today I am grateful to experience an Escape Room for the first time yesterday.

    5/13
    Today I am grateful to finally meet my team members in person.

    5/12
    Today I am grateful for:
    -the opportunity to speak at the Chicago Code Camp yesterday
    -an upgrade to Business Class on my transatlantic flight last night.

    5/11
    Today I am grateful to Hattan and Dave for answering my stupid questions the last few days.

    5/10
    Today I am grateful for dinner with Gary last night.

    5/9
    Today I am grateful for Korean burgers and Kimchi fries with Tim last night.

    5/8
    Today I am grateful to see Rev. Al Green in concert last night.

    5/7
    Today I am grateful for a bike ride last night through Chinatown, Pilsen, and the Lower West Side.

    5/6
    Today I am grateful for a visit to the Skokie Northshore Sculpture Park.

    Sunday, June 2, 2019 2:53:12 PM (GMT Daylight Time, UTC+01:00)
    # Saturday, June 1, 2019

    ThingsFallApartThings Fall Apart by Chinua Achebe tells the story of Okonkwo, a self-made man of the Umuofia clan in pre-colonial West Africa. Okonkwo worked hard to overcome the reputation of his lazy and cowardly father, who died with numerous debts. Okonkwo rose from poverty until he had acquired 3 wives, 10 children, a successful farm, and a position of respect and leadership among the clan.

    But Okonkwo was also hot-tempered and violent: he would beat his wives and children when they displeased him, and his great strength made him feel he could and should use violence to settle disputes.

    His rise to power was interrupted when he accidentally killed a fellow tribesman and was exiled for seven years for this crime. When he returned to the clan, everything had changed:  British colonists have arrived, bringing with them their culture, their laws, and their religion. The society that Okonkwo knew began to disappear, as it was subsumed by the colonists.

    Unlike many English novels of this period, Things Fall Apart tells the story of the colonization of Africa from the point of view of the Africans. We see the indigenous people's respect for their ancestors and their focus on doing what is best for the community as a whole; and we see the unpleasantness of their society, such as the killing of a youth to placate angry gods.

    We learn about Okonkwo - his strengths and his weaknesses. He is admired for his hard work and his sense of duty; but we also witness his uncontrolled rage and his intolerance.

    The point is that we see the complexity of their society - not the ignorant savages so often portrayed by Westerners to justify their methods of "education" and "liberation".

    Okonkwo was not a likeable man. But he had his principles and stuck to them, until his world was turned upside down by outsiders. And Achebe leads us toward his inevitable destruction.

    Saturday, June 1, 2019 9:58:00 AM (GMT Daylight Time, UTC+01:00)