# Thursday, 18 January 2018

ElfstonesOfShannaraThe Elfstones of Shannara is the second book of Terry Brooks’s Shannara series.

It begins years after The Sword of Shannara. The magical Ellcrys tree - created by the elves millennia earlier to imprison the evil demons - has begun to die, allowing the demons to regain their strength and attack men and elves.

Wil Ohmsford - grandson of Shea Ohmsford, hero of the first novel - is assigned the task of re-planting the Ellcrys. He needs the help of the elven princess Amberle, who is chosen to protect the magical tree, but has run away from her king and family. To protect her on his quest, he uses the power of the mystical elfstones left to him by his grandfather.

It becomes a race for Wil and Amberle to find the hidden Ellcrys seed and restore the Ellcrys before the emboldened demons attack and destroy men and elves of the Four Lands.

Terry Brooks knows how to tell a story, but it troubles me that he borrows so much from JRR Tolkien. It's not just that his world is populated with elves and dwarves. Major parts of the story are lifted directly from Tolkien. The young, reluctant hero goes on a quest to save the world from evil creatures, armed with a magical talisman that was left to him by an older relative, who went on a similar quest in an earlier book. We could easily substitute Bilbo, Frodo, and the Ring for Wil, Shea, and the Elfstones.

But, the stories are entertaining if not told with Tolkien's magic. It's a good introduction to the world of fantasy for those who want something more accessible than Lord of the Rings.

Thursday, 18 January 2018 05:06:43 (GMT Standard Time, UTC+00:00)
# Monday, 15 January 2018
Monday, 15 January 2018 18:29:00 (GMT Standard Time, UTC+00:00)
# Monday, 08 January 2018
# Sunday, 07 January 2018

Azure provides several ways of managing resources through scripting users. You can write scripts in either PowerShell (a popular Windows tool for managing servers and IAT resources) or CLI (a Bash-like scripting language that runs on Windows, Linux, and MacOS).

To use these tools, you need to have them installed locally, along with any support tools, such as the Azure PowerShell commandlets.

Until now.

Recently, Microsoft released the Azure Cloud Shell - a browser-based command-line interface built into the Azure portal. By opening a Cloud Shell from the Azure Portal, you can execute PowerShell or CLI scripts from within your browser, without installing anything.

To open Cloud Shell, navigate to the Azure portal and click the [Cloud Shell] button (Fig. 1) on the top tool bar.

CS01CloudShellButton
Fig. 1

It may take a minute to retrieve and connect to a Cloud Shell environment (Fig. 2), but soon you will see a window with a command prompt, as Shown in Fig. 3.

CS02-BashTerminal
Fig. 2

CS03-CLICloudShell
Fig. 3

The Cloud Shell in the image is configured to run CLI scripts. You can tell this by the dropdown in the window's top left corner. You can also run PowerShell scripts in a Cloud Shell window. To change the script types, click the top left dropdown and select your desired scripting language, as shown in Fig. 4.

CS04-ChangeScriptType
Fig. 4

Fig. 5 shows the Cloud Shell with PowerShell selected.

CS05-PowerShellCloudShell
Fig. 5

You don't need to log into the Cloud Shell environment. It will assume the account from which it was launched.

But you can view, create, and manage Azure resources. For example, from the Bash shell, type

az group list -o table

To see a list of all Resource Groups

or

az group create -l southeastus -n myrg

to create a new resource group named "myrg" in the Southeast US region.

You can even do other bash commands, such as ssh into an Azure Linux VM.

Cloud Shell automatically creates a container within an Azure VM to host your session. Although this container is destroyed shortly after you disconnect, Cloud Shell also creates a storage account to persist files or settings you use when using this interface, so they will be there when you return.

Azure Cloud Shell provides an environment for you to execute automation scripts and other administrative functions.

Sunday, 07 January 2018 10:43:00 (GMT Standard Time, UTC+00:00)
# Saturday, 06 January 2018

2017 was a year of change. I started a new job; I traveled for the first time to South America, the Czech Republic, Iowa, Oregon, and Los Angeles; I returned to Canada for the first time in 10+ years; my son Tim graduated and began his professional career; and my son Nick accepted a new job in Massachusetts.

I'll start with me.

New Job

Microsoft  went through a major reorganization last year and it greatly affected my department and my job. I moved to a new team that is focused on helping professors at Top Computer Science universities teach their students about cloud computing. This role involves even more travel than my last one. For the past few months, I've been visiting schools around North America and my calendar for the next 2 months is filled with campus visits for hackathons, guest lectures, workshops, and meetings with students, and professors. I won't be home much in January and February.

Travel

I traveled more and farther in 2017 than I have in a long time (if ever). I visited 7 different countries (Romania, Sweden, The Czech Republic, Uruguay, Argentina, Canada, and the US) and 17 states. It was my first visit to the Czech Republic, Uruguay, and Argentina.

Prague has long been on my list of places to visit so I was thrilled to finally get there and I enjoyed the hospitality of Gael Fraiteur and Brit King. Gael and I drove down to Český Krumlov - a small village in southern Bohemia preserved as it was in the 18th century - where we spent a night and explored castles, museums, and restaurants.

I was happy to accept an invitation to speak at .NET Conf UY, primarily because it was my first trip to South America. After a few days in Montevideo, Uruguay, I took a ferry to Buenos Aires, where I spent an afternoon exploring the city on foot. In May, I scheduled tour to speak at 4 different user groups in 3 days in Iowa. My friend Javier helped me plan the trip and I was excited for my first visit to the Hawkeye State. 

In 2017, I got serious about my goal of seeing every home stadium and arena of the 4 major professional sports league. I visited 3 NFL stadiums, 4 NBA arenas, and 2 NHL arenas last year. With 55 places remaining, I will need to accelerate this process.

In January, I flew to San Francisco, where my friend Sara picked me up and together we drove 7 hours to a small town in southwest Oregon to attend the funeral of the wife of an old friend. The next day, we attended the funeral and a dinner and repeated the trip in reverse. We were fortunate to have flexible enough schedules to make this trip and I'm really glad we did. And I got to know Sara a lot better on the trip.

Live Music

I made it a point to see a lot of live music in 2017. Most of the shows  (Stanley Clarke, Buddy Guy, Guy King, Ladysmith Black Mambazo, Booker T. Jones, Marcia Ball, Delbert McClinton, Kris Kristofferson, Al Stewart, Jean-Luc Ponty, Benny Golson, Paul Weller, and Roy Ayers) were at small clubs in Chicago (SPACE, Buddy Guy's Legends, Old Town School of Folk Music, SPACE, City Winery, Jazz Showcase, House of Blues, and The Promontory), but I also saw Eric Church and Tim McGraw / Faith Hill at the cavernous Allstate Arena.

My Two Sons

My two sons also had some major changes in their lives.

Shortly after graduating from Indiana University with a degree in Informatics, Tim accepted a job with Enkay Tech -  an IT consulting company outside of Chicago. He lived with me for a few months before renting a house in Wrigleyville. Spending time with hi was one of the highlights of my summer.

After 2 years serving as Director of Basketball Operations at Southern Illinois University - Edwardsville, Nick accepted a position as an assistant coach at Williams College in Williamstown, MA. He moved in the fall and his team have been ranked as high as #5 in Division 3. In December, I was able to see Williams play 2 games at a tournament in Thousand Oaks CA.

Mostly Good, Some Bad

Although most of 2017 was good to me, not everything was awesome. My mother passed away in June. A few weeks later, I was diagnosed with skin cancer, which was successfully removed. Not long after, a relationship ended after over a year of dating. Each incident was magnified because they came in quick succession, but I've recovered from them. My family and I were somewhat prepared for our mother's passing. She was 85 and the death of my father and sister in the past few years forced us to consider the inevitable loss of other loved ones. I am left with fond memories of her and of the girl I lost and this helps. And my follow-up appointment showed no sign of skin cancer.

Looking Ahead

2017 was an amazing year of growth for me personally. The changes are accelerating into 2018. My calendar is already full for the first 2 months and I am looking forward to the future with optimism.

Saturday, 06 January 2018 06:22:15 (GMT Standard Time, UTC+00:00)
# Monday, 01 January 2018
Monday, 01 January 2018 12:48:00 (GMT Standard Time, UTC+00:00)
# Sunday, 31 December 2017

12/31
Today I am grateful for lunch yesterday with my cousin Bob.

12/30
Today I am grateful to see Nick's Williams College basketball team play for the first time last night in California.

12/29
Today I am grateful for:
-Lunch yesterday with my cousin Barbara in San Juan Capistrano
-Watching a Spartan victory in the Holiday from the 50-yard-line with my son Tim

12/28
Today I am grateful to see a Lakers-Grizzlies game last night on my first visit to the Staples Center.

12/27
Today I am grateful to see an excellent Roy Hargrove concert last night at the Jazz Showcase in the South Loop.

12/26
Today I am grateful to spend Christmas with my family.

12/25
Today I am grateful that we still celebrate the birth of Jesus Christ after all these years.

12/24
Today I am grateful for a Christmas Eve snowfall; and the fact that I am not driving in it.

12/23
Today I am grateful for 3 Personal Training sessions this week - the last 3 of 2017!

12/22
Today I am grateful to see Roy Ayers in concert last night on my first visit to The Promontory in Hyde Park.

12/21
Today I am grateful to everyone1 who helped me get to 500 episodes on #TechnologyAndFriends

12/20
Today I am grateful for my first visit to the Argonne National Laboratory to attend a reception for David Danielson - clean energy entrepreneur and former Assistant Secretary of Energy.

12/19
Today I am grateful for an unseasonably warm Chicago December.

12/18
Today I am grateful to take Nick and Tim to a Black Hawks game last night - their first visit to the United Center.

12/17
Today I am grateful to spend yesterday with my sons.

12/16
Today I am grateful to spend some time at home.

12/15
Today I am grateful for the holiday party hosted by my apartment building last night.

12/14
Today I am grateful to spend a few days in Texas and meeting with folks at the University of Texas in Austin.

12/13
Today I am grateful to attend a home University of Texas basketball game for the first time.

12/12
Today I am grateful to see an exciting Pelicans-Rockets game last night - my first time at the Toyota Center!

12/11
Today I am grateful for:
-The hospitality and generosity of Paul
-Attending a home Texans game for the first time.

12/10
Today I am grateful for:
-The Uber driver who picked me up yesterday and took me to the airport when my Uber driver ran out of gas on the way.
-The "Lights in the Heights" festival last night in Houston.

12/09
Today I am grateful for a kind and completely unexpected email last night.

12/08
Today I am grateful to attend the Chicago User Group Holiday Party last night.

12/07
Today I am grateful for a meaningful and enjoyable offsite with my team in Atlanta this week.

12/06
Today I am grateful for an excellent dinner last night in midtown Atlanta with my team.

12/05
Today I am grateful for great seats at my second Atlanta Hawks home game in the past week.

12/4
Today I am grateful for temperatures in the 50s in Chicago in December.

Sunday, 31 December 2017 13:03:27 (GMT Standard Time, UTC+00:00)
# Saturday, 30 December 2017

As I discussed in a previous article, Microsoft Cognitive Services includes a set of APIs that allow your applications to take advantage of Machine Learning in order to analyze, image, sound, video, and language. One of these APIs is a REST web service that can determine the words and punctuation contained in a picture. This is accomplished by a simple REST web service call.

The Cognitive Services Optical Character Recognition (OCR) service is part of the Custom Vision API. It takes as input a picture of text and returns the words found in the image.

To get started, you will need an Azure account and a Cognitive Services Vision API key.

If you don't have an Azure account, you can get a free one at https://azure.microsoft.com/free/.

Once you have an Azure Account,  follow the instructions in this article to generate a Cognitive Services Computer Vision key.

To use this API, you simply have to make a POST request to the following URL:
https://[location].api.cognitive.microsoft.com/vision/v1.0/ocr

where [location] is the Azure location where you created your API key (above).

Optionally, you can add the following 2 querystring parameters to the URL:

  • Language: the 2-digit language abbreviation abbreviation. Use “en” for English. Currently, 25 languages are supported. If omitted, the service will attempt to auto-detect the language
  • detectOrientation: Set this to “true” if you want to support upside-down or rotated images.

The HTTP header of the request should include the following:

Ocp-Apim-Subscription-Key.     
The Cognitive Services Computer Vision key you generated above.

Content-Type

This tells the service how you will send the image. The options are:

  • application/json
  • application/octet-stream
  • multipart/form-data

If the image is accessible via a public URL, set the Content-Type to application/json and send JSON in the body of the HTTP request in the following format

{"url":"imageurl"}
where imageurl is a public URL pointing to the image. For example, to perform OCR on an image of an Edgar Allen Poe poem, submit the following JSON:

{"url": "http://media.tumblr.com/tumblr_lrbhs0RY2o1qaaiuh.png"}

DreamWithinADream

If you plan to send the image itself to the web service, set the content type to either "application/octet-stream" or “multipart/form-data” and submit the binary image in the body of the HTTP request.

The full request looks something like:  

POST https://westus.api.cognitive.microsoft.com/vision/v1.0/ocr HTTP/1.1
Content-Type: application/json
Host: westus.api.cognitive.microsoft.com
Content-Length: 62
Ocp-Apim-Subscription-Key: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
{ "url": "http://media.tumblr.com/tumblr_lrbhs0RY2o1qaaiuh.png" }

For example, passing a URL with the following picture:

 DreamWithinADream
  (found online at http://media.tumblr.com/tumblr_lrbhs0RY2o1qaaiuh.png)

returned the following data: 

{
  "textAngle": 0.0,
  "orientation": "NotDetected",
  "language": "en",
  "regions": [
    {
      "boundingBox": "31,6,435,478",
      "lines": [
        {
          "boundingBox": "114,6,352,23",
          "words": [
            {
              "boundingBox": "114,6,24,22",
              "text": "A"
            },
            {
              "boundingBox": "144,6,93,23",
               "text": "Dream"
            },
            {
               "boundingBox": "245,6,95,23",
              "text": "Within"
            },
            {
              "boundingBox": "350,12,14,16",
              "text": "a"
            },
            {
              "boundingBox": "373,6,93,23",
              "text": "Dream"
            }
          ]
        },
        {
           "boundingBox": "31,50,187,16",
          "words": [
             {
              "boundingBox": "31,50,31,12",
               "text": "Take"
            },
            {
              "boundingBox": "66,50,23,12",
              "text": "this"
             },
            {
              "boundingBox": "93,50,24,12",
              "text": "kiss"
            },
            {
               "boundingBox": "121,54,33,12",
              "text": "upon"
            },
            {
              "boundingBox": "158,50,19,12",
              "text": "the"
            },
             {
              "boundingBox": "181,50,37,12",
               "text": "brow!"
            }
          ]
        },
        {
          "boundingBox": "31,67,194,16",
          "words": [
             {
              "boundingBox": "31,67,31,15",
               "text": "And,"
            },
            {
              "boundingBox": "67,67,12,12",
              "text": "in"
             },
            {
              "boundingBox": "82,67,46,16",
              "text": "parting"
            },
            {
              "boundingBox": "132,67,31,12",
              "text": "from"
            },
            {
              "boundingBox": "167,71,25,12",
              "text": "you"
            },
             {
              "boundingBox": "195,71,30,11",
               "text": "now,"
            }
          ]
        },
         {
          "boundingBox": "31,85,159,12",
          "words": [
            {
              "boundingBox": "31,85,32,12",
               "text": "Thus"
            },
            {
               "boundingBox": "67,85,35,12",
              "text": "much"
            },
            {
              "boundingBox": "107,86,16,11",
              "text": "let"
            },
             {
              "boundingBox": "126,89,20,8",
              "text": "me"
            },
            {
              "boundingBox": "150,89,40,8",
              "text": "avow-"
            }
          ]
        },
        {
          "boundingBox": "31,102,193,16",
          "words": [
            {
              "boundingBox": "31,103,26,11",
              "text": "You"
             },
            {
              "boundingBox": "61,106,19,8",
              "text": "are"
            },
            {
               "boundingBox": "84,104,21,10",
              "text": "not"
            },
            {
              "boundingBox": "109,106,44,12",
              "text": "wrong,"
            },
             {
              "boundingBox": "158,102,27,12",
               "text": "who"
            },
            {
              "boundingBox": "189,102,35,12",
              "text": "deem"
             }
          ]
        },
        {
          "boundingBox": "31,120,214,16",
          "words": [
            {
               "boundingBox": "31,120,29,12",
              "text": "That"
            },
            {
              "boundingBox": "64,124,21,12",
              "text": "my"
            },
            {
              "boundingBox": "89,121,29,15",
              "text": "days"
            },
            {
              "boundingBox": "122,120,30,12",
              "text": "have"
            },
            {
              "boundingBox": "156,121,30,11",
              "text": "been"
            },
            {
               "boundingBox": "191,124,7,8",
              "text": "a"
            },
            {
              "boundingBox": "202,121,43,14",
              "text": "dream;"
            }
           ]
        },
        {
          "boundingBox": "31,138,175,16",
          "words": [
            {
              "boundingBox": "31,139,22,11",
              "text": "Yet"
            },
             {
              "boundingBox": "57,138,11,12",
               "text": "if"
            },
            {
              "boundingBox": "70,138,31,16",
              "text": "hope"
             },
            {
              "boundingBox": "105,138,21,12",
              "text": "has"
            },
            {
               "boundingBox": "131,138,37,12",
              "text": "flown"
            },
            {
              "boundingBox": "172,142,34,12",
              "text": "away"
            }
          ]
        },
        {
          "boundingBox": "31,155,140,16",
          "words": [
            {
              "boundingBox": "31,156,13,11",
              "text": "In"
             },
            {
              "boundingBox": "48,159,8,8",
               "text": "a"
            },
            {
               "boundingBox": "59,155,37,16",
              "text": "night,"
            },
            {
              "boundingBox": "100,159,14,8",
              "text": "or"
            },
             {
              "boundingBox": "118,155,12,12",
              "text": "in"
            },
            {
              "boundingBox": "134,159,7,8",
              "text": "a"
            },
             {
              "boundingBox": "145,155,26,16",
               "text": "day,"
            }
          ]
        },
         {
          "boundingBox": "31,173,144,15",
          "words": [
            {
              "boundingBox": "31,174,13,11",
              "text": "In"
            },
            {
               "boundingBox": "48,177,8,8",
              "text": "a"
             },
            {
              "boundingBox": "59,173,43,15",
              "text": "vision,"
            },
             {
              "boundingBox": "107,177,13,8",
              "text": "or"
            },
            {
              "boundingBox": "124,173,12,12",
              "text": "in"
            },
            {
              "boundingBox": "140,177,35,11",
               "text": "none,"
            }
          ]
        },
        {
          "boundingBox": "31,190,180,16",
          "words": [
            {
              "boundingBox": "31,191,11,11",
              "text": "Is"
            },
            {
               "boundingBox": "47,190,8,12",
              "text": "it"
            },
            {
              "boundingBox": "59,190,58,12",
              "text": "therefore"
            },
             {
              "boundingBox": "121,190,19,12",
               "text": "the"
            },
            {
               "boundingBox": "145,191,23,11",
              "text": "less"
             },
            {
              "boundingBox": "173,191,38,15",
              "text": "gone?"
            }
          ]
        },
        {
          "boundingBox": "31,208,150,12",
          "words": [
            {
              "boundingBox": "31,208,20,12",
              "text": "All"
            },
             {
              "boundingBox": "55,208,24,12",
               "text": "that"
            },
            {
              "boundingBox": "83,212,19,8",
              "text": "we"
             },
            {
              "boundingBox": "107,212,19,8",
              "text": "see"
            },
            {
               "boundingBox": "131,212,13,8",
              "text": "or"
            },
            {
              "boundingBox": "148,212,33,8",
              "text": "seem"
            }
           ]
        },
        {
          "boundingBox": "31,226,194,12",
          "words": [
            {
              "boundingBox": "31,227,11,11",
              "text": "Is"
            },
             {
              "boundingBox": "46,226,21,12",
               "text": "but"
            },
            {
              "boundingBox": "71,230,7,8",
              "text": "a"
             },
            {
              "boundingBox": "82,226,40,12",
              "text": "dream"
            },
            {
               "boundingBox": "126,226,41,12",
              "text": "within"
            },
            {
              "boundingBox": "171,230,7,8",
              "text": "a"
            },
             {
              "boundingBox": "182,226,43,12",
               "text": "dream."
            }
          ]
        },
         {
          "boundingBox": "31,261,133,12",
          "words": [
            {
              "boundingBox": "31,262,5,11",
               "text": "I"
            },
            {
               "boundingBox": "41,261,33,12",
              "text": "stand"
             },
            {
              "boundingBox": "78,261,32,12",
              "text": "amid"
            },
            {
              "boundingBox": "114,261,19,12",
              "text": "the"
            },
            {
              "boundingBox": "137,265,27,8",
              "text": "roar"
            }
          ]
        },
        {
          "boundingBox": "31,278,169,15",
          "words": [
            {
              "boundingBox": "31,278,18,12",
              "text": "Of"
             },
            {
              "boundingBox": "52,282,7,8",
              "text": "a"
            },
            {
               "boundingBox": "63,278,95,12",
              "text": "surf-tormented"
            },
            {
              "boundingBox": "162,278,38,15",
              "text": "shore,"
            }
          ]
        },
        {
          "boundingBox": "31,296,174,15",
          "words": [
            {
              "boundingBox": "31,296,28,12",
              "text": "And"
             },
            {
              "boundingBox": "63,297,4,11",
              "text": "I"
            },
            {
               "boundingBox": "72,296,28,12",
              "text": "hold"
            },
            {
              "boundingBox": "104,296,41,12",
              "text": "within"
            },
             {
              "boundingBox": "149,300,20,11",
               "text": "my"
            },
            {
              "boundingBox": "173,296,32,12",
              "text": "hand"
             }
          ]
        },
        {
          "boundingBox": "31,314,169,16",
          "words": [
            {
               "boundingBox": "31,314,42,12",
              "text": "Grains"
            },
            {
              "boundingBox": "78,314,15,12",
              "text": "of"
            },
             {
              "boundingBox": "95,314,19,12",
              "text": "the"
            },
            {
              "boundingBox": "119,315,43,15",
              "text": "golden"
             },
            {
              "boundingBox": "167,314,33,12",
              "text": "sand-"
            }
          ]
         },
        {
          "boundingBox": "31,331,189,16",
           "words": [
            {
              "boundingBox": "31,332,31,11",
              "text": "How"
            },
             {
              "boundingBox": "66,331,28,12",
              "text": "few!"
            },
            {
              "boundingBox": "99,333,20,14",
              "text": "yet"
            },
            {
              "boundingBox": "123,331,27,12",
               "text": "how"
            },
            {
               "boundingBox": "154,331,28,16",
              "text": "they"
            },
            {
              "boundingBox": "186,335,34,12",
              "text": "creep"
            }
           ]
        },
        {
          "boundingBox": "31,349,206,16",
          "words": [
            {
              "boundingBox": "31,349,55,16",
              "text": "Through"
            },
            {
              "boundingBox": "90,353,20,11",
               "text": "my"
            },
            {
               "boundingBox": "115,349,44,16",
              "text": "fingers"
            },
            {
              "boundingBox": "163,351,12,10",
              "text": "to"
            },
             {
              "boundingBox": "179,349,20,12",
               "text": "the"
            },
            {
              "boundingBox": "203,350,34,15",
              "text": "deep,"
             }
          ]
        },
        {
          "boundingBox": "31,366,182,16",
          "words": [
            {
               "boundingBox": "31,366,39,12",
              "text": "While"
            },
            {
              "boundingBox": "74,367,5,11",
              "text": "I"
            },
            {
              "boundingBox": "83,370,39,12",
              "text": "weep-"
            },
            {
              "boundingBox": "126,366,36,12",
              "text": "while"
             },
            {
              "boundingBox": "166,367,5,11",
              "text": "I"
            },
            {
               "boundingBox": "175,367,38,15",
              "text": "weep!"
            }
          ]
        },
        {
          "boundingBox": "31,384,147,16",
          "words": [
            {
               "boundingBox": "31,385,11,11",
              "text": "O"
            },
            {
              "boundingBox": "47,384,31,12",
              "text": "God!"
            },
             {
              "boundingBox": "84,388,21,8",
               "text": "can"
            },
            {
              "boundingBox": "110,385,4,11",
              "text": "I"
             },
            {
              "boundingBox": "119,386,20,10",
              "text": "not"
            },
            {
               "boundingBox": "144,388,34,12",
              "text": "grasp"
            }
          ]
        },
        {
          "boundingBox": "31,402,170,16",
          "words": [
            {
              "boundingBox": "31,402,37,12",
              "text": "Them"
            },
            {
              "boundingBox": "72,402,29,12",
              "text": "with"
            },
            {
              "boundingBox": "105,406,7,8",
               "text": "a"
            },
            {
              "boundingBox": "116,402,42,16",
              "text": "tighter"
            },
            {
              "boundingBox": "162,403,39,15",
              "text": "clasp?"
            }
           ]
        },
        {
          "boundingBox": "31,419,141,12",
          "words": [
            {
              "boundingBox": "31,420,11,11",
              "text": "O"
            },
             {
              "boundingBox": "47,419,31,12",
               "text": "God!"
            },
            {
              "boundingBox": "84,423,21,8",
              "text": "can"
             },
            {
              "boundingBox": "110,420,4,11",
              "text": "I"
            },
            {
               "boundingBox": "119,421,20,10",
              "text": "not"
            },
            {
              "boundingBox": "144,423,28,8",
              "text": "save"
            }
           ]
        },
        {
          "boundingBox": "31,437,179,16",
          "words": [
            {
              "boundingBox": "31,438,26,11",
              "text": "One"
            },
            {
              "boundingBox": "62,437,31,12",
               "text": "from"
            },
            {
               "boundingBox": "97,437,19,12",
              "text": "the"
             },
            {
              "boundingBox": "120,437,45,16",
              "text": "pitiless"
            },
             {
              "boundingBox": "169,438,41,11",
               "text": "wave?"
            }
          ]
        },
        {
          "boundingBox": "31,454,161,12",
          "words": [
            {
              "boundingBox": "31,455,11,11",
               "text": "Is"
            },
            {
               "boundingBox": "47,454,15,12",
              "text": "all"
             },
            {
              "boundingBox": "66,454,25,12",
              "text": "that"
            },
            {
              "boundingBox": "94,458,19,8",
              "text": "we"
            },
            {
              "boundingBox": "118,458,19,8",
              "text": "see"
            },
             {
              "boundingBox": "142,458,13,8",
               "text": "or"
            },
            {
              "boundingBox": "159,458,33,8",
              "text": "seem"
             }
          ]
        },
        {
          "boundingBox": "31,472,185,12",
          "words": [
            {
               "boundingBox": "31,473,23,11",
              "text": "But"
             },
            {
              "boundingBox": "58,476,7,8",
              "text": "a"
            },
            {
               "boundingBox": "69,472,40,12",
              "text": "dream"
            },
            {
              "boundingBox": "113,472,41,12",
              "text": "within"
            },
            {
              "boundingBox": "158,476,7,8",
               "text": "a"
            },
            {
              "boundingBox": "169,472,47,12",
              "text": "dream?"
            }
          ]
        }
      ]
    }
  ]
}
  

Note that the image is split into an array of regions; each region contains an array of lines; and each line contains an array of words. This is done so that you can replace or block out one or more specific words, lines, or regions.

Below is a jQuery code snippet making a request to this service to perform OCR on images of text. You can download the full application at https://github.com/DavidGiard/CognitiveSvcsDemos.

    var language = $("#LanguageDropdown").val();
    var computerVisionKey = getKey() || "Copy your Subscription key here";
    var webSvcUrl = "https://westcentralus.api.cognitive.microsoft.com/vision/v1.0/ocr";     
    webSvcUrl = webSvcUrl + "?language=" + language;
$.ajax({
    type: "POST",
    url: webSvcUrl,
    headers: { "Ocp-Apim-Subscription-Key": computerVisionKey },
    contentType: "application/json",
    data: '{ "Url": "' + url + '" }'
}).done(function (data) {
    outputDiv.text("");

    var regionsOfText = data.regions;
    for (var h = 0; h < regionsOfText.length; h++) {
        var linesOfText = data.regions[h].lines;
        for (var i = 0; i < linesOfText.length; i++) {
            var output = "";

            var thisLine = linesOfText[i];
            var words = thisLine.words;
            for (var j = 0; j < words.length; j++) {
                 var thisWord = words[j];
                output += thisWord.text;
                output += " ";

            }
            var newDiv = "<div>" + output + "</div>";
             outputDiv.append(newDiv);

        }
        outputDiv.append("<hr>");
    }
               
}).fail(function (err) {
    $("#OutputDiv").text("ERROR!" + err.responseText);
});

You can find the full documentation – including an in-browser testing tool - for this API here.

Sending requests to the Cognitive Services OCR API makes it simple to convert a picture of text into text.  

Saturday, 30 December 2017 10:31:00 (GMT Standard Time, UTC+00:00)
# Friday, 29 December 2017

It's difficult enough for humans to recognize emotions in the faces of other humans. Can a computer accomplish this task? It can if we train it to and if we give it enough examples of different faces with different emotions.

When we supply data to a computer with the objective of training that computer to recognize patterns and predict new data, we call that Machine Learning. And Microsoft has done a lot of Machine Learning with a lot of faces and a lot of data and they are exposing the results for you to use.

As I discussed in a previous article, Microsoft Cognitive Services includes a set of APIs that allow your applications to take advantage of Machine Learning in order to analyze, image, sound, video, and language.

The Cognitive Services Emotions API looks at photographs of people and determines the emotion of each person in the photo. Supported emotions are anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise. Each emotion is assigned a score between 0 and 1 - higher numbers indicate a high confidence that this is the emotion expressed in the face. If a picture contains multiple faces, the emotion of each face is returned.

To get started, you will need an Azure account and a Cognitive Services Vision API key.

If you don't have an Azure account, you can get a free one at https://azure.microsoft.com/free/.

Once you have an Azure Account,  follow the instructions in this article to generate a Cognitive Services Computer Vision key.

To use this API, you simply have to make a POST request to the following URL:
https://[location].api.cognitive.microsoft.com/vision/v1.0/recognize

where [location] is the Azure location where you created your API key (above).

The HTTP header of the request should include the following:

Ocp-Apim-Subscription-Key.
This is the Cognitive Services Computer Vision key you generated above.

Content-Type

This tells the service how you will send the image. The options are:

  • application/json
  • application/octet-stream

If the image is accessible via a public URL, set the Content-Type to application/json and send JSON in the body of the HTTP request in the following format

{"url":"imageurl"}
where imageurl is a public URL pointing to the image. For example, to generate a thumbnail of this picture of a happy face and a not happy face,

TwoEmotions

submit the following JSON:

{"url":"http://davidgiard.com/content/binary/Open-Live-Writer/Using-the-Cognitive-Services-Emotion-API_14A56/TwoEmotions_2.jpg"}

If you plan to send the image itself to the web service, set the content type to "application/octet-stream" and submit the binary image in the body of the HTTP request.

A full request looks something like this:

The full request looks something like:

POST https://westus.api.cognitive.microsoft.com/emotion/v1.0/recognize HTTP/1.1
Content-Type: application/json
Host: westus.api.cognitive.microsoft.com
Content-Length: 62
Ocp-Apim-Subscription-Key: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
{ "url": "http://xxxx.com/xxxx.jpg" }

For example, passing a URL with a picture below of 3 attractive, smiling people

BrianAnnaDavid   

(found online at https://giard.smugmug.com/Tech-Community/SpartaHack-2016/i-4FPV9bf/0/X2/SpartaHack-068-X2.jpg)

returned the following data: 

[
  {
    "faceRectangle": {
      "height": 113,
       "left": 285,
      "top": 156,
      "width": 113
    },
    "scores": {
      "anger": 1.97831262E-09,
      "contempt": 9.096525E-05,
      "disgust": 3.86221245E-07,
      "fear": 4.26409547E-10,
      "happiness": 0.998336,
      "neutral": 0.00156954059,
      "sadness": 8.370223E-09,
      "surprise": 3.06117772E-06
    }
  },
  {
    "faceRectangle": {
       "height": 108,
      "left": 831,
      "top": 169,
      "width": 108
    },
    "scores": {
      "anger": 2.63808062E-07,
      "contempt": 5.387114E-08,
      "disgust": 1.3360991E-06,
      "fear": 1.407629E-10,
      "happiness": 0.9999967,
      "neutral": 1.63170478E-06,
      "sadness": 2.52861843E-09,
      "surprise": 1.91028926E-09
    }
  },
  {
     "faceRectangle": {
      "height": 100,
      "left": 591,
      "top": 168,
      "width": 100
    },
    "scores": {
      "anger": 3.24157673E-10,
      "contempt": 4.90155344E-06,
      "disgust": 6.54665473E-06,
      "fear": 1.73284559E-06,
      "happiness": 0.9999156,
      "neutral": 6.42121E-05,
      "sadness": 7.02297257E-06,
      "surprise": 5.53670576E-09
    }
  }
]   

A high value for the 3 happiness scores and the very low values for all the other scores suggest a very high degree of confidence that each person in this photo  happy. is

Here is the request in the popular HTTP analysis tool Fiddler [http://www.telerik.com/fiddler]:
Request

Em01-Fiddler-Request

Response:
Em02-Fiddler-Response 

Below is a C# code snippet making a request to this service to analyze the emotions of the people in an online photograph. You can download the full application at https://github.com/DavidGiard/CognitiveSvcsDemos.

string emotionApiKey = "XXXXXXXXXXXXXXXXXXXXXXX";
var client = new HttpClient();
client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", emotionApiKey);
    string uri = "https://westus.api.cognitive.microsoft.com/emotion/v1.0/recognize";
HttpResponseMessage response;
var json = "{'url': '" + imageUrl + "'}";
byte[] byteData = Encoding.UTF8.GetBytes(json);
using (var content = new ByteArrayContent(byteData))
{
    content.Headers.ContentType = new MediaTypeHeaderValue("application/json");
    response = await client.PostAsync(uri, content);
}

if (response.IsSuccessStatusCode)
{
    var data = await response.Content.ReadAsStringAsync();
}

You can find the full documentation – including an in-browser testing tool - for this API here.

Sending requests to the Cognitive Services Emotion API makes it simple to analyze the emotions of people in a photograph.  

Friday, 29 December 2017 10:43:00 (GMT Standard Time, UTC+00:00)
# Thursday, 28 December 2017

Generating a thumbnail image from a larger image sounds easy – just shrink the dimensions of the original, right? But it becomes more complicated if the thumbnail image is a different shape than the original. For example, the original image may be rectangular but we need the new image to be a square. Or we may need to generate a portrait-oriented thumbnail from a landscape-oriented original image. In these cases, we will need to crop or distort the original image when we create the thumbnail. Distorting the image tends to look very bad; and when we crop an image, we want ensure that the primary subject of the image remains in the generated thumbnail. To do this, we need to identify the primary subject of the image. That's easy enough for a human observer to do, but a difficult thing for a computer to do. But if we want to automate this process, we will have to ask the computer to do exactly that.

This is where machine learning can help. By analyzing many images, Machine Learning can figure out what parts of a picture are likely to be the main subject. Once this is known, it becomes a simpler matter to crop the picture in such a way that the main subject remains in the generated thumbnail.

As I discussed in a previous article, Microsoft Cognitive Services includes a set of APIs that allow your applications to take advantage of Machine Learning in order to analyze, image, sound, video, and language.

The Cognitive Services Vision API uses Machine Learning so that you don't have to. It exposes a web service to return an intelligent thumbnail image from any picture.

You can see this in action here.

Scroll down the the section titled "Generate a thumbnail" to see the Thumbnail generator as shown in Figure 1. 

Th01
Figure 1

With this live, in-browser demo, you can either select an image from the gallery and view the generated thumbnails; or provide your own image - either from your local computer or from a public URL. The page uses the Thumbnail API to create thumbnails of 6 different dimensions.
 
For your own application, you can either call the REST Web Service directly or (for a .NET application) use a custom library. The library simplifies development by abstracting away HTTP calls via strongly-typed objects.

To get started, you will need an Azure account and a Cognitive Services Vision API key.

If you don't have an Azure account, you can get a free one at https://azure.microsoft.com/free/.

Once you have an Azure Account,  follow the instructions in this article to generate a Cognitive Services Computer Vision key.

     

To use this API, you simply have to make a POST request to the following URL:
https://[location].api.cognitive.microsoft.com/vision/v1.0/generateThumbnail?width=ww&height=hh&smartCropping=true

where [location] is the Azure location where you created your API key (above) and ww and hh are the desired width and height of the thumbnail to generate.

The “smartCropping” parameter tells the service to determine the main subject of the photo and to try keep it in the thumbnail while cropping.

The HTTP header of the request should include the following:

Ocp-Apim-Subscription-Key.     
The Cognitive Services Computer Vision key you generated above.

Content-Type

This tells the service how you will send the image. The options are:   

  • application/json    
  • application/octet-stream    
  • multipart/form-data

If the image is accessible via a public URL, set the Content-Type to application/json and send JSON in the body of the HTTP request in the following format

{"url":"imageurl"}
where imageurl is a public URL pointing to the image. For example, to generate a thumbnail of this picture of a skier, submit the following JSON:

{"url":"http://mezzotint.de/wp-content/uploads/2014/12/2013-skier-edge-01-Kopie.jpg"}

Man skiing  alps

If you plan to send the image itself to the web service, set the content type to either "application/octet-stream" or "multipart/form-data" and submit the binary image in the body of the HTTP request.

Here is a sample console application that uses the service to generate a thumbnail from a file on disc. You can download the full source code at
https://github.com/DavidGiard/CognitiveSvcsDemos

Note: You will need to create the folder "c:\test" to store the generated thumbnail.

   

             // TODO: Replace this value with your Computer Vision API Key
            string computerVisionKey = "XXXXXXXXXXXXXXXX"

            var client = new HttpClient();
            var queryString = HttpUtility.ParseQueryString(string.Empty);

            client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", computerVisionKey);

            queryString["width"] = "300";
            queryString["height"] = "300";
            queryString["smartCropping"] = "true";
            var uri = "https://westcentralus.api.cognitive.microsoft.com/vision/v1.0/generateThumbnail?" + queryString;

            HttpResponseMessage response;

            string originalPicture = "http://davidgiard.com/content/Giard/_DGInAppleton.png";
            var jsonBody = "{'url': '" + originalPicture + "'}";
            byte[] byteData = Encoding.UTF8.GetBytes(jsonBody);

            using (var content = new ByteArrayContent(byteData))
            {
                 content.Headers.ContentType = new MediaTypeHeaderValue("application/json");
                response = await client.PostAsync(uri, content);
            }       
            if (response.StatusCode == System.Net.HttpStatusCode.OK)
            {
                 // Write thumbnail to file
                var responseContent = await response.Content.ReadAsByteArrayAsync();
                 string folder = @"c:\test";
                string thumbnaileFullPath = string.Format("{0}\\thumbnailResult_{1:yyyMMddhhmmss}.jpg", folder, DateTime.Now);
                using (BinaryWriter binaryWrite = new BinaryWriter(new FileStream(thumbnaileFullPath, FileMode.Create, FileAccess.Write)))
                 {
                    binaryWrite.Write(responseContent);
                }
                // Show BEFORE and AFTER to user
                Process.Start(thumbnaileFullPath);
                 Process.Start(originalPicture);
                Console.WriteLine("Done! Thumbnail is at {0}!", thumbnaileFullPath);
            }
            else
            {
                Console.WriteLine("Error occurred. Thumbnail not created");
             }

        }            

The result is shown in Figure 2 below.
Th02Results
Figure 2

One thing to note. The Thumbnail API is part of the Computer Vision API. As of this writing, the free version of the Computer Vision API is limited to 5,000 transactions per month. If you want more than that, you will need to upgrade to the Standard version, which charges $1.50 per 1000 transactions.

But this should be plenty for you to learn this API for free and build and test your applications until you need to put them into production.
The code above can be found on GitHub.

You can find the full documentation – including an in-browser testing tool - for this API here.

The Cognitive Services Custom Vision API provides a simple way to generate thumbnail images from pictures.

Thursday, 28 December 2017 10:31:00 (GMT Standard Time, UTC+00:00)