A Parable

John spent days writing a software component. He tested and double-checked his code, and he was satisfied that it worked properly, according to the requirements he was given, so he checked it into source control. A few weeks later, a new version of the software that included his code was released to production. A user discovered a bug caused by John's changes. The user tweeted about the bug, and this was retweeted thousands of times. Before long, word got back to John. An edge case that John had not considered was causing problems in production. He fixed the bug and checked his changes back into source control. And he waited. Hoping for the best.

June spent days writing a software component. She tested and double-checked her code, and she was satisfied that it worked properly, according to the requirements she was given, so she checked it into source control. June's team had a policy that required a code review prior to merging any code with the main branch. During the code review process, one of June's peers pointed out a bug in her code. It was an edge case that June had not considered. She fixed the bug and checked her changes back into source control. The code was reviewed again and merged with the main branch. June slept well that night.

The story of June and John illustrates some of the advantages of code reviews. Catching June's bug during a code review resulted in a faster and cheaper fix and resulted in less public embarrassment than catching John's bug in production. The two bugs were of equal severity, but one was less costly to fix.

What is a Code Review?

Why do we do code reviews? They take up time that could be spent writing code, designing features, or otherwise directly driving forward a project, so there is a cost. The answer is that the benefits of a good code review far outweigh the costs.

When I think of a code review, I think of a formal process in which one person reviews code written by another and provides written or oral feedback to the author, approving that code only after they deem it acceptable.

There are two parties in a code review: The Code Author and the Code Reviewer.

The steps in a code review are:

  1. The Code Author makes changes to an application and checks those changes into a code repository
  2. The author sends a description of changes to the Code Reviewer. The changes are known as a "Change Set"; the description of those changes is a "Change List"; many Application Lifecycle Management tools (e.g., GitHub and Azure DevOps) support a "Pull Request", which combines the two with the source code and is a formal entry point for the Reviewer to begin reviewing the code changes. I often use these three terms interchangeably because they are so closely related
  3. The Code Reviewer retrieves the code, examines it, and (if necessary) provides feedback on changes that the author must make before the code can be merged with the main branch
  4. If the code requires changes, the Reviewer sends the feedback to the Author
  5. The Author responds to the feedback and makes any necessary changes
  6. The Author re-sends to the Reviewer the code with these updates
  7. Steps 3-6 are repeated until no more changes are required
  8. When no more changes are required, the Reviewer approves the changeset
  9. The code is merged with the main branch in the repository

Code Review Goals

A good code review will accomplish the following:

  • Validate code
  • Help make engineering decisions
  • Share knowledge
  • Increase code ownership

Let's discuss each of these goals.

Validate code

The most obvious reason to review code is to validate that it does what it is supposed to do. Generally, we look at this from an external point of view. For example, if we provide a given set of inputs to a function, we verify that the function returns the expected output. We can pull the code from source control and make sure it compiles and runs successfully. We can execute automated tests and validate that they all pass.

But we also want to validate the code from an internal point of view. If our team has coding standards, does the code adhere to those standards? While reviewing code, the reviewer looks for and calls out potential problems. Even if the code works well, there may be potential areas for improvement. For example, the reviewer may suggest ways to make the code more efficient, faster, or more readable. The reviewer should point out these as well.

Help make engineering decisions

Sometimes, a code review can drive engineering decisions. If there is confusion or inconsistency about how the application is accessing data or dividing services or testing code, code reviews can raise these issues and prompt a discussion. If different developers have different coding standards, it may indicate a gap in the team's standards and drive discussion around this.

Effective teams have published a set of coding guidelines that may describe everything from naming conventions to required test coverage. Developers must be aware of these guidelines and make an effort to adhere to them; but often non-compliant code slips through. A code review is a good place to catch this before the code is committed to the main branch.

Sharing knowledge

Another benefit of Code Reviews is that it allows sharing of knowledge.

There are two parties involved in a code review process: the code author and the code reviewer. By reviewing the code, the reviewer has a chance to improve the code itself and to address any weaknesses or knowledge gaps in the developer. Similarly, the reviewer can address his or her own weaknesses by seeing someone else's approach to a coding challenge.

The reviewer gains knowledge about a part of the system that someone else wrote. By reading the code, they may also learn something about the language in which it was written; or about a framework or a design pattern or an algorithm implemented by the author; or about the business cases and requirements of the application.

In addition, the code author can learn by reading feedback from the reviewer, who may suggest improvements that the coder did not consider.

Increase code ownership

I have worked on too many systems in which one developer possessed all knowledge about a part of that system. Confusion reigned when that developer left the team. No one understood how to maintain the orphaned code. By conducting regular code reviews, team members have a chance to understand parts of the system on which they are not actively working. This shared knowledge benefits the whole team, allowing flexibility in staffing and removing the danger of all knowledge departing when a team member departs.

Code Review Process

The process of a code review is simple: The author checks code changes into a repository and announces that it is available for review. A reviewer looks at and runs the code and provides feedback. This feedback can be either written or verbal. Most Application Lifecycle Management systems (e.g., GitHub and Azure DevOps) support this process through a Pull Request. In these systems, the code of a Pull Request does not get merged into the main branch until one or more reviewers have approved it. We can configure these systems, setting specific rules about who must approve code before it is merged.

This process works best when everyone involved believes in it and considers code review time to be well-spent. Support from upper management can help encourage the process; but, public buy-in from the team's most respected developers is an even more effective way to get others to buy into this process.

Conclusion

Code Reviews have become an important part of most of the projects on which I work, yet I remember a time before I even knew such a thing existed.

These days, code reviews are almost ubiquitous on my software projects. They help us address weaknesses among developers and reviewers, enforce compliance with coding standards, and improve the quality of our codebase.

If we can catch bugs before they go into production, we can save ourselves embarrassment, time, and money. A good code review process helps us achieve that.


GCast 124:

Customizing Spelling and Grammar Checks in Microsoft Word

Learn how to customize the spelling and grammar checker in Microsoft word: Enable and disable grammar checking, set the language and dialect, and ignore spellchecker in parts of your document.


When I was a boy, my parents took me to see Doctor Dolittle - a charming musical film in which Rex Harrison played a globetrotting veterinarian who had the ability to talk with animals in their own language. It quickly became my favourite movie and I watched it every time it was on TV. I was vaguely aware that the title character was based on a series of novels, but I never read these books. Until now.

Hugh Lofting's 1920 novel "The Story of Doctor Dolittle" introduced the title character. He was an M.D. but he had so many pets in his home that his patients refused to visit, and his sister eventually moved out, leaving no one to care for him. His pet parrot Polynesia taught the Doctor the languages of other animals and soon he developed a reputation as the most effective veterinarian in England. His reputation spread to Africa, where he was asked to come and cure a colony of sick monkeys.

On his journey, he was kidnapped by an African king, hunted by pirates, and rescued an old man from a cave. Most of his success was due to the help of the local animals.

It is worth noting that at least one scene does not age well. When we first encounter the African king, he is angry at the white Europeans thanks to the exploitation he experienced from previous imperialist visitors. These seemed progressive for a book written a hundred years ago; but a few chapters later, the king's son asks the Doctor to fulfill his dream of becoming a "White Prince". This scene has been cut from some editions, but it was left in the one I read, and it will not sit well with most modern readers. It appears that some racial epithets were removed from this edition.

Despite that, the story is fun, even for a grown-up like me. Lofting leads us from adventure to adventure and it is Dolittle's kindness to animals that is his greatest strength.


Episode 705

Douglas Starnes on Python and Azure

Douglas Starnes discusses how Azure, Visual Studio Code, GitHub Codespaces, and other Microsoft tools support Python and Linux tools.


Robert Jordan's "The Dragon Reborn" continues the adventures of the heroes of "The Wheel of Time" series. As in the first two books, volume three has the characters split up and travel across the world Jordan has created.

It was not a well-kept secret, but the previous book ended with an astral projection of Rand al'Thor's battle with Ba'alzamon Now virtually everyone in this world knows that Rand is The Dragon Reborn - the powerful male mystic, who is destined to change the world in a way that will either save or destory it; and he will likely be driven mad by the power he wields.

Rand sets out on a quest to find his destiny. Moiraine, Lan, Loial, and Perrin depart soon after to search for him. Egwene, Nynaeve, and Elayne move forward with their training to become full sisters of the mystical female Aes Sedai order; then go searching for the Black Ajah Liandran, who betrayed the Aes Sedai. Perrin wrestles with his growing connection with wolves, while Mat is finally cured of the evil that infected him when he stole a cursed blade.

In the end, the major players converge in the province of Tear and Rand again fights the dark lord Ba'alzamon, believing again this is the final battle between the two.

Oddly, the title of this book refers to Rand; we hear far less of him in these pages than of his companions. This worked, as the reader gets more development of the other characters - particularly Mat and Perrin. All these quests, characters, and subplots can confuse the reader. Fortunately, in this novel, Jordan stays longer with each group and place, allowing the reader to gain a comfort level with the plot thread he is exploring.

The Wheel of Time is a marathon and not a sprint and it is best to keep this in mind as Jordan builds his world and his characters. We learn more about his characters as they learn more about themselves.


Episode 704

Shannon Kuehn on Workload Identity Federation

Shannon Kuehn describes how to use Microsoft Workload Identity Federation to simplify authentication across Azure and other systems.

Links:
Shannon's blog

GitHub Federated Identity (Microsoft documentation)

GitHub Federated Identity (GitHub documentation)

Gallant GitHub Code (Az-CLI script)


"More About Paddington" is Michael Bond's second collection of stories about Paddington the Bear. As in the first collection, each story relates an incident or adventure in the life of the good-hearted anthropomorphic bear and the English family that has taken him in.

Each story follows a similar pattern: Paddington has an idea and sets out with the best of intentions; things go wrong and sometimes things get messy; everything works out well in the end. They are cute stories aimed at children, but they entertain enough to appeal to adults.

The stories in this collection are more cohesive than in the previous book. Each tale references the one before and they all take place over a period of a few months at the end of the year.

Paddington is quickly becoming one of my favourite characters in children's literature.


TomHarrellTom Harrell was born before the founding of Jazz Showcase, the southside club at which he performed Saturday night. Harrell looks every bit of his 75 years. He is old and frail and incapable of sitting erect in his chair. But he can still play, and his playing is still impressive.

Harrell has played with Stan Kenton, Woody Herman, Dizzy Gillespie, and a host of other jazz legends over the years; and he still loves the music enough to climb onto a stage and lead his quartet for two sets. The quartet was a terrific set of talent that included Ugonna Okegwo on bass and Adam Cruz on drums. But it was Venezuelan pianist Luis Perdomo who stole the show with his outstanding technique and solos.

Highlights of the evening included "Sea", a beautiful melody, which Harrell dedicated to the late Joe Siegel, who founded Jazz Showcase three-quarters of a century ago; and his version of John Coltrane's classic "Moment's Notice".

Although there was not much visually to the performance, the music was enough to satisfy a full house.



Episode 703

Mary Grygleski on Event Streaming and Processing with Apache Pulsar

Mary Grygleski describes how to use the Apache Pulsar open source product and its connectors to build scalable applications in pieces.

Links:
https://pulsar.apache.org/
https://pulsar-neighborhood.github.io/


April 2022 Gratitudes

Comments [0]

4/4
Today I am grateful for lunch in Little India with Nick yesterday.

4/5
Today I am grateful for a tune-up for my bicycle.

4/6
Today I am grateful to mentor Chicago high school students on their STEM project for the fourth year in a row.

4/7
Today I am grateful to see Sean Hayes in "Good Night, Oscar" at the Goodman Theatre last night.

4/8
Today I am grateful to get some legal protection against the person who has been threatening me for the past year.

4/9
Today I am grateful to see "King James" at the Steppenwolf Theatre last night with family and friends.

4/10
Today I am grateful that my sons took me to a Cubs game yesterday as a late birthday gift.

4/11
Today I am grateful for my first visit to the Field Museum in years.

4/12
Today I am grateful for a conversation with Glenn last night for the first time in too long.

4/13
Today I am grateful for 700 episodes of #TechnologyAndFriends and to all those who made it possible.

4/14
Today I am grateful for dinner last night with Nick and Tim before they flies out of town for Nick's bachelor party.

4/15
Today I am grateful for a long conversation yesterday with Jennifer and for the help, advice, and support she gave me.

4/16
Today I am grateful to celebrate Seder with friends last night for the beginning of Passover.

4/17
Today I am grateful for lunch with Debbie and her family yesterday.

4/18
Today I am grateful to celebrate a Virtual Easter with family and friends yesterday.

4/19
Today I am grateful
-to arrive safely in California
-that my abstinence from meat and alcohol is at an end
-to file my taxes on time

4/20
Today I am grateful to finally meet in person the people I have been working with for the past 6 months.

4/21
Today I am grateful for:
-a visit to 3 vineyards in central California
-coffee with Neha yesterday morning

4/22
Today I am grateful to successfully wrap up a customer project this week.

4/23
Today I am grateful for:
-Coffee with Christine yesterday morning
-Lunch with John yesterday
-My first visit to the newly-opened Microsoft office in Mountain View, CA

4/24
Today I am grateful to see the Tom Harrell Quartet in concert at Jazz Showcase last night.

4/25
Today I am grateful to bring out the cushions yesterday for the chairs on my balcony.

4/26
Today I am grateful for my fourth COVID vaccination shot.

4/27
Today I am grateful to those who offered support to me yesterday and for those who shared their own struggles.

4/28
Today I am grateful for co-workers who say and write nice things about me.

4/29
Today I am grateful to attend the ISTC STEM Challenge Showcase yesterday at the Chicago Cultural Center.

4/30
Today I am grateful that I no longer have to live paycheck to paycheck.

5/1
Today I am grateful to hear some excellent stories at The Moth Grand Slam last night.


KingJamesAmerican men are funny. So many of us like sports and talk about sports and even build friendships primarily on the basis of our shared love of a game or a team.

Rajiv Joseph's play "King James" follows two Cleveland Cavaliers basketball fans, who meet during LeBron James's rookie season. The city of Cleveland has not won a championship in over 50 years and sports fans are energized by the possibility of James - a generational talent drafted by the Cavaliers - changing that. Shawn (Glenn Davis) and Matt (Chris Perfetti) bond over their fandom and become best friends. So much of their friendship is based on their love of basketball and the Cavs. They debate, celebrate, and mourn as LeBron's career and decisions impact the team and its fans.

But the play is not about sports and the expectations that fans have of their athletic heroes.

But ultimately, sports was a red herring; it was nothing more than a reason to bring together the two friends. "King James" is more about relationships and friendships and how people understand one another and communicate. I saw myself in both Shawn and Matt and I felt their struggles. It was a bromantic comedy that even a bro would enjoy.

I left with a feeling of hope - hope that Cleveland sports fans must have felt following the 2016 NBA Finals.


Articles

A Sample JavaScript App Using the Bing Spell Check API

Calling the Bing Spell Check Service

Creating a Bing Spell Check Service

Calling the "Recognize Text" Cognitive Service from a .NET Application

Passing a binary file to a web service from a .NET app

Converting Images to Text with the "Recognize Text" API

Calling Cognitive Services OCR Service from a .NET Application

Calling Cognitive Service OCR service from JavaScript

Using the Cognitive Services OCR Service

Getting Started with the Cognitive Services Computer Vision API

Introducing Cognitive Services and Computer Vision

Natural Language Processing with LUIS

Cognitive Services Optical Character Recognition

Using the Cognitive Services Emotion API

Using Cognitive Services to Generate a Thumbnail Image

Generating a Cognitive Services API Key

Cognitive Services Make It Easy to Use Machine Learning in Your Application

Screencasts


Introducing Cognitive Services and Computer Vision

Creating Applications with the Analyze Image Cognitive Services API

OCR with Cognitive Services

Handwriting OCR with Cognitive Services

Cognitive Services Text Recognition service

Text Recognition Cognitive Service with Binary Images

Text Recognition C# Demo

Sentiment Analysis Cognitive Service

Sentiment Analysis JavaScript Demo

Using the Video Indexer AI Tool

Interviews

Sam Nasr on Cognitive Services

Martin Kearn on Document Recognition and Knowledge Extraction

Hamayal Choudhry and Samin Khan on SmartArm

Presentations

"Building and Training your own Custom Image Recognition AI" presentation at NDC-Oslo

Talking about Cognitive Services on the Eat Sleep Code podcast


GCast 123:

Ingesting Into an ADX Table From an Azure Storage Blob

In this video, you will learn to ingest data from an Azure Storage Blob into an ADX table. This is useful when you have a large amount of data to ingest.


After ten years working on Agile projects, I have discovered that just about everyone has their own spin on how to do it effectively.

Esther Derby and Diana Larsen emphasize frequent retrospectives, which they describe in their appropriately titled book "Agile Retrospectives: Making Good Teams Great".

I have been on many projects that waited until they ended before doing any kind of retrospectives - an event often referred to as a "Post-mortem". I have found these to be of little value. We spend a lot of time analyzing what went right and what went wrong, and we document it thoroughly and we file it away where no one reads it.

"Agile Retrospectives" suggests a more agile approach - conducting an analysis periodically throughout the project and using the information gathered to adjust the team's behavior and goals going forward.

Following a brief introduction describing the history and purpose of retrospectives, most of the book is devoted to a set of activities that one can organize and participate in during a retrospective. Each activity is broken down into the following:

  • Purpose
  • Time Needed
  • Description
  • Steps
  • Materials and Preparation
  • Examples

In addition, the authors sometimes list variations on the steps and description of an activity.

Each chapter reads like a set of recipes and this cookbook format makes it simple to select and follow each "recipe". While this book focuses primarily on retrospectives throughout a project - after each sprint, for example - many of the ideas and activities could also be executed at the end of a project or major deliverable.

One would not and should not attempt to involve their team in every activity described. There simply is not enough time and you will find that some activities are more relevant to your team than others. However, "Agile Retrospectives" contains enough good ideas to help you improve the next iteration by learning from the previous one. Read them all and pick those that will help your team.

I have always viewed Agile methodologies as a way to get and react to feedback as quickly as possible. The activities on this book will help your team do this.


Episode 702

Joe Kunk on Windows UI Testing

Joe Kunk describes how he uses the FlaUI open source tool to automate the testing of his WPF, Windows forms and modern apps.

https://github.com/FlaUI/FlaUI

I Will Remember Bogu

Comments [0]

BoguUntil I purchased my current condominium four years ago, I had always been the one responsible for cleaning my own home. But shortly after I moved in, I decided to hire someone to clean my place regularly. Bogu was referred to me by a friend, so I hired her to come every two weeks. The only exceptions were the times I was traveling.

For the next four years, she worked for me. She was punctual, flexible, hard-working, efficient, and thorough. I told her what to do and she did it, spending about 2.5 hours on each visit and keeping my home in good shape.

We never became close personally. Bogu spoke very little English, and I tried to stay out of her way when she was working. But I appreciated her work and her attitude, and we always greeted one another with a smile.

After this past Christmas, she stopped coming. I received no phone call or text message, and she did not respond to any of my messages. This was unusual, as she had always been very responsive. I called my friends who had referred her, and they knew nothing. I did not know any of Bogu's friends or family, so I had no way of knowing her situation.

Earlier this month, my friend called with the news. In early January, Bogu had died in her sleep. One morning, she simply did not wake up.

Bogu was a Polish immigrant who worked hard her entire life and never had a chance to enjoy retirement. I never heard her complain about anything. There are many such people in our country and in our world and most of them are forgotten.

I will remember Bogu.


GoodNightOscarIn 1958, Oscar Lavant was well known as a pianist, actor, and humorist. It was his sarcastic wit that set him apart and the reason that Jack Parr invited him as a guest on The Tonight Show. Unknown to Parr, Lavant's wife had him committed to a hospital to treat him for his drug addictions and temper. Luckily (or unluckily) he was released for a few hours to perform on Parr's show.

"Good Night, Oscar" - a play by Doug Wright - tells the story of that night.

The story moves seamlessly from hilarious one-liners to poignant moments between husband and wife to the tragedy of a man battling his demons. It is a roller coaster ride of emotions.

Sean Hayes, famous as Jack in the long-running TV show "Will & Grace" - is brilliant as Lavant. He captures a talented, but tortured soul, who craves an audience but deflects praise. He is the personification of imposter syndrome, forever comparing himself to his late friend George Gershwin to whom he can never live up. Portraying mental illness on stage or screen is always risky, but Hayes pulls it off successfully.

At the end, Hayes surprises us all with a vibrant rendition of Gershwin's "Rhapsody in Blue".

I have no idea how much of this story is true, but it was true for me. I felt the pain of the characters and I felt Lavant's need to salve his inadequacies with his caustic wit.

I came away more than satisfied.


Episode 701

Mihai Tataran on the Microsoft Azure Well Architected Framework

Mihai Tataran describes the Microsoft Azure Well-Architected Framework and how you can use its guidance to build a better cloud application - whether you are starting from scratch or migrating an existing application.

https://docs.microsoft.com/azure/architecture/framework/

Somehow, I missed Paddington as I was growing up. I missed him again as my children were growing up. I did not want to wait around for grandchildren, so I picked up "A Bear Called Paddington" - the first of fourteen novels written by Englishman Michael Bond about an anthropomorphic bear discovered by the Brown family in London's Paddington Station.

It was delightful!

The Browns discover a bear in Paddington Station and learn that he has stowed away on a ship from "Darkest Peru". They take him home and name him after the station in which they found him. He quickly becomes part of the family. Although Paddington is honest, polite, and kind, his curiosity and curious nature often get him into trouble.

This book does not contain an overall plot; rather, each chapter is a self-contained short story, and they are all loosely tied together. Each story relates an incident in which the bear finds himself in an unexpected predicament because, as Paddington admits "Things are always happening to me. I’m that sort of bear." It takes only a few pages for things to work themselves out - sometimes by luck and sometimes thanks to Paddington's positive attitude and usually for the best.

Although marketed as a children's book, this novel will appeal to readers of all ages. It is honest and funny and fun and full of adventure. It's that sort of book!


When I create a new software project, I often create a folder structure to hold components that I intend to write later. Sometimes thes folders are empty in early iterations - a reminder to me and others of where things should go.

This presents a problem when working with git source control. Empty folders are ignored by git, so they never make their way into the source control system.

The solution is to add a file to the folder, so it is no longer empty. Many people will add an empty text file named ".gitkeep". If a folder contains any file (even an empty one), git will not ignore it.

Technically this is a hack, but it's common enough that we can think of it as a psuedo-standard. Naming the file ".gitkeep" lets people know the purpose of this file and that it can be removed when useful files are added to the folder.


GCast 122:

Azure Data Explorer Materialized Views

Materialized Views are a features of Azure Data Explorer (ADX) that allow you to pre-aggregate data, making queries much faster. This video shows you how to create and use a Materialized View.


Overview

When designing a cloud application, there are many options. Even if you have settled on a cloud provider, the number of services from which to choose can be overwhelming. It helps if you group cloud services into three categories: Software as a Service (SAAS), Platform as a Service (SAAS),  and Infrastructure as a Service (SAAS). There may be some overlap between these services but thinking of them in this way can simplify your options.

Software as a Service (SAAS)

Software as a Service (or "SAAS" for short) describes services that are pre-built. You can sign up for them and start using them immediately with little or no configuration. Examples include email services, such as Gmail and Customer Relationship Management systems, such as Microsoft CRM. Many of these systems allow you to customize them, but that is your choice. If they match your needs, you can sign up and start using them very quickly.

These are the simplest services to use, as they require the least amount of custom code.

Platform as a Service (PAAS)

Platform as a Service (or "PAAS" for short) describes services you can consume or build on within your applications. They provide some basic functionality, such as a website, a web service, a database, or machine learning models. But they require some custom code on your part to make them useful. These services free you from implementing part of the application, so you can focus on the business logic.

For example, if you want to implement custom vision recognition in your software, you could create and train your own model or you could simply call the Custom Vision API of Microsoft Cognitive Services. The latter is much simpler because much of the data collection and compute processing has been done for you to create an existing model. You can then take the results of that model, use it to identify objects in a photo or video, and build the business logic of your application with that identification information. 

Infrastructure as a Service (IAAS) refers to software components that are pre-built

Infrastructure as a Service (or "IAAS" for short) describes the deployment of your data and code to a virtual machine or a container in the cloud. With this category, you are responsible for all the code and data, but the networking, infrastructure, and hardware are abstracted away from you. Those things are handled by the cloud provider automatically.

On-Premises

The final category used to be our only option. In years past, we had to buy, install, configure, and maintain our own hardware. In addition, we were responsible for installing and patching the operating system and any packaged software we are using. Each of the cloud solutions described above takes that responsibility away from us.

How to decide

As a general rule, it is more cost-effective to buy or rent a solution that fits your needs than to build that system yourself. This may be offset by the amount of customization you need to do, so take this into account when you choose your services.

We can think of the categories above as existing on a continuum, as shown below:

SAAS -> PAAS -> IAAS -> On-Premises

As we move from left to right along this continuum, we gain more control over our system; but we are required to do more of the work ourselves. Doing the work ourselves tends to be expensive because we cannot share that development cost among many customers, as a cloud provider can. On the other hand, we may require flexibility that is not offered by a given service, so we may need to move to the right and absorb that cost to gain that flexibility.

My recommendation when creating an application or service is to begin on the left side of the continuum above and consider if a service in that category will meet your needs. If it does, then stop and consider using that service. If not, move to the right and consider if the next category meets your needs, and so on.

If you can find a SAAS service that you can use, you should strongly consider that. It may be that no such service exists or that all are prohibitively expensive. If that is the case, look at PAAS services and try to find one or more that will meet your needs. If no such service exists, you may need to write your own. In this case, it may be possible to deploy your solution to a different PAAS service (Microsoft App Services and Microsoft CosmosDB, for example). However, if no PAAS service meets your needs, you may need to install and deploy everything yourself. The cloud can still help as you can deploy to IAAS services, using Virtual Machines and containers. You are still freed from maintaining the hardware and underlying infrastructure of the network.

Finally, there may be reasons you cannot deploy your application or data to the cloud. Accessing data or services across the Internet may result in unacceptable latency for your application or local laws may require you to store data in a country in which no cloud data center exists. In these cases, you can purchase and maintain your own servers in your own data center (or use a non-cloud data center). This requires the most work on your part and tends to be the most expensive option, which is why so many companies are moving to the cloud.

As you can see by the above example, there are no answers that fit every scenario. You need to consider your needs, the available cloud services, and each category.

Keep in mind that you can split your application into multiple services and deploy some as SAAS, some as PAAS, and some as IAAS. Doing this requires that your application is divided into component services that can be easily split.

If that is not the case, be aware that you can change your cloud deployment over time. For example, you may choose initially to "lift and shift" an on-premises application - that is, move it from a local server to a Virtual Machine in the cloud. Over time, you can refactor your code, dividing your application into smaller services that can be deployed as PAAS services. This should reduce the amount of code you need to maintain and reduce your costs.

Generally speaking, you should avoid "reinventing the wheel". Take advantage of work that has already been done. And take advantage of services that do things not part of your core business. Every hour you spend patching an O/S, installing a software update, and replacing hardware is an hour that you are not spending enhancing and refining your core business logic.


Episode 700

A Celebration of Friends!

At the end of each Technology and Friends episode, I ask my guests to say something that included the words "Technology" and "Friends". Here is a compilation of all their responses from the last 99 episodes!


Daryl HallTodd Rundgren, Daryl Hall, and their music were a big part of my life growing up. I saw Rundgren perform at a small club in Newport, KY almost 20 years ago; but I had never seen Hall in concert until Friday evening at the Auditorium Theatre.

Rundgren began the evening, performing for over an hour. He played his biggest hits - "Hello, It's Me", "I Saw the Light", "It Wouldn't Have Made Any Difference", and "We Gotta Get You a Woman", along with some deeper cuts from his five-decade recording career. He also delivered moving renditions of two Motown classics - Smokey Robinson's "Ooh Baby Baby" and Marvin Gaye's "I Want You".

After a brief intermission, Daryl Hall took the stage and brought his energy to a mix of originals from his solo career and songs made famous by other artists. He drew the biggest cheers while performing the music of "Hall & Oates" - a dynamic duo that cranked out hit after hit in the 70s and 80s.

For the encore, Rundgren joined Hall on stage, where they sang together for several songs.

ToddRundgrenBoth bands shared the same backing band - an unusual arrangement for a main act and warmup act. The band was electric - especially the saxophone player, who excited the crowd with his solos. Hall is still a known commodity to mainstream audiences, thanks to his performances on his long-running online show "Daryl's House". He tried to model the stage after the show’s set and kept the atmosphere as relaxed as the show with casual banter between songs. Rundgren still tours and occasionally records but has not maintained the strong public presence of Hall.

Hall and Rundgren are now in their mid-70s, and it is surprising how well their voices have held up. Hall was most famous for his range while singing the soulful melodies as the lead vocalist of Hall & Oates, but Rundgren has always shown great vocal range as well. The years have done little to diminish these skills.

The relationship between these two men goes far back. They both grew up in Philadelphia, they have recorded and toured together before, and Rundgren produced one of H&O's albums. Their closeness was apparent as they harmonized on stage.


March 2022 Gratitudes

Comments [0]

3/7
Today I am grateful to see "The Batman" at an IMAX theater yesterday.

3/8
Today I am grateful for a new shower curtain.

3/9
Today, I am grateful for an Indian lunch this week, courtesy of my employer.

3/10
Today I am grateful for dozens of new Twitter followers this week.

3/11
Today I am grateful for a decent sleep 3 nights in a row.

3/12
Today I am grateful to see a live performance of "West Side Story" in Lincolnshire last night.

3/13
Today I am grateful my flight arrived safely, even if it was 24 hours later than planned.

3/14
Today I am grateful for:
-Lunch yesterday with Stephen and Patrice
-My first visit to Staten Island
-An exciting Islanders-Ducks hockey game last night

3/15
Today I am grateful for an evening playing Top Golf in New Jersey

3/16
Today I am grateful to meet my teammates in person this week after working with them virtually for months.

3/17
Today I am grateful for 3 days in New York and New Jersey

3/18
Today I am grateful for an uneventful flight to Florida last night.

3/19
Today I am grateful for a day with new friends.

3/20
Today I am grateful for my first visit to the Everglades yesterday.

3/21
Today I am grateful for a visit to the Naples Botanical Garden yesterday.

3/22
Today I am grateful for the generosity and hospitality of Sean and Emilie

3/23
Today I am grateful for people who care about me.

3/24
Today I am grateful that my co-worker wasn't upset that I overslept and showed up 15 minutes late for our meeting yesterday morning.

3/25
Today I am grateful to finish preparing my taxes.

3/26
Today I am grateful to see Joseph in concert last night

3/27
Today I am grateful for brunch with friends in Wicker Park yesterday.

3/28
Today I am grateful for new slippers

3/29
Today I am grateful to work from the local Microsoft office for the first time in over 2 years.

3/30
Today I am grateful to receive a care package filled with nuts from my employer.

3/31
Today I  am grateful for dinner with my sons last night

4/1
Today I am grateful that Americans are focused on the world's most important issues, like: who was the biggest jerk at the Oscars

4/2
Today I am grateful to see Daryl Hall and Todd Rundgren at the Auditorium Theatre last night - my first concert at that venue.

4/3
Today I am grateful for exciting basketball games.


Episode 699

Jean Lange on the History of JavaScript

Jean Lange details how JavaScript grew from a scripting language to the language of the web and how studying its history helped her to make better technology choices.


Joseph in concertIt is unusual for a singer to warm up for her own band; but that is what Natalie Schepman did Friday night at the Old Town School of Folk Music. Schepman is one-third of Joseph - a vocal group from Oregon consisting of her and her two sisters: twins Allison and Meegan Closner.

Natalie had a solo career prior to forming Joseph, so she performed songs from that era, accompanying herself on acoustic and electric guitar, before taking a short break and returning to the stage with her sisters.

Each of the three ladies took turns singing lead on their songs performed throughout their 90-minute set. Their voices are distinct from one another, but each of them possesses an impressive vocal range on their own and together their harmonies are amazing! The only accompaniment came from Natalie's guitar playing, but that was enough.

I learned when I arrived that the show was all-request - fans were encouraged to suggest songs on the group's website and they played only those suggested by the fans. Although I was unfamiliar with Joseph before attending the concert, the crowd was not. Many came wearing shirts with the band's logo and large numbers sang along to the songs. It was a fun atmosphere and at least two couples were inspired enough to get engaged during the concert.

I arrived curious and I came away a fan of this trio.


In the universe of Robert Jordan's epic "The Wheel of Time" series, events repeat themselves with each turn of time's wheel - A turn that lasts thousands of years and spans many ages. As the Wheel turns, people are reborn to fulfill a destiny predetermined in a previous age.

One such person was The Dragon - a man who wielded the power of the universe to defeat the forces of darkness but was driven mad by that power.

It is now thousands of years later, and prophecies have foretold the coming of a new Dragon. Rand al'Thor, the protagonist of "The Eye of the World", has wielded this same power to temporarily defeat the villain Ba'alzamon and he now realizes that he is the Dragon Reborn. Rand is a shepherd from an isolated village who is unprepared for this responsibility and fearful of the madness that may consume him.

"The Great Hunt" is volume 2 of tWoT series and takes us with Rand as he and his companions seek the Horn of Valere - a sacred artifact stolen by Dark friends from those who follow the Light.

As in tEotW, we follow the travelers on their quest as they battle magic, demons, and other obstacles until they conclude their quest. Along the way, we get a closer look into the characters that make up the series and the world they inhabit. They grow up a little, hone their skills a little, and evolve a little. The action comes swiftly, but the characters evolve slowly - presumably to make the arc last for over a dozen books in the series.

The pacing is better in this novel than in Book 1, which dragged at times. Topics introduced in the first novel are explored further in Book 2. We see rivalries between factions of the Ais Sedai - a cult of female warriors / sorceresses capable of channeling the universe's power. We see women betrayed and enslaved by other women; we see Perrin tapping into his power to communicate with wolves; and we see Mat bonding himself to more magical devices with tragic consequences.

Jordan continues to draw ideas from Tolkien (calling upon an army of ghosts to fight your battle is an idea lifted straight out of Lord of the Rings), but he includes more of his own ideas in this story.

This is a worthy sequel and an inspiration for me to continue to Book 3.


Episode 698

Godfrey Nolan on Drone SDKs

Godfrey Nolan has been building applications that access the power of drones. He explains the current and future practical uses of these applications and how to get started using SDKs from drone vendors.

https://riis.com
https://www.meetup.com/drone-software-meetup-group


So many of us try to do everything. When an opportunity presents itself, we take it. When someone asks us to do something, our first instinct is to say "yes".

Greg McKeown advises us against this. In his 2014 book "Essentialism: The Disciplined Pursuit of Less", he tells us to think before saying "yes" or taking on more than we can handle. We can simplify our lives by focusing our energy only on those things that bring the most rewards. By eliminating non-essential things from our lives, we can focus more on what we value most.

McKeown provides some practical advice on how to become more focused. He tells us to block off time for ourselves; to take time to consider what in our life is most important and what we can eliminate; to take care of our bodies by getting sufficient sleep; to recognize when things are going poorly and know when to step away; to build buffer time into our schedules; and to learn how to politely decline requests.

Essentialism seems like common sense, but we are all guilty of overextending ourselves at least some of the time. McKeown's advice is sound. When I sold my house and moved to a small apartment in Chicago 8 years ago, I was able to rid myself of a lot of physical clutter. But I am still burdened with mental clutter. I am among those who are easily distracted and stretched too thin. Prioritizing my life would add value to it; especially if I could drop those things at the bottom.


Episode 697

Russ Fustino on Blockchain

Algorand Technical Evangelist Russ Fustino teaches us about Blockchain, how it works, its components, its uses, and how you can get started developing for it.

https://developer.algorand.org


JohnPaulWhiteCan one be happy listening to sad songs all evening? John Paul White proved that one can when he performed Friday night at the Old Town School of Folk Music in Lincoln Square.

His career has spanned over a decade and he has released solo albums, as well as those from Civil Wars - his Grammy-winning collaboration with Joy Williams.

But on this night, he focused on his solo songs, which he performed accompanied only by himself on guitar. And, of course, he focused on his sad songs from "The Hurting Kind" to "The Once and Future Queen", and "My Dreams Have All Come True", a song about a man whose dream of a painful breakup turns into reality.

At one point, he reassured the audience: "In case you're wondering, I'm ok", explaining that he wrote his best sad songs when he was happy.

White's voice has great pitch and range, reminiscent of Roy Orbison and some of the best Country & Western singers. His songs challenge that voice and it rose to that challenge.

The opening act was also a delightful surprise. I was unfamiliar with Parker Millsap, but his excellent songwriting, voice, and guitar picking won me over.

White's show was appreciated by an audience familiar with his music. In addition to his own songs, he played a handful of covers, including Electric Light Orchestra's "I Can't Get Her Out of My Head" with which he closed the evening.

It was a sad evening. But it made us happy.


In a previous article, I showed how to ingest data inline, listing each individual row to ingest. For large datasets, this is not a practical way to initialize data in a table. Another option is to ingest from CSV data stored in an Azure Storage Blob.

The syntax is

.ingest into table tablename
(
     h'blob_url_and_sas_token'
)

where:

  • tablenameis the name of the ADX table into which you want to ingest data
  • blob_url_and_sas_token is the URL of blob, followed by the SAS token of the Azure Storage Account containing that blob. There are no spaces between the 2 strings.

Each row in the CSV file will create one row in the ADX table. The format of the CSV file must match the schema of the ADX table.

Demo

For the demo below, we will use a table defined by the following ADX code:

.create table testdata(
     timeStamp: datetime,
     someNumber: long,
     someString: string,
     someJson: dynamic
     )

and we will create a blob containing the following text:

2022-02-08T21:25:04.811Z,1,foo1,{'foo':'bar1'}
2022-02-08T21:25:07.838Z,2,foo2,{'foo':'bar2'}
2022-02-08T21:25:10.912Z,3,foo3,{'foo':'bar3'}
2022-02-08T21:25:13.829Z,4,foo4,{'foo':'bar4'}
2022-02-08T21:25:15.415Z,5,foo5,{'foo':'bar5'}

Blob and Storage Account Configuration

In an Azure Storage Account, create a blob container and upload to the blob container a file containing the comma-delimited text above.

NOTE: For more information on working with Azure Storage Accounts and Blobs, see this article

After uploading the blob file, you can the Storage Browser browser blade by clicking the [Storage Browser] button (Fig. 1) in the left menu; then, navigating to the container containing your import CSV file, as shown in Fig. 2.

storage browser button

Fig. 1

Blobs in Container

Fig. 2

Click your import file blob in the list to view the properties, as shown in Fig. 3.

Blob Properties

Fig. 3

Copy the URL of this file and save it for later.

Next, generate an SAS Token for the storage account. In the left menu, under the "Security + networking" section (Fig. 4), click the [Shared access signature] button (Fig. 5)

Security and Network menu

 

Fig. 4

SAS Button

Fig. 5

The SAS dialog displays, as shown in Fig. 6.

SAS blade

Fig. 6

At the "Allowed resource types", check the "Object" checkbox.

The valid start and end date times default to the current datetime through 8 hours from now. Adjust these if you want to access the blob beyond the end time or will not begin the import until significantly later.

Click the [Generate SAS and connection string] button to generate a SAS token. Generated data will display below the button, as shown inf Fig. 7.

Generated SAS and connection string

Fig. 7

Copy the value in the SAS token field and save it for later.

Ingesting Data into ADX Table

Open the Azure Data Explorer interface, log in and select the database containing your import table.

From here, execute the ingest command described above. For our sample, this is:

.ingest into table testdata
(
     h'https://dgteststorage.blob.core.windows.net/data-import/TestData.csv?sv=2020-08-04&ss=bfqt&srt=o&sp=rwdlacupitfx&se=2022-03-11T03:57:56Z&st=2022-03-10T19:57:56Z&spr=https&sig=x21QHp4nXaZp%2Bfc9h8cZD41grOOtd%2F73lY%2Fyk5hZsA4%3D
)

You should now be able to query the table with its data via the following KQL command:

testdata

Troubleshooting

If data is not imported, error information can be retrieved via the following command:

.show ingestion failures 
| order by FailedOn

Look at the row matching the time of the import (most likely the most recent row). Common issues are an invalid or expired SAS token and data that does not match the table schema.

Other Data Formats

ADX supports ingestion from other file formats than CSV. For example, Parquet, Avro, and JSON files can also be ingested. You can find a complete list here.


Episode 696

Matt Eland on How Humans Learn

Matt Eland is an instructor at Tech Elevator. He discusses the different ways that students learn and how to be a better teacher by understanding these ways.


February 2022 Gratitudes

Comments [0]

2/7
Today I am grateful for those who trust me enough to tell me their troubles.

2/8
Today I am grateful to the two people who sent flattering emails about me yesterday.

2/9
Today I am grateful to receive a response to a message I wrote to a friend in prison last week.

2/10
Today I am grateful to deliver an internal presentation on Azure Data Explorer and to those who told me they enjoyed it.

2/11
Today I am grateful to see 10,000 Maniacs in concert last night.

2/13
Today I am grateful to see America in concert last night for the first time.

2/14
Today I am grateful that the Chicken Piccata I made for friends yesterday turned out well, even though it was the first time I made it.

2/15
Today I am grateful to catch up with William yesterday.

2/16
Today I am grateful for a conversation with Paul yesterday for the first time in a long time.

2/17
Today I am grateful to co-present on Blockchain with Russell Fustino at the ETHDenver conference yesterday

2/18
Today I am grateful that this year's NFL playoffs were so exciting.

2/19
Today I am grateful to those who reached out to me when they sensed I was feeling low.

2/20
Today I am grateful to see Billy Branch & The Sons of the Blues at Evanston SPACE last night.

2/21
Today I am grateful to see a production of "The Virginian" at the City Lit Theater yesterday.

2/22
Today I am grateful for my first bike ride of 2022.

2/23
Today I am grateful for the arrival of my home COVID tests

2/24
Today I am grateful:
-for an unexpected visit from Nick last night.
-to present at the Valley .NET User Groups last night

2/25
Today I am grateful that the back pain that plagued me for months is almost entirely gone.

2/26
Today I am grateful to see Enter the Haggis in concert last night.

2/27
Today I am grateful for Celtic music.

2/28
Today I am grateful to see the play "The Moors" at Red Orchid Theatre in Old Town yesterday.

3/1
Today I am grateful for:
- an enjoyable and informative conversation with Laurent last night
- another decade around the sun

3/2
Today I am grateful for a birthday dinner with family and friends last night.

3/3
Today I am grateful I have been able to maintain a consistent gym schedule this past month.

3/4
Today I am grateful to present at Code Camp Romania yesterday.

3/5
Today I am grateful to see John Paul White in concert last night in Lincoln Square.

3/6
Today I am grateful to Tim and Natale, who took me out for a birthday sushi dinner last night in Wicker Park.


EnterTheHaggisEvery band should include bagpipes. And two fiddlers. And a fife and a recorder.

It works for the Canadian Celtic Band Enter the Haggis, who delighted the crowd at City Winery Friday night.

For two hours, ETH played a mix of ballads and rock songs - but mostly of Celtic or Celtic-influenced arrangements. The band originated in Toronto, Ontario 25 years ago, but they made their name by writing and recording songs influenced by the music of Scotland and Ireland.

They delighted the audience with the melodic "Down With the Ship" and the rousing "One Last Drink" and closed their encore set with the upbeat "Shangri-La". 

I saw ETH years ago when they performed at the CodeMash conference in Sandusky, OH. After the show, I had a chance to speak with band members Craig Downie, Brian Buchanan, and Trevor Lewington. I was delighted to hear that each of them remembered CodeMash fondly.

Perhaps not every band would benefit from the addition of bagpipes and a fife; but they could all benefit from the enthusiasm shown by Enter the Haggis on a chilly winter night in Chicago.

More photos


In this article, I showed you how to create a user-defined ADX function that returns tabular data. In this article, I will show you how to create a user-defined ADX function that returns a single scalar value.

Setup

This article assumes that you have an Azure subscription, an ADX cluster, and an ADX database. See the previous articles in this series to learn how to create an ADX cluster and/or database.

For the examples in this article, we will use a table created with the following ADX commands:

.create async materialized-view 
.drop table customers

.create-merge table customers
(
FullName:string, 
LastOrderDate:datetime,
YtdSales:decimal,
YtdExpenses:decimal,
City:string,
PostalCode:string
)

.ingest inline into table customers <| 
'Bill Gates', datetime(2022-01-10 11:00:00), 1000000, 500000, 'Redmond', '98052'
'Steve Ballmer', datetime(2022-01-06 10:30:00), 150000, 50000, 'Los Angeles', '90305'
'Satya Nadella', datetime(2022-01-09 17:25:00), 100000, 50000, 'Redmond', '98052'
'Steve Jobs', datetime(2022-01-04 13:00:00), 100000, 60000, 'Cupertino', '95014'
'Larry Ellison', datetime(2022-01-04 13:00:00), 90000, 80000, 'Redwood Shores', '94065'
'Jeff Bezos', datetime(2022-01-05 08:00:00), 750000, 650000, 'Seattle', '98109'
'Tim Cook', datetime(2022-01-02 09:00:00), 40000, 10000, 'Cupertino', '95014'
'Steve Wozniak', datetime(2022-01-04 11:30:00), 81000, 55000, 'Cupertino', '95014'
'Scott Guthrie', datetime(2022-01-11 14:00:00), 2000000, 1000000, 'Redmond', '98052'
'David Giard', datetime(2022-01-02 09:01:00), 1.50, 1, 'Chicago', '60605'

Syntax

Use the .create-or-alter function command to create a new function or modify one that already exists. The syntax is:

.create-or-alter function with (docstring = description, folder=folder_name] name_of_function ( parameter_list ) { KQL_Script }

where:

  • description is a brief description of the function. This is optional, but it is useful to help others understand the purpose of your function.
  • folder_name is a logical folder in which to store the function. This is optional, but it can help to organize your functions if you have many of them.
  • parameter_list is a list of input parameters to the function.
  • KQL_Script is the KQL code to execute when the function is called.

Parameters

Parameters are passed to a function as a comma-separated list of name/data type pairs, for example:

.create async materialized-view 
startDateTime:datetime,
endDateTime:datetime,
You can make a parameter optional by adding a default value, as shown below:
.create async materialized-view 
timeBinLength:timespan = 1h
This creates an optional param named "timeBinLength" of type timespan. If this parameter is not passed to the function, it will default to 1 hour.

Sample

Here is an example of an ADX function that accepts as revenue and expense as input parameters and calculates the profit from these values:

.create async materialized-view 
.create-or-alter function
with (docstring = 'Points drone passed through rolled up by time period', folder='Samples')
Profit(
revenue:decimal,
expenses:decimal
)
{
revenue - expenses
}

Calling the Function

Calling your function requires only the function name, followed by parameters in parentheses.

The code below calls the DroneRoute function, setting the timespan to 30 minutes.

.create async materialized-view 
customers
| extend profit = Profit(YtdSales, YtdExpenses)

The results of that call are in Fig. 1.

Results of calling Profit function

Fig. 1

Conclusion

In this article, you learned how to create and call a User-Defined Scalar Function in Azure Data Explorer. See this article for information on creating a user-defined ADX function that returns a data table.


Kusto Query Language (KQL) ships with dozens of functions that you can call from within your queries. However, you may also write your own functions and call those.

Setup

This article assumes that you have an Azure subscription, an ADX cluster, and an ADX database. See the previous articles in this series to learn how to create an ADX cluster and/or database.

For the examples in this article, we will use a table created with the following ADX commands:

.drop table droneData

.create-merge table droneData
(
DroneId:int,
TimeStamp:datetime,
Longitude:decimal,
Latitude:decimal,
BatteryLife:decimal
)

.ingest inline into table droneData <|
1, datetime(2022-01-10 10:45:00), -73.988888888888888, 40.742222222222222, 100.0
1, datetime(2022-01-10 11:00:00), -73.988888844444444, 40.742222215555222, 100.0
1, datetime(2022-01-10 11:15:00), -73.978820800000000, 40.741862453111111, 99.9
1, datetime(2022-01-10 11:30:00), -73.978800000000000, 40.741862453112222, 99.8
1, datetime(2022-01-10 12:45:00), -73.970000000000000, 40.741862453122222, 99.7
1, datetime(2022-01-10 12:00:00), -73.960555555555555, 40.741862453122222, 99.6
1, datetime(2022-01-10 12:15:00), -73.960000000000000, 40.741862453122222, 99.5
1, datetime(2022-01-10 12:30:00), -73.960000088888888, 40.741862453122222, 99.4
1, datetime(2022-01-10 12:45:00), -73.960000055555555, 40.741862453122222, 99.4
1, datetime(2022-01-10 13:00:00), -73.960000011111111, 40.741862453122222, 99.3
1, datetime(2022-01-10 13:15:00), -73.955555555555555, 40.741862453122222, 99.2
1, datetime(2022-01-10 13:30:00), -73.950000000555555, 40.741862453122222, 99.2
1, datetime(2022-01-10 13:45:00), -73.960000000000000, 40.741862453122222, 99.0
1, datetime(2022-01-10 14:00:00), -73.960555555555555, 40.741862453122222, 98.9
1, datetime(2022-01-10 14:15:00), -73.960555555555555, 40.741862453122222, 98.8

Syntax

Use the .create-or-alter function command to create a new function or modify one that already exists. The syntax is:

.create-or-alter function with (docstring = description, folder=folder_name] name_of_function ( parameter_list ) { KQL_Script }

where:

  • description is a brief description of the function. This is optional, but it is useful to help others understand the purpose of your function.
  • folder_name is a logical folder in which to store the function. This is optional, but it can help to organize your functions if you have many of them.
  • parameter_list is a list of input parameters to the function.
  • KQL_Script is the KQL code to execute when the function is called.

Parameters

Parameters are passed to a function as a comma-separated list of name/data type pairs, for example:

startDateTime:datetime,
endDateTime:datetime,

You can make a parameter optional by adding a default value, as shown below:

timeBinLength:timespan = 1h

This creates an optional param named "timeBinLength" of type timespan. If this parameter is not passed to the function, it will default to 1 hour.

Sample

Here is an example of an ADX function that accepts as input parameters a start and end time and (optionally) a timespan that defines a bin size. It returns a dataset of drone locations between the start and end time, but only returns one row per timespan defined by the timeBinLength parameter.

.create-or-alter function
with (docstring = 'Points drone passed through rolled up by time period', folder='Samples')
DroneRoute(
startDateTime:datetime,
endDateTime:datetime,
timeBinLength:timespan = 1h
)
{
droneData
| where TimeStamp between (startDateTime .. endDateTime)
| summarize arg_max(TimeStamp, Longitude, Latitude) by bin(TimeStamp, timeBinLength)
| order by TimeStamp asc 
| project TimeStamp, Longitude, Latitude
}

Calling the Function

Calling your function requires only the function name, followed by parameters in parentheses.

The code below calls the DroneRoute function, setting the timespan to 30 minutes.

DroneRoute('2022-01-10 11:00:00', '2022-01-10 13:00:00', 30m)

The results of that call are in Fig. 1.

Results of calling droneRoute function

Fig. 1

The code below also calls the DroneRoute function, but omits the optional timeBinLength parameter, so the default of 1 hour is used.

DroneRoute('2022-01-10 11:00:00', '2022-01-10 13:00:00')

The results of that call are in Fig. 2.

Results of calling droneRoute function and not passing optional parameter

Fig. 2

Conclusion

In this article, you learned how to create and call a User-Defined Tabular Data Function in Azure Data Explorer.


Episode 695

Chris Judd on Technical Skills That Are Most In Demand

We are currently in a good job market for those with technical skills. But some skills are more in demand than others. Chris Judd is CTO and Partner at Manifest Solutions, which puts him in an ideal situation to see which skills customers are seeking.


BillyBranchHarmonica player Billy Branch is a connection to the great Chicago blues artists of the past. He has played and recorded with Chicago legends Willie Dixon, Lou Rawls, Koko Taylor, Syl Johnson, and Junior Wells, as well as a host of others.

He founded and led The Sons of the Blues almost 50 years ago and brought the current incarnation of that group to Evanston's SPACE nightclub Saturday night. The venue was completely full, including some who watched while standing in the back.

Branch brought with him many friends - some of whom were descendants of great bluesmen like Willie Dixon and Little Walter. Little Walter had a special place in tonight's concert. Branch and the S.O.B.s recently released an album of Little Walter songs.
https://amzn.to/3h2mXQZ

Like Walter, Branch sings and plays the harmonica, and he delighted the crowd in doing so on this night. He drew heavily from this album, "Nobody But You", "Mellow Down Easy", "Juke"

The Sons' current lineup featured top-notch musicians Andrew "Blaze" Thomas. (drums), Marvin Little (bass), Giles Corey (guitar), and Sumito Ariyoshi (keyboards), along with Branch. Little occasionally stepped to the front to engage the audience with humorous banter, but it was Ariyoshi's playing that really shined - especially on his solos. The sweetness of his melodies contrasted with the blistering of Branch's harmonica.

At 70 years young, Branch lacks the stamina of his youth. When he told a story, he joked that he did so for a chance to catch his breath; He also stepped away from the stage to allow his younger bandmates to play a couple of their own songs; and the performance lasted only about 100 minutes. But it was enough to remind us how much we love the blues and how much we appreciate those keeping alive the legacy of blues in Chicago.



GCast 121:

Analyzing Geographic Data with KQL

Kusto Query Language (KQL) is ideal for analyzing geographic location data stored in Azure Data Explorer (ADX). This video shows how to use some of the features of KQL to implement this analysis.


If you find that you are often querying the same aggregation query of ADX data, it may be useful to create a Materialized View. A Materialized view performs the aggregation in advance as data is added to the table. We can then query the Materialized View, rather than the table, eliminating the need for our query to perform aggregation.

Setup

For the examples in this article, we will use a table created with the following ADX commands:

.drop table customers

.create table customers
(
FullName:string,
LastOrderDate:datetime,
YtdSales:decimal,
City:string,
PostalCode:string 
)

.ingest inline into table customers <| 
'Bill Gates', datetime(2022-01-10 11:00:00), 1000000, 'Redmond', '98052'
'Steve Ballmer', datetime(2022-01-06 10:30:00), 150000, 'Los Angeles', '90305'
'Satya Nadella', datetime(2022-01-09 17:25:00), 100000, 'Redmond', '98052'
'Steve Jobs', datetime(2022-01-04 13:00:00), 100000, 'Cupertino', '95014'
'Larry Ellison', datetime(2022-01-04 13:00:00), 90000, 'Redwood Shores', '94065'
'Jeff Bezos', datetime(2022-01-05 08:00:00), 750000, 'Seattle', '98109'
'David Giard', datetime(2022-01-02 09:01:00), 1.50, 'Chicago', '60605'
See this article for information on managing tables with ADX commands.

You can run the examples in this article in either the ADX Data Explorer web page or in Kusto.Explorer - a rich client Windows application that you can download for free from here.

Creating a Materialized View

Here is some of the syntax for creating a Materialized View, using the features that we found most useful:

.create async materialized-view
with (backfill=backfill_status, docString='description' )
materialized_view_name
on table source {
aggregation_query
}

where:

  • backfill_status is true, if you want to calculate the aggregation for all existing rows in the table. For large tables, this can take a long time. Set this to false if you only want to aggregate data inserted after the view is created.
  • description is a brief description of the view, making it easier for others to identify its purpose.
  • materialized_view_name is the name of the view to create
    source is the name of the source table (or another Materialized View)
  • aggregation_query is a KQL query and/or set of commands that returns a dataset. This query must include aggregated data, such as avg, min, or max.

You must add the async keyword if you set backfill=true.

You can find the full syntax here.

Here is an example:

.create async materialized-view 
with (backfill=true, docString='Summary customer sales' ) 
myMaterializedView
on table customers { 
customers
| summarize numCustomers=count(), minSales=min(YtdSales), maxSales=max(YtdSales), avgSales=avg(YtdSales) by PostalCode
}
Once we have this materialized view, we can query it as we would query a table, as in the following example:
myMaterializedView
| where avgSales > 100000
The results of this query are shown in Fig. 1

Fig. 1

Altering a Materialized View

Use the .alter materialized-view to modify an existing View. The syntax is nearly identical to the .create materialized-view command.

For example, the following command will remove the minSales aggregated column in the view I created above.

.alter materialized-view
myMaterializedView
on table customers { 
customers
| summarize numCustomers=count(), maxSales=max(YtdSales), avgSales=avg(YtdSales) by PostalCode
}

Removing a Materialized View

You can remove a materialized view with the .drop materialized-view command, as in the following example:

drop materialized-view myMaterializedView

Conclusion

Although there is an initial performance hit when rows are inserted, using a Materialized View can speed up your queries considerably.


Happy birthday to the .NET Framework that turned 20 years old earlier this month.

My first introduction to .NET was a rocky one. I remember hearing about it in 2001, shortly before its release, and learning about it in 2002 shortly after. I was working as a consultant at a Microsoft partner in Cincinnati that staffed me at the local gas and electric utility company, where I spent my days writing Oracle stored procedures.

One Saturday, my company put on a full day of training to introduce all to .NET; but I found the training confusing and got little out of it. It was delivered remotely at a time when remote online training was suboptimal, and it was delivered by people with little training experience. To further complicate things, Microsoft had not yet released an IDE for the product, so we were typing into a command prompt and wondering about the results.

A few months later, we secured one .NET project, but I was needed at the utility, so I watched from afar with a bit of jealousy for my colleagues who were learning something new.
 
My first real project came after I rolled off the Oracle project and was assigned to work on an ERP system a customer was building entirely in .NET. The customer was 160 miles away and their slow Internet connection made it very difficult to learn a new technology while working effectively on a project. Refreshing and building my source code each morning took over an hour; builds were long and often timed out; and it was difficult to tell if an error was caused by my code, by a network error. It was not uncommon for a developer in the other city to break an interface by checking in only some of their code. The client grew frustrated with my slow performance and removed me from the project.

I asked some of my colleagues to suggest a good book to help me learn .NET and several of them recommended Jeffrey Richter's "Applied Microsoft® .NET Framework Programming". This turned out to be the wrong choice for a novice. In his book, Richter spent far more time explaining the inner workings of the Intermediate Language and Garbage collection than he did on building a web form or connecting to a database - the more practical skills I was seeking. This was a book for someone already familiar with the basics - not for a beginner like me.

Resolved to learn the alchemy that was .NET, I bought a domain name and began working on a personal website devoted to Michigan State University athletics. It was a good project because I was an MSU fan then and now; and it was a great opportunity to learn and apply ASP.NET, create a site with templates and dynamic content. Soon, I had a site that automatically refreshed when I updated a database. I enhanced the site's features and content for 10 years until the domain name expired (I made the mistake of registering it with a work e-mail. A rookie error)

This is how I learned .NET and it was not long before I was passing certification examines and working on real-world projects and successfully helping customers. Within a year, I was teaching classes on building .NET applications. In 2011, I even co-authored a book on the topic.

Fast forward 20 years. .NET is still around. Its latest incarnation can run on Linux and Mac as well as its original Windows home. I have worked on dozens (maybe hundreds) of projects using this platform. I have built a solid career around it. I don't know how long it would have taken me to get started if I had waited for a good project or good training from my employer, instead of building something on my own to force myself to learn.

Here are the lessons I learned from that experience:

- Learn to recognize which technologies have a chance at making an impact. To me, it was clear early on that .NET was the future of software development on Windows. Sometimes, it is not as obvious. Listen to others, read what is written in journals and influential blogs, and see who is adopting a new technology.
-Do not be discouraged if your first attempts are unsuccessful. I was able to rebound from a frustrating beginning.
-When something new comes along that you think will be relevant for years to come, find a way to learn it. Don't wait for an opportunity - create one!
-Most important: Maintain a passion for learning! Continual learning is the best and worst part of this career I have chosen.


Episode 694

Brent Ozar on What Every Developer Needs to Know About Databases

Database consultant Brent Ozar reveals the 5 things that every software developer needs to know about databases.


"The wheel weaves as the wheel wills"

This phrase, spoken frequently in Robert Jordan's "The Eye of the World", reflects the inevitability of destiny. But it also refers to the cyclical nature of time in the world of the novel. Every few thousand years, the world is destroyed and reborn and the cycle begins again with each turn of the wheel producing exactly 14 ages and with many of the history predetermined. A new cycle is like the previous one, but the people have some opportunity to change history - particularly the heroes and villains.

TEOTW begins in a small, isolated village that is invaded by Trollocs - giant creatures that are part man and part beast and are led by the eyeless, sadistic Myrddraals. The monsters came, seeking The Dragon, who was prophesied to attain great power. Moraine Damodred - a mystical woman of the mysterious and powerful Aes Sedai cult - and her companion al'Lan Mandragoran foresaw the attack and identify three young villagers -  Rand al'Thor, Matrim Cauthon, and Perrin Aybara as possibly being the Dragon Reborn. She leads the trio out of the city on a quest, which is joined by their friend
Egwene al'Vere, and Thom Merrilin, a minstrel known as a "gleeman". The group sets out across the world visiting towns and inns and abandoned cities and lively cities. Along the way, they are pursued and confronted by the dark forces that want to destroy them or seduce them to their side.

This is an adventure story and a coming-of-age story and a story of good vs evil. The forces of good are referred to as "Light", which could easily be another name for God; and the malevolent forces are known as "The Darkness" and clearly represent Satan. The universe of this novel explores the conflicts between predestination and free will. It is never clear how much control the actors have over their own lives nor the fate of the world.

The book also explores gender roles. The One Power of the universe can be wielded by the forces of Light or Darkness, but it is divided into two parts - one for men and another for women. The Aes Sedai consists entirely of women, although that has not always been the case.

It is a good story with many interesting characters and a decent arc. The opening scenes set the stage well for the chaos our heroes will soon face; and the climax at the end is a cataclysmic battle between Dark and Light forces and it is executed masterfully. But the middle of the story sometimes drags a bit. Repeatedly, the characters travel to a new town, are found and attacked by servants of the Darkness; then escape to the next town where something similar happens. The pace is also dragged down by Jordan's penchant for providing minute details of every place the group visits.

Jordan was definitely inspired by J.R.R. Tolkien's classic "The Lord of the Rings", as some of the plot elements and characters seem to be "borrowed" from that trilogy. But Jordan eventually takes the story in a new direction and makes it his own.


America2022I owned a vinyl copy of the album "History: America's Greatest Hits" when I was a teenager and I played it until it wore out and I had to buy another. America's music stayed with me throughout the years - their catchy melodies and their tight harmonies resonated with my young musical tastes. But I never saw them in concert. Until Friday night in Waukegan when they played a full house at the Genesee Theatre.

America was formed in 1970 by the trio of Dewey Bunnell, Dan Peek, and Gerry Beckley - the sons of US Air Force personnel serving in London. Peek is now gone (he passed away in 2011), but Bunnell and Beckley continue the tradition, performing over 100 shows a year for the past 50 years. This is an impressive feat for anyone, but particularly for artists approaching their seventh decade.

The years have not diminished their vocal prowess nor their enthusiasm for performing. The duo delighted the crowd, playing their hits and mixing in a few deep tracks from earlier albums. They engaged the audience and the audience appreciated it.

The evening was improved by having the Buckinghams as a warmup act. This Chicago act quickly rose to fame in the late 1960s and just as quickly faded from the charts. On this night, they played their string of hits, such as “Hey Baby (They're Playing Our Song)”, “Don’t You Care”, and their #1 single “Kind of a Drag”.

America opened with "Tin Man" - a big hit from 1974, which immediately fired up the audience. It seemed a foregone conclusion they would close with their biggest hit - "A Horse with No Name". In between, were the other classics, such as "Sister Golden Hair", "Lonely People", and "Ventura Highway" (my personal favourite). They even included a couple of covers: The Mamas and the Papas' "California Dreamin'" and The Beatles' "Eleanor Rigby", which they announced as a tribute to George Martin, who produced both The Beatles and America.

One came away with the feeling that the remaining duo still very much enjoys playing their music for an appreciative audience. I know I enjoyed it when they did.

Photos


GCast 120:

Analyzing Time Series Data with KQL

Kusto Query Language (KQL) is ideal for analyzing time series data stored in Azure Data Explorer (ADX). This video shows how to use some of the features to implement this analysis.


Geo Functions in KQL

Comments [0]

Kusto Query Language (KQL) contains many built-in functions to work with Geographic data. In this article, I will describe some of those that I found most useful.

Setup

For the examples in this article, we will use a table created with the following ADX commands:

.drop table vehicleLocations

.create-merge table vehicleLocations
(
VehicleId:int,
TimeStamp:datetime,
Longitude:decimal,
Latitude:decimal
)

.ingest inline into table vehicleLocations <| 
1, datetime(2022-01-10 10:45:00), -73.99446487426758, 40.73857555787898
1, datetime(2022-01-10 11:00:00), -73.97476673126219, 40.73857555787898
1, datetime(2022-01-10 11:15:00), -73.97476673126219, 40.74091672247485
1, datetime(2022-01-10 11:30:00), -73.99446487426758, 40.74091672247485
2, datetime(2022-01-10 11:01:00), -73.92, 40.72
2, datetime(2022-01-10 11:02:00), -73.99, 40.61
2, datetime(2022-01-10 11:03:00), -73.90, 40.55
2, datetime(2022-01-10 11:04:00), -73.85, 40.49
2, datetime(2022-01-10 11:04:00), -73.79, 40.48

Useful Functions

geo_distance_2points()

This function calculates the distance in meters between 2 points, given the latitude and longitude of each. The syntax is:

geo_distance_2points(p1_longitude, p1_latitude, p2_longitude, p2_latitude)

where:

  • p1_longitude, p1_latitude are the latitude and longitude of the first point
  • p2_longitude, p2_latitude are the latitude and longitude of the second point

The path between the points is assumed to be a straight line, so it may or may not be the path taken to move from one point to the other. For example, you may be tracking a car's movements and that car is likely be restricted to driving on roads and not driving through buildings.

Here is a sample call:

vehicleLocations
| where VehicleId == 1
| order by VehicleId asc, TimeStamp asc
| project  VehicleId, TimeStamp, Longitude, Latitude,
distance=geo_distance_2points(Longitude, Latitude, prev(Longitude), prev(Latitude))

The query above yields the results shown in Fig. 1

Results of geo_distance_2points function
Fig. 1

geo_point_in_polygon()

This function returns true, if a given point is inside a given polygon; otherwise, it returns false.

The syntax is:

geo_point_in_polygon(longitude, latitude, polygon)

where:

  • longitude is the longitude of the point to test
  • latitude is the longitude of the point to test
  • polygon is the polygon in question

There are 2 ways to create a polygon.

One is to assign an array of longitude/latitude pairs to an object of type polygon. The other is to use the pack function and pass in an object containing an array of longitude/latitude pairs.

Below are examples of each type:

let polygon = 
```
{
"type": "Polygon",
"coordinates": 
[[
[-73.96219253540039, 40.782816128657224],
[-73.96682739257812, 40.77631678827737],
[-73.96176338195801, 40.77202687527417],
[-73.95579814910889, 40.77053184050704],
[-73.95034790039062, 40.77254687948199],
[-73.94682884216309, 40.77732422768091],
[-73.94918918609619, 40.78125634496216],
[-73.96047592163086, 40.78470081841747],
[-73.96219253540039, 40.782816128657224]
]]
}
```;
let p1Lon = -73.99892807006836;
let p1Lat = 40.72924259684576;
let p2Lon = -73.98395061492919;
let p2Lat = 40.72924259684576;
let p3Lon = -73.98395061492919;
let p3Lat = 40.74976037842817;
let p4Lon = -73.99892807006836;
let p4Lat = 40.74976037842817;
let p1 = pack_array(p1Lon, p1Lat);
let p2 = pack_array(p2Lon, p2Lat);
let p3 = pack_array(p3Lon, p3Lat);
let p4 = pack_array(p4Lon, p4Lat);
let polygon = pack("type","Polygon","coordinates", pack_array(pack_array(p1, p2, p3, p4, p1)));

The polygons above each define a rectangle in New York City, as shown in Fig. 2

Polygon shown on map of New York City
Fig. 2

Here is a sample call to geo_point_in_polygon, using either of the polygon objects created above:

vehicleLocations
| where geo_point_in_polygon(Longitude, Latitude, polygon)
| order by VehicleId asc, TimeStamp asc

The query above will return only those rows that are inside the polygon. The results are shown in Fig. 3.

Results of geo_point_in_polygon function
Fig. 3

geo_point_to_s2cell()

The S2 system divides the Earth into rectangles of approximately equal size. The rectangles take into account the curvature of the Earth's globe, eliminating gaps and distortions caused by models that attempt to flatten the Earth's surface. The size of each rectangle is dependent upon the level (00 through 30). Lower level values divide the earth into larger rectangles, so it takes fewer rectangles to cover the globe. For a given S2 level, if two points share the same S2 value, they exist in the same rectangle.

You can read more about S2 at s2geometry.io

The syntax of geo_point_to_s2cell() is:

geo_point_to_s2cell(longitude, latitude, level)

where:

  • longitude and latitude represent a point on the globe
  • level is the level that defines the S2 rectangle in which to place this point

The following example places calculates the S2 Level for every data point of Vehicle 2:

vehicleLocations
| extend s2_13 = geo_point_to_s2cell(Longitude, Latitude, 30)
| where VehicleId == 2 

The query above yields the results shown in Fig. 4

Results of geo_point_to_s2cell function
Fig. 4

geo_polygon_to_s2cells()

This function returns an array of s2 cell values that cover a polygon. The syntax is:

geo_polygon_to_s2cells(polygon, S2_level)

where:

  • polygon is a polygon, as described above
  • S2_level is the level (0-30) defining the size of the S2 rectangles, as described above.

For example, the following code will return an array of 15-level S2 cells that completely covers a polygon named myPolygon. Every S2 cell in the area will overlap myPolygon.

geo_polygon_to_s2cells(myPolygon, 15)

You can then use this array to further refine searches, as in the example below:

let p1Lon = -73.99892807006836;
let p1Lat = 40.72924259684576;
let p2Lon = -73.98395061492919;
let p2Lat = 40.72924259684576;
let p3Lon = -73.98395061492919;
let p3Lat = 40.74976037842817;
let p4Lon = -73.99892807006836;
let p4Lat = 40.74976037842817;
let p1 = pack_array(p1Lon, p1Lat);
let p2 = pack_array(p2Lon, p2Lat);
let p3 = pack_array(p3Lon, p3Lat);
let p4 = pack_array(p4Lon, p4Lat);
let polygon = pack("type","Polygon","coordinates", pack_array(pack_array(p1, p2, p3, p4, p1)));
let s2Array =geo_polygon_to_s2cells(polygon, 12);
let devicesInS2Boxes = 
    vehicleLocations
    | extend s2_12 = geo_point_to_s2cell(Longitude, Latitude, 12) 
    | where s2_12 has_any (s2Array);
devicesInS2Boxes
| where geo_point_in_polygon(Longitude, Latitude, polygon)

This code returns the same results as in Fig. 3 above; however, it can be much faster if our table contains many points all over the world, far from the polygon. By first finding the overlapping S2 cells, we speed our query by only considering points near the polygon. This will run even faster if we pre-populate the S2 cell value.

Conclusion

In this article, I described the Geo features of KQL that we found most useful. You can view more KQL Geo functions here


The Kusto Query Language (KQL) is ideal for analyzing time series data stored in Azure Data Explorer (ADX).

Setup

For the examples in this article, we will use a table created with the following ADX commands:

.drop table droneData

.create-merge table droneData
(
DroneId:int,
TimeStamp:datetime,
Longitude:decimal,
Latitude:decimal,
BatteryLife:decimal 
)

.ingest inline into table droneData <| 
1, datetime(2022-01-10 10:45:00), -73.988888888888888, 40.742222222222222, 100.0
1, datetime(2022-01-10 11:15:00), -73.978820800000000, 40.741862453111111, 99.9
1, datetime(2022-01-10 11:30:00), -73.978800000000000, 40.741862453112222, 99.8
1, datetime(2022-01-10 12:45:00), -73.970000000000000, 40.741862453122222, 99.7
1, datetime(2022-01-10 12:00:00), -73.960555555555555, 40.741862453122222, 99.6
1, datetime(2022-01-10 12:15:00), -73.960000000000000, 40.741862453122222, 99.5
1, datetime(2022-01-10 12:30:00), -73.960000088888888, 40.741862453122222, 99.4
1, datetime(2022-01-10 12:45:00), -73.960000055555555, 40.741862453122222, 99.4
1, datetime(2022-01-10 13:00:00), -73.960000011111111, 40.741862453122222, 99.3
1, datetime(2022-01-10 13:15:00), -73.955555555555555, 40.741862453122222, 99.2
1, datetime(2022-01-10 13:30:00), -73.950000000555555, 40.741862453122222, 99.2
1, datetime(2022-01-10 13:45:00), -73.960000000000000, 40.741862453122222, 99.0
1, datetime(2022-01-10 14:00:00), -73.960555555555555, 40.741862453122222, 98.9
1, datetime(2022-01-10 14:15:00), -73.960555555555555, 40.741862453122222, 98.8

Ordering

To use the time series functionality, it is important to

  • Have a column string a time value
  • Sort your data by that time value column

Getting the Previous value

After you have sorted the data, KQL provides the prev function that allows you to retrieve the value of any column in the sorted order. You can only use this function if your data has been sorted using the order by clause.

The syntax is

prev(column)

where column is the name of the column from which to retrieve the previous row's value.

The following example retrieves the battery life from the previous row; then, calculates the delta between the current row and the previous row.

droneData
| order by TimeStamp asc
| project DroneId, TimeStamp, Longitude, Latitude, BatteryLife, PrevBatteryLife = prev(BatteryLife) 
| extend BatteryLifeChange = BatteryLife – PrevBatteryLife

The results of this query are shown in Fig. 1

Results of prev function
Fig. 1

By default, the prev() function returns a value 1 row prior to the current row. However, you can specify to go back any number of rows by providing an optional offset argument. The syntax is:

prev(column, offset)

By default, the prev function returns null, if there is no previous value (for example, for the first row in the dataset). However, you can provide a different default for these cases with the optional default_value parameter. The syntax is:

prev(column, offset, default_value)

As you may have guessed, there is also a next function that works exactly the same way, except that it returns a value from the next row in the series, rather than the previous one.

Summarizing Data Into Bins

KQL provides the bin function to use when aggregating data. Typically, when you aggregate data, you use the by clause group by a field or fields in the table. The bin() function allows you to group time series data by a time increments. If you have data points for every hour, you can return results for each 15-minute interval. The syntax is:

bin(value,roundTo)

where:

  • value is a column containing datetime values
  • roundTo is a timespan indicating how far apart each grouping should occur

An example will help. The following query returns one row for each 1-hour interval, even though our sample data contains values every 15 minutes.

droneData
| summarize arg_max(TimeStamp, DroneId, Longitude, Latitude, BatteryLife) by bin(TimeStamp, 1h)

The results of this query are shown in Fig. 2

Results of aggregating by bin
Fig. 2

If you have multiple rows within your specified interval, it is reasonable to ask which row's values will be returned. The arg_max operator in the example above takes care of this. It tells Kusto to return the row with the maximum TimeStamp value in that interval. The first argument in arg_max specifies which column to consider when determining the maximum and the other arg_max arguments determine what other column values to return.

The bin function can also be used to group numeric data, so that you only show one row per 100 items, for example.

Conclusion

There are many other KQL features to help you work with Time Series data, but this article covered the ones that my team has found most useful.

.

<< Older Posts