Use VCE Exam Simulator to open VCE files
Nowadays, companies use different systems for different tasks that are usually performed by different departments. Although these separate systems work for solving a specific need, they usually act as independent actors in a big scenario. These independent actors manage information about the company that can potentially be duplicated by other independent actors.
When a company realizes that independent actors are managing their data, they usually try to make all the independent actors or systems work together and share information between them. This situation is independent of using cloud services or on-premises services. To make the independent actors work together, you need to make connections between each actor or service that needs to communicate with the other.
You can use different services and techniques to achieve this interconnection. Azure provides some useful services that allow different services to work together without making big changes to the interconnected services.
Skill 6.1: Develop an App Service Logic App
Skill 6.2: Integrate Azure Search within solutions
Skill 6.3: Establish API Gateways
Skill 6.4: Develop event-based solutions
Skill 6.5: Develop message-based solutions
Exchanging information between different applications is a goal for most companies. Sharing the information enriches the internal process and creates more insight into the information itself. By using the App Service Logic App, you can create workflows that interconnect different systems, based on conditions and rules and easing the process of sharing information between them. Also, you can also take advantage of the Logic Apps features to implement business process workflows.
This skill covers how to:
Before you can interconnect two separate services, you need to fully understand which information you need to share between the services. Sometimes the information needs to undergo some transformations before it can be consumed by a service. You could write code for making this interconnection, but this is a time-consuming and error-prone task.
Azure offers the App Service Logic Apps that allows the interconnection of two or more services that share information between them. This interconnection between different services is defined by a business process. Azure Logic Apps allows you to build complex interconnection scenarios by using some elements that ease the work:
Workflows Define the source and destination of the information. Workflows connect to different services by using connectors. A workflow defines the steps or actions that the information needs to take to deliver the information from the source to the correct destination. You use a graphical language to visualize, design, build, automate, and
deploy a business process.
Managed Connectors A connector is an object that allows your workflow to access data, services, and systems. Microsoft provides some prebuilt connectors to the Microsoft services. These connectors are managed by Microsoft and provide the needed triggers and action objects to work with those services.
Triggers Triggers are events that fire when certain conditions are met. You use a trigger as the entry or starting point of a workflow. For example, when a new message arrives at your company’s purchases mailbox, it can start a workflow that can access information from the subject and body of the message and create a new entry in the ERP system.
Actions Actions are each of the steps that you configure in your workflow. Actions happen only when the workflow is executed. The workflow starts executing when a new trigger fires.
Enterprise Integration Pack If you need to perform more advanced integrations, the Enterprise Integration Pack provides you with BizTalk Server capabilities.
Note Azure Logic App, Azure Functions, Azure App Service Webjobs, and Microsoft Flow
If you need to implement workflows, Microsoft provides some products that you can use for that task. Although there is some overlap of the features provided by Logic Apps, Functions,WebJobs, and Flow, they are designed for different scenarios. You can review more details about the https://docs.microsoft.com/enus/azure/azure-functions/functions-compare-logic-apps-ms-flow-webjobs.
You can use Azure Logic Apps for different purposes. The following procedure shows how to create an Azure Logic App workflow that writes a message in the Microsoft Teams app when a new build completes in Azure DevOps. For this procedure, you need an Azure DevOps account with a configured project that you can build. You also need a Microsoft Office 365 subscription with access to the Microsoft Teams application. Let’s start by creating and configuring the Azure Logic App:
At this point, the Azure DevOps agent is building the sample project. Once the build finishes, you will receive a new message in your Microsoft Teams channel, as shown in Figure 6-3.
Need More Review? Control Workflow
In a regular workflow, you need to run different actions based on the input values or based on certain conditions. You can run these different actions using control workflow. To learn more about control flow and other connectors that you can use in Azure Logic Apps, see
https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-controlflow-conditional-statement.
Microsoft provides more than 200 built-in connectors that you can use in your Azure Logic Apps workflow. Despite this number of connectors, there are opportunities that you need some specific features that are not provided by the built-in connectors, or you want to create a connector for your company’s application.
You can create custom connectors for Microsoft Flow, PowerApps, and Azure Logic Apps. Although you cannot share Azure Logic Apps connectors with Microsoft Flow and PowerApps connectors, the principle for creating custom connectors is the same for all three platforms. A custom connector is basically a wrapper for a REST or SOAP API. This wrapper allows Azure Logic Apps to interact with the API of the application. The application that you want to include in the custom connector can be a public application, such as Amazon Web Services, Google Calendar, or the API of your application published to the Internet. Using the on-premises data gateway, you can also connect the custom connector with an on-premises application deployed in your data center. Every custom connector has the following lifecycle:
Generic OAuth 2.0.
OAuth 2.0 for specific services, like Azure Active Directory, Dropbox, GitHub, or SalesForce
Basic Authentication
API Key
The following steps show how to create a custom connector following the previous lifecycle. For the sake of brevity, we are using an Azure Cognitive Services API for creating this custom connector. For this example, you need to create an Azure Cognitive Services account that uses the Text Analytics API. You can sign up for a Text Analytics API at https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/howtos/text-analytics-how-to-signup
Note: Base URL
In this example, the endpoint definition already contains the correct base URL in the endpoint definition. If you change the default Base URL property in the General Information section in your Azure Logic Apps Custom Connector, you will receive a 404 error every time that you try to use your custom connector in an Azure Logic Apps workflow.
At this point, you have successfully created your Azure Logic Apps Custom Connector. In the following steps, you are going to create a workflow for testing your new custom connector. This workflow gets the content from files in an Azure Blob Storage account, sends the content of the files to the Azure Cognitive Services, and sends the sentiment score of the content to a Microsoft Teams channel:
At this point, you should be ready to test your new Azure Logic Apps Custom Connector. Now you can simply create one text file containing a positive sentence, such as Today is a great day. Then upload that file to the Azure Blob Storage container that you configured previously. If everything went well, you should get a new message in the selected team’s General channel, and the output of the workflow should look similar to Figure 6-6.
Exam Tip
You can create custom connectors for Azure Logic Apps, Microsoft Flow, and Microsoft PowerApps. You cannot reuse a connector created for Azure Logic Apps in Microsoft Flow or PowerApps (or vice-versa). You can use the same OpenAPI definition to create a custom connector for these three services.
Need More Review? Custom Connector
You can learn more about custom connectors at https://docs.microsoft.com/en-us/connectors/custom-connectors/.
Once you have created an Azure Logic App, you can reuse it in other Azure subscriptions or share with other colleagues. You can create a template from your working Azure Logic App to automate deployment processes. When you create a template, you are converting the definition of your Azure Logic App into an Azure Resource Manager (ARM) template. Using ARM templates allows you to take advantage of the flexibility of the ARM platform by separating the definition of the Azure Logic App from the values used in the logic app. When you deploy a new Azure Logic App from a template, you can provide a parameter file in the same way that you do with other ARM templates. Azure also provides some prebuilt Logic App templates. You can use these templates as a base for creating your own templates.
You can download an Azure Logic Apps template using several mechanisms:
Azure Portal You can use the Export Template option in the Azure Logic App in the Azure Portal for downloading the ARM template..
Visual Studio You can use the Azure Logic Apps Tools extension for Visual Studio to connect your Azure Subscription and download a template from your Azure Logic Apps..
PowerShell You can use the LogicAppTemplate PowerShell module to download a template from your Azure Logic App..
A Logic App template is a JSON file comprises of three main areas:
Logic App resource This section contains basic information about the Logic App itself. This information is the location of the resource, the pricing plans, and the workflow definition..
Workflow definition This section contains the description of the workflow, including the triggers and action in your workflow. This section also contains how the Logic App runs these triggers and actions..
Connections This section stores the information about the connectors that you use in the workflow..
Use the following procedure to create a template from your Azure Logic App using Visual Studio:
At this point, you can edit and customize your template. Once you are done with the modifications to your template, you can create a parameters file for deploying this template.
Need More Review? Logic App Templates
You can learn more by reading the following articles about Logic App templates:
Create Logic App Templateshttps://docs.microsoft.com/enus/azure/logic-apps/logic-apps-create-deploy-azure-resourcemanager-templates
Deploy Logic App Templateshttps://docs.microsoft.com/enus/azure/logic-apps/logic-apps-create-deploy-azure-resourcemanager-templates
If you want to add the ability to search for information in your application, indexing that information can be difficult if your app manages a lot of data. Fortunately, Microsoft offers the Azure Search service. Azure Search is a Search-as-a-Service (SaaS) platform that allows you to index information from different data sources, and it integrates with your application using a REST API or a .NET SDK. Using Azure Search, you can provide the needed infrastructure for storing the indexes and all other components needed by the search engine.
This skill covers how to:
The Azure Search service provides an API that you can use to add search capabilities to your solution. You can access the Azure Service by using a REST API or a .NET SDK. Using the API or the SDK, you can extrapolate the details from and the complexity of retrieving information from the indexes.
You can think of an index as a database that contains the information that you want to be available for searching. That information is stored in the index as a document, which is a searchable entity or unit. For example, if your application manages data from devices, a document is the structure that represents each of the devices managed by your application. Another example of a document would be each of the articles of a news company. Indexes and documents are conceptually equivalent to tables and rows in a database.
Azure Search creates physical structures to store the indexes that you upload or add to the service. These physical structures depend on the index schema that you provide. Before you can create an Azure Search index, you need to create an Azure Search service instance. Use the following procedure to create an Azure Search service instance:
Once you have created an Azure Search service instance, you can create indexes and import data. You can create indexes by using the Azure Portal, or you can create them programmatically using C#, PowerShell, Postman, or Python. Creating an index is usually an iterative process that requires several iterations before you get the appropriate index definition for your application. Because of this iterative process and the fact that the index schema is tightly coupled with the physical storage, each time you need to make significant changes to an existing field definition, you need to rebuild the entire index. During the development stage of your index, you will use the following recommended workflow for creating a basic index:
Note: Size of the Dataset
During the development of an index, you should consider using only a subset of your entire dataset because you usually need to make modifications to the index definition. Every time that you modify the index definition, you need to rebuild the entire index. Using a subset of your dataset makes this rebuilding process faster because you aren’t loading as much data in your index.
Use the following procedure to create your index by using the Azure Portal. (Once you create the first definition of your index, we will review how to make modifications to an existing index using some code.)
Field Name This should be the name of the field; in this example, it should be surname..
Type Because this field stores customer surnames, the type should be Edm.String.
Retrievable Checked. You should include this field in the search results.
Filterable Not checked. When you search strings by using filters, you look for exact matches by using Boolean operations. Filters are case sensitive and more strict than using search queries.
Sortable Checked. You may want to get your results ordered by this field.
Facetable Not checked. If your application does not use navigation trees for constructing filters, you should not check this attribute.
Searchable Checked. You want to search for people by their surnames. Using this option, you get the full power of the full-text search engine.
[service name] This should be the name of your Azure Search service instance that you created at the beginning of this section..
[index-name] This should be the name of the index that you provided in step 5 of this procedure.
api-key Paste the value that you copied on step 12
At this point, you can modify your index definition by changing the JSON file with your index definition that you just downloaded in the previous procedure. When you are creating and updating the definition of your index and its fields, there are some concepts that you should bear in mind:
Fields Your index is composed of a list of fields that define the data contained in the index. You can find the definition of the fields grouped as a collection in the JSON file. Each field has some attributes that control the behavior of the field. Each field has the following properties that you need to provide:
Name This is the name of the field..
Type This sets the kind of data that the field stores.
Attributes This is an attribute of a field that defines the behavior of the field in the index. For example, you can assign the searchable attribute to a field. That attribute marks the field as full-text searchable. Another important attribute that you need to consider is the Key attribute. Every index that you define must have one—and only one—field configured with the Key attribute. This field must be of type Edm.String. Table 6-1 shows a list of the available attributes that you can configure for your fields.
Suggesters You use this section to define which fields should be used to auto-complete the search feature. This feature presents some suggestions for search terms as the user is typing..
Scoring Profiles By assigning scores to the items, you can influence which items should appear higher in the search results. You need to define scoring profiles that are made up of functions and fields so you can automatically assign the scores to the items in the search results. The scoring profiles are transparent to the users..
Analyzers Configure the language analyzer for a field. When you configure a field as searchable, you need to set the language that this field contains to make the appropriate analysis..
CORS You configure which JavaScript client-side code can make API calls to the Search API. By default, CORS is disabled, so your solutions cannot query the Search API from the client side..
Encryption key You can configure your Azure Search service instance to encrypt the indexes using customer manager keys instead of the default Microsoft managed keys..
Once you are happy with the first definition of your index, you can upload data to your index and make queries to ensure that the index is performing as you need. You can repeat these steps as many times as you need to achieve the best results. In the following sections, we review how to upload data to your index and how to query your indexes.
Exam Tip
You cannot make modifications or edits to an existing index using the Azure Portal. Because the definition of an index is an iterative process, you should make the first index definition using the Azure Portal and then download the index definition using any web analyzing tool, such as Postman. Once you have the JSON definition, you can make updates to the index and upload it to your Azure Search service instance.
Need More Review? Creating an Index
Creating an Azure Search index is a complex task that you need to review carefully. You can review more details about creating an index, such as how to work with analyzers, scoring profiles, or suggesters, or the storage implications of the definition of your fields by reading the article at https://docs.microsoft.com/en-in/azure/search/search-what-is-an-index.
Azure Portal is not the only way to create new indexes. You can also use other methods, such as C#, Postman, Python, or Powershell. Learn how to create an index using C# at https://docs.microsoft.com/enin/azure/search/search-get-started-dotnet.
Once you have made a definition of your index, you need to import data to it before you can get results from the API. Importing data to the index depends on the type of data source that you are using for your data. You can define empty search indexes, but you cannot query those empty indexes until you fill them with some data.
There are two ways to import data into an index or search index:
Push Using this method; you are actually uploading data to your index. You need to provide JSON documents to the index and you programmatically upload the content to the index. The advantage of this import data method is that it is very flexible, and there are no restrictions regarding the type of data source that you use for your data as long as you convert the data to JSON documents that you upload to the search index..
Pull The data is not stored in the index itself; instead, it is stored in one of the supported data sources. Using the indexers that you can configure for your search index, you can connect data stored in the following Azure services: Azure Blob storage, Azure Table storage, Azure
Cosmos DB, Azure SQL database, and SQL Server deployed on Azure VMs. The Azure Search service connects your search index with one of the supported indexers, and then, the indexer maps the fields defined in your index with the fields stored in the documents in the data source. The indexer automatically converts your data stored in the data source to a JSON document that can be used in the search index..
Because the push mechanism stores the information directly in the search index, it provides the lowest latency, making this method the most appropriate for applications that are sensitive to latency.
Note: Sample Dataset
The remaining examples that we review in this skill are based on the sample data provided by Microsoft. You can review and download this sample data at https://azure.microsoft.com/en-us/resources/samples/azuresearch-sample-data/
The following procedure shows how to upload content to your index by using Postman. Before you can start uploading data to the Azure Search service, you need to create the hotels index:
api-key You can find this value in the Keys blade of your Azure Search service instance.
At this point, you have created a new index called hotels in your Azure Search service instance. The next step is to import data into the new index. In this section, we review how to import data by using push and pull methods.
You upload (or push) data to your Azure Search service instance by using the REST API or by using the .NET SDK. Using the REST API is quite similar to the procedure that we previously used for creating an index. You need to use the POST HTTP method and use the URL https://[service name].search.windows.net/indexes/[index name]/docs/index?apiversion=2019-05-06. For this example, the index name should be hotels. You need to use the content of the HotelsData_toAzureSearch.json file to upload the information to your Azure Search service instance.
When you are pushing data to your Azure Search service instance, you can control each action you want to perform on each document in the index. By setting the @search.action property in the JSON file, you can control the action performed on each document. The following list shows the available operations you can apply to the documents:
upload If the document doesn’t exist on the index, it uploads the new document. If the document already exists, it updates the document. When updating a document, any existing field in the document that is not specified in the request will be automatically set to null..
merge This action updates an existing document with the value of the fields specified in the request. The value set in the Request For field replaces the value of that field in the existing document in the index. If the document doesn’t exist in the index, the request fails..
mergeOrupload This action is similar to the merge action, but if the document doesn’t exist in the index, the action behaves as the upload action and creates a new document.. delete This action removes the specified document from the index..
For each action type, you at least need to provide a value for the key field to locate the correct document in the index. The other fields are optional, depending on the type of action that you want to perform on the document. Complete the following steps to push data to your index by using the .NET
SDK:
Microsoft.Extensions.Configuration.Json NuGet Package.
Listing 6-1 appsetting.json file
{
"JSONDocumentsFile": "HotelsData_toAzureSearch.json",
"SearchIndexName": "hotels",
"SearchServiceAdminApiKey": "<Your_admin_key>",
"SearchServiceName": "<Your_search_service_name>"
}.
Listing 6-2 Hotel.cs file
// C# .NET
using Microsoft.Azure.Search;
using Microsoft.Azure.Search.Models;
using System;
using System.Collections.Generic;
using System.Text;
using Newtonsoft.Json;
namespace <your_project_name>
{
public partial class Hotel
{
[System.ComponentModel.DataAnnotations.Key]
[IsFilterable]
public string HotelId { get; set; }
[IsSearchable, IsSortable]
public string HotelName { get; set; }
[IsSearchable]
[Analyzer(AnalyzerName.AsString.EnMicrosoft)]
public string Description { get; set; }
[IsSearchable]
[Analyzer(AnalyzerName.AsString.FrLucene)]
[JsonProperty("Description_fr")]
public string DescriptionFr { get; set; }
[IsSearchable, IsFilterable, IsSortable, IsFacetable]
public string Category { get; set; }
[IsSearchable, IsFilterable, IsFacetable]
public string[] Tags { get; set; }
[IsFilterable, IsSortable, IsFacetable]
public bool? ParkingIncluded { get; set; }
[IsFilterable, IsSortable, IsFacetable]
public DateTimeOffset? LastRenovationDate { get; set; }
[IsFilterable, IsSortable, IsFacetable]
public double? Rating { get; set; }
public Address Address { get; set; }
}
}
The Hotel class represents a document stored in the JSON file. As you can see in Listing 6-2, each of the properties has assigned a property attribute, such as IsFilterable, IsSortable, IsFacetable, and IsSearchable, which matches the attributes in the definition of a search index field. When you work with the .NET Azure Search SDK, you need to explicitly add the property attribute to the properties that define your document. You cannot depend only on the definition of the index fields. This is different from using REST API, where you don’t need to add that explicit definition to the JSON document. Also, pay attention to the HotelID property. This property has also assigned the System.ComponentModel.DataAnnotations.Key property attribute which means it is the Key attribute for this document. You also use property attributes for configuring the Analyzer for those properties that have the IsSearchonable attribute..
Listing 6-3 Hotel.Methods.cs file
// C# .NET
using System;
using System.Collections.Generic;
using System.Text;
namespace <your_project_name>
{
public partial class Hotel
{
public override string ToString()
{
var builder = new StringBuilder();
if (!String.IsNullOrEmpty(HotelId))
{
builder.AppendFormat("HotelId: {0}n", HotelId);
}
if (!String.IsNullOrEmpty(HotelName))
{
builder.AppendFormat("Name: {0}n", HotelName);
}
if (!String.IsNullOrEmpty(Description))
{
builder.AppendFormat("Description: {0}n", Description);
}
if (!String.IsNullOrEmpty(DescriptionFr))
{
builder.AppendFormat("Description (French): {0}n", DescriptionFr);
}
if (!String.IsNullOrEmpty(Category))
{
builder.AppendFormat("Category: {0}n", Category);
}
if (Tags != null && Tags.Length > 0)
{
builder.AppendFormat("Tags: [ {0} ]n", String.Join(", ", Tags));
}
if (ParkingIncluded.HasValue)
{
builder.AppendFormat("Parking included: {0}n", ParkingIncluded.Value ? "yes" : "no");
}
if (LastRenovationDate.HasValue)
{
builder.AppendFormat("Last renovated on: {0}n", LastRenovationDate);
}
if (Rating.HasValue)
{
builder.AppendFormat("Rating: {0}n", Rating);
}
if (Address != null && !Address.IsEmpty)
{
builder.AppendFormat("Address: n{0}n", Address.ToString());
}
return builder.ToString();
}
}
}
Listing 6-4 Address.cs file
// C# .NET
using System;
using Microsoft.Azure.Search;
using Microsoft.Azure.Search.Models;
using Newtonsoft.Json;
namespace <your_project_name>
{
public partial class Address
{
[IsSearchable]
public string StreetAddress { get; set; }
www.examsnap.com ExamSnap - IT Certification Exam Dumps and Practice Test Questions
[IsSearchable, IsFilterable, IsSortable, IsFacetable]
public string City { get; set; }
[IsSearchable, IsFilterable, IsSortable, IsFacetable]
public string StateProvince { get; set; }
[IsSearchable, IsFilterable, IsSortable, IsFacetable]
public string PostalCode { get; set; }
[IsSearchable, IsFilterable, IsSortable, IsFacetable]
public string Country { get; set; }
}
}
Listing 6-5 Address.Methods.cs file
// C# .NET
using System;
using System.Collections.Generic;
using System.Text;
using Newtonsoft.Json;
namespace <your_project_name>
{
public partial class Address
{
public override string ToString() =>
IsEmpty ?
string.Empty :
$"{StreetAddress}n{City}, {StateProvince} {PostalCode}n{Country}";
[JsonIgnore]
public bool IsEmpty => String.IsNullOrEmpty(StreetAddress) &&
String.IsNullOrEmpty(City) &&
String.IsNullOrEmpty(StateProvince) &&
String.IsNullOrEmpty(PostalCode) &&
String.IsNullOrEmpty(Country);
}
}
using System.Collections.Generic;.
using System.IO; using System.Linq;
using Microsoft.Azure.Search; using Microsoft.Azure.Search.Models;
using Microsoft.Extensions.Configuration; using Newtonsoft.Json;
Listing 6-6 Program.cs Main method
// C# .NET
IConfigurationBuilder builder = new ConfigurationBuilder().AddJsonFile("appsettings.
json");
IConfigurationRoot configuration = builder.Build();
string searchServiceName = configuration["SearchServiceName"];
string indexName = configuration["SearchIndexName"];
string adminApiKey = configuration["SearchServiceAdminApiKey"];
string jsonFilename = configuration["JSONDocumentsFile"];
SearchServiceClient serviceClient = new SearchServiceClient
(searchServiceName, new SearchCredentials(adminApiKey));
ISearchIndexClient indexClient = serviceClient.Indexes.GetClient(indexName);
//Batch documents import.
//Reading documents from the JSON file.
List<Hotel> actions;
using (StreamReader file = File.OpenText(jsonFilename))
{
string json = File.ReadAllText(jsonFilename);
actions = JsonConvert.DeserializeObject<List<Hotel>>(json);
}
//Create a batch object.
var batchActions = new List<IndexAction<Hotel>>();
foreach (var hotel in actions)
{
var indexAction = new IndexAction<Hotel>(hotel);
batchActions.Add(indexAction);
}
var batch = IndexBatch.New(batchActions.ToArray());
//Push the documents to the Azure Search service instance
try
{
indexClient.Documents.Index(batch);
}
catch (IndexBatchException ex)
{
Console.WriteLine($"Failed to index some documents: {String.Join(", ",
ex.IndexingResults.Where(r => !r.Succeeded).Select(r => r.Key))}");
}
Now press F5 to execute the code. You can check whether the documents have been correctly loaded into your index by reviewing the Overview blade in your Azure Search service instance, as shown in Figure 6-12. Depending on the size of your dataset, it can take several minutes to show the updated summary information in the Overview blade.
Pushing data to your search index is one way of importing data. The other way is to pull data from a supported data source. By using indexers, Azure Search service can connect to Azure Blobs, Azure Tables, Azure Cosmos DBs, Azure SQL Databases, or databases in SQL Servers deployed on Azure VMs, and the Azure Search service can extract information from those data sources. The indexer connects to a table, view, or equivalent structure in the data source, and it maps the columns or fields in the data source structure with the fields defined in the search index. Then the indexer converts the rowset into a JSON document that is loaded into the index. You can schedule the indexer to check at regular intervals for changes in the data source. You can configure an indexer by using the Azure Portal, the REST API, or the .NET Azure Search SDK. The following procedure shows how to import data from a Cosmos DB database to an index using Azure Portal:
When you are using the Import Data wizard in the Azure Portal, you can only associate data sources to new indexes created during the importing data process. If you need to associate a data source with an existing search index, you need to use the REST API or the .NET Azure Search SDK.
Exam Tip
Using the Azure Portal when working with Azure Search index offers a limited set of features. The Azure Portal is the best tool for creating the initial definition of fields, data sources, and indexes. If you need to import data into your existing index or you need to make modifications to the definition of existing indexes, you should use the REST API or the .NET Azure Search SDK.
Need More Review? Loading Data Into Azure Search
You learn more about complex scenarios by reviewing the following articles:
Load Data https://docs.microsoft.com/en-in/azure/search/search-whatis-data-import
Load Data with Indexers https://docs.microsoft.com/enin/azure/search/search-indexer-overview
Indexer Operations Using the Azure Search Service REST API https://docs.microsoft.com/en-us/rest/api/searchservice/Indexeroperations
Need More Review? Using Cognitive Services with Azure Search Service
You can take advantage of the several Azure Cognitive Services by integrating your Azure Search Service with Computer Vision (for image analysis) or Text Analytics (for entities recognition). This allows you to enrich the features available for Azure Services. You can review how to make these integrations at https://docs.microsoft.com/enin/azure/search/cognitive-search-attach-cognitive-services
Once you have defined your index and populated it with your dataset, you need to query the index to take advantage of the Azure Search service. You can make queries to your index by using different tools:
Search Explorer You can use the Search Explorer tool integrated into
the Azure Portal for querying your index..
Web testing tools Use your favorite web testing tool, such as Fiddler or Postman, to make REST API calls..
.NET SDK Using the .NET Azure Search SDK, you can extrapolate from the details of implementing the REST API calls to the Azure Search service. You need to use the SearchIdexClient class for querying the index..
REST API You can make REST API calls using your favorite language and using the GET or POST methods on your index..
Whatever tool you decide to use for making queries to your index, you need to choose between two different query types—simple and full.
Simple The simple query type is used for typical queries. The behavior of the Azure Search when you use the simple query type is similar to other search engines like Google or Bing. This query type is faster and more effective for free-form text queries. The simple query syntax used in the simple query type contains operators like AND, OR, NOT, phrase, suffix, and precedence operators..
Full The full query type extends the features of the simple query type. While the simple query type uses the Simple Query Parser, the full query type uses the Lucene Query Parser. The Azure Search service is based on the Apache Lucene high-performance search engine developed by the Apache Software Foundation. Using the full query type, you can construct more complex queries by using regular expressions, proximity searches, or fuzzy and wildcard searches, among others. Using queryType=full in your request instructs the Azure Search to use the Lucene Query Parser instead of the default Simple Query Parser..
When constructing a query, you need to bear in mind the definition of your index. Depending on the attributes that you assigned to the fields of your index, you can use them for sorting and filtering the results based on a particular field (or if you can use a specific field for searching). The definition of the field also affects whether the field would be included in the results of the query. For example, in our previous hotel’s index example in which you get the information from an Azure Cosmos DB database, you cannot sort by any field because we didn’t configure any fields as sortable.
Another important distinction that you should bear in mind when querying your index is the difference between filtering and searching.
Searching Searching uses full-text search to look for a value in a string. During the lookup process in a full-text search, the Lucene engine removes stopwords (such as “the” or “and”), reduces the query term to the root of the word, lowercases the query term, and breaks composite words into their parts. Then the engine returns the list of matches. .
Filtering On the other hand, filtering uses a Boolean expression that must evaluate true or false. You can use filtering with strings, but in those situations, you only get those results that exactly match your filter expression, including word casing..
Need More Review? The Lucene Full Text Search Engine
In most situations, the Lucene search engine returns the expected results. But there could be some situations in which you get unexpected results. In those situations, it is interesting to understand how the full-text search engine works. You can learn about the Lucene engine in the Azure Search service at https://docs.microsoft.com/en-in/azure/search/search-lucenequery-architecture.
The following procedure shows how to perform a search using the .NET Azure Search SDK. The procedure below is based on the procedure that you followed in” Import Searchable Data,” earlier in this chapter, in which you learned how to push data into an Azure Search index. Please ensure that you have completed the procedure in Listings 6-1 to 6-6 before starting the following procedure:
Listing 6-7 Program.cs Main method
// C# .NET
//Querying the index
SearchParameters parameters;
DocumentSearchResult<Hotel> results;
//Looking for hotels in Dallas.
Console.WriteLine("Query 1: Search for term 'Atlanta'");
parameters = new SearchParameters();
results = indexClient.Documents.Search<Hotel>("Atlanta", parameters);
WriteDocuments(results);
//Looking for hotels in Dallas. Get only certain fields of the document.
Console.WriteLine("Query 2: Search for term 'Atlanta'");
Console.WriteLine("Get only properties: HotelName, Tags, Rating, and
Address:n");
parameters = new SearchParameters()
{
Select = new[] {"HotelName", "Tags", "Rating", "Address"},
};
results = indexClient.Documents.Search<Hotel>("Atlanta", parameters);
WriteDocuments(results);
//Looking for hotels with restaurants and wifi. Get only the HotelName,
//Description and Tags properties
Console.WriteLine("Query 3: Search for terms 'restaurant' and 'wifi'");
Console.WriteLine("Get only properties: HotelName, Description, and
Tags:n");
parameters = new SearchParameters()
{
Select = new[] { "HotelName", "Description", "Tags" },
};
results = indexClient.Documents.Search<Hotel>("Dallas", parameters);
WriteDocuments(results);
//Use filtering instead of full text searches
Console.WriteLine("Query 4: Filter on ratings greater than 4");
Console.WriteLine("Returning only these fields: HotelName, Rating:n");
parameters =
new SearchParameters()
{
Filter = "Rating gt 4",
Select = new[] { "HotelName", "Rating" }
};
results = indexClient.Documents.Search<Hotel>("*", parameters);
WriteDocuments(results);
//Getting the two best scored hotels.
Console.WriteLine("Query 5: Search on term 'boutique'");
Console.WriteLine("Sort by rating in descending order, taking the top two
results");
Console.WriteLine("Returning only these fields: HotelId, HotelName,
Category, Rating:n");
parameters =
new SearchParameters()
{
//If you try to use a field that is not configured with the
//IsSortable attribute, you will get an error
OrderBy = new[] { "Rating desc" },
Select = new[] { "HotelId", "HotelName", "Category", "Rating" },
Top = 2
};
results = indexClient.Documents.Search<Hotel>("boutique", parameters);
WriteDocuments(results);
Listing 6-8 Program.cs WriteDocuments helper method
// C# .NET
//Helper method for printing the results of a query.
private static void WriteDocuments(DocumentSearchResult<Hotel> searchResults)
{
foreach (SearchResult<Hotel> result in searchResults.Results)
{
Console.WriteLine(result.Document);
}
Console.WriteLine();
}
Need More Review? Querying Azure Search Index Using REST API
Querying the Azure Search REST API to get results is similar to the process that you used to upload documents or create an index. The following articles show how to use the REST API to get results from your Azure Search index, using the simple and full query types:
Query data examples simple syntax https://docs.microsoft.com/enin/azure/search/search-query-simple-examples
Query data examples full syntax https://docs.microsoft.com/enin/azure/search/search-query-lucene-examples
Most of the applications and solutions that you can find or develop nowadays offer an API for accessing the features available in the solution. In business environments, those solutions usually need to communicate with each other using their respective APIs. Sometimes, you need to expose your solutions to your clients to offer your services. In those situations, you need to ensure that you offer a consistent and secure API. It isn’t easy to implement the necessary mechanism to achieve an enterprise-grade level of security, consistency, and flexibility. If you also need to publish several of your services under a common API, this task is even harder.
Microsoft provides the Azure API Management (APIM) service. This service allows you to create an enterprise-grade API for your existing backend services. Using APIM, you can securely publish your back-end applications, providing your customers with a platform protected against DOS attacks or JWT token validations.
This skill covers how to:
The API Management service allows you to expose a portion (or all) of the APIs offered by your back-end systems. By using the APIM service, you can unify all your back-end APIs in a common interface that you can offer to external users, such as clients or partners and internal or external developers. In general, the APIM service is a façade of the APIs that you configure in your APIM instance. Thanks to this façade feature, you can customize the front-end API offered by the APIM instance without changing the back-end API.
When exposing your back-end systems, you are not limited to REST API back ends. You can use a back-end service that uses a SOAP API and then publish this SOAP API as a REST API. This means you can update your older back-end systems without needing to modify the code and take advantage of the greater level of integration of the REST APIs. Use the following procedure to create a new APIM instance:
Note: Pricing Tiers
The Developer pricing tier is appropriate for testing and development environments, but you should not use it for production because the Developer tier does not offer high-availability features and can be affected by disconnections during the updates of the node. You can review the full offer and the features available on each tier at https://azure.microsoft.com/en-us/pricing/details/api-management/
Once you have created your APIM instance, you can start adding APIs to your instance. In the following procedure, you are going to add two different APIs, using different methods. The first method is the OpenAPI specification. For the second API, you are going to create a blank API definition and add only those methods that are appropriate for you.
At this point, you have added your first back-end API to the APIM instance by using the OpenAPI specification of your back-end API. In the following steps, you are going to add a back-end API without using any specification. Creating the front-end endpoints is useful if you need to connect only a few endpoints from your back-end API, or if you don’t have the OpenAPI or SOAP specification of your API in any format:
At this point, you have two back-end APIs connected to your APIM instance. As you can see in the previous example, you don’t need to expose the entire back-end API. By adding the appropriate operations, you can publish only those parts of the back-end API that are useful for you. Once you have created the APIs in your APIM instance, you can grant access to these APIs to your developers by using the Developer Portal. You can access
the APIM Developer Portal at https://<your_APIM_name>.developer.azureapi.net/.
Bear in mind that you need to associate a Product to your API for publishing it. Because you didn’t associate your APIs to any Product, your APIs won’t be available to the external world. Bear in mind that you can associate an API to more than one Product. Use the following procedure to create a Product and associate it with your APIs:
By default, when you create a new Product, only members of the Administrators built-in group can access the Product. You can configure this by using the Access Control section in the Product.
You can customize the Developers Portal for your APIM instance by modifying the content of the pages, adding more pages, and so on. You can review the details about how to perform these customizations by consulting the article at https://docs.microsoft.com/en-us/azure/apimanagement/api-management-customize-styles
Need More Review? Revisions and Versions
During the lifetime of your API, you may need to add to, update, or remove operations from your API. You can make these modifications without disrupting the usage of your API by using revisions and versions. You can review how to work with revisions and versions in your API by reading the article at https://azure.microsoft.com/es-es/blog/versionsrevisions/
Once you have imported your back-end APIs, you need to configure the authentication for accessing these APIs. When you configure the security options in the APIM instance, the back-end API delegates the security to the APIM instance. This means that even though your API has implemented its own authentication mechanism, they are never used when the API is accessed through the APIM instance.
This ability to hide the authentication of the back-end APIs is useful for unifying your security using a consistent and unique authentication mechanism. You can manage the authentication options associated with a Product or API by using Subscriptions. A Subscription manages the keys a developer can use to access your API. If an HTTP request made to an API protected by a Subscription does not provide a valid subscription key, the request is immediately rejected by the APIM gateway without reaching your back-end API. When you define a Subscription, you can use three different scopes to apply this Subscription:
Product When a developer wants to use one of your Products, the developer connects to the Developers Portal of your APIM instance and submits a request to subscribe to the product he or she wants to use..
All APIs The developer can access all APIs in your APIM instance using the same subscription key..
Single API The developer can access a single API in your APIM instance using a subscription key. There is no need for the API to be part of a Product..
If you use the All APIs or Single API scopes, you don’t need to associate the back-end API with an API. A Subscription using any of these two scopes allows access directly to the API. You can use the following procedure to create a Subscription and associate it with a Program:
Need More Review? Other Authentication Methods
Using subscription and subscription keys is not the only mechanism for protecting access to your APIs. API Management allows you to use OAuth 2.0, client certificates, and IP whitelisting. You can use the following articles to review how to use other authentication mechanisms for protecting your APIs:
IP whitelisting https://docs.microsoft.com/en-us/azure/apimanagement/api-management-access-restrictionpolicies#RestrictCallerIPs
OAuth 2.0 authentication using Azure AD
https://docs.microsoft.com/en-us/azure/api-management/apimanagement-howto-protect-backend-with-aad
Mutual authentication using client certificates https://docs.microsoft.com/en-us/azure/api-management/apimanagement-howto-mutual-certificates
When you publish a back-end API using the API Management service, all the requests made to your APIM instance are forwarded to the correct back-end API, and the response is sent back to the requestor. All of these requests or responses are altered or modified by default. But there could be some situations where you need to modify some requests and/or responses. An example of these modification needs could be transforming the format or a response from XML to JSON. Another example could be throttling the number of incoming calls from a particular IP or user.
A policy is a mechanism that you can use to change the default behavior of the APIM gateway. Policies are XML documents that describe a sequence of inbound and outbound steps or statements. Each policy is made of four sections:
Inbound In this section, you can find any statement that applies to requests from the managed API clients..
Backend This section contains the steps that need to be applied to the request that should be sent from the API gateway to the back-end API..
Outbound This section contains statements or modifications that you need to apply to the response before it’s sent to the requestor..
On-Error In case there is an error on any of the other sections, the engine stops processing the remaining steps on the faulty section and jumps to this section..
When you are configuring or defining a policy, you need to bear in mind that you can apply it at different scope levels:
Global The policy applies to all APIs in your APIM instance. You can configure global policies by using the code editor in the All APIs policy editor on the APIs blade of your APIM instance..
Product The policy applies to all APIs associated with a Product. You can configure product policies on the Policies blade of the Product in your API instance..
API The policy applies to all operations configured in the API. You can configure API-scoped policies by using the code editor in the All Operations option on the Design Tab of the API in your APIM instance..
Operation The policy applies only to a specific operation in your API. You can configure operation-scoped policies by using the code editor in the specific operation..
Policies are a powerful and very flexible mechanism that allow you to do a lot of useful work, such as applying caching to the HTTP requests, performing monitoring on the request and responses, authenticating with your back-end API using different authentication mechanisms, or even interacting with external services. Use the following procedure to apply some transformations to the Demo Conference API that you configured in previous sections:
Outbound section and add a new line by pressing the Enter key.
<set-header name="X-Powered-By" exists-action="delete" />
<set-header name="X-AspNet-Version" exists-actio
<find-and-replace from="://conferenceapi.azurewebsites.net" to="://
name>.azure-api.net/conference" />.
Need More Review? More About Policies
There are a lot of useful things you can do using policies.too many to cover in this section.
If you want to learn more about APIM policies, see the following articles:
Error handling in API Management https://docs.microsoft.com/enus/azure/api-management/api-management-error-handling-policies
How to Set or Edit Azure API Management Policies https://docs.microsoft.com/en-us/azure/api-management/set-edit-policies
One of the main principles of code development is to reuse as much as possible. To make it possible to reuse the code, you need to ensure that the code is as loosely coupled as possible, which reduces the dependencies with other parts of the code or other systems to a minimum.
With this principle in mind, to make loosely coupled systems communicate, you need to use a kind of communication. Event-driven architectures allow communication between separate systems by sharing information through events.
In general, an event is a significant change of the system state that happens in the context of the system. An example of an event could be when a user adds an item to the shopping cart in an e-Commerce application or when an IoT device collects the information from its sensors.
Azure provides different services, like Event Grid, notification hubs, or event hubs, to cover the different needs when implementing event-driven architectures.
This skill covers how to:
Azure Event Grid allows you to create an application using serverless architecture by providing a confident platform for managing events. You can use Azure Event Grid for connecting to several types of data sources, such as Azure Blob Storage, Azure Subscription, Event Hubs, IoT Hubs, and others; Azure Even Grid also allows you to use different event handlers to manage these events. You can also create your custom events to integrate your application with the Azure Event Grid. Before you can start using the Azure Event Grid in your solution, there are some basic concepts that we should review:
Event This is a change of state in the source (for example, in an Azure Blob Storage or when an event happens when a new blob is added to the Azure Blob Storage)..
Event source This is the service or application in which an event occurs; there is an event source for every event type..
Event handler This is the app or service that reacts to the event..
Topics These are the endpoints where the event source can send the events. You can use topics for grouping several related events..
Event subscriptions When a new event is added to a topic, that event can be processed by one or more event handlers. The event subscription is an endpoint or built-in mechanism to distribute the events between the different event handlers. Also, you can use subscriptions to filter incoming events..
An important consideration that you need to bear in mind is that an event does not contain the full information about the event itself. The event only contains information relevant to the event itself, such as the source of the event, a time when the event took place, and a unique identifier. For example, when a new blob is added to an Azure Blob storage account, the new blob event doesn’t contain the blob. Instead, the event contains a reference to the blob in the Azure Blob storage account.
When you need to work with events, you configure an event source to send events to a topic. Any system or event handler that needs to process those events subscribes to that topic. When new events raise, the event source pushes the event into the topic configured in the Azure Event Grids service.
Any event handler subscribed to that topic reads the event and processes it according to its internal programming. There is no need for the event source to have event handlers subscribed to the topic; the event source pushes the event to the topic and forgets it. The following steps show how to create a custom topic. Then we will create console applications using C# to send events to the topic and process these events:
When the Azure Resource Manager finishes creating your new Event Grid Topic, you can subscribe to the topic to process the events. Also, you can send your custom events to this topic. Use the following steps to publish custom events to your newly created Event Grid Topic:
Microsoft.Extensions.Configuration.Json NuGet Package.
Listing 6-9 appsettings.json file
{
"EventGridAccessKey": "<Your_EventGridTopic_Access_Key>",
"EventGridTopicEndpoint": "https://<Your_EventGrid_Topic>.<region_name>-1.eventgrid.
azure.net/api/events"
}
Listing 6-10 NewItemCreatedEvent.cs
// C# .NET
using Newtonsoft.Json;
namespace <your_project_name>
{
class NewItemCreatedEvent
{
[JsonProperty(PropertyName = "name")]
public string itemName;
}
}.
using Microsoft.Azure.EventGrid;
using Microsoft.Azure.EventGrid.Models;
using Microsoft.Extensions.Configuration;
usign System.Collections.Generic;
Listing 6-11 Program.cs Main method
// C# .NET
IConfigurationBuilder builder = new ConfigurationBuilder().AddJsonFile("appsettings.
json");
IConfigurationRoot configuration = builder.Build();
string topicEndpoint = configuration["EventGridTopicEndpoint"];
string apiKey = configuration["EventGridAccessKey"];
string topicHostname = new Uri(topicEndpoint).Host;
TopicCredentials topicCredentials = new TopicCredentials(apiKey);
EventGridClient client = new EventGridClient(topicCredentials);
List<EventGridEvent> events = new List<EventGridEvent>();
events.Add(new EventGridEvent()
{
Id = Guid.NewGuid().ToString(),
EventType = "MyCompany.Items.NewItemCreated",
Data = new NewItemCreatedEvent()
{
itemName = "Item 1"
},
EventTime = DateTime.Now,
Subject = "Store A",
DataVersion = "3.7"
});
client.PublishEventsAsync(topicHostname, events).GetAwaiter().GetResult();
Console.WriteLine("Events published to the Event Grid Topic");
Console.ReadLine();
At this point, your console application publishes events to the Event Grid topic that you previously created. Press F5 to run your console application to ensure that everything compiles and works correctly; you will not be able to see the published message. Use the following steps to create a subscriber Azure Function that connects to the Event Grid Topic and processes these events:
Listing 6-12 NewItemCreatedEvent.cs
// C# .NET
using Newtonsoft.Json;
namespace <your_project_name>
{
class NewItemCreatedEvent
{
[JsonProperty(PropertyName = "name")]
public string itemName;
}
}.
Listing 6-13 Function1.cs
// C# .NET
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Extensions.Logging;
using Microsoft.Azure.EventGrid;
using Microsoft.Azure.EventGrid.Models;
namespace <your_project_name>
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]
HttpRequestMessage req,
ILogger log)
{
log.LogInformation("C# HTTP trigger handling EventGrid Events.");
string response = string.Empty;
const string CustomTopicEvent = "Contoso.Items.ItemReceived";
string requestContent = await req.Content.ReadAsStringAsync();
log.LogInformation($"Received events: {requestContent}");
EventGridSubscriber eventGridSubscriber = new EventGridSubscriber();
eventGridSubscriber.AddOrUpdateCustomEventMapping(CustomTopicEvent,
typeof(NewItemCreatedEvent));
EventGridEvent[] eventGridEvents = eventGridSubscriber.DeserializeEventGridE
vents(requestContent);
foreach (EventGridEvent eventGridEvent in eventGridEvents)
{
if (eventGridEvent.Data is SubscriptionValidationEventData)
{
var eventData = (SubscriptionValidationEventData)eventGridEvent.
Data;
log.LogInformation($"Got SubscriptionValidation event data,
validationCode: {eventData.ValidationCode}, validationUrl:
{eventData.ValidationUrl}, topic: {eventGridEvent.Topic}");
// Do any additional validation (as required) such as validating
// that the Azure resource ID of the topic matches
// the expected topic and then return back the below response
var responseData = new SubscriptionValidationResponse()
{
ValidationResponse = eventData.ValidationCode
};
return req.CreateResponse(HttpStatusCode.OK, responseData);
}
else if (eventGridEvent.Data is StorageBlobCreatedEventData)
{
var eventData = (StorageBlobCreatedEventData)eventGridEvent.Data;
log.LogInformation($"Got BlobCreated event data, blob URI
{eventData.Url}");
}
else if (eventGridEvent.Data is NewItemCreatedEvent)
{
var eventData = (NewItemCreatedEvent)eventGridEvent.Data;
log.LogInformation($"Got NewItemCreated event data, item SKU
{eventData.itemName}");
}
}
return req.CreateResponse(HttpStatusCode.OK, response);
}
}
}.
https://<your_azure_function_plan>.azurewebsites.net/
api/<your_azure_function_name>.
At this point, you should be able to publish and process events using the Event Grid Topic that you created previously. Use the following steps to ensure that everything works correctly:
Note: Azure Function Monitoring
You need to have Application Insight integration enabled to be able to see the log messages generated from the Azure Function. Review the article about how to monitor Azure Functions using Application Insights at https://docs.microsoft.com/en-us/azure/azure-functions/functionsmonitoring.
Need More Review? Dead Letter and Retry Policies
When you work with event-driven architectures, there can be situations when the event cannot be delivered to the event handler. In those situations, it’s appropriate to set a retry strategy to try to recover the event before it expires. You can learn more about these retry policies and dead letter management at https://docs.microsoft.com/en-us/azure/eventgrid/manage-event-delivery.
Implement solutions that use Azure Notification Hubs
Developing applications that can be accessed using mobile devices can be challenging because you usually need to allow access to your application from different mobile platforms. The challenge becomes even bigger because the different mobile platforms use different notification systems to send events. You need to deal with the Apple Push Notification Service (APNS), the Google Firebase Cloud Messaging (FCM), or the Windows Notification Service(WNS)—and these are just the main mobile platforms on the market.
The Azure Notification Hubs provide an abstraction layer that you can use for connecting to different push notification mobile platforms. Thanks to this abstraction, you can send the notification message to the Notification Hub, which manages the message and delivers it to the appropriate platform. You can also define and use cross-platform templates. Using these templates, you ensure that your solution sends consistent messages independently of the mobile platform that you are using.
In Skill 2.2 (see “Add push notifications for mobile apps” in Chapter 2) we discussed how to create an Azure Mobile App that integrates with the Azure Notification Hub service. Based on the example that we reviewed in that section, we can extend the architecture of an enterprise solution.
When you need to add push notification support to your solution, you should think of the notification hub as a part of a bigger architecture. An example of this could be a solution that needs to connect your line-ofbusiness applications with a mobile application. In such a scenario, a possible architecture could be to use Event Grid topics. The line-of-business applications would be the publishers of events to the appropriate topic, and then you could deploy one or more Azure Mobile Apps that are subscribed to these topics. When one of the line-of-business applications publishes an event in the Event Grid topic, your Azure Mobile App, which is acting as an event handler, can process the event and send a notification to your mobile users by using the Azure Notification Hub. Figure 6-22 shows a schema of this architecture. As you can see in that figure, the key component of the architecture is the Event Grid service and the implementation of an eventdriven architecture.
Need More Review? Sample Architecture Implementation
You can review a sample architecture implementation using Service Bus messages instead of Event Grid at https://docs.microsoft.com/enus/azure/notification-hubs/notification-hubs-enterprise-push-notificationarchitecture.
Azure Event Grid is a great service for implementing event-driven solutions, but it is only one piece of a more complex pipeline. Although Event Grid is appropriate for working with event-driven, reactive programming, it is not the best solution when you need to ingest millions of events per second with low latency.
Azure Event Hub is a more suitable solution when you require a service that can receive and process millions of events per second and provide lowlatency event processing. Azure Event Hub is the front door of a big data pipeline that processes millions of events. Once the Azure Event Hub receives the data, it can deliver the event to Azure Event Grid, store the information in an Azure Blob Storage account, or store the data in Azure Data Lake Storage.
When you work with event hubs, you send events to the hub. The entity that sends events to the event hub is known as an Event Publisher. An Event Publisher can send events to the event hub by using either of these protocols: AMQP 1.0, Kafka 1.0 (or later), or HTTPS.
You can publish events to the Event Hub by sending a single event or grouping several events in a batch operation. If you publish a single event or a batch of them, you are limited to a maximum size of 1 MB of data per publication. When Azure Event Hub stores an event, it distributes the different events in different partitions based on the partition key provided as one of the data of the event. Using this pattern, Azure Event Hub ensures that all events sharing the same partition key are delivered in order and to the same partition.
A partition stores events as they arrive at the partition, which means newer events are added to the end of the partition. You cannot delete events from a partition. Instead, you need to wait for the event to expire and thus, removed from the partition. Because each partition is independent of other partitions in the Event Hub, the growth rates are different from partition to partition. You can define the number of partitions that your Event Hub contains during the creation of the Event Hub. You can create between 2 and 32 partitions, although you can extend the limit of 32 by contacting the Azure Event Hub team. Bear in mind that once you create the Event Hub and set the number of partitions, you cannot change this number later. When planning the number of partitions to assign to the Event Hub, consider the maximum number of parallels downstream that need to connect to the Event Hub.
You can connect event receiver applications to an Event Hub by using consumer groups. A consumer group is equivalent to a downstream in a stream-processing architecture. Using consumer groups, you can have different event receivers or consumers accessing different views (state, position, or offset) of the partitions in the Event Hub. Event consumers connect to the Event Hub by using the AMQP protocol that sends the event to the client as soon as new data is available.
The following procedure shows how to create an Azure Event Hub:
Once you have created your Event Hubs namespace and your hub, you can start sending and consuming events from the hub. Use the following procedure to create two console applications—one for sending events and another for receiving events:
Listing 6-14 Function1.cs
// C# .NET
using System;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.EventHubs;
namespace <your_project_name>
{
class Program
{
private static EventHubClient eventHubClient;
private const string EventHubConnectionString =
"<Your_event_hub_namespace_connection_string>";
private const string EventHubName = "<your_event_hub_name>";
private const int numMessagesToSend = 100;
static void Main(string[] args)
{
var connectionStringBuilder = new EventHubsConnectionStringBuilder
(EventHubConnectionString)
{
EntityPath = EventHubName
};
eventHubClient = EventHubClient.CreateFromConnectionString
(connectionStringBuilder.ToString());
for (var i = 0; i < numMessagesToSend; i++)
{
try
{
var message = $"Message {i}";
Console.WriteLine($"Sending message: {message}");
eventHubClient.SendAsync(new EventData(Encoding.UTF8.
GetBytes(message)));
}
catch (Exception exception)
{
Console.WriteLine($"{DateTime.Now} > Exception: {exception.
Message}");
}
Task.Delay(10);
}
Console.WriteLine($"{numMessagesToSend} messages sent.");
eventHubClient.CloseAsync();
Console.WriteLine("Press ENTER to exit.");
Console.ReadLine();
}
}
}
At this point, you can press F5 to run the console application. This application console sends 100 messages to the Event Hub that you configured in the EventHubName constant. In the next procedure, you will create another application console for implementing an Event Processor Host. The Event Processor Host is an agent that helps you receive events from the Event Hub. The Event Processor automatically manages the persistent checkpoints and parallel event reception. The Event Processor Host requires an Azure Storage Account to process the persistent checkpoints.
Note: Example Requirements
You need to create an Azure Blob Storage container to run this example. You can review how to create a blob container and how to get the access key by reading the following articles:
Create a container https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-portal#create-acontainer
Get access keys https://docs.microsoft.com/enus/azure/storage/common/storage-account-manage#access-keys
Follow these steps to create the console application that implements the Event Processor Host:
Microsoft.Azure.EventHubs.
Microsoft.Azure.EventHubs.Processor
Listing 6-15 SimpleEventProcessor.cs
// C# .NET
using Microsoft.Azure.EventHubs;
using Microsoft.Azure.EventHubs.Processor;
using System;
using System.Collections.Generic;
using System.Text;
using System.Threading.Tasks;
namespace <your_project_name>
{
public class SimpleEventProcessor : IEventProcessor
{
public Task CloseAsync(PartitionContext context, CloseReason reason)
{
Console.WriteLine($"Processor Shutting Down. Partition '{context.
PartitionId}', Reason: '{reason}'.");
return Task.CompletedTask;
}
public Task OpenAsync(PartitionContext context)
{
Console.WriteLine($"SimpleEventProcessor initialized. Partition: '{context.
PartitionId}'");
return Task.CompletedTask;
}
public Task ProcessErrorAsync(PartitionContext context, Exception error)
{
Console.WriteLine($"Error on Partition: {context.PartitionId}, Error:
{error.Message}");
return Task.CompletedTask;
}
public Task ProcessEventsAsync(PartitionContext context, IEnumerable<EventData>
messages)
{
foreach (var eventData in messages)
{
var data = Encoding.UTF8.GetString(eventData.Body.Array, eventData.Body.
Offset, eventData.Body.Count);
Console.WriteLine($"Message received. Partition: '{context.
PartitionId}', Data: '{data}'");
}
return context.CheckpointAsync();
}
}
}.
Listing 6-16 Program.cs
// C# .NET
using Microsoft.Azure.EventHubs;
using Microsoft.Azure.EventHubs.Processor;
using System;
namespace <your_project_name>
{
class Program
{
private const string EventHubConnectionString =
"<your_event_hub_namespace_connection_string>";
private const string EventHubName = "<your_event_hub_name>";
private const string StorageContainerName = "<your_container_name>";
private const string StorageAccountName = "<your_storage_account_name>";
private const string StorageAccountKey = "<your_storage_account_access_key>";
private static readonly string StorageConnectionString = string.Format($"DefaultE
ndpointsProtocol=https;AccountName={StorageAccountName};AccountKey={StorageAccount
Key}");
static void Main(string[] args)
{
Console.WriteLine("Registering EventProcessor...");
var eventProcessorHost = new EventProcessorHost(
EventHubName,
PartitionReceiver.DefaultConsumerGroupName,
EventHubConnectionString,
StorageConnectionString,
StorageContainerName);
// Registers the Event Processor Host and starts receiving messages
eventProcessorHost.RegisterEventProcessorAsync<SimpleEventProcessor>();
Console.WriteLine("Receiving. Press ENTER to stop worker.");
Console.ReadLine();
// Disposes of the Event Processor Host
eventProcessorHost.UnregisterEventProcessorAsync();
}
}
}
Now, you can press F5 and run your console application. The console application registers itself as an Event Processor and starts waiting for events not processed in the Event Hub. Because the default expiration time for the events in the Event Hub is one day, you will receive all the messages sent by your publishing console application in the previous example. If you run your Event Publisher console application without stopping the Event Processor console application, you will be able to see the messages in the Event Processor console almost in real time as they are sent to the Event Hub by the Event Publishing console. This simple example also shows how the Event Hub distributes the events across the different partitions.
Exam Tip
The Azure Event Hub is a service appropriate for processing huge amounts of events with low latency. You should consider the Event Hub as the starting point in an event processing pipeline. You can use the Event Hub as the event source of the Event Grid service.
Need More Review? Event Hubs Concepts
The Azure Event Hub service is designed to work with big data pipelines where you need to process millions of events per second. In those scenarios, making a bad decision when planning the deployment of an Event Hub can have a big effect on the performance. You can learn more about the Event Hub service by reading the article at https://docs.microsoft.com/en-in/azure/event-hubs/event-hubs-features.
In the previous Skill, we reviewed how to use event-driven services in which a publisher pushes a lightweight notification, or event, to the events management system and forgets about how the event is handled or if it is even processed.
In this section, we are going to review how to develop message-based solutions using Azure services. In general terms, a message is raw data produced by a service with the goal of being stored or processed elsewhere. This means that the publisher of the messages has an expectation of some other system or subscriber process the message. Because of this expectation, the subscriber needs to notify the publisher about the status of the message.
This skill covers how to:
Implement solutions that use Azure Service Bus
Implement solutions that use Azure Queue Storage queues
Azure Service Bus is an enterprise-level integration message broker that allows different applications to communicate with each other in a reliable way. A message is raw data that an application sends asynchronously to the broker to be processed by another application connected to the broker. The message can contain JSON, XML, or text information.
There are some concepts that we need to review before starting to work with the Azure Service Bus:
Namespace This is a container for all messaging components. A single namespace can contain multiple queues and topics. You can use namespaces as application containers that associate a single solution to a single namespace. The different components of your solution connect to the topics and queues in the namespace..
Queue A queue is the container of messages. The queue stores the message until the receiving application retrieves and processes it. The message queue works as a FIFO (First-In, First-Out) stack. When a new message arrives at the queue, the Service Bus service assigns a timestamp to the message. Once the message is processed, the message is held in redundant storage. Queues are appropriate for point-to-point communication scenarios in which a single application needs to communicate with another single application..
Topic You use topics for sending and receiving messages. The difference between queues and topics is that topics can have several applications receiving messages used in publish/subscribe scenarios. A topic can have multiple subscriptions in which each subscription to a topic receives a copy of the message sent to the topic..
Use the following procedure to create an Azure Service Bus namespace; then you can create a topic in the namespace. We are going to use that topic to create two console applications to send and receive the messages from the topic:
Now you are going to create two console applications. One console application will publish messages to the Service Bus Topic; the other console application will subscribe to the Service Bus Topic, process the message, and update the processed message. Use the following procedure to create the console application that publishes messages to the Service Bus Topic:
Listing 6-17 Program.cs
// C# .NET
using Microsoft.Azure.ServiceBus;
using System;
using System.Text;
namespace <your_project_name>
{
class Program
{
const string ServiceBusConnectionString =
"<your_service_bus_connection_string>";
const string TopicName = "<your_topic_name>";
const int numberOfMessagesToSend = 100;
static ITopicClient topicClient;
static void Main(string[] args)
{
topicClient = new TopicClient(ServiceBusConnectionString, TopicName);
Console.WriteLine("Press ENTER key to exit after sending all the
messages.");
Console.WriteLine();
// Send messages.
try
{
for (var i = 0; i < numberOfMessagesToSend; i++)
{
// Create a new message to send to the topic.
string messageBody = $"Message {i} {DateTime.Now}";
var message = new Message(Encoding.UTF8.GetBytes(messageBody));
// Write the body of the message to the console.
Console.WriteLine($"Sending message: {messageBody}");
// Send the message to the topic.
topicClient.SendAsync(message);
}
}
catch (Exception exception)
{
Console.WriteLine($"{DateTime.Now} :: Exception: {exception.Message}");
}
Console.ReadKey();
topicClient.CloseAsync();
}
}
}
You can now press F5 and publish messages to the topic. Once you publish the messages, you should be able to see an increase in the Message Count column in the Overview blade of your Service Bus Topic. The next steps show how to create the second console application that subscribes to the topic and processes the messages in the topic:
Listing 6-18 Program.cs
// C# .NET
using Microsoft.Azure.ServiceBus;
using System;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace <your_project_name>
{
class Program
{
const string ServiceBusConnectionString =
"<your_service_bus_connection_string>";
const string TopicName = "<your_topic_name>";
const string SubscriptionName = "<your_subscription_name>";
static ISubscriptionClient subscriptionClient;
static void Main(string[] args)
{
subscriptionClient = new SubscriptionClient(ServiceBusConnectionString,
TopicName, SubscriptionName);
Console.WriteLine("Press ENTER key to exit after receiving all the
messages.");
// Configure the message handler options in terms of exception handling,
number of concurrent messages to deliver, etc.
var messageHandlerOptions = new MessageHandlerOptions
(ExceptionReceivedHandler)
{
// Maximum number of concurrent calls to the callback
ProcessMessagesAsync(), set to 1 for simplicity.
// Set it according to how many messages the application wants to
process in parallel.
MaxConcurrentCalls = 1,
// Indicates whether the message pump should automatically complete the
messages after returning from user callback.
// False below indicates the complete operation is handled by the user
callback as in ProcessMessagesAsync().
AutoComplete = false
};
// Register the function that processes messages.
subscriptionClient.RegisterMessageHandler(ProcessMessagesAsync,
messageHandlerOptions);
Console.ReadKey();
subscriptionClient.CloseAsync();
}
static async Task ProcessMessagesAsync(Message message, CancellationToken token)
{
// Process the message.
Console.WriteLine($"Received message: SequenceNumber:{message.SystemProperties.
SequenceNumber} Body:{Encoding.UTF8.GetString(message.Body)}");
// Complete the message so that it is not received again.
// This can be done only if the subscriptionClient is created in
// ReceiveMode.PeekLock mode (which is the default).
await subscriptionClient.CompleteAsync(message.SystemProperties.LockToken);
// Note: Use the cancellationToken passed as necessary to determine if the
// subscriptionClient has already been closed.
// If subscriptionClient has already been closed, you can choose to not call
// CompleteAsync() or AbandonAsync() etc.
// to avoid unnecessary exceptions.
}
// Use this handler to examine the exceptions received on the message pump.
static Task ExceptionReceivedHandler(ExceptionReceivedEventArgs
exceptionReceivedEventArgs)
{
Console.WriteLine($"Message handler encountered an exception
{exceptionReceivedEventArgs.Exception}.");
var context = exceptionReceivedEventArgs.ExceptionReceivedContext;
Console.WriteLine("Exception context for troubleshooting:");
Console.WriteLine($"- Endpoint: {context.Endpoint}");
Console.WriteLine($"- Entity Path: {context.EntityPath}");
Console.WriteLine($"- Executing Action: {context.Action}");
return Task.CompletedTask;
}
}
}
You can now press F5 and run the console application. As the console application processes the messages in the topic, you can see that the count of the messages in the subscription is decreasing.
Need More Review?: Service Bus Advanced Features
You can learn more about Service Bus in the following articles:
Queues, Topics, and Subscriptions https://docs.microsoft.com/enus/azure/service-bus-messaging/service-bus-queues-topicssubscriptions
Service Bus Performance Improvements https://docs.microsoft.com/en-us/azure/service-bus-messaging/servicebus-performance-improvements
Topic Filters and Actions https://docs.microsoft.com/enus/azure/service-bus-messaging/topic-filters
Azure Queue Storage is the first service that Microsoft released for managing message queues. Although Azure Service Bus and Azure Queue Storage share some features, such as providing message queue services, Azure Queue Storage is more appropriate when your application needs to store more than 80GB of messages in a queue. Also, although the queues in the service work as a FIFO (First-In, First-Out) stack, the order of the message is not guaranteed.
Note Azure Queue Storage vs. Azure Service Bus
You can review a complete list of differences between these two queuing services at https://docs.microsoft.com/en-us/azure/service-busmessaging/service-bus-azure-andservice-bus-queues-comparedcontrasted
The maximum size of a single message that you can send to an Azure Queue is 64KB, although the total size of the queue can grow larger than 80GB. You can only access an Azure Queue using the REST API or using the .NET Azure Storage SDK. Here are the steps to create an Azure Queue Storage account and a queue for sending and receiving messages:
At this point, you can create queues in your Azure Storage account by using the Azure Portal. You can also add messages to the queue using the Azure Portal. This approach is useful for development or testing purposes, but it is not suitable for applications. Use the following steps to create a console application that creates a new queue in your Azure Storage account.
The application also sends and reads messages from the queue:
Microsoft.Azure.Storage.Common.
Microsoft.Azure.Storage.Queue
Listing 6-19 Program.cs
// C# .NET
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Queue;
using System;
using System.Collections.Generic;
using System.Linq;
namespace <your_project_name>
{
class Program
{
private const string connectionString =
"<your_storage_account_connection_string>";
private const string queueName = "az203queue";
private const int maxNumOfMessages = 10;
static void Main(string[] args)
{
CloudStorageAccount storageAccount = CloudStorageAccount.
Parse(connectionString);
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
//Get a reference to the queue.
CloudQueue queue = queueClient.GetQueueReference(queueName);
//Create the queue if it doesn't exist already
queue.CreateIfNotExists();
//Sending messages to the queue.
for (int i = 0; i < maxNumOfMessages; i++)
{
CloudQueueMessage message = new CloudQueueMessage($"Message {i}
{DateTime.Now}");
queue.AddMessage(message);
}
//Getting the length of the queue
queue.FetchAttributes();
int? cachedMessageCount = queue.ApproximateMessageCount;
//Reading messages from the queue without removing the message
Console.WriteLine("Reading message from the queue without removing them from
the queue");
List<CloudQueueMessage> peekedMessages = (queue.PeekMessages((int)
cachedMessageCount)).ToList();
foreach (CloudQueueMessage peekedMessage in peekedMessages)
{
Console.WriteLine($"Message read from the queue: {peekedMessage.
AsString}");
//Getting the length of the queue
queue.FetchAttributes();
int? queueLenght = queue.ApproximateMessageCount;
Console.WriteLine($"Current lenght of the queue {queueLenght}");
}
//Reading messages removing it from the queue
Console.WriteLine("Reading message from the queue removing");
List<CloudQueueMessage> messages = (queue.GetMessages((int)
cachedMessageCount)).ToList();
foreach (CloudQueueMessage message in messages)
{
Console.WriteLine($"Message read from the queue: {message.AsString}");
//You need to process the message in less than 30 seconds.
queue.DeleteMessage(message);
//Getting the length of the queue
queue.FetchAttributes();
int? queueLenght = queue.ApproximateMessageCount;
Console.WriteLine($"Current lenght of the queue {queueLenght}");
}
}
}
}
Press F5 to execute the console application that sends and reads messages from the queue. You can see how the messages are added to the queue by using the Azure Portal and navigating to your Azure Storage account and choosing Queues > az203queue. You will see a queue similar to the one shown in Figure 6-25.
Need More Review? Publish/Subscribe Pattern
Although the Azure Queue Storage service doesn’t provide the ability to create subscriptions to the queues, you can easily implement the publishsubscribe pattern for communicat-ing applications using the Azure Queue Storage. You can learn how to implement this pattern by reviewing the article https://docs.microsoft.com/en-us/learn/modules/communicatebetween-apps-with-azure-queue-storage/
Azure App Service Logic Apps allows you to interconnect different services without needing to create specific code for the interconnection..
Logic App Workflows define the steps needed to exchange information between applications..
Microsoft provides connectors for sending and receiving information to and from different services..
Triggers are events fired on the source systems..
Actions are each of the steps performed in a workflow..
Azure Logic Apps provides a graphical editor that eases the process of creating workflows..
You can create custom connectors to connect your application with Azure Logic Apps..
A Custom Connector is a wrapper for a REST or SOAP API..
You can create custom connectors for Azure Logic Apps, Microsoft Flow, and Microsoft PowerApps..
You cannot reuse custom connectors created for Microsoft Flow or Microsoft PowerApps with Azure Logic Apps..
You can export your Logic Apps as Azure Resource Manager templates..
You can edit and modify the Logic Apps templates in Visual Studio..
Azure Search service is built on top of the Apache Lucene search engine..
An Azure Search index contains the information you need to add to the search engine..
Azure Search indexes are composed of documents..
Azure Search indexes and documents are conceptually equivalent to tables and rows in a database..
When defining an Azure Search index, you should use the Azure Portal for the initial definition..
You cannot edit or change the definition of an Azure Search index by using the Azure Portal..
You cannot upload data to a search index by using the Azure Portal..
You should use code to edit or modify a search index..
You need to use code to upload data to a search index..
There are two methods to import data to a search index—push and pull.
The push method uploads the actual data, in JSON format, to the search index..
The pull method connects the search index to a supported data source and automatically imports the data.
The push method is less restrictive than the pull method.
The push method has lower latency when performing search operations.
The attributes configured in the field definition affect the physical storage of the index..
The attributes configured in the field definition affect the search queries that you can perform on your search index..
The API Management service allows you to publish your back-end REST or SOAP APIs using a common and secure front end..
You need to create subscriptions in the APIM service to authenticate access to the API..
You need to create a Product to publish a back-end API..
You can publish only some operations of your back-end APIs..
APIM Policies allow you to modify the behavior of the APIM gateway..
An event is a change in the state of an entity..
In an event-driven architecture, the publisher doesn’t have the expectation that the event is processed or stored by a subscriber..
Azure Event Grid is a service for implementing event-driven architectures..
An Event Grid Topic is an endpoint to which a publisher service can send events..
Subscribers are services that read events from an Event Grid Topic..
You can configure several types of services as event sources or event subscribers in Azure Event Grid..
You can create custom events to send them to the Event Grid..
You can subscribe to your custom application with an Event Grid Topic by using webhooks..
The Azure Notification Hub is a service that unifies the push notifications on mobile platforms..
You can connect the push notifications services from the different manufacturers to the Azure Notification Hub..
The Azure Event Hub is the entry point for big data event pipelines..
Azure Event Hub is specialized to ingest millions of events per second with low latency..
You can use Azure Event Hub as an event source for the Event Grid service..
You can use AMQP, Kafka, and HTTPS for connecting to Azure Event Hub..
In a message-driven architecture, the publisher application has the expectation that the message is processed or stored by the subscriber..
The subscriber needs to change the state once the message is processed..
A message is raw data sent by a publisher that needs to be processed by a subscriber..
Azure Service Bus and Azure Queue are message broker services..
In this thought experiment, you will demonstrate your skills and knowledge of the topics covered in this chapter. You can find answers to this thought experiment in the next section.
Your organization has several Line-Of-Business (LOB) applications deployed on Azure and on-premises environments. The information managed by some of these LOB applications overlaps more than one application. All your LOB applications allow you to use SOAP or REST API for connecting to the application.
Your organization needs to implement business processes that require sharing information between the LOB applications. Answer the following questions about connecting Azure services and third-party applications:
This section contains the solutions to the thought experiment. Each answer explains why the answer choice is correct.
Top Training Courses
LIMITED OFFER: GET 30% Discount
This is ONE TIME OFFER
A confirmation link will be sent to this email address to verify your login. *We value your privacy. We will not rent or sell your email address.
Download Free Demo of VCE Exam Simulator
Experience Avanset VCE Exam Simulator for yourself.
Simply submit your e-mail address below to get started with our interactive software demo of your free trial.