Now that I kind of know how to do use this interface and have done, 1 question, its time to start into the next set and hopefully get 5-10 knocked out.
Well, I was on the right track here but having no background in Azure its kind of a shot with being familiar with MSFT stuff. I mean, I’m sort of familiar with Azure but given my shock at what it does in the last post its clear that I have a lot to learn. I would assume that you would add the rule and then auto-scaling but that could go either way. However, it does seem you would want to add the rule and then say the rule auto applies? I mean ok, you apply the auto-scaling with no logic haha. Anyway, splitting hairs. I would also assume that the rule contained the condition but also, wrong.
When walking through the UI it makes sense but it doesn’t talk about Azure app service tiers so I should probably look at that. Whats online isnt super clear but it seems like it will scale out to an additional instance so its possible that figuring out pricing for pushing another instance vs setting it to a higher tier would be an issue for pricing configuration optimization. Maybe there is a video on YouTube about this…well theres a page with a video that I found helpful How and When to Scale Up/Out Using Azure Analysis Services
Let’s start with when to scale up your queries. You need to scale up when your reports are slow, so you’re reporting out of Power BI and the throughput isn’t working for your needs. What you’re doing with scaling up is adding more resources. The QPU is a combination of your CPU, memory and other factors like the number of users.
Memory checks are straightforward. You run the metrics in the Azure portal and you can see what your memory usage is, if your memory limited or memory hard settings are being saturated. If so, you need to either upgrade your tier or adjust the level within your current tier.
CPU bottlenecks are a bit tougher to figure out. You can get an idea by starting to watch your QPUs to see if you’re saturating those using those metrics and looking at the logs within the Azure portal. Then you want to watch your processor pool job que length and your processing pool busy, non-IO threads. This should give you an idea of how it’s performing.
For the most part, you’re going to want to scale up when the processing engine is taking too long to process the data to build your models.
Next up, scaling out. You’ll want to scale out if you’re having problems with responsiveness with reporting because the reporting requirements are saturating what you currently have available. Typically, in cases with a large number of users, you can fix this by scaling out and adding more nodes.
You can add up to 7 additional query replicas; these are Read-only replicas that you can report off, but the processing is handled on the initial instance of Azure Analysis Services and subsequent queries are being handled as part of those query replicas. Hence, any processing is not affecting the responsiveness of the reports.
After it separates the model processing from query engine, then you can measure the performance by watching the log analytics and query processing units and see how they’re performing. If you’re still saturating those, you’ll need to re-evaluate whether you need additional QPUs or to upgrade your tiers.
The thing about this is that its not mentioning tiers beyond standard and these are the current plans but it saying up or out. Up being to a ‘better machine’ and out being to create a replica machine at the same price point, as I understand it. Anyway, these are the current tiers:
Honestly, its a fairly basic concept but to calculate out cost you’ll probably need some kind of Azure Pricing Calculator anyway, I liked this link too Horizontal vs Vertical scaling – Azure Autoscaling … moving on
First of all I have no idea what QnA Maker is, at all but I did get this right, but the terms where backwards. Which means, it wasn’t right. Not sure if this was legible. Anyway, here is the base article and not the ‘how to’ QnA Maker
QnA Maker is a cloud-based API service that lets you create a conversational question-and-answer layer over your existing data. Use it to build a knowledge base by extracting questions and answers from your semi-structured content, including FAQs, manuals, and documents. Answer users’ questions with the best answers from the QnAs in your knowledge base—automatically. Your knowledge base gets smarter, too, as it continually learns from user behavior.
I mean, I could have figured that out based on the name but lets find a docs article with like descriptors Quickstart: Create, train, and publish your QnA Maker knowledge base ok so I was confused by this one based on the question but it starts with this:
Create your first QnA Maker knowledge base
Sign in to the QnAMaker.ai portal with your Azure credentials.
In the QnA Maker portal, select Create a knowledge base.
On the Create page, skip Step 1 if you already have your QnA Maker resource.If you haven’t created the resource yet, select Create a QnA service. You are directed to the Azure portal to set up a QnA Maker service in your subscription. Remember your Azure Active Directory ID, Subscription, QnA resource name you selected when you created the resource.When you are done creating the resource in the Azure portal, return to the QnA Maker portal, refresh the browser page, and continue to Step
Which there is an entire article about this that explains the QnA Maker management service: Manage QnA Maker resources
That one walks through using and creating things with it and its basically the engine that makes the API function. So basically anything having to do with the actual interaction uses that. Terms seem to be a little hazy but that is what they are talking about.
Now, as for the runtime, this is a little confusing if your thinking its insights into how people interact with your data. Which, one would assume that runtime would be associated with performance for hardware but again, terms can be confusing at times. This is also from the setup article:
The QnAMaker runtime is part of the Azure App Service instance that’s deployed when you create a QnAMaker service in the Azure portal. Updates are made periodically to the runtime. The QnA Maker App Service instance is in auto-update mode after the April 2019 site extension release (version 5+). This update is designed to take care of ZERO downtime during upgrades.
You can check your current version at https://www.qnamaker.ai/UserSettings. If your version is older than version 5.x, you must restart App Service to apply the latest updates:
So, you can sort of see info on how the process is running here and im assuming that also has processor and ram usage info? Unclear but I’m sure this is part of a standard format that I’ll figure out as we move along.
Now if you want to understand how people are interacting with your data, this is the information your looking for: Get analytics on your knowledge base
QnA Maker stores all chat logs and other telemetry, if you have enabled App Insights during the creation of your QnA Maker service. Run the sample queries to get your chat logs from App Insights.
So, these terms are not backwards at all and I’m not really sure what the hell I was thinking when I answered it the way that I did but now I have a new level of clarity on all sorts of things. Moving on…
Ok so…this is a whole ass set of stuff to get into. The answer seemed obvious to me but what does the Azure AD Connect Wizard do? The last time I looked into Azure and on prem connect was 2012 and im assuming a lot has changed so lets start there…. holy shit … ok What is hybrid identity with Azure Active Directory?
Alright…lets start with the What is Azure AD Connect? and ill have the cheesecake, yes the entire thing. thanks. haha…anyway. uha
Azure AD Connect is the Microsoft tool designed to meet and accomplish your hybrid identity goals. It provides the following features:
Password hash synchronization – A sign-in method that synchronizes a hash of a users on-premises AD password with Azure AD.
Pass-through authentication – A sign-in method that allows users to use the same password on-premises and in the cloud, but doesn’t require the additional infrastructure of a federated environment.
Federation integration – Federation is an optional part of Azure AD Connect and can be used to configure a hybrid environment using an on-premises AD FS infrastructure. It also provides AD FS management capabilities such as certificate renewal and additional AD FS server deployments.
Synchronization – Responsible for creating users, groups, and other objects. As well as, making sure identity information for your on-premises users and groups is matching the cloud. This synchronization also includes password hashes.
Health Monitoring – Azure AD Connect Health can provide robust monitoring and provide a central location in the Azure portal to view this activity.
So it seems like using a Wizard is bad but you know, we’ve been through the server thing before and its a great idea to know, like, most things: Azure AD Connect sync: Understand and customize synchronization
The Azure Active Directory Connect synchronization services (Azure AD Connect sync) is a main component of Azure AD Connect. It takes care of all the operations that are related to synchronize identity data between your on-premises environment and Azure AD. Azure AD Connect sync is the successor of DirSync, Azure AD Sync, and Forefront Identity Manager with the Azure Active Directory Connector configured.
This is, a huge can of worms haha wow, this is exciting. Anyway, so here’s this: Introduction to the Azure AD Connect Synchronization Service Manager UI
The Synchronization Service Manager UI is used to configure more advanced aspects of the sync engine and to see the operational aspects of the service.
You start the Synchronization Service Manager UI from the start menu. It is named Synchronization Service and can be found in the Azure AD Connect group.
Again, this is as far into this as im going but yeah I love this crap haha… time to move on
I dont know what any of these things are, obviously, so lets define them
- Azure Service Bus – Microsoft Azure Service Bus is a fully managed enterprise integration message broker. Service Bus can decouple applications and services. Service Bus offers a reliable and secure platform for asynchronous transfer of data and state. Data is transferred between different applications and services using messages. A message is in binary format and can contain JSON, XML, or just text. For more information, see Integration Services.
- Azure Relay – The Azure Relay service enables you to securely expose services that run in your corporate network to the public cloud. You can do so without opening a port on your firewall, or making intrusive changes to your corporate network infrastructure.
- Azure Event Grid – Azure Event Grid allows you to easily build applications with event-based architectures. First, select the Azure resource you would like to subscribe to, and then give the event handler or WebHook endpoint to send the event to. Event Grid has built-in support for events coming from Azure services, like storage blobs and resource groups. Event Grid also has support for your own events, using custom topics.
- Azure Event Hub -Azure Event Hubs is a big data streaming platform and event ingestion service. It can receive and process millions of events per second. Data sent to an event hub can be transformed and stored by using any real-time analytics provider or batching/storage adapters.
I’m now understanding them and the ‘Restaurant Telemetry’ seems like it would be Event Hub, the “Inventory’ answer makes sense but to be honest I’m not sure about the first one, ‘Shopping Cart’, however I think its also Service Bus.
Anyway, that’s about all for now and I’ve accomplished all the things I wanted to do yesterday, this morning. So that’s pretty cool. I am, so fucking excited about Azure! I’m happy I got the Sec+ and Net+ but like this type of stuff is so much fun to learn!