Maybe this wont turn into several blog posts but I quickly have learned that this is like starting from scratch on infrastructure because you may think you know how something works but this is cloud so we do it a little bit different. Which is fine but you know, definitional stuff and concepts. For example, this first one. Clearly, its always encrypted but what are we calling that?

Anyway, I don’t know what the rest of this shit is so lets get to lookin! Normally, I would have a good system for designing this but since we are on this new block editor, we will see how this goes (you know the old bulleted list with hyperlinks and short blob to the side)
- Advanced data security for Azure SQL Database – Advanced data security is a unified package for advanced SQL security capabilities. It includes functionality for discovering and classifying sensitive data, surfacing and mitigating potential database vulnerabilities, and detecting anomalous activities that could indicate a threat to your database. It provides a single go-to location for enabling and managing these capabilities. — really product that does assessment and remediation
- Always Encrypted – Always Encrypted is a feature designed to protect sensitive data, such as credit card numbers or national identification numbers (for example, U.S. social security numbers), stored in Azure SQL Database or SQL Server databases. Always Encrypted allows clients to encrypt sensitive data inside client applications and never reveal the encryption keys to the Database Engine (SQL Database or SQL Server). As a result, Always Encrypted provides a separation between those who own the data and can view it, and those who manage the data but should have no access. By ensuring on-premises database administrators, cloud database operators, or other high-privileged unauthorized users, can’t access the encrypted data, Always Encrypted enables customers to confidently store sensitive data outside of their direct control. This allows organizations to store their data in Azure, and enable delegation of on-premises database administration to third parties, or to reduce security clearance requirements for their own DBA staff.
- Elastic pools – SQL Database elastic pools are a simple, cost-effective solution for managing and scaling multiple databases that have varying and unpredictable usage demands. The databases in an elastic pool are on a single Azure SQL Database server and share a set number of resources at a set price. Elastic pools in Azure SQL Database enable SaaS developers to optimize the price performance for a group of databases within a prescribed budget while delivering performance elasticity for each database.
- Transparent data encryption for SQL Database and Azure Synapse – Transparent data encryption (TDE) helps protect Azure SQL Database, Azure SQL Managed Instance, and Synapse SQL in Azure Synapse Analytics against the threat of malicious offline activity by encrypting data at rest. It performs real-time encryption and decryption of the database, associated backups, and transaction log files at rest without requiring changes to the application. By default, TDE is enabled for all newly deployed Azure SQL databases and needs to be manually enabled for older databases of Azure SQL Database, Azure SQL Managed Instance, or Azure Synapse. — I find this confusing as it seems to indicate this is only for data in an ‘at rest’ state but ok

So I took some time off after writing one question and basically remodeled a bathroom for some reason. It started with cleaning the grout. Which turned into re grouting because grout is cheap, which turned into painting the walls, naturally you have to replace all the hardware and paint the cabinets while applying a distressed finished covered with a shiny lacquer. Anyway, the current episode of ‘This Old House’ is finished. Thankfully, I didn’t spend too much money on it but it does look great! Anyway, lets uha, get back to work. Sound good? Great. So the first part of this, after learning that AlwaysOn encryption is for NPPI data that seems obvious, the next few things I have no idea what they are so lets get into that:
- Encrypt a Col
umn of Data
– One would think based on this name that it would apply to this scenario how ever it does not. This is a normal encryption scenario where you generate keys to decrypt the data. Anyway what we are looking for is instructions for setting up AlwaysOn encryption, which apparently is here Query columns using Always Encrypted with SQL Server Management Studio and now we have a bullet with a quote, bare with me here “To enable Always Encrypted, typeColumn Encryption Setting = Enabled
. To disable Always Encrypted, specifyColumn Encryption Setting = Disabled
or remove the setting of Column Encryption Setting from the Additional Properties tab (its default value is Disabled).” So that has to be enabled for alwayson to work - Public Database Role – Every database user belongs to the public database role. When a user has not been granted or denied specific permissions on a securable object, the user inherits the permissions granted to public on that object. Database users cannot be removed from the public role. – basically its kind of extra and doesn’t matter
- The encryption keys bit, per the second article linked in the first bullet “Always Encrypted allows clients to encrypt sensitive data inside client applications and never reveal the encryption keys to the Database Engine”
So here we are, realizing that this Azure cert is, application dev, SQL, windows sever, networking, virtualization and containers. Damn, we are really getting into some shit now boys. And girls. Maybe its only one or the other that reads this. Who knows. Anyway, type 1 for Burt Reynolds and 2 for Jim Croce. If you didn’t pick it up, this is a mustache contest. Actually the photos are backwards because its clear that Croce is number 1 but type 2 for Croce. Unless you really think the guy runner blocker for coors and banging Sally Feild is A number 1. I dont even think he was really bangin her but I could be wrong.


Anyway,

You know, at first I found this confusing because it doesn’t state that the appliance router is functioning as a subnet gateway and I don’t think it gave the IP of the appliance either but that appears to be the situation is that it hits the soft router that’s function as a gateway and then goes from there. If you don’t know what an appliance is Virtual Appliance
Anyway, here’s Jim Croce

Autoscale is obvious, managed disks offer a lot of benefits but I cant find any thing directly stating that you have to use them in this scenario. So I’m still a little confused but I can say that manged disks offer a ton of benefits. No idea what the extra cost is but given how they work I would assume you have to use them with Auto-scale but I don’t see any indicator of that. Anyway, here is some more info on ‘Managed Disks’

This one has a doc associated, so im going to start there: Use Azure Import/Export service to import data to Azure Files
Well this simply mentions that you need them and not really in specific language but its one of the many recommendations along with have a fedex account for some reason? But you do have to have them. It doesn’t go into why or any thing like that but here is info on the creation of the two documents
- Preparing hard drives for an Import Job – The WAImportExport tool is the drive preparation and repair tool that you can use with the Microsoft Azure Import/Export service. You can use this tool to copy data to the hard drives you are going to ship to an Azure datacenter. After an import job has completed, you can use this tool to repair any blobs that were corrupted, were missing, or conflicted with other blobs. After you receive the drives from a completed export job, you can use this tool to repair any files that were corrupted or missing on the drives. In this article, we go over the use of this tool.
What is dataset CSV
Dataset CSV file is the value of /dataset flag is a CSV file that contains a list of directories and/or a list of files to be copied to target drives. The first step to creating an import job is to determine which directories and files you are going to import. This can be a list of directories, a list of unique files, or a combination of those two. When a directory is included, all files in the directory and its subdirectories will be part of the import job.
For each directory or file to be imported, you must identify a destination virtual directory or blob in the Azure Blob service. You will use these targets as inputs to the WAImportExport tool. Directories should be delimited with the forward slash character “/”.
What is driveset CSV
The value of the /InitialDriveSet or /AdditionalDriveSet flag is a CSV file that contains the list of disks to which the drive letters are mapped so that the tool can correctly pick the list of disks to be prepared. If the data size is greater than a single disk size, the WAImportExport tool will distribute the data across multiple disks enlisted in this CSV file in an optimized way.
There is no limit on the number of disks the data can be written to in a single session. The tool will distribute data based on disk size and folder size. It will select the disk that is most optimized for the object-size. The data when uploaded to the storage account will be converged back to the directory structure that was specified in dataset file. In order to create a driveset CSV, follow the steps below.
Anyway, back to the new format being more or less confusing but I think I sort of understand this. Man, when I thought server had a lot to learn I was a little off. Not really but I’m saying there is a ton of stuff to learn with this and Its going to take a while to get familiar with. I’m not really sure how much more or different stuff they could put in the new exams but I’ll be working on this until changes are made that indicate that new material is the way to go or only option.