Azure, Pt 7! Now with more Dev blogs! and LOASP

So I’ve kind of been worrying lately about educating my self out of a job because lately reciters are calling and asking if I know something about a thing that I have a certification for and explaining both hands on and education they are kind of shocked to learn that I actually know what I’m talking about and can point out examples of using it. Not all of them but a fair number. Anyway, I like to learn and am now fairly proficient at learning new tech concepts. I was working for a company doing password resets for user logins, mostly, with an MCSA on Server 2012 so being ‘under employed’ for my education level is less stressful than being unemployed. Anyway, I’m kind of trying to take it slow while realizing that I’m easily board and like to have things to do. Also my desk is great so I love sitting here and studying. That said I ordered an art history book and am haggling with a guy over the price on a 3 volume set about the crusades on eBay to occupy my time now that I’ve completely rebuilt my living space haha. Its much more expensive on Amazon but this is what I was talking about: History of The Crusades, 3 Volume Set: The First Crusade, The Kingdom of Jerusalem, The Kingdom of… by Steven Runciman and I was kind of thinking about starting another blog to write about thoughts expressed in those two books so I don’t have to worry so much about being completely overqualified for everything and having to speak with people who don’t believe me and have little motivation to help me acquire appropriate employment. I’m not saying that every person I speak with has this attitude but its kind of a running joke. Besides, I love art and history and it can be very good subject to have lots of knowledge about for dealing with understanding certain situations. This provides a path for education, I plan to keep learning as long as I’m breathing, and possibly wont educate me out of a job. Dont know though. Depends on my mood. Currently thinking I’m enjoying learning Azure and might just do nothing but flip flop between the two subjects. Three subjects, sorry. Anyway, lets get into Azure.

There are a few issues here. First of all it says 3 actions and you can only pick one. Second I’m assuming you have to have a gateway subnet before you can actually create the VPN Gateway? Assuming they simply mean the subnet thats assigned to that gateway. Anyway, lets take a look to verify this with the linked article: Step-By-Step: Configuring a site-to-site VPN Gateway between Azure and On-Premise

After this is created then you actually create the gateway

Then you create the VPN

There is actually another step in this before creating the VPN that the question seems to be skipping.

Ok, so this one is missing a few things. It had the “further information” button but it was totally blank so what the hell are blueprint files? Is this a thing or is it simply a file called blueprint? Anyway, lets start with Google: What is Azure Blueprints?

Just as a blueprint allows an engineer or an architect to sketch a project’s design parameters, Azure Blueprints enables cloud architects and central information technology groups to define a repeatable set of Azure resources that implements and adheres to an organization’s standards, patterns, and requirements. Azure Blueprints makes it possible for development teams to rapidly build and stand up new environments with trust they’re building within organizational compliance with a set of built-in components, such as networking, to speed up development and delivery.

So now we have an idea of this but regardless we seem to be just copying files. Anyway, lets check this out: Use an Azure file share with Windows

So its using SMB and requiring a key to login to the Azure share but it doesn’t seem to be encrypting the traffic. To my knowledge SMB doesn’t encrypt traffic but lets find out!

SMB Encryption uses the Advanced Encryption Standard (AES)-CCM algorithm to encrypt and decrypt the data. AES-CCM also provides data integrity validation (signing) for encrypted file shares, regardless of the SMB signing settings. If you want to enable SMB signing without encryption, you can continue to do this. For more information, see The Basics of SMB Signing.

Yep, its encrypted. You may have to actually turn it on if your using SMB 2.0 but you shouldnt use that any way. If im understanding it right

Anchorman GIFs | POPSUGAR Entertainment

So whats ‘Storage Explorer’ Doctor Nick? I’m not sure, lets see if its real: Get started with Storage Explorer

Microsoft Azure Storage Explorer is a standalone app that makes it easy to work with Azure Storage data on Windows, macOS, and Linux. In this article, you’ll learn several ways of connecting to and managing your Azure storage accounts.

Sure does seem like you would be able to drag and drop doesn’t it? But really, your just logging into your Azure drives through a desktop app rather than a web browser. I wonder if it run’s better in chrome. Probably lol

This makes no sense because 5 mins ago ‘Network Contributor’ did not mean you could ‘Create a Subnet’! is a Virtual Network different than a subnet? The fuck are they talking about here.

ok boomer. I musta had been mistaken about some detail. I’ll get it ironed out.

Basic shell commands for network troubleshooting are so fucking primitive, i cant even. lol, anyway. One would assume that ‘Diagnostics” did not mean ‘troubleshoot’. Anyway, is connection troubleshoot real? Also, the fuck do they mean by VM Blade??? The woooorlllddd may never know. Im not even goin there with that one. I’m not inclined to believe that it is. Troubleshooting connectivity problems between Azure VMs

I have, no fucking idea whats going on here and the Overview is of no help. Lets start with ‘Entity Framework’

The Entity Framework provides the glue between your object oriented code and the SQL Azure relational database in a framework that is fully compatible with your skills and development tools. Integrated into Visual Studio, and part of ADO.NET, the entity framework provides object relational map capabilities that help an application developer focus on the needs of the application as opposed to the complexities of bridging disparate data representations.

dev blogs dont get cited

Parks and Recreation - April Ludgate's Best Moments (Supercut ...

Just kidding lol Why use the Entity Framework with SQL Azure?

Look, I don’t even know what the API error is!!! How much do you expect me to research this without MORE DATA. …. ok boomers, all right. Connection Resiliency

Connection resiliency automatically retries failed database commands. The feature can be used with any database by supplying an “execution strategy”, which encapsulates the logic necessary to detect failures and retry commands. EF Core providers can supply execution strategies tailored to their specific database failure conditions and optimal retry policies.

Looks like its being used here but the specifics of ‘exponential backoff’ are not here and its probably some SQL stuff that I’m, not getting into today but I atleast have an idea of what the hell this is even if they say what the API error is that appears to be an issue with data insertion.

That’s all for now, may go for a run and then learn some more. Running 8k’s fairly consistently. Much faster than last year. Ran a mile in 10:45 but trying to get to a 60 min 10k eventually so even a 10 min mile isnt fast enough for that at all. Whatever, I’ll figure it out. Fuck having a girl friend lol

Azure, PT 6

It’s been an intresting few days. Lots of solid leads on jobs and lots of the same old same old. Sometimes, its almost as if people ask questions about things they them selves don’t understand while expecting you to know less than them and are suprised by the fact that you know what your talking about whilst not checking your research material or realizing how much effort goes into obtaining certifications. Anyway, I think I’m pretty much done playing this old house and excited to blog from this location.

Image

Yep, totally bought stuff to make scented black candles to sell on etsy because I’m “a lame ass mall goth kid.” Soap making also requires about the same stuff but with using a soap base instead of wax and I do very much like using fancy bar soap for some reason. Judge away as I eat my chicken nuggets and MacNCheese. lol regardless, excited to get back to work on computer stuff. I’m fairly sure I can have the 300 passed before the deadline but no idea if it will still count. Honestly, if I don’t get a job and really “hunker down” i’m sure I can get both of them knocked out. Still waiting to hear what MSFT says about taking one old and one new test though. This would be my preferred method. If you haven’t heard they are retiring the 300 and 301 in September. After having acquired 3 significant certs last year I’m not in a huge rush to get this one knocked out but I can if need be. Anyway, I have had about half a bottle of wine and am ready to see to what we can find in Azure :::says some incompetent gibberish that is some how approved of and generates a response:::

The Musician Portraits of John Singer Sargent | Operavore | WQXR

I’ll be real honest, I have no idea what the hell they are talking about which is awesome because it gives me a lot of stuff to learn and read so lets start with some definitions here:

  • Microservices – Microservices are a software architecture style in which applications are composed of small, independent modules that communicate with each other using well-defined API contracts. These service modules are highly decoupled building blocks that are small enough to implement a single functionality. The purpose of microservices architectures is to make it easier to develop and scale applications. Microservice architectures foster collaboration between autonomous teams and enable them to bring new functionalities to market faster.
  • Service Fabric Cluster – (I have a loose idea on this one) Azure Service Fabric is a distributed systems platform that makes it easy to package, deploy, and manage scalable and reliable microservices and containers. Service Fabric also addresses the significant challenges in developing and managing cloud native applications. Developers and administrators can avoid complex infrastructure problems and focus on implementing mission-critical, demanding workloads that are scalable, reliable, and manageable. Service Fabric represents the next-generation platform for building and managing these enterprise-class, tier-1, cloud-scale applications running in containers.

It was at this point where I had finished my bottle of wine and realize that I was sleepy and required a good solid nap. After attempting to watch Blown Away with Tommy Lee Jones, realizing I had to pay for it and then switching back to the usual TCM programming. Waking up with a mild case of depression and finding funny memes that made fun of people who where not smart and rude, made my self a bagel and now we are back on track. No closer to finding stable and suitable employment though. Anyway, free photo from Blown Away and I’m back to creating content for the purpose of sustaining my existence in a meaningful fashion.

TOMMY LEE JONES BLOWN AWAY (1994 Stock Photo - Alamy
  • Scale Agility – there isn’t a quick definition on this but I’m pretty sure what the mean is the ability to scale up and down quickly.

Anyway, now that we understand what all those terms are, the answer makes sense. There is so much devops stuff in Azure that’s completely new to me. It’s sort of overwhelming but not really. They talk about separation of roles and so forth but its not really that so much as deploying a router as an application rather than a physical bit of hardware. As to why admins are expected to know how to launch and maintain docker chat bots, thats beyond me.

There is an interesting note in the last one that says once they spin up, they dont go down unless memory usage goes down from 50% and the graph is out of order but it spins up to 5 in the second one and then the memory usage/cpu usage never drops enough to let it go to 3 or less.

This is interesting because I’m not sure how Hyper-V works in Azure. Also, if it runs through a gateway is that considered giving it a public address? I would assume so but lets take a look at the network adapter situation for VM’s in Azure. Add network interfaces to or remove network interfaces from virtual machines – per this it seems like you simply add a network adapter? I don’t know if the UI is the same as regular Hyper-V. I would assume but this makes it seem as if that where not that case. I’ll get into that later. ok so per this Configure a private IP address for a VM using the Azure portal the UI is totally different and it looks like you can make a private address and use a public gateway that has NAT. The interesting thing about this is that im assuming this could all still be not connected to the internet without an actual public gateway.

This one is kind of interesting because what its getting at is that as long as there is policy that says that machine is backed up then you cant get rid of that machine basically. I’m left wondering if the Recovery Services Vault is set for only one machine or several machines.

It seems like you have a vault set for one machine which was what the D incorrect answer was kind of hinting at.

This is using OAuth 2.0 and after having gone through Sec+ it becomes really obvious that one of these must use that technology. Clearly D or E would be the choice. The MFA thing kind of throws things off a bit. A and B have nothing to do with this at all. Amazing, how that … works. No where in this does it say that MFA is required under the API conditions list so im really left with D and E

  • Bot Framework Portal – this appears to be a portal to login to in order to build a bot.
  • Bot Framework Authentication – The Azure Bot Service v4 SDK facilitates the development of bots that can access online resources that require authentication. Your bot does not need to manage authentication tokens. Azure does it for you using OAuth2 to generate a token, based on each user’s credentials. Your bot uses the token generated by Azure to access those resources. In this way, the user does not have to provide ID and password to the bot to access a secured resource but only to a trusted identity provider.

And there we are, this uses OAuth2, holy fuck MSFT isn’t using CHAP or some bullshit that makes no sense and have decided to go with the norm here? Clap!

Anyway, I think that’s all for now and I feel like I’ve learned a ton. Another day in wonka land of nonsense Azure posting.

Send Mails from Event Hub via Azure Functions

Great post!

CloudWizardInc.Com

Azure Event Hubs is an event ingestion service for big data streaming workloads. It is capable of receiving and processing millions of events per second. Data sent to an event hub can be transformed and stored by using any real-time analytics provider or batching/storage adapters.

It can also be used to receive Machine’s Telemetry data in a Manufacture Organization. A monitoring and alerting system can be deployed on Event Hubs.

In this article, we will learn to send an email with SendGrid for disconnected Machines based on the telemetry data received in the Event Hub. SendGrid is a cloud-based email service that provides reliable transactional email delivery, scalability, and real-time analytics along with flexible APIs that make custom integration easy

So Let’s begin.

Pre-requisites

  • Event Hub Should be up and running.
  • Events Data received by Event Hub should have below Schema:
{
   "MachineName":"AB01",
   "ServerName":"XXXYYZZZZ01",
   "Process":"CNC",
   "LastStatus":3,
   "Status":"Connect",
   "Time":"2020-05-30T16:18:31.7058659Z"
}

  • To…

View original post 419 more words

Azure Pt. 5

Well, this is take 2 on this post as I didn’t make a draft of the first one. I’ve been taking it easy lately and kind of focusing on applying for jobs, contact with temp agencies, interviews (I had 3 on Friday! hoping one of the works out) as well as doing some home renovation stuff. I have a thread on twitter seen here, for home renovation stuff. Anyway, I have an amazing super goth bathroom with a shelf in the shower for candelabra lit showers that I’m enjoying immensely and feel free to make fun of me about this haha

Touring Bela Lugosi's Los Angeles haunts and hangouts - Curbed LA

As well as a black/white shabby chic looking desk that’s also covered in glitter and candles. You really only notice the glitter as it flickers off the reflection of the flame. It’s pretty cool. My desktop PCs was about 400 dollars used like 5 years ago so I can’t do much more than tweet and, write blog posts, from that machine. Which, other than making music is pretty much all I do anyway. Regardless, it would be nice to have another FL machine. So lets get into this pots. I mean, Post. I thoroughly enjoy Azure but have been so into HGTV while having time off haha. By time off I mean, the world is recovering from a pandemic and people are burning cities down and rioting over police brutality and the ATL police decided they would go ahead and shoot another guy. In the back.

Which member of the band 'The Cure' was most responsible for the ...

Anyway, lets get Azure..

I kind of covered part of this but I don’t know everything about it. The other thing about these questions is you can sort of guess because the answers on the left go in matching size boxes. honestly, I find it helpful. Anyway, lets look at it step by step. Use the Azure Import/Export service to import data to Azure Blob Storage

So, these steps are honestly kind of random because you are physically shipping a drive to MSFT and they manually import it to a blob for you. They do actually ship the physical drive back to you if you request it though. I feel like I went over this but somehow was confused on the fact that they didnt transfer the data over VPN

Anyway, that’t the bit about returning the drives.

Naturally, one would assume that MSFT wouldn’t pick the underutilized proprietary web language for the most common app language in use. I picked Java just because while considering that the answer probably was .Net

This question is kind of confusing. The second question is easy but I’m not super clear on the math on the first one. So lets start into that. The part I don’t understand here is that I’m not seeing a time deliminator of after (unit of time) at (CPU usage) then create the condition of (spin 2 more VMs up). Perhaps the time limit is a standard notion? I have no idea where to find this info. So according to this Understand Autoscale settings

See where it says ‘past 10 mins’? That’s a configurable setting that I didn’t see in the question. So I think this one has some specific issues regarding it.

The small text in the first one is the important part, its a load balancer and the details of the back end part isn’t what its asking about, its asking how to set up a load balancer that’s connecting several things. Anyway, lets start with the linked article. Actually, lets start here What is Azure Application Gateway?

Azure Application Gateway is a web traffic load balancer that enables you to manage traffic to your web applications. Traditional load balancers operate at the transport layer (OSI layer 4 – TCP and UDP) and route traffic based on source IP address and port, to a destination IP address and port. Application Gateway can make routing decisions based on additional attributes of an HTTP request, for example URI path or host headers. For example, you can route traffic based on the incoming URL. So if /images is in the incoming URL, you can route traffic to a specific set of servers (known as a pool) configured for images. If /video is in the URL, that traffic is routed to another pool that’s optimized for videos.

And this is the linked article Tutorial: Create an application gateway with path-based routing rules using the Azure portal

This is the part they are talking about and given the options it makes sense but somehow Azure is still slightly confusing. Like its Windows but on Weed lol

La Ptite Grenouille Montréal presents Tribute to Sublime - June ...

I got to mail stuff? Things are called different things! You got routing tables that are all caty wompus!

Well, this one, to be honest, I dont really know what it is and I thought about drinking a cup of coffee before tackling this. Decided I could manage though.

  • Azure Backup Server – Back up files, folders, system state using the Microsoft Azure Recovery Services (MARS) agent. Or use the DPM or Azure Backup Server (MABS) agent to protect on-premises VMs (Hyper-V and VMWare) and other on-premises workloads
  • Recovery Services Vault Recovery Services vaults are based on the Azure Resource Manager model of Azure, however Backup vaults were based on the Azure Service Manager model. When you upgrade a Backup vault to a Recovery Services vault, the backup data remains intact during and after the upgrade process. Recovery Services vaults provide features not available for Backup vaults
  • A backup policy – not really an Azure thing
  • A recovery plan – not really an Azure thing

The back up server is actually an application. not entirely clear but it appears to be used for, making back ups. It’s interesting that even in Azure there is so much focus on backup but I supposed they are not writing your code and managing your machines. However, hardware failure does seem to be off the table. Anyway, that’s all for now. I’ll probably create another one of these later today.

Azure, Pt. 4

Maybe this wont turn into several blog posts but I quickly have learned that this is like starting from scratch on infrastructure because you may think you know how something works but this is cloud so we do it a little bit different. Which is fine but you know, definitional stuff and concepts. For example, this first one. Clearly, its always encrypted but what are we calling that?

Anyway, I don’t know what the rest of this shit is so lets get to lookin! Normally, I would have a good system for designing this but since we are on this new block editor, we will see how this goes (you know the old bulleted list with hyperlinks and short blob to the side)

  • Advanced data security for Azure SQL Database – Advanced data security is a unified package for advanced SQL security capabilities. It includes functionality for discovering and classifying sensitive data, surfacing and mitigating potential database vulnerabilities, and detecting anomalous activities that could indicate a threat to your database. It provides a single go-to location for enabling and managing these capabilities. — really product that does assessment and remediation
  • Always Encrypted – Always Encrypted is a feature designed to protect sensitive data, such as credit card numbers or national identification numbers (for example, U.S. social security numbers), stored in Azure SQL Database or SQL Server databases. Always Encrypted allows clients to encrypt sensitive data inside client applications and never reveal the encryption keys to the Database Engine (SQL Database or SQL Server). As a result, Always Encrypted provides a separation between those who own the data and can view it, and those who manage the data but should have no access. By ensuring on-premises database administrators, cloud database operators, or other high-privileged unauthorized users, can’t access the encrypted data, Always Encrypted enables customers to confidently store sensitive data outside of their direct control. This allows organizations to store their data in Azure, and enable delegation of on-premises database administration to third parties, or to reduce security clearance requirements for their own DBA staff.
  • Elastic pools – SQL Database elastic pools are a simple, cost-effective solution for managing and scaling multiple databases that have varying and unpredictable usage demands. The databases in an elastic pool are on a single Azure SQL Database server and share a set number of resources at a set price. Elastic pools in Azure SQL Database enable SaaS developers to optimize the price performance for a group of databases within a prescribed budget while delivering performance elasticity for each database.
  • Transparent data encryption for SQL Database and Azure SynapseTransparent data encryption (TDE) helps protect Azure SQL Database, Azure SQL Managed Instance, and Synapse SQL in Azure Synapse Analytics against the threat of malicious offline activity by encrypting data at rest. It performs real-time encryption and decryption of the database, associated backups, and transaction log files at rest without requiring changes to the application. By default, TDE is enabled for all newly deployed Azure SQL databases and needs to be manually enabled for older databases of Azure SQL Database, Azure SQL Managed Instance, or Azure Synapse. — I find this confusing as it seems to indicate this is only for data in an ‘at rest’ state but ok

So I took some time off after writing one question and basically remodeled a bathroom for some reason. It started with cleaning the grout. Which turned into re grouting because grout is cheap, which turned into painting the walls, naturally you have to replace all the hardware and paint the cabinets while applying a distressed finished covered with a shiny lacquer. Anyway, the current episode of ‘This Old House’ is finished. Thankfully, I didn’t spend too much money on it but it does look great! Anyway, lets uha, get back to work. Sound good? Great. So the first part of this, after learning that AlwaysOn encryption is for NPPI data that seems obvious, the next few things I have no idea what they are so lets get into that:

  • Encrypt a Column of Data – One would think based on this name that it would apply to this scenario how ever it does not. This is a normal encryption scenario where you generate keys to decrypt the data. Anyway what we are looking for is instructions for setting up AlwaysOn encryption, which apparently is here Query columns using Always Encrypted with SQL Server Management Studio and now we have a bullet with a quote, bare with me here “To enable Always Encrypted, type Column Encryption Setting = Enabled. To disable Always Encrypted, specify Column Encryption Setting = Disabled or remove the setting of Column Encryption Setting from the Additional Properties tab (its default value is Disabled).” So that has to be enabled for alwayson to work
  • Public Database Role – Every database user belongs to the public database role. When a user has not been granted or denied specific permissions on a securable object, the user inherits the permissions granted to public on that object. Database users cannot be removed from the public role. – basically its kind of extra and doesn’t matter
  • The encryption keys bit, per the second article linked in the first bullet “Always Encrypted allows clients to encrypt sensitive data inside client applications and never reveal the encryption keys to the Database Engine”

So here we are, realizing that this Azure cert is, application dev, SQL, windows sever, networking, virtualization and containers. Damn, we are really getting into some shit now boys. And girls. Maybe its only one or the other that reads this. Who knows. Anyway, type 1 for Burt Reynolds and 2 for Jim Croce. If you didn’t pick it up, this is a mustache contest. Actually the photos are backwards because its clear that Croce is number 1 but type 2 for Croce. Unless you really think the guy runner blocker for coors and banging Sally Feild is A number 1. I dont even think he was really bangin her but I could be wrong.

The Story Behind The Song: I Got A Name by Jim Croce | Louder
Burt Reynolds: A Star With the Pedal to the Metal - The New York Times

Anyway,

You know, at first I found this confusing because it doesn’t state that the appliance router is functioning as a subnet gateway and I don’t think it gave the IP of the appliance either but that appears to be the situation is that it hits the soft router that’s function as a gateway and then goes from there. If you don’t know what an appliance is Virtual Appliance

Anyway, here’s Jim Croce

Autoscale is obvious, managed disks offer a lot of benefits but I cant find any thing directly stating that you have to use them in this scenario. So I’m still a little confused but I can say that manged disks offer a ton of benefits. No idea what the extra cost is but given how they work I would assume you have to use them with Auto-scale but I don’t see any indicator of that. Anyway, here is some more info on ‘Managed Disks’

This one has a doc associated, so im going to start there: Use Azure Import/Export service to import data to Azure Files

Well this simply mentions that you need them and not really in specific language but its one of the many recommendations along with have a fedex account for some reason? But you do have to have them. It doesn’t go into why or any thing like that but here is info on the creation of the two documents

  • Preparing hard drives for an Import Job – The WAImportExport tool is the drive preparation and repair tool that you can use with the Microsoft Azure Import/Export service. You can use this tool to copy data to the hard drives you are going to ship to an Azure datacenter. After an import job has completed, you can use this tool to repair any blobs that were corrupted, were missing, or conflicted with other blobs. After you receive the drives from a completed export job, you can use this tool to repair any files that were corrupted or missing on the drives. In this article, we go over the use of this tool.

What is dataset CSV

Dataset CSV file is the value of /dataset flag is a CSV file that contains a list of directories and/or a list of files to be copied to target drives. The first step to creating an import job is to determine which directories and files you are going to import. This can be a list of directories, a list of unique files, or a combination of those two. When a directory is included, all files in the directory and its subdirectories will be part of the import job.

For each directory or file to be imported, you must identify a destination virtual directory or blob in the Azure Blob service. You will use these targets as inputs to the WAImportExport tool. Directories should be delimited with the forward slash character “/”.

What is driveset CSV

The value of the /InitialDriveSet or /AdditionalDriveSet flag is a CSV file that contains the list of disks to which the drive letters are mapped so that the tool can correctly pick the list of disks to be prepared. If the data size is greater than a single disk size, the WAImportExport tool will distribute the data across multiple disks enlisted in this CSV file in an optimized way.

There is no limit on the number of disks the data can be written to in a single session. The tool will distribute data based on disk size and folder size. It will select the disk that is most optimized for the object-size. The data when uploaded to the storage account will be converged back to the directory structure that was specified in dataset file. In order to create a driveset CSV, follow the steps below.

Anyway, back to the new format being more or less confusing but I think I sort of understand this. Man, when I thought server had a lot to learn I was a little off. Not really but I’m saying there is a ton of stuff to learn with this and Its going to take a while to get familiar with. I’m not really sure how much more or different stuff they could put in the new exams but I’ll be working on this until changes are made that indicate that new material is the way to go or only option.

How to Reduce the Costs of your Azure IaaS VMs – Thomas Maurer

I hadn’t considered this but yeah those licenses are expensive.

If you already have existing Windows Server and SQL Server on-premises licenses with Software Assurance, you can use them for Azure virtual machines (VMs). This will allow you to save the Pay-as-you-go cost for Windows Server and SQL Server licenses. The Azure Hybrid Benefit applies not only to Azure VMs but also on Azure SQL Database PaaS services and the Azure Dedicated Host. If you want to know more about how to take advantage of the Azure Hybrid Benefit, check out the Microsoft Azure Docs page.

Azure 3.3, i’ll be on this for a while

Welp, yesterday went fairly well. Ended up going for brunch and leaving the house for the first time in like 2 or 3 months to actually do something. Anyway, I think this is going to be a short post unless I go through some more questions.

Wow, a slightly tricky question where they actually expound upon the answer so I don’t have to look around confused like. Having never used Azure for storage or really at all, root is not c but /. Good note. I don’t think this would work the same way on a local install but I could be wrong. Well, per this (which does state that add can be used to copy files into a build) it has to be where docker is installed and you don’t annotate the drive. Interesting note if I’m reading it correctly.

The docker documentation doesn’t use a drive any where either. I’m not sure why I didn’t notice that haha

My first instinct in this was to select the two correct answers but then considered the A answer and went with that and C. Why I thought forwarded traffic was more important than remote gateways was that it seemed redundant for some reason. It seems like if you use gateway transit it has to go through a gateway and into other one right? I don’t know about this one, I need to read the article. Probably a good place to start Virtual network peering

Virtual network peering enables you to seamlessly connect networks in Azure Virtual Network. The virtual networks appear as one for connectivity purposes. The traffic between virtual machines uses the Microsoft backbone infrastructure. Like traffic between virtual machines in the same network, traffic is routed through Microsoft’s private network only.

Azure supports the following types of peering:

  • Virtual network peering: Connect virtual networks within the same Azure region.
  • Global virtual network peering: Connecting virtual networks across Azure regions.

The benefits of using virtual network peering, whether local or global, include:

  • A low-latency, high-bandwidth connection between resources in different virtual networks.
  • The ability for resources in one virtual network to communicate with resources in a different virtual network.
  • The ability to transfer data between virtual networks across Azure subscriptions, Azure Active Directory tenants, deployment models, and Azure regions.
  • The ability to peer virtual networks created through the Azure Resource Manager.
  • The ability to peer a virtual network created through Resource Manager to one created through the classic deployment model. To learn more about Azure deployment models, see Understand Azure deployment models.
  • No downtime to resources in either virtual network when creating the peering, or after the peering is created.

Network traffic between peered virtual networks is private. Traffic between the virtual networks is kept on the Microsoft backbone network. No public Internet, gateways, or encryption is required in the communication between the virtual networks.

So that’s helpful but its not the specifics we are looking for but this one has more info Create, change, or delete a virtual network peering

So it turns out this is a fairly specific scenario as it doesn’t indicate that your using a VPN, it says hub and spoke. Which apparently works the same way as using a VPN. Forwarded traffic is also covered in that article and explains why you would want to do that and gives scenario examples. I’ll let you click through to the article if your interested but I’m sure your excited about my high lighted notes in this bad boy. Anyway, that’s all for this one.

Azure! Part 3.2… Or Network Watcher, NSG’s and more!

I’m unemployed at the moment and doing lots of interviews but with this COVID-19 stuff not a lot going on. Unemployment is also kind of tough as my employer has filed a claim but they are still sitting on it. My bills are paid up for this month but I’m pretty sure I’ll have to cash out my small 401k as it doesn’t look like unemployment is coming through any time soon. I complain but there are people in much worse positions. Also paid a company to redo my resume, I don’t think I mentioned that, and sent them a list of information about my blog and whats covered on various certs. Excited to get back because I’m not really certain how to organize some of that stuff and based on emails it looked like they could tell that I was highly skilled hard working professional but who knows, they are corporate linguistic experts. Anyway, lets get to work.

I found it kind of suprising with this one that they didnt give an idea of why the set up is with VM3 but I have no idea what NSGs are so I think we need to start there. Network security groups

You can use Azure network security group to filter network traffic to and from Azure resources in an Azure virtual network. A network security group contains security rules that allow or deny inbound network traffic to, or outbound network traffic from, several types of Azure resources. For each rule, you can specify source and destination, port, and protocol. This article describes properties of a network security group rule, the default security rules that are applied, and the rule properties that you can modify to create an augmented security rule.

This is kind of basic stuff but there is a specific flavor to it and im starting to realize I might want to watch one of those hour or two hour long videos on Azure networking basics however the idea is NSGs basically function as a rule set as if traffic was going through a configured switch. At least that’s my understanding so far. If it functions like Hyper-V, it may prove to provide too much non-useful granular detail settings but hopefully that isnt the case. I mean, that’s my experience using Hyper-V in 2019 but maybe you had a different experience. Anyway, where we? Oh yeah, all right, lets get into Azure Network Watcher. Also, a diagram being static seems like a good idea as you could see what NSG was applied to it (right I couldnt figure that out) to view conflicting rules until you see what Azure Network Watcher is. In the below video you can see the tool in use and from the starting point up to about 5 mins in they are talking about this scenario. They also go into diagramming up time and so forth but what I would like to see is if the tool shows real time if a connection is broken and offers a reason as to why. That doesn’t seem that hard but I could be wrong haha

At around 6 mins, you can see that if you run through some things it will tell you more information but I wondering about a heads up display with like real time type of diagram situations. Anyway, you can view and change network diagrams from Network Watcher.

You know, since I’ve found that being more through is helpful when studying for certification tests lets just make this a big long post where we learn about Azure Networking. So lets look at these other answers. We can start with Azure Monitor. Since these are tools I think that videos might be more helpful and I found this Azure Monitor video to be most helpful. There is another video that looks much slicker but to be honest this one has the best description and tool use cases

So I searched YouTube for videos to come up with this one and discovered it was on a page. This image shows that you can get network insights using Azure monitor but I’m not seeing it on the page and he doesn’t go into in the video so I’m going to assume its more performance related than being a super useful tool to diagnose network issues as that’s probably what Network Watcher is for. Anyway, Azure Monitor overview

Azure Monitor can collect data from a variety of sources. You can think of monitoring data for your applications in tiers ranging from your application, any operating system and services it relies on, down to the platform itself. Azure Monitor collects data from each of the following tiers:

  • Application monitoring data: Data about the performance and functionality of the code you have written, regardless of its platform.
  • Guest OS monitoring data: Data about the operating system on which your application is running. This could be running in Azure, another cloud, or on-premises.
  • Azure resource monitoring data: Data about the operation of an Azure resource.
  • Azure subscription monitoring data: Data about the operation and management of an Azure subscription, as well as data about the health and operation of Azure itself.
  • Azure tenant monitoring data: Data about the operation of tenant-level Azure services, such as Azure Active Directory.

Basically it seems like a place to sort logs pertaining to machine and app performance.

Ok, so whats a Traffic Manager Profile? Well, lets start here: What is Traffic Manager?

Azure Traffic Manager is a DNS-based traffic load balancer that enables you to distribute traffic optimally to services across global Azure regions, while providing high availability and responsiveness.

Traffic Manager uses DNS to direct client requests to the most appropriate service endpoint based on a traffic-routing method and the health of the endpoints. An endpoint is any Internet-facing service hosted inside or outside of Azure. Traffic Manager provides a range of traffic-routing methods and endpoint monitoring options to suit different application needs and automatic failover models. Traffic Manager is resilient to failure, including the failure of an entire Azure region.

Oh man, I keep hearing Azure region mentioned but I haven’t gotten into that yet. Might as well grab that while we are thinking about it:

A region is a set of datacenters deployed within a latency-defined perimeter and connected through a dedicated regional low-latency network.

With more global regions than any other cloud provider, Azure gives customers the flexibility to deploy applications where they need to. Azure is generally available in 53 regions around the world, with plans announced for 5 additional regions.

Ok, that’s straight forward and interesting but lets get back to load balancing with hybrid cloud options, I mean Azure Traffic Manager…anyway, yeah its a powerful load balancer and MSFT has some really great documentation about how to set it up and use it like this profile for low latency that even goes into actually creating VMs, installing IIS and and all that and then finally gets into creating the profile that is in use to actually direct traffic Tutorial: Improve website response using Traffic Manager . Anyway, I think that’s all for this question. I’m going to do one more. Watch an Azure Networking Video that I probably wont link but you can find if you can use google and then maybe go downtown for a nice long run.

I got the first two right in this question but I have no idea what they are talking about with a probe. The other two are basic networking questions. I mean, maybe not basic as in home router but at … literally nothing I can say at this point wont sound pretentious as hell haha. Anyway, lets figure out what a probe is. The naming convention here is a little wonky but I can read through the idea to understand what it is Application Gateway health monitoring overview

An application gateway automatically configures a default health probe when you don’t set up any custom probe configuration. The monitoring behavior works by making an HTTP request to the IP addresses configured for the back-end pool. For default probes if the backend http settings are configured for HTTPS, the probe uses HTTPS as well to test health of the backends.

For example: You configure your application gateway to use back-end servers A, B, and C to receive HTTP network traffic on port 80. The default health monitoring tests the three servers every 30 seconds for a healthy HTTP response. A healthy HTTP response has a status code between 200 and 399.

If the default probe check fails for server A, the application gateway removes it from its back-end pool, and network traffic stops flowing to this server. The default probe still continues to check for server A every 30 seconds. When server A responds successfully to one request from a default health probe, it’s added back as healthy to the back-end pool, and traffic starts flowing to the server again.

So a probe is basically a heartbeat and the naming conventions for that concept are usually changed and everyone calls it something different. Its one server saying “hey are you up” to another server but perhaps this is a little more in-depth as they don’t usually have rule sets identified with them but this is for larger scale infrastructure.

Honestly the next two questions in this are not as expansive so I may try to figure out some more stuff. Who knows. Anyways, thank you to whom ever actually reads this blog! I appreciate your viewership of this thing that I put time and money into haha

Azure Pt. 3.1, Container image hosting…

Well, I paid a company to redo my resume. Not a lot of stuff out there at the moment by my current resume format seems a bit crowded and its probably good to have someone who deals with resumes day in and day out to take a look at it and figure out how to organize things and what to highlight. So much Linux experience with NAPA and im attempting to target a Windows Server Admin role but we will see how that goes haha. Anyway, back to Azure and realizing that this two test cert could take most of the year. This is fine with me. If I have extra time maybe ill start into the 70-744 but they are no longer offering that in January of next year and to be honest a Network+ and a Security+ with two MCSAs sounds better than a Core Infrastructure MCSE to me as its vendor diverse and implies the same thing with less confusion as to meaning. That’s not say that I don’t want the Core Infrastructure MCSE but I’m not sure I have the time/value for it unfortunately but it would feel awesome to pass the 70-744. It’s also becoming very apparent that cloud computing is the future so, here we are. Anyway, this is the part where I start throwing in questions and trying figure out what everything is.

One would assume that the data has to go into some type of storage for a container, assuming we are using Docker as that’s what I learned about on the 2016 MCSA but who knows. Lets take a look at how containers work in Azure. This make take a while or it may not. Who knows. Lets start with the link in the question: Deploy an Azure Web App Container

I dont know what YAML is, I’ve heard the term thrown around but I’m not super familiar with it so lets sort through that but you can see in the screen shot that its pointing to a container registry. Earlier they ran a command to pull your docker image from a github repo that has sample container images or you would assume its sample images but really its just code pointing to default docker test images, as seen below

This Dockerfile is for a test container to demonstrate use of docker-compose with Azure Pipelines

See http://docs.microsoft.com/azure/devops/pipelines/languages/docker for more information

FROM ubuntu:trusty

RUN apt-get update && apt-get install -yq curl && apt-get clean

WORKDIR /app

ADD docs/test.sh /app/test.sh

CMD [“bash”, “test.sh”]

Kind of confused by this as its running an update script and doesn’t seem to point to an image and appears to be updating an image which has a large potential to break any running apps you have on a container haha. Anyway, if your interested install docker on your machine and then download something like this: Couchbase and you’ve got a small VM running on your machine. Now lets figure out what YAML is

YAML (a recursive acronym for “YAML Ain’t Markup Language”) is a human-readable data-serialization language. It is commonly used for configuration files and in applications where data is being stored or transmitted. YAML targets many of the same communications applications as Extensible Markup Language (XML) but has a minimal syntax which intentionally differs from SGML .[1] It uses both Python-style indentation to indicate nesting, and a more compact format that uses [] for lists and {} for maps[1] making YAML 1.2 a superset of JSON.[2]

Funny thing about programming languages, I’ve learned that every thing in 2020 is basically JAVA or XML haha. I’ve also found it very helpful to occassionaly poke around in Kali and walk through some basic stuff on Vuln hub as it promotes familiarity with Linux and unless you want to sit around and build web apps at home or something its sort of like Leapfrog learning. it Also comes with free super cool sunglasses and and a hoodie (follow @viss on twitter for more info haha)

hacking

Edit: now with hackerman HD images courtesy of Viss that I wasn’t sure where I had saved

Anyway, back to Docker on Azure. The point being this is dev ops and they have linked an article that is fairly specific and slightly confusing for infrastructure people with no background in containers. For the sake of “this is the article they linked” I’m going to start with Pipelines and then dig into the container process because the linked repo is concerning Azure Pipelines

There is a lot going on with Pipelines lol but this little graphic sums it up in common folk talk the best. As an added bonus be sure and note the underlying hackerman joke of ‘deploy to target.’ Also, have you read AWS documentation? Can you read technical documentation well haha, JFC! I think we have a little comedy club going for those … you know what. nevermind. its better this way. Ok, here we go Build An Image

This one makes sense, you throw in your docker file and this is a template that has images and isn’t just a piping code in. I don’t know, I could be fucking this up as I haven’t used docker super extensively but I sort of get the basics. All right, well its kind of clear on that one. Again, not really in devops so I would I have to do some more research and testing but I have found for learning this stuff ‘hacker man’ stuff is a great way to figure some things out. Again, I don’t recommend it for cool points, but local Defcon groups can be great fun . Anyway, Container Registry? (normally I don’t link the sale pages but I found this helpful)

Wow! Pipelines for patching! not..building images haha

The office smile excited GIF on GIFER - by Dirr

And now we are back on Docker with the technical documentation Introduction to private Docker container registries in Azure (why not just use docker hub, pull your image over and pipe code to the container? Who knows … I’m not in sales or devops haha)

Azure Container Registry is a managed, private Docker registry service based on the open-source Docker Registry 2.0. Create and maintain Azure container registries to store and manage your private Docker container images and related artifacts.

Use Azure container registries with your existing container development and deployment pipelines, or use Azure Container Registry Tasks to build container images in Azure. Build on demand, or fully automate builds with triggers such as source code commits and base image updates.

All right, this is kind of what I was looking for Quickstart: Create a private container registry using the Azure portal

An Azure container registry is a private Docker registry in Azure where you can store and manage private Docker container images and related artifacts. In this quickstart, you create a container registry with the Azure portal. Then, use Docker commands to push a container image into the registry, and finally pull and run the image from your registry.

To log in to the registry to work with container images, this quickstart requires that you are running the Azure CLI (version 2.0.55 or later recommended). Run az --version to find the version. If you need to install or upgrade, see Install Azure CLI.

You must also have Docker installed locally. Docker provides packages that easily configure Docker on any MacWindows, or Linux system.

Alright, so you have to be running win10 pro to get docker because virtualization is locked on the home version and as such I cant run any hypervisor on my current main machine but I have two older machines that work fine for that and I use this for blogging and FL Studio mostly. I may upgrade at some point but jesus christ is it a pain in the ass to get a large SSD and windows 10 if you go through the dell site to order a PC. Not to mention! I would like to simply put in my volume license key that I bought off of eBay for a quarter of the price of what dell charges for windows 10 and have it mysteriously work so I can fuck with docker when I feel like it. Sorry for cussing. Back on track, it is looking like you may not be able to use public image repositories per the MSFT suggested method but I’m sure there are ways around that. Maybe? Regardless, now we know where docker images are hosted. Honestly, I think that’s a good place to stop for now as this turned into a wall of text fairly quick.

Blog at WordPress.com.

Up ↑