Azure Security Group Automation [Guest Post]

16 Dec
Azure Security Group Automation [Guest Post]

Mission Statement

In this blog post by our guest author (@islamtg) we are going to discuss how to automatically update an Azure AD Security Group. This can be useful for something like Power BI where we have a security group that has specific permissions on a service. Through this will will try to address the following topics:

  • Eliminating the manual task of adding users to a security group.
  • Eliminating the process of manually cleaning up inactive users and invalid (or no longer valid) emails addresses.
  • Create an email clean up list that can be sent to the admins on a daily basis which can then be automated to clean up bad emails.
  • Creating an automated process for multiple security groups.
    • Create a read only security group on a particular service, a server admin security group.
    • Add users to a group that is assigned a specific license.

The Problem

Updating Security Groups manually can become a tedious and time-consuming task, therefore why not automate it? It can also cause delays for getting proper access for individuals or even organizations.

Proposed architect of the project

  • We will be using Power Automate, but these concepts can still apply to Logic Apps.
    • Some of the actions will require Premium licensing for Power Automate. There will be a workaround architecture that will use SSIS and SharePoint Online; this will help If you don’t have a Premium license for Power Automate or you cannot create Logic Apps due to cost or other limitations.
  • If you are going to use Logic apps, please note
    • If the data is on premises, you will need to setup an azure data gateway for Logic apps.
    • If it is an Infrastructure As A Services (IAAS) solution you will still need a data gateway.
    • If it is in Azure and is a Platform As A Service (PAAS)  solution you not need a data gateway

Best Practice

  • If you are using logic apps to access data on premises step up a data gateway
    • If you have an existing gateway that is used by Power BI you can potentially reuse that.
  • Use stored procs to get the data from SQL Server
  • Create a least privilege account for read-only purposes in SQL Server that can execute stored procedures.

Security Group Member list automation

As soon as you mentioned automation, the question of how much code comes to mind. Well the good news is this will be codeless! We will be using Power Automate to accomplish this which is part of the Power Platform family; the same steps can be applied with Logic Apps.


  1. Prerequisites:
    • SQL Login
    • SQL Stored Proc to get users upn/email  
    • Account that has permissions on the Security group to add new members.
  2. Configure a data Gateway for the Logic App, (Optional, only required if data is on-premises.)
    • Configuration for Power BI, Power Apps, Power Automate
    • Configuration for Logic Apps
  3. Create a Flow:
    • Power Automate or Logic App


SQL Login

We want to create a login or use an existing login that has least privilege permissions on a table, we really just need a login that has execute permissions on a stored proc that does a read on the users table. So the login will look something like the below.

 -- Creates the login user_exec with password 'Password123'.  
 CREATE LOGIN user_exec  
     WITH PASSWORD = 'Password123';  
 -- Creates a database user for the login created above.  
 CREATE USER user_exec   FOR LOGIN user_exec  ;  

SQL Stored Proc

For the stored Proc we will want to keep it simple since it will be just retrieving the email address of the users.

-- Creates Stored Procedure GetListOfUPNs.   
     DECLARE @Email NVARCHAR(max)
     SELECT @Email = CONCAT(COALESCE(@Email + ',', ''), [EmailAddress])
         FROM [dbo].[User] 
         WHERE void = 0
     SELECT @Email

We then need to grant that login created in the previous section the ability to execute this stored proc

 -- Grant permission to the login to execute the stored proc
     TO user_exec  ;  

And this concludes the SQL portions that are need for the flow.

The Security Group

In this section we will look at what is needed for the security group, this will require an AD admin to help create or modify the permissions of the group.

In this case we are create a new Security group we will call it PBI-Auto-Group and we will assign few people to this group. When we configure the owners this will be important since you can only be an owner of the group to be able use power automate to update the group.

I will keep the members empty as we can automate the process. So we will need an AD admin to make us an owner of the existing group or a newly created group.

Configure a data Gateway (Optional, only required if data is on-premises.)

Note for these sections we will be using the enterprise Data Gateway and not the Personal

Setup the gateway for Power BI, Power Apps, Power Automate

Note if you already have a data gateway created for Power BI you can skip the below steps, and if you are using logic apps you can take a look at the section for Logic App to see how to setup the data gateway for logic apps.

Since the data is going to be on-premises and there is no gateway installed on the machine we will need to configure the Data Gateway, so that we create that secure connection from Azure. This will allow use to use logic apps and the SQL action to query the data from on-premises.

First, we will need to install The data gateway App on a machine, we can download it from this link. Once we download, we will need to start to configure the gateway. Following the below steps.

We need to run the executable and following that we will need to agree to the terms of using this tool.

Following that we will need to enter an email address that will be used with the gateway, ideally the user installing the gateway must be the admin of the gateway.

Once we authenticate we will need to set a new name for the gateway and create a recovery key. IMPORTANT: please save the key some where because during a migration or update of the gateway we will need to provide the key, there is no way to recover it (as of writing this blog post).

Once installation is completed you should see the following screen which confirms that the gateway is setup for use by Power Automate, Power Apps and Power BI. You will notice there is an option to create a gateway in Azure for Logic App, we will get to that shortly.

Setup the gateway for Logic Apps

You may not need this if you have Power Automate Premium, if you don’t have power automate premium and you want to query the on-premises machine to get the emails we will need to setup the gateway in azure for Logic Apps.

While we have the on-premises data gateway app open we can click on the link to configure the gateway in azure for logics apps.

You will get redirected to the azure portal, complete the following steps:

  1. Add a unique resource name for the gateway
  2. Select the correct subscription
  3. Select the resource group or create a new one
  4. Select the region of the gateway you installed in the previous step which will be specified under the section logic apps (see our example above, we used West Central for both)
  5. Select the gateway that you created for the purpose of this task or an existing one

Create the gateway and reset the app.

Finally we have successfully configured the gateway for the logic app to connect to our on-premises data. Now we are ready to create the flow.

Create a Flow

Power Automate / Logic App

We will create the flow on a recurrence so we will use a recurrence trigger. You can set the duration to whenever you want to run the process, the best option is during a period with less activity.

Following that we will choose the SQL Execute Stored Proc. We then need to create a new SQL connection to our database. Note that this is a premium action which will require license, if you do not have a license then look at the workaround section.

Select the ellipses (…) and under my connection and add a new connection.

Once we choose that option, we then need to fill in the information, we have that from the previous sections. Look at the screenshot below as a reference. We have to select connect via on-premises data gateway and then specify the gateway we have access to. Finally enter the SQL server name and the database name along with basic for authentication type which is going to be our previously created SQL login.

Once you have setup the connection you can now select the server, the database name, and the procedure name (see screenshot below).

Note: I would recommend using the cool feature in power automate the copy to clipboard and use from clipboard, skip ahead to see how to do it.

Now lets create a variable that will be our upn array. To do this, select new action type. Type “var” in the search bar and select “set variable.” Give it a meaningful name and leave the initial value blank, we will set it later.

We will also need to do some validation to make sure that there are no bad emails. So we will also create a scope that will get the existing members of the group we want to update. This means we need to initialize a few more array and string variables.

For the last two variables, we will take advantage of cool feature in power automate. The copy to clipboard and use from clipboard is a great way to recreate an action based on an existing one. Our last two variables will be called bademails which will store strings, and removedEmails which will be an array.

Use the Copy to my clipboard functionality and then modify the type and the name. The last two variables should look something like this:

The step after that is to get the results of the stored procedcure which is the upn list. Create a compose action and in the inputs and add dynamic content. Select resultsSets, similarly to the screen below.

The results will look like this when we retrieve it from the output of the stored procedure.

   "Table1": [
       "Column1": "email1,email2,email3, … "

This is not an array, but rather it is JSON, so we will flatten it out to make it into an array which we can use later on.

To do that we will need to create another compose action and in this one we will enter an expression rather than use a dynamic content. The expression we will use is:


This will give the item in the JSON results that is inside of the Table1 element.

Now let’s test and see what we got so far. Click on the beaker looking icon in the top right corner.

The result from the last compose action should show

       "Column1": "email1,email2,email3, … "

Great news, we can set our array we will need to use a set variable action.

Then we set the value to the following expression


This should flatten out the JSON object to become our upnlist for the array. Now that we have an array we need to start manipulating it and finally insert these email addresses into the group. We can create our scope and list the members for the group we want and from there append to the array.

We are going to use the O365 Action list group members inside of the Lookup Members scope. From there we are going to select the group we want to update, if we wanted to also pass in a security group that could be another option. So now the last action we have should start to look something like this screenshot:

After the step above we will need to iterate through each member. To do so we will need to use the “apply to each” action and in the required field of that action we will select “Select an output from previous steps.” We will pass in the dynamic content named Value; this is a built in content that will pass in an array of values which will be the list of group members inside of that group.  Following that we will select the action “Append To Array” and in the values box, we will select expression. To avoid string matching issues, convert the UPNs to lower case, and do a string compare to get the matches. We will use an expression to convert to lower case, the expression so be as follows toLower(items('IterateThroughGroup')?['userPrincipalName']) now the full scope should like the screenshot below:

For our next scope, we will be adding users and creating a list to remove inactive and bad emails. We will use the trick above to copy from clipboard and use the previous scope. Once we have the scope we will add the compose action, where we are going to split the array by comma (,), so that we can start to iterate though it and do our validations. The inputs of the compose we will use this expression:


Now we need iterate through each item in the array , therefore we will need to create an “apply to each” action and use the outputs of the “SplittingTheArray” Compose. Will do the trick above to copy the append to array action to clipboard and modify it for the Lost to compare array. Within the append to array action will assign the list using the tolower function ToLower(items('IteratingUPNArray')) the scope should look like this now:

We can do the validations if the email is good/valid continue with adding the users, if it’s a bad/invalid email append to the bad email list. Following this validation process we end the flow by sending an email to the DBAs/AD Admins to do some cleanups.

To start the validation process we will use the “Search for User” action; which is under the Office 356 users, and we will use the same expression, which is ToLower(items('IteratingUPNArray')), inside the search term field. We will now use an if condition to check if the user list is valid or not. We will use the following expression length(outputs('SearchForUsers')?['body/value']). We are then going to compare if the length of the item from the “SearchForUsers” output is greater than 0; this indicates that there is an item in Active Directory and there for is a valid email, if the length is not greater than 0; this indicates that the email is invalid and will be appended to the bad emails list.

In the next screen shots we will see the body output of the “SearchForUsers”

Note: if you get an error using this expression length(outputs('SearchForUsers')?['body/value']) you can try this length(body('SearchForUsers')?['value'])

Looks like we have an email that was bad/invalid and that did not satisfy the criteria of the if condition, and looks like the email has been added to the bademails array. For the Bad email array, we used the “Append to string” Variable rather than using the append to an array.

Next, we have to compare and see if the users are in the group or not, and if so then we skip adding them and move to the next user. We will need an if condition for this one again, this time we will use the current item with the list we got from the previous scope the Lookup Members scope.

We will say if the array ListExistingGroupMembers does not contain the current item (which is a user login) it means we have to add it to the group.  Following that in the “if yes” side of the condition, which means true, we will use the O365 action “Add member to group” that way we add the new email(s) to an existing security group. The expression again for the current item will look like toLower(items('IteratingUPNArray')) .

And that is the last step in this scope, in this scope we are doing a few things:

  1. Validating an email is good and exists on the tenant
  2. Validating that the email is not in the group
  3. Adding bad emails to a list
  4. Adding the new emails to the group while ignoring the already existing emails.

All that with minimal to no code, just minor expressions.

The next scope as you may have guessed is going to delete no longer existing or inactive members.

We will create a new scope very similar to the Lookup Members scope, so a quick copy to clipboard and reuse is the way we will start it.

There will be some minor modifications, first we will create an if condition inside of that “apply to each” action, and we will do a compare between the ListToCompare array and the current item in lower case, which should be the most recent user list of the group. Then if the email is not contained in the ListToCompare array this will mean that this user is no longer valid/needed in the group and will be removed.  

The expression for this will be toLower(items('IterateThroughGroup_2')?['userPrincipalName']) along with the html tag <br> , this tag has been used in 2 places; 1) removedemails array and 2) bademails array and this is because we will be sending an email with both bad and removed emails.

The final outcome of the scope is in the below screenshot.

Finally we need to create two send email actions; one is for when the process succeeds and the other is for if there is a failure in the flow. After we select the send email action we will need to create a parallel branch and select yet another send email action.

For the Failure flow we will need to set the run after and configure it to run only after if  failed, skipped or has timed out.

And the finally the last step of the flow should look like the screenshot below:

With this we are ready to deploy and no longer worry about updating security group(s).


When invoking a Stored Procedure on an on-premises SQL Server, we have the following limitations:

  1. Output values for OUTPUT parameters are not returned. You can still specify input values for OUTPUT parameters.
  2. Return value is not available.
  3. Only the first result set is returned.
  4. Without premium we cannot use the SQL Actions in Power Automate.
  5. Without an Azure Gateway we cannot use logic apps to connect to the on-premises SQL.


As we saw the process can be done completely in Power Automate or logic apps, yet we might not have premium licensing for Power Automate, or we do not want to encore cost of create logic apps, so we can do the following workaround.

Workaround architecture

NOTE: It might be worthwhile to explore the licensing of Power Automate to create other solutions.

Workaround Prerequisites


  1. Prerequisites:
    • SQL Login.
    • SSIS package to export the Data from SQL
    • Account that has permissions on the Security group to add new members.
  2. SharePoint
    • Account access
    • PowerShell Script to upload the file to SharePoint.
  3. Create a Flow:
    • Power Automate.
    • Logic App.

Steps 1 is the same as above architecture, no changes there.


We will need to use an account that has is a member or own of the SharePoint site along with we will need to create a app id and secret and get the Tenant id.

To do this step you can follow the steps in this link.

Once we have the App id and the App Secret and the Tenant id we can move on to the SSIS Package and the PowerShell script.

SSIS Breakdown

In the SSIS package we are doing two simple tasks, first is a data flow task which will be used to run a SQL query that would be exported to a flat file. We will use a .txt file and a comma to separate the values, which is the email list.

Within the OLEDB Source task inside of the dataflow task we are using the query below to get the list of active emails:

Once the query has completed, we then take that output and export it to the flat file mentioned above.

So far nothing too difficult, just a simple data flow task. The next step after the data flow is an execute process task. This is used because there is no way to upload files to SharePoint using SSIS. We will need to create a script that would upload the file there. For this, we will use PowerShell.

One thing to note is that we need to use the switch –ExecutionPolicy Bypass because when we run the PowerShell script remotely it prompts the user to click “ok” to move to the next step; that switch removes and bypasses that prompt. Then we use the -F switch that executes the script from the location specified.

-ExecutionPolicy Bypass -F \\PATHtoPSScript\Csv2SpoREST.ps1

PowerShell Script Breakdown

In the Code snippet below, we are specifying the use of tls2 and from there we are giving the Token to authenticate to the SharePoint site, we are passing the content in a json header.

The next portion of the below snippet is identifying the Client ID, Client Secret, Tenant ID and Tenant Name of the SharePoint site, to get these details you can reference this link.

With the body we are passing the Client ID, Client Secret, Tenant ID and Tenant Name of the SharePoint site along with the header to allow us to authenticate to the SharePoint site and finally we are invoking the web URL.

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
#share point URL and token
$url = ''
$headers = @{
    'Content-Type' = 'application/x-www-form-urlencoded'
    'Accept' = '*/*'

#client id and secret
$client_Id      = 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'
$client_secrect = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx' 

#tenent ID and name
$TenantID       = 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx'
$TenantName    ='SharePointTenantname'
$body = $("grant_type=client_credentials&client_id={0}%40{1}&client_secret={2}%3D&resource=00000003-0000-0ff1-ce00-000000000000%2F{3}{4}" -f $client_Id,$TenantID,$client_secrect,$TenantName,$TenantID)

$item = Invoke-WebRequest -Method POST -Uri $url -Headers $headers -Body $body -UseBasicParsing

$response = ConvertFrom-Json $item

$AbsoluteSiteUrl = "https://"

The next portion of the PowerShell script is specifying the file which is on-premises, that was created by the SSIS Package we talked about in the previous section. We are creating an array of the content of the file to upload it to the SharePoint site.

The second portion of the script looks like the below.

The next portion of the PowerShell script is specifying the file on-premises, which is generated by the SSIS Package which we will talk about in the next section, that is the file we want to upload to the SharePoint site. Since the file may have leading or trailing spaces, we are creating an array before we upload the file and we are appending each index of the file and uploading the file in its entirety. The second portion of the script looks like the below.

$FilePath = “FilePATHHERE"
$FileName2Upload = "upnlist.txt"

$url = $("{0}/_api/web/GetFolderByServerRelativePath(decodedurl=%27DocumentsGroupMembers%27)/Files/add(url=%27{1}%27,overwrite=true)" `
            -f $AbsoluteSiteUrl, $FileName2Upload)
#Creating the json header
$headers = @{
    'Authorization' = $("{0} {1}" -f $response.token_type,$response.access_token) 
    'Accept' = 'application/json; odata=verbose'
    'Content-Type' = 'text/csv'
# Get the form digest
$file_to_process = Get-Content $("{0}\{1}" -f $FilePath, $FileName2Upload)

$index = 0

$string2upload = ""

# checking if the file is an array and looping. then make the file into 1 long string so that we can upload it to sharepoint in 1 go.
if ( $file_to_process.GetType().BaseType.Name -eq 'Array' )
    foreach ( $file in $file_to_process )
        $string2upload = $string2upload  + $file
    $string2upload = $file_to_process
$string2upload | Invoke-WebRequest -Method POST -Uri $url -Headers $headers -UseBasicParsing

Create a Flow

Power Automate / Logic App

Similar to the above architecture, minus the SQL action and the gateway. The trigger will be “when an item is created or modified on a SharePoint site.”

We will use the When an item is created or modified which is a SharePoint trigger.

It will require a SharePoint site and a List Name/Document Library.

I will be using a document library called DocumentsGroupMEmber. (mind the misspelling)

The trigger should look like the below.

This should replace the reoccurrence trigger from the proposed architecture.

In the Add and Validate Users scope from the above proposed architecture we will need to add an action before the compose, this action will be Get File Content this action requires a site address, which will be same as above, and also a file identifier. In the file identifier field we will need to select a dynamic content called identifier

We will then need to modify the splittingthearray compose.

We will need to change the expression to look at the body of the action that proceeded it.

split(body('Get_file_Content'), ',')

These are the only two changes we need to do here.

We have our SSIS package which creates our file and uploads it to the SharePoint site and the document library we want to use. So, lets run the SSIS package and look at the Automation.

The final flow should like like the below, these were the minor changes from the proposed architecture.

We have made some clean up to the flow and removed unneeded variables and actions from the proposed architecture.

We removed the UPNListArray Variable and the 2 compose actions after the BadEmailList Variable. We also removed the SQL Execute Stored Proc action.

Everything else stays the same.


Other Solutions

For a different approach on this take a look at the article below.

Azure Active Directory Security Group Automation with Power Automate

Leave a comment

Posted by on December 16, 2020 in Database Administration, Other


Tags: , , , , , ,

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: