Category Archives: Azure SQL Database

Azure Security Fundamentals: Azure SQL Database

I’m continuing my Kloud blog series on the security fundamentals for various Microsoft Azure services with my most recent post being for Azure SQL Database.

Please go and have a read!

Tagged , ,

Azure SQL Database – Row Level Security Walkthrough

According to De La Soul three is the Magic Number, so here’s the third post on the new Azure SQL Database features released over the last couple of months.

Previously I looked at Dynamic Data Masking and Auditing, both features that deliver Azure-specific functionality (or that rely on Azure features). This post will cover Row-Level Security (RLS) which would be familiar to many on-prem SQL Server admins and until this release was a feature missing from Azure SQL Database.

As with my previous posts I’m going to work with the AdventureWorks sample database that I have loaded into Azure SQL Database.

Note that I’m not looking to take a “best practices” approach here to how I implement policies and permissions – that’s why you can read MSDN or other SQL Server specific blogs :).

Testing it out

The remainder of this post is mostly going to be SQL and some result sets – no need for pretty UI here!

The AdventureWorks database is a good sample as it has Sales information which includes quota, bonus, commission and year-to-date sales information. If we run the query below we get a good feeling for the data available.

SELECT PER.FirstName, PER.LastName, 
       SP.SalesQuota, SP.Bonus,
       SP.CommissionPct, SP.SalesYTD 
FROM Sales.SalesPerson AS SP
INNER JOIN Person.Person AS PER 
ON SP.BusinessEntityID = PER.BusinessEntityID

The results of this query are shown below.

FirstName         LastName              SalesQuota   Bonus      CommissionPct  SalesYTD
----------------- --------------------- ------------ ---------- -------------- -------------
Stephen           Jiang                 NULL         0.00       0.00           559697.5639
Michael           Blythe                300000.00    4100.00    0.012          3763178.1787
Linda             Mitchell              250000.00    2000.00    0.015          4251368.5497
Jillian           Carson                250000.00    2500.00    0.015          3189418.3662
Garrett           Vargas                250000.00    500.00     0.01           1453719.4653
Tsvi              Reiter                300000.00    6700.00    0.01           2315185.611
Pamela            Ansman-Wolfe          250000.00    5000.00    0.01           1352577.1325
Shu               Ito                   250000.00    3550.00    0.01           2458535.6169
José              Saraiva               250000.00    5000.00    0.015          2604540.7172
David             Campbell              250000.00    3500.00    0.012          1573012.9383
Tete              Mensa-Annan           300000.00    3900.00    0.019          1576562.1966
Syed              Abbas                 NULL         0.00       0.00           172524.4512
Lynn              Tsoflias              250000.00    5650.00    0.018          1421810.9242
Amy               Alberts               NULL         0.00       0.00           519905.932
Rachel            Valdez                250000.00    75.00      0.018          1827066.7118
Jae               Pak                   250000.00    5150.00    0.02           4116871.2277
Ranjit            Varkey Chudukatil     250000.00    985.00     0.016          3121616.3202

(17 row(s) affected)

So.. some details there we don’t want to necessarily share amongst the sales team!

I’m going to setup RLS to restrict this so that the person who can see all entries is someone logged in a dbo-equivalent user, with individual sales people only able to see their own row.

The below SQL shows how to set this up.


-- create a test user (we'll do it without a login for this demo)
CREATE USER LindaMitchell WITHOUT LOGIN;

-- grant them select permissions on the two tables in our query
GRANT SELECT ON Person.Person to LindaMitchell;
GRANT SELECT ON Sales.SalesPerson to LindaMitchell;

-- create a 'security' schema as per Microsoft's recommendations
CREATE SCHEMA Security;
GO

-- create a table-valued function that will provide our filter functionality
CREATE FUNCTION Security.fn_securitypredicate(@FirstName AS nvarchar(50), @LastName AS nvarchar(50))
    RETURNS TABLE
WITH SCHEMABINDING
AS
    RETURN SELECT 1 AS fn_securitypredicate_result 
WHERE USER_NAME() = (@FirstName + @LastName) OR USER_NAME() = 'dbo';

-- now create security policy on the Person table, passing in the FirstName and LastName columns.
CREATE SECURITY POLICY SalesPersonFilter
ADD FILTER PREDICATE Security.fn_securitypredicate(FirstName, LastName) 
ON Person.Person
WITH (STATE = ON);

Now that we have this setup we can re-run the previous query and we will still get the same output as shown above because we are allowing anyone with user name of ‘dbo’.

However, what do we get when we run the query in the user context of ‘LindaMitchell’? Let’s find out.

EXEC AS USER = 'LindaMitchell';

SELECT PER.FirstName, PER.LastName, 
       SP.SalesQuota, SP.Bonus, 
       SP.CommissionPct, SP.SalesYTD 
FROM Sales.SalesPerson AS SP
INNER JOIN Person.Person AS PER 
ON SP.BusinessEntityID = PER.BusinessEntityID

REVERT;

Which results in:

FirstName         LastName              SalesQuota   Bonus      CommissionPct  SalesYTD
----------------- --------------------- ------------ ---------- -------------- -------------
Linda             Mitchell              250000.00    2000.00    0.015          4251368.5497

(1 row(s) affected)

Which is exactly what we wanted – Linda can only see her own data in this table!

So there we have the final piece of the major new features in Azure SQL Database – now an even better match for you migration to Azure.

Happy days!

Tagged , , , ,

Azure SQL Database – Auditing Walkthrough

I enjoyed writing my post on the new Dynamic Data Masking feature in Azure SQL Database so much that I thought I’d take a look at another recently added feature – Auditing.

Getting setup

I’m going to use the same AdventureWorks sample database from last post for this one too.

Firstly, I’m going to initiate some changes at the actual Server level which individual Databases can inherit. So I’m going to open the Database Sever and switch to the Auditing & Security tab as shown below.

Server Audit Settings

On this page I’m going to make the following settings and select a storage account (which I’ve blanked out below). The Auditing feature utilises Azure Table Storage which is a great choice as it (theoretically) supports up to 500TB of storage per Subscription and as Troy Hunt blogged, easily handles millions of rows.

Server Audit Default Settings

Note: while here make sure to download the report template (an Excel workbook) which we’ll use later.

Now that we have the server configured we’ll need to make a few minor tweaks at the Database level too. I am going to disable the use of unsecured connections to this database by flipping the following setting on the Database’s Auditing & Security tab and then saving.

Secure Connection Strings

Testing it out

For this testing I’m going to use SQL Server Management Studio again… so let’s login…

Failed Login

D’oh! I forgot – I’m doing this on a 4G connection and the source IP I’m at isn’t in the firewall for the Database so my login failed. Let’s go and fix that and then try again. We’ll check later to see if that failure was logged or not.

I’m going to start by using the free Azure Storage Explorer to allow me to view the raw data in the Azure Storage Table into which my Azure SQL audit logs are being written. A sample is shown below.

Audit Log - Azure Explorer

This is all very nice, but as you can see this will quickly become difficult to manage and roll-up reporting will be a challenge. This is where the spreadsheet we downloaded comes into play and we can leverage the Power Query to provide useful reporting.

If you open the spreadsheet you will see that it has instructions on how to set it up (hint: I’m not going to cover those again here).

After you have updated the Power Query data source to point to your Audit Logs you have a dynamic workbook tied to your logs which will support slicing and dicing of your audit data (and that you can publish to a PowerBI dashboard if you so chose). Note this isn’t a one-time data sample – I can refresh my data as required as long as I have an Internet connection. A sample from the Excel workbook is shown below that highlights the Login Failure information on my server.

Audit Logs in Power Query

As with the other recently announce Azure SQL Database features such as Dynamic Data Masking, this offering is very much a “v1” implementation. Expect to see changes and increasing in depth of this offering over time because if anyone needs more information it’s the guys in the tinfoil hats!

Tagged , , ,

Azure SQL Database – Dynamic Data Masking Walkthrough

Microsoft recently announced the public preview of the Dynamic Data Masking (DDM) feature for Azure SQL Database that holds a lot of potential for on-the-fly data obfuscation that traditionally would have required either custom business logic or third party systems.

In this post I am going to take the opportunity to walk through how we can set this up for an existing database. For the purpose of this post I am going to utilise the AdventureWorks sample database for Azure SQL Database which you can download from Codeplex.

Our Azure SQL Database Server

Firstly we need to understand our Azure SQL Database Server instance configuration. At time of writing any existing Azure SQL Database Server or one created without specifying a Version flag will use the “2.0” Azure SQL Database engine which does not support DDM.

If this is your scenario the you will need to upgrade your server to v12 using the Azure Management Portal – make sure you follow the planning guidance provided by Microsoft. Note that an upgrade has minimal impact on running databases – there will be brief interruption to service at the very end of the upgrade, but for the rest of the process your database will remain online.

I’m going to setup my server using the following PowerShell, but if you want to specify a server name them you will need to use the preview portal (https://portal.azure.com/). Note the Version parameter I’m providing to specify the new engine release.


# Should return 0.8.14
(Get-Module Azure).Version

New-AzureSqlDatabaseServer -Location "West US" 
                           -AdministratorLogin bobsyouradmin
                           -AdministratorLoginPassword N0tRe4l?
                           -Version 12.0

Notes:

  • At the time of writing there is a known issue with this Cmdlet due to the time it takes to complete exceeding the maximum PowerShell timeout. The server will still be created but you will receive an error.
  • Make sure you open up the firewall on the server to allow your client machine to connect. You can do this via the Azure portal or PowerShell.

After this has run I’ll have an Azure SQL Database instance with the v12 engine ready for use. My next step is to deploy the Adventureworks database schema and data. We can do this leveraging the downloaded package (from Codeplex) and then running the following command at an Admin-elevated command prompt.


CreateAdventureWorksForSQLAzure.cmd {servername}.database.windows.net {user}@{servername} {password}

Once this command completes we will have a database setup that we can now query. I’m going to simply use SQL Server Management Studio (SSMS) for this. You will need a version recent enough to support Azure SQL Database (SSMS for SQL Server 2012 and 2014 will do).

This is the test query we’ll use to show how data masking works. I’m going to run the initial query as our admin user (‘bobsyouradmin’).

SELECT TOP 10 BusinessEntityID, FirstName, MiddleName, LastName 
FROM Person.Person

Which results in the following.

BusinessEntityID FirstName     MiddleName LastName
---------------- ------------- ---------- -----------
285              Syed          E          Abbas
293              Catherine     R.         Abel
295              Kim           NULL       Abercrombie
2170             Kim           NULL       Abercrombie
38               Kim           B          Abercrombie
211              Hazem         E          Abolrous
2357             Sam           NULL       Abolrous
297              Humberto      NULL       Acevedo
291              Gustavo       NULL       Achong
299              Pilar         NULL       Ackerman

(10 row(s) affected)

Getting the Mask On

Now that we’ve gotten everything setup, let’s go and mask the LastName field. Firstly we need to switch on Dynamic Data Masking for our Azure SQL Database instance via the Azure Portal.

Navigate to the Azure SQL Database instance and then select the Auditing & Security tab on the very right.

This is the Audit and Security Option

Then scroll down to the bottom and switch on Dynamic Data Masking as shown below. I’m going to add our current login as a privileged login too (this means it can query the data without a mask).

Data Mask Setup

Now at this stage if I re-run the query above with the same credentials then I will get back exactly what I have above which is expected because I allowed my admin user to be excluded from the masking (and I’m not using a secured connection string).

Let’s go ahead and create another login and grant it access to the database – we’ll just make it a data reader so that it can run the SELECT query but not update anything.

Firstly create a login for the server

CREATE LOGIN untrusteddev 
WITH PASSWORD = 'N0tL0Lc4t5^' 
GO

Then a user in the Master and AdventureWorks databases

CREATE USER untrusteddev
FOR LOGIN untrusteddev
WITH DEFAULT_SCHEMA = [dbo]
GO

-- Add user to the database data reader role (not needed in Master DB)
EXEC sp_addrolemember N'db_datareader', N'untrusteddev'
GO

Now let’s close our connection to the Azure SQL Database and create a new one with our new login.

As we don’t trust the developer we’re going to give him a special connection setting to use. Instead of the standard

{servername}.database.windows.net 

we will provide the developer with

{servername}.database.secure.windows.net.  

We will turn off the ability for clients to connect via the old un-secured hostname as well.

Before we give our developer access to the database let’s make a tweak by adding a mask to our Person table. On the Auditing & Security page for our database I’m going to click “Add Mask” in the bottom menu and then add the mask as shown below.

Mask Rule

Now that this is place we can provide our developer with the necessary connection details and they can connect via SSMS like so:

Secured Connection

When the developer logs in everything looks pretty much as it always does.

Let’s pop open a new Query Window in our AdventureWorks database and re-run our SQL query on the Person table from above.

SELECT TOP 10 BusinessEntityID, FirstName, MiddleName, LastName 
FROM Person.Person

So can we see the full Person details as before?

and our result set looks like:

BusinessEntityID FirstName     MiddleName LastName
---------------- ------------- ---------- -----------
285              Syed          E          XXXXXXXX
293              Catherine     R.         XXXXXXXX
295              Kim           NULL       XXXXXXXX
2170             Kim           NULL       XXXXXXXX
38               Kim           B          XXXXXXXX
211              Hazem         E          XXXXXXXX
2357             Sam           NULL       XXXXXXXX
297              Humberto      NULL       XXXXXXXX
291              Gustavo       NULL       XXXXXXXX
299              Pilar         NULL       XXXXXXXX

(10 row(s) affected)

The key take away from this is that we haven’t actually changed the underlying data – the database does still contain the LastName of the Person – it’s just that the user context of the call is such that a configured mask is applied to the data as it is retrieved from the server.

We can use this approach for a lot of different data types and I expect eventually you’ll have a lot more customisation you can apply here. This is very much a “v1” release and will no doubt be added to over time. This is a great feature that may eventually remove some need to perform mass data cleansing operations when populating dev/test data sources with “production” data.

Tagged , ,