Tag Archives

12 Articles

SQL Server 2017: It’s here!

Anxiously you have waited for this day. The day you could finally get your hands on the latest version of the best database platform available. Today is that day. SQL Server 2017 is available and it runs on Windows, Linux and Docker. Find more info here: https://www.microsoft.com/en-us/sql-server/sql-server-2017. What changed? So many things it is hard to mention them all. Have a look at the Ignite sessions to get a feel for what happened.

 

Power BI Refresh scenarios against data storage solutions explained

A recurring theme with customers and partners is Power BI data refresh. More specifically, there is some confusion on what refresh scenario requires a Pro version and what can be done with the Free version of Power BI. I made the diagram below to help explain this. It shows the refresh scenarios against data storage solutions, such as SQL Azure, SQL in a virtual machine in Azure or SQL server on premises. I used these as examples, there are other options as well. I think the overall time carries over to other data storage solutions. The diagram shows the refresh that can be done using a Power BI Free account as orange and the refresh scenarios that need Power BI Pro as green lines. As shown in the diagram, if  you want to refresh against on-premises sources or a database running in a VM in Azure you will need a gateway and Power BI Pro. This applies not only to the creator of the report and schedule but also to every consumer. If you use PAAS solutions for data storage in Azure such as SQL Azure, it becomes a bit more difficult and it is really dependent on the type of refresh required. If you need a refresh cycle higher than once a day (either max 8 times per 24 hours or live) you will need Power BI Pro. If you just want to refresh against such as SQL Azure and once a day is enough you can do that using Power BI Free. Again, the license requirement carries over from author to viewer; if the author of the report requires Pro, then the viewers also need Pro.

Power BI Refresh scenarios against data storage solutions

Hope this helps. If you have any questions or feedback, please comment below!

SQL Server op Linux? Ja dus!

Als iemand een paar jaar terug tegen mij had gezegd dat Microsoft Multi platform zou gaan en dat ook SQL Server op Linux zou uitkomen, dan had ik hem voor gek verklaard. Echter sinds de komst van Satya Nadella als CEO en mensen als Scott Guthrie (CVP voor Cloud & Enterprise) waait er een frisse wind door Microsoft!

En als Microsoft employee voelde ik natuurlijk al wat aankomen, maar het is toch altijd weer top om het nieuws dan nu in de buitenwereld te zien landen: Gister heeft Microsoft de Linux versie van SQL Server geannonceerd in preview. Dit betreft voorlopig alleen het RDBMS en niet de overige SQL tools als SSRS en SSAS, maar een mooie start om nu ook op Linux de (volgens Gartner) de leading database te kunnen runnen!

Ik zou het leuk vinden als jullie na het testen van deze versie op het blog zouden willen reageren met de eerste indruk van de performance van deze preview versie.

De blogpost van Scott over de annoncering: http://blogs.microsoft.com/blog/2016/03/07/announcing-sql-server-on-linux/

SQL groeten,

Harry

Platform as a Service (PAAS) moving a rocket speed!

As an Enterprise Architect in an organization life has always been dynamic to say the least! It is your responsibility to keep up with the latest developments in ICT both in technique as in architecture. In the old days of on-premise only that was a big challenge. But with the Cloud as a integral part of your information systems it became even more complex.

But still… The Cloud was moving vm’s to Amazon or Microsoft. So architecturally not that complex. Identity & access off course, but that’s about it. Then came Platform as a service (PAAS). That was something completely different! Not moving vm’s to the Cloud, but move complete technical workloads to the Cloud like an ESB in the Cloud, Media Services, Federated identity, Storage, etc, etc..

This does impact your architecture!

A blazing 78 new PAAS services were introduced in 2014 within Azure. So it’s moving rocket fast! And to be fair: not only at Microsoft, also are other Cloud vendors moving into the PAAS area with new services.

What is the impact for you as Enterprise Architect?

In your normal day to day work you make choices based on software you can purchase and implement at your data center. But now you should at least ask yourself for every choice you have to make: Do I want to do this myself or shall I take this as a service from one of the Cloud vendors.

An example: Your organization wants to use Cloud services from multiple Cloud vendors but you want a single sign on experience for your users. Now you can buy a federated identity server, do research on all Cloud vendors on how to connect and then build the connections. But you can also use The Windows Azure Active Directory Federation Service (ADFS) from Microsoft with over 2600 Cloud vendors already pre-installed.

Second example: You have a new web application that you need to deploy. Again you can buy a few servers, install IIS, SQL Server, the application and install everything and schedule things like backup, patch management, storage, etc., etc. But you can also take a web-role to host the web-application, Azure SQL database to host you data and let Microsoft worry about backup’s, 3 replica’s for DR, patching the server, etc.., etc.

So my message to all you Enterprise Architects out there: Examine carefully the PAAS offerings from the Cloud vendors before making expensive buy decisions. My recommendations to checkout:

Azure Service Bus, Azure Machine Learning, WAAS, BizTalk Services and Azure SQL Database. Next blog-post I will dig deeper on Azure SQL Database.

SQL Server 2014 Generally Available since April 1st

This is not a late April fool’s joke. SQL Server 2014 has been made generally available on April 1st 2014. See the original blog post here: http://blogs.technet.com/b/dataplatforminsider/archive/2014/04/01/sql-server-2014-now-generally-available.aspx

I encourage you to download and try SQL Server 2014 or run it in a VM in our Azure cloud. Also you might want to block your calendars for a live stream on April 15: “Accelerate Your Insights” (http://www.microsoft.com/en-us/server-cloud/new.aspx?WT.mc_id=Blog_SQL_Launch_SQL2014).

Setup and register a Data Management Gateway for Power BI

In this post we will have a look at how to install, setup and register a Data Management Gateway for Power BI. It involves installing software on a server which will function as the gateway as well as configuration through the Power BI Admin Center. Once set up you can use the gateway to publish on-premises sources for use through Power BI.

  1. To create and register a Data Management Gateway follow these steps:
    1. In the Power BI Admin Center, click gateways and click ‘new gateway’.
    2. Specify a name for the gateway and an optional description. Then, select ‘Store the credentials securely in the cloud’ to store credentials in the gateway rather than on the machine where the gateway is installed. Storing credentials in the cloud enables you to restore the gateway to another machine if the original one is lost. Click ‘Next’.
    3. Click ‘Create’ to create the gateway. Then we need to copy the Gateway Key for later use. Click ‘Download’ to download the Data Management Gateway software. Install the software on your machine and start the Gateway Setup wizard.
    4. In the wizard paste the Gateway key you copied. Finally go to the Power BI Admin Center and click ‘Finish’. Your gateway should now be listed as ‘Online’.
    5. Switch back to your setup wizard and choose whether you want to use an existing certificate or a Power BI generated certificate to protect the credentials. Click ‘Next’. On the specify endpoint access page specify if you want to use HTTP or HTTPs and make configuration settings accordingly. Click ‘Next’ and ‘Finish’ to close the wizard. Make sure the gateway name matches the name in the Admin Center and that is shows as Registered and Started.

That’s it! You have now successfully installed and registered a Data Magement Gateway for Power BI. Next up is creating sources. I’ll discuss in a future post how to create an OData feed through the Power BI Admin Center.

Publish SSIS Packages as OData Feed Sources with Power BI

With the upcoming advent of Power BI on-premises data sources can be published as OData feeds to users. By default, you can only register SQL Server data sources here. With a clever trick you can register SSIS packages as data sources here, using the new Data Streaming Destination and Microsoft OLE DB Provider for SQL Server Integration Services (SSISOLEDB).

How it works is that you use a special destination (Data Streaming Destination) in the SSIS package. Then, we you publish the package to the SSIS Catalog you can run the SSIS Data Feed Publishing Wizard which enables you to create a view. That view can then be published as OData feed as can any other view.

In this post we will walk through an example. You will need SQL 2012 and the SSIS Data Feed Publishing Components, available here: http://www.microsoft.com/en-us/download/confirmation.aspx?id=39931 (pre-release).

  1. Install the SSIS Data Feed Publishing Components. After you installed it make sure it is successfully installed and configured by using SQL Server Management Studio. Connect to your SQL Server, expand Server Objects, Linked Servers, Providers and makes sure you see the SSISOLEDB provider. Double click SSISOLEDB here and enable ‘Allow Inprocess’ if it is not enabled and click OK.

     

     

  2. Create a new Integration Services project (or of course you can edit an existing one), I’ll work with a new one just to keep it simple. I named my project SSISODataPublishing. Drop a Data Flow task onto the control flow and double-click it.
  3. In the data flow, define a source (I connected to an AdventureWorks database). Also specify the table you want to publish as OData feed (I used DimProduct).
  4. Now drag and drop the Data Streaming Destination to the data flow. Of course in real life the package would not be this simple: you would get the data from sources, transform it and then finally use the destination. For demonstration purposes I am keeping it simple here. Connect the OLE DB Source to the new Data Streaming Destination like so:

  5. What I noticed is that the Data Streaming Destination does not support all data types. For example, LargePhoto is a varbinary and is not supported. To fix this we need to unselect it from our OLE DB source. Do this by editing the source. On the Columns page unselect LargePhoto and click OK:
  6. Now there is nothing else to do but to deploy the package. Right-click the project in the Solution Explorer and choose Deploy. Click Next and use browse to enter the destination SQL server name to publish the project to the Integration Services catalog. Then click browse to enter the path. If you run into this information message you have not correctly set up the SSISDB catalog yet:
     

    To fix this go to SQL Server Management Studio, connect to your SQL Server, right-click Integration Services Catalog and choose ‘Create Catalog…’. In the next dialog enter a password for the encryption key protection and click OK.

    After this return to your Integration Services Deployment Wizard and choose browse to enter the path again. Now you can create a new folder in the catalog and select it. Click OK to enter the path:

    Click Next and choose Deploy to finish the deployment.

     

  7. When the deployment finishes, make sure all items have resulted in Passed and no errors or warning exist. Then click Close.
  8. Now that the deployment has been done, we need to publish the package as view in a SQL Server database. To do that we need to use a new wizard called the SSIS Data Feed Publishing Wizard, which can be started by running the ISDataFeedPublishingWizard.exe from C:\Program Files\Microsoft SQL Server\110\DTS\Binn or just search for Data Feed Publishing Wizard in Windows.
  9. In the wizard, click Next on the welcome page.
  10. On the package settings page enter the name of the server where you deployed the package and enter the package. If you have any parameters configured you can enter values here or bind them to environment variables:
     

    When done click ‘Next’.

  11. On the Publish Settings page select a database where you want the view for the SSIS package to be created and enter a name for the view. Make sure to choose a LinkedServer. If it does not exist already, the wizard will create it. Note you can also specify Use32BitRuntime and Timeout options here:
  12. Before clicking ‘Next’, choose ‘Advanced’ and specify the Schema you want to use, if you want to use encryption to encrypt data before sending it over the network and what certificate you will use for the encryption. I left everything at the default (schema dbo, no encryption). Click ‘Next’.
  13. On the validation page you will see a warning if no linked server exists yet. That is not a problem since the wizard will create the server automatically. Make sure that no other warnings exist or errors exist and click ‘Next’.
  14. Review the selections you made and click ‘Publish’ to actually create the view.
  15. Review the progress and make sure no warnings or errors exist. Note that the query you will need to use to get the output data of the package is displayed at the bottom of this screen:
  16. Click ‘Finish’.
  17. Test the view by running the query above in SQL Server Management Studio:
  1. So far all is OK. Last step is to create a data source and enable the OData feed in the PowerBI Admin center. You will have to have a working Data Management Gateway configured in order to do this. When done you can use the feed in Power Query. See my post on how to configure the Data Management Gateway.

How to use a configuration (.config) file in SQL Server CLR

Config files (web.config, app.config) are a great way to store configuration settings outside of your programs’ code. If the configuration changes (for example you move from development to test or from test to production) you do not need to go and change your code and re-compile. You simply change the config file and you are good to go. In SQL CLR you can also use this concept, however there are some gotcha’s.

First create your SQL Server CLR project. Note that you will have to have the correct versions of SQL Server Data Tools. You can download what you need from here: http://msdn.microsoft.com/en-us/data/hh297027. I have SQL 2012 installed and am using Visual Studio 2012, so I installed the SQL Server Data Tools for Visual Studio 2012 from the link above.

To start create a new database project and add a new SQL CLR C# User Defined Function (you can also use VB if you like). Add a reference to System.Configuration to your project. Make sure you include the following line in your cs file: ‘using System.Configuration;’.

Now, the config file needs to be stored in the \Binn folder under the root path of the SQL Instance. To get the root folder run the following code in SQL Server Management Studio:

 

 

Add ‘\Binn’ to the result of this query. My path was: C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Binn. There create a file named ‘sqlservr.exe.config’ (make sure to get the name right!) You can use Visual Studio for this. A sample config file would be as follows:

 

Of course you can specify anything you want in the config file. To keep things simple I chose only to configure a sample connection string.

In your cs file all you need to do is create a function that retrieves your configuration for the config file (in my case just the connection string).

Here is the code you will need to put in your CS file:

 


 

As you can see this code retrieves the value of the configuration value MyConnString from the config file.

If you, after the code has been deployed, made any changes to the configuration file that you would like to be picked up you will have to either restart the SQL Instance (not really an option in most cases) or free the system memory caches which will unload the CLR memory clerk by running the following in SQL Server Management Studio:

 

This is unlike normal .NET applications where configuration file changes take effect immediately.

When done right-click your solution in the solution explorer and choose ‘Publish’ and publish your solution to the SQL Server. Then in Management Studio you should be able to find your function in the database you deployed to:


Now we can test our function by executing it:

 

 

Be sure to execute this against the database you deployed to.
Your connection string should be returned:


There you have it: you can now use configuration files from SQL Server CLR.

 


Generating perfect test data using Visual Studio and custom Data Generators

In many development projects the need arises to fill a database with some random data. With random data I do not mean mere gibberish, but actual data such as an email address, phone number, SSN, numeric value, dates, datetimes, etc.

With Visual Studio 2010 you can do this very easily. All you need is an edition of Visual Studio 2010 that includes the database projects (I think you will need Ultimate, but am not sure). And of course you will need a database, I used SQL Express in this example.

However, turns out that the data generators included in Visual Studio are somewhat limited. Visual Studio can generate data for many datatypes in SQL Server, but for example generating a phone number, SSN or email address is not possible.

However, one can easily create custom data generators, which is exactly what we will do. I will build custom data generators for email addresses and one that generators Lorem Ipsum text.

Building the custom data generators
Start with creating a new class library project. In the solution explorer, right-click the project and click ‘Add Reference’. Click the ‘.NET’ tab and select the following components: Microsoft.Data.Schema, Microsoft.Data.Schema.SQL and Microsoft.Data.Schema.Tools. Click ‘OK’ to add these references.

Open your class (I renamed my class and class file to LoremIpsumGenerator) and add the following using statements:

Now we need to inherit the Generator class by changing our class definition to:

Also add the DatabaseSchemaProviderCompatibilityAttribute to your class declaration as so:

Now we need to add an output property. This will be used to wire up the output of your generator to Visual Studio Data Generation Plans.

Also, if you need to specify some parameters to your data generator we need to create some input properties. I have created three: one for the number of words to generate, one to indicate if the data needs to start with ‘Lorem Ipsum…’ every time and one to specify if just random words can be selected or if we are more interested in sentences. Also note the default settings at the bottom.

Then all we need to do is override the OnGenerateNextValues() method. Since I wanted to Unit test my generator I have just one line in this method, which class a public void method (which my testing command line project can call). In this function (called Generate) I built the output based on Lorem Ipsum and set my Output property.

I will not discuss how exactly I managed to get the Lorem Ipsum words returned. If you want to know you can download the source code. Also, my email address generator follows the same structure of output, inputs (in this case none) and a generate function that creates the email address string and sets the output property.

Registering the data generators
If your data generators build successfully, we need to sing the generators with a strong name. To do this, right-click your project’s name and choose properties. Go to the Signing tab and select the ‘Sign the assembly’ check box and under ‘Choose a strong name key file’ choose ‘New…’. Follow the wizard and enter a password. Your screen now looks like this:

Now save all, and rebuild your solution.

Next we will need to create an XML file to accompany your DLL. Just add a XML file to your project and name it [YourProjectsName].Extensions.xml. My project (and thus my dll) is named MyDataGeneratorsLibrary, so the filename of the XML is: MyDataGeneratorsLibrary.Extensions.xml

Open the XML and replace the contents with this (you will have to fill in your own values).

As you can see, we need an Extension element for each generator. In my library I have created two generators (one for Lorem Ipsum text, and one for email addresses). Also you will have to specify the correct type (which is your classname and namespace) and the assembly (which again is your DLL file / project name). Last item to enter is the public key token. To get it open your Visual Studio Command prompt and use

to get the public key token returned to you. You will need to enter the correct path and file name. Copy paste the public key token into the XML.

One last thing I did is right click the XML and chose Properties. I changed the ‘Build Action’ to ‘Content’ and the ‘Copy to Output Directory’ property to ‘Copy always’. It does not matter in any way but it makes the next step just a bit easier.

Now, build your solution and navigate to your solutions directory. In there go to the bin\Debug folder, and copy your DLL and XML file.

Now it is time to register the data generators with Visual Studio. Be sure your Visual Studio 2010 instance is closed and navigate to the following folder: C:\Program Files (x86)\Microsoft Visual Studio 10.0\VSTSDB\Extensions . It is good practice to keep your own extensions in a separate folder, so I created a folder ‘CustomGenerators’ and I advise you do the same. Next, paste your DLL and XML files into this folder.

Start Visual Studio.

Using the data generators to generate data
In Visual Studio, add a new project to your solution or create a new solution. Select SQL Server Database Project as the project type and enter a fancy name. I assume you already have built your database schema. Right click on your project name and choose ‘Import Objects and Settings’ and follow the wizard to get your database schema into your project. Next, add a new Data Generation Plan and open it. Select the table you want to use and specify the number of rows to insert.

You will see a list of the columns in the table. To the right (under ‘Generator’) you can specify which generator you want to use to generate data with. It should contain your custom generators now. Select it and click the preview button to test your generator: . Here is my screenshot:

If you have any input properties specified you can change the values by selecting the row and opening the properties window, see below screenshot for my Lorem Ipsum generator.

That concludes my introduction into custom data generators. Download the source code (with the Lorem Ipsum and email address generators) on Github.

Installing Master Data Services add-in for Excel 2013

I recently picked up my new laptop, which of course runs Office 2013 and Windows 8.
When you try to install the Master Data Services add-in for SQL 2012 you may run into a warning that says you need Office 2010.

There is an easy fix: just install the Master Data Services add-in for SQL 2012 SP1 (get it here: http://www.microsoft.com/en-us/download/details.aspx?id=35581 . Be sure to pick 32 / 64 bit to match your Office version.

Oh and of course you will have to get the other pre-req as well: Visual Studio 2010 Tools for Office runtime (find it here: http://www.microsoft.com/en-my/download/details.aspx?id=35594).

See http://support.microsoft.com/kb/2774422 for more information.

%d bloggers like this: