Monthly Archives

5 Articles

CTP2 of the new SQL Server 2016 has been released to public

With this new beta release of SQL Server we get a better insight in what we can expect in the full product.
Some of the top capabilities in SQL Server 2016 CTP2 are:

  • Real-time Operational Analytics & In-Memory OLTP, enhanced for up to 30x faster transactions for a greater number of applications, customers can configure the in-memory columnstore to work on top of a transactional database to achieve real-time operational analytics with breakthrough OLTP performance.
  • Always Encrypted helps protect data at rest, in motion and while in use, on-premises and in the cloud.
    With Always Encrypted, SQL Server can perform operations on encrypted data.
    Best of all the encryption key resides with the application in the customer’s trusted environment.
  • Stretch Database technology keeps historical data at users’ fingertips by transparently stretching warm and cold data in a more secure manner to Microsoft Azure on demand without application changes.

I advise you to take a special look into the stretched databases.

It is a technique originating in Azure where it’s already very easy to scale up or down on a working database.
And not only on a single database!
Also a set of databases (called Elastic Pool) can be tailored to meet the per time different demand of power for each individual database.
A brand new concept which you could use in use-cases like having a database per customer or project which scale needs are different in time and in database.

In my next blogpost I will dig deeper on the new concept of elastic database.

Harry

Automatically building a Microsoft BI machine using PowerShell – preparation: install files using Azure File Service (post #3)

This post is #3 in the series to automatically build a Microsoft BI machine using PowerShell – see the start of series.

 

In this series so far:

Start of series – introduction and layout of subjects
Post #2 – Preparation: install files using Azure disk

 

 

In our last post we looked a one way of working with the install files required for automating the installation of a BI machine, using disks. This post will focus on sharing the install files using Azure File Service. The Azure File Service exposes file shares using the standard SMB 2.1 protocol. It is in some ways an addition to storage accounts. See http://blogs.msdn.com/b/windowsazurestorage/archive/2014/05/12/introducing-microsoft-azure-file-service.aspx for more information. This service is in beta at the moment, so you will need to subscribe to the beta using the Azure Preview portal: http://azure.microsoft.com/en-us/services/preview/. Look for ‘Azure Files’ in the list and click on ‘Try it’ to get your account activated for the preview.

The Azure File Service is not exposed in any portal, probably since it is in preview. Also, keep in mind that while the service is in preview existing storage accounts will not have access to the File Service, so we will need to create a new storage account as well. To do this login into the portal and click on ‘New’ and create a new storage account. After the storage account has been created, you will need to use PowerShell to create a file share. Make sure you have the latest version of Azure PowerShell installed and then run the following in Azure PowerShell or use ISE:

After this runs you should be able to access the file share in multiple ways, but the easiest way I found is mapping the share as a folder in a VM by running:

Now you can download and store files on the share just as you can with disks, as discussed in post #2 on using install files using Azure disks.

Next post will be our final step of the preparation: logging.

 

 

 

Automatically building a Microsoft BI machine using PowerShell – preparation: install files using disk (post #2)

This post is #2 in the series to automatically build a Microsoft BI machine using PowerShell – see the start of series.

The first step in our preparation is making the install files available. I see two options for this, namely using VHD / disk and Azure File Service. In this post we will walk through how to make the install files available using a VHD / disk in Azure.

The way this works is making a new disk that you can store the installer files on. After you created your virtual machine that you would like to automatically you would attached that disk so the installer files are available to the machine and thus to the automatic installer script.

To create a new disk, log in into the Azure portal (either the production or the preview) and navigate to Virtual Machines. Select a machine you have created. This could be the one you will be setting up or any other VM. The disk you will create is re-usable across machines. Once you have a machine selected, click ‘Attach’ and select ‘Attach empty disk’ to create a new empty disk:

Enter a file name for the new disk and set up the size in GB you expect to be using. No need to change the host cache settings here. When done, click button at the bottom right:

After the disk has been created, login to the machine you attached the disk to using Remote Desktop. You can you download the install files and save them to the disk you have just attached. In the VM, Start ‘Disk management’ (for example by right-clicking the Windows button and selecting it from the list). You will see a notification to initialize the disk. Accept the defaults and click OK:

Once the initialization is done we need to create a partition on the disk. In Disk Management, right-click on the new disk and choose ‘New Simple Volume’. Follow the steps of the wizard, taking note of the drive letter assigned. Also make sure to set a volume label and wait for the format to finish.

Once the formatting is done, you can download and store you install files to the disk. I created a folder in the root of the disk named ‘Resources’ and created a sub-folder per software item required. I saved the install files in these folders. The scripts we will create will point to these install files. The scripts I also store on the same install disk.

When you are done downloading the files (and later making the scripts) you can detach the disk from the VM and re-attach it to another machine by using the portal and select the VM that currently has the install disk attached and selecting ‘detach disk’ and choosing the disk to detach. You can then re-attach the disk to another VM.

This is an OK way to work with the install files. In the next post we will explore an alternative way using Azure File Service.

 

 

 

 

 

Automatically building a Microsoft BI machine using PowerShell – Start of Series

I used to spend quite some time on building and re-building Microsoft BI demo machines. As you can imagine this manual process takes a lot of time and effort. Therefore (and also for my own education on PowerShell) I decided to look into automating the whole process. I will explain this in this series of posts.

The goal

In the end, we want to have a virtual machine that is configured as follows: Windows Server 2012 R2, with Active Directory Domain Controller role. Additionally, SQL Server 2014 is installed and configured as well as SharePoint 2013. Finally, the BI tools like Power Pivot and Power View are configured.

Ok, but how do we build such a machine?

Here are the steps to take. I always do them in this order, partly because there are some dependencies and partly because it stops me from going insane.

  1. Install Windows (doh). I will skip this step (therefore it is number 0) since I use Azure and a VM in Azure comes with Windows Server pre-installed. I happen to use Windows Server 2012 R2 b.t.w.
  2. Disable Internet Explorer Enhanced Security Configuration. Although it is a great idea (see http://technet.microsoft.com/en-us/library/dd883248(v=WS.10).aspx for more info on this) it is hard to give a good demo on the machine with this thing on. So first step is disabling it.
  3. Set up Active Directory; AD is required for the PowerPivot service.
  4. After AD has been set up we need to promote the Domain Controller.
  5. After promotion we configure a very unrestrictive password policy; remember, this is just a demo machine!
  6. Virus protection is important, even for a demo machine; therefore set up System Center Endpoint Protection.
  7. Install SQL Server 2014.
  8. Install SharePoint.
  9. Install PowerPivot Service.
  10. Configure PowerPivot Service.
  11. Configure last parts of PowerPivot Service.
  12. Configure Master Data Services.
  13. Configure Data Quality Services.
  14. Configure other SharePoint Service Applications.
  15. Activate SharePoint site features.
  16. Add favorites in Internet Explorer to point to MDS and SharePoint site.

In this blog series I will share my PowerShell code to accomplish this. Please note that I am not a developer so things can probably be done a lot smarter J

Next step is preparation: the install files.

First look: Project Oxford – powerful face, vision, language and speech APIs

Just announced: Project Oxford (http://www.projectoxford.ai), a project that aims at providing powerful APIs for developers that are looking to use face, speech and language recognition capabilities to their applications. The site of Project Oxford provides a way to interact with the APIs currently available.

In this first look we will focus on the Face APIs that can be used for face detection (finding faces in photos, determining gender and age), age verification (checking that a person in two photos is the same), similar face searching, face grouping and face identification.

Here is a sample result of the Face Detection API (you can do this yourself too!) using a photo of me:

This is actually quite good, this picture was made a while ago and I was 29 at the time, so not too far off.

The Face Verification API allows you to check if the person in two photos is the same. Here is the result of my test (again, do this yourself!):

Have a look at that, I turn out to be the same person in both photos J

This is very powerful stuff, I am looking forward to start using this in projects. Will keep you posted on that.

%d bloggers like this: