Category Archives

98 Articles

Multilingual SSRS reports – Scenario 2: Change RDL

This is the third post in my series about multilingual SSRS reports. If you missed the introduction, you can find it here.

In this post we will talk about the second implementation scenario, which changes the RDL after creating it. The diagram below helps to understand this:

This means that developing the report is independent of making it available in multiple languages.
This means there is no impact on the process of creating a report, where with the first scenario (custom assembly) there was an impact (and a rather big one!).

The downside of this solution however is that there will be a separate process manipulating the RDL after it has been developed. This has a downside however: if the RDL language schema changes (and it does just about every new release of SQL) you will have to check if your code still works.

Now, the process that changes the RDL could do two things: 1) change the original RDL and add localization that will actually localize the report at run time or 2) change the original RDL and make a copy of it for every language (essentially you get the same report multiple times).

The first option here is just an automated version of scenario number 1 (the custom assembly) which we discussed earlier. It however eliminates the biggest issue with scenario 1: the fact that it is a manual process and has to be repeated for every label. However, what this option doesn’t do is allow you to translate parameter prompts, which option 2 does. Downside of option 2 however is that multiple copies of the report get created (one for each language). Creating multiple copies of the report (one for each language) would have no impact on rendering the report and may be a good choice if you want to manage each language separately. You will need to decide for yourself what you want to do, the basic architecture of this scenario stay the same.

In this post we will deal with the latter option (option 2).

I envision the process that changes the RDL as just a process that gets executed periodically. The process reads the RDL and translates any text it finds again using the translation table, resource file or whatever solution you picked for storing translations.

Implementing the process is out of scope for this blog because it is a matter of reading an XML file (RDL is XML structured) changing some items and writing it to disk. Any .NET developer could do it, for example using XPath.

The trick of course is knowing what to find in the RDL and what to change.

The simplified structure of RDL (SQL 2012) is the following (I stripped away all that is not related to localization):


<?xml version="1.0" encoding="utf-8"?>

As you can see, there a just a couple of items we need to look for when scanning the RDL:

  • Report.DataSets.DataSet
    DataSets define the queries to the source systems. If we want to localize result sets we need to manipulate the query here.
  • Report.ReportSections.ReportSection.Body.ReportItems / Report. ReportSections.ReportSection.Page.PageHeader.ReportItems / Report. ReportSections.ReportSection.Page.PageFooter.ReportItems
    ReportItems can be TextBox, Chart, Tablix, which will be discussed in more detail later.
  • Report.ReportParameters.ReportParameter
    Parameter prompts can be localized here.

Localizing a DataSet
A dataset defines the <CommandText> which essentially is the query to the source system. When changing the RDL one can easily add a where-clause to the query indicating the language to render: 

Where translationtable.language='en-us'.

What you will be looking for is Report.DataSets.DataSet.CommandText to do this.

Localizing ReportItems
ReportItems can be TextBoxes, Charts, and Tablixes each of which carry one or more labels that need localization.
The structure of a TextBox looks like this:


You will be wanting to localize what is inside <Value></Value> tag.

For Tablixes you will also be looking for the <Value></Value> tags inside TextRuns on Cells, which are just TextBoxes. Here is the basic structure of a Tablix:


You will want to change what is inside the <Value></Value> tag of each TextBox. The TextBox here has the same structure as above.

Finally, Charts are a bit different, their basic structure is like this:

            <ChartSeries Name=""/>

You can localize the following items on charts:

  • Series name: Chart.Chartdata.ChartSeriesCollection.ChartSeries.Name
  • Axis Title: Chart.ChartAreas.ChartArea.ChartCategoryAxes.ChartAxis.ChartAxisTitle.Caption and ChartAreas.ChartArea.ChartValueAxes.ChartAxis.ChartAxisTitle.Caption
  • Chart legend title: Chart.ChartLegens.ChartLegend.ChartLegendTitle.Caption
  • No data message: Chart.ChartNodataMessage.Caption

Localizing Report Parameters
ReportParameters define data types, default values, valid values and also the prompts. The last one you will be wanting to localize.

The basic structure of the ReportParameter definition is:


You will be looking for the ReportParameter.Prompt tag.

That concludes our overview of changing the RDL to implement localization. I agree it is not the most elegant solution as it increases dependency and complexity in your environment, however it is fairly simple to implement and provides a complete localization opportunity, from datasets to report items and even parameter prompts.

Stay tuned for the next implementation scenario!

Power BI and why you should care

Yesterday Microsoft announced Power BI for Office 365: a self-service Business Intelligence solution delivered through Excel and Office 365. Power BI ties together the various bits and pieces we already had (Power View, PowerPivot, GeoFlow, Data Explorer) and also introduces some exciting new functionality. In this post I will introduce you to Power BI and discuss the various capabilities. Future blog posts will deal with the components more in-depth.

(By the way, do not let the tag “for Office 365” set you back; Geoflow Power Map and Data Explorer Power Query are available as add-in for Excel regardless of whether you use Office 365 or not).

Power BI
The image below shows the Power BI platform. I have broken it down in two segments: Excel and Office 365.


With Power BI we take the next step to making Excel a true BI tool. BI developers used to smile when I talked about Excel and told me Excel helped end users create non-transparent, spaghetti like BI solutions. I have to admit, that is true. Now with Power BI anything you do with data in Excel, from loading and cleansing using Power Query, modelling and enriching it using PowerPivot and finally displaying using Power View and Power Map is structured and traced. No more page long formulas. No more copy-paste, hidden sheets, linked formulas and other nightmares for us BI folk.


ETL – Power Query
Previously named Data Explorer, Power Query is our self-service ETL tool in Excel. Power Query can connect to just about any data source you throw at it and it enables you to load data, cleanse it and then use it in your Power Pivot model. It even includes a natural language search function that helps you find information in your organization or on the web if you do not actually know where the info could be at. Imagine have loaded something from your corporate data warehouse and then adding relevant external information (such as weather or population info) from the web without having to leave Excel! Once you have loaded data you can add and drop columns, change data types, split columns, combine tables, filter data, remove duplicates, etc. Power Query not only connects to “standard” databases and files, but also includes a connection to Facebook, Hadoop (HDFS and Azure HDInsight), SharePoint and any OData Feed. All steps you do in Power Query are stored in a script so it is clear where data came from, what happened to it along the way and where it got displayed. See my screenshot below: I did a web search for ‘population of European cities’ and clicked on a Wikipedia page to get the data in Excel.


Analysis – PowerPivot
PowerPivot has been out for a while and has gotten quite some attention. PowerPivot allows you to do data modelling with massive amounts of data in Excel. With massive, I mean huge. I keep repeating this as long as I keep meeting people who still think that Excel cannot handle 5 million records (that happened to me yesterday). With PowerPivot it is easy to load data from various sources, link them together (essentially creating a data model) and apply formulas. Using PowerPivot you create a structured model for your data in Excel. And it is fast. (Did I already mention it can handle lots of data?).

Continuing on from the example I started above, I added an Excel sheet with stores per city and their sales to the data model (the Excel sheet has just one sheet, which contains a simple table listing StoreID, SalesAmount and City). Then I related the two tables by dragging City from my Excel sheet to the Name column of the Wikipedia data I loaded using Power Query. The resulting data model is shown here:

Now I can do interesting stuff, such as add a calculation to figure out sales per inhabitant (Sales per Capita), by adding a column to the Stores tables with the following formula: =[SalesAmount]/RELATED(Cities[Population]) . (Dividing SalesAmount by the related cities’ population).


Reporting – Power View
Ah yes, Power View. The tool that is so easy to use that even my mom can use it (and its true). Power View enables you to create great looking, interactive reports with just a few mouse clicks right there in Excel. Just select what you need, decide how to show it and you’re done. Power View includes all the standard things: tables, matrixes, column charts, bar charts, pie charts. It also includes some great features that introduce time as a factor in your analysis by allowing you to create scatter plots with a play axis (think bouncing bubbles). Moreover, Power View can display images right there in your report and includes 2D mapping functionality.


In my example, with just a few clicks I created this report (I selected the Netherlands as country in the bottom right graph to show the highlighting capabilities in the other graphs). Also note the texts above each item to understand what is displayed here.


Geospatial – Power Map
Power Map (previously known as GeoFlow) is a very powerful 3D mapping tool. It allows you to plot any data on a map, as long as it makes sense. For example, just trying to plot your products on a map might not make sense. However, plotting your stores on a map makes a lot of sense. You do not need to specify longitude and latitude or other fancy stuff. Just some text is enough and the tool will go out and try to plot it on a map. Just try it, enter some city or venue names in Excel and click Insert à Map. Two more clicks and you have plotted the information on a map!

In my example, here is what I created using Power Map. Again, this took me just two minutes:

(Above shows total sales amount and sales per capita per city, plotted on the 3D map).


BI Sites
A BI site is an optimized workspace dedicated to BI. You might call this a data marketplace: it is a one-stop shop where you go to get anything related to BI. You go there to consume a report, create a new analysis, share an analysis, discover some new insights using the items provided and find information.


Natural language query – Q&A
This is a feature I particularly love! It gets us closer to Minority report: just type what you’re looking for and we’ll find it and display it. Once information is published to the BI Site (for example through the Data Management Gateway (below) but also just by uploading an Excel sheet), you can search through all that information just by typing a question. In my example this might look like ‘sum of sales amount by country’. You can change the way the information is displayed by including ‘as map’ or ‘as bar chart’ to your question. I do not have a demo available right now, so I’ll just include a screenshot here. Here the user just typed ‘number of gold medals by country in 2008’ and the information is retrieved from an Excel sheet (note that the user has not explicitly asked to get data from that particular sheet) and shows it as a map (since we know this is geospatial information).


Manage and monitor
Power BI empowers data stewards; business users can grant access to published data sets based and track who is accessing the data and how often. This brings to mind the PowerPivot management dashboard we know and love.


Data Management Gateway
The data management gateway allows IT to build connections to internal data sources (think your data warehouse or other LOB information source) so reports that are published to BI sites can get that data easily.


Mobile Access
Last but not least: mobile access (woohoo!). Users can access their reports through a HTML5 enabled browser or through a mobile application on Windows or iPad. This means that Silverlight is no longer a requirement for accessing Power View reports. Other platforms might be added later.


And this is relevant…how?
So you have read this and maybe read some other blogs as well. You’re thinking to yourself: why should I care?

My question then is: do you use Excel? Well yes, any person who has ever worked with a PC has used Excel.

Exactly. That’s why you should care. You should care because the good old Excel which you though you knew so well has suddenly transformed into a cool kid on the block with lots of great and really easy to use features.

Those features enable you to find any data, work your magic and then gain insight from that data. Just think about that. How could you use this in your business? And in your personal life? (I myself am looking to buy a house. Power BI has allowed me to understand which neighborhoods I would like to live and which not, just by finding and visualizing data). I know there are specialized, paid, services for that (involves sending a text and paying for the info). I did it myself in half an hour, paid zero and learned a lot more about the question at hand.

Try out connecting to Facebook for example, and plot your friends on a map! Or find out who has not disclosed their gender to Facebook… J

This might be a revolution: Power BI brings the might of information analysis tools to anyone to consume any data for any scenario. The possibilities are endless. It is just a matter of using your creativity.

Since you are human, using your creativity is probably what you really want to do. Power BI: be creative with data.

Multilingual SSRS reports – Scenario 1: Assembly

This is the second post in my series about making SSRS report multilingual. You just missed the first post, the introduction and comparison of the solution scenarios.

This solution consists of an assembly registered in each report that should support multiple languages / localization. The assembly retrieves translation from a source, such as a data source, resource file, web service or online translation service. Anything is possible here.

This solution is limited in the fact that it will not translate dataset results nor parameter prompts. In addition there is an impact on the report creation process as you will see down the line. The upsides are that it is a relatively straightforward solution that has only little impact on report rendering performance.

1. Creating the assembly
To implement this solution we first need the assembly that will do the actual translation / localization. To build this assembly fire up Visual Studio and create a new Class library project. I will use C# here (and in all my samples on this blog, but another language would work just as well). I named the project SSRSMultiLingualAssembly.

When the new solution and project has been created, rename Class1 to something that makes more sense. I renamed it to SSRSMultiLingual.

Open the class and add the following method to it:

public string Translate(String cultureInfo, String item)
            //Put logic here
            //return the translated string
            return "PLACEHOLDER: " + item +" translated into "+ cultureInfo;

What you would need to do is implement the class and translate the item using the cultureInfo and return the translated result. This is very dependent on your actual situation and also fairly independent on the actual scenario you choose to implement.

For now, let’s continue to the SSRS side of things to tie things together. Later we can deal with actually making it do something.

After you have added the code, build the solution and make sure there are no errors.

2. Copy assembly to the SSRS folders

To make the assembly you have just built assessable from reports, you will need to copy it over to the SSRS directories, including any resource files or whatever your assembly needs to work. Of course you could make this part of a custom build action and kind of automate this action, but for now here is the manual method.

Copy the assembly into the Report Server path: C:\Program Files\Microsoft SQL Server\MSRS11.MSSQLSERVER\Reporting Services\ReportServer\bin
(this path may change based on installation location and version of SQL you have installed). This will register the assembly with the report server, which is strictly only required after deployment of the reports to the server and not during design / development time.

In order to make it accessible during design time, the assembly also needs to be copied to the report designer folder. Copy and paste the assembly to C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies (again, this path may change based on installation location and version of SQL / Visual Studio you are working with). Note the (x86) in C:\Program Files (x86), since the same directory also exists in C:\Program Files if you are working on a x64 system (which I assume you are).

By the way, if you use reporting services in native mode (i.e. not integrated with SharePoint) you also might have to copy the same assembly to C:\Program Files\Microsoft SQL Server\MSRS11.MSSQLSERVER\Reporting Services\ReportManager\bin. I have not tried this since I am running SSRS in SharePoint integrated mode.

Note that you could also add the assembly to the global assembly cache (GAC).

3. Add the assembly to your reports
In order for your reports to successfully use the assembly to translate text open a report in Report Designer (your Visual Studio environment).

From the report menu select report properties. In the dialog box click references.

Click add and then click on the ellipsis button (…) at the end of the newly added row. In the next dialog box (add reference) click browse, locate the assembly and click ok.

Verify that your assembly has been added. In the bottom part of the dialog enter your class name (assemblyname.classname) and set up an instance name. Note that the class name is case sensitive. Your instance name can be anything, I chose to set my instance name to ‘myML’.

Close the dialog by clicking ok.

Now that we have added the custom assembly to the report it is time to call the translate function to translate text!

4. Calling the Translate function
Now that the report has a reference to the custom assembly, let’s wire up the last part to actually translate text. Start by either adding a textbox to your report or selecting a textbox (or any other label for that matter). Right-click the textbox and select expression… (Expressions allow you to do some limited programming that gets executed when the report renders). In the expression dialog box, enter the following code:

=Code.<your instance name>.Translate(user.Language,"<your text to translate>")

For example, with the settings I made the expression dialog box looks like this:

Click ok to close this dialog. Never mind the red squiggly line.

Now, the big moment is there! In report designer click preview and see the results of your labor!

Come on, pat yourself on your back, you have successfully registered a custom assembly and used it in a report. That wasn’t too hard now was it?

Note that you will have to set up this expression with the correct parameter value for every single text on the report that you want to translate. This is very laborious and error prone. Also, this is the reason why this solution does not translate parameters.

5. Change Reporting Services configuration and deploy report
Once you have completed the report and am certain it works OK it is time to deploy the report to the report server. For the assembly to work there you will need to have the assembly in the appropriate places (see above) make some configuration changes and you will have to restart your Report Server after any change in the assembly, including first registration.

If you do not make the required configuration changes you will get the following error when deploying:

Assuming you have already copied the assembly to the correct directories as indicated above, let’s make the required configuration changes. You will have to open the rssvPolicy.config file located in C:\Program Files\Microsoft SQL Server\MSRS11.MSSQLSERVER\Reporting Services\ReportServer. Open the file and find the <CodeGroup> tags. There will be a bunch of them. Basically what we need to do is add one of our own after the last <CodeGroup> tag. This is what you will have to add:

<IMembershipCondition Version="1" Url="C:\Program Files\Microsoft SQL Server\MSRS11.MSSQLSERVER\Reporting Services\ReportServer\bin\SSRSMultiLingualAssembly.dll" />

Change the Url value to match your situation. This code group gives the assembly (your assembly in the directory specified above (which you will have to change based on your installation) full trust permissions. For testing purposes you could get rid of the SSRSMultiLingualAssembly.dll part and give any assembly in the directory full trust, however in production you will have to set up the code group as specific as above, so why not do it right from the start?

Now it is time to restart the Reporting Services service. To do this either use services.msc or use SQL Configuration Manager.

Now, deploy the report and render the report. You can test different languages by switching your Internet Explorer to another language (Settings à Internet Options à Languages).

Rejoice! You have successfully implemented this scenario of localization of SSRS reports. It may be time to check out the other implementation scenarios.





Multilingual SSRS reports

When thinking about supporting multiple languages for SSRS reports, most people think about changing the display language of one or more of the following items:

  • Labels (text boxes, axis labels, table headers, etc.)
  • Dataset results
  • Parameter prompts

When considering presenting SSRS reports in multiple languages, one also needs to consider where to store translations. Translations can either be:

  • stored in a data source (database, cube) or resource file, either accessed directly from the report or through a service call
  • retrieved from an external translation service, such as Bing Translate

In most solutions retrieving a predefined translation would be preferable to doing an automatic translation using an external translation service, since the quality of the results can be questionable. A scenario I come across often is that the external translation service is only used when a predefined translation is not available. The scenarios discussed in this series of posts do not pose any requirement on how the translation is retrieved. Reporting Services can pass the user’s language setting along. That can then be used to get the right translation.

Additionally, some of the items considered when thinking about a multilingual SSRS solution are:

  • Can the report be developed once and presented in multiple languages? All of the scenarios discussed in this series provide this capability.
  • Impact on report creation process. Does the solution chosen require manual activities when designing the report?
  • Performance
  • Implementation complexity

In this series of blog posts we talk about options for implementing multilingual SSRS reports. Below is a quick scoring of each option on the above capabilities and requirements. Of course the importance of requirements and the correct choice depends on the situation.

Each scenario will be discussed in a post in this series.

Requirement / ScenarioAssemblyChange RDLReport Definition CustomizationCustom ReportViewer
Translate labels (textboxes, axis labels, table headers, etc.)





Translate dataset results





Translate parameter prompts





Impact on report creation





Impact on report rendering performance





Implementation complexity






Did I miss a requirement or solution? Please let me know!

Next time: Scenario 1: Assembly.

Do you want to jump to a specific scenario? Here you go:

Scenario 1: Assembly
Scenario 2: Change RDL
Scenario 3: Report Definition Customization
Scenario 4: Custom ReportViewer

Meet Paul

Those of you who have attended one of my talks on BI probably know this story. I get asked about it a lot so wanted to share this more permanently. For me this story sums up the chance we have with Microsoft BI to fix one of the biggest issues in the corporate world.

Some time ago I worked as a BI Consultant on a data warehouse project at a major customer. All floors in the 20 floor office were like the ones you see in movies, all mindless, endless rows of cubicles. My cubicle was one in what they called ‘the front row’, which I think meant ‘at the central aisle’, which ran from the door to the manager’s offices in the back.

One day the door opened and someone that looked a bit like Charlie Chaplin walked out onto the floor. He was dressed like an old school English gentleman; complete with hat, newspaper and umbrella. He wore a yellowish dress shirt, blue suspenders and a brown tie with little blue bears on it. I estimated him to be about 70 years old. I am not that good at guessing ages, he might have been 75. Anyway, it was clear that he was well beyond retirement age. He looked around a bit and waited. In the back of the floor his arrival was noticed and someone hurried over to him and guided him to one of those identical cubicles a little further from where I was. Since I felt this was going to be interesting I went to get some coffee and made sure I passed along that cubicle on my way. ‘Charlie’ sat at an old computer (remember those CRT monitors?) and I saw him do something that grabbed my attention. He started Microsoft Excel version 5.0. When I got back from the coffee machine I stopped at this cubicle again and I saw him busily typing away. Some moments later a matrix printer which also stood there sprang into action and started spitting out some papers. He started to collect his stuff, took a quick glance and the papers and handed them over to guy who greeted him at the door and left.

I had the chance to peek at what was on those papers and I am no expert but to me it seemed a lot like a profit and loss statement. That got me puzzled even more, so instead of returning to my cubicle I walked over to the office of the BI manager, who was also my project lead. I described what I saw (‘older man came in, sat in a cubicle, pushed some buttons, printed some pages and left’). The BI manager looked at me and nodded: ‘You just met Paul’.

He continued: ‘Paul used to work for us and retired about five years ago. In this long employment here he made a big Excel spreadsheet that enables us to generate a profit and loss statement. We hire Paul twice a year just to come in here, push some buttons and get us that statement. We pay him handsomely for that service because we need that statement for the financial authorities here. If we do not provide the statement one time twice a year we might lose our license’.

Stunned, I looked at him and said: ‘I am going to ask you a tough question.’ He replied: ‘I know what you are going to ask so go ahead’. I said: ‘Let’s imagine that, heaven forbid, Paul dies tomorrow.’. He froze, looked me straight in the eye and said: ‘We would go bankrupt or lose our license.’

Although this might seem a little over done, this is a true story. Think about it for a moment what this could mean for you and your company. Do you think you have a Paul in your company? I am sure you have; every customer I talk to recognizes this story in some share or form. Do you have any idea what he has built and how dependent the company is on it?

It is time to find Paul, talk to him and make sure you understand what he built. If you can, migrate his stuff over to a more corporate solution. In any case, we need to get this under control. This is not a tiny little company I am describing here, this is a multi-million dollar business and the P&L statement comes from a black box Excel 5.0 sheet that Paul built and only Paul knows how to run.

MDS / DQS integration on a domain controller

Normally I would never advice you installing anything on a domain controller, let alone SQL, MDS and DQS. However if you have BI demo machine you will probably have all this (and more) running on the same box. At least I do J

If you do you will probably get this error message when you try to enable the DQS integration from Master Data Services Configuration Manager after you successfully installed DQS and MDS.

When clicking the button ‘Enable integration with Data Quality Services’ an error will pop-up:

Here is where it gets a bit confusing. If you read the error message closely, it seems that MDS is looking for a local account on your machine instead of a domain account. However, with it being a domain controller, you cannot create local accounts…

To make this work you need to do the following:

  1. Add a Windows User Login into SQL Server for [YourDomain]\MDS_ServiceAccounts.


  2. Then run the following query against your DQS_MAIN database, which creates a user on the DQS_MAIN database which maps to the login you just created and adds the user to the DQS_Administrator role. Of course you can also do this using the UI. Make sure to enter your DOMAIN in the query below before executing.

    use [DQS_MAIN]
    CREATE USER [MDS_ServiceAccounts] FOR LOGIN [YourDomain\MDS_ServiceAccounts]
    exec sp_addrolemember @rolename=N’dqs_administrator’,@membername=N’MDS_ServiceAccounts’
  3. When done go back to the Master Data Services configuration manager and hit the button again. Now it should come back with:

Victory ! J


SQL Server 2012 and SharePoint 2013 – Better Together session on repeat

June 10th, we will be hosting a SQL Server 2012 and SharePoint 2013 – better together session aimed at partners at our Microsoft office in the Netherlands! This is the third delivery, because the first two deliveries were overbooked and highly valued.

More information at

Looking forward to meeting you there!

(Please note that this session will be in Dutch..)

SQL Server 2012 Unboxing

Just about every new consumer technology device will be greeted with “unboxing” videos on YouTube. A lot of the people I talk to really need to start unboxing SQL Server 2012 and start to understand what is in the box. Most of them already have access to SQL Server 2012 and still think it is just a database. There is so much more! This post is aimed to providing a quick overview of what exactly is in the box with pointers to where you can find documentation.

  1. Database Engine (SSDE)
    First off, let’s start with the product that gave SQL its name: the database. This is without doubt the best known product of the whole SQL suite and also the most used. More often than not this is also the only product people use and know. Find out more here:
  2. Data Quality Services (DQS / SSDQS)
    Introduced with SQL Server 2012, DQS is a knowledge-driven data quality solution that works on the premise of specifying what defines data quality in a knowledge base and using to cleanse data automatically during ETL (see SSIS below), Master Data Management (see MDS below) processes or manually.
  3. Analyis Services (SSAS)
    Analysis Services is SQL Server’s analytical database or cube. It features both more traditional cubes and tabular models, provides self-service analysis capabilities and includes data mining. See
  4. Integration Services (SSIS)
    Integration Services is a full-blown ETL tool and can be used for all sorts of data integration solution. SSIS features a drag and drop interface to build the solution and provides a lot of components out of the box with connectors to and from just about any database, file storage or file format. If need be, you can also use the power of .NET to build the exact behavior required. SSIS also integrates with DQS to use data quality knowledge bases during ETL processes. For more info visit:
  5. Master Data Services (MDS)
    Master Data Services enables users to build a Master Data Management solution on top of SQL Server. MDS integrates with DQS to make data quality aspects a part of the overall MDM solution. See
  6. Reporting Services (SSRS)
    Reporting Services is the enterprise reporting solution that delivers web-enabled reports that can get information from a variety sources and be rendered in various formats (including Excel, Word and PDF). Also, reports can be retrieved on demand, on subscription bases or based on a alert. Find out more here:
  7. StreamInsight
    StreamInsight is Microsoft’s Complex Event Processor (CEP). CEP technology enables high throughput and real-time (low latency) processing of streams of data (events). Examples include financial trading, Web analytics, sensor data, etc. StreamInsight is provides a familiar development platform based on .NET to quickly start using real-time information. See:

That concludes the quick unboxing of SQL Server 2012. Although there is a lot more to say (about features, but also around editions and capabilities) , this should give you a good idea of what is in the box. Bottom line: there is a lot more to SQL Server than just a database!

%d bloggers like this: