Friday, December 28, 2012

SSAS Tabular with error "OLE DB or ODBC error: Login failed for user 'domain\instance'.; 28000"

If you are trying to create a SSAS Tabular model and if you are getting error similar to following one then you are reading appropriate post as I am going to explain the solution for resolving the issue.
Error looks similar to following one;


OLE DB or ODBC error: Login failed for user 'Domain\instancename$'.; 28000.

A connection could not be made to the data source with the DataSourceID of 'e2d72aea-e51d-4816-a6f6-c47e98716eff', Name of 'SqlServer localhost AdventureWorks2012DW 2'.

An error occurred while processing the partition 'DimCustomer_80fa8624-8c3f-43f0-9e17-950292667334' in table 'DimCustomer_80fa8624-8c3f-43f0-9e17-950292667334'.

The current operation was cancelled because another operation in the transaction failed.

Resolution:

If you are trying to create a tabular model with "Impersonation Information" as "Service Account" then you will have to add "NT AUTHORITY\NETWORK SERVICE" in the "Security" of relational database.

Connect to relational database and expand "Security" folder, add new login i.e.NT AUTHORITY




Friday, July 27, 2012

Errors in the OLAP storage engine: The attribute key cannot be found when processing

The most common error while processing cube and which everyone face when they are newbie to cubes is "The attribute key cannot be found". If the person is newbie to SSAS then probably he/she will find it hard to  get into the exact root cause of error so today I am going to explain this error message and then the resolution for the same. I have created the following example of error message to explain the error in more detailed way.
Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'dbo_FactSales', Column: 'ProductID', Value: '1111'. The attribute is 'Product ID'.

The above error explains that the fact table named "FactSales" contains column ProductID with value "1111" but the same  ProductID  is not present in your dimension table. There is a primary key - foreign key relationship exist between the ProductID column of dimension table and fact table named "FactSales" and cube is unable to find ProductID with value 1111 in the dimension table. So the first step one should do is to check either your dimension and fact table contains the value mentioned in the error message (  Value: '1111' in the above example) and the most possible chance is your fact table contains the value (ProductID=1111) but the dimension table for the same does not contain the value (1111). If this is the case then your dimension table is not populated properly so try to bring the ProductID = 1111 in your dimension table. If the ProductID with value 1111 is present in both dimension as well as in fact table then your cube dimension is not updated yet so you can do that by doing "ProcessUpdate" on the corresponding dimension first and then try to process the measure group or partitions.
If you are doing daily processing of your cube (using sql job) then always do ProcessUpdate of your dimensions first and then process your measure groups/partitions.


Wednesday, July 25, 2012

MDX for getting data for last 7 or 15 days

This is a very common requirement in most of organizations where they want to analyze the sum of last 7 or 15 days data and if you are asked to write MDX for such requirements then you can write your MDX in the following way;

MDX for getting the SUM of last 7 days.


WITH
  MEMBER [Measures].[Sum Of Last 7 Days] AS
    Sum
    (
      {
          [Date].[Calendar].CurrentMember.Lag(6)
        :
          [Date].[Calendar].CurrentMember
      }
     ,[Measures].[Internet Sales Amount]
    )
SELECT
  {
    [Measures].[Internet Sales Amount]
   ,[Measures].[Sum Of Last 7 Days]
  } ON COLUMNS
FROM [Adventure Works]
WHERE
  [Date].[Calendar].[Date].&[20070827];

MDX for getting the SUM of last 15 days.

WITH
  MEMBER [Measures].[Sum Of Last 15 Days] AS
    Sum
    (
      {
          [Date].[Calendar].CurrentMember.Lag(14)
        :
          [Date].[Calendar].CurrentMember
      }
     ,[Measures].[Internet Sales Amount]
    )
SELECT
  {
    [Measures].[Internet Sales Amount]
   ,[Measures].[Sum Of Last 15 Days]
  } ON COLUMNS
FROM [Adventure Works]
WHERE
  [Date].[Calendar].[Date].&[20070827];

You can use above MDX for getting the sum of last any number of days by just changing the value of Index in .Lag(Index) function.

Tuesday, July 24, 2012

Read partition QueryDefinition (query) using AMO

I found some similar questions on msdn forums where some guys are interested to know how to get the SQL query using AMO which is used there in partition Querydefinition. I have answered one of the similar question on MSDN forum (MSDN thread) but I thought lets share the same on blog too so others can check if they need.


For using AMO, you will need a "Microsoft.AnalysisServices.dll". Reference the dll in your project so you can use all the classes of AnalysisServices namespace. After referencing, you can use following code to get the query used in partition Querydefinition.



        Dim objServer As Server
        Dim objDatabase As Database
        Dim objCube As Cube
        Dim objMeasureGroup As MeasureGroup
        Dim objPartitionSource As QueryBinding
        Dim strQuery As String

        objServer = New Server
        objServer.Connect("localhost")

        objDatabase = objServer.Databases.FindByName("Adventure Works DW 2008R2")
        objCube = objDatabase.Cubes.FindByName("Adventure Works")
        objMeasureGroup = objCube.MeasureGroups.FindByName("Internet Sales")

        For Each objPartition As Partition In objMeasureGroup.Partitions
            objPartitionSource = objPartition.Source
            strQuery = objPartitionSource.QueryDefinition
        Next

        objServer.Disconnect()

While executing above code,  strQuery will return the query from partition  QueryDefinition

Monday, July 23, 2012

No mapping between account names and security IDs was done...

Sometimes you may come across the following error message while deploying cube "No mapping between account names and security IDs was done...". I have answered similar question on msdn forums and the thread for the same is MSDN Thread.

This error comes while deploying cube if one of the active directory user has removed from your active directory but still the reference of that AD user name exist in your SSAS cube Roles.
Connect to Analysis services and go to the "Roles" folder. Open the Role and go to the "Membership" tab, check if "Specify the user and groups for this role" contains any SID (Security Identifier) which looks somewhat like this "S-1-5-21-3623811015-3361044348-30300820-1013". If yes, then remove that and if any such SID is not present then remove and re-add the active directory group and check by doing the deployment.

Sunday, July 22, 2012

OLE DB error: OLE DB or ODBC error: Query timeout expired; HYT00

I found many threads on MSDN forums asking for the solution on the following error message;
"OLE DB error: OLE DB or ODBC error: Query timeout expired; HYT00"
One of the similar thread on which I have answered is MSDN Thread, So I thought let us share some insights on the root cause and solution.

Whenever we process entire cube database, ssas engine process all the objects in parallel. It process dimensions first and then measure groups. So if your relational database tables contain huge volume of data and if the cube needs to fetch huge volume of data while processing then sometimes processing can exceed 1 hour (3600 sec) of time and then you will get the similar "Query timeout" error message. So for rectifying the issue you can increase the Query timeout from 3600 sec to 5400 sec (i.e. 1 hour 30 mins) or to any value as per your requirement. Now next question is, where should I change this property and the answer to this question is SSAS server property named "ExternalCommandTimeout". This server property is used to set the number of seconds that SSAS should wait to time out when issuing commands (queries) to external data sources.

Now you want to change the server property and for doing the same you needs to follow following steps;

1. Connect to SSAS services using SSMS and right click on instance name and select "Properties" option.


2. When you click on "Properties" option, you will get "Analysis Server Properties" window. Check the checkbox "Show Advanced(All) Properties" and go to the property named "ExternalCommandTimeout". You will find the default value as 3600 sec. Change the value as per your requirement and click OK button.
You do not need to restart services after changing the value of property.


After changing property, your cube processing will not fail till the value you have set for ExternalCommandTimeout property.

Friday, July 20, 2012

Automate creation of cube partitions

In most of the SSAS solutions, developers create cube partitions as per Years, Semesters,Quarters or Months but that totally depends upon the volume of data your fact table contains. Now consider a case in which you have created cube partitions for the previous and current Year but the next question strikes in your mind is that "how to create a new partition before new year begins ?" and then probably you will think that you have to create a partition manually in the SSAS project file and deploy that project before new year begins but Is it really a efficient solution as you will have to create a new partition and deploy the code before every year begins and then probably you will think, "Can I automate the creation of partitions before every Year begins ?", and the answer to this question is "Yes" and this post is all about the same. So lets discuss this in details.......I am going to explain you the case where we are going to create partition on yearly basis and if your cube contains partitions per year then probably you can create that manually before new year starts but if your cube contains partitions as per quarters or months then it's really a separate maintenance task to modify the code manually in order to create a new partition before every quarter or month begins. So lets discuss this in details.....

You can automate the creation of partitions by writing some AMO code under "Script Task" of SSIS package. Step-by-step instructions for the same are as follows;

1. Create a SSIS project. I have created a project and package with the name "CreateCubePartitions" but you can create with the name which you want.


2. Drag and drop "Script Task" from the Toolbox on to the "Control Flow" tab of the package.


3. I have renamed "Script task" to "Create Cube Partitions" but you can rename it as per your need. Right click on script task and click on "Edit" option.






4. When you click on "Edit" option, it will open a "Script Task Editor" form. From the "ScriptLanguage" option ,select "Microsoft Visual Basic 2008" ( as I am going to demonstrate this example using VB script and that's why I have selected "Microsoft Visual Basic 2008" option" but you can use Microsoft Visual C# 2008 option too). After selection of script language, click on "Edit Script" button. when you click on "Edit Script" button, it will open VB script console. Click on "Show All Files" option under "Project Explorer". You can refer following snapshot for the same.










5. Right click on "References" node and click on "Add Reference" option.






6. When you click on "Add Reference" option, it will open "Add Reference" form. Under ".Net" tab click on "Analysis Management Objects" and click OK button and you will find "Microsoft.AnalysisServices.dll" under the references node which I have highlighted in the following snapshot (this example is based on SSAS 2008 version and that's why I have selected 10.0.0.0 version dll but you will need to select the proper version of dll as per the version of SSAS installed on your server). 





7. After referencing the dll, you will need to Import the same, so you can use the classes and objects of AnalysisServices namespace. For importing, you just need to write Imports Microsoft.AnalysisServices in the following way at the top of your class.



8. Now you can write your AMO code using different classes of AnalysisServices namespace. Following is the sample code which you can write under Sub Main( ). I have given sample code. you can modify that as per your requirements.



Public Sub Main()


        Dim objServer As Server
        Dim objDatabase As Database
        Dim objCube As Cube
        Dim objFindPartition As Partition
        Dim objNewPartition As Partition
        Dim intCurrentYear As Integer
        Dim strQuery As String
        Dim objSource As QueryBinding
        Dim strQueryForNewPartition As String
        Dim strNewPartitionName As String


        intCurrentYear = Now.Year


        objServer = New Server
        objServer.Connect("localhost")


        objDatabase = objServer.Databases.FindByName("Adventure Works DW 2008R2")
        objCube = objDatabase.Cubes.FindByName("Adventure Works")


        For Each objMeasureGrp As MeasureGroup In objCube.MeasureGroups


            strQuery = String.Empty


            For Each objFindPartition In objMeasureGrp.Partitions


                If objFindPartition.Name.Contains(intCurrentYear) Then
                    objSource = objFindPartition.Source
                    strQuery = objSource.QueryDefinition
                    strQueryForNewPartition = strQuery.Replace(intCurrentYear, intCurrentYear + 1)


                    strNewPartitionName = objFindPartition.Name.Replace(intCurrentYear, intCurrentYear + 1)
                End If
            Next


            objNewPartition = New Partition
            objNewPartition.Name = strNewPartitionName
            objNewPartition.StorageMode = StorageMode.Molap
            objNewPartition.Source = New QueryBinding(objSource.DataSourceID,   strQueryForNewPartition)
            objMeasureGrp.Partitions.Add(objNewPartition)
            objMeasureGrp.Update(UpdateOptions.ExpandFull)
            objNewPartition.Process(ProcessType.ProcessFull)
        Next


        objCube.Update(UpdateOptions.Default)
        objDatabase.Update(UpdateOptions.Default)


        Dts.TaskResult = ScriptResults.Success
End Sub



9. So you are done with your package creation and you can schedule the package using SQL Server Job Agent. Create a new job and under Schedule tab and then under New job schedule, select Frequency as Monthly and then set a appropriate day and then recurs every 12 months. You can schedule a job quarterly, monthly if you want to create cube partitions on quarterly,monthly basis respectively.


                                                                                                                                                                   

Monday, July 16, 2012

Fetch cube data using .Net code

When I went through MSDN SSAS forum, I found that, there are many professionals who want to know "how to fetch cube data using .net code". Following is one of the similar thread from MSDN forum;
SSAS MSDN Forum thread . So I thought lets discuss this topic here on my blog.

For fetching cube data using .Net code, you will need the reference of Microsoft.AnalysisServices.AdomdClient.dll. If you don't have this dll then you can download that from Microsoft download center here Microsoft download . Open the link and download msi file for "Microsoft SQL Server 2008 R2 ADOMD.NET" as per your server requirement (i.e. X86, X64 or IA64).
After downloading, run the msi file. After successful run go to the location "C:\Program Files\Microsoft.NET\ADOMD.NET\100\" and check either dll named "Microsoft.AnalysisServices.AdomdClient.dll" is present there or not.


Create .net project and add the reference of "Microsoft.AnalysisServices.AdomdClient.dll" under reference node. 




After addition of dll reference, you can import the AdomdClient namespace and use different classes and objects of the same namespace.



After all above steps, you can write following code which will fetch cube data and fill that in a data table.



        Dim objConnection As New AdomdConnection("Data Source=localhost;Initial Catalog=Adventure       Works DW 2008R2")
        Dim objCommand As New AdomdCommand()
        Dim objDatatable As New DataTable
        Dim strCommand As String


        strCommand = "SELECT [Measures].[Reseller Sales Amount] ON COLUMNS,"
        strCommand = strCommand & " [Product].[Category].[Category] ON ROWS "
        strCommand = strCommand & " FROM [Adventure Works]"


        objConnection.Open()
        objCommand.Connection = objConnection
        objCommand.CommandText = strCommand
        Dim objDataAdapter As New AdomdDataAdapter(objCommand)
        objDataAdapter.Fill(objDatatable)


        objConnection.Close()









Thursday, July 12, 2012

Process Cube objects using AMO

Today I am going to share code for Processing Cube objects using AMO (Analysis Management Objects) For using AMO, you need to reference the Microsoft.AnalysisServices.dll. Following is the code which you can use for processing. In following sample code, I am processing dimensions first (i.e. ProcessUpdate on dimensions) and then I am processing whole cube. Following is just a sample code, you can modify it as per your requirements


        Dim objServer As Server
        Dim objDatabase As Database

        objServer = New Server
        objServer.Connect("localhost")
        objServer.CaptureXml = True

        objDatabase = objServer.Databases.FindByName("Adventure Works DW 2008R2")

        For Each objDimension As Dimension In objDatabase.Dimensions
            objDimension.Process(ProcessType.ProcessUpdate)
        Next

        For Each objCube As Cube In objDatabase.Cubes
            objCube.Process(ProcessType.ProcessData)
            objCube.Process(ProcessType.ProcessIndexes)
        Next

        objServer.ExecuteCaptureLog(True, True)


Analysis services start processing the cube objects when you run "ExecuteCaptureLog" function.
"ExecuteCaptureLog" function accepts three overloads, first indicates whether the capture log entries will be executed in a transaction, second indicates whether the capture log entries will be executed in a parallel, third one indicates whether the affected objects will be processed.

If you just want to retrieve the generated XMLA then instead of "ExecuteCaptureLog" function, you can use the function named "ConcatenateCaptureLog" which returns XMLA string.